Placing Writing Tasks in Local and Global Contexts: The Case of Argumentative Writing

  • Hannah Ringler Carnegie Mellon University
  • Beata Beigman Klebanov Educational Testing Services (ETS)
  • David Kaufer Carnegie Mellon University
Keywords: argument, diagnostic, distance, first-year composition, genre, measurement, rhetorical profiles, rhetorical strategies, sequence, student writing, text analysis, writing analytics

Abstract

  • Background: Current research in composition and writing studies is concerned with issues of writing program evaluation and how writing tasks and their sequences scaffold students toward learning outcomes. These issues are beginning to be addressed by writing analytics research, which can be useful for identifying recurring types of language in writing assignments and how those can inform task design and student outcomes. To address these issues, this study provides a three-step method of sequencing, comparison, and diagnosis to understand how specific writing tasks fit into a classroom sequence as well as compare to larger genres of writing outside of the immediate writing classroom environment. By doing so, we provide writing program administrators with tools for describing what skills students demonstrate in a sequence of writing tasks and diagnosing how these skills match with writing students will do in later contexts.
  • Literature Review: Student writing that responds to classroom assignments can be understood as genres, insofar as they are constructed responses that exist in similar rhetorical situations and perform similar social actions. Previous work in corpus analysis has looked at these genres, which helps us as writing instructors understand what kind of constructed responses are required of students and to make those expectations explicit. Aull (2017) examined a corpus of first-year undergraduate writing assignments in two courses to create “sociocognitive profiles” of these assignments. We analyze student writing that responds to similar writing tasks, but use a different corpus method that allows us to understand the tasks in both local and global contexts. By doing so, we gain confidence and depth in our understanding of these tasks, analyze how they sequence together, and are able to compare argumentative writing across institutions and contexts.
  • Research Questions: Two questions guided our study:
  1. What is the trajectory of skills targeted by the sequence of tasks in the two first-year writing courses, as evidenced by the rhetorical strategies employed by the writers in successive assignments?
  2. Focusing on the final argument assignments, how similar are they to argumentative writing in other contexts, in terms of rhetorical profiles?
  • Methodology: We first conducted a local analysis, in which we used a dictionary-based corpus method to analyze the rhetorical strategies used by writers in the first-year writing courses to understand how they built on each other to form a sequence. Having understood what skills students are demonstrating in a course, we then conducted a global analysis which calculated a “distance” between the first-year argument writing and a corpus of argument writing drawn from other contexts. Recognizing that there was a non-trivial distance, we then identified and evaluated the sources of the distance so that the writing tasks could be assessed or modified.
  • Results: The local analysis revealed eight key rhetorical strategies that student writing exhibits between the two first-year writing courses. With this understanding, we then placed the argument writing in global contexts to find that the assignments in both courses differ somewhat from argument writing in other contexts. Upon analyzing this difference, we found that the first-year writing primarily differs in its usage of academic language, the personal register, assertive language, and reasoning. We suggest that these differences stem primarily from the rhetorical situation and learning objectives associated with first-year writing, as well as the sequencing of the courses.
  • Discussion: The three-step method presented provides a means for writing program administrators to describe and analyze writing that students produce in their writing programs. We intend these steps to be understood as an iterative process, whereby writing programs can use these results to evaluate what rhetorical skills their students are exhibiting and to benchmark those against the program’s goals and/or other similar writing programs.
  • Conclusions: By presenting these analyses together, we ultimately provide a cohesive method by which to analyze a writing program and benchmark students’ use of rhetorical strategies in relation to other argumentative contexts. We believe this method to be useful not only to individual writing programs, but to assessment literature broadly. In future research, we anticipate learning how this process will practically feed back into pedagogy, as well as understanding what placing writing tasks into a global context can tell us about genre theory.

Author Biographies

Hannah Ringler, Carnegie Mellon University
PhD student, English department
Beata Beigman Klebanov, Educational Testing Services (ETS)
Senior research scientist
David Kaufer, Carnegie Mellon University
Professor, English department

References

Allen, J. (2004). The impact of student learning outcomes assessment on technical and professional communication programs. Technical Communication Quarterly, 13(1), 93–108. Retrieved from https://doi.org/10.1207/S15427625TCQ1301_9.

Aull, L. L. (2015). First-year university writing: A corpus-based study with implications for pedagogy. UK: Palgrave Macmillan.

Aull, L. L. (2017). Corpus analysis of argumentative versus explanatory discourse and its implications for writing task design. Journal of Writing Analytics, 1(1), 1–47. Retrieved from https://journals.colostate.edu/analytics/article/view/106

Aull, L. L., & Lancaster, C. I. Z. (2014). Linguistic markers of stance in early and advanced academic writing: A corpus-based comparison. Written Communication, 1(33), 151-183. Retrieved from https://doi.org/10.1177/0741088314527055.

Bakhtin, M. (1990). Mikhail Bakhtin. In P. Bizzell & B. Herzberg (Eds.), The rhetorical tradition: Readings from classical times to the present (pp. 1206–1245). Boston: Bedford St. Martins.

Beck, S. W., & Jeffrey, J. V. (2007). Genres of high-stakes writing assessments and the construct of writing competence. Assessing Writing, 12(1), 60–79. Retrieved from https://doi.org/10.1016/j.asw.2007.05.001.

Beigman Klebanov, B., Kaufer, D., Yeoh, P., Ishizaki, S., Holtzman, S. (2016). Argumentative writing in assessment and instruction: A comparative perspective. In N. Stukker, W. Spooren, & G. Steen (Eds.), Genre in language, discourse and cognition (pp. 167–192). Berlin, Boston: De Gruyter Mouton.

Beigman Klebanov, B., Ramineni, C., Kaufer, D., Yeoh, P., Ishizaki, S. (in press). Advancing the validity argument for standardized writing tests using quantitative rhetorical analysis. Language Testing.

Bennett, R. E. (1991). On the meanings of constructed response. ETS Research Report Series, 1991(2), i–46. doi: 10.1002/j.2333-8504.1991.tb01429.x

Biber, D. (1988). Variation across speech and writing. Cambridge: Cambridge University Press.

Biber, D., Conrad, S., Reppen, R., Byrd, P., & Helt, M. (2002). Speaking and writing in the university: A multidimensional comparison. TESOL Quarterly, 36(1), 9–48. doi: 10.2307/3588359

Brown, D. W., & Aull, L. L. (2017). Elaborated specificity versus emphatic generality: A corpus-based comparison of higher- and lower-scoring Advanced Placement exams in English. Research in the Teaching of English, 51(4), 394–417.

Burrows, J. (2002). ‘Delta’: A measure of stylistic difference and a guide to likely authorship. Literary and Linguistic Computing, 17(3), 267–287. Retrieved from https://doi.org/10.1093/llc/17.3.267.

Cotos, E., Huffman, S., & Link, S. (2015). Furthering and applying move/step constructs: Technology-driven marshalling of Swalesian genre theory for EAP pedagogy. Journal of English for Academic Purposes, 19, 52–72. Retrieved from https://doi.org/10.1016/j.jeap.2015.05.004.

Crossley, S. A., Kyle, K., & McNamara, D. S. (2016). The tool for the automatic analysis of text cohesion (TAACO): Automatic assessment of local, global, and text cohesion. Behavior Research Methods, 48(4), 1227–1237. Retrieved from https://doi.org/10.3758/s13428-015-0651-7.

CWPA, NCTE, & NWP. (2011). Framework for success in postsecondary writing. Retrieved from http://wpacouncil.org/files/framework-for-success-postsecondary-writing.pdf

Devitt, A., & Reiff, M. J. (2014). Reproducing genres: Pattern-related writing. In E. Jakobs & D. Perrin (Eds.), Handbook of writing and text production (pp. 263–284). Berlin: De Gruyter Mouton.

Dryer, D. B., Bowden, D., Brunk-Chavez, B., Harrington, S., Halbritter, B., & Yancey, K. B. (2014). Revising FYC outcomes for a multimodal, digitally composed world: The WPA outcomes statement for first-year composition (version 3.0). Writing Program Administration, 38(1), 129–143.

Ford, J. E., & Perry, D. R. (1982). Research paper instruction in the undergraduate writing program. College English, 4(8), 825–831. Retrieved from http://www.jstor.org/stable/377339

Geisler, C. (2016a). Current and emerging methods in the rhetorical analysis of texts – Introduction: Toward an integrated approach. Journal of Writing Research, 7(3), 417–424.

Geisler, C. (2016b). Current and emerging methods in the rhetorical analysis of texts – Closing: Toward an integrated approach. Journal of Writing Research, 7(3), 511–526.

Gray, B. (2015). Linguistic variation in research articles: When discipline tells only part of the story. Amsterdam: John Benjamins Publishing Company.

Hardy, J. A., & Römer, U. (2013). Revealing disciplinary variation in student writing: A multi-dimensional analysis of the Michigan Corpus of Upper-level Student Papers (MICUSP). Corpora, 8(2), 183–207. Retrieved from https://doi.org/10.3366/cor.2013.0040

Harrington, S., Malencyzk, R., Peckham, I., Rhodes, K., & Yancey, K. B. (2001). WPA outcomes statement for first-year composition. College English, 63(3), 321–325. Retrieved from http://dx.doi.org/10.2307/378996

Hart, R. P. (2000). Campaign talk: Why elections are good for us. Princeton, NJ: Princeton University Press.

Hood, C. L. (2010). Ways of research: The status of the traditional research paper assignment in first-year writing/composition courses. Composition Forum, 22. Retrieved from http://compositionforum.com/issue/22/ways-of-research.php

Hoover, D. L. (2004a). Testing Burrows’s delta. Literary and Linguistic Computing, 19(4), 453–475. Retrieved from https://doi.org/10.1093/llc/19.4.453.

Hoover, D. L. (2004b). Delta prime? Literary and Linguistic Computing, 19(4), 477–495. Retrieved from https://doi.org/10.1093/llc/19.4.477

Ishizaki, S. & Kaufer, D. (2011). The DocuScope text analysis and visualization environment. In P. McCarthy & C. Boonthum (Eds.), Applied natural language processing and content analysis: Advances in identification, investigation, and resolution (pp. 275–296). Hershey, PA: Idea Group Inc. (IGI).

Jarvis, S., Grant, L., Bikowski, D., & Ferris, D. (2003). Exploring multiple profiles of highly rated learner compositions. Journal of Second Language Writing, 12(4), 377–403. Retrieved from https://doi.org/10.1016/j.jslw.2003.09.001.

Jenseth, R. (1989). Understanding Hiroshima: An assignment sequence for freshmen English. College Composition and Communication, 40(2), 215–219. Retrieved from http://www.jstor.org/stable/358131

Kaufer, D. S., & Butler, B. S. (2000). Designing interactive worlds with words: Principles of writing as representational composition. New York: Routledge.

Kaufer, D., Ishizaki, S., Collins, J., & Vlachos, P. (2004). Teaching language awareness in rhetorical choice: Using IText and visualization in classroom genre assignments. Journal of Business and Technical Communication, 18(3), 361–402. Retrieved from https://doi.org/10.1177/1050651904263980.

Kiniry, M., & Strenski, E. (1985). Sequencing expository writing: A recursive approach. College Composition and Communication, 36(2), 191–202. Retrieved from http://www.jstor.org/stable/357441

Kučera, H., & Francis, W. N. (1967). Computational analysis of present-day American English. Providence: Brown University Press.

Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22(1), 79–86. Retrieved from http://www.jstor.org/stable/2236703

Lindemann, E. (2001). A rhetoric for writing teachers. Oxford: Oxford University Press.

Manning, A. D. (1961). The present status of the research paper in freshman English: A national survey. College Composition and Communication, 12(2), 73–78. Retrieved from http://www.jstor.org/stable/355440

McNamara, D. S., Graesser, A. C., McCarthy, P. M., & Cai, Z. (2014). Automated evaluation of text and discourse with Coh-Metrix. Cambridge: Cambridge University Press.

Melzer, D. (2009). Writing assignments across the curriculum: A national study of college writing. College Composition and Communication, 61(2), W240–W261.

Michigan Corpus of Upper-level Student Papers. (2009). Ann Arbor, MI: The Regents of the University of Michigan.

Miller, C. R. (1984). Genre as social action. Quarterly Journal of Speech, 70(2), 151–167. Retrieved from https://doi.org/10.1080/00335638409383686.

Moxley, J. (2013). Big data, learning analytics, and social assessment. The Journal of Writing Assessment, 6(1), 1–10.

National Research Council. (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century. Committee on Defining Deeper Learning and 21st Century Skills, J.W. Pellegrino and M.L. Hilton, Editors. Board on Testing and Assessment and Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.

Rankin, E. (1990). From simple to complex: Ideas of order in assignment sequences. Journal of Advanced Composition, 10(1), 126–135. Retrieved from http://www.jstor.org/stable/20865704

Sandhaus, E. (2008). The New York Times annotated corpus. Linguistic Data Consortium, Philadelphia, 6(12), e26752.

Simons, H. W. (1978). ‘Genre-alizing’ about rhetoric: A scientific approach. In K. K. Campbell & K. H. Jamieson (Eds.), Form and genre: Shaping rhetorical action (pp. 33–50). Falls Church, VA: The Speech Communication Association.

Soliday, M. (2011). Everyday genres: Writing assignments across the disciplines. Carbondale, IL: Southern Illinois University Press.

Stein, S., & Argamon, S. (2006). A mathematical explanation of Burrows’s Delta. In Proceedings of the Digital Humanities Conference (pp. 207–209).

Thomas, S., & McShane, B. J. (2007). Skills and literacies for the 21st Century: Assessing an undergraduate professional and technical writing program. Technical Communication, 54(4), 412–423.

WAC Clearinghouse. (2001). Sequencing writing assignments. Fort Collins, CO: University of Delaware Writing Center.

White, E. M., Elliot, N., & Peckham, I. (2015). Very like a whale: The assessment of writing programs. Logan, UT: Utah State University Press.

Widaman, K. (2007). Common factors version components: Principals and principles, errors and misconceptions. In R. Cudeck & R.C. MacCallum (Eds.), Factor analysis at 100: Historical developments and future directions (pp. 177–203). Mahwah, NJ: Erlbaum.

Wolfe, J. (2009). How technical communication textbooks fail engineering students. Technical Communication Quarterly, 18(4), 351–375. Retrieved from https://doi.org/10.1080/10572250903149662.

Yancey, K. B., & Morrison, B. M. (2006). Coming to terms: Vocabulary as a means of defining first-year composition. In H. Tinsberg & P. Sullivan (Eds.), What is college level writing? (pp. 267–280). Urbana: NCTE.

Zhu, W. (2004). Writing in business courses: An analysis of assignment types, their characteristics, and required skills. English for Specific Purposes, 23, 111–135. Retrieved from https://doi.org/10.1016/S0889-4906(02)00046-7.

Published
2018-12-09