It's All in the Notes: What Session Notes Can Tell Us About the Work of Writing Centers

  • Genie N. Giaimo The Ohio State University
  • Joseph J. Cheatle Michigan State University
  • Candace K. Hastings Texas A&M University
  • Christine Modey University of Michigan
Keywords: corpus analysis, session notes, writing analytics, writing centers

Abstract

  • Aim: This research note focuses on how corpus analysis tools can help researchers make sense of the data writing centers collect. Writing centers function, in many ways, like large data repositories; however, this data is under-analyzed. One example of data collected by writing centers is session notes, often collected after each consultation. The four institutions featured in this note—Michigan State University, the University of Michigan, Texas A&M University, and The Ohio State University—have analyzed a subset of their session notes, over 44,000 session notes comprising around 2,000,000 words. By analyzing the session notes using tools such as Voyant, a web-based application for performing text analysis, writing center researchers can begin to explore critically their large data repositories to understand and establish evidence-based practice, as well as to shape external messaging about writing center labor—separate from and in addition to impact on student writers—to institutional administrators, state legislators, and other stakeholders.
  • Problem Foundation: This section identifies a key problem in writing centers—there are large amounts of data but no easy way to analyze that data. Session notes are a common record-keeping practice in writing centers, yet few researchers have critically examined these documents, and fewer still have utilized any type of discourse, textual, or corpus analysis to do so. Those studies that do are either limited in scope (Brown, 2010) or are labor-intensive because they require hand coding (Hall, 2017). Additionally, few cross-institutional collaborations or partnerships exist among writing center research. Therefore, our experiment with corpus-level analysis of session notes has the potential to break new ground in writing center studies that can lead to innovations in training, assessment, and field-based practices.
  • Information Collection: This research note uses Voyant, a free open-access web-based application, to perform a textual analysis on session notes from four institutions. Institutional writing centers each analyzed 500,000 words from session notes encompassing the last two to five years. Each institution has a different audience for its session notes. OSU, MSU, and TAMU share documents with clients upon request. TAMU’s online writing center automatically sends notes to clients, while University of Michigan only shares notes among staff (not instructors or students). Voyant is one tool among many (e.g., AntConc, CohMetrix, KNIME Analytics Platform, and Word Stat) that researchers can use to identify their analytical priorities, whether it be structural linguistics, contextual linguistics, socio-cultural discourse, or some combination therein. Because the field of writing center studies is only just starting to apply corpus analysis, we suggest that as corpora are developed within the field, corpus analysis can become more sophisticated and varied, and different programs can serve different functions and needs. Programs such as Voyant are free multi-featured and open-access online programs that provide high-quality visuals. For this project, each researcher chose which tools in Voyant to use based on the interests and goals of the institution and writing center. Tools used include Corpus Terms (a table view of term frequency in the entire corpus), Cirrus (a word cloud that visualizes the top frequency words of a corpus), Contexts Tool (which shows each occurrence of a keyword and how it co-occurs with words/phrases to the left and right of it), and Collocates Graph/Links Tool (which represents keywords and terms that occur in close proximity as a force directed network graph).
  • Conclusions: Tools such as Voyant offer an effective way to provide broad insights into the work of writing centers through a corpus analysis of session notes. As each institution demonstrates, a wide variety of questions are answerable by tools like Voyant, including ones that are specific to individual centers and institutions. Corpus analysis of session notes can provide a broad view of the ways that language functions within sessions to enhance and concretize writing centers’ sense of the work they, and their consultants, do. While corpus analysis does not provide all of the answers for writing centers, when this analytic method is coupled with other quantitative and qualitative strategies for understanding the work of writing centers and the interactions that take place therein, it can provide us with the insights needed to support consultants and clients while improving the center.
  • Directions for Further Research: While this research note demonstrates the capability of analytical tools like Voyant to help individual institutions understand and assess their writing centers, directions for further research include comparing the institutions as well as creating a corpus of the combined institutions’ session notes. Because there are no current reference corpora for writing centers, as there are for contemporary American English (for example, the Corpus of Contemporary American English (COCA), this collaboration can aggregate and create corpora for other institutions to utilize in their own analyses. By creating a reference corpus, we can extend the research impact of our work and make findings more powerful and, potentially, significant as other writing centers take up this work.

References

Anson, C. M. (2008). The intelligent design of writing programs: Reliance on belief or a future of evidence. WPA:Writing Program Administration, 32(1), 11–36.

Aull, L.L. (in press). Corpus analytic tools: Constructing and understanding student writing assessment. Assessing Writing.

Brown, R. (2010). Representing audiences in writing center consultations: A discourse analysis. The Writing Center Journal, 30(2), 72–99.

Bugdal, M., Reardon, K., & Deans, T. (2016). Summing up the session: A study of student, faculty, and tutor attitudes toward tutor notes. The Writing Center Journal, 35(3), 13–36.

Cogie, J. (1998). In defense of conference summaries: Widening the research of writing center work. The Writing Center Journal, 18(2), 47–70.

Conway, G. (1998). Reporting writing center sessions to faculty: Pedagogical and ethical considerations. Writing Lab Newsletter, 22(8), 9–12.

Crump, E. (1993). Voices from the net: Sharing records: Student confidentiality and faculty relations. Writing Lab Newsletter, 18(2), 8–9.

Driscoll, D., & Perdue, S. (2012). Theory, lore, and more: An analysis of RAD research in "The Writing Center Journal," 1980–2009. The Writing Center Journal, 32(2), 11–39. Retrieved from http://www.jstor.org/stable/43442391

Gee, J. P. (2014). How to do discourse analysis: A toolkit. Oxon: Routledge.

Getting Started. (n.d.) Retrieved from https://voyant-tools.org/docs/#!/guide/start

Giaimo, G. (2017). Focusing on the blind Spots: RAD-Based assessment of students’ perceptions of a community college writing center. Praxis: A Writing Center Journal, 15(1), 55–64.

Hacker, D., & Sommers, N. (2015). A writer's reference (8th ed.). Boston: Bedford/St. Martin's.

Hall, R. M. (2017). Around the texts of writing center work: An inquiry-based approach to tutor education. Logan: Utah State University Press.

Haswell, R. (2005). NCTE/CCCC’s recent war on scholarship. Written Communication, 22(2), 198–223.

Haswell, R. & Elliot, N. (2017). Innovation and the California State University and Colleges English Equivalency Examination, 1973-1981: An organizational perspective. The Journal of Writing Assessment, 10(1). Retrieved from http://journalofwritingassessment.org/article.php?article=118.

Jackson, K. (1996). Beyond record-keeping: Session reports and tutor education. Writing Lab Newsletter, 20(6), 11–13.

Larrance, A. J., & Brady, B. (1995). A pictogram of writing center conference follow-up. Writing Lab Newsletter, 20(4), 5–7.

Mackiewicz, J. (2017). The aboutness of writing center talk: A corpus-driven and discourse analysis. New York: Routledge.

Mackiewicz, J. & Thompson, I. (2016). Adding quantitative corpus-driven analysis to qualitative discourse analysis: Determining the aboutness of writing center talk. The Writing Center Journal, 35(3), 187–225.

Partington, A., Duguid, A., & Taylor, C. (2013). Patterns and meanings in discourse: Theory and practice in corpus-assisted discourse studies (CADS). Philadelphia: John Benjamins.

Pemberton, M. (1995). Writing center ethics: Sharers and seclusionists. Writing Lab Newsletter, 20(3), 13–14.

Schendel, E., & Macauley, W. (2012). Building writing center assessments that matter. Boulder, Colorado: University Press of Colorado.

Schön, D. (1987). Educating the reflective practitioner. San Francisco: Jossey-Bass.

Weaver, M. (2001). Resistance is anything but futile: Some more thoughts on writing conference summaries. The Writing Center Journal, 21(2), 35–56.

Published
2018-12-09