This week, I emphasized articles from Computers and Composition in my blog, for two reasons. First, as my
attention for the past two weeks has been on digital writing research methods, Computers and Composition was an obvious
choice for research articles related to this subfield of writing studies. Second,
and more important, this journal’s articles often utilize numerous research designs.
Having such variety to analyze, I felt, would enable me to address (if rather
simplistically and generally) how research in this area adheres to the
methodological theories and approaches that I have been examining for the past
month. What I have noticed in reading these articles and thinking about their methods
is that they utilize traditional research methods in ways appropriate for the
complexity of digital writing research.
I want to begin by discussing how some of these articles
adhere to more traditional research methods. The studies I selected for this
week’s readings covered a variety of research methods, but as an example of how
these studies rely on and differ from traditional formulations, I will address
Jin and Zhu’s case study approach. The goals of a case study are exploratory in
nature, not attempting to develop cause/effect notions but to establish
potential variables to explore more fully (Lauer and Asher 23). In addition,
case studies tend to rely on either broad or representative samples of populations
to study, a variety of sources for data, careful coding and reliability
examinations, and “descriptive accounts” to report their findings (Lauer and
Asher 25, 26-27, 31, 32). Jin and Zhu are clear that their study is
exploratory, especially in terms of discovering potential ways
computer-mediated tools affect students’ activity (286). Furthermore, they also
demonstrate the use of selective sampling by choosing two students who have
different levels of computer skills/knowledge (287-88). However, while Jin and
Zhu do address their coding procedures, they do not address reliability issues
related to their coding. This lapse does not seem to weaken their study. Their
focus on motives and the numerous streams of data they use (video observations,
interviews, IM transcripts, etc. [288]) they triangulated to better understand and
validate participants’ claims about their motives (289) suggest an effective
methodology for making sense of the data they collected.
These studies also adhere to the principles addressed
throughout Heidi A. McKee and Danielle Nicole DeVoss’s anthology on digital
writing research, especially approaches to theory, ethics, and collecting data
effectively. As is clear in McKee and DeVoss’s collection, many scholars in
digital writing studies rely on theory as a guiding principle to understand and
explore their research topics (e.g., Hilligoss and Williams 232-36; Kimme Hea 273-74; Romberger 251-54). Jin and
Zhu, for example, embrace activity theory to establish and explicate their
focus on motivation and how technology can affect these motives (285-86). In
addition, some articles in McKee and DeVoss’s text emphasize the complexity of
ethical issues related to digital research (see Banks and Elbe; Pandey for a
couple of examples). In my readings this week, Stedman’s article addresses most
explicitly the complexity of these issues, noting his concerns about the
treatment of fan communities by previous researchers, his decisions that being
ethical in the eyes of this community meant being explicit about his intentions,
and his sense that his IRB permissions did not precisely address the ethical
considerations he had to make (110-11, 117). Finally, many of the authors in
McKee and DeVoss’s anthology note the need for considerable flexibility in
terms of their research methods to adapt to the malleability of digital
research (e.g., DePew; Rickly). One particularly representative example of this
from my reading this week was Jin and Zhu’s article. Though they did not rely
on quantitative methods, they did use multiple approaches to collect their data
(see above). Many of these methods (video recording and chat transcripts, for
example) allowed them to gain insight into their participants’ motives less
obtrusively than traditional observation and possibly even yielded more
accurate information.
What these articles demonstrate in terms of research
methods, then, is both an attention to the research tradition in maintaining
high intellectual standards and a willingness to add to or tweak these
practices in response to changing research contexts as a result of
computer-mediated technologies. As I have noted previously, such flexibility is
important for research in the digital age if it is to be ethical and rigorous,
and despite some lapses in methods or analysis (see my entry on Garrison’s
article), the research I examined this week seems to illustrate such ethics and
rigor.
Works Cited
Banks, Will, and Michelle Eble. “Digital Spaces, Online
Environments, and Human Participant Research: Interfacing with Institutional
Review Boards.” McKee and DeVoss 27-47. Print.
Garrison, Kevin. “An Empirical Analysis of Using
Text-to-Speech Software to Revise First-Year College Students’ Essays.” Computers and Composition 26.4 (2009):
288-301. Print.
Hilligoss, Susan, and Sean Williams. “Composition Meets
Visual Communication: New Research Questions.” McKee and DeVoss 229-47. Print.
Jin, Li, and Wei Zhu. “Dynamic Motives in ESL
Computer-Mediated Peer Response.” Computers
and Composition 27.4 (2010): 284-303. Print.
Kimme Hea, Amy. “Riding the Wave: Articulating a Critical
Methodology for Web Research Practices.” McKee and DeVoss 269-86. Print.
Lauer, Janice M., and J. William Asher. Composition Research: Empirical Designs. New York: Oxford UP, 1988.
Print.
McKee, Heidi A., and Danielle Nicole DeVoss, eds. Digital Writing Research: Technologies,
Methodologies, and Ethical Issues. New Dimensions in Computers and
Composition. Ed. Gail E. Hawisher and Cynthia Selfe. Creskill: Hampton, 2007.
Print.
Pandey, Iswari. “Researching (with) the Postnational
‘Other’: Ethics, Methodologies, and Qualitative Studies of Digital Literacy.”
McKee and DeVoss 107-25. Print.
Romberger, Julia E. “An Ecofeminist Methodology: Studying
the Ecological Dimensions of the Digital Environment.” McKee and DeVoss 249-67.
Print.
Stedman, Kyle D. “Remix Literacy and Fan Compositions.” Computers and Composition 29.2 (2012):
107-23. Print.
No comments:
Post a Comment