Archive | Digital Writing Assessment RSS feed for this section

Assignment: Annotated Bibliography Part 2 – Peer Comments

Thoughts after reading Annotated Bibliography entries:
  • We can run no longer. The era of critical making is here. Daniel's blog discussed the implications of data collection in digital spaces, and Leslie's blog discussed the need for digital composition and assessment strategies. Both have the inherent argument that the ramifications of digital spaces are firmly embedded within our discipline. What this tells me is that as scholars and instructors we have the responsibility to understand how to create technologies for scholarship and the classroom. We have to understand it as fully as teachers need to understand anything they teach and as scholars need to understand to analyze. It's a kind of fluency with technology production that most of us lack, or are scared of, or refuse to accept is part of the discipline. If there was a historical schism between linguistics and English, oral communication and English, and creative writing and English, with these disciplines being splintered off from the department, we must do the opposite for technology. As it now exists in completely separate discipline, technology studies needs to be enveloped by English studies. We have to bring these courses in technology production, management, and theory into our world alongside our surveys and seminars in literature and rhetoric. If we are going to teach students to be producers of digital content, we must understand the technology that facilitates that production. The time has come; I hope it has not passed.  
  • The assignment for this entry asked for a summary of what/how/why I learned. The what is pretty easy. My comment above speaks to that. The how and why are far more difficult to answer. How did I learn? Collaboratively would be one way to answer that. My peers, whom I hold in the highest regard and regularly inspire and teach me, were able to comprehend the readings, coherently summarize them in their pieces, and offer broader implications for our course and discipline. The knowledge that I built is on their solid foundation, so in that sense I learned by working with Daniel and Leslie. These kind of jigsaw activities (I believe they are called if my memory of undergraduate courses in education are serving me right) are among my favorite ways to learn. Others often bring ideas to the table or make connections that I would not have made based on our diverse experiences and bodies of knowledge.
  • Why did I learn what I learned? I was open. I was willing. I was humble in the face of the vastness of things that I do not know. My peers are intelligent. My instructors asked me to. The material was selected for me. I have been trained in and have practiced the active reading and critical thinking skills needed to learn. I have prior knowledge from which to draw. All of these are potential answers to the question of why I, or anyone for that matter, might have learned. But I think that the real answer here if that I learned these particular take-aways because I am acutely aware of the issues surrounding critical making or technology production as a result of my own research. The "why" is inextricable from the "what" in which I am interested. The human mind most easily makes the connections across the pathways that are already there. As new information comes in, my mind works to fit it into the files that exist, to see the relevance to that which I already find important. Perhaps this is why networks grow organically. I want to expand the nodes I value. I seek out the connections that can be made to and from it. The network grows where knowledge can be easily created and around nodes that are most valued.

Comment on Daniel's blog

Comment on Leslie's blog (at time of posting, this comment was awaiting moderation)

Synthesis Post: What did I learn from the Annotated Bibliography?

First of all, let me say that after reading classmates’ bibliography posts, I’m convinced I must read this publication in its entirety. For this post, however, I chose to focus on readings reviewed by Maury and Summer. I selected Maury’s … Continue reading

Annotated Bibliography: Reilly and Atkins

Reilly, Colleen A., and Anthony T. Atkins. “Rewarding Risk: Designing Aspirational Assessment Processes for Digital Writing Projects.” McKee, Heidi A., and Dànielle Nicole DeVoss, Eds. Digital Writing Assessment & Evaluation. Logan, UT: Computers and Composition Digital Press/Utah State University Press, 2013. Web.

Weigh the Elephant.

In the article “Rewarding Risk” Reilly and Atkins describe a process of creating a method of assessing digital writing projects that encourage students to take risks. In the beginning of the article, Reilly and Atkins discuss the challenges of assessing digital writing projects and the ways in which assessments can either discourage or encourage students from taking risks while undertaking a digital writing project. While students are learning to use new technologies, they benefit from assessments that take into account their attempts to learn to use technologies without punishing them for imperfect outcomes. Reilly and Atkins claim that the language of assessments of digital writing projects should be generalizable, generative, aspirational (encouraging students to use new tools and learn new skills), and should solicit student involvement in assessment creation, which Reilly and Atkins claim will localize and contextualize the assessment). In creating assessments, Reilly and Atkins claim that assessment should clearly support pedagogical practices, that the educational values of the course should be evident, that the assessment should focus on instructive aspects of the course, and that the assessment should give feedback to guide future work (2-3). The assessments that are developed for digital writing projects need to be nimble, adaptable, and take into account students’ unfamiliarity with the technologies that they are utilizing. By connecting assessment to student learning outcomes, can help encourage students to learn new skills. Digital writing projects, according to Reilly and Atkins, should include collaboration, acknowledge the way in which writing has changed, incorporate peer review, and include a revision plan (5). The author’s suggest the use of deliberative practice, which “overtly requires a process that includes trial and error, the experience which leads to expanding proficiencies and developing expertise” (5). They say that assessment must encourage students to move past their current skills level and develop their expertise (6). Deliberative practice requires increasing the level of challenge, so assessments should take into account that students will make mistakes in the learning process. Reilly and Atkins say that assessments that students view as a checklist “discourage the deviation and innovation essential to engaging in deliberate practice and embarking on the process of developing expertise” (7). Attempts by both Reilly and Atkins to develop aspirational processes of assessment are detailed in the passage. They suggest that one way to facilitate assessment is to have students write reflections of their experiences working on projects, can help students to think about their work in rhetorical terms, to demonstrate their knowledge of course concepts, top provide rationales for design choices, and to learn through analyzing their experiences (9). Another approach that they explores was the use of primary trait scoring for digital writing assessment (10). This process involves the students into the creation of an assessment that “accounts for the risks they ned to take to complete a project successfully while simultaneously blurring distinctions between formative and summative assessment and making assessment part of the writing process, informing the development, production, and revision of digital compositions” (11). This process begins with assessment of the assignment, and also includes “analyzing the writing performance, and formulating primary traits” (12). This approach acknowledges that student “accomplishments may be much greater than the product they submit” (13). When involving students in the making of the assessment, the assignment becomes aspirational (15). In their conclusion, Reilly and Atkins acknowledge that there are some limitations to the process, such as time-constraints, but they explain the benefits that they have seen in their courses, such as increased student motivation and helping students learn to determine how projects should be assessed. The outcomes and the results suggest that by utilizing the aspirational process of assessment or the primary trait scoring process can increase student motivation and encourage them to take risks as they learn to use new technologies.

How is this Relevant to the Course:

One way in which the discussion of the creation of assessment of digital writing projects is relevant to the course is that we are ourselves creating digital writing projects that some of us do not necessarily have much experience with. Before my first 894 class, I had a blog but had never really used it. We also use other technologies like Popplet. While I don’t need to know how to write code in the class, I have run into a few issues with Word Press.

As I was reading this article by Reilly and Atkins, it occurred to me that some of the writing prompts, particularly the reading notes prompt, were designed to be an aspirational process of assessment. Because we have choices in the type pf content and the format, we students have the freedom to try new things with our blogs. We have the opportunity to aspire to continue developing our skills with blogging. The reading notes assignment prompt accomplishing two things that Reilly and Atkins felt were important in assessment of digital media projects: It encourages experimentation with composing in digital media, and it motivates “students to move beyond the basic activities necessary to produce the digital compositions” (np.).

Another parallel that I saw was that some of our assignments (particularly the Mind Map) include built in reflection on the use of technology. Reilly and Atkin explain that using a written reflection for digital projects encourages students to think rhetorically about their technological choices, show knowledge of course concepts, and articulate goals. The written discussions of our Mind Maps help us explain the rhetorical choices that we made and give us a place to delve deeper into how course concepts guided our choices.

Vatz and the Rhetorical Situation: How can assessment encourage rhetorical thinking?

Reilly and Atkins say that student reflections “about their digital compositions should involve rhetorically oriented rationales of content and design choices” (9). Why is it important that students be able to explain their rhetorical choices? For the answer to that question, we can turn to Vatz. Vatz says that if “you view meaning as a consequence of rhetorical creation, your paramount concern will be how and by whom symbols create the reality to which people react” (158). I think that this quote by Vatz is sheds some light on the ways in which attention to rhetorical concerns can impact assignment design and assessment.

1. By writing rhetorical rationales, students begin to develop an awareness of the ways in which meaning is “a consequence of rhetorical creation”.

2. Reflection on the assignment helps teachers understand the rhetorical nature of their assignment designs. By examining students’ discussions of the ways in which they interpreted and grappled with an assignment, teachers begin to see how symbols [particularly assignment design symbols] “create the reality to which people react” (158).

Annotated Bibliography: Digital Writing Assessment & Fairness

Poe Mya. “ Making Digital Writing Assessment Fair for Diverse Writers.” Eds. Heidi A. Mckee and Dànielle Nicole DeVoss. Digital Writing Assessment & Evaluation. Logan, UT: Computers and Composition Digital Press/Utah State University Press, 2013. Web. 3 February 2014.

In “Making Digital Writing Assessment Fair for Diverse Writers,” Mya Poe argues that digital writing assessment must consider fairness in order to provide an equal opportunity to all students. Digital writing assessment is becoming more important due to the rise in digital writing and multimodal composition. Poe presents two theories about assessment and technology: “writing assessment as technology” and “writing assessment with technology. First, writing assessment is a technology.  Poe sites the work of George Madaus, who argued that assessments fall under “very simple definitions of technology—the simplest being something put together for a purpose, to satisfy a pressing and immediate need, or to solve a problem” (qtd in Poe).  Second, digital writing is being assessed through automated essay scoring (AES). The research on AES is mixed. On one hand it has been shown as reliable. On the other hand, the programs are said to have the “raced ideologies of their designers” (4). In response to this, Poe presents 3 key terms in digital writing assessment: validity, reliability, and fairness. She defines validity and reliability; however, the focus here is fairness.

Fairness is fundamental to digital assessment because through fairness instructors are able to “make valid, ethical conclusions from assessment results so that we may provide all students the opportunity to learn” (7).  Poe uses the Standards for Educational and Psychological Testing as a starting point for developing more equitable large-scale digital writing assessments.  Using the Standards, Poe presents that fairness guidelines require the following:

  • Thoughtful design
  • Extension of fairness through the entire assessment
  • Data collection (locally sensitive data through surveys, ethnographic research)
  • Interpretation of evidence in context
  • Frame assessment results for the public

Under the fairness, instructors consider the goal of the assessment and to ensure that students understand the purpose of the assessment. In addition, when interpreting assessments, instructors consider the social context and connect writing program data to institutional data. The fairness guidelines also encourage instructors to gather evidence about digital identities, understanding students’ past digital writing experiences and the nature of those experiences. The Standards “provide us ways to think about the interpretation and use of assessment results” (14). Digital writing assessment has to go beyond traditional rubrics to seeing digital assessment as a way for instructors to make informed choices for the benefit of their teaching and student learning.

The questions posed in Poe’s work (and in the entire collection) remind me of our initial class discussion where we talked about defining a network and understanding the affordances and roles of networks. Poe’s work aims at the question: “How might the multimodal, networked affordances of digital writing affect issues of equity and access?” (Preface).  This goes beyond questions of access to the network but also the benefits that sad network offers to that particular group. A minority group may have access to the network, but lack the knowledge or ability to capitalize on this access. This could be a network of physical friends and co-workers or a digital network.

Poe’s work also made me think of assessment as a kind of boundary genre. Boundary genres are defined as genres that “may actively participate in interprofessional struggles about hierarchies, dominance, and values, helping to create, mediate, and store tensions” (Popham 283). Assessments work in this way, as scholar-teachers we struggle over the role of assessment and when and how to assess. Assessment goes across professional disciplines/boundaries because teachers, administrators, and the public use them to make changes to pedagogy, develop policies, and judge quality/effectiveness, respectively.

I am still pondering the connection between boundary genres and networks. Would boundary genres serve as nodes, providing connections between disciplines and groups in order to redistribute information and communicate between different points?

 Works Cited:

McKee, Heidi A., and Dànielle Nicole DeVoss, Eds. “Preface.” Digital Writing Assessment & Evaluation. Logan, UT: Computers and Composition Digital Press/Utah State University Press, 2013. Web.

Poe Mya. “ Making Digital Writing Assessment Fair for Diverse Writers.” Eds. Heidi A. Mckee and Dànielle Nicole DeVoss. Digital Writing Assessment & Evaluation. Logan, UT: Computers and Composition Digital Press/Utah State University Press, 2013. Web. 3 February 2014.

Popham, Susan L. “Forms as Boundary Genres in Medicine, Science, and Business.” Journal of Business and Technical Communication 19.3 (2005): 279-302.


Deuren, Joe Van. “Fairness heading” Balanced Life Sills. Web. 3 February 2014.

Tarbell, Jared. “Node Garden.” Gallery of Computation, 2004. Web. 3 February 2014.