Archive | privacy RSS feed for this section

Mindmap #4: Drawing Some Genre Lines

Mindmap visualization

Mindmap update #4: Popplet screen capture

This week, like others in the class, I felt a need to add a little more structure to my mindmap. In response, I added a color key in the top left that codes each popple according its function in the map or relation to theorists. I identified two functions, marked in black and blue (without intentional reference to the intellectual bruising these theories are giving me…), to indicate “Networking” and “Descriptions, Questions & Commentary.”  Networking references the parts of a generalized network as I’ve encountered them: nodes, connections, hierarchies, and frameworks. Descriptions, Questions & Commentary refer to questions and comments I made as I struggled with particularly puzzling aspects of theorists’ ideas or network functions. I’ve found less need to interrogate theorists as I’ve moved ahead in the class, at least in part because our latest theorists write more clearly about their own objects of study than our earlier readings. I continue to connect these questions and comments to other parts of the map as I find additional or more nuanced ways to answer or address them.

Adding the color coding also encouraged me to more clearly articulate the relationship of ideas to theorists, so I ended up more closely aligning Foucault to “contradiction” and “historical a priori” and Bazerman, Miller, and Popham to “genre,” “genre system,” “boundary genre” and “activity system.” Interestingly, I discovered Miller discussed hierarchy in more detail that I had remembered, so I drew that connection. Miller (1984) identifies form as “metadata” for substance that offers instruction on how the symbolic representation is to be perceived; as a result, “form and substance thus bear a hierarchical relationship to one another” (p. 159). I connected Bazerman’s (2004) “activity system” to a network framework, as I understood the way Bazerman constructed the hierarchical relationship of genre set as part of a genre system, and a genre system as a part of an activity system; Bazerman claims analyses of the relationships among and between these systems provides “a focus on what people are doing and how texts help people do it, rather than on texts as ends themselves” (p. 319). Focusing on how texts help people do things is both active and framing, in that such focus offers a clearer understanding of text (and relations to people) within a framework of text (and related people) functions.

I also threw in a new theoretical position, that of assessment theory from Digital Writing Assessment and Evaluation; in this case, digital compositions are the object of assessment study, assignments that are often networked, sometimes physically (within computer or cloud-based networks) or through curricula or lesson planning (within class assignment sets). I focused on Crow’s (2013) concern with new media composition networks as surveillant assemblages and drew connections to network frameworks, genre systems, and activity systems. Of special interest to networks are the very practical issues related to shifting understandings of privacy and our disciplines’ responsibilities to protect the privacy interests of our students. As Crow (2013) notes:

“[I]n the midst of venues that facilitate social networks, and in the midst of increasing technology capabilities by corporations and nation states, conceptions of privacy are changing shape rapidly, and individuals draw on a range of sometimes unconscious rubrics to determine whether they will opt in to systems that require a degree of personal datasharing.” (Crow 2013)

These unconscious rubrics are likely themselves hierarchically networked, with diminishing levels of privacy concern along a continuum of the perceived importance of the data held in a network.

As a result, I added privacy as a node in my network and started connected it to other nodes. Given the many-dimensional character of data (a lá rabbit holes) in which one network serves as node in larger networks, lower privacy concerns at lower levels of the network might become greater concerns at higher levels of the network. For example, while a collection of course assignments in Google Drive are themselves of little privacy concern, information found in those documents, like student ID, name, school, email address, and more might find their way as members of a school’s Google Drive network into larger surveillant assemblage maintained by Google. Corporate “Google” might be able to connect those Google Drive documents with emails sent via Gmail, websites visited following Google search results, and ads clicked from Google-affiliated display advertising networks to generate a remarkably accurate, if aggregated, profile of the user. Trust becomes a real operative word in the relationship between the user and Google. As a result, I predict adding “trust” as a node in the next update of my mindmap.

In Google We Trust – Trailer from Journeyman Pictures on Vimeo.


Bazerman, C. (2004). Speech acts, genres, and activity systems: How texts organize activities and people. In Bazerman & Prior (Eds.), What writing does and how it does it: An introduction to analyzing texts and textual practices (pp. 309-340). New York, NY: Routledge.

Crow, A. (2013). Managing datacloud decisions and “big data”: Understanding privacy choices in terms of surveillant assemblages. In McKee, H. A., & DeVoss, D. N. (Eds.). Digital writing assessment & evaluation. Logan, UT: Computers and Composition Digital Press/Utah State University Press. Retrieved from

Miller, C. R. (1984). Genre as social action. Quarterly Journal of Speech, 70(2), 151-67.

[Ropes draw patterns: Creative Commons licensed image from flickr user floriebassingbourn]

Annotated Bibliography Entry: Crow in DWAE

Crow, A. (2013). Managing datacloud decisions and “big data”: Understanding privacy choices in terms of surveillant assemblages. In McKee, H. A., & DeVoss, D. N. (Eds.). Digital writing assessment & evaluation. Logan, UT: Computers and Composition Digital Press/Utah State University Press. Retrieved from

Crow addresses the ethics of assessment by defining online composition portfolios as surveillant assemblages, collections of electronic student data that may be used to create increasingly accurate aggregate student profiles. Composition studies seeks assessment techniques, strategies, and technologies that are effective and fair. As big data continues to proliferate, Crow argues that we need to understand and communicate specific ways that student data are used in surveillance. Our goal should be to move toward caring on a surveillance continuum between caring and control.

Google Drawing Visualization of Surveillance Continuum

Google Drawing Visualization of Surveillance Continuum

For-profit assessment platforms, from Google Apps to ePortfolio companies, have sharing and profiling policies that are troubling and may represent more controlling than caring policies. These controlling policies may remove agency from students, faculty, and composition or English departments and transfer agency to university IT departments, university governance, or even corporate entities. Crow concludes that the best option would be a discipline-specific and discipline-informed DIY assessment technology that would take into consideration these real concerns about surveillant assemblages.

The concept of a surveillant assemblage is a network concept. It’s a dynamic collection of student information grown ever larger by the addition of student files. Crow demonstrates that electronic portfolios used for assessment are networked collections of files, collected over time for assessments, that build a (potentially) dangerously accurate profile of the student in aggregate—a profile that can be used for extra-assessment purposes through data mining.

Contemporary networks make privacy a complicated issue, a moving target, one that requires decisions on the part of participants regarding levels of privacy expected.

“[I]n the midst of venues that facilitate social networks, and in the midst of increasing technology capabilities by corporations and nation states, conceptions of privacy are changing shape rapidly, and individuals draw on a range of sometimes unconscious rubrics to determine whether they will opt in to systems that require a degree of personal datasharing.” (Crow 2013)

Crow responds that English studies as a (supra)discipline has a responsibility to investigate the effects of surveillant assemblage collections and to maintain student, faculty, and departmental or disciplinary agency in technology and network selection and implementation.

Miller’s genre, Bazerman’s genre set, and Popham’s boundary genre all demonstrate the socially active nature of genre and genre collections. Crow makes similar observations about student files as surveillant data collections: they have and take on a social activity of their own that can’t necessarily be predicted or controlled. As networked action, genre can expand within its framework and, in the case of boundary genre, expand into interdisciplinary spaces. Tension and contradiction (a la Foucault) are continually present in such networks, including surveillant assemblages, and unexpected results—like the superimposition of business in medical practice seen in Popham’s analysis or the potential marketing of aggregated student data from assessment processes and results mentioned in Lundberg’s forward—can, perhaps likely will, occur, if disciplinary agency is not maintained.

I’ve been working on my Twitter identity this past week, and a Tweet from @google about its transparency efforts caught my eye in relationship to Crow’s article.

The tweet links to an entry in Google’s Official Blog, “Shedding some light on Foreign Intelligence Surveillance Act (FISA) requests,” dated Monday, February 3, 2014, and reports that Google is now legally able to share how many FISA requests they receive. The blog entry, in turn, links to Google’s Transparency Report, which “disclose[s] the number of requests we [Google] receive[s] from each government in six-month periods with certain limitations.”

What struck me about the Transparency Report, the blog post, and the Twitter post related to Crow’s article is the focus on the important role reporting has on my willingness to contribute to my own surveillant assemblage. I feel a little better knowing that Google reports on such requests in an open and relatively transparent way, even if I also know that Google uses my data to create a profile of me that feeds me advertising and other profile-specific messages. This is my own “sometimes unconscious rubric” to which I turn when making decisions about how much and whether to opt in. The question it raises is whether we give our students, faculty, staff, and prospects agency to make these opt-in decisions, consciously or unconsciously. As a Google Analytics and web metrics consumer, these are especially sensitive issues with which I deal on a daily basis.

[CC licensed image from flickr user Richard Smith]