Start Submission Become a Reviewer

Reading: Towards Open Annotation: Examples and Experiments

Download

A- A+
Alt. Display

Research articles

Towards Open Annotation: Examples and Experiments

Author:

Lindsey Seatter

University of Victoria, CA
About Lindsey
Lindsey Seatter (BA, MA, Simon Fraser University) is a SSHRC-funded Doctoral Candidate in the Department of English at the University of Victoria. Her research focuses on the British Romantic period and Digital Humanities, with special interest in women writers, the evolution of the novel, reader engagement, and online communities of practice. Seatter has given presentations at national and international conferences on female literary networks, reading Austen with computers, and teaching digital Romanticism. She is also the creator and editor of an open access digital anthology that curates lesser-known, mostly manuscript, works by Romantic women writers. Seatter works as a Research Assistant in the Electronic Textual Cultures Lab and is a Colloquium Co-Chair for the Digital Humanities Summer Institute.
X close

Abstract

This article interrogates how digital text annotation tools and projects facilitate online engagement and virtual communities of practice. With the rise of the Web 2.0 movement and the proliferation of digital resources, annotation has evolved from an isolated practice to a collaborative one. This article unpacks the impact of this shift by providing an in-depth discussion of five web-based tools and two social reading projects. This article examines issues of design, usability, and applicability to pedagogical intervention as well as underscores how productive group dynamics can be fostered through digital, social annotation. 

How to Cite: Seatter, L., 2019. Towards Open Annotation: Examples and Experiments. KULA: knowledge creation, dissemination, and preservation studies, 3(1), p.12. DOI: http://doi.org/10.5334/kula.49
178
Views
25
Downloads
13
Twitter
  Published on 27 Feb 2019
 Accepted on 13 Sep 2018            Submitted on 01 Jun 2018

Intent and Theory

In his symposium lecture delivered in 2000 at King’s College London, John Unsworth advanced what now stands as the formative theory of scholarly primitives. In his address, Unsworth (2000) explained that primitives, according to Aristotle, are a set of self-understood principles or terms that require no definition or explanation, and out of which all further logic proceeds. Using the theory of primitives as a metaphor, Unsworth isolated seven essential research activities that he claimed to be the foundation of scholarly inquiry. Unsworth argued that these basic functions are unbounded by discipline, time, or theoretical orientation. The scholarly primitives he advanced were: discovering, comparing, referring, sampling, illustrating, representing, and annotating. As a cross-disciplinary, functional primitive, Unsworth observed that annotation’s basic form requires the linking between objects and the creation of a sense of association ‘between, among, and within’ objects.

While the fundamental theory of annotation remains the same today, in the nearly two decades since Unsworth’s presentation, annotation practices have substantially evolved. In 2000, Unsworth argued that there were no platforms or tools that facilitated the effective sharing of annotations. Online annotation tools were clunky, imprecise, and limited in their content applicability. Contrastingly, today, the parallel rise of digital humanities research, Web 2.0, and collaborative methods have shifted annotation strategies from static to dynamic. Now, there is an expanse of annotation platforms that can be used for describing a variety of materials and accommodating different annotation strategies, sharing preferences, and needs for cataloguing and posterity (Siemens et al. 2017). Most significantly, annotation practices have become open—existing online in spaces accessible to scholars, students, academic professionals, and citizen scientists. The objective of this brief article is to explore and evaluate a selection of these open tools and projects, specifically considering how they facilitate group annotation practices.

This research builds on Siemens et al.’s (2017) recently published article and annotated bibliography by evaluating and discussing a selection of the scan’s resources. In their article, Siemens et al. theorize annotation in light of our age of rapid, digital communication and real-time, interactive technologies. The accompanying bibliography presents a scan of relevant annotation tools as well as catalogues publications on annotation models, annotation organization/retrieval, and flexible annotation systems. They argue that the dynamic annotation platforms currently available facilitate ‘a network with a thousand entrances to individual works, as well as from those individual works to entire, interconnected fields of inquiry’ (Siemens et al. 2017, 154). The complexity and multiplicity of the contemporary, digital annotation practices discussed by Siemens et al. stand in contrast to the material, foundational, and contained annotation strategies described by Unsworth. However, the basic impulse and outcome of annotation—regardless of medium—remains the same: remarking on materials in order to deepen understanding and facilitate engagement with others. The Web 2.0 movement has expanded these sites of interaction by rendering documents digitally, thereby allowing multiple readers to critically engage with a single object at the same time. Increasing the proximity of user interaction raises the following questions of group dynamics and online communities of practice:

  • How do we create productive, effective, connected communities in the absence of physical interaction with our objects of inquiry and with each other?
  • How can tools and technologies enhance and enrich foundational scholarly practices, including annotation?
  • How do group annotation platforms and projects present an opportunity for pedagogical interventions?

Using a selection of annotation tools and collaborative reading projects, this paper investigates how virtual collaboration is facilitated by web annotation tools and interrogates issues of interface, usability, and online engagement. As a scholarly primitive, evaluating how we teach annotation is as important as understanding how we enact the practice ourselves. Therefore, much of this investigation is focused through the lens of pedagogy, specifically how instructors can use these tools in the classroom as a way of teaching skills such as critical thinking, close reading, and argumentation.

Method

As illustrated by the Siemens et al. (2017) bibliography, digital annotation resources are rich and abundant. However, it was imperative for the purpose of this exploration to focus on a small group of comparable tools and projects in order ensure a keen evaluation of their objectives, affordances, and challenges. Therefore, it was necessary that I whittle down the extensive list presented in the Siemens et al. document in order to assemble a specific subgenre of annotation resources.

My first decision was to focus on tools designed for textual material, as opposed to visual material. As a literary humanities practitioner and instructor, the form of annotation I most commonly encounter is text annotation. I chose to hone in on tools that facilitated collaborative annotation of textual documents and projects that demonstrated group text annotation strategies because they are most closely applicable to my own scholarly work, my work as an instructor, and the work of my colleagues. Second, I chose tools that were free and standalone. Limited classroom funds means that software or subscription products are often inaccessible given their cost. Furthermore, the diversity of students’ machines (e.g., phones, laptops, institutional desktops) and platforms (e.g., Windows, Apple) raises questions of compatibility with particular plug-ins or downloadable software. Because of these challenges, I decided to focus on open, web-based tools and projects. Finally, I privileged resources that explicitly expressed a pedagogical focus. Many of the online annotation platforms are built for workplace settings. While these can be adapted for the classroom, tools that are built with students and instructors in mind, and with learning objectives as a core tenet of their design, more easily fit pedagogical goals. With this selection criteria—text-based annotation, free, standalone, and pedagogically-focused—I combed through the Siemens et al. (2017) bibliography and compiled a shortlist of five tools (Annotation Studio, Hypothes.is, NowComment, Prism, and Google Docs) and two projects (Open Utopia and Infinite Ulysses).

The remainder of this article provides detailed descriptions and evaluations of these seven selected resources. Each of the resources is evaluated based on three criteria, which measure their overall usefulness and specific applicability to pedagogical annotation exercises:

  • Flexibility: Does the tool ingest multiple file formats and/or allow for the annotation of web-based documents? Does the commentary easily export to other file formats? Does the project facilitate varied and efficient textual annotation?
  • Usability: How intuitive is the design of the tool/website? Are there an adequate number of features (highlighting, annotating, marking) to support various types of commentary and textual interaction? Are there resources provided to educate the user on how to effectively navigate the platform?
  • Sociality: Does the tool allow users to interact through their commentary? Does the platform facilitate both public and private annotation spaces? Does the project use social technologies to create a community of users?

Each of the resources received a rating of ‘excellent’, ‘good,’ or ‘poor’ for how directly, fully, and effectively they met the criteria. In order to develop a test case for each of the tools, I selected a passage of text from the first chapter of Jane Austen’s novel Pride and Prejudice (1813). Keeping the content identical across platforms facilitated a more accurate comparison of the functionality. On the other hand, it is the diverse content of the two collaborative annotation projects that makes their comparison fruitful and demonstrates the cross-disciplinary applicability of social reading platforms. The aims of these brief reports are to (a) summarize how the tools and projects function and (b) explore how they enable social reading through textual commentary.

Examples: Tools for Collaborative, Text-Based Annotation

Example One: Annotation Studio

Flexibility: Excellent | Usability: Good | Sociality: Excellent

Annotation Studio, developed by Hyperstudio at the Massachusetts Institute of Technology, is a ‘suite of collaborative web-based annotation tools’ (Annotation Studio Team 2017, n.p.). Annotation Studio is specifically designed for students; it requires no previous knowledge of TEI—a commonly required skill for text annotation programs—nor does it require an in-depth understanding of literary markup techniques.

When users log in to their free Annotation Studio account, they are taken directly to a ‘Dashboard’ page where they can access public documents, a list of their annotations, and an address book of people participating in their same class or project. Users create new documents simply by uploading a DOC, TXT, or PDF file or pasting the text into the built-in text editor. Annotations are created by simply highlighting the applicable passage of text and selecting the writing icon that hovers above. Clicking this icon opens a text box where users can write their annotation (Figure 1); annotations can be made private or public. Users can also filter annotations by user, portion of document, or tags. The tagging is directed by folksonomy (user-generated).

Figure 1 

Annotation text box in Annotation Studio.

The greatest advantage of Annotation Studio is its applicability to classroom use and characteristic humanities reading assignments. The website foregrounds pedagogy and provides resources—case studies, student interviews, instructor interviews, and assessment reflections—that assist users in optimizing their use of the tool. The ability to create groups of individuals that have access to specific, private documents facilitates a productive and efficient collaboration space. By allowing instructors to curate documents for groups of students, Annotation Studio lends itself to the types of social annotation activities that fit seamlessly into a pedagogical model.

The simplicity of the tool is also a distinct advantage. Students register, upload a document, and begin annotating within a few minutes. This simple, user-friendly workflow is helpful for students and instructors, who may be unfamiliar with similar tools. However, this simplistic design does sacrifice the breadth of abilities available on other platforms. Annotations can only appear in a single highlighted colour, which diminishes the effectiveness of some compare and contrast strategies. Further, because tagging is folksonomy based, the ability to sort by tag is almost useless unless the group predetermines a set vocabulary. I would encourage those using Annotation Studio in the classroom to develop a tagging vocabulary as a pedagogical exercise before diving into collaborative markup. While the Annotation Studio system is generally simple and straightforward, not being able to edit or export published documents forced me to upload texts multiple times as I learned to navigate the site.

Example Two: Hypothes.is

Flexibility: Excellent | Usability: Excellent | Sociality: Excellent

Hypothes.is was originally launched in July 2011 by founder and CEO Dan Whaley. The mission of Hypothes.is is to ‘enable a conversation over the world’s knowledge’ (Hypothes.is Team, n.d.). This tool aims to foster community through its free, open, non-profit platform. Hypothes.is understands the importance of annotation to pedagogy and integrates particular features for classroom use.

To get started with the web version of Hypothes.is, users must create a free account and download the Google Chrome extension. The extension facilitates group annotations of websites, articles, or online PDF documents. When marking up the document, users have the choice between highlighting and annotating; both types of notes are collected in the sidebar (Figure 2). Users can select whether to post their notes as public or to catalogue them in a private group. One of the most appealing features is the ability for users to not only create their own notes but to reply to the annotations left by others. Once a user has completed their markup, the annotated page can be shared via an active link. All annotations and their appropriate web pages are also stored on the user account page on the Hypothes.is site.

Figure 2 

Annotation side bar in Hypothes.is plug-in.

Hypothes.is is an incredibly flexible tool with a wide user base. The ability to layer annotation tools onto any website makes annotating ephemeral documents—like current news articles or Twitter streams—a viable classroom assignment. Given the mutable nature of digital content, annotating these types of documents in print is a moot point. However, Hypothes.is provides a solution for including variable online content in critical, close reading exercises, thereby enriching and diversifying foundational pedagogical strategies. Further, the opportunity for students to engage not only with each other, through the ‘reply’ feature, but also with a wider audience is particularly useful for bridging the gap between university learners and the community. By facilitating community beyond the walls of the classroom, the impacts of critical thinking and the export of close reading skills for wider engagement are emphasized.

Hypothes.is’s intuitive interface is supplemented with ‘How To’ instructions pinned to the top of the sidebar. This makes discovering the different functions of the tool simple and alleviates concerns about the learning curve associated with any new technology. The challenge with Hypothes.is is that it is a web extension compatible with a single browser. This means that students must all be using Google Chrome in order to participate in Hypothes.is annotations. Google Chrome is free to download, but needing to do so does add another step. While more browser extensions are in development, this does limit the web version of the tool.

Example Three: NowComment

Flexibility: Poor | Usability: Poor | Sociality: Good

NowComment, developed by Fairness.com, is a ‘sophisticated group collaboration app for the discussion and annotation of online documents’ (Fairness.com LLC 2018). The tool facilitates threaded conversations in the context of a source document. NowComment prides itself on a being a ‘fast, powerful, feature-rich’ platform for online discussion no matter the size of the collaboration team (Fairness.com LLC 2018).

Users begin by creating a free NowComment account. The hub of this account is the ‘My Library’ page where users can see all of their annotated documents, as well as access their private groups and view commonly used tags. NowComment supports the uploading of HTML web content, DOC files, PDF files, or text copied and pasted into the application’s text editor. Because NowComment has been designed not only with students in mind but also with an eye towards working professionals, there are a lot of additional features—timestamps, comment privacy, metadata, customizable document legends—that can be layered onto any given text. Once the document is created, users can comment on specific document paragraphs or on the document generally (Figure 3). The highlight feature includes a wide array of colours and users are able to invite others to collaborate on any of their private documents.

Figure 3 

Annotation interface in NowComment.

While many of the additional features available in NowComment do not seem applicable to a classroom setting, the ability to develop a built-in tag vocabulary and customize the meaning of each highlight colour would be useful for pedagogical collaboration purposes. The ability to generate metadata and a tailored document legend were features that appealed more to me as a researcher than as an instructor. While outside the scope of this exploration, NowComment does also facilitate image and video annotation, which is a useful additional feature for those working at the intersection of text and image or in the field of media studies.

The problem with NowComment is that it is designed for paragraph-level commentary and not word-level commentary, as is often required for close reading. Not being able to highlight an individual word or phrase was something I found particularly challenging and made my document notations clumsy. This problem would certainly be exacerbated in a collaborative setting where multiple people would be adding their comments to the document. Group annotation certainly requires precision. Additionally, the interface was not nearly as intuitive as Annotation Studio or Hypothes.is. It was unclear what functions like ‘Comments: Full,’ ‘Comments: Summaries,’ and ‘Comments: Sorted’ meant and how they could be useful for collaborative annotation.

Example Four: Prism

Flexibility: Poor | Usability: Excellent | Sociality: Good

Prism is a tool for crowdsourcing interpretation created by the Praxis Program at the Scholars’ Lab at the University of Virginia. Prism is a simple annotation program that allows users to highlight vocabulary and sort it into predetermined categories, or facets. The platform is designed to reveal the ‘patterns that exist in the subjective experience of reading a text’ (Praxis Program and Scholars’ Lab, n.d.). What is unique about Prism is that annotations are turned into visualizations, which adds another dimension to this critical practice.

To create a prism, users must sign up for a free account. Text documents are copied and pasted into the built-in text editor. At this stage, users must also define their variables; there is a choice of up to three facets. Once the document is finalized, users can highlight the text using one of their pre-defined variables (Figure 4). There is also an easy erase function for errors. Once the user has finished highlighting the document, they can switch to visualization view. They can choose to view the document as a ‘Winning Facet Visualization,’ where each word is coloured according to the facet they were most often sorted into by the group, or as a ‘Font Size Visualization,’ where the words in a facet are displayed in a font size relative to the number of times they were categorized by the group.

Figure 4 

Facet highlighting in Prism.

The brilliance of Prism is in its simplicity. It is certainly the easiest tool to use and understand. The clean, minimalist design of the interface is both aesthetically pleasing and works to highlight the subject of the annotation over the various features of the tool. The simple, visually-oriented features of Prism make it an ideal tool for classroom collaboration. In addition, there are a variety of instructor resources, including hundreds of open prisms on the website and different lesson plans collected on external sites like Pedagogy Toolkit for English (Christie n.d.), which makes designing annotation activities for this tool uncomplicated. Recently, Prism was discovered by the K-12 community and is now being widely used by educators outside of post-secondary institutions.

The downside of Prism is that it does not support text annotations; users can only highlight words and classify them into the predetermined categories. When marking up my text, I found the mutual exclusivity of categories limiting, and struggled to categorize items without providing a rationale for doing so. The design of this tool definitely restricts the extent of the user’s critical intervention and does not provide space for extended conversation between participants. If instructors want to encourage discussion through commentary, this tool is not a functional option. There is very limited space on the platform for interaction because multiple people cannot contribute to the same project at the same time. Instead, Prism is a great spot to introduce and model close reading because of its emphasis on bold, simple visualizations.

Example Five: Google Docs

Flexibility: Good | Usability: Excellent | Sociality: Excellent

Google Docs is part of the Google Drive suite of online drafting, collating, and presenting programs. Specifically, Google Docs is a web-based document creation platform with smart editing and sharing features that support multiple users working on the same document in real time.

To begin working in Google Docs, users must first create a free Google account. Once logged in, the user can create a document (either blank or template) or upload an existing document from their computer. One benefit of Google Docs is that it is compatible with Microsoft Word, which is the most common word processing software. Once the text is uploaded and converted to Google Doc format, the user can add annotations using the ‘Comment’ function. Comments are added by highlighting a passage of text and clicking the text bubble icon on the function toolbar. This highlights the text in yellow and subsequently prompts a text box to open where commentary can be written (Figure 5). All comments display the user name and include a time stamp. Google Docs documents can be made private or public and can be shared via link or through a group file.

Figure 5 

Annotation interface in Google Docs.

Google Docs has become the standard shared writing platform and the proliferation of its use makes it an easy choice for practitioners because students likely already understand how to manipulate the technology and navigate the platform interface. For those who are not already users of Google Docs, the fact that its design mirrors the popular Microsoft Word—with its upper action toolbar (e.g., “File”, “Edit”, “Insert,” etc.) and lower function toolbar (e.g., font, type size, bullet list, etc.)—makes the transition from local program to web program unproblematic. The familiarity of this interface makes it the easiest to navigate and eliminates the platform learning curve almost entirely.

However, while the interface aesthetics are exceptional, the comment function does not appear as central to the platform’s purpose or design. Google Docs is much more a collaborative writing space than it is a collaborative commenting space. The comment icon is obscure and locating the function using the action toolbar requires users to look under the ‘Insert’ heading, which is not an entirely obvious selection. Further, annotations appear to be treated more as disposable notes attached to the body of the text rather than equally valued commentary. This is demonstrated by the ‘Resolve’ function, which allows users to delete annotations once they have been addressed. By foregrounding the dismissal of annotations, the Google Docs platform privileges the use of comments to alert other document contributors to questions or errors, instead of using them as markers to enhance the body of the text. That said, one redeeming quality of the comment function is that users can ‘Reply’ to each other. This feature automatically threads the comments, which facilitates discussions, makes conversations easy to read, and enhances the platform’s sociability.

Experiments: Collaborative Annotation Projects

Example One: Open Utopia

Flexibility: Good | Usability: Good | Sociality: Excellent

Created by Stephen Duncombe, Open Utopia is a digital edition of Thomas More’s Utopia that enacts the work’s primary precept—all property is common property—through its form and facilitation of group annotation (Duncombe 2018). Duncombe’s edition of Utopia is open to reading, copying, and modification. Its greatest value to the culture of openness is its mounting of a social edition of the text. Open Utopia is a pilot project of Social Book, a social reading platform developed by the Institute for the Future of the Book. This platform allows users to read and engage in an open community—commenting on the original text, adding to Duncombe’s commentary, and entering into conversations with one another.

For users to participate in Open Utopia they must create a free account. Once validated, the account gives them access to Duncombe’s social edition where they can immediately begin annotating. The annotation interface comprises three panes: the account pane on the left hand side, where users access their profile, bookshelf, and the ‘Help’ menu; the reading pane in the middle, where the text of Utopia appears; and the comment pane on the right hand side, where user annotations and the annotations of other users appear (Figure 6). Text that has been commented on appears highlighted in blue; the more annotations on a particular word or phrase, the darker the blue colour becomes. This colour opacity variation helps to signal passages of importance or debate within the community.

Figure 6 

Reading and commenting interface in Open Utopia.

The Social Book interface, particularly the edition of Utopia, is user-friendly, clean, and efficient. While the vast majority of the platform is intuitive, the useful ‘Help’ menu clarifies functions and features for those unsure of how to approach the social edition. The varying opacity of highlighting and the threading of comments encourages users to converse with one another and helps to facilitate a true community of readers. Open Utopia also allows for the creation of reading groups. These groups can be designated open or closed, making this social edition easy to use in a classroom or private setting.

Open Utopia includes hyperlinked footnotes to provide useful context and incorporates links to outside source materials, like Wikipedia, for users’ further reading. All of this content makes Open Utopia a particularly robust edition. However, the reading pane is quite narrow and it necessitates a lot of clicking. Further, there is no clear table of contents, so navigation can be confusing. Finally, there are still some comments on the text that were clearly left by the creator or developers in order to test the Social Book features. Encountering these comments interrupts the reading experience.

Example Two: Infinite Ulysses

Flexibility: Good | Usability: Excellent | Sociality: Good

Infinite Ulysses (2016) was the dissertation project of Amanda Visconti, who completed her doctoral degree at the Maryland Institute of Technology in the Humanities in April 2015. Infinite Ulysses was an open, social edition of James Joyce’s modernist novel Ulysses. This project was recently archived and is no longer active. However, Visconti’s stellar documentation of the site makes it possible to understand how the tool was designed and how it promoted an online annotation community.

When the edition was live, users would create an account and log in to the reading interface. The interface was separated into two panes: the reading pane, which was located on the left side of the screen and presented Joyce’s original text, and the annotation pane, which was located on the right side of the screen and displayed annotations (Figure 7). To annotate, users selected the text and added their comment or question to the popup textbox. Users could also add tags to their notations using folksonomy tagging. Submitted annotations appeared in the annotation pane. Notations could be sorted by creation date, reader, or rating, or filtered by tag, which allowed readers the flexibility to tailor the annotations that appeared to their particular interests. As with Open Utopia, words and phrases with more annotations would appear in a deeper highlight (in this case, yellow) while text attached to a single comment would appear in a paler colour.

Figure 7 

Reading and commenting interface in Infinite Ulysses.

Infinite Ulysses struck a balance between facilitating active community and maintaining an interface that was very conducive to reading online. Features like page bookmarking and chapter navigation took into consideration that no reader would be devouring Joyce’s tome in a single sitting and that they would necessarily need to return to the text later. Importantly, the project displayed a deep awareness of its community orientation and worked to facilitate a positive environment by developing community policies and an accessibility strategy. Infinite Ulysses was the only resource examined in this paper that foregrounded accessibility. Visconti’s attention to group dynamics is commendable and should certainly be held as the standard.

The blessing and the curse of Infinite Ulysses was the lack of privacy settings. While this worked to create an open reading community, it would have been challenging to wrangle comments from a designated group of readers, such as a class of students. Filtering options, group settings, or the ability to create private reading groups within the public reading space would have greatly improved the edition’s applicability to the classroom. Further, while users could interact with each other’s comments through ‘favouriting’ and ‘liking’ annotations, the lack of a ‘reply’ feature diminished the conversational potential of this social edition. A function that enables users to dialogue with one another—as is supported by Hypothes.is—would have enhanced the effectiveness of the community and made the project better suited for pedagogical initiatives.

Conclusion and Next Steps

The development of digital technologies and growth of online community environments has promoted the transformation of annotation practices from static to social. These platforms enable meaningful, engaged, collaborative scholarship in diverse online environments. Overall, Hypothes.is is the most functional tool for pedagogical annotation exercises whereas NowComment is least suited for these purposes. Most of the tools possess navigable interfaces and intuitive annotation technologies; however, it is the breadth of applications—facilitated by the browser plug-in technology—that really sets Hypothes.is apart from the other resources included in this survey. The inability to generate granular commentary through the NowComment interface makes it a less desirable tool for social annotation because it lacks the precision and design affordances necessary for successful online interaction. Annotation Studio and Google Docs are both ideal environments for creating private, shared classroom documents because they focus on facilitating textual commentary in a controlled environment. On the other hand, Prism is ideal for open lectures or in-class group exercises where strong visuals outweigh the importance of varied functionality. Both Open Utopia and Infinite Ulysses demonstrate the potential of social reading for creating community and facilitating extended, online engagement with texts—transforming an oft-times solitary activity into an interactive one. While neither project presents both an ideal interface and a fully functional suite of annotation tools, considering them together gestures towards the technologies necessary to build an open, online reading platform that is at once flexible, usable, and social.

However, it is imperative, in the open annotation community’s next steps, that we acknowledge the lingering exclusivity of many ‘open’ platforms. Universal design and accessibility need to become integral to web-based tools, and this functionality was not evident in the majority of the platforms explored in this paper. Developing features that facilitate an inclusive learning environment would work towards transforming aspirationally open technologies into more objectively open technologies. This is a key consideration as we work to integrate these technologies into the classroom.

This snapshot of collaborative annotation tools and projects demonstrates that many open tools are comprehensive and uniquely designed for effective, pedagogical purposes—from the simple, comparative structure of Prism to the private, collaborative annotation spaces of Annotation Studio. Further, specific reading environments curated around individual texts facilitate collaborative, critical reading that merges various publics and enhances isolated scholarly practice. Overall, engaging with open annotation platforms—in the classroom, in our research, and in our community—adds yet another unbounded dimension to Unsworth’s scholarly primitive: space.

Acknowledgements

I would like to thank my team at the Electronic Textual Cultures Lab and the members of the Implementing New Knowledge Environments research partnership for their support of this work. I would also like to thank my peer reviewer for their insightful feedback.

Competing Interests

The author has no competing interests to declare.

References

  1. Annotation Studio Team. 2017. “Annotation Studio.” Accessed June 1, 2018. www.annotationstudio.org. 

  2. Christie, Alex. n.d. “Pedagogy Toolkit for English: Prism.” Accessed June 1, 2018. www.pedagogy-toolkit.org/tools/PRISM.html. 

  3. Duncombe, Stephen. 2018. “Open Utopia.” Accessed June 1, 2018. www.theopenutopia.org/about/. 

  4. Fairness.com LLC. 2018. “NowComment.” Accessed June 1, 2018. https://nowcomment.com. 

  5. Google Drive. n.d. “Google Docs.” Accessed September 6, 2018. https://www.google.ca/docs/about/. 

  6. Hypothes.is Team. n.d. “Hypothes.is.” Accessed June 1, 2018. https://web.hypothes.is/about/. 

  7. Praxis Program, and Scholars’ Lab. n.d. “Prism.” Accessed June 1, 2018. http://prism.scholarslab.org. 

  8. Siemens, Ray, Alyssa Arbuckle, Lindsey Seatter, Randa El Khatib, and Tracey El Hajj, with the ETCL Research Group. 2017. “The Value of Plurality in ‘The Network with A Thousand Entrances.’” International Journal of Humanities and Arts Computing, 11(2): 153–173. DOI: https://doi.org/10.3366/ijhac.2017.0190 

  9. Unsworth, John. 2000. “Scholarly Primitives: What Methods Do Humanities Researchers Have in Common, and How Might Our Tools Reflect This.” Paper presented at the Symposium on Humanities Computing: Formal Methods, Experimental Practice at King’s College. London, May 13. Accessed 1 June 2018. www.people.virginia.edu/~jmu2m/Kings.5-00/primitives.html. 

  10. Visconti, Amanda. 2016. “Infinite Ulysses.” Accessed June 1, 2018. www.infiniteulysses.com. 

comments powered by Disqus