RESEARCH ARTICLE

Is Digital Scholarship Meaningful?: A Campus Study Tracking Multidisciplinary Perceptions

Nickoal Eichmann-Kalwara

University of Colorado Boulder

Frederick Carey

University of Colorado Boulder

Melissa Hart Cantrell

University of Colorado Boulder

Stacy Gilbert

University of Colorado Boulder

Philip B. White

University of Colorado Boulder

Katherine Mika

Harvard University

Increased computational and multimodal approaches to research over the past decades have enabled scholars and learners to forge creative avenues of inquiry, adopt new methodological approaches, and interrogate information in innovative ways. As such, academic libraries have begun to offer a suite of services to support these digitally inflected and data-intense research strategies. These supports, dubbed digital scholarship services in the library profession, break traditional disciplinary boundaries and highlight the methodological significance of research inquiry. Externally, however, these practices appear as domain-specific niches, e.g., digital history or digital humanities in humanities disciplines, e-science and e-research in STEM, and e-social science or computational social science in social science disciplines. The authors conducted a study examining the meaningfulness of the term digital scholarship within the local context at University of Colorado Boulder by investigating how the interpretation of digital scholarship varies according to graduate students, faculty, and other researchers. Nearly half of the definitions (46 percent) mentioned research process or methods as part of digital scholarship. Faculty and staff declined or were unable to define digital scholarship more often than graduate students or post-doctoral researchers. Therefore, digital scholarship as a term is not meaningful to all researchers. We recommend that librarians inflect their practices with the understanding that researchers and library users' perceptions of digital scholarship vary greatly across contexts.

Keywords: digital scholarship; academic libraries; qualitative data; needs assessment

 

How to cite this article: Eichmann-Kalwara, Nickoal, Frederick Carey, Melissa Hart Cantrell, Stacy Gilbert, Philip B. White, and Katherine Mika. 2021. Is Digital Scholarship Meaningful?: A Campus Study Tracking Multidisciplinary Perceptions. KULA: Knowledge Creation, Dissemination, and Preservation Studies 5(1). https://doi.org/10.18357/kula.130

Submitted: 3 August 2020 Accepted: 9 October 2020 Published: 30 December 2021

Copyright: @ 2021 The Author(s). This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. See http://creativecommons.org/licenses/by/4.0/.

 

Introduction

Digital scholarship (DS) support in academic libraries has grown significantly over the last decade and has been associated primarily with scholarly communication and digital humanities (DH) support. As a discipline-agnostic kaleidoscope of practices used throughout the research lifecycle, DS enables scholars and learners to experiment and investigate research questions in new ways and on a larger scale. Within the context of the University Libraries at the University of Colorado Boulder (CU Boulder), DS is characterized as the use of computational and multimodal approaches to explore and/or answer research questions in new and innovative ways.

The University Libraries at CU Boulder resides at an important nexus of facilitating the development of and supporting ongoing engagement with DS. As a result of the CU University Libraries 2013 DH needs assessment at CU Boulder (Lindquist et al.) and its findings, the CU Libraries identified that dedicated infrastructure and functional specialists familiar with emergent computational research methods and open scholarship practices added significant value to its suite of support and programming. This led to the eventual founding of a new Libraries department and a cross-campus research center, the Center for Research Data and Digital Scholarship (CRDDS), to enhance support for digital research, the data lifecycle, and open research practices. Librarians and information professionals, working in conjunction with the CRDDS, offer training, education, and outreach in this shifting scholarly ecosystem. We have learned that supporting DS exploration and creation, whether for research or teaching, requires attention to local context and regular reassessment since the nature of open and digital scholarship within the research lifecycle continues to evolve.

We, therefore, distributed a campus-wide survey (Appendix A) in April 2018 to better understand how researchers across disciplines understand the term digital scholarship and how to improve outreach, engagement, and communication methods between the University Libraries and its users. This paper uses that survey data to investigate how the definition of DS varies among CU Boulder scholars by their academic status, discipline, and familiarity and engagement with DS. Our findings indicate that digital scholarship is not always a meaningful term for researchers, even as DS strategies and tools become increasingly integrated into university curricula and research across disciplines. This lack of cross-disciplinary understanding has implications for library professionals at CU Boulder, who distinguish DS from “analog” or “traditional” research methods to improve support for users, but whose users do not necessarily see DS as distinct from “traditional” scholarship and often define their work in more discipline-specific terms. Based on this local context, we recommend that libraries use the term digital scholarship internally—to inform decisions about library infrastructure and services—and externally when referring to research methods, tools, publishing, and pedagogy on a large scale for diverse library users. However, we recommend using more discipline-specific language for outreach and education when engaging directly with researchers.

Literature Review

Our literature review yields two main findings: 1) that there are two main definitions of DS, one used by researchers and one used by libraries; and 2) that these different definitions make it difficult for libraries to describe and promote services intended to support researchers doing DS, particularly since researchers often use discipline-specific terms in lieu of the term digital scholarship. In 1999, Kelly Russell, Ellis Weinberger, and Andy Stone provided one of the first uses of the term digital scholarship. They wrote in the context of preserving digitally created scholarship, including “databases, electronic journals, and the hypertext links of the world wide web [which] have become standard fare in academia” (271). Since then, for many institutions, DS has evolved to encompass a much wider umbrella, with two apparent approaches emerging within libraries. The first approach defines DS by interdisciplinary methodologies that employ computational and digital research methods, often reflected in DH. For example, librarian and DH scholar John Cox describes DS as “the application of digital technologies and content to enable new methods of enquiry, often involving large-scale manipulation of data” (2016, 230, emphasis added). A second approach to DS is even broader in scope and addresses the entire research data lifecycle, including scholarly publishing and data management planning and preservation, in addition to digital research methodologies. This approach aligns with the 2011 Scholarly Communication Institute’s definition of DS as “the use of digital evidence and method, digital authoring, digital publishing, digital curation and preservation, and digital use and reuse of scholarship” (Rumsey 2011, emphasis added). In the Association of Research Libraries (ARL) SPEC Kit 350: Supporting Digital Scholarship, Rikk Mulligan similarly considers DS as the “use of digital evidence and method, digital authoring, digital publishing . . . and digital use and reuse of scholarship,” including research and publication of “print and web-based text, video, audio, still images, annotation, and new modes of multi-threaded, nonlinear discourse” (2016, 2).

These two conceptual approaches to DS manifest structurally in the language used by different academic libraries and the various DS-engaged units within them. A library’s use of particular terminology reflects how DS is situated and regarded locally and may serve in terms of tactical convenience towards the institutionalization of DS. For instance, despite their overlaps with each other and DS, data science and digital humanities may be more recognizable or understandable terms for funding opportunities and buy-in for capacity building at certain academic institutions. Further, as Mulligan notes, “this battle over definition can also be a battle for recognition and is one of the initial challenges for promoting and supporting DS in many of our institutions” (2016, 2) As such, while DS is broad enough to include multiple approaches and may serve as an inclusive catch-all, it also presents challenges to libraries in defining the scope of, promoting, and supporting DS, especially when working with researchers who do not necessarily define their work as DS.

Indeed, academic departments and disciplines tend to apply their own alternative terms for and understandings of DS. Our systematic study of digital scholarship articles found three main disciplinary strands of research: digital libraries from the social sciences, networked scholarship from information science and computer science, and DH from the humanities. The low rate of cross-citations between these three areas suggests that each field is fragmented and sees DS through its discipline’s lens (Raffaghelli et al. 2016). Additionally, there is e-Research, an older term used synonymously with digital scholarship, subsections of which include e-social science for the social sciences and e-Science for the sciences (Brandt 2007). William H. Dutton and Eric T. Meyer’s 2009 exploratory study on social scientists’ attitudes towards e-Research found that recent graduates are most interested in e-social science. In a STEM context, e-Science is often associated with cloud computing and related technologies (Casacuberta et al. 2013; Yang et al. 2013).1 However, while a great deal of literature has explored the definition of DH (Terras et al. 2014), an exhaustive search found no articles that examine the definitions of e-research, e-social science, e-science, digital libraries, and networked scholarship, despite the popularity of these terms.

How researchers define DS as a more general practice outside of libraries remains underexplored. Over the years, campus needs assessment surveys have demonstrated discord between researchers’ and librarians’ definitions. For instance, Pamela Price Mitchem and Dea Miller Rice note that at Appalachian State University, “even though many faculty were doing projects that are considered digital scholarship, they did not think of their undertakings in those terms. There appeared to be little understanding regarding what constitutes digital scholarship” (2017, 833). Similarly, Thea Lindquist et al. reached similar findings in a DH needs assessment survey at CU Boulder in 2013. In fact, some scholars wonder if the term digital scholarship applies to all research and is, therefore, too broad to offer clear meaning. As Clifford A. Lynch (2014) argues, “digital scholarship is an incredibly awkward term that people have come up with to describe a complex group of developments. The phrase is really, at some basic level, nonsensical. After all, scholarship is scholarship.”

Acknowledging that digital scholarship as a phrase seems to be meaningful among some library professionals, yet obscure to many researchers, our study attempted to address a gap in scholarship by exploring researchers’ perspectives of DS at an R1 institution. Because of the varied definitions of and approaches to DS in libraries, we needed to gain a local understanding of DS in order to build and strengthen community, develop strategic partnerships, and build sustainable infrastructure and support for our university’s context. Regardless of the various definitions and approaches to DS in the field more broadly, libraries need to know how DS is understood in their institutions, and only by understanding their local context and adjusting to the vocabulary of specific library users can we better build community and infrastructure around DS. Thus, this paper seeks to answer: 1) How does the definition of DS vary among CU Boulder scholars according to their academic status, discipline, and familiarity and engagement with DS?; and 2) what are the implications of these variations for libraries services?

Methods

Survey Design

A team of CU Boulder librarians developed and distributed a campus survey in April 2018 to understand: 1) how researchers on CU Boulder’s campus perceive DS, 2) how those perceptions impact the University Libraries’ outreach, engagement, and communication efforts, and 3) how the evolving perspectives of DS shape the practice of librarianship. As participants might not use the term digital scholarship to describe their own research, the survey was distributed as a needs assessment on “digital research and tools.” Participants used a Likert scale to assess their current engagement with and interest in learning digital methods and tools, before providing their own definitions of DS.

Defining Digital Scholarship

At the end of the survey, participants were asked a set of three questions about the definition of DS in order to assess understanding of and engagement with the concept. The survey refrained from using the phrase digital scholarship until the “Defining Digital Scholarship” section at the end of the survey. By neither using nor defining the term digital scholarship in earlier sections of the survey, the authors intended to gather participant definitions that were not heavily influenced by the librarians’ views. By taking the survey, participants had already reflected on their DS practices, so asking for a definition of DS as part of the survey was the best method for collecting additional data about perceptions of the concept.

In this section, the survey stated, “All of the digital research methods covered in this survey are considered forms of Digital Scholarship. How do you define Digital Scholarship?” This question was followed by a text box in which participants could write their responses. Next, the survey asked two follow-up questions. The first asked, “Before taking this survey, how familiar were you with the term Digital Scholarship?” and gave participants the option to respond unfamiliar, somewhat familiar, or very familiar. The second question asked, “To what extent do you consider yourself engaged in Digital Scholarship?” and provided participants with the option to answer not at all, some engagement, or engaged (integral to my work).

Distribution

Working with the CU Office of Data Analytics’ Institutional Research unit, we distributed the survey using Qualtrics (Eichmann-Kalwara et al. 2018) and sent it to 3,612 CU Boulder researchers. We closed the survey after twenty-five days, having received 451 survey responses (a 12.5 percent response rate). Graduate students provided the majority of responses (68 percent, n = 305), followed by faculty (25 percent, n = 115), postdocs (5 percent, n = 21), and other (2 percent, n = 10). A majority of respondents represent the College of Arts and Sciences (50 percent, n = 247) and the College of Engineering and Applied Math (25 percent, n = 115). While the survey was intended to capture response data from all disciplines, the data are most reflective of respondents from the sciences (natural sciences, computing, engineering, etc.) due to the proportionally high number of respondents from these areas. To permit multiple analyses across academic units and to safeguard the confidentiality of the respondents, participants’ responses were categorized based on their department’s broad discipline (e.g., sciences, humanities, arts, engineering, etc.).

Content Analysis

Of the 451 participants who took the survey, 235 responded to the question “How do you define Digital Scholarship?” In order to systematically analyze the hundreds of responses, the authors performed a content analysis. First, we developed a codebook (Appendix B) for analysis by taking a random 10 percent sample of the 235 responses. We then read the twenty-three sample definitions and individually compiled a list of themes, met to discuss the themes, and created a collective list which became the codes. Next, we tested the first draft of the codebook on the same twenty-three definitions and then met again to modify the codes. After coming to a consensus on the codebook, the DS definitions were divided among us to be coded. The final codebook, summarized in Table 1, has nine codes.


Table 1: Summary of codes and definitions from the Digital Scholarship at the University of Colorado Boulder 2018 Campus Survey Codebook
Code Definition
Process or methods Applications/ways of doing research, not the stuff you use to do it (broader than tools, referring to broader research concepts); integrated into research process
Tools Specifically calls out using a thing, such as tools, technology, computers (as opposed to techniques and methods, even if methods are enabled by the tools)
Publication, dissemination, and communication Distribution of scholarship and outputs for consumption
Undefined No definition given, for a variety of reasons (too complex, don’t use term)
Not new/just research Broad research activities, not a distinct form of research practices; progress/evolution of research
New/non-traditional Distinctly different from previous practices; novelty, paradigm shift
Digital output Digital output/object as a result of research process; includes data when specifically called out as output
Teaching/pedagogy Digital stuff used in the context of instruction; pedagogy and teaching are specifically called out
Values/ethos Reflects ethics or beliefs as practices in research lifecycle; a community of practice

Fisher's Exact Test

Since the survey also asked participants to indicate a variety of demographic factors, it was possible to analyze how these nine codes were featured in participants’ definitions of DS according to academic discipline, academic status, and prior familiarity and/or engagement with DS. To analyze the results, the authors tested for statistical significance of the demographic factors and DS definitions using the Freeman-Halton extension of Fisher’s exact test. Fisher’s exact test is ideal because it can calculate p values of small data sets (Boslaugh 2008). Since Fisher’s exact test is only for 2 x 2 tables, the Freeman-Halton extension was calculated with Fisher Exact Probability Test: 2 x 3 (Lowry 2019). A Fisher exact probability of ≤ .05 was used to indicate statistical significance. No statistical significance means that the use of a code does not vary in a statistically significant way depending on the participants’ academic affiliation, discipline, or familiarity or engagement with DS. For example, if examining the use of different codes based on participants’ academic status, no statistical significance would indicate that participants of a particular academic status did not feature a code more or less than participants of other statuses. No statistical significance does not mean that the codes appear an insignificant number of times or that these codes do not matter.

Findings and Discussion

Out of the 235 definitions analyzed, the most frequently identified code, process or methods, appeared in 43 percent of the definitions (n = 100). People who described digital scholarship in terms of process or methods used phrases like “research methods,” “online research,” or “computational methods.” Tools, the second most frequently identified code, was identified in 38 percent of the definitions (n = 90). Participants who described DS in terms of tools used phrases such as “using digital resources,” “media,” “tools,” “computers,” “technology,” “online resources,” “platforms,” or “electronic platforms.”

The two next most frequently identified codes, publication, dissemination, and communication and undefined, were each identified in 19 percent of the definitions (n = 45). Definitions coded under publication, dissemination, and communication used phrases such as “how recorded academia is distributed and accessed online,” “dissemination of scholarly research,” and “creates archives or web pages or other ways to access information on line.” The fifth most frequently identified code, not new/just research, was identified in 14 percent of the definitions (n = 33). It manifested in phrases like “scholarship with a digital component,” “academic research,” or “a form of developing knowledge,” and “the tools in the survey are all necessary for research so ‘digital scholarship’ is becoming just ‘scholarship.’” Conversely, 10 percent (n = 24) of the definitions mentioned DS as a new/non-traditional form of research, described by participants as, for example, “on the cutting edge” or “not your grandpa’s scholarship.”

The three least identified codes were digital output (object) (8 percent or n = 18), teaching/pedagogy (5 percent or n = 12), and values/ethos (3 percent or n = 7). Definitions that prioritized outputs included phrases such as “research or creative works whose output is a digital product,” while in definitions with an emphasis on teaching and pedagogy DS was described as the “use of technology in teaching or research” or “one-on-one appointments with specialists.” Definitions that emphasized the values or ethos around DS were more nebulous than the other codes but brought attention to ethical concerns, in certain cases noting that DS “systemically appreciates open sharing and collaboration” or “improve[s] the quality, repeatability, and accessibility of research.” One respondent simply defined digital scholarship as “money.” The responses from participants who declined to write a definition for DS were coded undefined. These participants wrote that they were too novice or unfamiliar with DS and therefore unable to write a definition (e.g., “I’m too much of a novice to say,” “I am not sure what digital scholarship is”) or acknowledged that DS might be too elusive to define (e.g., “Not jumping down this rabbit hole. ;)”).

Perceptions of Digital Scholarship at CU Boulder

Despite its ambiguity to many scholars across academic disciplines, the practice of DS has been applied in different disciplines in a variety of ways. Historically, as digital research methods gained popularity and momentum, academic disciplines focused on unique applications of DS to keep their work relevant, interesting, and meaningful. It could be argued that, if DS applications in one discipline become too similar to those in another, each discipline risks losing its individual identity. Therefore, unique, subject-specific understandings of DS organically emerged under different names, promoting the perception that certain practices pertain only to certain disciplines.

To accommodate these subject-specific understandings of DS, current DS support in libraries focuses heavily on the common ground between academic disciplines and the similarities in the methods and tools employed during the research process. As a result, a distinction between analog modes of scholarship and DS has begun to emerge. However, as different disciplines incorporate DS practices at varying paces, confusion over the definition of DS remains persistent, and many researchers are reverting to the comfort of discipline-specific terms, such as data science, digital history, and data humanism, as part of situating their work. Our survey results confirm this trend, as well as a broad confusion around the term digital scholarship (despite its generic and interdisciplinary meaning, which is meant to include all digital and data-inflected research practices and tools).

Perceptions According to Academic Status

Survey respondents who identified as faculty, graduate students, or postdoctoral researchers predominantly defined DS using the same two codes: processes or methods or tools (Table 2). However, the popularity of each of these two codes varied considerably depending on participants’ academic status, indicating a conceptual shift in the perception of DS between early and late career stages. Forty-six percent (n = 70) of graduate students and 67 percent (n = 8) of postdoctoral students defined DS in terms of processes or methods, while only 31 percent (n = 22) of faculty members’ definitions were coded as discussing processes or methods. There was a significant difference (p = 0.02) between the proportions of the three groups whose definitions were coded as processes or methods. This result suggests that respondents in earlier stages of their careers tend to more frequently understand DS practices as an inherent part of the research process and consider the methodological contribution to one’s research techniques. Tools was identified in similar proportions across status groups. Thirty-eight percent (n = 27) of faculty, 38 percent (n = 58) of graduate students, and 42 percent (n = 5) of postdoctoral researchers’ definitions were coded as tools, with no significant differences in proportions of these groups that defined DS as such.


Table 2: Content analysis of DS definitions by respondents’ academic status
Number of respondents by academic status
Code Faculty/staff (n = 71) Graduate students (n = 152) Postdoctoral fellows (n = 12) Total number of respondents (N = 235) Fisher exact probability
Digital output [object] 7 (10%) 10 (7%) 1 (8%) 18 (8%) NS
New/non-traditional 7 (10%) 17 (11%) 0 (0%) 24 (10%) NS
Not new/just research 7 (10%) 23 (15%) 3 (25%) 33 (14%) NS
Process [methods] 22 (31%) 70 (46%) 8 (67%) 100 (43%) 0.02*
Publication / dissemination [communication] 12 (17%) 31 (20%) 2 (17%) 45 (19%) NS
Teaching/pedagogy 5 (7%) 7 (5%) 0 (0%) 12 (5%) NS
Tools 27 (38%) 58 (38%) 5 (42%) 90 (38%) NS
Undefined 21 (30%) 23 (15%) 1 (8%) 45 (19%) 0.03*
Values/ethos 1 (1%) 6 (4%) 0 (0%) 7 (3%) NS
Note: A Fisher exact probability of ≤ 0.05 was used to indicate statistical significance. * indicates statistical significance; NS indicates that the differences between the different groups were not statistically significant.

Further statistical differences emerged between participants of different academic status in responses coded undefined. Faculty declined to define or were unable to define DS more often than graduate students or postdoctoral researchers. Thirty percent (n = 21) of faculty members’ definitions were coded as undefined, while only 15 percent (n = 23) of graduate students and 8 percent (n = 1) of postdoctoral researchers elected not to define DS. These results represent a significant difference (p = 0.03) in the proportions of these academic groups whose definitions were coded as undefined, which suggests newer researchers are more familiar with DS and the methods presented in the survey.

Despite being conducted over ten years apart, this study’s findings are aligned with Dutton and Meyer’s 2009 study, which found that recent social sciences graduates were more interested than social sciences faculty in e-social science (Dutton and Meyer 2009). The affiliation groups did not significantly differ from one another among the remaining definition codes.

Perceptions According to Academic Discipline

There were few variations in how participants from different disciplines defined DS. Excluding thirty participants who declared themselves as interdisciplinary or “other” disciplines, the remaining 205 participants represented three main disciplinary areas: arts and humanities, STEM (science, technology, engineering, and mathematics), and social sciences. Art, art history, classics, English, history, philosophy, and religious studies comprise arts and humanities; natural sciences and engineering comprise STEM; and communication, media studies, information sciences, business, and education comprise the social sciences. None of the codes demonstrated statistical significance between these discipline areas (Table 3).


Table 3: Content analysis of DS definitions by respondents’ discipline
Number of respondents by discipline
Code Arts and humanities (n = 18) STEM (n = 125) Social sciences (n = 62) Total number of respondents (N = 205) Fisher exact probability
Digital output [object] 0 (0%) 8 (7%) 4 (8%) 12 (6%) NS
New/non-traditional 0 (0%) 12 (10%) 8 (13%) 20 (9%) NS
Not new/just research 1 (6%) 18 (14%) 12 (19%) 31 (15%) NS
Process [methods] 7 (39%) 53 (43%) 25 (40%) 75 (37%) NS
Publication / dissemination [communication] 5 (28%) 26 (21%) 9 (15%) 40 (20%) NS
Teaching/pedagogy 2 (11%) 7 (6%) 2 (3%) 11 (5%) NS
Tools 10 (56%) 50 (40%) 22 (35%) 82 (40%) NS
Undefined 1 (6%) 25 (20%) 12 (19%) 38 (16%) NS
Values/ethos 0 (0%) 5 (4%) 1 (3%) 6 (3%) NS
Note: A Fisher exact probability of ≤ 0.05 was used to indicate statistical significance. NS indicates that the differences between the different groups were not statistically significant.

The frequency of codes remained consistent across disciplines. In each discipline, process or methods and tools were either the first or second most frequently identified codes. Among arts and humanities’ participants, 56 percent (n = 10) of definitions included the code tools and 39 percent (n = 7) included process or methods. Those affiliated with STEM disciplines included the code process or methods in 43 percent (n = 53) of their definitions and tools in 40 percent (n = 50) of their definitions. Forty percent (n = 25) of the responses written by social sciences participants included the code process or methods and 35 percent (n = 22) included tools. With the exception of arts and humanities including tools in 56 percent of their definitions, none of the disciplines had any one code applied to more than 50 percent of the definitions. The words and phrases each discipline used to describe process or methods and tools were similar.

Differences in thought and understanding of DS emerged from the codes new/non-traditional research and not new/just research. The code new/non-traditional research was identified in 10 percent (n = 12) of STEM’s definitions and 13 percent (n = 8) of social sciences’ definitions; however, it was not identified in any definitions from the arts and humanities. Similarly, 14 percent (n = 18) of STEM’s definitions and 19 percent (n = 12) of social sciences’ definitions were identified with the code not new/just research, while none of the arts and humanities’ responses included this code. It is also interesting to note that only one participant from the arts and humanities (6 percent) elected not to define DS; however, undefined was coded in 20 percent (n = 25) of STEM and 19 percent (n = 12) of social sciences’ definitions.

Scholarly societies in the arts and humanities, such as the American Historical Association (AHA) and the Modern Language Association (MLA), have developed special evaluative criteria for digital research outputs such as DH projects and datasets, and our findings suggest that DS may be understood, but perceived as complicated, by our humanities researchers. The AHA’s “Guidelines for the Professional Evaluation of Digital Scholarship by Historians” (2015) argues that evaluation of scholarship and scholarship itself should evolve alongside one another. AHA (2015) applies the terms digital scholarship, digital publication, and digital history interchangeably as “scholarship that is either produced using computational tools and methods or presented using digital technologies.”2 Similarly, the MLA’s “Guidelines for Evaluating Work in Digital Humanities and Digital Media” from 2012 note that humanities computing is not new for their disciplines; however, “humanists are adopting new technologies and creating new critical and literary forms and interventions in scholarly communication.” These guidelines further point to the need for libraries to offer specialized support and meaningful communication that considers the varied approaches, language, criteria, and needs for digital research across disciplines and local contexts.

Perceptions According to Familiarity and Engagement with DS

The results of participants’ familiarity and engagement with DS were analyzed together because these categories represent two different but related aspects of each participant’s prior exposure to DS. Nearly half of the participants (49 percent) identified themselves as unfamiliar with DS (n = 115), while another 40 percent identified as somewhat familiar (n = 95). Only 11 percent of respondents identified as very familiar (n = 25). These statistics prove interesting compared to participants’ self-identified engagement levels. Only 17 percent of participants indicated that they were not at all engaged in DS (n = 39), while over half (55 percent) identified as somewhat engaged (n = 129), and 28 percent were engaged (n = 67) (Table 5). No correlation emerged between how respondents rated their familiarity and how they rated their engagement with DS. In fact, 17 percent (n = 39) of respondents answered that they were unfamiliar with DS but that they were engaged (i.e., DS is integral to their work). Notably, nearly two-thirds (64 percent, n = 25) of participants answering in this way also identified themselves with engineering or natural sciences. Yet, only three respondents indicated that they were very familiar but not at all engaged with DS, and the largest proportion of respondents (26 percent, n = 62) reported themselves as somewhat familiar and having some engagement with DS (Table 4).


Table 4: Content analysis of DS definitions by respondents’ level of familiarity with DS
Number of respondents by level of familiarity
Code Very familiar (n = 25) Somewhat familiar (n = 95) Unfamiliar (n = 115) Total number (%) of respondents (N = 235) Fisher exact probability
Digital output [object] 2 (8%) 12 (13%) 3 (3%) 17 (7%) 0.03*
New/non-traditional 3 (12%) 13 (14%) 7 (6%) 23 (10%) NS
Not new/just research 1 (4%) 18 (19%) 13 (12%) 32 (14%) NS
Process [methods] 11 (44%) 50 (53%) 39 (34%) 100 (43%) 0.02*
Publication / dissemination [communication] 4 (16%) 26 (27%) 14 (13%) 44 (19%) 0.03*
Teaching/pedagogy 1 (4%) 1 (2%) 9 (8%) 11 (5%) NS
Tools 9 (36%) 48 (51%) 33 (29%) 90 (38%) 0.00*
Undefined 4 (16%) 3 (3%) 37 (33%) 44 (19%) NS
Values/ethos 0 (0%) 3 (3%) 3 (3%) 6 (3%) NS
Note: A Fisher exact probability of ≤ 0.05 was used to indicate statistical significance. * indicates statistical significance; NS indicates that the differences between the different groups were not statistically significant.


Table 5: Content analysis of DS definitions by respondents’ level of engagement with DS
Number of respondents by level of engagement
Code Engaged (n = 67) Some engagement (n = 129) Not at all (n = 39) Total number of respondents (N = 235) Fisher exact probability
Digital output [object] 4 (6%) 11 (9%) 3 (8%) 18 (8%) NS
New/non-traditional 6 (10%) 15 (12%) 0 (0%) 21 (9%) 0.05*
Not new/just research 10 (15%) 20 (16%) 3 (8%) 33 (14%) NS
Process [methods] 32 (48%) 55 (43%) 12 (31%) 99 (42%) NS
Publication / dissemination [communication] 16 (24%) 23 (18%) 6 (15%) 45 (19%) NS
Teaching/pedagogy 3 (4%) 8 (6%) 1 (3%) 12 (5%) NS
Tools 26 (39%) 51 (40%) 12 (31%) 89 (38%) NS
Undefined 10 (15%) 20 (16%) 14 (38%) 44 (19%) 0.01*
Values/ethos 4 (6%) 1 (1%) 2 (5%) 7 (3%) NS
Note: A Fisher exact probability of ≤ 0.05 was used to indicate statistical significance. * indicates statistical significance; NS indicates that the differences between the different groups were not statistically significant.

Participants’ self-identified familiarity and/or engagement levels directly related to whether or not they explicitly defined DS. A third (33 percent, n = 38) of participants unfamiliar with DS did not define the term compared to 3 percent (n = 3) of those somewhat familiar and 16 percent (n = 4) of those very familiar. Similarly, 38 percent (n = 15) of participants not at all engaged in DS did not define the term compared to 15 percent (n = 20) of those with some engagement and 15 percent (n = 10) of those engaged. Across all levels of familiarity and engagement, process or methods and tools were by far the most common codes applied to the definitions. The more familiar respondents claimed to be with DS, the less likely they were to have supplied a definition coded as undefined. The responses that did not answer the question followed the same pattern as responses with definitions coded as undefined: across all disciplines and statuses, the less familiar or less engaged one self-reported, the more likely that participant skipped defining DS. These findings are in line with previous research on faculty’s DS activities, which demonstrated that many faculty members did not think of their research as DS (Mitchem and Rice 2017; Lindquist et al. 2013), and they support Lynch’s argument that DS is “nonsensical” because “scholarship is scholarship” (2014).

Impacts on Librarianship at CU Boulder

Within libraries, thinking about DS as distinct forms of research from traditional or analog aids in creating structures and support for users. Due to the kind of research support that CRDDS at CU Boulder offers, its services do not always neatly fit into “traditional” librarian roles. Using the term digital scholarship therefore proves useful within the local context of the Libraries at CU Boulder, especially for subject specialists whose constituents do not necessarily think about their research artifacts as data or utilize machines to help them process and analyze those data as DS. Such a distinction between digital and analog research allows libraries to improve support for users and facilitates collaboration between transdisciplinary functional and subject specialists.

Despite the use of the term digital scholarship in the Libraries at CU Boulder being meaningful from service-oriented and organizational perspectives, in practice it proves to be convoluted at best and at worst is misleading or too vague to be meaningful when working with researchers outside of the Libraries. The authors’ experiences consulting on a variety of research topics from English and history to Earth sciences and information science suggest that digital research methods are widespread and often not distinguished from “traditional” or “analog” research methods. Indeed, the discrepancy between unfamiliar and not at all engaged responses in the authors’ 2018 digital scholarship engagement survey perhaps indicates that users self-report higher levels of engagement with methods and tools related to digital research methods but lower familiarity with DS as a distinct type of research. Our findings show that the local understanding of DS still requires education and outreach but that library professionals need to tailor messages to disciplinary contexts, as researchers use different terms to define their work, such as digital humanities, data science, or, simply, research.

Conclusion

There are a couple of factors which may limit the broad applicability and significance of the findings from this survey. First, while the survey refrained from using the term digital scholarship until asking participants to define the term, the question that was posed had the potential to influence participant responses. As stated before, the survey prompted: “All of the digital research methods covered in this survey are considered forms of Digital Scholarship. How do you define Digital Scholarship?” The number of respondents who used the phrase “digital research methods” or similar phrases covering “methods” or “methodology” within their definition suggests that the use of this phrase within the question may have shaped some responses. Second, the survey responses represent a proportionally high number of graduate students as well as respondents from the sciences (natural sciences, computing, engineering, etc.). Since we received responses from these groups at a higher rate, these data do not necessarily provide a complete reflection of the perceptions of DS for the CU Boulder community. Future directions for this and similar studies include expanding the scope of the study to more campuses as well as broader demographic groups, which would create a larger sample and potentially garner a more representative body of participants. Conducting focus groups in addition to the survey methodology used for this study would help to generate more nuanced explanations of respondents’ perceptions as well as provide a better understanding of the terms and practices used within various local contexts.

Based on these findings, the authors recommend using the term digital scholarship to inform libraries’ service offerings and support infrastructure internally but advise that libraries avoid assuming that researchers and users understand its use or definition. While there is some evidence to suggest that researchers are most likely to interpret libraries’ DS-branded offerings as in some way related to digital tools and/or research methods, there is no consistent understanding of the phrase and its use is likely more confusing than descriptive. These data suggest that there is likely not a single word or phrase to describe digital or e-research across disciplines and statuses. By marketing services and products with one term exclusively, libraries may be losing potential users. Rather, it may be more important to fold librarians’ understanding of DS into specific contexts when engaging in targeted outreach and education and consulting with individual researchers. However, it is also important to use a holistic approach to discussing research methods, tools, publishing, and pedagogy at a broad scale among diverse groups of library users. This balance is essential to avoid confusing users while simultaneously inspiring confidence that libraries are well equipped with the resources and skills to support digital scholarship.

Appendices

Competing Interests

The authors declare that they have no competing interests.

References

American Historical Association. 2020. “Guidelines for the Professional Evaluation of Digital Scholarship by Historians.” Accessed May 1, 2020. https://www.historians.org/teaching-and-learning/digital-history-resources/evaluation-of-digital-scholarship-in-history/guidelines-for-the-professional-evaluation-of-digital-scholarship-by-historians. Archived at: https://perma.cc/2BZJ-GHZ9.

Boslaugh, Sarah. 2008. “Fisher’s Exact Test.” In Encyclopedia of Epidemiology. SAGE Research Methods. Thousand Oaks, CA: SAGE Publications. https://dx.doi.org/10.4135/9781412953948.n159.

Brandt, D. Scott. 2007. “Librarians as Partners in e-Research: Purdue University Libraries Promote Collaboration.” College & Research Libraries News 68 (6): 365–96. https://doi.org/10.5860/crln.68.6.7818.

Casacuberta, David, and Jordi Vallverdú. 2013. “E-Science and the Data Deluge.” Philosophical Psychology 27 (1): 126–40. https://doi.org/10.1080/09515089.2013.827961.

Cox, John. 2018. “Positioning the Academic Library within the Institution: A Literature Review.” New Review of Academic Librarianship 24 (3–4): 217–41. https://doi.org/10.1080/13614533.2018.1466342.

Dutton, William H., and Eric T. Meyer. 2009. “Experience with New Tools and Infrastructures of Research: An Exploratory Study of Distance From, and Attitudes Toward, e-Research,” Prometheus 27 (3): 223–38. https://doi.org/10.1080/08109020903127802.

Eichmann-Kalwara, Nickoal, Frederick C. Carey, Melissa Hart Cantrell, Stacy Gilbert, Philip B. White, and Katherine Mika. “Survey Response Data.” 2018 Digital Scholarship Campus Survey, December 17, 2018. Last updated March 27, 2019. https://doi.org/10.17605/OSF.IO/AK8FM.

Heidorn, P. Bryan. 2011. “The Emerging Role of Libraries in Data Curation and e-Science.” Journal of Library Administration 51 (7–8): 662–72. https://doi.org/10.1080/01930826.2011.601269.

Hensley, Merinda Kaye, and Steven J. Bell. 2017. “Digital Scholarship as a Learning Center in the Library: Building Relationships and Educational Initiatives.” College & Research Libraries News 78 (3): 155–58. https://doi.org/10.5860/crln.78.3.9638.

Johnson, Layne M., John T. Butler, and Lisa R. Johnston. 2012. “Developing e-Science and Research Services and Support at the University of Minnesota Health Sciences Libraries.” Journal of Library Administration 52 (8): 754–69. https://doi.org/10.1080/01930826.2012.751291.

Lindquist, Thea, Holley Long, Alexander Watkins, Leo Arellano , Michael Dulock, Eric Harbeson, Erika Kleinova, Viktoriya Oliynynk, Elaine Paul, and Esta Tovstiadi. 2013. “dh+CU: Future Directions for Digital Humanities at CU Boulder.” CU Scholar University Libraries. https://scholar.colorado.edu/concern/reports/1n79h519n.

Lowry, Richard. 2020. “Fisher Exact Probability Test: 2x3.” Accessed May 24, 2019. http://vassarstats.net/fisher2x3.html. Archived at: https://perma.cc/Y6FA-GGEP.

Lynch, Clifford A. 2014. “The ‘Digital’ Scholarship Disconnect.” EDUCAUSE Review 49 (3): 10–15. https://er.educause.edu/articles/2014/5/the-digital-scholarship-disconnect. Archived at: https://perma.cc/FW6L-8QLF.

Martin, Lindsey. 2016. “The University Library and Digital Scholarship: A Review of the Literature.” In Developing Digital Scholarship: Emerging Practices in Academic Libraries, edited by Alison Mackenzie and Lindsey Martin, 3–22. London: Facet Publishing. https://doi.org/10.29085/9781783301799.002.

Mitchem, Pamela Price, and Dea Miller Rice. 2017. “Creating Digital Scholarship Services at Appalachian State University.” portal: Libraries and the Academy 17 (4): 827–41. https://doi.org/10.1353/pla.2017.0048.

Modern Language Association. 2020. “Guidelines for Evaluating Work in Digital Humanities and Digital Media.” Accessed May 1, 2020. https://www.mla.org/About-Us/Governance/Committees/Committee-Listings/Professional-Issues/Committee-on-Information-Technology/Guidelines-for-Evaluating-Work-in-Digital-Humanities-and-Digital-Media. Archived at: https://perma.cc/L7QG-TPF6.

Mulligan, Rikk. 2016. “SPEC Kit 350: Supporting Digital Scholarship (May 2016).” Washington, DC: Association of Research Libraries. https://doi.org/10.29242/spec.350.

Raffaghelli, Juliana E., Stefania Cucchiara, Flavio Manganello, and Donatella Persico. 2016. “Different Views on Digital Scholarship: Separate Worlds or Cohesive Research Field?” Research in Learning Technology 24 (December). https://doi.org/10.3402/rlt.v24.32036.

Robertson, Stephen. 2016. “‘The Differences between Digital Humanities and Digital History.’” Debates in the Digital Humanities, edited by Matthew K. Gold and Lauren F. Klein. Minneapolis, MN; London: University of Minnesota Press. https://dhdebates.gc.cuny.edu/read/65be1a40-6473-4d9e-ba75-6380e5a72138/section/ed4a1145-7044-42e9-a898-5ff8691b6628#ch25. Archived at: https://perma.cc/5BE7-FG3K.

Rumsey, Abby Smith. 2011. “Scholarly Communication Institute 9: New-Model Scholarly Communication: Road Map for Change.” Charlottesville, VA: University of Virginia Library. http://uvasci.org/institutes-2003-2011/SCI-9-Road-Map-for-Change.pdf. Archived at: https://perma.cc/K2VC-JXWN.

Russell, Kelly, Ellis Weinberger, and Andy Stone. 1999. “Preserving Digital Scholarship: The Future Is Now.” Learned Publishing 12 (4): 271–80. https://doi.org/10.1087/09531519950145670.

Terras, Melissa, Julianne Nyhan, and Edward Vanhoutte, eds. 2013. Defining Digital Humanities: A Reader. Surrey: Ashgate.

Yang, Xiaoyu, David Wallom, Simon Waddington, Jianwu Wang, Arif Shaon, Brian Matthews, Michael Wilson, Yike Guo, Li Guo, Jon D. Blower, Athanasios V. Vasilakos, Kecheng Liu, and Philip Kershaw. “Cloud Computing in e-Science: Research Challenges and Opportunities.” 2014. The Journal of Supercomputing 70: 408–64. https://doi.org/10.1007/s11227-014-1251-5.

Footnotes

1 Libraries have supported e-science by assisting researchers with the data curation lifecycle (Heidorn 2011), and Layne M. Johnson, John T. Butler and Lisa R. Johnston (2012) have identified data management planning, data archiving and sharing, bibliometric and informatic support, and aid in grant preparation as potential library roles in supporting e-science.

2 Not all historians agree with this approach. Stephen Robertson, for example, prefers the term digital history to digital humanities (2019). Libraries (at least in CUB’s local context) group these terms all under the same umbrella, akin to AHA’s usage.