RESEARCH ARTICLE

Viewing Citation Analysis Through the Lens of Citation Justice

Linda C. Smith
University of Illinois Urbana-Champaign

Abstract

Given the recent growing emphasis on citation justice and related concepts, this paper examines "Citation Analysis," published in Library Trends in 1981, and revisits it through the lens of citation justice. Overarching questions include: How can citation analysis be more just? How can research evaluation go beyond citation analysis to be more just? Sections include a discussion of the concept of citation justice, applications of citation analysis with particular emphasis on evaluative bibliometrics, characterization of assumptions underlying citation analysis, identification of problems posed in dealing with citation data, and an outline of possible approaches to achieving citation justice. Several different entities and actions are discussed with the goal of working toward citation justice. These include author roles, pedagogical approaches, resource compilation, editor and reviewer roles, publisher roles, advocacy, recommendations for research evaluation reform, and higher education institutional roles. Viewing citation analysis through the lens of citation justice reveals significant limitations in citation analysis and suggests ways to correct them–both to ensure that more diverse scholars are part of the scholarly conversation that underlies citation analysis and to encourage approaches to research evaluation that are not dependent solely on citation counts.

Keywords: citation analysis; citation counts; citation justice; evaluative bibliometrics; research evaluation

 

How to cite this article: Smith, Linda C. 2026. Viewing Citation Analysis Through the Lens of Citation Justice. KULA: Knowledge Creation, Dissemination, and Preservation Studies 9(1). https://doi.org/10.18357/kula.305

Submitted: 2 March 2025 Accepted: 4 November 2025 Published: 19 January 2026

Competing interests and funding: The author declares that they have no competing interests.

Copyright: © 2026 The Author(s). This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. See http://creativecommons.org/licenses/by/4.0/.

 

Introduction

The theme of the Summer 1981 issue of Library Trends, “Bibliometrics,” was a response to the lack of critical analyses of the field and sought “to provide analyses of the major concepts of bibliometrics and to indicate its present and future directions” (Potter 1981, 5). The editor explained that authors of articles in the issue “were chosen to bring some new names and, it is hoped, new ideas to the literature” (Potter 1981, 5). I was the author of “Citation Analysis” (designated as CA for the remainder of this paper), which focused “on the development of citation analysis as a research method, uses and abuses of this method, and prospects for the future” (Smith 1981, 83). Better known for my work on artificial intelligence and information retrieval (Smith 1976), I had previously published one study using citation context analysis (CCA) to trace the influence of Vannevar Bush’s ideas as described in “As We May Think” (1945) on the subsequent design and development of information retrieval systems (Smith 1980). As explained by Valeria Aman and Jochen Gläser (2025, 170), “while a citation only indicates that a knowledge flow may have taken place, CCA can establish whether knowledge flows occurred, which knowledge from the cited text was used, and how it was used.” My 1980 study demonstrated that many citing documents mentioned “As We May Think” only in passing, perfunctory citations adding little to one’s understanding of the article’s influence. Therefore, simple citation counts were imperfect measures of impact.

Reflecting my early published work, an author co-citation analysis of information science (White and McCain 1998, 337) analyzing the 120 most-cited authors in twelve information science journals for 1972 to 1995 concluded that I was “about equally known as an experimental retrievalist, a practical retrievalist, and a citationist.” The author co-citation analysis method thus “highlights the careers of individual authors as perceived by citers” (White and McCain 1998, 348). My research interests gradually evolved with the opportunity to co-lead a grant from the Institute of Museum and Library Services beginning in 2007. That grant supported enhancing and expanding the doctoral program in library and information science at my university to have a stronger focus in information in society, including policy, economic, and historical dimensions. In collaboration with colleagues and students involved in this effort, I became much more engaged with critical information studies, exploring “the structures, functions, habits, norms, and practices that guide global flows of information and cultural elements” (Vaidhyanathan 2006, 292).

The critique of citation analysis in CA outlined the assumptions often made and the problems that can arise in data collection. Subsequent publications (e.g., MacRoberts and MacRoberts 1996) highlighted multiple concerns, including the potential for cultural bias. Recognizing the increasing focus on social justice in science communication (Dawson, Iqani, and Lock 2024), this paper takes CA as a starting point and revisits it through the lens of citation justice. Viewing citation analysis through the lens of citation justice makes evident significant limitations in citation analysis unless efforts are made to address citation justice. This raises two questions: How can citation analysis be more just? How can research evaluation go beyond citation analysis to be more just?

The remaining sections of this paper include a discussion of the concept of citation justice, applications of citation analysis with particular emphasis on evaluative bibliometrics, characterization of assumptions underlying citation analysis, identification of problems posed in dealing with citation data, and an outline of possible approaches to achieving citation justice.

Citation Justice

Tracing the development of the concept of “citation justice” can begin with concerns expressed in a letter to the editor from a chemical engineer (Radovic 1996, 6): “Not only are authors, papers, or journals simply misidentified, but too many references do not match authors’ interpretations. Under publish-or-perish pressure, one is often tempted to cite the papers that are at hand, not necessarily the most appropriate papers.” Twenty-four years later, an invited commentary from a nursing ethicist (Fowler 2020, 2) lamented how style guides obscure female authorship, and advocated “for nursing authors, editors, organizations and publishers to call for full and equal recognition of women’s authorship. It is a question of social justice.” By 2022, a news feature in Nature reported “the rise of citational justice” (Kwon 2022, 569) as an emerging movement that calls “on academics to acknowledge the inequities in citational practices — and, by paying more heed to work from groups that are typically under-cited, take action to reduce them.” It has become evident that both individual and institutional biases contribute to inequities in citation (Dworkin, Zurn, and Bassett 2020, 890).

Citation justice seeks to critically examine the ethics and politics of knowledge production (Baffour, Garcia, and Rich 2024). It promotes the deliberate effort to find and cite work authored by scholars with varied backgrounds, not only to help correct citation imbalance but also to affirm contributions to their fields of study. Inclusive citation that is meaningful and intentional reflects conscientious engagement “with those authors and voices we want to carry forward” (Mott and Cockayne 2017, 954). As highlighted by the Citational Justice Collective et al. (2022, 81), “fostering a critical awareness around citational practices is needed so that we can learn to take personal and political responsibility and recognize our role in eliminating citational injustices.”

Ingie Hovland and Britt Halvorson (2024, 170) note the need for “the emergence of a language to talk about citation.” This paper focuses on “citation(al) justice” as the core concept, but it is important to acknowledge that there are now several related terms appearing in the literature, such as citation bias, citation ethics, citation politics, citation power, and critical citation. For example, in the context of #BlackCitesMatter, to overcome the marginalization of research by Black scholars, Fred A. Bonner II and Barbara L. Garcia-Powell (2022, 12) add the terms citational injustice, citational erasure, citational exclusion, citational equivocation, citational gentrification, citational disparity, and citational racism. Advocates argue for more citation diversity and inclusive citation. Related concepts, beyond the scope of this paper, include epistemic (in)justice and epistemicide (Patin et al. 2021). There are also terms to describe misbehavior in citation practice, including citation cartel, citation malpractice, citation manipulation, and coercive citation. Many of these arise in response to the increasing emphasis on evaluative bibliometrics as discussed in the next section on applications.

Applications of Citation Analysis

CA (Smith 1981, 94–98) identified eight potential areas of application of citation analysis: “Literature of” studies; “type of literature” studies; user studies; historical studies; communication patterns; evaluative bibliometrics; information retrieval; and collection development. While all continue to be investigated, the use of citation analyses for evaluative purposes is most relevant to citation justice. Nearly fifty years ago, Francis Narin (1976) observed that the broad application of citation analysis for evaluative bibliometrics was clearly attributable to the appearance of the Science Citation Index, with citation counts second in importance only to the counting and classification of publications. Citation counts were interpreted as a measure of the use or influence of a single publication or of all publications of an individual, grant, contract, department, university, funding agency, or country.

Yves Gingras (2016) traces the evolution in evaluative bibliometrics from science policy in the 1970s to an emphasis on research evaluation starting in the 1980s. “Research excellence” considered the number of publications produced as an indicator of productivity and the number of citations those publications received as an indicator of scholarly impact. Use of citation analysis expanded with the wide availability of citation data from the Web of Science and Scopus subscription databases in many universities and with freely available citation tools such as Google Scholar. “Historically, research evaluation was done by specialists,” but easy access to these data led to their use in measurement by administrators, policymakers, and others not trained in bibliometrics (Sugimoto and Larivière 2018, 5). “These individuals need to be able to compile and interpret research indicators in a meaningful manner,” which requires “knowledge of the data available (and the limitations of these data), an understanding of how indicators are constructed and compiled, and the ability to contextualize the use of these indicators” (Sugimoto and Larivière 2018, 5). Diana Hicks, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols (2015, 429) observe: “Research evaluations that were once bespoke and performed by peers are now routine and reliant on metrics. The problem is that evaluation is now led by the data rather than by judgement.”

In The Tyranny of Metrics, Jerry Z. Muller (2018) characterizes this trend in higher education as consistent with a “metric fixation” in other sectors. This fixation includes replacing personal judgment with numerical indicators of comparative performance based upon standardized metrics and attaching rewards and penalties to individually measured performance. Metric fixation is the persistence of these practices regardless of their unintended negative consequences. Despite known limitations (Light, Gullickson, and Harrison 2025), researchers may be judged by metrics such as the h-index, which combines productivity and citation counts (Egghe 2010; Hirsch 2005). The journals where their publications appear may be compared by journal impact factor, computed based on citations received by articles published in a journal over a given period of time (Garfield 1999; Simons 2008). The impact factor is seen not only as an indicator of the quality of the journals but also—and often wrongly—as a measure of the quality of the individual articles. The ease of use of the h-index and impact factor has resulted in their increased influence to assess researchers, with the effect that where papers are published and how many times they are cited carry more weight than what is said in the papers (Wood 2021, 14). This creates a culture where “publish or perish” is merging with “impact or perish” (Biagoli and Lippman 2020, 1).

The past decade has seen more attention to issues raised by the increased use of evaluative bibliometrics. Scholarly Metrics Under the Microscope: From Citation Analysis to Academic Auditing (Cronin and Sugimoto 2015) compiled papers to provide a critical commentary on the use of metrics to assess the quality of scholarship and to measure the impact of lines of research. The Metric Tide (Wilsdon et al. 2015), a review of the role of metrics in research assessment and management, cautions that use of indicators like journal impact factors and citation counts could create perverse incentives. Metrics should support, not replace, expert judgment provided by peer review (Moed 2005, 4). Outcomes of citation analysis should be considered in an evaluative framework that also takes into account the substantive content of the works under evaluation. Mike Thelwall (2025, 2–3) summarizes arguments for and against the use of citation-based indicators in research evaluation. Those supporting more use of indicators note the potential for time savings, facilitation, impartiality, and objectivity. Those objecting to the use of citation indicators express concerns regarding oversimplification, partial evidence, field or disciplinary differences, perverse incentives, and gaming (discussed below).

Echoing Lutz Bornmann and Hans-Dieter Daniel (2008) and Iman Tahamtan and Bornmann (2019), one can ask: “What do citation counts measure”? Of particular concern is the tendency to regard citation counts as an indicator of the quality of research (Thelwall et al. 2023). Most recent studies are reluctant to treat citations as a direct proxy for quality, often preferring the term “impact” or “visibility.” Dag W. Aksnes, Liv Langfeldt, and Wouters (2019) explain that research quality is a multidimensional concept, with plausibility/soundness, originality, scientific value, and societal value as key components. Citation counts do evaluate certain aspects of impact, but they may not measure other important but less accessible dimensions, such as societal value. Societal impact is significantly more difficult to measure than impact on science (Bornmann 2016, 348). Factors that could be considered include policy impact, innovation and economic impact, social and environmental impact, and cultural impact (Woolston 2023). In addition, the societal impact of research may not be observable for many years, and it is difficult in many cases to establish a causal connection between a particular piece of research and its effects. More generally, only a small proportion of readings are done for the purpose of writing articles; most readings of articles are done by non-authors, not captured by citation counts. Instead, other measures of societal benefit and capacity building may be more effective (Thelwall 2025, 9).

Despite these cautions, when institutions use citation metrics as indicators of impact, how often a researcher is cited plays an important role in that person’s career advancement (Ray et al. 2022, 158). Research has shown, moreover, that citation patterns and practices are affected by biases, including the gender, race, and nationality of the authors being cited. Lai Ma (2019) objects to the “flattened self” in citation databases: “Persons and work are flattened and objectified in citation databases, and metrics are embedded in departmental reports, strategic plans, grant applications, and university rankings.” Because citations may be factored in so heavily to the hiring, promotion, and tenure processes, any inequity in the way citations are created and disseminated can lead to inequity in higher education institutions.

The focus on citation counts and impact factors can shape the topics and issues scholars write about, their choice of methodology, and their choice of publication venue for their work. In addition, attention to these metrics may influence the decisions that journal editors make about the work they choose to publish (Smeyers and Burbules 2011). Of growing concern are emerging issues in the gaming and manipulation of metrics in higher education that are designed to amplify individual and institutional reputation (Oravec 2019). These distortions were anticipated by Narin (1976, 163), who noted that if citation analysis becomes an accepted method of evaluating research, scientists may conspire with their colleagues to cite one another to effect an increase in their individual citation counts. Nicholas C. Burbules (2015, 716) cautions that the more important these metrics become, the more likely there will be efforts to “game” the system, eroding the reliability of the metrics.

CA identified two types of limitations which can affect citation analyses for evaluative bibliometrics as well as other applications: Assumptions made may not be true, and the data collected may have inadequacies. Invalid conclusions will be made unless these limitations are recognized in the design of a study and in the interpretation of results. The next two sections address these potential limitations through the lens of citation justice.

Assumptions of Citation Analysis

CA identified the following five assumptions frequently underlying citation analysis (Smith 1981, 87–89):

  1. Citation of a document implies use of that document by the citing author.

  2. Citation of a document (author, journal, etc.) reflects the merit (quality, significance, impact) of that document (author, journal, etc.).

  3. Citations are made to the best possible works.

  4. A cited document is related in content to the citing document.

  5. All citations are equal.

Citation justice seeks to reinforce assumptions one and three. Citation manipulation challenges assumptions two, four, and five and poses threats to citation justice.

Assumptions one and three support citation of and engagement with relevant publications that might otherwise be overlooked because the author was not aware of the document, or could not obtain it, or could not read the language in which it was published. Bias may also play a factor in failures to cite. Scholars may use convenient heuristics when finding sources to cite, such as the language a paper is written in, the author’s institution, and journal reputation. Such practices may silence certain voices, including those writing in a language other than English, affiliated with a less prestigious institution, or publishing in a journal with a lower impact factor. One important way to counter citational disparities is to expand the range of scholarship with which a scholar critically engages.

Multiple bibliometric analyses have discovered that women and people of color are cited less often than their white, male counterparts and that citations contain geographic bias. Echoing Margaret W. Rossiter’s (1993) characterization of the Matilda Effect, a term coined to refer to the way that women’s contributions to science are often undervalued or attributed to men, Cassidy R. Sugimoto and Vincent Larivière (2023, 145) report that an overwhelming majority of studies have found that women’s publications are cited less frequently than those by men. A recent investigation in environmental studies confirms this pattern, with H. O’Leary, T. Gantzert, A. Mann, E. Z. Mann, N. Bollineni, and M. Nelson (2024) finding a significant gender gap between who creates novel contributions to the field (measured by new publications) and whose contributions are being recognized (measured through new citations). Race and ethnicity are also factors. For example, Paula Chakravartty, Rachel Kuo, Victoria Grubbs, and Charlton McIlwain (2018) find that non-white scholars continue to be underrepresented in publication rates, citation rates, and editorial positions in communication studies.

Looking at geographic differences, Alicia Mattiazzi and Martin Vila-Petroff (2024) emphasize the role of international or region-based citation bias, finding that scientific research from rich countries or regions is more widely cited than comparable studies from nations or regions with fewer resources, also reported by Charles J. Gomez, Andrew C. Herman, and Paolo Parigi (2022). Contributing factors can include lack of visibility (scientists from low-income countries have fewer opportunities to travel and to showcase their research in international forums) and greater difficulties in publishing their results (due to prejudices of reviewers, unaffordable article processing charges, expectations for English language proficiency [Lenharo 2025]). A study by Suleman Lazarus (2025) examines factors influencing citation metrics through an autoethnographic lens for his research portfolio focused on Nigerian society. He expresses concern that systemic prejudices disproportionately affect individuals from Africa south of the Sahara. Authors may experience pressure to replace citations of papers in Africa-based journals with those from the Global North, even though the former are more contextually relevant. More generally, scholars from both the Global North and South may cite work from the North, even when discussing the Global South (Velho 1986).

Citation manipulation undermines assumptions two, four, and five, which state that citations are made because of the merit of the cited work, are related in content to the citing document, and are equal for the purposes of citation analysis. The increasing reliance on metrics to evaluate scholarly publications has produced new forms of academic fraud and misconduct (Biagoli and Lippman 2020), the gaming and manipulation of indicators. These actions differ from traditional academic misconduct (falsification, fabrication, plagiarism). Payal B. Joshi and Manoj Pandey (2024, 285) find that citation manipulation in scientific publishing is a pervasive issue that undermines the integrity of academic research with strategies seeking to boost the reputations of authors, reviewers, and editors. Samuel V. Bruton, Alicia L. Macchione, Mitch Brown, and Mohammad Hosseini (2025, 330) enumerate a range of unethical practices, including coercive citations (when journal editors require authors to cite other works from the same journal or peer reviewers suggest citing their own work), excessive self-citations, and selective citations (when citations are deliberately selected or omitted to skew the significance of the reported results). Other questionable tactics include citation rings or cartels (when authors or editors collude to increase citation metrics), citation padding (when authors attempt to impress journal reviewers or editors by citing prestigious but irrelevant articles), citation copying (when authors merely copy citations from another source without reading them or checking their accuracy), and the citation of sources authors have not read (such as citations based only on an article’s title or skimmed abstract). Other concerns include citation stacking (journal editors build a network to exchange citations excessively) and citation contamination (authors cite articles from predatory journals) (Joshi and Pandey 2024, 293–94). There is even evidence that citations can be bought in bulk (Ibrahim et al. 2025). In many cases of citation manipulation, there is no connection between the cited reference and the scholarly content of the manuscript.

Problems with Data

Given the difficulties with the assumptions that underlie many citation analyses, one must also be aware of the problems that can exist in sources of citation data, including citation indexes. CA identified nine problems (Smith 1981, 91–93):

  1. Multiple authorship (how to allocate credit in multi-authored publications)

  2. Self-citations (whether or not to include in citation counts)

  3. Homographs (how to differentiate among authors with the same name)

  4. Synonyms (how to handle name variants for the same author)

  5. Types of sources (whether to limit analysis to journal articles or include other publication types)

  6. Implicit citations (how to handle mentions of a publication in the text omitted from the reference list)

  7. Fluctuations with time (how to choose a time interval for analyzing citations, when there may be large variations in citation counts from one year to another)

  8. Field variations (citation rates vary greatly in different fields, posing challenges for cross-discipline comparison)

  9. Errors (inaccuracies in cited author and publication data)

The implications of each of these potential problems will be addressed from the perspective of citation justice and developments since CA was published in 1981.

Multiple Authorship

A challenge in factoring publications with multiple authors into citation counts has been determining how to allocate credit. The Contributor Role Taxonomy (CRediT), approved in 2022 as an ANSI/NISO standard, is a fourteen-role taxonomy that can be used to describe the key types of contributions typically made to the production and publication of research outputs such as research articles. By fragmenting the scientific production process into clearly distinct tasks, CRediT was designed to go beyond the customary rules specific to name orderings in scientific publications (Larivière, Pontille, and Sugimoto 2021). Roles that can be identified include conceptualization, data curation, formal analysis, funding acquisition, investigation, methodology, project administration, resources, software, supervision, validation, visualization, writing—original draft, writing—review & editing. CRediT complements authorship guidelines by providing greater transparency regarding the specific inputs of each team member. Tove Godskesen, Gert Helgesson, and Stefan Eriksson (2025) caution that limited adoption of CRediT means that evaluation of author contributions may still emphasize author position (e.g., first author, last author) in multi-authored publications rather than role as identified by CRediT.

Self-Citations

John P. A. Ioannidis (2015) warns that, if the goal of self-citing is to increase an author’s citation counts, excessive self-citation practices may be highly misleading and may distort the scientific literature. There is also a gender gap in self-citation: Men are more likely to cite their own work than women (King et al. 2017). This gap suggests that there would be greater equity if self-citations were generally omitted from citation counts.

Homographs and Synonyms

A tool to assist in resolving problems with both homographs and synonyms in author names is the Open Researcher and Contributor ID (ORCID). Introduced in 2009, ORCID is a registry that associates a unique persistent identifier to scholars that connects their contributions across disciplines, borders, and time (Hernigou and Scarlat 2024). This approach addresses a number of challenges to ensuring that an author receives credit for all of their publications: Most personal names are not unique, can change, have different name orders in different cultures, use first-name abbreviations inconsistently, may vary due to different transliteration systems and approaches to handling accents and other diacritics, and may have multiple family names.

Types of Sources

Web of Science and Scopus, the major commercial citation databases, both cover journal, conference, and book content from a wide range of disciplines. They disproportionately index English-language publications, with social sciences and humanities still underrepresented in comparison to natural sciences and medical and health sciences. Toluwase Asubiaro, Sodiq Onaolapo, and David Mills (2024) completed a regional analysis of journals represented in Web of Science and Scopus across all eight UNESCO world regions compared to the total number of active Ulrich’s directory academic journals in those regions. They found that journals published in Europe, Oceania, and North America were much more likely to be indexed in Web of Science and Scopus compared to other world regions. Journals published in sub-Saharan Africa were the most underrepresented and four times less likely to be indexed than those published in Europe. Research of Global South scholars is thus less likely to appear in the main citation indexes, which are based in the Global North, and so citation analysis results may be incomplete for those scholars. As noted by Mills and Asubiaro (2024, 122), “Web of Science and Scopus are designed to support knowledge flows at a global scale, rather than nurture national research cultures and knowledge ecosystems.” The globalization of citation metrics undermines local research strength as researchers in the periphery are forced to choose between either the international recognition that comes with publishing in indexed journals or supporting regional research by publishing in local journals. Writing from the perspective of Taiwanese researchers, Chuing Prudence Chou, Hsiao Fang Lin, and Yun-Ju Chiu (2013) report that citation numbers and impact factors in Global North databases do not represent the research quality and social impact of Taiwanese scholars in the social sciences and the humanities. Negative consequences include the following: English-language publications have become more important than their Chinese-language counterparts; mainstream international issues, instead of local-regional issues, are highlighted; publishing in a foreign English-language journal has become a more prestigious accomplishment than in a local-regional journal; and books have been devalued and downgraded compared with journal articles. While Google Scholar is more expansive, covering journal and conference papers, theses and dissertations, academic books, preprints, abstracts, technical reports, and other scholarly literature from all broad areas of research, there is no master list of sources indexed compared to what is provided by Web of Science and Scopus. As an alternative to Web of Science and Scopus, Zehra Taşkın (2025) provides a review of the open citation movement, which seeks to make citation data freely accessible, downloadable, and reusable.

Implicit Citations, Fluctuations with Time, Field Variations

There is particular concern with applying bibliometric techniques well adapted to science, technology, engineering, and mathematics (STEM) fields to arts and humanities (Ochsner 2021). Unlike STEM, arts and humanities scholarship may be characterized by: (1) national or regional orientation and different languages of publication; (2) diverse range of publication types, especially books; (3) different pace of theoretical development with an important fraction of older citations; (4) often sole authorship; (5) more outputs directed at a non-scholarly public; and (6) citations that are not always explicit. More generally, the focus on metrics neglects important intangible and social benefits of arts and humanities research. Thelwall and Maria M. Delgado (2015) caution that artists produce a wide variety of types of outputs that are not naturally citable. Creativity is valued in the arts, and citation counts lack a straightforward connection to disciplinary research goals.

Errors

Errors persist in citation data. James H. Sweetland (1989) suggests that this lack of interest in accuracy of citations may be related to a general lack of training in the norms and purposes of bibliographic citation. Jaime A. Teixeira da Silva (2024) highlights the frequency of errors in dealing with authors with compound family names, noting that misrepresentation of an author’s professional publishing name may distort their publishing record. An emerging source of errors, with as yet unknown impact, are large language models (LLMs) as a source of bias in citation data. Andres Algaba, Carmen Mazijn, Vincent Holst, Floriano Tori, Sylvia Wenmackers, and Vincent Ginis (2024) analyzed the characteristics and potential biases of references recommended by LLMs and observed an inherent bias toward generating references to previously highly cited papers, irrespective of other characteristics of the references. Enrique Orduña-Malea and Álvaro Cabezas-Clavijo (2023) advise both journals and publishers to be vigilant in order to prevent fake references created by ChatGPT and other applications of generative artificial intelligence (GenAI) from inclusion in published papers. They illustrate the risk with fake references found in preprint publications. Citation integrity errors, where statements from the cited paper are not accurately reflected in the citing article (including instances where the citing article misrepresents claims made in the cited paper), raise additional concerns (Sarol et al. 2024).

Approaches to Achieving Citation Justice

The overview published in Nature (Kwon 2022, 571) notes that achieving citation justice “requires a multi-pronged approach, with efforts to seek equity in all parts of the scholarly communication system.” Recognizing this, it is important to identify approaches that several different entities can pursue with the goal of working toward citation justice. This section includes discussion of author roles, pedagogical approaches, resource compilation, editor and reviewer roles, publisher roles, advocacy, recommendations for research evaluation reform, and higher education institutional roles.

Author Roles

Every individual author has a direct impact on the decisions made around citations (Poquet et al. 2024, 808), and these decisions are important because they may directly affect careers. Citation justice “encompasses an equitable recognition of the credibility of diverse voices and perspectives, particularly those from marginalised or underrepresented communities” (Dadze-Arthur and Mangai 2024, 336).

Examples of authors who overtly articulate their intent are Sara Ahmed (2017) and Catherine D’Ignazio and Lauren F. Klein (2020). In her book Living a Feminist Life, Ahmed does not cite any white men, instead citing those “who have contributed to the intellectual genealogy of feminism and antiracism” (2017, 15). D’Ignazio and Klein’s book Data Feminism includes an appendix documenting the extent to which their citations reached their aspirational metrics related to the structural problems of racism, patriarchy, cissexism, heteronormativity, ableism, colonialism, classism, and proximity. Januschka Schmidt (2022, 178) notes that critical citation practices should entail engaging with scholars from underrepresented groups in the literature review as early as possible as well as considering knowledge from outside of academia. Lori Wright, Neisha Wiley, Elizabeth VanWassenhove, Brandelyn Tosolt, Rae Loftis, and Meg L. Hensley (2022) articulate feminist citational praxis as intentionally citing authors of color, women, transgender, and nonbinary scholars as well as disabled authors/authors with disabilities.

A number of journals encourage or expect authors to include a citation diversity statement (CDS). For example, all Biomedical Engineering Society (BMES) journals encourage authors to include a CDS within their manuscripts to work towards correcting citation imbalances in the biomedical engineering literature (Rowson et al. 2021). A CDS is a paragraph placed before the reference section of a manuscript in which the authors address the diversity and equity of their references in terms of gender, race, ethnicity, or other factors and affirm a commitment to promoting diversity and equity in sources and references (Ray et al. 2022). The statement typically reports the percentages of different diversity categories for first and last authors, the method used to determine these percentages, and the method’s limitations. The Gender Balance Assessment Tool (Sumner 2018), a web-based tool for estimating gender balance in bibliographies, has been designed to make the task easier, with the aim of remedying the gender gap. Perry Zurn, Danielle S. Bassett, and Nicole C. Rust (2020) note that the CDS is one way to increase awareness of and mitigate citation bias. Ulitmately, what is more important than a CDS itself is the reflexive process undertaken by the author (see, for example, Okune 2019).

Cana Uluak Itchuaqiyaq (2025) demonstrates unconventional citation methods to challenge traditional citation practices that can flatten and erase the contributions of cited authors. Techniques can include adding scholarly identity markers (e.g., race, gender) to every author cited, employing annotated string citations explaining how the author is building on each of the cited texts, and listing all author names for every in-text citation to prevent erasure.

Pedagogical Approaches

Awareness of and engagement with citation justice can be supported through workshops and courses taught by librarians and writing instructors that include attention to citation practice. Jodi H. Coalter (2023) argues that as research and citation experts, librarians are in an excellent position to support citation justice by introducing new researchers to the practice and teaching about tools to help writers track and manage citations. Librarians’ expertise includes knowledge of when and how to cite, how to find relevant publications that include diverse authors, and familiarity with citation management. Audiences for courses and workshops may include undergraduate students, graduate students, and faculty. Coalter provides examples of content to be covered at each level. Liz Chenevey (2023, 152) also advocates for librarians to empower “students in their engagement with citations, particularly through a critical lens” and has developed a workshop on the politics of citation. Recognizing that those involved in academic decision-making should be “metrics literate,” with an understanding of how metrics are derived as well as basic knowledge of their strengths and weaknesses, Lauren A. Maggio, Alyssa Jeffrey, Stefanie Haustein, and Anita Samuel (2022) analyzed available educational videos about the h-index in order to provide recommendations for future educational initiatives.

Co-authors Kylie E. Quave and Savannah Hagen Ohbi (2024, 39), respectively a writing teacher and a student, promote teaching and learning what they term a “joyful” citation praxis. Instead of focusing on a citation practice primarily motivated by plagiarism fears, they advocate discussion of “more imaginative, inclusionary, joy-generating reasons for attribution.” Rather than having students view citation as a compulsory act under threat of sanction, they should be encouraged to view it as an opportunity for community and dialogue, purposefully including certain voices. In the same spirit, Teaching Citational Practice (TCP) (n.d.) is an open-access resource for higher education instructors interested in strategies for teaching research and citation. It documents teaching approaches of instructors from across academic disciplines and institutions. TCP aims to develop and promote citational practices that legitimize excluded, overlooked, and non-traditional sources and scholarship. L. C. Santangelo (2023) describes a lesson that reconceptualizes citation practices as an engaging component of “feminist memory” instead of an obligatory chore. Christa Craven (2021, 122) describes teaching antiracist citational politics inspired by the Cite Black Women movement, which advocates citing Black women as a starting point for broader inclusion of indigenous scholars, Black men, other people of color, scholars outside of the United States, and queer, trans, and nonbinary voices.

Resource Compilation

Resources developed to support citation justice can be used for direct instruction or self-study. Librarians have developed LibGuides on citation justice and related topics, identifiable through a search of the LibGuides Community (Springshare n.d.). For guides related to research impact, Sheila Craft-Morgan (2024) emphasizes the importance of sharing information about diversity, equity, inclusion, and justice (DEIJ) as they relate to impact and efforts to mitigate bias.

Citation guides can supplement existing style manuals. Aurélie Carlier, Hang Nguyen, Lidwien Hollanders, Nicole Basaraba, Sally Wyatt, and Sharon Anyango (2022) discuss the development of the UM Citation Guide: A Guide by FEM (Carlier et al. 2021) to draw attention to the ways in which citations can be used to make more visible the contributions of women and other underrepresented groups in the production of knowledge. There are also efforts to fill in gaps in existing style manuals. For example, Lorisia MacLeod (2021) developed templates to better account for oral teachings, citing Indigenous Elders and Knowledge Keepers together with their nation/community and other applicable information, such as where they live and a brief description or title of the teaching. Such templates can be included in institutional citation guides. S. V. Chetan (2024) advocates challenging style manuals such as the Publication Manual of the American Psychological Association, which, unlike the MLA Handbook and The Chicago Manual of Style, designates the use of last names and initials rather than full names, thus obscuring the gender of authors. Amber Lancaster and Carie S. Tucker King (2025) express concern with APA guidance that uses “et al.” for in-text citations of publications by three or more authors and uses an ellipsis to replace author names in reference lists when there are more than twenty co-authors. These practices lead to invisibility of some authors. They call for empowering authors and editors to develop new approaches to ensure authorship credit.

Resources have been developed to facilitate identification of female and underrepresented authors. The UM Citation Guide (Carlier et al. 2021, 7) includes a list of resources to locate women researchers in a range of fields. Itchuaqiyaq and Jordan Frith (2022) discuss developing the multiply marginalized and underrepresented scholar database to assist in locating self-identified MMU scholars. Under “How to Diversify Your Citations,” Andrea Miller-Nesbitt (2025) provides an extensive list of resources for finding diverse sources (interdisciplinary, subject specific, additional) as well as a list of academic databases for finding diverse sources (Indigenous, regional, international).

The Framework for Open and Reproducible Research Training (FORRT) (Sauvé et al. 2025, 2) has developed a citational justice toolkit to promote equitable scholarship by curating “actionable resources, tools, and practices helping scholars and institutions to audit, diversify, and reflect on their citation practices across the research cycle.” The toolkit features tools to use in project planning, while reading, while writing, and at the publication stage. The motivation for compiling such a toolkit is “to demonstrate that citational justice is not peripheral to good scholarship, but central to research integrity, accountability, and inclusivity” (Sauvé et al. 2025, 23). FORRT (2025) has also produced a video providing a walkthrough of the toolkit.

Professional associations have also sought to create position statements and guidelines to foster citation justice. The Society for Personality and Social Psychology has developed “Guidelines for Promoting Inclusive Citing Practices” (2025) with five recommendations, such as including an annotated references section that explains why each cited work was included in the paper. The Conference on College Composition and Communication, the world’s largest professional organization for researching and teaching composition, issued a Position Statement on Citation Justice in Rhetoric, Composition, and Writing Studies” (2022) to encourage scholars to engage in citation justice in all areas of scholarly production. It provides guidance on practicing citation justice and a compilation of relevant resources.

Editor and Reviewer Roles

Editors and reviewers can support citation justice in a number of ways. An editorial in American Political Science Review (2023) has multiple recommendations, including broadening the reviewer pool, explicitly requesting that reviewers be attuned to citation biases, and alerting authors to relevant research that may be missing from bibliographies. Schmidt (2022) cautions against biases in the peer review process (e.g., non-anglophone names, author’s gender, institutional affiliation) that can contribute to the exclusion of scholars from underrepresented groups, including holding work by and about such groups to higher standards. The Society for Personality and Social Psychology’s Anti Colorism/Eurocentrism in Methods and Practices (ACEMAP) task force (Ledgerwood et al. 2024) provides nine recommendations and example resources for editors and reviewers to encourage inclusiveness in journals, including removing obstacles to improved citation practices. Abena Dadze-Arthur and Mary S. Mangai (2024, 336) recommend that editors generally provide authors with guidance and tools that consider citational inequities in their work, including preparation of citation diversity statements.

Debra Z. Basil, Suzan Burton, Alena Soboleva, and Paul Nesbit (2023) advocate training about reviewer conflict of interest, warn about coercive citation requests, suggest self-engagement through an honor code, and provide a review of recommendations to add citations. Additional recommendations to discourage coercive citations include the need for a clearer definition of standards for reviewer citation suggestions and author responses to them (McLeod 2021, 285). Mattiazzi and Vila-Petroff (2024, 3) suggest that journal editors consider including reference list reviewers, similar to the employment of special statistical and technical reviewers.

Publisher Roles

The Committee on Publication Ethics (COPE) is a resource for guidance to publishers on topics related to citation justice, including two position statements: “Editors Requiring Authors to Cite Papers in Their Journal” (“Editors can suggest relevant citations to improve articles but should avoid mandating citations to their own work or journal. Additional citations shouldn’t determine acceptance” [COPE 2024a]) and “Handling Citation Manipulation” (“When references do not contribute to the scholarly content of the article and are included only to increase citations of the author or specific peers or journals” [COPE 2024b]). Mina Mehregan and Mohammad Moghiman (2024) recommend that publishers scrutinize all citations suggested by reviewers to ensure research integrity.

The Royal Society of Chemistry (2023) has brought together fifty-six publishing organizations to ensure a more inclusive and diverse culture within scholarly publishing. Beginning work in June 2020, the Joint Commitment for Action on Inclusion and Diversity in Publishing has launched a set of standardized questions for self-reported diversity data collection and has developed minimum standards for inclusion and diversity in scholarly publishing. As this effort moves forward, authors, reviewers, and editors should expect to be invited to respond to a standard set of questions regarding gender, race, and ethnicity whenever they send their papers to journals and when they review or edit manuscripts (Else and Perkel 2022).

Decisions to make a publication open access can contribute to greater visibility of research, particularly from the Global South. Open-access research outputs generally receive more diverse citations from institutions, countries, subregions, regions, and fields of research (Huang et al. 2024). Latin America is a leader in non-profit open-access journals (Moutinho 2024), with the Scientific Electronic Library Online (SciELO) hosting more than a thousand titles. To be part of SciELO’s collection, journals must meet a series of quality criteria, such as keeping an editorial board of experts in the journal’s subject area, publishing on a regular schedule, identifying authors with an ORCID, and having abstracts in English. Geographic representation in SciELO includes fourteen Latin American and Caribbean countries, Portugal and Spain, and South Africa. Liz Allen and Elizabeth Marincola (2020) suggest that researchers in the Global South have effectively gained a position to leapfrog over much of the legacy system of scholarly publishing and pursue new models. They describe the African Academy of Sciences initiative, which is now Open Research Africa, an innovative open-access platform to enable researchers to publish rapidly without barriers and with the benefit of transparent peer review, making African research freely available and usable. Included are scholarly articles and other research outputs (e.g., posters, slides, and documents) reporting basic, applied, translational, and clinical research findings. New modes and outlets for sharing research outputs are reducing the practical barriers for researchers from across the globe who wish to share their findings and participate in a more connected and open science system.

Advocacy

Advocacy efforts have raised the issue of citation justice and offer recommendations to achieve greater recognition for authors who may be disadvantaged by gender, race, geographic location, or discipline. Amanda Murdie (2018, 347) proposes that the process to end the gender citation gap can benefit from the study of international relations, specifically by the methods that advocates and entrepreneurs use to develop international norms. She outlines strategies “to build a critical mass of scholars who understand the problem and are working to alleviate it” (Murdie 2018, 347).

Christen A. Smith (2021) started Cite Black Women in November 2017. This global campaign seeks to center Black women’s ideas and intellectual contributions, using citation as a practice to engage with the voices that are often silenced or left behind (Smith et al. 2021). Authors are encouraged to critically and actively reflect on how gender, race, nationality, and class shape the possibilities of knowledge production. Rather than “haphazardly inserting sources into bibliographies,” authors “must deeply engage with Black women’s ideas and experiences” (Smith and Garrett-Scott 2021, 33).

Advocacy for scholars in the Global South argues that increased citation of Global South publications as well as going beyond citation-based metrics should be primary criteria for assessing research. Citational Justice Collective et al. (2023, 16) offer several ideas to contribute to a more equitable and fair citation of work beyond traditional geographical regions. These include searching proactively for research conducted in the Global Souths by “attending conferences organized by or located within the Global Souths, publishing in Global Souths open access journals, or using search engine filters to look for research in particular regions” (Citational Justice Collective et al. 2023, 16). To create a more equitable publishing environment for research outside of core anglophone countries, Shannon Mason and Margaret Merga (2021) argue for citing articles beyond those in the “top” journals because lower impact does not necessarily mean lower quality. Citations can also identify articles from diverse contexts and include works in languages other than English. A related effort is the Helsinki Initiative on Multilingualism in Scholarly Communication (2019) to protect national infrastructures for publishing locally relevant research and to promote language diversity in research assessment, evaluation, and funding systems. Any metrics-based systems should ensure that publications in all languages are adequately considered.

Turning attention to how to assess research excellence in the Global South, Julie Shi (2023, 26) notes the importance of locally adaptable governance policies for publishing, access, and academic evaluation to foster “bibliodiversity.” In Transforming Research Excellence: New Ideas from the Global South, Erika Kraemer-Mbula, Robert Tijssen, Matthew L. Wallace, and Robert McLean (2020, 8) observe that citation ranking is not very helpful for capturing scientific performance that addresses local issues or problems. They recommend measures able to capture multiple dimensions related to social value. “Such performance criteria also depend on geography – the location where the science is done, and where the primary users and potential beneficiaries of scientific findings are to be found” (Kraemer-Mbula et al. 2020, 8).

Disciplinary differences must also be considered. The arts and humanities have distinct patterns of publication that should be reflected in the approach to research assessment (Ochsner et al. 2016). Björn Hammarfelt (2016) argues that to evaluate humanities researchers, they should be actively consulted to identify the time spans and types of publications appropriate for their disciplines. In computer science, Neha Kumar and Naveena Karusala (2021, 7) have issued “a call for collective action” in order to “break the culture of unnecessary silence around citations.” Citational Justice Collective et al. (2021) describe a conference workshop on Braving Citational Justice Together for researchers in human-computer interaction (HCI) and computer-supported cooperative work (CSCW) to collectively reflect upon and to improve citation justice.

Recommendations for Research Evaluation Reform

Alexander Rushforth and Hammarfelt (2023, 892) observe that since the late 2010s the responsible metrics reform movement has shifted from the more specific focus on appropriate uses of bibliometrics into a wider framing of responsible research assessment. Consistent with that observation, over the past decade influential initiatives and reports have emphasized the limitations of quantitative indicators in research assessment and argued for their use to support rather than replace expert judgment as the primary evidence of research quality or value.

The Leiden Manifesto for Research Metrics (Hicks et al. 2015, 430–31) enumerates ten principles for guiding research evaluation to combat the misuse of bibliometrics. It is intended as a “distillation of best practice in metrics-based research assessment” (Hicks et al. 2015, 430) that emphasizes detailed evaluation of research rather than excessive use of quantitative data. Examples of these principles include recognition of locally relevant research, adjustment for variations by field in publication and citation practices, the addition of a qualitative judgment of a researcher’s portfolio, and correction of possible systemic biases and assessments.

The Coalition for Advancing Research Assessment (CoARA) is a collective of organizations committed to reforming the methods and processes by which research, researchers, and research organizations are evaluated. Over seven hundred organizations have agreed on a common direction and guiding principles as outlined in their Agreement on Reforming Research Assessment (CoARA 2022). This agreement requires assessment to be based primarily on qualitative judgment, for which peer review is central, supported by responsible use of quantitative indicators.

The San Francisco Declaration on Research Assessment (DORA) (2012) is a statement that criticizes the practice of correlating the journal impact factor to the merits of a specific researcher’s contributions. The declaration outlines recommendations for funding agencies, institutions, publishers, and organizations that supply metrics, as well as for researchers. Themes include the need to eliminate the use of journal-based metrics in funding, appointment, and promotion decisions; the need to assess research on its own merits rather than on the basis of the journal in which it is published; and the need to capitalize on the opportunities provided by online publication (such as relaxing unnecessary limits on the number of words, figures, and references in articles). DORA sponsors Project TARA (Tools to Advance Research Assessment) (2024), a project to “facilitate the development of new policies and practices for academic career assessment.” In related work, Jennifer S. Trueblood et al. (2025, 7) present “a set of perspectives on changing the publish-or-perish culture by reforming academic evaluation.”

Higher Education Institutional Roles

At the institutional level, Davies et al. (2021) advocate for the use of inclusive metrics of success and impact to dismantle what they see as a discriminatory reward system. They observe that citation counts continue to be valued despite their evident gender and racial biases and argue strongly for a more holistic assessment that takes into account a range of contributions such as pedagogy, community engagement, science communication, and mentoring.

Widening the scope of activities worthy of academic recognition and reward will depend on how well institutions serve as the “crucibles of innovation” (Moher et al. 2018, 16) and continue to be the model for others. A new tool to aid in this process is Reformscape, sponsored by DORA (Owens 2024) and designed to inspire university leaders by collecting examples of responsible career assessment. This searchable collection of criteria and standards for hiring, review, promotion, and tenure seeks to support the development of new policies and practices for responsible research assessment (RRA). Such assessments reflect, incentivize, and reward the plural characteristics of high-quality research in support of diverse and inclusive research cultures.

Conclusion

This paper posed two questions: How can citation analysis be more just? How can research evaluation go beyond citation analysis to be more just? Using the lens of citation justice in the review of assumptions underlying citation analysis and problems posed in dealing with citation data goes well beyond the critical analysis first undertaken in CA. The subsequent discussion of approaches to citation justice identifies many components in the scholarly communication process that can make citation analysis more just: authors, editors and reviewers, publishers, librarians, writing instructors, resource developers, and other advocates. This work involves engagement with a wider range of scholarship while resisting attempts at citation manipulation.

A recent paper by Bornmann and Christian Leibel (2025) discusses citation accuracy, citation noise, and citation bias. They explain that in the research evaluation context, “citation accuracy refers to the precise attribution of knowledge flow from cited to citing papers, ensuring that each citation reflects a genuine intellectual contribution” (Bornmann and Leibel 2025, 37). The application of citation analysis in research evaluation processes presupposes that the accuracy of the citation decisions is high. Both citation bias and citation noise can undermine the validity of using citation analysis in a research evaluation context. The citation justice movement as discussed in this paper seeks to reduce bias. Resistance to citation manipulation and other reasons for citing that do not reflect actual knowledge flow, such as the example of perfunctory citations to “As We May Think” noted at the beginning of this paper, can reduce citation noise.

Citation justice inspires authors to reflect on how they cite, whom they cite, and why they cite. Bornmann and Leibel (2025, 33) advocate creating a citation justification table appended to each paper in which the author indicates why a certain publication was cited at a certain point in the manuscript, i.e., the knowledge flow for which a particular citation was inserted in the publication. While efforts at research evaluation reform and an institutional focus on responsible research assessment can make research evaluation more just, the responsibility for improving research evaluation also lies with authors. As Perry Zurn, Erin G. Teich, Samantha C. Simon, Jason Z. Kim, and Dani S. Bassett (2022, 3) observe, “citations are trails of where the author has been and trails of where the reader might go.”

The publication of CA in 1981 represented my initial effort to provide a critical assessment of citation analysis as a method. More than forty years later, as the citation justice and responsible research assessment movements demonstrate, citation still matters but there is a better understanding of how to improve both the practice of citation and citation analysis. In summary, citation justice shows that

Valuing, engaging, and being in conversation with the contributions of a broader array of scholars ultimately improves the intellectual rigor of our research, the health of higher education institutions, the strength of our professional associations and journals, and the creation of a vibrant intellectual community. In these ways, pursuing citational justice benefits not only the women, people of color, LGBTQ+ scholars, and scholars in the Global South whose work will be more robustly engaged but also will benefit knowledge production and the academy more generally. (American Political Science Review 2023, vii)

Citation justice resonates with open science’s core values of quality and integrity, collective benefit, equity and fairness, and diversity and inclusiveness (UNESCO 2021, 17). As support for open science increases, there is increased potential for more diverse scholars to be part of the scholarly conversation that underlies citation analysis and for research evaluations that are not dependent solely on citation counts.

Acknowledgements

The author wants to thank Dr. Janaynne Carvalho do Amaral, special issue editor Sally Wyatt, and the two reviewers for their valuable comments on earlier drafts of this paper.

References

Ahmed, Sara. 2017. Living a Feminist Life. Duke University Press.

Aksnes, Dag W., Liv Langfeldt, and Paul Wouters. 2019. “Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories.” Sage Open 9 (1): 1–17. https://doi.org/10.1177/2158244019829575.

Algaba, Andres, Carmen Mazijn, Vincent Holst, Floriano Tori, Sylvia Wenmackers, and Vincent Ginis. 2024. “Large Language Models Reflect Human Citation Patterns with a Heightened Citation Bias.” Preprint, last revised August 24, 2024. https://doi.org/10.48550/arXiv.2405.15739.

Allen, Liz, and Elizabeth Marincola. 2020. “Rethinking Scholarly Publishing: How New Models Can Facilitate Transparency, Equity, Efficiency and the Impact of Science.” In Transforming Research Excellence: New Ideas from the Global South, edited by Erika Kraemer-Mbula, Robert Tijssen, Matthew L. Wallace, and Robert McLean. African Minds. https://library.oapen.org/handle/20.500.12657/23441.

Aman, Valeria, and Jochen Gläser. 2025. “Investigating Knowledge Flows in Scientific Communities: The Potential of Bibliometric Methods.” Minerva 63 (1): 155–82. https://doi.org/10.1007/s11024-024-09542-2.

American Political Science Review. 2023. “Notes from the Editors: Citation Matters.” American Political Science Review 117 (1): v–viii. https://doi.org/10.1017/S0003055422001368.

Asubiaro, Toluwase, Sodiq Onaolapo, and David Mills. 2024. “Regional Disparities in Web of Science and Scopus Journal Coverage.” Scientometrics 129 (3): 1469–91. https://doi.org/10.1007/s11192-024-04948-x.

Baffour, Tiffany, Myra Garcia, and Mindi Rich. 2024. “Advancing the Grand Challenge to Eliminate Racism: A Call to Action for Citational Justice in Social Work.” Journal of Ethnic & Cultural Diversity in Social Work 34 (5-6): 275–85. https://doi.org/10.1080/15313204.2024.2406257.

Basil, Debra Z., Suzan Burton, Alena Soboleva, and Paul Nesbit. 2023. “Coercive Citation: Understanding the Problem and Working Toward a Solution.” Academy of Management Perspectives 37 (3): 205–19. https://doi.org/10.5465/amp.2022.0081.

Biagoli, Mario, and Alexandra Lippman. 2020. “Introduction: Metrics and the New Ecologies of Academic Misconduct.” In Gaming the Metrics: Misconduct and Manipulation in Academic Research, edited by Mario Biagoli and Alexandra Lippman. The MIT Press. https://doi.org/10.7551/mitpress/11087.001.0001.

Bonner II, Fred A., and Barbara L. Garcia-Powell. 2022. “#BlackCitesMatter: Foregrounding a Citational Justice Movement.” Diverse: Issues in Higher Education 39 (14): 12–13. https://www.diverseeducation.com/opinion/article/15297041/blackcitesmatter-foregrounding-a-citational-justice-movement.

Bornmann, Lutz. 2016. “Scientific Revolution in Scientometrics: The Broadening of Impact from Citation to Societal.” In Theories of Informetrics and Scholarly Communication: A Festschrift in Honor of Blaise Cronin, edited by Cassidy R. Sugimoto. De Gruyter Saur. https://doi.org/10.1515/9783110308464-020.

Bornmann, Lutz, and Hans-Dieter Daniel. 2008. “What Do Citation Counts Measure? A Review of Studies on Citing Behavior.” Journal of Documentation 64 (1): 45–80. https://doi.org/10.1108/00220410810844150.

Bornmann, Lutz, and Christian Leibel. 2025. “Citation Accuracy, Citation Noise, and Citation Bias: A Foundation of Citation Analysis.” Preprint, August 18, 2025. https://doi.org/10.48550/arXiv.2508.12735.

Bruton, Samuel V., Alicia L. Macchione, Mitch Brown, and Mohammad Hosseini. 2025. “Citation Ethics: An Exploratory Survey of Norms and Behaviors.” Journal of Academic Ethics 23 (2): 329–46. https://doi.org/10.1007/s10805-024-09539-2.

Burbules, Nicholas C. 2015. “The Changing Functions of Citation: From Knowledge Networking to Academic Cash-Value.” Paedagogica Historica 51 (6): 716–26. https://doi.org/10.1080/00309230.2015.1051553.

Bush, Vannevar. 1945. “As We May Think.” The Atlantic Monthly 176 (1): 101-8. https://cdn.theatlantic.com/media/archives/1945/07/176-1/132407932.pdf.

Carlier, Aurélie, Hang Nguyen, Lidwien Hollanders, Nicole Basaraba, Sally Wyatt, and Sharon Anyango. 2021. UM Citation Guide: A Guide by FEM. Maastricht University. https://fasos.maastrichtuniversity.nl/weekly/wp-content/uploads/2022/03/um_citation_guide_fem_2022.pdf.

Carlier, Aurélie, Hang Nguyen, Lidwien Hollanders, Nicole Basaraba, Sally Wyatt, and Sharon Anyango. 2022. “Aspirational Metrics: A Guide for Working Towards Citational Justice.” LSE Impact Blog. https://blogs.lse.ac.uk/impactofsocialsciences/2022/05/16/aspirational-metrics-a-guide-for-working-towards-citational-justice/. Archived at: https://perma.cc/3MR8-6SDV.

Chakravartty, Paula, Rachel Kuo, Victoria Grubbs, and Charlton McIlwain. 2018. “#CommunicationSoWhite.” Journal of Communication 68 (2): 254–66. https://doi.org/10.1093/joc/jqy003.

Chenevey, Liz. 2023. “Teaching the Politics of Citation: Challenging Students’ Perceptions.” College & Research Libraries News 84 (5): 152–57. https://crln.acrl.org/index.php/crlnews/article/view/25887/33825.

Chetan, S. V. 2024. “Women and Queer Researchers Cited, But Not in Sight: Rethinking APA Citation Style.” Journal of International Women’s Studies 26 (5): Article 13. https://vc.bridgew.edu/jiws/vol26/iss5/13.

Chou, Chuing Prudence, Hsiao Fang Lin, and Yun-Ju Chiu. 2013. “The Impact of SSCI and SCI on Taiwan’s Academy: An Outcry for Fair Play.” Asia Pacific Education Review 14 (1): 23–31. https://doi.org/10.1007/s12564-013-9245-1.

Citational Justice Collective, Gabriela Molina León, Lynn Kirabo, Marisol Wong-Villacrés, Naveena Karusala, Neha Kumar, Nicola Bidwell, Pedro Reynolds-Cuéllar, Pranjal Protim Borah, Radhika Garg, Sushil K. Oswal, Tee Chuanromanee, and Vishal Sharma. 2021. “Following the Trail of Citational Justice: Critically Examining Knowledge Production in HCI.” In CSCW ‘21 Companion: Companion Publication of the 2021 Conference on Computer Supported Cooperative Work and Social Computing, edited by Jeremy Birnholtz, Luigina Ciolfi, Sharon Ding, Susan Fussell, Andrés Monroy-Hernández, Sean Munson, Irina Shklovski, and Mor Naaman. Association for Computing Machinery. https://doi.org/10.1145/3462204.3481732.

Citational Justice Collective, Syed Ishtiaque Ahmed, Sareeta Amrute, Jeffrey Bardzell, Shaowen Bardzell, Nicola Bidwell, Tawanna Dillahunt, Sane Gaytán, Naveena Karusala, Neha Kumar, Rigoberto Lara Guzmán, Maryam Mustafa, Bonnie Nardi, Lisa Nathan, Nassim Parvin, Beth Patin, Pedro Reynolds-Cuéllar, Rebecca Rouse, Katta Spiel, Soraia Silva Prietch, Ding Wang, and Marisol Wong-Villacrés. 2022. “Citational Justice and the Politics of Knowledge Production.” Interactions 29 (5): 78–82. https://doi.org/10.1145/3556549.

Citational Justice Collective, Amy Ogan, Frederick van Amstel, Gabriela Molina León, Juan Fernando Maestre, Kristin Williams, Nicola J. Bidwell, Pedro Reynolds-Cuéllar, Saiph Savage, Sushil Oswal, and Vishal Sharma. 2023. “Why Do We Need to Learn about Citational Practices? Recognizing Knowledge Production from the Global Souths and Beyond.” XRDS: Crossroads, The ACM Magazine for Students 29 (3): 12–17. https://doi.org/10.1145/3589256.

Coalition for Advancing Research Assessment (CoARA). 2022. “Agreement on Reforming Research Assessment.” https://coara.eu/agreement/the-agreement-full-text/.

Coalter, Jodi H. 2023. “Citation Power: Overcoming Marginalization One Citation at a Time.” In Perspectives on Justice, Equity, Diversity, and Inclusion in Libraries, edited by Nandita S. Mani, Michelle A. Cawley, and Emily P. Jones. IGI Global. https://doi.org/10.4018/978-1-6684-7255-2.ch004.

Committee on Publication Ethics (COPE). 2024a. “COPE Position: Editors Requiring Authors to Cite Papers in Their Journal.” https://publicationethics.org/guidance/cope-position/editors-requiring-authors-cite-papers-their-journal.

Committee on Publication Ethics (COPE). 2024b. “COPE Position: Handling Citation Manipulation.” https://publicationethics.org/guidance/cope-position/handling-citation-manipulation.

Conference on College Composition and Communication. 2022. “Position Statement on Citation Justice in Rhetoric, Composition, and Writing Studies.” https://cccc.ncte.org/cccc/citation-justice.

Craft-Morgan, Sheila. 2024. “Citational Justice: How Librarians Can Improve Equity in Measuring Research Impact.” American Libraries 55 (6): 54. https://americanlibrariesmagazine.org/2024/06/03/citational-justice/.

Craven, Christa. 2021. “Teaching Antiracist Citational Politics as a Project of Transformation: Lessons from the Cite Black Women Movement for White Feminist Anthropologists.” Feminist Anthropology 2 (1): 120–29. https://doi.org/10.1002/fea2.12036.

Cronin, Blaise, and Cassidy R. Sugimoto, eds. 2015. Scholarly Metrics Under the Microscope: From Citation Analysis to Academic Auditing. Information Today.

Dadze-Arthur, Abena, and Mary S. Mangai. 2024. “The Journal and the Quest for Epistemic Justice.” Public Administration and Development 44 (4): 326–41. https://doi.org/10.1002/pad.2064.

Davies, Sarah W., Hollie M. Putnam, Tracy Ainsworth, Julia K. Baum, Colleen B. Bove, Sarah C. Crosby, Isabelle M. Côté, Anne Duplouy, Robinson W. Fulweiler, Alyssa J. Griffin, Torrance C. Hanley, Tessa Hill, Adriana Humanes, Sangeeta Mangubhai, Anna Metaxas, Laura M. Parker, Hanny E. Rivera, Nyssa J. Silbiger, Nicola S. Smith, Ana K. Spalding, Nikki Traylor-Knowles, Brooke L. Weigel, Rachel M. Wright, and Amanda E. Bates. 2021. “Promoting Inclusive Metrics of Success and Impact to Dismantle a Discriminatory Reward System in Science.” PLOS Biology 19 (6): e3001282. https://doi.org/10.1371/journal.pbio.3001282.

Dawson, Emily, Mehita Iqani, and Simon Lock. 2024. “Why Should We Think About Social Justice in Science Communication?” Journal of Science Communication 23 (04): E. https://doi.org/10.22323/2.23040501.

Declaration on Research Assessment (DORA). 2012. “San Francisco Declaration on Research Assessment.” https://sfdora.org/read/.

D’Ignazio, Catherine, and Lauren F. Klein. 2020. Data Feminism. The MIT Press. https://doi.org/10.7551/mitpress/11805.001.0001.

Dworkin, Jordan, Perry Zurn, and Danielle S. Bassett. 2020. “(In)citing Action to Realize an Equitable Future.” Neuron 106 (6): 890–94. https://doi.org/10.1016/j.neuron.2020.05.011.

Egghe, Leo. 2010. “The Hirsch Index and Related Impact Measures.” Annual Review of Information Science and Technology 44: 65–114. https://doi.org/10.1002/aris.2010.1440440109.

Else, Holly, and Jeffrey M. Perkel. 2022. “The Giant Plan to Track Diversity in Research Journals.” Nature 602 (7898): 566–70. https://www.nature.com/articles/d41586-022-00426-7.

Fowler, Marsha. 2020. “Citation Justice.” Nursing Inquiry 27 (1): e12331. https://doi.org/10.1111/nin.12331.

Framework for Open and Reproducible Research Training (FORRT), “FORRT’s Citation Politics Toolkit—A Walkthrough,” Posted 2025, by FORTT. YouTube, 18:36. https://www.youtube.com/watch?v=HuQEmrME6uk.

Garfield, Eugene. 1999. “Journal Impact Factor: A Brief Review.” Canadian Medical Association Journal 161 (8): 979–80. https://www.cmaj.ca/content/161/8/979.short.

Gingras, Yves. 2016. Bibliometrics and Research Evaluation: Uses and Abuses. The MIT Press. https://doi.org/10.7551/mitpress/10719.001.0001.

Godskesen, Tove, Gert Helgesson, and Stefan Eriksson. 2025. “Implementation, Barriers, and Improvement Strategies for CRediT: A Scoping Review.” Accountability in Research, 1-22. https://doi.org/10.1080/08989621.2025.2528953.

Gomez, Charles J., Andew C. Herman, and Paolo Parigi. 2022. “Leading Countries in Global Science Increasingly Receive More Citations Than Other Countries Doing Similar Research.” Nature Human Behaviour 6: 919–29. https://www.nature.com/articles/s41562-022-01351-5.

Hammarfelt, Björn. 2016. “Beyond Coverage: Toward a Bibliometrics for the Humanities.” In Research Assessment in the Humanities : Towards Criteria and Procedures, edited by Michael Ochsner, Sven E. Hug, and Hans-Dieter Daniel. Springer. https://doi.org/10.1007/978-3-319-29016-4.

“Helsinki Initiative on Multilingualism in Scholarly Communication.” 2019. Helsinki: Federation of Finnish Learned Societies, Committee for Public Information, Finnish Association for Scholarly Publishing, Universities Norway & European Network for Research Evaluation in the Social Sciences and the Humanities. https://www.helsinki-initiative.org/.

Hernigou, Philippe, and Marius M. Scarlat. 2024. “Unlocking Citation Obsession and Academic Identity: The Importance of ORCID Numbers.” International Orthopaedics 48 (1): 1–3. https://doi.org/10.1007/s00264-023-06069-1.

Hicks, Diana, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols. 2015. “Bibliometrics: The Leiden Manifesto for Research Metrics.” Nature 520 (7548): 429–31. https://doi.org/10.1038/520429a.

Hirsch, J. E. 2005. “An Index to Quantify an Individual’s Scientific Research Output.” Proceedings of the National Academy of Sciences 102 (46): 16569–72. https://doi.org/10.1073/pnas.0507655102.

Hovland, Ingie, and Britt Halvorson. 2024. “Problems of Citation in the Study of Religion: Who Do We Cite and Why?” Studies in Religion/Sciences Religieuses 53 (2): 167–84. https://doi.org/10.1177/00084298241245663.

Huang, Chun-Kai, Cameron Neylon, Lucy Montgomery, Richard Hosking, James P. Diprose, Rebecca N. Handcock, and Katie Wilson. 2024. “Open Access Research Outputs Receive More Diverse Citations.” Scientometrics 129 (2): 825–45. https://doi.org/10.1007/s11192-023-04894-0.

Ibrahim, Hazem, Fengyuan Liu, Yasir Zaki, and Talal Rahwan. 2025. “Citation Manipulation Through Citation Mills and Pre-print Servers.” Scientific Reports 15: Article 5480. https://doi.org/10.1038/s41598-025-88709-7.

Ioannidis, John P. A. 2015. “A Generalized View of Self-Citation: Direct, Co-Author, Collaborative, and Coercive Induced Self-Citation.” Journal of Psychosomatic Research 78 (1): 7–11. https://doi.org/10.1016/j.jpsychores.2014.11.008.

Itchuaqiyaq, Cana Uluak. 2025. “Citational Checkup for an Antiracist, Justice-Oriented Field.” In The Routledge Handbook of Social Justice in Technical and Professional Communication, edited by Natasha N. Jones, Laura Gonzales, Angela M. Haas, and Miriam F. Williams. Routledge. https://doi.org/10.4324/9781003455158.

Itchuaqiyaq, Cana Uluak, and Jordan Frith. 2022. “Citational Practices as a Site of Resistance and Radical Pedagogy: Positioning the Multiply Marginalized and Underrepresented (MMU) Scholar Database as an Infrastructural Intervention.” Communication Design Quarterly 10 (3): 10–19. https://doi.org/10.1145/3507870.3507872.

Joshi, Payal B., and Manoj Pandey. 2024. “Deception Through Manipulated Citations and References as a Growing Problem in Scientific Publishing.” In Scientific Publishing Ecosystem: An Author-Editor-Reviewer Axis, edited by Payal B. Joshi, Prathamesh P. Churi, and Manoj Pandey. Springer. https://doi.org/10.1007/978-981-97-4060-4.

King, Molly M., Carl T. Bergstrom, Shelley J. Correll, Jennifer Jacquet, and Jevin D. West. 2017. “Men Set Their Own Cites High: Gender and Self-Citation Across Fields and Over Time.” Socius 3: 1-22. https://doi.org/10.1177/2378023117738903.

Kraemer-Mbula, Erika, Robert Tijssen, Matthew L. Wallace, and Robert McLean. 2020. “Introduction.” In Transforming Research Excellence: New Ideas from the Global South, edited by Erika Kraemer-Mbula, Robert Tijssen, Matthew L. Wallace, and Robert McLean. African Minds. https://library.oapen.org/handle/20.500.12657/23441.

Kumar, Neha, and Naveena Karusala. 2021. “Braving Citational Justice in Human-Computer Interaction.” In CHI EA ‘21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, edited by Yoshifumi Kitamura, Aaron Quigley, Katherine Isbister, and Takeo Igarashi. Association for Computing Machinery. https://doi.org/10.1145/3411763.3450389.

Kwon, Diana. 2022. “The Rise of Citational Justice.” Nature 603 (7902): 568–71. https://www.nature.com/articles/d41586-022-00793-1.

Lancaster, Amber, and Carie S. Tucker King. 2025. “Empowerment Through Authorship Inclusivity: Toward More Equitable and Socially Just Citation Practices.” Communication Design Quarterly 12 (4): 1–15. https://doi.org/10.1145/3658438.3658440.

Larivière, Vincent, David Pontille, and Cassidy R. Sugimoto. 2021. “Investigating the Division of Scientific Labor Using the Contributor Roles Taxonomy (CRediT).” Quantitative Science Studies 2 (1): 111–28. https://doi.org/10.1162/qss_a_00097.

Lazarus, Suleman. 2025. “An Autoethnographic Perspective on Scholarly Impact, Citation Politics, and North-South Power Dynamics.” Life Writing 22 (3): 603–29. https://doi.org/10.1080/14484528.2024.2430666.

Ledgerwood, Alison, Katherine M. Lawson, Michael W. Kraus, Johanna Ray Vollhardt, Jessica D. Remedios, Dulce Wilkinson Westberg, Ayse K. Uskul, Adeyemi Adetula, Colin Wayne Leach, Joel E. Martinez, Laura P. Naumann, Geetha Reddy, Charlotte Chucky Tate, Andrew R. Todd, Katherine Weltzien, NiCole Buchanan, Roberto González, L. James Montilla Doble, Rainer Romero-Canyas, Erin Westgate, and Linda X. Zou. 2024. “Disrupting Racism and Global Exclusion in Academic Publishing: Recommendations and Resources for Authors, Reviewers, and Editors.” Collabra: Psychology 10(1): 1–33. https://doi.org/10.1525/collabra.121394.

Lenharo, Mariana. 2025. “Breaking Language Barriers: ‘Not Being Fluent in English Is Often Viewed as Being an Inferior Scientist.’” Nature, February 10. https://doi.org/10.1038/d41586-025-00157-5.

Light, Ryan, Aaron Gullickson, and Jill Ann Harrison. 2025. “Inequality in Measuring Scholarly Success: Variation in the h-index Within and Between Disciplines.” PLOS One 20 (1): e0316913. https://doi.org/10.1371/journal.pone.0316913.

Ma, Lai. 2019. “From Metrics to Representation: The Flattened Self in Citation Databases.” Information Research 24 (4): paper colis1927. https://informationr.net/ir/24-4/colis/colis1927.html.

MacLeod, Lorisia. 2021. “More Than Personal Communication: Templates for Citing Indigenous Elders and Knowledge Keepers.” KULA: Knowledge Creation, Dissemination, and Preservation Studies 5 (1): 1–5. https://doi.org/10.18357/kula.135.

MacRoberts, M. H., and Barbara R. MacRoberts. 1996. “Problems of Citation Analysis.” Scientometrics 36 (3): 435–44. https://doi.org/10.1007/BF02129604.

Maggio, Lauren A., Alyssa Jeffrey, Stefanie Haustein, and Anita Samuel. 2022. “Becoming Metrics Literate: An Analysis of Brief Videos That Teach About the h-index.” PLOS One 17 (5): e0268110. https://doi.org/10.1371/journal.pone.0268110.

Mason, Shannon, and Margaret Merga. 2021. “Less ‘Prestigious’ Journals Can Contain More Diverse Research, by Citing Them We Can Shape a More Just Politics of Citation.” LSE Impact Blog. https://blogs.lse.ac.uk/impactofsocialsciences/2021/10/11/less-prestigious-journals-can-contain-more-diverse-research-by-citing-them-we-can-shape-a-more-just-politics-of-citation/.

Mattiazzi, Alicia, and Martin Vila-Petroff. 2024. “Unveiling the Ethical Void: Bias in Reference Citations and Its Academic Ramifications.” Current Research in Physiology 7: 100130. https://doi.org/10.1016/j.crphys.2024.100130.

McLeod, Sam. 2021. “Should Authors Cite Sources Suggested by Peer Reviewers? Six Antidotes for Handling Potentially Coercive Reviewer Citation Suggestions.” Learned Publishing 34 (2): 282–86. https://doi.org/10.1002/leap.1335.

Mehregan, Mina, and Mohammad Moghiman. 2024. “The Unnoticed Issue of Coercive Citation Behavior for Authors.” Publishing Research Quarterly 40 (2): 164-68. https://doi.org/10.1007/s12109-024-09994-0.

Miller-Nesbitt, Andrea. 2025. “Citation Justice in STEMM.” McGill University Library. https://libraryguides.mcgill.ca/citation_justice.

Mills, David, and Toluwase Asubiaro. 2024. “Does the African Academy Need Its Own Citation Index?” Global Africa (7): 115–25. https://doi.org/10.57832/18yw-xv96.

Moed, Henk F. 2005. Citation Analysis in Research Evaluation. Springer. https://doi.org/10.1007/1-4020-3714-7.

Moher, David, Florian Naudet, Ioana A. Cristea, Frank Miedema, John P. A. Ioannidis, and Steven N. Goodman. 2018. “Assessing Scientists for Hiring, Promotion, and Tenure.” PLOS Biology 16 (3): e2004089. https://doi.org/10.1371/journal.pbio.2004089.

Mott, Carrie, and Daniel Cockayne. 2017. “Citation Matters: Mobilizing the Politics of Citation Toward a Practice of ‘Conscientious Engagement.’” Gender, Place and Culture 24 (7): 954–73. https://doi.org/10.1080/0966369X.2017.1339022.

Moutinho, Sofia. 2024. “Breaking the Glass.” Science 386 (6726): 1087–89. https://www.science.org/doi/epdf/10.1126/science.adv0401.

Muller, Jerry Z. 2018. The Tyranny of Metrics. Princeton University Press.

Murdie, Amanda. 2018. “We Need a New International Norm: Eradicating the Gender Citation Gap.” Political Analysis 26 (3): 345–47. https://doi.org/10.1017/pan.2018.27.

Narin, Francis. 1976. Evaluative Bibliometrics: The Use of Publication and Citation Analysis in the Evaluation of Scientific Activity. Computer Horizons.

Ochsner, Michael. 2021. “Bibliometrics in the Humanities, Arts and Social Sciences.” In Handbook Bibliometrics, edited by Rafael Ball. De Gruyter Saur. https://doi.org/10.1515/9783110646610.

Ochsner, Michael, Sven E. Hug, and Hans-Dieter Daniel, eds. 2016. Research Assessment in the Humanities: Towards Criteria and Procedures. Springer. https://doi.org/10.1007/978-3-319-29016-4.

Okune, Angela. 2019. “Self-Review of Citational Practice.” Zenodo. https://www.researchdatashare.org/content/okune-angela-2019-may-21-self-review-citational-practice-zenodo.

O’Leary, H., T. Gantzert, A. Mann, E. Z. Mann, N. Bollineni, and M. Nelson. 2024. “Citation as Representation: Gendered Academic Citation Politics Persist in Environmental Studies Publications.” Journal of Environmental Studies and Sciences 14 (3): 525–37. https://doi.org/10.1007/s13412-024-00928-y.

Oravec, Jo Ann. 2019. “The ‘Dark Side’ of Academics? Emerging Issues in the Gaming and Manipulation of Metrics in Higher Education.” The Review of Higher Education 42 (3): 859–77. https://doi.org/10.1353/rhe.2019.0022.

Orduña-Malea, Enrique, and Álvaro Cabezas-Clavijo. 2023. “ChatGPT and the Potential Growing of Ghost Bibliographic References.” Scientometrics 128 (9): 5351–55. https://doi.org/10.1007/s11192-023-04804-4.

Owens, Brian. 2024. “How to Make Academic Hiring Fair: Database Lists Innovative Policies.” Nature, January 30. https://doi.org/10.1038/d41586-024-00273-8.

Patin, Beth, Melinda Sebastian, Jieun Yeon, Danielle Bertolini, and Alexandra Grimm. 2021. “Interrupting Epistemicide: A Practical Framework for Naming, Identifying, and Ending Epistemic Injustice in the Information Professions.” Journal of the Association for Information Science and Technology 72 (10): 1306–18. https://doi.org/10.1002/asi.24479.

Poquet, Oleksandra, Srecko Joksimovic, and Pernille Brams. 2024. “The Role of Gender in Citation Practices of Learning Analytics Research.” In LAK ‘24: Proceedings of the 14th Learning Analytics and Knowledge Conference. Association for Computing Machinery. https://doi.org/10.1145/3636555.3636878.

Potter, William Gray. 1981. “Introduction.” Library Trends 30 (1): 5–7. https://www.ideals.illinois.edu/items/7138.

Project TARA. 2024. DORA. https://sfdora.org/project-tara/.

Quave, Kylie E., and Savannah Hagen Ohbi. 2024. “Teaching and Learning a Joyful Citation Praxis: Affective Relations for Fostering Community Through Our Compositions.” Radical Teacher 128: 37–48. https://doi.org/10.5195/rt.2024.1219.

Radovic, Ljubisa R. 1996. “Citation Justice.” Chemical & Engineering News, October 28.

Ray, Keisha S., Perry Zurn, Jordan D. Dworkin, Dani S. Bassett, and David B. Resnik. 2022. “Citation Bias, Diversity, and Ethics.” Accountability in Research 31 (2): 158–72. https://doi.org/10.1080/08989621.2022.2111257.

Rossiter, Margaret W. 1993. “The Matthew Matilda Effect in Science.” Social Studies of Science 23 (2): 325–41. https://doi.org/10.1177/030631293023002004.

Rowson, Bethany, Stefan M. Duma, Michael R. King, Igor Efimov, Ann Saterbak, and Naomi C. Chesler. 2021. “Citation Diversity Statement in BMES Journals.” Annals of Biomedical Engineering 49 (3): 947–49. https://doi.org/10.1007/s10439-021-02739-6.

Royal Society of Chemistry. 2023. “Joint Commitment for Action on Inclusion and Diversity in Publishing.” https://www.rsc.org/policy-evidence-campaigns/inclusion-diversity/joint-commitment-for-action-inclusion-and-diversity-in-publishing/.

Rushforth, Alexander, and Björn Hammarfelt. 2023. “The Rise of Responsible Metrics as a Professional Reform Movement: A Collective Action Frames Account.” Quantitative Science Studies 4 (4): 879–97. https://doi.org/10.1162/qss_a_00280.

Santangelo, L. C. 2023. “Reconceptualizing Citation Practices as ‘Feminist Memory.’” Feminist Pedagogy 3 (4): Article 12. https://digitalcommons.calpoly.edu/feministpedagogy/vol3/iss4/12/.

Sarol, Maria Janina, Shufan Ming, Shruthan Radhakrishna, Jodi Schneider, and Halil Kilicoglu. 2024. “Assessing Citation Integrity in Biomedical Publications: Corpus Annotation and NLP Models.” Bioinformatics 40 (7): btae420. https://doi.org/10.1093/bioinformatics/btae420.

Sauvé, Sarah A., Sara L. Middleton, Helena M. Gellersen, and Flavio Azevedo. 2025. In Pursuit of Citational Justice: A Toolkit for Equitable Scholarship. Framework for Open and Responsible Research Training. https://osf.io/preprints/metaarxiv/qjecy_v2.

Schmidt, Januschka. 2022. “Whom We Cite: A Reflection on the Limits and Potentials of Critical Citation Practices.” In Diversity, Inclusion, and Decolonization: Practical Tools for Improving Teaching, Research, and Scholarship, edited by Abby Day, Lois Lee, Dave S. P. Thomas, and James Spickard. Bristol University Press. https://doi.org/10.2307/j.ctv2jn920v.18.

Shi, Julie. 2023. “Articulations of Language and Value(s) in Scholarly Publishing Circuits.” Canadian Journal of Academic Librarianship 9 (April): 1–33. https://doi.org/10.33137/cjal-rcbu.v9.38148.

Simons, Kai. 2008. “The Misused Impact Factor.” Science 322 (5899): 165. https://doi.org/10.1126/science.1165316.

Smeyers, Paul, and Nicholas C. Burbules. 2011. “How to Improve Your Impact Factor: Questioning the Quantification of Academic Quality.” Journal of Philosophy of Education 45 (1): 1–17. https://doi.org/10.1111/j.1467-9752.2011.00787.x.

Smith, Christen A. 2021. “An Introduction to Cite Black Women.” Feminist Anthropology 2 (1): 6–9. https://doi.org/10.1002/fea2.12050.

Smith, Christen A., and Dominique Garrett-Scott. 2021. “’We Are Not Named’: Black Women and the Politics of Citation in Anthropology.” Feminist Anthropology 2 (1): 18–37. https://doi.org/10.1002/fea2.12038.

Smith, Christen A., Erica L. Williams, Imani A. Wadud, Whitney N. L. Pirtle, and The Cite Black Women Collective. 2021. “Cite Black Women: A Critical Praxis (A Statement).” Feminist Anthropology 2 (1): 10–17. https://doi.org/10.1002/fea2.12040.

Smith, Linda C. 1976. “Artificial Intelligence in Information Retrieval Systems.” Information Processing & Management 12 (3): 189–222. https://doi.org/10.1016/0306-4573(76)90005-4.

Smith, Linda C. 1980. “‘Memex’ as an Image of Potentiality in Information Retrieval Research and Development.” SIGIR ’80: Proceedings of the 3rd Annual ACM Conference on Research and Development in Information Retrieval. Butterworth. https://dl.acm.org/doi/abs/10.5555/636669.636692.

Smith, Linda C. 1981. “Citation Analysis.” Library Trends 30 (1): 83–106. https://www.ideals.illinois.edu/items/7149.

Society for Personality and Social Psychology. 2025. “Guidelines for Promoting Inclusive Citing Practices.” https://spsp.org/professional-development/publishing-resources/resources-for-inclusive-practices/guidelines-for-promoting-inclusive-citing-practices.

Springshare. n.d. “LibGuides Community.” https://community.libguides.com/.

Sugimoto, Cassidy R., and Vincent Larivière. 2018. Measuring Research: What Everyone Needs to Know. Oxford University Press.

Sugimoto, Cassidy R., and Vincent Larivière. 2023. Equity for Women in Science: Dismantling Systemic Barriers to Advancement. Harvard University Press.

Sumner, Jane Lawrence. 2018. “The Gender Balance Assessment Tool (GBAT): A Web-Based Tool for Estimating Gender Balance in Syllabi and Bibliographies.” PS: Political Science & Politics 51 (2): 396–400. https://doi.org/10.1017/S1049096517002074.

Sweetland, James H. 1989. “Errors in Bibliographic Citations: A Continuing Problem.” The Library Quarterly 59 (4): 291–304. https://www.jstor.org/stable/4308405.

Tahamtan, Iman, and Lutz Bornmann. 2019. “What Do Citation Counts Measure? An Updated Review of Studies on Citations in Scientific Documents Published Between 2006 and 2018.” Scientometrics 121 (3): 1635–84. https://doi.org/10.1007/s11192-019-03243-4.

Taşkın, Zehra. 2025. “Sustaining the ‘Frozen Footprints’ of Scholarly Communication Through Open Citations: An Annual Review of Information Science and Technology (ARIST) Paper.” Journal of the Association for Information Science and Technology, 1–17. https://doi.org/10.1002/asi.24982.

Teaching Citational Practice. n.d. General Overview. https://journals.library.columbia.edu/index.php/citationalpractice/generaloverview.

Teixeira da Silva, Jaime A. 2024. “The Inaccurate Representation of an Author’s Publishing Name, and Impact on Reference Accuracy.” Scientometrics 129 (5): 2923–32. https://doi.org/10.1007/s11192-024-05029-9.

Thelwall, Mike. 2025. Quantitative Methods in Research Evaluation: Citation Indicators, Altmetrics, and Artificial Intelligence. University of Sheffield. https://doi.org/10.48550/arXiv.2407.00135.

Thelwall, Mike, and Maria M. Delgado. 2015. “Arts and Humanities Research Evaluation: No Metrics Please, Just Data.” Journal of Documentation 71 (4): 817–33. https://doi.org/10.1108/JD-02-2015-0028.

Thelwall, Mike, Kayvan Kousha, Emma Stuart, Meiko Makita, Mahshid Abdoli, Paul Wilson, and Jonathan Levitt. 2023. “In Which Fields Are Citations Indicators of Research Quality?” Journal of the Association for Information Science and Technology 74 (8): 941–53. https://doi.org/10.1002/asi.24767.

Trueblood, Jennifer S., David B. Allison, Sarahanne M. Field, Ayelet Fishbach, Stefan D. M. Gaillard, Gerd Gigerenzer, William R. Holmes, Stephan Lewandowsky, Dora Matzke, Mary C. Murphy, Sebastian Musslick, Vencislav Popov, Adina L. Roskies, Judith ter Schure, and Andrei R. Teodorescu. 2025. “The Misalignment of Incentives in Academic Publishing and Implications for Journal Reform.” Proceedings of the National Academy of Sciences 122 (5): e2401231121. https://doi.org/10.1073/pnas.2401231121.

UNESCO. 2021. UNESCO Recommendation on Open Science. United Nations Educational, Scientific and Cultural Organization. https://unesdoc.unesco.org/ark:/48223/pf0000379949.

Vaidhyanathan, Siva. 2006. “Afterword: Critical Information Studies: A Bibliographic Manifesto.” Cultural Studies 20 (2-3): 292-315. https://doi.org/10.1080/09502380500521091.

Velho, Léa. 1986. “The ‘Meaning’ of Citation in the Context of a Scientifically Peripheral Country.” Scientometrics 9 (1–2): 71–89. https://doi.org/10.1007/BF02016609.

White, Howard D., and Katherine W. McCain. 1998. “Visualizing a Discipline: An Author Co-citation Analysis of Information Science, 1972-1995.” Journal of the American Society for Information Science 49 (4): 327-55. https://doi.org/10.1002/(SICI)1097-4571(19980401)49:4<327::AID-ASI4>3.0.CO;2-4.

Wilsdon, James, Liz Allen, Eleonora Belfiore, Philip Campbell, Stephen Curry, Steven Hill, Richard Jones, Roger Kain, Simon Richard Kerridge, Mike Thelwall, Jane Tinkler, Ian Viney, Paul Wouters, Jude Hill, and Ben Johnson. 2015. The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. https://doi.org/10.13140/RG.2.1.4929.1363.

Wood, Felicity. 2021. “The Cult of the Quantifiable: The Fetishism of Numbers in Higher Education.” Prometheus 37 (1): 8–26. https://www.jstor.org/stable/10.13169/prometheus.37.1.0008.

Woolston, Chris. 2023. “How to Measure the Societal Impact of Science.” Nature 614: 375–77. https://doi.org/10.1038/d41586-023-00345-1.

Wright, Lori, Neisha Wiley, Elizabeth VanWassenhove, Brandelyn Tosolt, Rae Loftis, and Meg L. Hensley. 2022. “Feminist Citational Praxis and Problems of Practice.” Women’s Studies Quarterly 50 (3/4): 124–40. https://www.jstor.org/stable/27201442.

Zurn, Perry, Danielle S. Bassett, and Nicole C. Rust. 2020. “The Citation Diversity Statement: A Practice of Transparency, a Way of Life.” Trends in Cognitive Sciences 24 (9): 669–72. https://doi.org/10.1016/j.tics.2020.06.009.

Zurn, Perry, Erin G. Teich, Samantha C. Simon, Jason Z. Kim, and Dani S. Bassett. 2022. “Supporting Academic Equity in Physics Through Citation Diversity.” Communications Physics 5: Article 240. https://www.nature.com/articles/s42005-022-00999-9.