Sections

Sections are curated clusters of thematically related content that accrue on an ongoing basis for a time frame of three or more years. KULA was established to create space for intellectual exchange on topics related to knowledge creation, dissemination, and preservation, and the goal of sections is to generate dialogue on topical issues relevant to the journal’s mandate. Contributions often take the form of commentaries or invited articles, which do not undergo traditional anonymous peer review but instead go through a collaborative, open editorial review process.

Section editors supervise the publication of several submissions per year. In this way, section editors and contributors build a collection of scholarship on a particular issue—much like a special issue—but extend the production of that scholarship over several years, allowing contributors time to respond to and build on the work of other scholars.

Not all sections accept open submissions; please see the relevant section page to determine if open submissions are accepted.

*Note: Since sections are intended to cultivate content over time, section focuses and descriptions may evolve along with shifts in scholarly conversations and changes in editorship. Sections may also be put on hiatus or retired—with no new content being added—depending on section editor availability and the number of other active sections. 

AI and Academic Publishing

The editorials in this section take a speculative approach to the topic of artificial intelligence and equity in academic publishing. The transformative potential of AI in academic knowledge production has been a topic of considerable discussion, particularly since the widespread attention brought on by ChatGPT. However, the development and integration of AI have predominantly been shaped by the interests of publishers and surveillance-based publishing. While AI offers promising advancements and improved efficiency for the publishing industry, there are unaddressed concerns about the potential biases, ethical implications, and unforeseen consequences that could arise from adding AI to an already highly inequitable system. AI thus prompts us to question and rethink our relationships with our processes for knowledge production, and the pieces in this section explore what AI could be from different ontological and epistemological perspectives without the constraints of publishers.

Section Editor: Leslie Chan, Associate Professor, Department of Global Development Studies; Director, Knowledge Equity Lab, University of Toronto Scarborough

Open submissions accepted: No

View section

Conversations on Epistemic Injustice

This section contributes to the conversation in ethics and epistemology about epistemic injustice, which Miranda Fricker defines as “a wrong done to someone specifically in their capacity as a knower.” Philosophers working on decolonization have taken up the concept of epistemic injustice to analyze the marginalization and delegitimization of knowledge systems outside of the dominant mainstream knowledge economy of the Global North and to argue that epistemic justice requires the recentring of these marginalized knowledge systems to dismantle oppressive power structures upheld through epistemic supremacy.

The thought pieces in this section may explore different strands of epistemic injustice (e.g., testimonial, hermeneutic); ideas about epistemic authority, epistemic agency, or epistemic risk; the relationship of epistemic injustice to epistemic decolonization and contributory injustice; the necessity of epistemic justice for material, social, and political empowerment; and the importance of situating conversations about epistemic injustice in particular geo-sociopolitical locations. This section aims to bring vital conversations about epistemic injustice happening in social epistemology and ethics to a multidisciplinary audience.  

Section Editor: Veli Mitova, Professor and Director of the African Centre of Epistemology and Philosophy of Science, University of Johannesburg 

Open submissions accepted: No.

View section