What are harmful systems in scholarship?
Session 1 of 3 of the Scholarship Values Summit 2021
The context for scholarship has changed. The cultural norms and incentives for scholarship today (e.g. ‘publish or perish’) were established during a now-outdated context for traditional academia (see Lancaster et al 2018). Such framing and assumptions no longer work in today’s mainstream academia, let alone among the growing numbers of independent scholars and researchers that operate under different incentives, motivations, and business models.
The Ronin Institute and IGDORE co-hosted a “Scholarship Values Summit” (SVS 2021) using an unconference format. The goals were to identify the harmful scholarship-related systems that no longer work for us as scholars and to define the values to help us create new systems that are healthier and more supportive of our goals as scholars.
The 500+ scholars and researchers in our communities of IGDORE and the Ronin Institute, in addition to the broader community of our mailing list and social media followers, were the primary audience for SVS 2021. The scholarship experience for this audience ranges from those working independently to those at brick-and-mortar universities. What we have in common is a recognition that change is needed in scholarship today. The starting point for finding solutions is to define our core values that underpin scholarship. Shared values and common language provide scholars with a common reference and guide when discussing avoiding harmful systems and creating positive changes.
This is the first post (cross-posted on the Ronin blog) of a series of three posts that summarize three sessions of breakout table discussions. Note that these summaries were put together by the authors of this post, and are based on real-time note-taking from participants during the sessions. Therefore, these posts reflect additional contextualization by the authors and aren’t intended to be fully comprehensive; they may not reflect every view expressed during the sessions. Here is the outline of the series:
- Session 1: Identify harmful systems: This session was intended to identify scholarship-related systems and actions that scholars participate in that lead to exploitation and harm (this post)
- Session 2: Propose values & solutions: To define and propose a set of values for scholarship that can help scholars begin to extract themselves from exploitative and harmful systems, and engage in efforts that align better with our values and our current socioeconomic and environmental context.
- Session 3: Creating change: Develop strategies for communicating and normalizing these stated values and suggested new systems.
Acknowledgements: We would like to acknowledge the contributions of participants at Session 1 of the Scholarship Values Summit 2021, and particularly the leads of breakout tables and individuals who took notes helping to summarize discussions.
We often assume that English is the dominant language of scholarship because it appears to have become effectively mandatory for publishing, especially scientific articles, in many parts of the world. This has advantages and disadvantages. Having a common language to express concepts in some highly technical fields (e.g. biological and geological sciences) is generally considered a good thing. However, as a disadvantage, we often miss the multilingual richness inherent in international research, especially for those concepts not easily expressible in English or not common in cultures where English dominates. A loss of linguistic diversity is akin to a loss of biodiversity. Likewise, we often ignore older research done before English was a common language in scholarship. There is also a difference between writing an academic publication (whatever the language) and making the results accessible to non-expert readers of any language, even the same one as the original text. Some solutions we can consider are publishing research in a machine-readable fashion, and then translating from there into different human languages. However, this would be most appropriate for very structured papers or datasets. Another approach is to conduct literature reviews using multilingual teams.
Lack of reproducibility
The lack of reproducibility in science is the result of many different factors, including poor documentation of methodologies, a lack of standards for writing methodologies, a bias in publications towards positive research results, and research in reproducibility being male-dominated (within the open-science movement). One solution is to document methodological details closer to the time when that step in the methodology actually happens (e.g. when the lab experiment is done or the algorithm is written, preferably on a daily basis). Methodological standards should be used whenever possible, while documenting variations, deviations or new methods in meticulous detail. The publication of negative results, information that isn’t deemed “novel”, and derivative work or commentaries is needed for the purpose of reproducing, contextualizing and validating existing work and studies. Finally, increasing diversity in the new metascience subfield of reproducibility is critical for its full benefits to be realized.
we should all be thoughtful about who and what we cite, and how it contributes to existing power dynamics… [which] sometimes manifests itself when scholars (either in-training or in more junior roles), are often recommended to cite particular authors or groups without clear guidance as to what is important about those authors and not others.
Ignoring what is known
There is much scholarly work that has been captured in some form but is ignored for 1) accidental or practical reasons; 2) because of legal or financial barriers to sharing, or 3) willful reasons such as ignoring others’ contributions. In the first category, scribbles on the margin of an experimental lab notebook do not make it into manuscripts, grant proposals or follow-up work, while many manuscripts and most grant proposals remain unpublished, often locking up that knowledge. Much knowledge is in different formats, written in a non-English language, or released under conditions that potential reusers can not make use of. There are also items published in places where people do not look. In the second category are legal barriers to sharing, e.g. non-disclosure agreements (which have become increasingly common even in academic environments). In the United States, the Bayh-Dole Act was passed in the 1980s and incentivizes academic institutions to commercialize work, creating a paranoid climate that keeps people and organizations from sharing. In the third category, the willful ignorance of the achievements of others is one of the hardest to address but is often a result of perverse incentives in the academic system. The group identified some key ideas for ignoring less of what is known, including: 1) better documenting what is known or done, 2) having networked discussions between “known” pieces of information and potential (re)users of such information, 3) leveraging the desire of people to share and removing perverse incentives.
Lack of a system of bug reporting for institutional practices
Even after problems have been identified in academia it can be hard to get anything done about them. What if individuals could report bugs they have observed and suggest changes to resolve them? In this case, the bugs aren’t errors or omissions in the scholarly literature — various methods do exist to address those (although whether they are sufficient is debatable!). Instead, these could be specific issues affecting how science or scholarship gets done (e.g. barriers to working with Open Data), reported (e.g. giving awards for material that isn’t publicly accessible), or funded (e.g. supporting core infrastructure with project grants), or broader and more structural issues such as using excellence as a measure of success. Alternatively, these could be systemic issues that limit the participation of some groups, such as women and carers, within academia or the broader scientific ecosystem: such issues are often raised internally within institutions or sometimes reported anonymously, but ultimately, bug reports need to do more than just identify problems — they need to lead to them getting fixed as well.
Biased citation practices
A recommendation from the group was that we should all be thoughtful about who and what we cite, and how it contributes to existing power dynamics. This sometimes manifests itself when scholars (either in-training or in more junior roles), are often recommended to cite particular authors or groups without clear guidance as to what is important about those authors and not others. Some suggestions of how this might work in practice was that journals should have (and enforce) explicit policies that require “citation accountability” for authors, reviewers, and editors. Citations to work that doesn’t fall into “traditional” formats (e.g work other than peer-reviewed papers) can also increase the diversity of scholars who participate in knowledge creation. Social media can also be a venue for increasing the recognition of those groups are currently underrepresented in scholarship (e.g., on Twitter, the hashtag #CiteBlackWomen).
Much of the knowledge generation process …. often assumes a unified experience of research that tends to correspond to Western-centric norms. This means a large fraction of human experience is often disregarded.
Predefining research value
Many institutions, funders and employers require scholars to provide extensive justifications or reasons to allow them to initiate or fund research. However, sometimes, this reason is discovered through the discovery process itself (i.e. it can’t be easily defined a priori). In other words, sometimes the value is in the discovery of the value. Such requirements also underestimate the role of intuition, and “following-your-nose” within research. The usual workaround is for scholars to already do some of their research before writing a proposal or asking for permission or funding. However, this is very difficult to do if you don’t already have funding and favours existing established researchers, who have already accrued significant academic “capital”. This can prevent new ideas coming from junior researchers or those who are just entering the field. In addition, many funders are focused only on supposedly “novel” research, and aren’t interested in research that is focused on replicating results or refining ideas that already exist in the current literature or practice.
Disregarding human experience
Much of the knowledge generation process ignores or “abstracts-away” the experience of the individual researcher. It often assumes a unified experience of research that tends to correspond to Western-centric norms. This means a large fraction of human experience is often disregarded. This group focused on what other kinds of approaches could be taken within our knowledge system. For example, how can indigenous science, contemplative science, meditation and other ways of knowing that incorporate both human and non-human experiences (see recent works by Isabelle Stengers, Richard Powers, Andreas Weber and Jeremy Lent) be integrated within science and scholarship? Because they pay attention to different aspects of reality that are often ignored by science’s increasing focus on applications and the economy, they can lead to different research questions. Further, these questions have real-world relevance, since they can influence how some people interact with, respond to or participate in science-related matters. The group also considered the role of acknowledging biases of the researchers themselves, and the importance of qualitative research to broaden the perspectives of researchers and allow them to better understand the subjects of their research.
Another aspect of the disregard of human experience is that scholars are often treated as replaceable cogs in academia; they need to produce scholarly outputs as fast as possible to increase the productivity metrics of their institutions, an approach which is often in conflict with conducting quality research or scholarship that addresses societal issues or values. This dynamic is described by Goodhart’s Law, which states that when a metric becomes the goal, it ceases being a good metric. The group considered whether certain kinds of metrics, or groups of metrics, could be developed that were resistant to being gamed. In other words, how do you incentivize research productivity, but not too much? The idea of a simple “one paper per year” threshold has some merit relative to the current status quo.
How can we increase the visibility of the places where real knowledge generation occurs (such as putting research insights into practice through public and policy outreach), which is among people, rather than the institute or place of affiliation?
Excessive grant competition
State funded grant systems in many countries, including the United States, often involve a certain level of favouritism, bordering on corruption, favouring people with connections and those at prestigious universities and leading to intense competition between researchers for grants. Additionally, especially in the United States, universities only seem interested in supporting applications for large grants because the percentage of overheads is independent of the grant size, which means administering larger grants provides universities with a better cost-benefit ratio. This restricts the ability of scholars to conduct small scale-projects. The US National Science Foundation (NSF) now recognizes the Ronin Institute as a trusted administrator of research grants, which we see as a promising development for the future of independent scholars.
Single track career paths
In addition to traditional institutions being biased towards supporting people with particular academic pedigrees, there is also a very limited set of specific paths that scholars are expected to tread. The default career path is often offered without taking into account real life and other opportunities — which limits diversity of people who can walk this path. Those who choose to leave the path are assumed to be leaving scholarship, and thus not able to contribute to knowledge generation in our society. Many leave this path with sadness of what they are leaving behind, but many discover joy in that freedom as well. How can we increase the visibility of the places where real knowledge generation occurs (such as putting research insights into practice through public and policy outreach), which is among people, rather than the institute or place of affiliation?
…pressures result in selecting for scholars that fit academic institutions’ notion of a good scientist or scholar. … We must ask ourselves: What does this lack of diversity mean for the scholarship? Where is there space for people who don’t fit the status quo?
Survival of the “fittest”
The group noted that the academic background and pedigree of individuals has a huge influence on those who seem to thrive in our current academic system. However, this pedigree often distracts from the value of the research itself and reinforces the status quo. For example, the quality of a person or work is often determined by proxies like who-they-know and what universities they trained at, or where they published. Funders have a great influence as to what research gets attention. Self-funding is also often assumed. For example, research work is often expected to be done before a proposal is accepted for funding. Researchers must often bring their own money both for their salary (e.g., soft money positions) and also to do their research. Indeed, the scholarship system depends on people doing what they love for free (e.g., providing “service” for their fields through unpaid peer-review or outreach work). These pressures result in selecting for scholars that fit academic institutions’ notion of a good scientist or scholar. In addition, scholarly topics themselves are also under selective pressure and constrain what areas are prioritized for funding. We must ask ourselves: What does this lack of diversity mean for the scholarship? Where is there space for people who don’t fit the status quo?
Daniel Mietchen is a biophysicist whose research areas span the temporal and spatial scales of life, with an additional focus on integrating scholarly workflows with the open parts of the web. He frequently contributes to the Wikipedia ecosystem, edits the journal Research Ideas and Outcomes and recently joined the Ronin Institute.
Gavin Taylor is a researcher and global board member at the Institute for Globally Distributed Open Research and Education (IGDORE). His research background is in visual neuroscience and computational biophysics, and he is currently trying to establish infrastructure that facilitates independent research and Open Science.