🌿 The Spore
Brought to you by the team at Organizational Mycology
Something that caught our eye: Impact assessment in open science projects
In July, the Chan-Zuckerberg Initiative released a report assessing the impact of five cycles of its Essential Open Source Software program. The program, founded in 2019, aims to support “the maintenance, growth, development and community engagement of computational tools that are key to the success of modern biomedical research.” The report assessed impact in a number of directions: the program’s impact on funded software projects, on the broader open source community, and on diversity in open source; funded projects’ impact on biomedical research; and general impacts on maintainers, users, institutions, and funders.
The report prompted some comments in the CZI community’s online spaces, which then developed into conversations with peers about the challenges of monitoring project health and assessing impact.
Some challenges are obvious: How do you even define “impact”? The CZI authors did an excellent job of considering different kinds of impact, but not all folks are so clear about the nuances. Impact needs to be more than just what is easy to measure: downloads, views, dependency graphs, or citations. We need to look at more than just the numbers.
How do we measure a program or project’s social effects? What does it mean to make communities stronger (or weaker), to make scientific research more (or less) accessible and transparent, or to generate (or stifle) interdisciplinary collaboration? We often see assessments that seek easy, favorable answers to these questions (such as surveys that lead the respondent to answer in a particular way, so as to make the impact of a training or grant look good without doing much digging beyond the surface).
To be sure, lots of research and insight is out there around project health monitoring and impact assessment that does account for these “softer” impacts. Beth has contributed to Code for Science & Society’s efforts to assess the impact of data education events and the UK Software Sustainability Institute’s guidance on measuring the impact of workshops, for example.
OrgMycology’s impact assessment approach is based in large part on Beth’s expertise and emphasizes the need to make impact assessment an ongoing, sustainable practice rather than something that just happens when a report needs to be written.
No matter the approach, the resource-related practicalities of assessing impact make it even more challenging to do well. It’s a lot of work to do this well. Which introduces questions about who is responsible for doing the intensive assessment work. Funders, for example, ask open source projects to report their impact, but the projects often lack the capability and people-power to do thorough assessments (especially on the more social, community-oriented questions). Moreover, software projects need funds and resources to do a better job tracking impact, but in many cases they also need to demonstrate impact before they can get the funds… a chicken-or-the-egg type of problem.
We are in the early stages of brainstorming ideas for how to make impact assessment easier for everyone–software projects, their leaders and developers, funders, institutions, government agencies, and the like. But what has become clear so far is the dearth of tools and sustainable processes to make impact assessment an ongoing, transparent practice.
What might it look like to automate the quantifiable metrics and make them accessible to communities and funders?
How can we build avenues for community members and users to document how a project has impacted their work or their career, which can then be used in grant applications, reports, demos and presentations, C.V.s, job applications, tenure packets, and project websites?
We’ll be discussing some of these topics a bit more in the next CZI EOSS community call and in some of our upcoming consulting engagements, and we’re looking forward to sharing what we learn. Have ideas or interested in helping to move this effort forward? Reply and let us know!
Short Updates
EOSS Community Call Resources
As we mentioned in a previous newsletter, we’ve been hosting CZI’s EOSS community calls. Our first resource from those calls, “Managing an open source project: A Checklist of Issues to Consider,” is now published and open for contributions. This is a community output from folks who attended the call and contributors from the broader EOSS community. Stay tuned for our next upcoming resource, 10 Simple Rules for Leadership without Formal Authority, from our August call.
OM-The Turing Way - JupyterHub Collaboration
This week, we attended Collaboration Cafes at both The Turing Way and JupyterHub to introduce our research and community development project. In short, we’ll be querying the JupyterHub community about their successes, challenges, and specific needs, then working to help the community develop and implement community-building strategies. We’re excited to share progress and outputs openly and generalize some of the learnings for other projects to take a look at.
Member discussion