tonyRH.bsky.social
tonyrh.bsky.social
tonyRH.bsky.social
@tonyrh.bsky.social
6/ 📚 Read the full study:

🔗 “Open Science at the Generative AI Turn”
Published in Quantitative Science Studies (MIT Press):
👉 doi.org/10.1162/qss_...

Let’s work together to ensure that #AI & #GenAI align with the values of #OpenScience!
Open Science at the generative AI turn: An exploratory analysis of challenges and opportunities
Abstract. Technology influences Open Science (OS) practices, because conducting science in transparent, accessible, and participatory ways requires tools and platforms for collaboration and sharing re...
doi.org
December 17, 2024 at 10:45 AM
5/ 🌍 A Call for Responsible Use
To ensure GenAI aligns with Open Science values:
- Researchers must integrate GenAI with care and scrutiny.
- Developers need to create transparent, unbiased tools.
- Policymakers must balance innovation and risk.
December 17, 2024 at 10:44 AM
4/ 🔍 The Risk
Despite the potential, there are challenges:
❌ Opaque “black box” models undermine transparency
❌ Bias in training data risks reinforcing inequalities
❌ High computational demands raise sustainability concerns.
December 17, 2024 at 10:35 AM
3/ ✨ The Opportunity
GenAI can:
✅ Increase efficiency of enhanced documentation
✅ Simplify complex science into accessible language
✅ Break language barriers through translation
✅ Enable public participation in research
✅ Promote inclusivity, accessibility, and understanding.
December 17, 2024 at 10:35 AM
2/ TL;DR
2/ TL;DR. Mohammad Hosseini, Serge Horbach, @kristiholmes.bsky.social and I explore GenAI's enormous potential to enhance accessibility and efficiency in science. But we emphasise that to do so, GenAI must bespeak Open Science principles of openness, fairness, and transparency.
December 17, 2024 at 10:35 AM
Its presumptuous but I dont mind that so much - but what I really dislike is when I need to use those details and validate the new account just in order to decline.
December 11, 2024 at 11:18 AM
8/ Read the full paper here for insights on how to reshape research evaluation systems for fairness and effectiveness: doi.org/10.1093/rese...
doi.org
December 5, 2024 at 11:22 AM
7/ We close with recommendations: clarify core purposes of research assessment, use shared frameworks, train assessors on bias, reduce over-frequent assessments, and move beyond binary thinking on qualitative/quantitative methods.
December 5, 2024 at 11:22 AM
6/ We examine the “performativity of assessment criteria,” revealing a tension between rigid/flexible criteria and how transparently they are communicated. Transparent, equitable frameworks are vital to align formal criteria with the realities of research evaluation.
December 5, 2024 at 11:22 AM
5/ Respondents noted that beyond metrics, informal factors—social dynamics, politics, and demographics—play key roles in assessment outcomes. These hidden criteria emerge in opaque processes, granting assessors significant flexibility.
December 5, 2024 at 11:22 AM
4/ Through qualitative analysis of free-text responses from 121 international researchers, we highlight a major gap between formal evaluation criteria and their practical application.
December 5, 2024 at 11:22 AM
3/ How do current systems enable “hidden factors” like cronyism or evaluator biases, and how might these change under proposed reforms? Our study examines researchers' perceptions of social and political influences on assessment processes.
December 5, 2024 at 11:22 AM
2/ Reform of research assessment, especially to avoid over-quantification and empower qualitative assessment, is a hot topic. Change is coming. But how do we balance broader criteria to value activities beyond publishing/funding, peer review reliance, and merit-based rewards?
December 5, 2024 at 11:22 AM
1/ The full paper is available at: doi.org/10.1093/rese...
doi.org
December 5, 2024 at 11:22 AM