Asaf Bartov
I have been a practicing (volunteer) Wikipedian for many years; my home wiki is the Hebrew Wikipedia, but I am also active in Wikidata and the Wikimedia Commons, and I make occasional contributions on the English Wikipedia as well, all under my volunteer account. I have a strong background in software engineering, and an abiding interest in literature, education, open access, linked data, and library science.
At the Foundation, I am working on increasing reach (readers) and participation (editors) in the developing world, through community support as well as partnerships with grantees, other NGOs, and governments.
My main project since 2019 is the WikiLearn online learning environment. Previously, I had worked on the Community Capacity Development program and as a grants program officer.
Sessions
As the Wikimedia movement expands learning opportunities, how we recognize learning achievements matters. This session explores WikiLearn’s certification system to foster alignment on best practices for learning recognition across the movement. What makes a certificate meaningful? How do these recognitions support community growth and sustainability? How can Wikimedians apply these principles to their own capacity-building efforts?
This interactive session is designed for Wikimedia trainers, representatives from affiliates and partner organizations, and Wikimedians enthusiastic about capacity building—especially those interested in using WikiLearn. Through guided hands-on activities, we will:
- Build a shared understanding of what learning certificates mean for the Wikimedia movement.
- Evaluate WikiLearn’s existing process for certifications and badges.
- Identify opportunities to remix, exchange, and scale learning recognition models.
Participants will leave with concrete insights for their own capacity building initiatives, contributing to a more unified, effective approach to learning certifications in the Wikimedia movement.
Generative AI is everywhere, and is here to stay. While it is useful for certain tasks, it can also easily be weaponized to attack and corrupt wiki communities, overwhelming volunteers with endless labor, and inundating our wikis with huge waves of spam, or even with more difficult-to-detect subtle POV-pushing (bias).
How can we defend our wikis against this?