Select Language:
Artificial intelligence is capable of many tasks, but it’s still not refined enough to fully replace human editors. Wikipedia recognizes this, which is why it won’t be substituting its human contributors anytime soon.
Wikipedia Volunteers Are Set to Receive AI Assistance
The Wikimedia Foundation has revealed plans to incorporate AI for creating new features. These enhancements are designed specifically to support and uplift Wikipedia’s volunteer community.
Instead of replacing editors, moderators, or volunteers, the new AI tools will automate time-consuming tasks and facilitate the onboarding of new contributors through “guided mentorship.” Additionally, AI will enhance the platform’s information discoverability, allowing editors to focus more on thoughtful engagement and consensus-building while creating, refining, or updating Wikipedia articles.
Wikipedia aims for its volunteers to focus on their goals rather than getting bogged down by technical tasks. Automating processes like translation and adaptation of common topics will enable editors to better convey local perspectives and insights.
As AI continues to evolve and poses challenges to job markets, especially in creative fields, it’s refreshing to see Wikipedia prioritize its volunteers. For further details on the foundation’s AI strategy, you can check out the announcement on Meta-Wiki. A notable excerpt from the announcement emphasizes:
We believe our future efforts with AI will thrive not just due to our actions, but also our approach. Our initiatives will be driven by our enduring values, principles, and policies (like privacy and human rights) which will guide us: we will adopt a human-centered approach, emphasize human agency, prioritize open-source AI, champion transparency, and embrace a nuanced perspective on multilinguality, a core characteristic of Wikipedia.
Generative AI Lacks Human Oversight
While Wikipedia may not always be seen as the most reliable information source online, its human oversight certainly enhances its credibility compared to generative AI alternatives, which often fabricate or misrepresent facts.
Most AI tools, such as ChatGPT, Gemini, and Grok, learn from diverse internet data, which can contain inaccuracies that lead to erroneous outputs, known as “hallucinations.” Wikipedia asserts that it remains central to AI training models, meaning it is essential for ensuring accurate information and context.
Unlike generative AI tools, which often lack human creativity, empathy, and contextual understanding, Wikipedia benefits from human review, making it a superior option when verifying facts, information, and historical details.