Wikipedia Won’t Be Using AI to Replace Humans, Thank Goodness

Wikipedia Won’t Be Using AI to Replace Humans, Thank Goodness

AI can do a lot of things, but it’s not good enough to replace human editors just yet. Wikipedia’s new AI strategy understands that, and won’t be replacing humans on the platform anytime soon.

Wikipedia Volunteers Are About To Get AI Support

The Wikimedia Foundation has announced that it will be using AI to build new features. However, these new features are all in the “service of creating unique opportunities that will boost Wikipedia’s volunteers.”

In other words, instead of replacing editors, volunteers, and moderators, Wikipedia’s new AI tools will automate tedious tasks and help onboarding new volunteers with “guided mentorship.” AI will also be used to improve the platform’s information discoverability. This gives editors more time to think and build a consensus when creating, editing, or updating Wikipedia entries.

Opening Wikipedia on a phone to research a show
Roman Pyshchyk / Shutterstock

Wikipedia wants its volunteers to spend more time on what they want to accomplish instead of worrying about technical details. Tasks like translating and adapting common topics will also be automated, which Wikipedia feels will help editors better share local perspectives or context.

Related

How to Become a Wikipedia Editor

Wikipedia is open for updates from everyone, but did you know you can become an editor? Here’s how to become a Wikipedia editor.

At a time when AI is threatening to impact human jobs, especially in content creation, it’s good to see Wikipedia take a stance for its volunteers. You can read the foundation’s new AI strategy on Meta-Wiki, but this excerpt from the announcement sums it up well:

We believe that our future work with AI will be successful not only because of what we do, but how we do it. Our efforts will use our long-held values, principles, and policies (like privacy and human rights) as a compass: we will take a human-centered approach and will prioritize human agency; we will prioritize using open-source or open-weight AI; we will prioritize transparency; and we will take a nuanced approach to multilinguality, a fundamental part of Wikipedia.

Generative AI Isn’t As Good as Human Oversight

Wikipedia isn’t the most credible source of information on the internet. But it does have human oversight, which makes it better compared to generative AI solutions, which often hallucinate or make facts up, in my opinion.

Most, if not all, AI tools like ChatGPT, Gemini, Grok, and others have scraped the internet to form their training dataset, and errors in this dataset lead to the AI model experiencing hallucinations or giving incorrect information. Wikipedia claims that it’s at the “core of every AI training model,” meaning it needs to ensure the information it’s giving out is factual and provides the necessary context.

Generative AI tools lack human creativity, empathy, understanding of context, and reasoning. These are great tools if you want to research something or need to quickly analyze a big spreadsheet. But when you’re looking at facts, information, and history, having a human look over the text is always the better option.

Leave a Comment

Your email address will not be published. Required fields are marked *