Machine translators have made it easier than ever to create error-plagued Wikipedia articles in obscure languages. What happens when AI models get trained on junk pages?
In the English Wikipedia, that process is working quite well. But in e.g. the Welsh Wikipedia or other tiny languages, they might only have a handful of reviewers in total. There’s no way that such a small group of people could be knowledgeable in all subjects.
Welsh Wikipedia, for example, has less than 200 total active users, and there are dozens of small language or dialect Wikipedias that have <30 active users.
I don’t think there’s an actual solution for this issue until AI translations become so good that there’s no need for language-specific content any more. If that ever happens.
That’s kinda what Wikipedia does. They have a quite elaborate review process before stuff goes live: https://en.wikipedia.org/wiki/Wikipedia:Reviewing
In the English Wikipedia, that process is working quite well. But in e.g. the Welsh Wikipedia or other tiny languages, they might only have a handful of reviewers in total. There’s no way that such a small group of people could be knowledgeable in all subjects.
Welsh Wikipedia, for example, has less than 200 total active users, and there are dozens of small language or dialect Wikipedias that have <30 active users.
https://en.wikipedia.org/wiki/List_of_Wikipedias
I don’t think there’s an actual solution for this issue until AI translations become so good that there’s no need for language-specific content any more. If that ever happens.