top of page
Writer's pictureLiron Dorfman

Wikipedia in the new Age

Updated: Dec 5


A person walking in a tunnel

On July 8, the Hebrew Wikipedia project celebrated its 20th anniversary, and a week later, a commemorative event took place in Tel Aviv to honor this milestone.


The attendees in the hall had varying connections to the project, with many not being part of its inception or early years and some not even born at that time.


Nevertheless, there was unanimous agreement that the world has undergone significant transformations since the project's inception, and with these changes, the challenges they face have also evolved. Despite these shifts, those dedicated to making information accessible through Wikipedia in their native language remain unwavering, working tirelessly day and night. 

 

During its early years, the Israeli project encountered challenges due to a limited number of writers and editors, resulting in a need for more articles and significant gaps in coverage. However, the situation has evolved, and the Hebrew Wikipedia now boasts over 338,000 articles. The current challenge faced by Israel and other countries and languages is to ensure the quality of such a vast amount of content while safeguarding against biased information or deliberate inaccuracies. 

 

As the project gains wider usage, there has been an increase in contributions from casual writers. These individuals might make minor corrections they consider appropriate, add personal details about someone they know, or update the information they came across in the media. Unfortunately, such contributions often prove inaccurate, contain less relevant information, or do not belong in an encyclopedia. Some writers intentionally insert false information, commonly known as "vandalism." They might do this either to test the system or to further personal agendas, leading to the misleading of readers and the possible effect on public perceptions. 

 

The significant number of donations necessitates a more sophisticated system to monitor the constant changes in the project. This requires increased collaboration among editors, particularly those responsible for continuous monitoring, referred to as Patrollers (New Page Reviewers). Their diligent efforts in reviewing information ensure the project's reliability, making Wikipedia a dependable resource for various purposes, ranging from simple inquiries to academic research. 

 

To assist the Patrollers in their tasks, various technological tools, including those based on artificial intelligence, have been developed over the years. Bots may help identify edits with undesirable content and flag potential instances of vandalism. Bots will streamline the process, allowing human Patrollers to promptly assess changes, address suspicious content, block vandals, and focus on more intricate matters. 

 

In summary, the project's early years were marked by a scarcity of contributors and articles. Still, the current focus centers on maintaining content quality and safeguarding against biased or deliberately incorrect information. Collaborative efforts from dedicated editors and the support of advanced technological tools are crucial in upholding Wikipedia's reliability as a valuable information source for diverse readers. 

 

Initially, Wikipedia's editors heavily relied on "traditional" sources such as printed encyclopedias, books, and academic journals. However, as the platform's popularity surged, becoming one of the most visited websites globally, it attracted attention from various groups, including malicious actors. Regrettably, corruption on the platform has evolved into a more sophisticated form, necessitating protection against not only mischief but also fake news, often spread through social networks. This challenge extends to countering content generated using Deep Fake tools, which employ AI to create realistic but fabricated images and videos by overlaying them onto other media. 

 

Fortunately, compared to other platforms, Wikipedia has likely experienced a lesser impact from Deep Fakes. It can be attributed to the project's foundational principles, which prioritize content from open sources that are easily verifiable, as opposed to closed sources hidden behind paywalls. Additionally, from the early days of the Hebrew Wikipedia, the decision was not to keep articles consisting of only one or two lines except under exceptional circumstances. This added filtering layer allows the inclusion of short messages with proper references and source validation. This unique decision has contributed to the Hebrew Wikipedia's reputation as one of the top three Wikipedias regarding content depth. 

 

Nevertheless, Wikipedia is only partially immune to deep fakes' potential risks. Individuals who gain prominence in the mass media may find their way into Wikipedia's pages, and editors could consider the legitimacy of their content due to their exposure. As the technology for creating deep fakes continues to evolve, the editorial community recognizes the need for additional automated identification systems to provide an extra layer of protection for the project, complementing the examination done by human editors. 

 

In a reality where generative AI tools simplify content creation, Wikipedia editors must continue writing articles themselves because tools like Chat GPT or Google Bard rely on information from various sources, including Wikipedia. Allowing the insertion of reports generated by these tools exposes the risk of introducing errors that may have yet to be identified or resolved in Wikipedia articles of the same language or even in Wikipedia sites of other languages. Such is inherent in generative AI tools, which currently rely on information and present it in various forms without the simple possibility of thoroughly examining the quality of the original content. 

 

However, despite the new challenges, it is essential to recognize the significant technological advancements that have greatly improved the work of Wikipedia writers and editors in various aspects over the past two decades. Notably, new technologies have made editing more accessible and convenient for contributors. 

 

In the past, editing on Wikipedia primarily occurred on computers, but there is a noticeable increase in the percentage of edits made through Wikipedia's mobile software interface. People can now make improvements and additions while on the go, whether traveling by train, during a trip abroad, waiting in line for the doctor, or even from the comfort of their bed just before falling asleep. 

 

One area where remarkable progress has been made is the development of a more user-friendly editing interface. In the early years, writing on Wikipedia required using a tool that involved learning a source code named "wiki markup." This complexity sometimes discourages potential new contributors from getting involved. However, in recent years, Wikipedia introduced the Visual Editor tool, which allows people to edit and contribute without grasping intricate syntax. Adding footnotes with corroborating references to external sources became simpler, leading to deeper content and improved reliability. Other tools have facilitated the integration of images, video content, and sound files more straightforwardly and conveniently. Furthermore, the accessibility of digital photography has allowed for higher-quality photographs contributed by the public. 

 

These technological advancements have undoubtedly played a significant role in making Wikipedia a more inclusive and robust platform, encouraging greater participation from contributors worldwide. 

 

"Zoom" has even come to Wikipedia's aid. Today, various Wikipedia editors and members of the Wikimedia Foundations around the world conduct training sessions for interested individuals also using remote communication. Zoom has eliminated the need to travel long distances to attend meetings in big cities or send guides to remote locations. Now, it is more convenient to participate in these trainings, opening the door for broader involvement in the project. 

 

As a result, Wikipedia, in general, and the Hebrew Wikipedia, in particular, have become more dynamic, accurate, and accessible centers of knowledge. The new tools empower contributors and editors to create richer and more comprehensive content, benefiting readers from across the globe. 

 

So, why not join them? You are all invited! 


A group of people posing for a photo

 

Want to learn more about digital platforms?

Here are some articles you might find interesting:

5 views0 comments

Recent Posts

See All

Commenti


bottom of page