Leading the way towards objective truth and fact driven media narratives

Our Mission


We aim to stimulate and support the revival of the search for truth. Facing a world full of fake news, and manipulation, we fight for the opportunity to decide together, what truth means and reinvent new ways to come as close to it as possible.

The Problem


Media in all its various forms has an increasingly important impact on our society. Information has become one of the most important resources and recent events have shown how elections, wars, and economical change are manipulated using media narratives. Our socio-economic systems have created an environment which incentivises clickbait and rewards falsities and polarizing narratives. It has become increasingly difficult to obtain objective and reliable information on current events and socioeconomic issues. The search for truth has become an outdated ambition reserved for few, privileged individuals with the ability to critically analyse and reflect on the presented information.

Our Solution


With a systemic and unbiased approach to finding the truth Veritá will help journalists to produce better media and the public to critically reflect on presented information. This will be achieved with decentralized source verification and advanced language processing.
A decentralized network will analyse and cross-reference millions of sources to determine the credibility of data, statistics, and events used in media narratives. In combination with advance language processing algorithms, this database can then be used to identify logical mistakes, unintended assumptions, as well as manipulative arguments in media narratives. As the used algorithms will be open source and continuously reviewed and improved by a large body of journalists and individuals around the globe, it provides full transparency and minimizes manipulations.

Why it is important?


We live in a society, where everyone chooses their own facts. Instead of searching for the truth together, we all live our own reality and scream at those who disagree with us. Veritá detaches itself from biases, believes, and assumptions. Taking all data, facts, and opinions into account we aim to carefully analyze and determine their credibility and coherence, to find what is most likely to be true instead of what we want to hear or what makes media companies the most money.

How it could look like


One of the current ideas is a text editor for journalist, which enables the use of verified data, as well as detection of unintended assumptions and logical mistakes. Another idea is a browser plugin, which informs the reader about the used data, assumptions, and manipulative narratives.

Contact us!


Are you curious to learn more, or do you want to join our team? Don't hesitate to contact our Project initiator!

Kaspar Rothenfusser

Project initiator

The next steps


To make turn this idea into a successful Project, we need all the support we can get! We are looking for journalists, philosophers, ethicists, computer scientists, developers, and more to join our team, and are in also looking for a source of funding. If you want to join us on our mission to find truth, whether as a collaborator, advisor, or trough funding, please do not hesitate to contact us!

5 steps towards the truth


Verifying data

To ensure reliable data, each data source referenced in a news article must be verified as objectively and independently as possible. To ensure this, an algorithm will analyze and cross-reference millions of data points. It will use complex rules to calculate credibility scores for sources based on their background and expertise, as well as congruence with other findings. To minimize biases the algorithm will be open source and constantly updated in a democratic manner. It will be run by a network of operating nodes similar to blockchain systems, increasing reliability and processing power. It is crucial that the network of operating nodes is big and wide spread over world demographics to avoid manipulation. Based on the verification algorithm each datapoint will get a credibility score indicating the trustworthiness which is constantly updated as new, confirming or contradicting data is available.

Access and transparency for everyone

A Database will store all verified data and the current credibility score of each datapoint. This database is stored decentralized on several servers and utilizes entanglement technologies to ensure immunity against manipulations similar to blockchain hashing. The database will be accessible to everyone to maximize transparency of information. Datapoints can be directly accessed using APIs to automatically update credibility of outdated data even in articles on other platforms.

Understanding narrative building

Using the database a language processing artificial intelligence will identify the assumptions required to shape the proposed narrative based on the sourced data. To do so, logical coherence between data points and the narrative will be checked and the gaps in the logical coherence will show additional assumptions which are made by the author. Those assumptions will be cross checked with the database to identify possible supporting or contradicting data points and based on this cross analysis the assumptions will be scored regarding their factuality. By analyzing the connection between facts, data, and media narratives, not only can true facts be found, but also better conclusions can be made.

Deciding together

To ensure that the algorithm determining the credibility of data is as objective and up to date as possible, a reliable democratic system for constant review and change needs to be established. A continues discussion needs to be established about the way we search for truth and whether the developed tool achieves this accurately. While many forums exist already to fulfill similar needs in cryptocurrency networks, in this particular case extra caution needs to be given regarding immunity against manipulation and transparency. Therefore a dedicated team will evaluate the most effective and secure way to implement systems to execute the democratic reviewing and changing of the algorithms.

The Usecase

Two user interfaces will be developed, one for media publishers and one for their readers. For publishers, an application will enable access to all verified data as well as offer the possibility to submit new data for review. Additionally, the language processing AI will be accessible to check written text for underlying assumptions and hence enable more reflective and cautious writing. For readers, a browser extension will be available, which enables access to the credibility scores of presented data and shows the assumptions used to shape a given narratives. The precise implementation is not yet clear, however, first ideas are tested, which you can be seen trough the links below!

A long way to go


To get it right and achieve the intended objective, every aspect needs to be implemented with great attention to detail and after a thorough analysis of possible unintended consequences and ethical concerns. Furthermore, several technological innovations and new solutions need to be developed. Therefore, this Project requires a big team with a wide range of expertise.

What is truth and how can we find it?

To answer this question in a thorough and unbiased manner and ensure close ethical control, a team of philosophers, epistemologists, ethicists, and journalists will be gathered. This team will determine the underlying principles that are used to develop the algorithmic search for truth, as well as the democratic way to constantly review and improve it. It will then be up to a team of Computer scientists and developers, to translate those principles into a working application and provide all the necessary technological innovations. Close interdisciplinary collaboration between the theoretical and technical team is essential.

Technology development

Several technological innovations are still necessary to implement the proposed solution. These include new database structures, elaborate search algorithms, algorithmic data contextualization, and cryptographic entanglement. Combining techniques from a variety of fields yields an interesting challenge and promises to be source of meaningful innovation.

Funding

The next step is the development of a concrete project plan and proposal. Furthermore, relevant experts will be consulted to get a more detailed list of the required expertise and possible risks. This will be make it easier to acquire the necessary funding and expertise. The funding could come from research institutes or the private sector. However, it needs to be ensured that full autonomy of the Project is granted to prevent corporate manipulation in Veritas' quest to find the truth.

Get Notified


Morbi semper nisl et dictum cursus. In hac habitasse platea dictumst. Aliquam blandit etiam vel massa eget mollis. Donec at quam orci. Proin et semper metus consequat etiam.

© Untitled Corp. All rights reserved.

Thank you


Morbi semper nisl et dictum cursus. In hac habitasse platea dictumst. Aliquam blandit etiam vel massa eget mollis. Donec at quam orci. Proin et semper metus consequat etiam.