fbTREX is a tool to empower citizens facing the opaqueness of Automated Decision Making in their life. We address Facebook as the first platform, because the personal and social impact of News Feed algorithms is, in our opinion, one of the most significant violations of freedom of our time.
It wants also explore the benefits of public owned datasets, collect forensically valid evidence to exert their rights, and works with the Academy, to educate people on algorithm influence.
Algorithms are supposed to help us in discerning and selecting meaningful content among the enormous amount of information available. Implicitly, an algorithm is a de facto delegate to prioritize content. This set of values, implemented in the algorithm, should be in the control of the user itself because no one but them knows what matters more. Empowering every user, letting them decide which algorithm to use, make them shareable, allowing them to be remixed; this is what we call “algorithm diversity.” This is opposed to the algorithm hegemony currently imposed by platforms like Facebook. This goal is at the intersection of politics, technology, and freedoms and we should not consider it different from freedom of speech and choice. Algorithm diversity is the techno-political goal of this project.
The mandate of fbTREX is to focus on technology and letting other partners be the domain experts in the social, political and governance issues fbTREX can be used to address.
Targeted political advertising is an issue, and in 2018, we focused on this as one of our main use case. it is just one among many. Misinformation is another one, polarization and extremization of content, echo chambers, content moderation, algorithmic censorship, automated decision-making accountability; they are all symptoms of the current design of the platforms.
We want to support research and projects that address these topics, with the strategic goal of reaching a wide variety of users, offering them alternative perspectives on how social networks should work, and how data could be treated in the interest of society rather than to exploit it.
If it is true that “you are what you eat”, then in the information age, “you are what you know”. If your information diet is composed almost exclusively of puppies, memes, and selfies, then maybe you want to be aware of it. If you have 500 contacts, but your timeline is composed of the same 30 friends, you should be able to understand why. This is nothing less than a distorted perception of reality, a reality that is given to users by social networks. Facebook does not provide statistics or clarity as to why it shows you certain things related to your interest. Analytics can help the user to assess how they are using the platform, if the interaction is productive and the amount of time they spent.
At the moment, we see a debate on Facebook algorithm influence solely based on anecdotal references. This is true because of the very nature of the Facebook experience. The timeline is ephemeral and personalised, and Facebook doesn’t provide any information as to how or why information is shown in the timeline. Once the user refreshes the page, that chunk of information is lost.
Because social networks provide a personalized experience, the technology we developed:
The research questions of these analyses are to better understand how the platform interacts with users’ data us, or to obtain third-party observation of the platform algorithm. which can be inferred by comparing a statistically significant number of timelines.
We will visualize the differences for the user in an easy to understand way. On top of this educational experience, we want to implement accessible interfaces and design to allow users to experiment with different algorithms. An algorithm is a form of prioritization which is going to stay in our lives; we want to let users experiment with their data to see if they can develop a better algorithm than Facebook. It might seem like an impossible task due to the unmatchable capacity of the two entities. Still, we have a chance: as free-willed human beings, we are curious and often change, especially when we add input received outside social networks and bring them back into our online activities- and this curiosity can be/should be reflected in our interactions within social networks. We believe that a more personalized algorithm “owned” by users might eventually perform differently than Facebook, but the users can pick the most appropriate use in different situations within daily life. fbTREX offers a new potential reality of how users and algorithms can co-exist, by allowing users agency and ownership over the algorithm, and letting users experiment with all of the potential alternatives for social interactions, information, and news that an algorithm can display. Display the possibility, suggest a new potential, let imagine a new reality. Show the power of the algorithm, let the user experiment with alternatives.
We say “perform differently” and not “perform better than Facebook” because, better or worse is a matter of metrics. To be clear, we are exploring this metric; our first scientific publication and the first initial months of research have been used to explore how we can measure algorithms, which is necessary before we start to develop any kind of replacement.
You can download the 15 pages of Project vision and status in .pdf, it has been released in winter 2018.