fbTREX is collaborating with researchers and journalists, providing technical assistance and useful datasets to expose Facebook’s personalisation algorithms and their impacts on society
Facebook has an immense influence over the flow of information nowadays. Its personalisation algorithm feeds us with information based on our existing orientations, reinforcing confirmation biases and group polarisation in our societies. We increasingly lack common understandings as personalisation algorithms design bespoke realities for each of us, divergently impacting our lifestyles, moods and opinions on a daily basis. Facebook’s domination over and proprietisation of information has led to extreme knowledge asymmetry between Facebook and societies. Not only does Facebook understand us more than we understand them, due in part to their divisive personalisation algorithms, it increasingly understands our societies better than we do.
This problem is not easy to solve, as it is difficult for regulatory bodies to address platforms whose very functioning is kept secret.
fbTREX provides open technology solutions and research-quality datasets useful to researchers interested in discovering, investigating, and addressing the social impacts of personalisation algorithms. We hope to foster interdisciplinary cooperation and knowledge-sharing among a growing, global community in the production of evidence-based critiques of personalisation algorithms based on the perspectives and priorities of people and cultures outside of Palo Alto.
So far, fbTREX has contributed to peer-reviewed publications and other reports, and continues to support emergent research and experimentation, discovering diverse ways of reusing its datasets. The time-sensitive, filtered news feed content collected by fbTREX provides empirical evidence for data analysts to study real patterns around ongoing observations of the social problems presented by personalisation algorithms.
fbTREX can also facilitate investigations into algorithmic discrimination and data exploitation, as it is ideal for comparing different users’ news feed content and advertisements based on parameters like gender, political orientation, age, likes, device, etc. This can help use gain insights into issues we have for instance seen in reports of advertisers excluding protected groups from seeing their ads through Facebook. We wish to support strategic litigation to this end and to aid in efforts by regulatory authorities to investigate whether they can adequately address issues which arise from personalisation algorithms.
If you are interested in utilising fbTREX datasets in your research or investigations, please reach out to us at firstname.lastname@example.org. We are also working on an experimental tool to perform Youtube algorithm analysis. You might find worthy our statement on data activism.