Researcher

fbTREX is collaborating with researchers and journalists, providing technical assistance and useful datasets to expose Facebook’s personalisation algorithms and their impacts on society

Facebook has an immense influence over the flow of information nowadays. Its personalisation algorithm feeds us with information based on our existing orientations, reinforcing confirmation biases and group polarisation in our societies. We increasingly lack common understandings as personalisation algorithms design bespoke realities for each of us, divergently impacting our lifestyles, moods and opinions on a daily basis. Facebook’s domination over and proprietisation of information has led to extreme knowledge asymmetry between Facebook and societies. Not only does Facebook understand us more than we understand them, due in part to their divisive personalisation algorithms, it increasingly understands our societies better than we do.

This problem is not easy to solve, as it is difficult for regulatory bodies to address platforms whose very functioning is kept secret.

fbTREX provides open technology solutions and research-quality datasets useful to researchers interested in discovering, investigating, and addressing the social impacts of personalisation algorithms. We hope to foster interdisciplinary cooperation and knowledge-sharing among a growing, global community in the production of evidence-based critiques of personalisation algorithms based on the perspectives and priorities of people and cultures outside of Palo Alto.

So far, fbTREX has contributed to peer-reviewed publications and other important reports around elections and referendums in 14 countries, and continues to support emergent research and experimentation, discovering diverse ways of reusing its datasets. The time-sensitive, filtered news feed content collected by fbTREX provides empirical evidence for data analysts to study real patterns around ongoing observations of the social problems presented by personalisation algorithms.

Our collected data is only useful to understanding Facebook if it is utilised for research: fbTREX prioritises the protection of personal data. Access to the full fbTREX dataset is strictly limited to researchers analysing collective phenomena in the public interest. We are asking individuals only to share some of the data that Facebook gives them – the goal is to study social media influence, not the subjects participating. Still, this information can contain a lot of personally identifying information (PII). fbTREX’s ethical policy imposes the following limits: 1. We observe only news feeds, not individual profiles or pages. 2. We store only posts marked as “public,” not posts that are restricted to friends. 3. Users who install the extension have full control on their data. They can delete all of the data provided at any time. 4. Third-party access to user data is granted by each user on a case-by-case and strictly opt-in basis. 5. Analyses run on the dataset will be strictly aimed at understanding social phenomenons, not individuals.

fbTREX can also facilitate investigations into algorithmic discrimination and data exploitation, as it is ideal for comparing different users’ news feed content and advertisements based on parameters like gender, political orientation, age, likes, device, etc. This can help use gain insights into issues we have for instance seen in reports of advertisers excluding protected groups from seeing their ads through Facebook. We wish to support strategic litigation to this end and to aid in efforts by regulatory authorities to investigate whether they can adequately address issues which arise from personalisation algorithms.

If you are interested in utilising fbTREX datasets in your research or investigations, please reach out to us at support@tracking.exposed.

CTA options

Collaborate with fbTREX to expose the influence and abuses of Facebook
Join a collaborative effort to investigate personalisation algorithms
Collaborate with us to research the dangerous social effects of personalisation algorithms
Join a collaborative effort to expose personalisation algorithm influence and abuse