This project goals and ethos

fbTREX operates to expose how tracking and profiting from user data has a negative impact on society, so that corrective political and civil actions can be taken to address these issues.

We raise awareness, offer assistance, and develop tools to help individuals and organizations face the challenged posed by data tracking. We provide analytical instruments to reuse the data collected, restoring the benefits of this data to their rightful owners.

We are techno-activists, but we operate in the interest of all of society – we strive to elucide what has been hidden, and to make understandable what has been obscured, through the use of clear language, free thinking, and unfailing optimism.

Responsible use of technology

We recognise that technology can elevate human potential, but we believe that this pursuit brings serious responsibilities. Users must be aware of the implications of the technology that they use, and companies have an ethical obligation to be designed as transparently and fairly as possible.

Shared knowledge

We believe that knowledge symmetry is the greatest equalizer between users and companies, allowing users to make informed decisions about the risks and advantages of technology. We undertake and advocate an open approach to knowledge, and do our best to share what we know.

Critical thinking

Critical thinking enables us to question the world around us and search for answers, instead of just accepting the status quo. We consider this to be the most crucial element of a healthy relationship with technology.

Technical ethos

We do not share full access to the fbTREX dataset with third parties – we believe this asset has been provided in trust by our users, and we pledge to protect it.

At the moment, our data protection model is by policy, but as part of our development plan, we will improve our technology to offer data protection by design.

When installing fbTREX, users only share some of the data that Facebook provides to them in the form of public news feed posts. Still, some personally identifying information (PII) may be present in this data, in particular as they can be linked to the users’ profile before anonymisation. As we only collect data in public interest, transparency and fairness are essential to us in our handling of this data. We do not have a business plan to develop, and will not use our datasets for profiling.

fbTREX ethics clearly define the following limitations that we set for ourselves and third parties handling fbTREX datasets, in both the collection and analysis of data.

  1. We only collect public news feed data. We do not collect data from users’ profiles.

    • This represents the difference between fbTREX, which enables self-assessment, and social media intelligence. We do not engage in social media intelligence, as we consider it to be a potentially abusive practice.
    • We consider the public data from users’ news feeds to potentially contain personally identifying information (PII), and treat it as such.
  2. We only store public news feed data on our servers.

  3. Users who install the fbTREX browser extension have full control over their data.

    • Users can halt our collection of their data at any time by uninstalling the extension.
    • Collected data can be deleted at any time on request to
  4. We follow a strict opt-in policy. Users will have to meaningfully opt-in to share their data with any third party.

  5. fbTREX will only run analyses on our datasets to study social effects of personalisation algorithms, not the individuals participating.

    • While this cannot be formally verified yet, as part of our development, we intend to formulate and pu blish updates on the safeguards we are implementing in our research.


In July 2018, we received confirmation from the European Research Council that our 12 month development plan has been funded as a joint project with the University of Amsterdam. We will pursue partnerships and additional funds complementary with this project plan.

The software, fbTREX is a free software project designed to promote accountability for personalisation algorithms, and for increasing understanding and awareness of their effects. We aim to address Facebook’s closed model as a decentralized community. As we exist in the public interest, we do not sell our data. fbTREX intends to achieve the following goals in collaboration with partners who share the values, priorities, and ethics described below.

Short-term goals

Here are some of our specific goals for the coming months.

  1. To enlarge and diversify our community. We plan to do this by offering users more features:

    • To help social media users address misinformation issues by improving our ability to visualise the information diet of users and making our UX more accessible to a broader audience.
    • To focus on additional political contexts affected by content moderation and algorithm filtering; not just elections, but also conflict areas and other politically-charged events.
    • To develop awareness of the impacts of personalisation algorithms.
  2. To affirm fbTREX as a valid forensic evidence collector by:

    • Identifying target users subject to particular rights.
    • Using evidence collected to criticise Facebook’s content moderation and raise questions about the news feed algorithm.
    • Contributing to the publication of more peer-reviewed studies and reports utilising our datasets.
  3. To educate various stakeholders on ethical data usage by:

    • Collaborating with additional research groups and partners.
    • Offering a clear and transparent opt-in procedure for our data processing.
    • Fortifying our role as technical supporters for politically-motivated activism globally.

Long-term goals

Throughout 2019, we intend to make considerable progress toward our vision for fbTREX. We look forward to the following advancements:

  1. Providing our community more granular engagement with their information diets by:

    • Enabling users to experiment with, customise, and share visualisations to better reflect their individual values, priorities, and perspectives.
    • Enabling users to customise their algorithms, representing a major step toward algorithmic diversity.
  2. Enabling our community to compare their news feeds with others, based on mutual agreement. We hope to involve a diverse community to compare feeds globally, in order to observe and verify whether fbTREX tools can be used to address misinformation and foster greater understanding of each other’s viewpoints.

  3. Providing third-party researchers access to our anonymised datasets, to contextualise their findings, combine this with additional data, and complement it with social science methodologies to arrive at more rigorous conclusions in their research. We hope that through this process we will contribute to ethical data use models by:

    • Protecting user data by sharing it only with networks and researchers with whom we have built trusted relationships.
    • Continuing to develop a framework for ethical data reuse. We hope to set an example for the potential of collecting datasets in the public interest, allowing us to better understand social phenomena while protecting individuals.