11/18/2021 | Press release | Distributed by Public on 11/18/2021 13:03
The intersection of technology and civil rights is an emerging space that requires further attention from the entire industry, especially given the perceived rise of digital discrimination and bias. As questions have been raised about technology's potential effects on members of marginalized communities, we've heard the calls for more research.
Today, I want to explain more about our approach to understanding how people from marginalized communities experience Meta technologies. Some people have said that their opportunities are limited or that they're having a different experience than others, but we don't have the data to fully understand what may be happening and why. We can't address what we can't measure, so establishing a more accurate measurement framework is vital to create more inclusive products, policies and operations across the company.
To do this right, we can't work alone. We're consulting with the civil rights community, privacy experts, academics, regulators and other organizations on the best way to measure these potential differences in people's experiences. As we've explored options for measurement, experts reaffirmed that any work we do must take into account privacy, security and transparency.
To start, we plan to introduce a framework for studying our platforms and identifying opportunities to increase fairness when it comes to race in the United States. We have explored using aggregate US Census and ZIP code data, which is an accepted way of measuring demographics in the US. However, this approach has some limitations. To that end, we plan to augment that approach with two methodologies that will produce more accurate insights. We will do this in a way that allows important measurement while honoring people's privacy:
Learn more about these methodologies in our technical paper. While this work will initially focus on race in the US, it will help us lay the groundwork for how to address concerns from other marginalized communities here and around the world, consistent with our corporate human rights policy.
I joined the company at the beginning of the year to establish a new civil rights organization as part of our commitment following an independent audit of our policies and practices. As civil rights expert Laura Murphy stated in the audit, Meta has a "responsibility to ensure that the algorithms and machine learning models that can have important impacts on billions of people do not have unfair or adverse consequences." We know that this journey will not be easy, however, we remain committed to this work, doing it thoughtfully, and being transparent about our efforts.