Product Inclusion

Besides lacking diversity itself, the tech sector exists in a system that amplifies the voices of the most common people resulting in the neglect of underrepresented people. For example, only in late 2022 did female crash test dummies become available which have meant higher death and injury rates for women in car crashes. Furthermore, recidivism algorithms are more likely to rate Black defendants compared to white defendants as higher risk for criminal offense.

However, the lack of consideration for underrepresented communities results in products that don’t work for everybody; facial recognition algorithms have largely been trained on white and male faces which mean lower accuracy rates for everybody else.

Product inclusion emphasizes taking active steps to include underrepresented communities in the product development process, ensuring it works for them and all those who use the technology.

Readings

Algorithmic Justice League, Mission, Team and Story

Recommended time: 9 minutes Algorithmic Justice League, a digital advocacy nonprofit fighting algorithmic bias, highlights the origin and approach of their organization.

Google, Belonging in Products

Recommend time: 5 minutes Here, Google highlights a few of its recently-implemented features that promote product inclusion, such as the “Black-owned” tag on Google Maps and a more-representative skin tone scale to standardize online content.

IDEO.org, On Gender, Identity & Intersectionality

Recommended time: 17 times In this piece, two team members from nonprofit design studio IDEO.org describe their journeys to designing for gender equality and the lessons they’ve learned along the way.

Privacy

Companies collect a significant amount of information about their users for many reasons — including improving their products and delivering relevant advertising (which, for a company like Google where 80% of revenue is from advertising, matters a lot!). In the present day, we take advantage of free services by giving companies our data. Privacy efforts in tech focus on ensuring that users have control over the access to their data. We can see numerous instances of which seemingly benign data can be used to gain insights about people. For example, Target predicted a young daughter was pregnant even before the family knew about it in 2012. There are also more explicit examples of how poor privacy controls have led to mass data leakage. The Facebook-Cambridge Analytica scandal is one case study into how Cambridge Analytica was able to harvest millions of user profiles.

Washington Post, Tour Amazon’s dream home, where every appliance is also a spy.

Recommend time: 15 minutes. Article PDF without paywall (note the animations are not well-represented on this PDF). Washington Post delivers an interactive article that shows the amount of data that Amazon smart gadgets collect on a typical user and why it matters.

You Should Get Paid For Your Data (Episode 2)

Recommend time: 5 minutes The New York Times made a three part video series with Jaron Lanier, a Silicon Valley thought leader who has a vision for how we can build a society where individuals have control over their personal data.

ACLU, In Big Win, Settlement Ensures Clearview AI Complies With Groundbreaking Illinois Biometric Privacy Law

Recommended time: 7 minutes The ACLU describes how Clearview AI, a company that scraped the open web to build a facial recognition software tool, has reacted to the Illinois Biometric Privacy Law.

Poverty reduction