Article 6ZA4T Driving towards a double bottom line, through participation and choice

Driving towards a double bottom line, through participation and choice

by
Ashley Boyd and Mark Surman
from The Mozilla Blog on (#6ZA4T)
Mozilla-Blog-Thumbnail@2x-1024x576.png

As global political landscapes shift, mission-driven organizations face a critical challenge: creating resilient models that deliver meaningful social impact and financial stability.

For over 25 years, Mozilla has worked to strike this balance across diverse political and economic climates. While Mozilla's structure is unique - a non-profit Foundation as the sole shareholder of multiple commercial subsidiaries - our approach to social impact doesn't only rely on our unique governance model. It also requires a portfolio-wide commitment to using technology development, industry influence and consumer power as a way to build meaningful alternatives.

From 2016 to 2024, we ran several advocacy programs focused on the consumer power piece of this equation. This post reflects back on this work with the hope that leaders of other social impact organizations can learn from the path that Mozilla has traveled. We focus on three concrete ways to create change by focusing on mobilizing consumer demand - for technology they love and trust, and for social impact.

Mozilla's advocacy history

Mozilla's advocacy philosophy centers on creating change by building technology with public support and participation. Mozilla Co-founder Mitchell Baker identified Mozilla's core strength as advocacy through building technology, using an architecture of participation."

In the early 2000s, Mozilla channeled public frustration over Microsoft's growing monopolization of computing and the web, sparking both a consumer backlash and a cadre of open source projects aimed at creating alternatives.

These projects - and the responsible tech companies that many of them spawned - showed that you could push technology in a better direction by creating alternatives. Starting in 2016, we decided that Mozilla should not only continue to build alternatives but also that we should expand our efforts to include advocacy programs that invited the public to participate in this future, both in helping to define what alternatives should be built and to push on tech companies to do better.

This included three key strategies to increase and meet consumer demand for trustworthy AI" and privacy-centric products.

Empowering consumers

Consumer power grows when and where we cultivate it. Where are consumer sentiment and commercial offerings out of sync? These intersections present a rich opportunity for mission-based organizations and companies to stand out and make a difference.

Beginning in 2017, our Mozilla team noted a large gap between consumer attitudes about privacy and market share for privacy-preserving products in the U.S. and elsewhere. When we asked our global community for insight, the same questions surfaced again and again: Do products that respect consumer privacy exist?" and, How can I tell if a product I own (or want to buy) respects my privacy?" Indeed, to our surprise, we couldn't find a comprehensive, accessible consumer guide on privacy-preserving tech!

We set out to address this gap by launching Mozilla's Privacy Not Included initiative. From 2017 to 2024, the Privacy Not Included team researched and evaluated the privacy and security practices of over 2,000 consumer technology products, platforms and apps. Their product reviews balanced accessibility and reliability, which translated into detailed reviews with a strong dose of snark and humor. We performed the research everyone knows they should do before using a product, but never does.

From a public engagement and empowerment perspective, Privacy Not Included was an instant success. It quickly reached millions of consumers globally through Mozilla platforms, social and traditional media as well as organic sharing. The guide's unique content drew more than a million unique visitors to Mozilla's website each year, while the project garnered more than 7,000 local, national and global media stories to date. Privacy Not Included also sparked ongoing, year-round engagement with our community as thousands of readers nominated products for review and thousands more rated products using our custom creep-o-meter" rating system.

As the reach and visibility of Privacy Not Included grew, the project further empowered consumers by providing a platform to pressure companies to improve the privacy and security of products already in use. It was a natural evolution and connected easily with our other advocacy capabilities but, honestly, it wasn't something we planned ahead of time. Connecting our research and campaigns functions, the Mozilla Foundation regularly issued action alerts calling on companies with the lowest rated products to improve their products and policies right away. In response to our Privacy Not Included campaigns, companies reach out to us to complain, inquire, ask questions, and ultimately make significant improvements.

As proof of the impact: several large tech companies changed their product release schedule to ensure our team would review their updated products in the next issue of Privacy Not Included.

Privacy Not Included's research and public campaigns to date have sparked more than 150 improvements to consumer technology products used by billions. Among the most far-reaching changes resulted from our 2023 investigation into the data collection and data sharing by car manufacturers. Our researchers uncovered data collection and sharing by all 25 major car manufacturers, with many collecting vast amounts of personal information, including sensitive data like genetic information and sexual activity, and sharing it with third parties.

The report sparked widespread public outrage and media attention (over 900 press stories), and prompted further investigation by Senator Markey and the Federal Trade Commission (FTC). In response, the car manufacturer's association (The Alliance for Automotive Innovation) announced their support for federal data privacy legislation for the first time. Additionally, Toyota, Lexus and GM pledge to stop selling driver data to data brokers and to expand driver's ability to delete data. (There's now a court case pending in Nebraska based on this research)

Both by design and by accident, Privacy Not Included expanded its impact well beyond providing consumers with reliable, accessible information about the privacy (or not) in consumer technology products. Its research prompted meaningful, voluntary actions by companies unaccustomed to being investigated and called out for invasive features and practices. Today, responsible technology" is increasingly the default expectation by consumers and a bar companies seek to meet.

Shaping products

Responsible products aren't always born." More commonly, they evolve through incremental improvements including changes driven by consumer demand and relevant regulations. Trusted organizations can effectively surface, translate and mobilize public opinion to spark actionable changes by companies and policymakers.

Mozilla tackled each of these functions when we created a platform for consumers to share their experiences with YouTube's recommendation algorithm and participate in research to demonstrate what changes were needed to improve YouTube's safety.

Several years after the focus on social media's role in the spread of misinformation during the 2016 US election, questions started to emerge about the impact of YouTube's recommendation algorithm in viral misinformation. At the time, no significant independent research had been conducted to show whether - or how - misinformation was spreading on the platform. To learn more, we asked Mozilla supporters to tell us about their experience with video recommendations on YouTube and we received over 2,000 stories that highlighted specific instances in which people were recommended violent content or misinformation that did not relate to their previous viewing history or search.

These powerful stories compelled us to further examine YouTube's recommendation algorithm using research methods that would validate our concerns to spark action by Google and policymakers. In 2020, we launched YouTube Regrets," a global crowdsourced research study analyzing data from more than 37,000 across 190 countries. Based on our extensive analysis of the data submitted from user sessions, we validated the platform's practice of recommending extreme content users had not requested. Our findings also highlighted striking differences between the volume and type of content recommended to users in the U.S. and other countries.

The YouTube Regrets research and crowdsourced methodology brought to life people's real-life experiences and gained widespread media attention, followed by extensive interest among global policymakers. Mozilla held extensive briefings with policymakers, including the EU Commission as it was crafting the EU Digital Services Act legislation introducing new and sweeping requirements for technology platforms. Our body of work on YouTube's recommendation algorithm - both the findings themselves and the gap in independent research to uncover platform issues - highlighted the need for transparency of technology platforms and independent researcher access to platform data.

In response to press inquiries and mounting consumer pressure, YouTube product leaders questioned the validity of our research findings and recommendations. However, the increased public scrutiny and our ongoing, direct conversations with YouTube leadership appeared to spark significant changes within YouTube. In 2021 and 2022, the company announced a series of voluntary changes to better surface helpful" content and reduce recommendations based on engagement metrics alone.

The greatest changes came as a result of the passage of the EU Digital Services Act, which requires platforms like YouTube to grant independent researchers access to platform data to identify and advocate for improvements. Starting in 2023, YouTube opened its doors to independent researchers and research is underway. The YouTube Regrets" research was cited in the Digital Services Act language, validating the need and impact of platform research of this kind. Mozilla's community-backed campaign not only succeeded in changing one powerful platform but also creating a groundswell of support for transparency across technology platforms.

Building alternatives

Sometimes, responsible alternatives simply don't exist and must be created from scratch. This option can be daunting but doable by mission-based organizations, particularly those with engaged communities. Building tech alternatives through the power of community is the architecture of participation" in action

Mozilla undertook this approach when launching an initiative to address bias in voice recognition systems caused by limited training data. Beginning in 2017, Mozilla set out to build an open dataset of diverse voices through crowdsourcing and consensual data collection. The goal was to build a dataset of underserved languages, accents and ages in order to power equitable speech recognition technology. Today, Common Voice is the world's largest crowdsourced open speech dataset, and enables community and commercial projects to offer voice-enabled services in underrepresented languages.

Common Voice's dataset was created (and continues to grow) from thousands of hours of speech contributions from the Mozilla community and beyond. To engage Mozilla's existing and new supporters, we shared our vision of advancing equitable speech recognition technology and created a highly accessible platform to solicit contributions. Importantly, speech contributions and other volunteer activities do not require technical or AI expertise, making it highly accessible to a wide range of supporters. To date, we've collected over 33,000 hours of voice data in 300 languages contributed by more than 750,000 people.

With voice clips donated by volunteers on every continent Common Voice reflects real-world speech and centers underrepresented voices. It's more than just infrastructure - it's a movement for data dignity and linguistic justice, and proves that open data can be powerful, participatory, and global. The project is now being used to train machine learning models, so that AI all over the world is more inclusive.

Conclusion

Mozilla's three-pronged experiment in innovative advocacy strategies - empowering consumers, shaping products, and building alternatives - offers a potential model for mission-driven organizations that seek to create lasting social impact while maintaining financial growth. This strategy has shaken up the big tech landscape, and driven massive impact.

As the internet has transformed in the age of AI, so has Mozilla. Mozilla Foundation's advocacy work continues, with urgent campaigns on issues like surveillance tech. Mozilla Corporation and MZLA are transforming Firefox and Thunderbird for a new era through adoption of emerging technologies. Mozilla Ventures, an impact fund, is investing in dozens of founders and start-ups, seeding a new generation of companies committed to the Mozilla Manifesto. And Mozilla.ai, an AI incubator, aims to empower developers with trustworthy AI.

Mozilla's mission remains a double bottom line: advancing our manifesto and succeeding in the market, so we can do even more to build a better internet, and better AI, with people, for people.

Ashley Boyd was Mozilla Foundation's Senior Vice President, Global Advocacy from 2017-2024. Mark Surman is President of Mozilla Foundation.

The post Driving towards a double bottom line, through participation and choice appeared first on The Mozilla Blog.

External Content
Source RSS or Atom Feed
Feed Location http://blog.mozilla.com/feed/
Feed Title The Mozilla Blog
Feed Link https://blog.mozilla.org/en/
Reply 0 comments