Join us at MozFest 2023!

Superbloom will be hosting five sessions at the Mozilla Festival (Monday, March 20 - Friday, March 24 2023). If you’ll be there, we’d love to see you, meet you and get to know you. Come join us! Want to learn how to design a Tech Policy playbook? Are you interested in global tech transparency? Would you like to find out how shadow data affects you? Do you want to understand design’s impact on encrypted messaging? Are you looking for how to center human rights in usability? Join us and 1000s of others at MozFest 2023! This year’s event will be held in person in Amsterdam and online and the Superbloom team will be participating in five of the 360+ sessions. Intrigued? Read more in this post.

You can register for MozFest here, and links to each of the sessions may found below.

The unfreedom monitor: a discussion on transparency in technology

March 21, 2023 | 10-11am EDT | 2-3pm GMT

Veszna Wessenauer and Sindhuri Nandhakumar

For info and to book:

The session’s goal is to provide space to have an interactive discussion on how greater transparency around the use and development of technologies, such as commercial spyware or algorithmic content moderation/ranking/curation, could help communities respond to the threats imposed by digital authoritarianism. The session will illustrate the utility of transparency through real-life events in environments where digital authoritarianism is growing.

The session will build on the research gathered through The Unfreedom Monitor. It is a project to analyze, document, and report on the growing use of digital communications technology to advance authoritarian governance worldwide.

We will introduce panelists from our research project, including from Brazil, Zimbabwe and Hong Kong, who will share country-specific insights based on the Unfreedom Monitor research project and their own experiences and showcase how their findings fit into the global context. But before they begin, we will ask participants to think on their own about what ‘transparency in technology’ means to them. The range of interpretations could be broad, which we welcome. Participants will also be encouraged to suggest anecdotal evidence, data and case studies that they think are relevant information for the panelists. They share their questions for the panelists based on this information, and our researchers will share their insights and respond to these questions. This allows for the discussion to be driven as much by participants as panelists.

At the end of the session, participants will be asked to reflect and share how they might apply what the researchers have said and the panelists will be asked to talk about whether they learned anything new and revealing about what transparency means to both individuals and communities.

Bringing light to shadow data

March 21, 2023 | 3-4pm EDT | 7-8pm GMT

Georgia Bullen, Susan Kennedy, Cameron Hickey and Shayne Longpre

For info and to book: 

Shadow data is information about individuals that is collected and maintained by social media platforms, but is not necessarily visible or accessible to the individuals themselves. This can include data about online activity, personal preferences, and demographics, as well as data that is inferred or calculated. While some of this data is assembled with the explicit consent of the individual, much of it is obtained through other means, such as third-party data aggregation or machine learning model predictions and scoring. The production, collection, and maintenance of shadow data has significant implications for efforts to establish transparency about social media and for society broadly. Shadow data is a known unknown. We know it exists but we don’t know the scale and shape of it. We don’t know where it is used or why. Bringing greater transparency to the platforms that house our digital public square requires establishing a framework for bringing transparency to shadow data as well. During this conference session, participants will work together to identify and classify the different types of shadow data that can be inferred and maintained by social media platforms, and will discuss the potential implications of this data. They will also propose mechanisms for providing greater transparency about shadow data, including ways in which individuals can access and control their own data, as well as ways in which aggregate data can be made available to researchers and policymakers. Overall, the goal of this session is to bring greater awareness to the issue of shadow data and to encourage the development of approaches that prioritize individual privacy and control. By bringing light to shadow data, we can work towards a more transparent and equitable digital landscape.

Sender unknown: design’s impacts on deception and trust in messaging

March 21, 2023 | 3-4pm EDT | 7-8pm GMT

Caroline Sinders and Justin Hendrix

For info and to book:

Consumers are increasingly using encrypted messaging apps to share information, engage with one another, and conduct commerce. But while the promise of encrypted messaging is private communications and user control over the spread of personal information, the reality is more complicated. An overlapping and interconnected set of engineering, design, and system factors, coupled with varied user behaviors, create the conditions for individuals to subvert their own interests or the interests of their communities on encrypted apps.

In this workshop, participants will collaboratively identify how design failures, dark patterns and adversarial behaviors by various parties may combine to produce malicious effects in encrypted messaging. Such malicious effects may be patterns that nudge users to share personal information or forward messages to insecure channels, suggestive user interfaces or flawed security mechanisms that can compromise security. What are the design choices app makers have made that lead to or confuse the user into making poor security decisions? Are there insecure apps that use security UI to look more secure than they are? How does this relate to content moderation capabilities on messaging apps? How do people end up compromising their own safety? What is the prevalence of these phenomena, and what are the policy solutions?

This workshop is split into Two parts: Part 1 is a presentation on our preliminary research.

In Part 2, the facilitators will break the participants into 2-3 groups. Each group is given a specific prompt or question to work on, and then everyone comes back together to share findings, questions, and suggestions. For example, one group may focus on messaging forwarding within Telegram or Signal, another group may focus on Facebook messaging when messages have tens and 100s of participants.

Designing a tech policy playbook 101: tools and resources to tackle OGBV

March 22, 2023 | 7am-8.30am EDT | 11am-12.30pm GMT

Georgia Bullen, Ngọc Triệu, Ame Elliott, Katherine Townsend, Raashi Saxena, Candy Rodríguez, Bulanda Nkhowani and Victory Brown.

For info and to book:

Bringing design to tech policy can have material benefits for the safety and well-being of people on the web and beyond. Building on work from the last two years with the Web Foundation and many global partners, we will share the Tech Policy Design Lab (TPDL) Playbook: a framework that employs a multi-stakeholder collaboration and interactive design approach to gain consensus from a diverse perspective and tackle key tech policy problems with measurable results. 

The Tech Policy Design Lab (TPDL) playbook brings learnings from the first two TPDLs, led by the Web Foundation on Online Gender Based Violence (OGBV) and Deceptive Design, to focus on addressing the urgent challenges of Online Gender-Based Violence (OGBV).

At Mozilla Festival this year, we are opening our process and invite everyone to join our co-design session who identifies as belonging to the global majority, is interested and/or is working on OGBV and abuses, deceptive design or similar issues, as well as anyone who wants to address tech policy issues in a multi-stakeholder way.

Keeping the www open, safe and trusted, in a way that empowers everyone requires new ways of working to combat the polarization, normalization of threats, erosion of trust, and manipulative interactions threatening the internet. We believe there is power at the intersection of technology, policy, and design, and that bringing a design approach to policy problems can create positive change to make the web safe for everyone. 

The workshop will be hosted via Zoom and use digital collaboration tools (Google Docs and Miro).

Centering human rights: prioritizing accessibility + privacy + usability

March 24, 2023 | 3-4pm EDT | 7-8pm GMT

Caroline Sinders and Natalie Cadranel

For info and to book:

While best practices and standards for human-rights centered concepts are emerging in the computer science and tool-building spaces, very few have been created for the research and design fields. This session will help fill this gap by presenting a research and design methodology to those developing technologies to help marginalized communities. This work is designed to streamline the process of user research, promote human-rights centered design, and aid technologists who build tools, platforms, and services.

It is challenging to ensure products or services are adaptive to the needs of a specific community’s privacy and safety, especially in cases where you are building a platform to help vulnerable communities or providing people with direct resources. In light of this, we developed the Human Rights Centered Design (HRCD) curriculum (, which provides best practices, use cases, and knowledge from human rights activists, community organizers, and technologists from across the globe.

In this session, we aim to share our curriculum, gather feedback from session participants, and learn from others seeking to adopt and standardize usability best practices when working on privacy-enhancing tools. To make the curriculum widely applicable, we will encourage participants to draw upon their previous practices and projects and share their experiences and suggestions. MozFest provides a unique opportunity to connect with like-minded groups, and we will use the session to find areas for collaboration and to discuss related projects.


Georgia Bullen, Susan Kennedy, Raashi Saxena.

Photo by Connor Ballard-Pateman | @PinotConoir.