- The organizations doing vital design and user research work in the Human Rights Technology space are good at sharing when they convene but not so much in asynchronous ways due to security and privacy concerns.
- With support from the Open Technology Fund Secure Usability and Accessibility Lab, we were able to gather UX Designers, User Researchers, Digital Security Trainers and OSS tool builders at the Human Rights Centered Design gathering at RightsCon 2023 to have critical discussions about the challenges and opportunities that sharing user insights could bring to how OSS tools for human rights needs are built in a ways that are more context sensitive and specific.
- Starting this discussion was a critical step in acknowledging that our informal, in-person ways of sharing are valuable, yet fall short when we are unable to meet. As we discussed we discovered the limitations and considerations needed when sharing information as well as prototyping ideas, such as sharing through trusted funder networks, to facilitate an information sharing process digitally.
Many designers, researchers and trainers have returned to attending events and conferences post pandemic. Events provide a critical space for connecting with the community, sharing insights, and having organic conversations that are often challenging to do online. Though in-person events also introduce many challenges, such as visa access, funding and the general safety and accessibility of attendees, it is often easier to navigate security challenges - such as recognising who is from which organization and in what capacity, which is more secure at events like RightsCon which has processes to verify and validate attendees - in order to support and enable the sharing of critical user insights, stories, and needs with one other. At the RightsCon 2023 Human Rights Centered Design community event on Day 0, we discussed what we could do to better augment in-person information sharing that is safe, private, and which empowers human rights activist users and tool teams without compromising risky personal information.
At Superbloom, some of our team have been focusing on this from the vantage point of improving practices for participant and researcher safety, with a particular focus on high-risk contexts. This conversation became front of mind for Eriol Fox, Senior Designer, after attending the Nonprofit Software Development Summit in November 2022 where folks that work directly as front line support for human rights activists expressed a need to have their experiences and frustrations with human rights tools heard by the technical teams. They told stories not only of the ways in which tools did not meet user needs but the work-arounds they had developed in certain contexts as well as their frustrations with the methods that many tools have to provide feedback about their tools. One attendee expressed exhaustion at raising an issue in GitHub or filling out a Google form, given they have “no idea how long or where that goes”, not to mention needing to provide the same or similar information in multiple tools, and they wondered what could be done to limit the time spent communicating the same cases in many tools.
Coincidentally, José Gutiérrez - as a newcomer to the Internet Freedom space and having done consultancy work for Okthanks and Internews - has seen the kinds of problems that researchers/designers, end-users, and tool makers have in the space. One of those being the apparent togetherness that communities have, but the lack of actual information sharing. Of the great opportunity our community has of materializing values such as openness, transparency, and collaboration on a new level.
The Human Rights Centered Design community, co-facilitated by Superbloom, held space on Day 0 at RightsCon 2023 for those practicing design functions in the Human Rights Technology space, as well as those interested in design practices in Human Rights work. The day was informal and centered around collaboration, conversation and problem solving discovery sessions.
Eriol facilitated a session that explored questions around how and why we might share information about user needs, behaviors and contexts. Discussion groups were formed around the stakeholders involved: the tool teams, the designers/researchers and the end user human rights activists/defenders. The attendant collaborators of the session broke into discussion groups choosing the group they felt most affiliated with: Tools Teams, Designers or End Users.
The groups were encouraged to discuss the topic of sharing user insights/research safely and securely but also offered a series of questions including:
- How might tool teams ‘receive’ user insights/research directly from users, digital security trainers, designers and intermediaries that are not based in technologies and processes which are comfortable for the information-givers rather than the tools?
- How can designers/researchers share user insights/research safely, securely and ethically with other designers/researchers and relevant tool teams these designers/researchers aren’t directly connected with? What might that sharing enable for the tool teams or designers/researchers?
- How might end users directly participate in the sharing of their own experiences and contexts with a clear understanding of what that information might be useful or used for? What guidance would end users need? What safety measures do they need across various contexts?
The session collaborators were encouraged to discuss ideas around the above prompts but also generate their own questions, boundaries, needs and challenges. The hypothesis being that if we are able to find ways users can remain safe while sharing their insights and context, this information might directly improve tools for human rights needs; by building new features that prioritize human rights’ user needs and broaden the potential tools able to be used in human rights circumstances.
What we learned from the discussions
As stated above, the purpose of the discussions was not to come up with definitive answers, but spend the remainder of the time discussing the challenges, opportunities, concerns and details of how user insights/research can be shared amongst OSS tool teams. Here’s what each team (Tools, Design, End Users) discovered.
- The default would need heavy anonymising of information in case specific information is identifiable. We decided that the existence of a completely anonymised, redacted document was better than the information not appearing to exist at all. In this way. trusted networks of researchers and tools teams could still seek out more detailed information or less anonymised documents.
- Threat modeling for user insight could help designers/researchers better understand what needs heavy anonymisation and what doesn’t, and could be shared more openly.
- Designers/researchers and their organizations could still write reflections from more of a researcher point of view to protect the users’ contexts.
- Designers ultimately want to gain a better understanding of needs and context by sharing information across organizations and teams. We surmised that rarely would this information be used to make ‘final product decisions’ but this information could help the designers/researchers better form their own research into specific subjects and contexts.
- We briefly discussed what role funders play in the encouragement of cross-organisation information sharing, and whether those funders could act as the host of a trusted network of tool teams and researchers that could securely share via an online platform which is maintained and secured by that funder.
- If the user’s insight is specific to a tool or function, the tools and platforms need a secure way to capture this information in a way which doesn’t create a risk to that tool or platform’s own security.
- Ways of collecting feedback by using specific surveys or collection tools that are ‘destroyed’ could be an option for destructible feedback over a specific period of time or for a specific need.
- Some concerns were expressed about bad actors sending in information that actively stops or derails a tool’s progress, or compromises the integrity of the tool.
- Users should remain the ultimate owners of their own data and understand how that has been used, by whom, and for what. This introduces a level of complexity on any ‘platforms’ that might facilitate the sharing of user insight, but allows for critical safety measures to be integrated and also awareness of what happens to user insight.
- The ability to join together with other similar end users would allow for a generalization of user insight that could better protect specific individuals.
- Relationships between end users, the designer/researcher and a particular tool were expressed as important contexts for user insight. How could this be shared (if at all?).
Moving forward with secure, safe user insights sharing
At Superbloom we’re investigating how we can better facilitate feedback and research, with a focus on participant and research safety including what a platform that facilitates the sharing of user research and insights could look like. This platform might not only contain the research reports and insights, but also the secure methodologies and processes we use internally to keep at-risk individuals and communities from harm. To maintain safety, these processes and methods are often shared in private confidence with fellow designers and user researchers, but what would a privacy, security and human rights OSS tool space look like if we could all benefit from the insights and knowledge we learn across tools? As open source and openness advocates, we believe that sharing as much as we can, when we can, helps to maintain a healthy OSS ecosystem with safer tools to use for all.
How to get involved?
We would recommend getting involved in the HRCD community. Reach out to let us know how you would like to collaborate and progress this conversation.
You can also check out the resources we co-created with Okthanks and Internews as part of the USABLE and the A Dev’s Guide to… Adoptable Projects which focuses on design resources for security and privacy OSS tools teams.