The dream of the ‘90s is over. We now have an Internet of Things thriving in physical space where our bodies live instead of a Cyberspace envisioned as a realm solely of the mind. That’s cause for optimism as we consider how resilient organizations can collaboratively build a different future in this, our physical reality.
Note: This essay was written in December 2022 as a personal reflection on four years of education, design, and tech policy with the Open Design of Trusted Things, an EU Horizon 2020 program, as contrasted with experiences in the 1990’s tech community in the San Francisco Bay Area. It originally appeared in Reboot IoT: Regain Trust/Refresh Hope, edited by Jon Rogers and Michelle Thorne.
John Perry Barlow’s 1996 manifesto against government regulation, Declaration of Independence of Cyberspace, laid out a naive vision of the internet as a self-organizing place guided only by the Golden Rule. At a time when computer equipment, internet connections, and email addresses were primarily housed at universities with private access restricted to the wealthiest people in the world, Barlow drew artificial battle lines with the government on one side and a disembodied group of denizens of Cyberspace on the other.
This techno-libertarian tract would go on to shape internet culture and was published from the World Economic Forum in Davos, Switzerland. Barlow wrote, “Our identities have no bodies, so, unlike you, we cannot obtain order by physical coercion. We believe that from ethics, enlightened self-interest, and the commonweal, our governance will emerge.”
1990s cyberspace is dead
Twenty-seven years later, this way of thinking is as embarrassingly dated as the thinking of the “weary governments of the Industrial World” was to Barlow. Of course Cyberspace is not separate from the material world. Of course people have bodies. Of course those bodies inform online experiences. And of course the sensors of Internet of Things (IoT) devices installed in physical places have made the idea of Cyberspace as a civilization of the mind obsolete.
Unsurprising for the World Economic Forum, Barlow’s Declaration of Independence of Cyberspace is founded on the idea of marketplaces as the obvious organizing principle for society. This ethos spawned a tech ecosystem where a handful of software companies headquartered in the United States, a physical place with a particular set of cultural values and legal frameworks, shape the internet experience for the global majority.
Maybe companies will save us from government
Over time the “us” of Barlow’s Cyberspace, the group opposed to government intervention, has been consolidated into publicly-traded companies acting as custodians of the internet. The commercialization of the internet, and the associated infrastructure for innovation and product development produced an explicitly capitalist theory of how technology changes.
Consumers buy products that change company behavior as they produce more products to meet demand and stop producing products for which there is no demand.
No, companies won’t change when it hurts profits
What can people do to change technology? Buy different things. In the advertising-fueled context of surveillance capitalism, that means not only buying alternative products, but directing attention differently: viewing different content, configuring different options, and spending time differently. The ecosystem of innovation and product development running on Human-Centered Design practices relies on a similar model of tracking user behavior to understand how to allocate resources to make product changes. The vocabulary of user engagement, organic traffic, time on task, and conversion rates guides investment.
Under this paradigm, individual choice and individual responsibility to select and configure products are presented as central to this model, with individuals suffering the consequences of their mistakes. If their privacy is violated, money wasted, reputation damaged, that’s a consequence of not understanding the application, reading the fine print or being an informed customer. The public interest tech community works for change within this system through policy and advocacy campaigns to raise awareness of issues, educate people to use different products, and train them in how to use available products differently.
The result of the 1990s dream of a Cyberspace free from government control is an internet with power firmly consolidated within a small number of quasi-monopolistic companies, a configuration recognizable from the industrial age. No place is this clearer than in the consumer IoT space, which is dominated by Amazon products (i.e. Alexa voice assistant, Echo smart speakers, and Ring connected doorbells), and Google products, (i.e. Google voice assistant, Nest thermostats, cameras, speakers, and doorbells). The fresh companies of the 1990s became the same type of “weary giants” Barlow abhorred.
Maybe governments will save us from companies
How did we get here? Barlow’s trust in markets was misplaced, and today’s companies are “imposing tyrannies” the way he feared governments would. What if governments are needed to protect people from companies? How would treating people as citizens rather than customers change technology development? Within the European Union, where governments enjoy more positive reputations than tech companies, this has become the dominant view, resulting in a different Theory of Change.
People elect governments to create regulations that change tech company behavior to protect citizens.
It’s worth pointing out that democracy and elections are not necessary for this Theory of Change. Governments can also make and enforce laws without input from the public. There is nothing inherently democratic about regulation, but in a European context, the actions people take as citizens to change technology rely on democratic processes. The public interest tech community and associated policy and advocacy campaigns still have an awareness-building and educational focus, but the motor of change is not commercial activity. Instead citizens are encouraged to sign petitions, make their views known to politicians, elect different people, and comment on proposed legislation.
The Declaration addresses this, “Governments derive their just powers from the consent of the governed. You have neither solicited nor received ours. We did not invite you.” Contemporary Europeans are enthusiastically inviting their government to regulate technology. Enforcement of regulations is essential to this model because companies are motivated by profit to avoid expensive fines, costly legal proceedings, and the potential loss of customers in markets where their products become illegal.
Yet looking at the state of technology regulation and enforcement, there is cause for alarm. Despite the General Data Protection Regulation (GDPR) in the EU providing broad privacy protection for citizens, day-to-day experience of using technology from Big Tech has changed little, and companies are making record profits without making high-profile changes. The implementation of GDPR has offloaded responsibility to individuals, who again bear responsibility for clicking the right boxes and making the right configuration choices, for example by refusing consent to data collection.
No, governments are not helping quickly enough
Does that mean governments are unsuited to the task of changing technology? No. Governments such as the European Union work in a slow, consensus-building, multilateral way. Compared to product development cycles driven by quarterly earnings, the years-long approach of regulation is glacial by comparison. However, time is of the essence because IoT devices pose an urgent threat by extending a web of surveillance beyond the screens of 1990s Cyberspace and into the homes, schools, stores, streets, hospitals, and transit hubs of today.
The simplistic binary of governments versus companies also ellides the complicated ways they work together, sometimes in ways that compromise the rights of citizens and consumers. For example, public-private partnerships abound for deploying IoT devices in civic infrastructure. Touted benefits such as saving tax-payer money and benefiting from companies’ previous experience developing technology has led to joint projects in areas including regulation of parking, monitoring usage of public transit, and managing crowds. Controlling the spread of disease during pandemic and conserving energy during a war are adding pressure for more public-private collaborations. Yet unclear models for data ownership, transparency, and accountability in these alliances between governments and companies create opportunities for misuse that threaten the public well-being.
Is anybody coming to save us?
Despite growing threats and the absence of any quick-fixes, there are still causes for optimism. First, the user experience design techniques that developed current IoT products can be applied to undo some of the harms those products cause. Studying how ideas move from policies to pixels can create new processes for change. In this approach, policies are embodied in the user interface and change the user experience in visible ways. Awareness and education move from abstract media campaigns directly into the interface. The work of policy, regulation, and enforcement is disproportionately textual, while the experience of using the internet, particularly with connected devices, is not.
Policies become pixels (and sounds and movements)
Exploring interaction design patterns and systems that demonstrate how policies are expressed in interfaces closes the feedback loop of user behavior and tech change. The regulation discussion shifts from the featureless expanse of generic Cyberspace to an embodied experience: where do the eyes suggest to click to navigate a menu, what to say to activate a voice assistant, which alert sounds convey an appropriate amount of urgency, and how does a device feels in the hand as it moves? Expanding the rich vocabulary of sensorial qualities for interacting with IoT devices beyond the corporate product design world can include more people in creating those experiences, including policy makers and the public. User experience design practices such as flow diagrams and system maps make it easier to identify manipulative elements of deceptive designs and to combat them. The internet may still be 0s and 1s, but discussion of it should not be only alphanumeric.
Intentionally resilient organizations
A second reason for optimism are promising experiments with organizational resilience. The current climate crisis, wars, natural resource shortages, and pandemics all require new approaches that explore systems of power in a time of decaying trust in institutions. Plugging a fraying social safety net with corporate apps will not solve these problems. But encouragingly, these cataclysmic shifts have provided fertile ground for new kinds of organizations that side-step companies, governments, and public-private partnerships.
Mutual accountability makes organizations resilient in the face of change and uncertainty. Neighborhood groups, trade associations, investment clubs, coalitions, labor unions, data ownership collectives, and membership organizations of many kinds are thriving. One factor driving this renaissance is a new attention to governance and organizational structures. Being intentional about how these structures work and explicitly naming and identifying power is an exciting alternative to waiting for top-down answers.
New collective governance structures can also subvert the emphasis on individual action and individually born harm. Other ways of being beyond being a more informed consumer or engaged citizen emerge. When help isn’t coming from companies or governments, people known to each other working in a decentralized manner have capabilities those centralized power structures lack. We can help each other. Collectively, we are better equipped to fight back against technology that threatens our freedoms leaving us helpless.
Re-examining Cyberspace as described by Barlow as “both everywhere and nowhere, but it is not where bodies live,” through an IoT lens exposes this as a dangerously out-moded way of thinking. The IoT happens where our bodies live, and embodied action is the only possibility for addressing its potential harms. Two starting points to address those harms are putting policies into pixels and designing resilient organizations with innovative governance structures. Building on these suggestions, we can collaboratively build a different future.
We can save each other.
Credits
Project Contributors: Ame Elliott, industry expert for the Open Design of Trusted Things project. Team of Reboot IoT: Regain Trust/Refresh Hope: Michelle Thorne, Jon Rogers, Georgia Bullen, Babitha George, Davide Gombe, Solana Larsen, Kasia Odrozek, Irini Papadimitriou, Bas Raijmakers, and Geke van Dijk. A special thank you to the Early Stage Researchers, faculty, and staff of the OpenDoTT programme.
H2020 MSCA-ITN 2018
This project received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 813508.
Funded by the European Union.