Welcome to this week’s Tip of the Hat!
Last Saturday LDH attended the All Tech Is Human Summit with 150+ other technologists, designers, ethics professionals, academics, and others in discussing issues surrounding technology and social issues. There were many good conversations, some of which we’re passing along to you all as you consider how your organization could approach these issues.
The summit takes inspiration from the Ethics OS Toolkit which identifies eight risk zones in designing technology:
- Truth, Disinformation, Propaganda
- Addiction & the Dopamine Economy
- Economic & Asset Inequalities
- Machine Ethics & Algorithmic Biases
- Surveillance State
- Data Control & Monetization
- Implicit Trust & User Understanding
- Hateful & Criminal Actors
Each risk zone has the potential to create social harm, and the Toolkit helps planners, designers, and others in the development process to mitigate those risks. One of the ways you can mitigate risk in many of the areas in the design process (like the Data Control and Surveillance zones) is incorporating privacy into the design and development processes. Privacy by Design is an example of integrating privacy throughout the entire process, instead of waiting to do it at the end. Much like technical debt, incorporating privacy and other risk mitigation strategies throughout the design and development process will lessen the need for intensive resource investment on short notice when something goes wrong.
Another way to approach ethical design comes from George Aye, co-founder of the Greater Good Studio. In his lightning talk, George identified three qualities of good design:
- Good design honors reality
- Good design creates ownership
- Good design builds power
Viewed through a privacy lens (or, in the case of LDH, with our data privacy hat on), these qualities can also help approach designers and planners in addressing the realities surrounding data privacy:
- Honoring reality – how can the product or service meet the demonstrated/declared needs of the organization while honoring the many different expectations of privacy among library patrons? Which patron privacy expectations should be elevated, and what is the process to determine that prioritization? What societal factors should be taken into account when doing privacy risk assessments?
- Creating ownership – how can the product or service give patrons a sense that they have ownership over their data and privacy? How can organizations cultivate that sense of ownership through various means, including policies surrounding the product? For vendors, what would it take to cultivate a similar relationship between library customers and the products they buy or license?
- Building power – building off of the ownership questions, what should the product or service do in order to provide agency to patrons surrounding data collection and sharing when using the product or service? What data rights must be present to allow patrons control over their interactions with the product or process? Libraries – how can patrons have a voice in the design process, including those more impacted by the risk of privacy harm? Vendors – how can customers have a voice in the design process? All – how will you ensure that the process will not just be a “mark the checkbox” but instead an intentional act to include and honor those voices in the design process?
There’s a lot to think about in those questions above, but the questions illustrate the importance of addressing those questions while still in the design process. It’s hard to build privacy into a product or services once the product is already out there collecting and sharing high-risk data. Addressing the hard ethical and privacy questions during the design process not only avoids the pitfalls of technical debt and high-risk practices, but also provides the valuable opportunity to build valuable relationships between libraries, patrons, and vendors.