Hello. I'm Sarah Gold.
My work, and this website, is dedicated to helping organisations and individuals design trust into data enabled services. I want to make trustworthiness in the products and services we all rely on the expectation and not the exception.
A leading expert in trust, design and technology. I have a well-founded reputation for my life’s work to change the way personal data is collected, managed and used.
I've given evidence in parliament on transparency in algorithms. I'm a Forbes 30 under 30 awardee, I've received the Gold Award from Creative Conscious, been named a New Radical by Nesta and the Observer, and a Future Pioneer by the Design Council.
A digital leader once described me as a "rebel with the right cause."
I'm founding CEO of IF
I started Projects by IF in 2016 to practically develop the field of Responsible Technology, through design. IF is in business to redesign trust in technology.
IF is not a business as normal agency. We guide, show and coach you how to move beyond ethical principles and marketing statements, and into practical action. This work is increasingly a commercial and moral imperative as technology mediates how we work, learn, care, love and are remembered.
We work with innovative organisations that want to lead. Our clients include: Google, Meta, Citizens Advice, Oxfam, DeepMind, Mozilla Foundation, Our Future Health, BBC, Rolls Royce, Blue Cross Blue Shield. Join us.
Whatever Sarah Gold and team put out into the world is always done with unusual and exemplar care. https://t.co/FJ7KUzPnwj pic.twitter.com/GyXp70LmP1
— John Maeda (@johnmaeda) March 20, 2020
Where my work began
I started in the built environment
Studying architecture was my first step into systems-thinking: it taught me how to design for multiple people, and how to consider the relationships between those people, and the objects they interact with. Learning how systems were made was a key part of learning how to deconstruct them too. That's a skill that I still use to this day.
Architecture taught me how to design for communities
Architecture is different to product design: you're designing for groups of people, perhaps even entire communities, as opposed to individuals, or consumers. Even so, I became disillusioned by the economic model to which we build in the UK: designing to maximise profit for the land owner rather than focussing on quality homes for people and communities.
It helped me think critically about the world we're creating
Although I quit Architecture, I still believed in the power of design to inspire and create change. So whilst helping to found and grow Open Systems Lab with Architecture 00 I began reading for a Masters in Industrial Design at Central Saint Martins in London.
I became interested in how technology changes the social contract
In my penultimate year, the Snowden Revelations were published and I began to understand how emerging technologies could change the dynamic of our social contract.
The surprise at the time was about state surveillance. Fast forward to today, and the relationship between digital platforms and surveillance has become much more widely understood. But people’s understanding about how questions about personal data and privacy affect them is pretty incomplete. The same goes for leadership and product teams- many of us know that we want to do things differently, but we don’t always know how to do that.
I created The Alternet
All of this thinking was the catalyst for The Alternet, a speculative project for a 'fair trade', community-stewarded internet. The Alternet was my final masters project. The idea was to show how we could manage data in a way that is profoundly different to today.
My proposition was that we use data licences: users would create custom licences for data about them by answering four simple questions. With this, users have greater control over what personal data organisations could use and for what purpose. Companies have a strong basis for consent that enables them to use data to innovate in ways otherwise deemed too high risk.
Data licences, for now, are a speculative data pattern. But what they represent is important, because if society wants to enable the use of data for societal benefit our current approach to consent needs to change.
People often let companies decide how data about them is used by accepting defaults, without considering the implications. Companies are overly conservative towards using data for purposes beyond increasing revenue, while pursuing the latter aggressively. This all makes using data for societal benefit harder to achieve.
After freelancing for a while, I started IF
I started IF in 2016 as a vehicle for the change I wanted to see in the world. I wanted to work collaboratively with other designers and developers to demonstrate how design can change trust in technology.
Too often issues of data and trust are understood as technical issues or an after thought. But they can't be solved with just more technology or compliance, or simply automated away. Trust is a relationship that's earnt over time, and designing for trust should in integral to an organisation's strategy and delivery.
Read more about my work
Here are a few places where you can read more about my work:
- Design and Digital Interfaces: Designing with Aesthetic and Ethical Awareness (2021) features an interview with me.
- Big Data, Big Design: Why Designers Should Care about Artificial Intelligence (2021) features an interview with me.
- An interview for Google PAIRS in 2020.
- I wrote a piece on privacy and architecture for Icon magazine in 2019
- IF's work in data ethics was featured in Eye magazine in 2019.
- I was interviewed on why designers should care about GDPR for FastCoDesign in 2018.
- Critical Design in Context: History, Theory, and Practices (2017) features the Alternet project.
- The Alternet was featured in Vice in 2014.