Software for Good Logo
Insights / February 8, 2023

Software with Soul

A black-and-white photo of people with their hands raised, tinted teal blue, with the words “Software with Soul” overlaid in glowing gold. Photo: Luis Quintero, Pexels

By Sharon Kennedy Vickers

Our affinity for convenience is destroying our humanity. We can no longer afford the convenience of technology without the inconvenience of moral integrity. Because, as the prophet King warned us, “we are in danger of destroying ourselves in the misuse of our own instruments.” — Rev. Dr. Bernice King, January 16, 2023

The “next big thing in tech”

A little while back, the moderator of a panel asked me a question I’ve heard many times: What’s the next big thing in tech?

That question, as it’s been posed to me over the years, is usually an invitation to name some disruptive innovation that’s just on the verge of changing the world. At the beginning of 2023, there are many next big things that fit that description — above all the revolution that is generative AI. But now more than ever before, in this moment when artificial intelligence is making dramatic new strides, I believe the next big thing in tech is not a thing at all. It’s the return of the human soul to technology.

 

What gives technology its soul

What I call technology with soul liberates human beings to live freely and fully. Technology with soul helps people dream, build, create and care for each other, as only human beings can do. The lines of code that make up databases, internet protocols, and search engines, for example, expand the scope of human freedom and compassion. You can recognize software like that the way you recognize soul music. Soul music is a blend of gospel, R&B, and jazz — music that creates harmony, connection, call and response. Software with soul blends different languages, frameworks and protocols to create a living relationship with the people who use it. When you hear soul music, you feel the lived experience in the lyrics, and you feel hope and joy in the listening. When you use software with soul, you interact with the lived experiences of the people who created it — and in using it, your life gets richer, more connected, more free.

The trouble is that building technology with soul is often very inconvenient — because it can only be built in ways that prioritize human wellbeing above everything else. As Bernice King reminded us this Martin Luther King Day, “We can no longer afford the convenience of technology without the inconvenience of moral integrity.” Soulless technology, which puts the convenience of some over shared wellbeing, lacks moral integrity — and it’s leading us to our destruction.

 

Convenience and soullessness

AI in particular can certainly make life more convenient by automating activities that only humans were capable of before. Today AI-enabled software can compose poems, paint pictures, create videos, and even write code — applications of software that can liberate human genius in truly transformative ways. But AI is also making it more convenient for state power to deprive people of their liberty by automating law and punishment; more convenient for banks to deny women access to capital; more convenient for healthcare companies to deny people of color the care we need to live. These applications of AI are convenient for people with more power, and oppressive to people with less. They are soulless.

That’s because of how they’re built. As Joy Buolamwini explains, “The underrepresentation of women and people of color in technology, and the under-sampling of these groups in the data that shapes AI, has led to the creation of technology that is optimized for a small portion of the world.” She founded the Algorithmic Justice League to make sure that software built on AI is optimized for everyone, not just those who have the most power — to make sure, in other words, that it has soul.

 

Tech leadership with soul

Buolamwini is one of the Black women who’s leading the global effort to return the soul to the software we live by. Another is Ruha Benjamin, who is teaching the world how software supports and reinforces the violence of racism. Another is Timnit Gebru, who is leading the effort to reimagine how AI software can be built, trained, and used to benefit the whole human community. In these courageous and soulful women I recognize the lineage of Black feminist thought that formed me as a college student and continues to guide Software for Good today.

With guidance from these prophetic activists and intellectuals, people of goodwill all over the world are building the next big thing in technology: returning the soul to the software we live by. That’s exactly what we’re doing every day here at Software for Good. But even if you’ve never written a line of code, you can help bring soul back to your own technological life by looking beyond the intent and convenience of our tools while remaining curious about the impact they have by asking some simple questions about the software you use.

 

Does the software you’re using have soul? Four questions to consider

Why was it built? Software with soul exists to empower the people who use it, but most software actually exists to enrich the people who own it. Software owned by publicly traded companies or funded by traditional venture capital has to prioritize making money for investors — and often, it does so at the expense of the people who build it and use it. Open-source software, like the code that powers Software for Good’s CommitChange platform, has a lot more room for soul, because it prioritizes its creators and users from the start.. Learn about what’s motivating the software you use by finding out why it was built — and consider open-source alternatives.

Who keeps it running? Often, soulless software seems “magical” by exploiting hidden human labor. OpenAI, the for-profit company behind DALL-E and ChatGPT, is currently valued at $29 billion — and in order to make the datasets its neural nets learn from safe for consumption in the global North, it hired Kenyan workers paid $2 an hour to view horrifying text and images from the darkest corners of the internet to remove them from the AI’s training dataset. If a piece of software feels like magic, get curious about how that works — particularly when it comes to AI.

What effects does this software have on human lives? When I joined social networks, it connected me in a whole new way to people I love and care for. It still does, and for me it feels soulful in that way. But even as social media makes my life better, those same lines of code are injecting disinformation into our politics, leading children to take their own lives, and facilitating genocide around the world. When you interact with software, ask whose life it’s making better — and whether it’s making other lives worse. 

Who’s in the code and who isn’t? Software and datasets don’t come from nowhere: they are written and generated by human beings, who bring with them their histories, biases, and ways of seeing the world. That’s why diversity and inclusion in tech are so profoundly important — and why Software for Good is  prioritizing a team with as many different kinds of experience and identity as we can. Blind spots for the dominant culture become visible when we have a diverse set of people building software together. Ask yourself this about the software you use: Who wrote the code for this? Who decided what datasets would be used, and who’d be represented there? And who isn’t included?