Trust is the ultimate currency

This was an invited talk, given today, at the Talent Arena at Mobile World Congress.

I’m going to make an argument that humans are not built to live and work the way we do today. We have not evolved far enough to comfortably exist in our modern context. There is a whole lot of information coming toward us every day and not a lot of direct, deep social engagement. And, even less physically proximate social engagement.

About twenty five years ago Robert Putnam wrote a book called Bowling Alone. This book focused on the US and talks about the lack of social connectedness that continues to manifest in interesting ways.  His title speaks to one example of bowling leagues.  More broadly, the missing element of third places. Now the idea of the third place is a park or a public social area or any convening group or location.  Broadly speaking he is addressing non-work social groups that bring people together in order to move towards some sort of good. Other examples were neighborhood socials, schools & churches, affinity groups and Rotary International.This sort of non-work, non-family social construct has been disappearing as we have been doing more and more things online – getting our services, our recommendations and our engagement with the world at scale and online.

Where I believe this breakdown is most visible is in the concept of trust.  Greater trust between people could actually help us improve not only our happiness and engagement in the world, but also allow us to build better, healthier businesses.

Because I’m a nerd, I always start with definitions. So for today’s talk, I’m gonna narrow Trust down to a belief in the reliability, truth, or ability of someone or something.  So that’s both as a noun and as a verb. 

The very simplest case of building trust is two people. Usually, we build trust between people. In this case, Alice trusts Bob, and Bob trusts Alice. 

Now there’s also a concept of transitive trust. Where Alice trusts Bob, Bob trusts Alice, but Bob and Carol trust each other as well. When Bob introduces Alice to Carol he offers this transitive trust. You’ve done this anytime you’ve made an introduction for someone.This is a use of transitive trust. Hey, I love you both. I think you should work together.

So this is super important in building these mutually reliable and engaged groups and communities and pairings and partners. Again, mutual reliability, truth, and ability. Now how do you actually foster this intentionally? There are several techniques I use. One of them is fairly straightforward.  Be vulnerable and authentic. I am myself and no one else here on this stage. I’m gonna stumble over my words, I will make nerdy jokes. And I’m gonna tell you some stories about me and my life. 

One of the ways that I see trust being undermined most recently is by sharing misinformation.

People are frequently engaging with things that are not necessarily trustworthy and sharing them on, offering that transitive trust that we talked about between Alice, Bob, and Carol. Now in this image, Bob is offering information that he saw that he trusts. As he offers this piece of information he has unintentionally offered that transitive trust. Now here’s a funny story. My partner and I trade links and articles back and forth all day long. He is one of my highly trusted views into the world, and he sees things and notices things I don’t because his perception of the world is very different from mine.

There was a moment several months ago where he sent me a link to something and because I trust him, I trusted that link and I trusted that piece of information -without thinking critically about it.   About fifteen minutes later, he texted me again saying “I’m so sorry. That was bad information. Here is the right information”. By that point, I had already shared with other people I knew this really pertinent piece of information that caused my limbic system to react, and that shortcut to belief is how we see so much misinformation being shared today. So this hijack of our trust through information and influencers online, other entities that want to get you to share their message, that is something that is incredibly important and is starting to break down the way we trust in the world today. 

Now we are rewarded online for some of these behaviors. There’s a sense of people in the know, out of the know. But the hijacking of trust and the use of social media to share information in this high volatility, high value way for someone who maybe has the first scoop of information. That is one of those spaces where we’re starting to undermine trust. 

I framed basic trust as between two people or a person and a group of people.  However, there are nonhuman things that we trust. We trust our senses to give us information and maybe our senses can give us bad info.  We trust organizations. We trust a hierarchy in our businesses. We trust cars to drive us from place to place. We trust technology to make sure our bank accounts are correct. So there is trust in things that are nonhuman, but they’re just tools. That distinction makes the trust a little bit different. 

We’ve been trusting and talking about trust a lot in the last twenty years in the tech industry. Because the cloud brought us something that was super different as far as how we trust in the world. My friend,Mike Bursell, wrote this amazing book “trust in computer systems and the cloud”, and it really clarified for me how much trust we put into a single bit of software and that that software will behave as we expected.

So this is a great example using the same alphabetical notation that I did earlier for Alice, Bob, and Carol. But now we have   “company a deploying software from provider b using libraries from open source community c and proprietary software provider d” . My favorite line in this entire book is “we realize that we are already halfway through the alphabet and have yet to consider any of the humans in the mix”

Now I did promise that this talk would have more AI in it. So I’m gonna start with the concept of generated AI here. Generative  AI is fascinating and fun and exciting to play with, and it is a statistical prediction model.  I’ve quipped “T9 v 4.0”.

Now AI broadly does lots of really interesting things. And we can learn to trust these tools over time. But part of what we really need to focus on is how that trust is contextualized.

So do I want to attend an AI generated talk about trust? No. In fact, if you look back up at my abstract, the abstract was written by an LLM. I I figured I’d play with it and I’d see what happened to put something up on the conference page to make sure that I had a bit of content in the program catalog as needed. But it’s really bland and it’s really boring and it doesn’t sound like me or share any of my authenticity.

So we have this whole new set of tools that we get to use to build things. And those tools may make us be able to work faster and if used improperly, they may help us break down trust. There are lots of other things they’ll get to help us do as well, but those are the two that I’m thinking about right now. So how do we trust tools? Like, this is an example here with a crowbar. It’s an example of a simple machine, a lever.

Levers are incredibly useful for all sorts of things. But I guess my question would be, “would I trust this tool to wash my window?” 

No. I really wouldn’t because that’s not the context in which it should be used. So I need to always think in terms of the context of the tool and how you’re using it. So for generative AI, there is a context and a setting for when it’s incredibly useful and incredibly powerful. And then there’s a context where it’s exactly the wrong tool.

In those wrong cases, you end up with content or product or interactions with other people in your teams that destroy trust. It is the same with all tools, and ways to build a new world with them, build a faster world with them. There is also the possibility for tech to become something that Dan Davies calls an accountability sink. There was a great comedy clip from The UK in, the nineties – 

someone was trying to answer questions of a customer and the only answer was “the computer says no.” “The computer says no.”

There was no ability for that person to make changes. The computer said no. That was the end of the conversation. And there was no accountability for this “no” because it wasn’t the clerk’s fault. The clerk was very clearly just using the tool in front of them to answer the question. 

That is an accountability sink. It separates the person who is answering the question from the accountability of bringing back a good answer.  Anyone who has tried to change a flight or get an answer as to why a flight was canceled or try to do something that is just a little bit outside the statistical norm has found tools sometimes just stop won’t let you move forward. Your tools say that doesn’t work,that can’t happen, And there’s no accountability for this problem not being solved.  So we have to be really careful to avoid these broken promises, these broken pieces of trust that are technological accountability sinks..

This is the big question at this point. How do we resolve this? And, honestly, as much as I think about this, I think about ways to build community and strengthen the teams that I am working in. The only suitable tool I can find is the human investment of time. Because “at scale” is one of the scariest phrases I know. In that, we’ve tried to supplant trusting humans with trust in brands or trust in tools. Now, again, this can work, but there has to be a feedback mechanism to make sure that the people who are having issues or the issues that are happening outside of the average experience. 

The problems that are happening outside of that or the humans that look different from the average the software was built for –  they need to be served too. And in order to build that mutual reliability, truth, and ability that we talked about before, we have to have feedback machines. We have to have ways to bring feedback into all of these new tools. So that investment of human time is both important for building the trust that you have between humans. And also building the feedback loop for when these tools around us break or give bad information. We need to fall back to the old model of human trust.  Built by the human investment of time. 

So here are some suggestions for human investment of time. Let’s first put down any technology that is not facilitating the conversation at the moment. Spend time connecting deeply with people. That could be long written letters as we saw in decades and centuries past that could be phone calls and conversations. That could be video calls. It’s possible. It may even be interactions on social media, but just check them to make sure that what you’re having is a conversation and what you’re building is mutual trust as opposed to a parasocial relationship where you feel like you are being heard and understood by someone who does not know you. Parasocial relationships are a catalyst for people who tend to become disenfranchised and step away from their interconnectedness in their communities. 

Another example or effort that may help you is to really slow down. I know it feels crazy to say slow down in the world that’s moving so quickly today and to feel as if the AI powered revolution is getting away from you again and again and again.  Maybe it’s timeboxing  – I’m going to not work at speed or I’m going to do something very slow and methodical and good for me, two hours a day, one hour a day, several hours a week. I’m focused on the  kids. I’m going to spend time with them in person. Whatever that looks like, slow down and enjoy the humans for who they are. In order to be able to go faster when you need to nurture you and your relationships.

Another suggestion is to make sure that you are spending time in nontraditional or non similar spaces. So if you are trying to grow trust with your team and you always and only see them at work, try to find some social space to connect with them or take a walking one on one or meet people out in the world. Engage with them as whole humans as opposed to simply heads in a in a video conference.

And then my last direct suggestion is one that has pulled humans together for millennia. Which is eating together. Now this is not always possible in our very connected world where I might be talking to someone from bars from here in Barcelona talking to someone in Seattle, or taking, you know, taking early and late night meetings but you can do this. Let’s invite each other to share time with tea, and you bring your tea and you tell each other about your tea. And you have a connection and a little bit more of a a human space, made by handling the necessity of eating and drinking. 

Today, people are connected in all sorts of ways.  As I said, we’re sometimes connected very virtually from other ends of the globe. Some we’re connected very locally.Sometimes you are in chosen groups. Sometimes you are in corporate groups. Or affinity groups. You may exist in any number of these relationships. Maybe you’re an end node in a graph that isn’t very connected. Or maybe you’re someone connected to two very deeply connected to two distinct graphs. All of these are great, and any work you put into this will make your connectedness better, deeper, and ultimately will allow you to engage more authentically with these people at a higher level of trust and and with higher expectations. 

Building and nurturing trusting teams really brings us stronger, more interconnected teams, resilient teams that do better with better with hardship in front of them. Or hardship around them and still produce greater business impact.  

Studies also find that connected and trusting humans are generally happier humans that have greater business impact. There’s a great study from Harvard that’s been following people for eighty years.  And it really does show that people are happier with that sense of place and well-being in a community, whether that community is your corporate team, or an open source project or a social group you’re part of or an intramural sport you play, These connections are what build happier humans and bigger business impact. 

References:

Bowling Alone by Robert D. Putnam

The Unaccountability Machine by Dan Davies

Trust in Computer Systems and the Cloud – Mike Bursell

Harvard Happiness Study

By:


Discover more from hi, i'm sarah

Subscribe now to keep reading and get access to the full archive.

Continue reading