If you’d like trust to develop in your office, group or team — and who wouldn’t? — the key is sharing your weaknesses, says business writer Daniel Coyle.
At some level, we intuitively know that vulnerability tends to spark cooperation and trust. But we may not realize how well this process works, particularly when it comes to group interactions.
Jeff Polzer, a professor of organizational behavior at Harvard, has spent much of his career examining these seemingly insignificant social exchanges. “People tend to think of vulnerability in a touchy-feely way, but that’s not what’s happening,” Polzer says. “It’s about sending a really clear signal that you have weaknesses, that you could use help. And if that behavior becomes a model for others, then you can set the insecurities aside and get to work, start to trust each other and help each other. If you never have that vulnerable moment, on the other hand, then people will try to cover up their weaknesses, and every little microtask becomes a place where insecurities manifest themselves.”
Vulnerability is less about the sender than the receiver. “The second person is the key,” Polzer says. “Do they pick it up and reveal their own weaknesses, or do they cover up and pretend they don’t have any? It makes a huge difference in the outcome.”
Polzer has become skilled at spotting the moment when the signal travels through the group. “You can actually see the people relax and connect and start to trust. The group picks up the idea and says, ‘Okay, this is the mode we’re going to be in,’ and it starts behaving along those lines, according to the norm that it’s okay to admit weakness and help each other.”
This interaction can be called a vulnerability loop. A shared exchange of openness, it’s the most basic building block of cooperation and trust. Vulnerability loops seem swift and spontaneous, but they all follow the same steps:
1. Person A sends a signal of vulnerability.
2. Person B detects this signal.
3. Person B responds by signaling their own vulnerability.
4. Person A detects this signal.
5. A norm is established; closeness and trust increase.
Each signal takes only a few seconds to deliver. But they’re vital, shifting the dynamic and allowing two people who have been separate to function as one. It’s useful to zoom in on this shift. Scientists have designed an experiment to do exactly that, called the Give-Some Game. You and another person, whom you’ve never met, each get four tokens. Each token is worth a dollar if you keep it but two dollars if you give it to the other person. The game consists of one decision: How many tokens do you give the other person?
This is not a simple decision. If you give all, you might end up with nothing. Most people give an average of 2.5 tokens to a stranger — slightly biased toward cooperation. But what’s interesting is how people tend to behave when their vulnerability levels are increased a few notches.
Leaping into the unknown, when done alongside others, causes the solid ground of trust to materialize beneath our feet.
In one experiment, subjects were asked to deliver a short presentation to a roomful of people who were instructed by experimenters to remain silent. They played the Give-Some Game afterward. You might imagine that the subjects who endured this experience would respond by becoming less cooperative, but the opposite turned out to be true: the speakers’ cooperation levels increased by 50 percent.
That moment of vulnerability did not reduce their willingness to cooperate but boosted it. The inverse was also true: Increasing people’s sense of power — tweaking a situation to make them feel more invulnerable — dramatically diminished their willingness to cooperate.
We think about trust and vulnerability the way we think about standing on solid ground and leaping into the unknown. First we build trust, then we leap. But science is showing we’ve got it backward. Vulnerability doesn’t come after trust — it precedes it. Leaping into the unknown, when done alongside others, causes the solid ground of trust to materialize beneath our feet.
The MIT team’s approach was closer to a hastily scrawled plea shoved into a bottle and lobbed into the ocean of the Internet: “If you find this, please help!”
Question: How would you go about finding ten large red balloons deployed at secret locations throughout the United States?
That question was dreamed up ten years ago by the Defense Advanced Research Projects Agency (DARPA), a division of the US Department of Defense tasked with helping America’s military prepare for future technological problems. The Red Balloon Challenge, which DARPA announced in October 2009, was designed to mimic real-life situations like terrorism and disease control, and offered a $40,000 prize to the first group to locate all ten balloons. The immensity of the task — ten balloons in 3.1 million square miles — led some to wonder if DARPA had gone too far. A senior intelligence analyst declared it “impossible.”
Groups quickly signed up. They were a cross-section of America’s brightest minds: hackers, social media entrepreneurs, tech companies and research universities. The vast majority took a logical approach to the problem. They built tools to attack it. They constructed search engines to analyze satellite photography technology, tapped into existing social and business networks, launched publicity campaigns, built open-source intelligence software, and nurtured communities of searchers on social media.
A team from MIT Media Lab found out about the challenge just four days before launch. A group of students, led by postdoctoral fellow Riley Crane, realized they had no time to do anything that resembled an organized approach. Instead, they took a different tack. They built a website that consisted of this invitation:
When you sign up to join the MIT Red Balloon Challenge Team, you’ll be provided with a personalized invitation link [with your name in it].
Have all your friends sign up using your personalized invitation. If anyone you invite, or anyone they invite, or anyone they invite (. . . and so on) wins money, so will you!
We’re giving $2000 per balloon to the first person to send us the correct coordinates, but that’s not all— we’re also giving $1000 to the person who invited them. Then we’re giving $500 [to] whoever invited the inviter, and $250 to whoever invited them, and so on . . . (see how it works).
The MIT team’s approach was laughably primitive. There was no organizational structure, strategy or software. It was closer to a hastily scrawled plea shoved into a bottle and lobbed into the ocean of the Internet: “If you find this, please help!”
Two days before launch, MIT switched on the website. For a few hours, nothing happened. Then, at 3:42 p.m, people began to join. Connections first bloomed out of Boston, then radiated to Chicago, Los Angeles, San Francisco, Minneapolis, Denver, Texas, and far beyond, including Europe. Viewed in time lapse, the spread of connections resembled the spontaneous assembly of a gigantic nervous system, with hundreds of new people joining the effort with each hour.
At precisely 10:00 a.m. Eastern on December 5, DARPA launched the balloons in ten secret locations. Thousands of teams swung into action, and the organizers settled in to wait: they estimated it would take up to a week for a team to accurately locate all ten balloons.
Eight hours, fifty-two minutes, and forty-one seconds later, it was over.
The MIT team found all ten balloons and did so with the help of 4,665 people — or as DARPA organizer Peter Lee put it, “a huge amount of participation from shockingly little money.”
“Being vulnerable gets the static out of the way and lets us do the job together, without worrying or hesitating,” says vulnerability researcher Jeff Polzer.
The other teams had used a logical, incentive-based message: Join this project, and you might win money. This sounds motivating, but it doesn’t really encourage cooperation — in fact, it does the opposite. If you tell others about the search, you are reducing your chances of winning prize money. These teams were asking for participants’ vulnerability, while remaining invulnerable themselves.
The MIT team, on the other hand, signaled its vulnerability by promising that everyone connected to finding a balloon would share in the reward. Then it provided people with the opportunity to create networks of vulnerability by reaching out to friends, then asking them to reach out to friends. The team did not dictate what participants should do or how they should do it; it gave out the link and let people do what they pleased. And what they pleased was to connect with lots of other people. Each invitation created another vulnerability loop that drove cooperation — Hey, I’m doing this crazy project and I need your help.
So what made the difference wasn’t how many people a person reached or how good their balloon-search technology was — it was about how effectively people created relationships of mutual risk. The Red Balloon Challenge wasn’t really a technology contest; it was a vulnerability-sharing contest.
Cooperation does not simply descend out of the blue. It is a group muscle that is built according to a pattern of repeated interaction.
Most of us see vulnerability as a condition to be hidden. But when it comes to creating cooperation, vulnerability is not a risk but a psychological requirement. Polzer says, “Being vulnerable gets the static out of the way and lets us do the job together, without worrying or hesitating. It lets us work as one unit.”
I began to see vulnerability loops in other places I visited. Sometimes they were small, quick exchanges. A pro baseball coach began a season-opening speech to his players by saying, “I was so nervous about talking to you today,” and the players responded by smiling sympathetically — they were nervous too. Sometimes these loops took physical form, like the Failure Wall at Dun & Bradstreet Credibility Corporation, where people could share moments where they’d fallen short.
Sometimes they were habits of seemingly invulnerable leaders, such as Steve Jobs’s penchant for beginning conversations with “Here’s a dopey idea.” (“And sometimes they were,” recalls Jonathan Ive, Apple’s senior vice president of design, in his memorial to Jobs. “Really dopey. Sometimes they were truly dreadful.”)
Each loop was different yet shared a deeper pattern — an acknowledgment of limits, a keen awareness of the group nature of the endeavor. The signal being sent was the same: You have a role here. I need you.
Cooperation does not simply descend out of the blue. It is a group muscle that is built according to a specific pattern of repeated interaction, and that pattern is always the same: a circle of people engaged in the risky, occasionally painful, ultimately rewarding process of being vulnerable together.
Excerpted with permission from the new book The Culture Code: The Secrets of Highly Successful Groups by Daniel Coyle. Published by Bantam Books, an imprint and division of Penguin Random House LLC, New York. Copyright © 2018 by Daniel Coyle. All rights reserved.
Watch the TEDx Talk on growing a talent hotbed from Daniel Coyle: