A “yes man” culture that is adverse to dissent can not only be stifling for employees, but in some cases, can be downright dangerous. So how do you create a culture where everyone feels empowered to bring their ideas to the table?

On today’s episode of The Culture Kit with Jenny & Sameer, Haas School of Business professors and organizational culture experts Jenny Chatman and Sameer Srivastava answer a question from Shuchi Mathur, the Vice President of Customer Experience at Reelgood. Jenny and Sameer share examples of companies they’ve worked with like Pixar and Netflix that have built cultures around celebrating failure and farming for dissent.

Do you have a vexing question about work that you want Jenny and Sameer to answer? Submit your “Fixit Ticket!”

You can learn more about the podcast and the Berkeley Center for Workplace Culture and Innovation at www.haas.org/culture-kit.

*The Culture Kit with Jenny & Sameer is a production of Haas School of Business and is produced by University FM.*

Jenny & Sameer’s 3 Main Takeaways:

  1. Be intentional – recognize that you need to go out of your way to prioritize dissent; otherwise you might inadvertently stifle it.
  2. Build systems – some organizations even establish processes to encourage people to take deliberate action to surface dissent. This is mission-critical in an organization where life and safety are on the line.
  3. Model what you want to see – leaders need to actively model a willingness to admit when they’re wrong and own up to mistakes. At the same time, they can seek out and defer to expertise, rather than acting like they always have the answers.

Show Links:

 

Transcript

How To Avoid Creating a ‘Yes Man’ Culture

[00:00:04] Sameer: From Berkeley Haas and the Berkeley Center for Workplace Culture and Innovation, this is The Culture Kit with Jenny and Sameer.

[00:00:09] Jenny: I’m Jenny Chatman.

[00:00:10] Sameer: And I’m Sameer Srivastava.

[00:00:13] Jenny: We’re professors at the Haas School of Business. On this podcast, we’ll answer your questions about workplace culture.

[00:00:20] Sameer: We’ll give you practical advice that you can put to work right away.

[00:00:24] Jenny: Join us to start building your culture toolkit.

[00:00:28] Sameer: Hey, Jenny!

[00:00:28] Jenny: Hey, Sameer. Well, we have another great question this week. Let’s hear it.

[00:00:33] Sameer: Sounds good.

[00:00:34] Shuchi: Hi, my name is Shuchi Mathur, and I’m the VP of Customer Experience at Reelgood. My question is, how can we build a culture where people are not afraid to bring their perspectives to the table and be comfortable with the difference in opinion to their management and not become a “yes” man? The true value of having a diverse team is when everyone feels empowered to put their point across or raise problems without losing sight of the company’s vision. Thank you.

[00:01:03] Jenny: Yeah, this is a really critical question. And to answer it, we should start by talking about the perils of creating a culture where people are uncomfortable about disagreeing. This can be a huge problem, and in some cases, can even be dangerous. One example I’m thinking of is Boeing. You know, Boeing was founded on the values of safety and quality but has more recently experienced some really tragic accidents, partly because people in the company aren’t comfortable anymore about speaking up.

[00:01:34] Sameer: Yeah, that culture at Boeing is credited with helping the company revolutionize commercial aviation.

[00:01:40] Jenny: Exactly, Sameer. Maybe you remember the slogan, “If it’s not Boeing, I’m not going,” reflecting people’s trust in the planes that Boeing built. So, in earlier days, Boeing was often described as an engineer-centered culture. The company valued openness and speaking up. In fact, withholding information was considered like a transgression and was sanctioned as counter-normative.

But those norms started to change after Boeing acquired McDonnell Douglas in 1997. McDonnell Douglas brought in a more aggressive, kind of, profit-driven culture. And once profit was at the top of the priority list, something had to give, and quality and safety were, unfortunately, the things that did.

Employees began to feel uncomfortable about reporting safety issues. Whistleblowers who attempted to raise alarms faced pushback. And in some cases, they were even retaliated against. And sadly, this turned absolutely catastrophic with the 737 crashes, both in 2018 and 2019. These, together, killed a staggering 346 people. Really, just a terrible tragedy.

[00:02:51] Sameer: And let’s not forget the four missing bolts on the Alaska Airlines’ door earlier this year. That could also have easily turned tragic.

[00:02:58] Jenny: Yeah, that’s right. And I believe we talked about in an earlier episode the Space Shuttle Challenger incident in 1986, where all the astronauts on board were killed. In that case, it has some eerie similarities. It was NASA’s hierarchical structure that resulted in suppressing descent. The Morton Thiokol engineers knew about the relationship between the ambient temperature and O-ring failure. They knew that, when it got cold, the O-rings were going to break.

In fact, one was quoted as saying, “I was just as sure as death that the failure would occur,” but no one at NASA was listening to them. And in fact, this is exactly what caused the Challenger to crash. It was a cold morning and the O-rings failed.

[00:03:43] Sameer: This also reminds me of the case of Wells Fargo and the sales misconduct that occurred there back in 2016 and thereafter. And really, what happened there is that the culture got in the way of people speaking up. They had conflicting incentives. And there was a total breakdown of what Robert Simons from HBS refers to as internal control systems — for example, the systems that provide early warning signs of problems or the incentive systems that can fuel unethical behavior.

[00:04:13] Jenny: Oh, that’s a super interesting analysis. How do you think Simon’s framework applies to Wells Fargo?

[00:04:19] Sameer: Well, one of the biggest breakdowns at Wells Fargo was in their so-called belief systems. Interestingly, two of the company’s five core values were, one, “honesty, trust, and integrity are essential for meeting the highest standards of corporate governance”; and two, “We value what’s right for customers in everything we do.” But, at the same time, there was a widely shared belief, which was reinforced by high-powered incentives, that cross-selling was the way to create shareholder value.

[00:04:47] Jenny: Right, yeah. My recollection of that situation is that the company failed to provide guidance on how people should resolve these conflicting priorities, on the one hand, of meeting sales targets, on the other hand, behaving ethically.

[00:05:00] Sameer: That’s exactly right. And all of the implicit signals, for example, who got promoted or who was celebrated, emphasized the importance of hitting your financial targets without much regard, if any, for how one did so. So, it’s not surprising that the salespeople created, astonishingly, over two million unauthorized customer accounts.

[00:05:19] Jenny: Wow. You know, it also strikes me that this is a similar situation as when organizations go through culture change. And what I often see is they keep all of the cultural priorities that they had previously, but they just make that list longer. So, not only are we going to pay attention to customers and be ethical and be profitable, we’re also going to be collaborative. We’re also going to be innovative.

And when you pile on more and more, it’s so easy for those values to become conflicting, and employees have a really hard time figuring out what to prioritize and how to, kind of, break those, those ties, like in the Wells Fargo case.

[00:06:04] Sameer: That’s exactly right. And sometimes, those long lists of values can just become meaningless. So, Jenny, what do you think, then, are some ways of getting people to speak up more?

[00:06:13] Jenny: So, one thing that organizations have done is they’ve tried to create flatter structures with less hierarchy, so people feel more accountable and freer to speak up. And I think here about Zappos, which had a, kind of, famous experiment with what they called holacracy, a decentralized approach that, kind of, does away with management and distributes decision-making through self-organized teams.

The idea behind holacracy is that it gives people on the front lines higher levels of responsibility and accountability and can result in, kind of, more informed decisions because they’re right there in front of things.

[00:06:55] Sameer: So, what happened with holacracy at Zappos?

[00:06:57] Jenny: Well, the extreme level of flat hierarchy ended up being a little bit unwieldy. So, Zappos backed off somewhat, but the company’s culture continued to be much less hierarchically organized, and people viewed it as highly collaborative and a place where it was easy to weigh in, regardless of your level.

[00:07:15] Sameer: Yeah, another company that works hard to stay decentralized is Netflix, one that we’ve talked about in a previous episode as well. Their approach was not quite as radical as Zappos’. They still try to limit the layers of the hierarchy and they push decision rights to individuals, but they complement that decentralized decision-making with a strong norm that’s a person’s so-called zone of discretion.

What that means is that they want people to take the perspective of the company as a whole rather than their own function or department when they’re making decisions. And although it doesn’t happen all the time, they curate and circulate narratives in which a junior person might, for example, green-light a project even when a senior person disagrees with that choice.

[00:07:59] Jenny: Wow. I mean, how did they avoid chaos?

[00:08:03] Sameer: Well, one way is that they strongly encourage people to “farm for dissent.” That is to seek out disconfirming feedback from people outside their functional area and ensure that their idea addresses those concerns. If a person’s proposed new or risky idea has farmed for dissent and it fails, that’s generally okay. But if they haven’t farmed for dissent, it’s really not.

And then they reinforce this idea with the choice of language. And I’ve already mentioned a few different terms that they use, which are a part of the culture. But another one that they talk about is a decision that’s been made after the person has farmed for dissent and doesn’t work out is a so-called noble bet. And if it doesn’t work out, it’s a so-called noble failure, as opposed to just a failure.

[00:08:49] Jenny: Well, that’s interesting. It aligns with what behavioral economists call the confirmation bias, that we tend to only look for evidence that our ideas are good. And this leaves us very vulnerable to not creating very robust ideas. What you’re talking about is a way of looking for disconfirming evidence, which should make those ideas much more robust.

[00:09:15] Sameer: Exactly.

[00:09:15] Jenny: So, going back to Shuchi’s question, part of the reason organizations end up with a culture of “yes” people is by inadvertently stifling dissent. In general, people don’t like to be wrong. They don’t like to fail. And they worry that failures will be punished. So, cultivating criticism of your idea makes it feel more like failure. We can see why people would resist that.

So, I’m thinking about organizations that have redefined failure, which would be another way of instigating more dissent. And Pixar, the animation studio, is a great example of that. When people pitch story ideas, the likelihood that that story idea is going to be adopted is very low. But Pixar recognizes that the innovation pipeline, the front end of their innovation pipeline for story ideas, needs to be so big that everyone has to really participate.

And so, you know that, when you pitch your idea, you’re very likely to be turned down. But that’s not viewed as a failure. Failure is not that your idea wasn’t adopted as the next Toy Story franchise, failure would be that you didn’t even stand up and try to pitch a new idea.

[00:10:31] Sameer: Any other examples that come to mind?

[00:10:33] Jenny: Yeah, I’m thinking of an executive in a company that told me that they have something called the Golden Toilet Award for the worst idea. So, you get to, kind of, visibly display this golden toilet. It’s, you know, about half the size of a regular toilet sprayed in, in gold paint. And you get to put it on your desk that week and proudly display it, showing that you had the worst idea of the week, right?

So, again, this is a way of, kind of, showcasing failure and redefining it not as something universally bad but as a necessary step for coming up with really great ideas.

[00:11:14] Sameer: Yeah, I’m trying to picture the golden toilet in my office, and it doesn’t look so good.

[00:11:19] Jenny: I know. I deserve to have it on my desk, frequently.

[00:11:23] Sameer: So, you know, another person who comes to mind as you’re talking, Jenny, is Microsoft CEO, Satya Nadella, who famously ushered in a new culture focused around Carol Dweck’s idea of a growth mindset. And core to that culture change was Nadella himself role-modeling what it means to have a growth mindset. So, there were a couple of missteps that he had and the company had. And he was quite open about those. One was a chatbot that went awry and quickly learned to be racist in how it was communicating. They had to shut it down. He took full responsibility for that. And at one point in a conference for women engineers, he gave a truly tone-deaf response to a question about the gender wage gap.

[00:12:09] Jenny: Yeah, I think I remember that, something about raises coming through good karma rather than women needing to ask directly. And so, I guess what you’re saying is that a growth mindset is being open to critiques of your ideas.

[00:12:26] Sameer: Exactly. And in this case, Nadella came back and said he was completely wrong. He said it in a company-wide email. He said it externally as well. And he worked with their Chief People Officer, and they together decided to use this as an example of how he was himself learning. So, in both of these failures, he modeled publicly owning up to the mistakes, and that was a powerful message to send to the whole organization.

[00:12:49] Jenny: Yeah, I think what this really comes down to is that, when there’s a lack of intention, leaders can inadvertently stifle dissent since we’re all hypersensitive to criticism. So, invitations to dissent might need to be completely obvious, deliberate, and overt in organizations. I’m thinking even way back to Toyota’s Andon Cord. This was a case where any worker could pull this cord if they saw a problem on the manufacturing line. It didn’t matter what your status was in the organization. If you saw a problem, you pulled that cord. And this was an intentional, easily accessible way for employees to provide useful information about problems arising.

[00:13:35] Sameer: Yeah, what a powerful artifact to have in an organization.

[00:13:38] Jenny: Yeah, you know, I’m even thinking back to our colleague, Karlene Roberts, who used to be at the Haas School. She has since retired. She did really interesting work about how the military cultivates that mentality as well, despite this, this reputation they have for rigid hierarchy. And they do so because of the high stakes, often, lives hanging in the balance. And so, calling out problems is absolutely essential in what she called high-reliability organizations.

For example, on aircraft carriers, it’s critical that anyone who sees a problem raises it immediately. And they make a public display of commending people who raise problems.

So, Sameer, what do you think our takeaways are from today?

[00:14:25] Sameer: Well, as we’ve done before, I think we can sum them up in three points. First, be intentional. Recognize that you need to go out of your way to prioritize dissent. Otherwise, you might inadvertently stifle it. Two, build systems. Some organizations even establish processes to encourage people to take deliberate action to surface dissent. This is mission critical in an organization where life and safety are on the line. And finally, model what you see. Leaders need to actively model a willingness to admit when they’re wrong and own up to mistakes. At the same time, they can seek out and defer to expertise rather than acting like they always have the answers.

[00:15:04] Jenny: Great. I’m going to go question the status quo right now.

[00:15:08] Sameer: Excellent. See you next time.

[00:15:10] Jenny: Thanks, Sameer.

Thanks for listening to The Culture Kit with Jenny and Sameer. Do you have a question about work that you want us to answer? Go to haas.org/culture-kit to submit your fix-it ticket today.

[00:15:24] Sameer: The Culture Kit Podcast is a production of the Berkeley Center for Workplace Culture and Innovation at the Haas School of Business, and it’s produced by University FM. If you enjoyed the show, be sure to hit that Subscribe button, leave us a review, and share this episode online, so others who have workplace culture questions can find us, too.

[00:15:44] Jenny: I’m Jenny.

[00:15:45] Sameer: And I’m Sameer.

[00:15:46] Jenny: We’ll be back soon with more tools to help fix your work culture challenges.

Previous Episode 4: Going above and beyond the job description Next Episode 6: Laszlo Bock on the key skills to become a successful leader of tomorrow