Can GM Make it Safe for Employees to Speak Up?
The Conference Board uses cookies to improve our website, enhance your experience, and deliver relevant messages and offers about our products. Detailed information on the use of cookies on this site is provided in our cookie policy. For more information on how The Conference Board collects and uses personal data, please visit our privacy policy. By continuing to use this Site or by clicking "OK", you consent to the use of cookies. 

In early April, following the news of faulty ignition switches and recall of more than 6 million cars, GM CEO Mary Barra announced a “Speak Up for Safety” program. “GM must embrace a culture where safety and quality come first,” Barra said at a company town hall meeting. “GM employees should raise safety concerns quickly and forcefully, and be recognized for doing so.” (Since April, the number of vehicles the company has recalled for a variety of reasons has doubled.)

In an industry that involves selling machines that transport humans at fast speeds, the notion that employees wouldn’t be encouraged to discuss safety seems odd. But the company’s transition to a culture that dissuades silence has been slow. As Harvard Business School professor David Garvin explains, “Culture is remarkably durable and resistant to change.” But at GM, and at many other large organizations, the challenges may be especially complex.

As Mary Barra put it to her employees, following today’s release of Anton R. Valukas’s three-month internal investigation, “The lack of action was a result of broad bureaucratic problems and the failure of individual employees in several departments to address a safety problem…Repeatedly, individuals failed to disclose critical pieces of information that could have fundamentally changed the lives of those impacted by a faulty ignition switch.”

In other words, it wasn’t a cover-up or a deliberate attempt to thwart safety; it was just the way GM did business, with “no demonstrated sense of urgency” and “nobody [raising] the problem to the highest levels of the company.” But even though Barra attempted to blame a few “individuals,” her underlying message hints at the importance of company — and industry — culture.

A fundamental problem at many American car companies is a legacy of reacting slowly, protecting executives from bad news, and focusing on cutting costs. And the sheer complexity of building cars also plays a role — David Cole, the former head of the Center for Automotive Research (and son of a former GM president), insists that this is really what’s behind the ignition switch error and why it was so hard for the company to get to the bottom of the problem.

But that’s exactly why it would be a mistake to look past organizational behavior and culture at GM: It is utterly inevitable that things will go wrong, according to Harvard Business School professor Amy Edmondson. This is “not because people screw up, but because of the immense complexity of what we do,” she told The Washington Post. “The phenomenal number of interacting parts, interacting people and continuing changes in technology mean that we will always have failures, full stop.” (Consider: there are bugs in your smartphone, too – another complex device – but they won’t kill you.)

And yet we have a canon of research and experience telling us that while mistakes always happen, it’s the environment in which they occur that really makes the difference. And it’s really hard for leaders to change that culture, even when they become aware of the problem.

The classic example of repeated institutional failure, of course, is NASA. “If you think about the Challenger space shuttle explosion [in 1986], the incident was attributed to cultural issues: an unwillingness to speak up and accept dissonant voices,” says David Garvin. “Then, 17 years later, we have the Columbia explosion. Culture is tough because it gets embedded.”

Conversely, says Garvin, there are programs like SUBSAFE for the U.S. Navy’s nuclear submarines. “At NASA, you had to prove something was broken, which is hard to do,” he explains. “At the nuclear submarine program, the working mantra is ‘prove to me that it’s right, that it’s workable.’ That’s a very different mindset.”

Each mindset requires very different types of communication — and proving that something works requires raising open-ended questions. In order to show that something is broken, however, you may feel you have to be completely confident in your facts. Garvin notes that this is where Edmondson’s work on implicit voice theories comes into play. These are “theories we have in our heads about the risks of speaking up. Things like: Don’t embarrass the boss in public. Don’t go up the chain of command. Everything must be done before you present it.”

“These theories are hard to dislodge, and you need leaders who explicitly invoke the kind of behavior they’re asking for,” Garvin says.

After the Challenger explosion, there were new requirements put in place, but none of them required major changes to the organization itself. More than a decade later, and under a culture sociologist Diane Vaughan says fell “back on routine under uncertain circumstances,” a piece of foam broke off the Columbia shuttle and hit a section of the wing during liftoff. Repeated requests for photos and data of the shuttle to discern any problems that might occur during reentry were dismissed. In a recent New York Times video on both disasters, one former engineer, Rodney Rocha, recalls asking why his request was rejected. The manager’s answer: “I don’t want to be a Chicken Little about this.”

At NASA, notes Rocha, “part of our engineering culture is that you work through your chain of command. I will regret always why I didn’t break down the door by myself.”

At GM, despite the company’s insistence that its culture is changing, there are a few key sticking points worth examining.

First, Maryann Keller, a former auto analyst, notes that, historically, GM hasn’t invested in root-cause analysis. While working on her 1989 book on GM, she shared this story that an engineer imparted: “They were having a problem with enormous warranty claims for a window washer motor. The original response from GM was not to look for the root cause because that wasn’t part of the company’s thought process. No one asked, ‘Why are they failing?’”

Instead, she said, “their proposal was to build another factory to make window washer motors.” And the reason the motors were failing in the first place? “To save a few pennies, someone had changed the design of the motor so that there was no internal way to cool it. So it had to use the washer fluid itself to cool down.”

Keller notes that root-cause analysis has gotten better, but the lack of it illuminates the industry’s mentality that you “build a car to the specifications that have been accepted for that segment of the market, make sure your bill of materials equals a certain cost, and whatever happens after that is extraneous.” American car companies, she says, have tolerated high warranty claims based, in part, on an attitude that “if something happens along the way, so be it. It’s not my job.”

Second, Keller says that for years it was considered bad for your career if information filtered up to the highest ranks. “I had people tell me that everyone would know about a problem, but no one would speak about it,” she explains. “The goal was to insulate the senior executives and hope that nothing happens.” It’s telling, as the AP reported yesterday, that the director of vehicle safety at GM was four rungs down from the CEO prior to the recall. Both Ford and Chrysler’s hierarchies place safety directors closer to the CEO, and management experts told the AP that “safety ranks higher at other companies as well, especially food, drug, and chemical makers. At some, the safety chief has direct access to the CEO.”

But although changing a corporate culture is hard, it is not impossible with the right leadership. Just take this now-famous story about Ford CEO Alan Mulally. As Fortune first reported:

“Mulally instituted color coding for reports: green for good, yellow for caution, red for problems. Managers coded their operations green at the first couple of meetings to show how well they were doing, but Mulally called them on it. ‘You guys, you know we lost a few billion dollars last year,’ he told the group. ‘Is there anything that’s not going well?’ After that the process loosened up. Americas boss Mark Fields went first. He admitted that the Ford Edge, due to arrive at dealers, had some technical problems with the rear lift gate and wasn’t ready for the start of production. ‘The whole place was deathly silent,’ says Mulally. ‘Then I clapped, and I said, ‘Mark, I really appreciate that clear visibility.’ And the next week the entire set of charts were all rainbows.”

It’s a striking moment, in which a subordinate was allowed to admit failure and the boss praised him for it — something Edmondson says is crucial to developing a culture that can learn from failure and communicate more openly. In a case study related to her research at Children’s Hospital in Minneapolis, Edmondson chronicled the efforts of one executive to maneuver workplace norms. As Garvin explained to me, this executive borrowed the concept of blameless reporting from aviation. “If you have a near miss, and you file it with the FAA within 10-14 days, you are exempt from punishment.” He says the hospital executive instituted something similar, and worked to distinguish blameless acts from blameworthy ones to maintain accountability while encouraging employees to speak up.

GM is seemingly not there yet, despite the company’s insistence that today’s culture is markedly different than it was before its 2008 bailout. Sure, Mary Barra told employees, “If you are aware of a potential problem affecting safety or quality and you don’t speak up, you are part of the problem. And that is not acceptable. If you see a problem that you don’t believe is being handled properly, bring it to the attention of your supervisor. If you still don’t believe it’s being handled properly, contact me directly.”

But everything we know about speaking up shows that doing what Barra has asked people to do is completely and utterly terrifying. And while the first steps are important — the Speak Up for Safety program and hiring a new safety chief — transitioning from hiring someone in that role to embedding a safety culture throughout GM is far from guaranteed. “A strong safety culture stems from psychological safety — the ability, at all levels, to speak up with any and all concerns, mistakes, failures, and questions related to even the most tentative issues,” writes Edmondson. “Simply appointing a safety chief will not create this culture unless he and the CEO model a certain kind of leadership.” The stick of firing a handful of people isn’t enough to send the message — the CEO must also use the carrot of publicly praising employees who speak up.

At the same time, many of the people I spoke with are optimistic about Barra’s ability to lead going forward. “You hope a crisis brings change,” says Maryann Keller. “But GM has had a hard time internalizing that past crises were their fault.” And because Barra isn’t from the financial side of the company that’s obsessed with counting beans, Keller hopes she’ll have better insight into how difficult it is to put cars together — how tough it is to talk about things that go wrong.

 

 

This blog first appeared on Harvard Business Review on 06/05/2014.

View our complete listing of Strategic HR blogs.

Can GM Make it Safe for Employees to Speak Up?

Can GM Make it Safe for Employees to Speak Up?

08 Sep. 2014 | Comments (0)

In early April, following the news of faulty ignition switches and recall of more than 6 million cars, GM CEO Mary Barra announced a “Speak Up for Safety” program. “GM must embrace a culture where safety and quality come first,” Barra said at a company town hall meeting. “GM employees should raise safety concerns quickly and forcefully, and be recognized for doing so.” (Since April, the number of vehicles the company has recalled for a variety of reasons has doubled.)

In an industry that involves selling machines that transport humans at fast speeds, the notion that employees wouldn’t be encouraged to discuss safety seems odd. But the company’s transition to a culture that dissuades silence has been slow. As Harvard Business School professor David Garvin explains, “Culture is remarkably durable and resistant to change.” But at GM, and at many other large organizations, the challenges may be especially complex.

As Mary Barra put it to her employees, following today’s release of Anton R. Valukas’s three-month internal investigation, “The lack of action was a result of broad bureaucratic problems and the failure of individual employees in several departments to address a safety problem…Repeatedly, individuals failed to disclose critical pieces of information that could have fundamentally changed the lives of those impacted by a faulty ignition switch.”

In other words, it wasn’t a cover-up or a deliberate attempt to thwart safety; it was just the way GM did business, with “no demonstrated sense of urgency” and “nobody [raising] the problem to the highest levels of the company.” But even though Barra attempted to blame a few “individuals,” her underlying message hints at the importance of company — and industry — culture.

A fundamental problem at many American car companies is a legacy of reacting slowly, protecting executives from bad news, and focusing on cutting costs. And the sheer complexity of building cars also plays a role — David Cole, the former head of the Center for Automotive Research (and son of a former GM president), insists that this is really what’s behind the ignition switch error and why it was so hard for the company to get to the bottom of the problem.

But that’s exactly why it would be a mistake to look past organizational behavior and culture at GM: It is utterly inevitable that things will go wrong, according to Harvard Business School professor Amy Edmondson. This is “not because people screw up, but because of the immense complexity of what we do,” she told The Washington Post. “The phenomenal number of interacting parts, interacting people and continuing changes in technology mean that we will always have failures, full stop.” (Consider: there are bugs in your smartphone, too – another complex device – but they won’t kill you.)

And yet we have a canon of research and experience telling us that while mistakes always happen, it’s the environment in which they occur that really makes the difference. And it’s really hard for leaders to change that culture, even when they become aware of the problem.

The classic example of repeated institutional failure, of course, is NASA. “If you think about the Challenger space shuttle explosion [in 1986], the incident was attributed to cultural issues: an unwillingness to speak up and accept dissonant voices,” says David Garvin. “Then, 17 years later, we have the Columbia explosion. Culture is tough because it gets embedded.”

Conversely, says Garvin, there are programs like SUBSAFE for the U.S. Navy’s nuclear submarines. “At NASA, you had to prove something was broken, which is hard to do,” he explains. “At the nuclear submarine program, the working mantra is ‘prove to me that it’s right, that it’s workable.’ That’s a very different mindset.”

Each mindset requires very different types of communication — and proving that something works requires raising open-ended questions. In order to show that something is broken, however, you may feel you have to be completely confident in your facts. Garvin notes that this is where Edmondson’s work on implicit voice theories comes into play. These are “theories we have in our heads about the risks of speaking up. Things like: Don’t embarrass the boss in public. Don’t go up the chain of command. Everything must be done before you present it.”

“These theories are hard to dislodge, and you need leaders who explicitly invoke the kind of behavior they’re asking for,” Garvin says.

After the Challenger explosion, there were new requirements put in place, but none of them required major changes to the organization itself. More than a decade later, and under a culture sociologist Diane Vaughan says fell “back on routine under uncertain circumstances,” a piece of foam broke off the Columbia shuttle and hit a section of the wing during liftoff. Repeated requests for photos and data of the shuttle to discern any problems that might occur during reentry were dismissed. In a recent New York Times video on both disasters, one former engineer, Rodney Rocha, recalls asking why his request was rejected. The manager’s answer: “I don’t want to be a Chicken Little about this.”

At NASA, notes Rocha, “part of our engineering culture is that you work through your chain of command. I will regret always why I didn’t break down the door by myself.”

At GM, despite the company’s insistence that its culture is changing, there are a few key sticking points worth examining.

First, Maryann Keller, a former auto analyst, notes that, historically, GM hasn’t invested in root-cause analysis. While working on her 1989 book on GM, she shared this story that an engineer imparted: “They were having a problem with enormous warranty claims for a window washer motor. The original response from GM was not to look for the root cause because that wasn’t part of the company’s thought process. No one asked, ‘Why are they failing?’”

Instead, she said, “their proposal was to build another factory to make window washer motors.” And the reason the motors were failing in the first place? “To save a few pennies, someone had changed the design of the motor so that there was no internal way to cool it. So it had to use the washer fluid itself to cool down.”

Keller notes that root-cause analysis has gotten better, but the lack of it illuminates the industry’s mentality that you “build a car to the specifications that have been accepted for that segment of the market, make sure your bill of materials equals a certain cost, and whatever happens after that is extraneous.” American car companies, she says, have tolerated high warranty claims based, in part, on an attitude that “if something happens along the way, so be it. It’s not my job.”

Second, Keller says that for years it was considered bad for your career if information filtered up to the highest ranks. “I had people tell me that everyone would know about a problem, but no one would speak about it,” she explains. “The goal was to insulate the senior executives and hope that nothing happens.” It’s telling, as the AP reported yesterday, that the director of vehicle safety at GM was four rungs down from the CEO prior to the recall. Both Ford and Chrysler’s hierarchies place safety directors closer to the CEO, and management experts told the AP that “safety ranks higher at other companies as well, especially food, drug, and chemical makers. At some, the safety chief has direct access to the CEO.”

But although changing a corporate culture is hard, it is not impossible with the right leadership. Just take this now-famous story about Ford CEO Alan Mulally. As Fortune first reported:

“Mulally instituted color coding for reports: green for good, yellow for caution, red for problems. Managers coded their operations green at the first couple of meetings to show how well they were doing, but Mulally called them on it. ‘You guys, you know we lost a few billion dollars last year,’ he told the group. ‘Is there anything that’s not going well?’ After that the process loosened up. Americas boss Mark Fields went first. He admitted that the Ford Edge, due to arrive at dealers, had some technical problems with the rear lift gate and wasn’t ready for the start of production. ‘The whole place was deathly silent,’ says Mulally. ‘Then I clapped, and I said, ‘Mark, I really appreciate that clear visibility.’ And the next week the entire set of charts were all rainbows.”

It’s a striking moment, in which a subordinate was allowed to admit failure and the boss praised him for it — something Edmondson says is crucial to developing a culture that can learn from failure and communicate more openly. In a case study related to her research at Children’s Hospital in Minneapolis, Edmondson chronicled the efforts of one executive to maneuver workplace norms. As Garvin explained to me, this executive borrowed the concept of blameless reporting from aviation. “If you have a near miss, and you file it with the FAA within 10-14 days, you are exempt from punishment.” He says the hospital executive instituted something similar, and worked to distinguish blameless acts from blameworthy ones to maintain accountability while encouraging employees to speak up.

GM is seemingly not there yet, despite the company’s insistence that today’s culture is markedly different than it was before its 2008 bailout. Sure, Mary Barra told employees, “If you are aware of a potential problem affecting safety or quality and you don’t speak up, you are part of the problem. And that is not acceptable. If you see a problem that you don’t believe is being handled properly, bring it to the attention of your supervisor. If you still don’t believe it’s being handled properly, contact me directly.”

But everything we know about speaking up shows that doing what Barra has asked people to do is completely and utterly terrifying. And while the first steps are important — the Speak Up for Safety program and hiring a new safety chief — transitioning from hiring someone in that role to embedding a safety culture throughout GM is far from guaranteed. “A strong safety culture stems from psychological safety — the ability, at all levels, to speak up with any and all concerns, mistakes, failures, and questions related to even the most tentative issues,” writes Edmondson. “Simply appointing a safety chief will not create this culture unless he and the CEO model a certain kind of leadership.” The stick of firing a handful of people isn’t enough to send the message — the CEO must also use the carrot of publicly praising employees who speak up.

At the same time, many of the people I spoke with are optimistic about Barra’s ability to lead going forward. “You hope a crisis brings change,” says Maryann Keller. “But GM has had a hard time internalizing that past crises were their fault.” And because Barra isn’t from the financial side of the company that’s obsessed with counting beans, Keller hopes she’ll have better insight into how difficult it is to put cars together — how tough it is to talk about things that go wrong.

 

 

This blog first appeared on Harvard Business Review on 06/05/2014.

View our complete listing of Strategic HR blogs.

     

0 Comment Comment Policy

Please Sign In to post a comment.