Work Culture and Constructive Uncertainty

Image for post
Image for post

Jonah Sachs says that companies that emphasize being ‘nice’ wind up making poorer decisions, as in the example where only one worker knows about recent moves by a competitor, and another has some new information about an emerging technology:

In a workplace that values harmony and respect (and nearly all now do), that new information, sadly, will almost certainly get buried. That’s thanks to a pernicious and powerful quirk in group psychology called shared information bias.

Here’s what happens: in nice organizations, team members become highly attuned to each other’s feelings and short-term well-being. Individuals rightly assume that their survival and advancement is based as much on how nice they can be and how good they make others feel as on the results they produce.

How does this lead to bad decisions?

Good as it feels, this emphasis on niceness leads to poor decision-making and low levels of creativity by limiting the number of inputs a group will consider and diverting focus away from risk-taking and results. It also, surprisingly, reinforces cultural biases against women and people of color. This is ironic because “nice culture” is, in many ways, an outgrowth of attempts to make companies more inclusive and welcoming, a backlash against the hard driving, dog-eat-dog, exclusively white male workplaces of decades past. But niceness fails to increase equality, even making it worse.

But I think Sachs mischaracterizes this. It’s not that the company culture is emphasizing niceness: the real problem is that the culture does not place a premium on constructive uncertainty (see Work Skills for the Future: Constructive Uncertainty, which is the active effort to slow down decision-making, and its close cousin, dissent ( Dissensus, not consensus, is the shorter but steeper path)).

One way to actually counter the stupidity of groupthink is to institutionalize the exploration of alternative perspectives, the surfacing of all known information that is relevant to the issue at hand, and sidestepping the usual dumb approach of arguing for a particular course of action before hearing and contemplating the situation. Basically, to try to slow down decision making.

Shared information bias (or sharedness bias) is simply that we give more weight to information known to more people, while it is often the fact that something known to only one or a few might be of greater importance. Even someone of lower status. Preference bias is where we exclude looking deeply into alternatives that we believe are less likely to be important, without real justification.

Perhaps I can restate what Sachs means this way: when companies short circuit fully evaluating alternatives leading up to a decision based on cognitive biases, or business conventions that minimize contention and dissent, they are likely to make poor decisions.

So, put less of a value on ‘making nice’ and focus instead on the science of sidestepping our biases.

Originally published at

Written by

Founder, Work Futures. Editor, GigaOm. My obsession is the ecology of work, and the anthropology of the future.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store