The University of Prius offered up another gem as I was driving to work this week – this time from ABC's great All in the Mind podcast. For copyright reasons, the 3rd July 2010 episode was a re-run of a 2007 piece with Philip Zimbardo, titled When Good People Turn Bad. Zimbardo was the Psychology Professor at Stanford University for the infamous Stanford Experiment – where students played the roles of guards and prisoners for a few days with controversial and disturbing results for all concerned. The podcast is good – his reflection on the flaws – and his flaws – in the experiment makes for interesting listening.
Philip Zimbardo: Yes, it's a very common phenomenon that's been widely studied by psychologists, I know, in Australia as well as in the United States. You know, when your behaviour does not mesh with your values and attitudes, typically what happens is your attitudes and values change to fit the behaviour rather than the other way around. Talmudic scholars would say get people to pray before you try to get them to believe, once they start praying they'll come to believe what they are doing. So a lot of evil done by good people is really more from the evil of inaction
Some of his conclusions are, to put it mildly, disputed – not least by a more rigorous, but televised version run in 2002: The Experiment.
But what did strike me was the degree that people do adapt to fit the community they're working in. It's something I've seen in organisations that I've worked in – my own values being subsumed in favour of other behaviours. Not necessarily because those people around me believe in the behaviours, but rather that they became accustomed to those – and are now passing on those behaviours. And the rituals we follow will shape the principles we hold – not necessarily the other way around.
(I'm not for a second invoking the insidious "following orders" argument – this is rather a recognition that values and behaviours can be subtly influenced and change over time, contrary to an individual's expressed values.)
Where this has particular resonance is when a department or organisation is malfunctioning. We assume that negative actions are deliberate, that individuals are at fault and that the overall organisation will spot these problems and resolve them. And the truth is that none of these necessarily hold true.
When it's an individual, it's easier to see – through surveys and other means – when there's someone that's out of sync. But seeing a group of people on a slow shift towards negative behaviours is far more difficult from without. It's hard to anticipate, hard to detect.
It's often not as a result of conscious decisions, but instead a steady, slow oneupmanship that started heading in the wrong direction and the competition dynamic took too far and too fast. So hunting for the person that made the decision is a poor approach – instead it results in scapegoating.
And often it arises from the environment – the cultural, the economic, the political and the organisational. Processes and targets too play a part – if it's easier to meet a target and fulfil a process than to do the right thing, there are few people who'll do the right thing. And the unintended consequences of targets and processes are regularly the things that trip us up.
The approach that we've been looking at comes back to narrative – what situations are "allowable" and which tip over into "inappropriate". It ties in with David Maister's idea that culture is set as much by what we tolerate as what we aspire to. It's entirely possible to gather stories from people about what's happening, but more useful to present them with examples of behaviour and ask them where those examples sit.
An "ethical audit" this way needs to be done with care – no clear answers from the stories (i.e. it's not obvious what the answer is – "should you lie to a customer to win a contract" is a bit too blunt!). But it allows for department-wide or even organisation-wide assessment of where people are, how ethical they are in their dealings and therefore what might be done.