I found myself an unwilling participant, yesterday, in a study into the nature of perception. Not that there really was a "study" in any formal sense of the word. No. It was more an observation following on from several real studies that took place around the world when nations were debating the whole WMD thing.
Back then, the research discovered something disturbing, especially to people like me who claim to have the power of objectivity. It discovered that when assessing evidence, our brains subconsciously assign "importance points", and the closer the evidence is to what we already believe, or the closer the information is in its presentation to the presentation we would expect from information that is accurate, the more importance points are assigned. Conversely, the brain assigns less points to information that contradicts what we believe, or is presented to us in a format that is not what we would expect for accurate data. In other words, when you come across information supporting your belief, or which conforms with your expectation of accurate presentation, you are more likely to see it and pay attention to it, not matter how unlikely it is, and when you come across information that contradicts what you believe, or is presented in a manner not consistent with your expectation of accuacy, you are more likely to miss it, or ignore it, no matter how likely it is.
Here's a simplistic example. You're handed information scribbled on a scrunched up piece of paper. The spelling is poor, the grammar worse and the handwriting is child-like. You're handed other information presented to you beautifully printed and bound, with an official looking insignia on the cover. If you have no preconception, you're more likely to believe the bound report. You are only more likely to believe the scribble if it supports a fundamental belief, or if the bound report contradicts some fundamental belief.
Let me give you another simplistic, though marginally more complex example. Let's say you believe the earth is flat, and you're standing on a high cliff looking out over the ocean. Your brain will not see the curve of the earth at the horizon, even though it is there and is objectively measurable. The person standing next to you believes the earth is round, and says "wow, look at that curve". Your most likely reaction will be "what curve?", and you still won't see it, even though it's been pointed out to you AND it's there.
I was doing some research yesterday, trawling through hundreds of research papers looking for links between whey protein and a beneficial effect in preventing or fighting cancer, and especially breast cancer.
Hundreds of papers.
And suddenly, I noticed that I was either subconsciously dismissing or glossing over papers which did not support the belief, or I was overly critical of such studies, and far more accepting of studies which confirmed the link, even though some of those studies were poorly constructed.
I realised that this sort of filtering happens every day. Our brains look for information that validates our pre-existing understanding of the world. We're far more likely to notice things that fit into the little model of the world that sits inside our heads, and equally likely to ignore things that do not fit into that model.
Even more important, is that our brains add more importance or credence to information that comes to us in a form that is either more expected, or more attractive than other information.
Think about that, and what it means to the way companies sell stuff to you. Many years ago, when marketing was more a dark art and less an evil science, a gin company had a leading brand. They were losing market share so they decided to find out why.
When they did a blind taste test, their brand ALWAYS scored higher for taste than the competitors brand. That is, more people thought their brand tasted better. They then repeated the experiment (with different subjects), but this time, those subjects could see the bottles from which the gin was poured. The competitor's brand ALWAYS scored higher. Finally, they repeated the experiment again, but this time, they switched the contents of the bottles, that is, they put their gin in their competitor's bottle, and vice versa. The one in the competitors bottle (theirs) now scored higher again.
The test subjects perceptions of what they were about to taste was influenced more by the way they saw the product, and less by the actual taste. The company changed their bottle and regained market dominance.
If you doubt whether this applies to just about everything we buy, take a look at the cosmetic industry. Here, form over substance triumphs, because the consumers' brains are far more likely to believe something in a beautiful package is going to make them beautiful than something in a plain dull package.
I read a book once called "Supressed Inventions and Other Discoveries". It had stories about a whole raft of "new technologies" that its author was suggesting had been supressed, either by government or by competing commercial interests. I will tell you that I have no doubt that some of the stories in that book were more likely to be true than not, but one less paranoid explanation may be that those evaluating the new technology simply couldn't see the evidence that it worked.
We see it in medicine every day. As recently as last year, an Australian doctor was being investigated because his patients were claiming that he had cured their incurable cancers. He was using microwave radiation therapy, and setting the wave lengths of the microwaves to specifically target individual types of cancers.
There were hundreds of patient files, almost all of which showed near miraculous recovery from cancer previously diagnosed by another, more recognised oncologist as terminal, Almost all of those patients were still alive, years after they were expected to have died, and almost all of them were cancer free.
The authorities charged with the responsibility of evaluating this radical new treatment ignored the patient files and instead focused on the mechanism. Their conclusion was that as there was no known mechanism by which this treatment could be killing the tumours, the treatment therefore had no validity and did not warrant further investigation. The investigative team did not talk to a single patient, and did not refer, in its final report, to any patient files.
For some, this was clear evidence of conspiracy by the evil drug companies. I think the explaination is a great deal simpler. The investigators just didn't believe the treatment could work (probably because they had been indoctrinated by the aforementioned evil drug companies), so they only gave credence to evidence that would support that position.
What I'm trying to say here is that the same mental processes are at work. We see stuff that we want to believe. That belief may be that a particular fact is true, or it may be that something more beautiful works better than something less beautiful.
Sadly, though, there's no solution. We're overwhelmed by input and all of it is subjectively dealt with in our brains. If you start second guessing your motivation for assigning validity to one piece of information, or one product over another, I suspect you'll go nuts. But maybe knowing why people don't see what you see might help keep you out of an argument or two.
3 comments:
This is ESSENTIAL reading my dear Chester. You have absolutely hit the nail on the head.
Your words resonated with me over & over for you know what an evidentiary little duck I am. The need for proof to backup a particular belief and so on.
This is a beautiful piece of writing, very eoloquent and most rivetting :)
Re the gin ref. You should visit DMM's latest post, simply called "Interesting". It is along similar lines to your gin thing and is also compulsive reading. How funny you should both post something on a similar subject on the same day even tho you have never met! If you want the link to her blog, just go to mine. She is on my blogroll :)
I am going to save this post of yours. It's terrific.
xxx
Also, have you read The Tipping Point?
There is FASCINATING reading on how people interpret information, how children process Sesame Street, why some people are better at spreading messages than others and so on. Fantastic stuff.
Spooky. I just read Ross Gittins piece CAW/DMM referred to from the Sydney Morning Herald this morning, (see below) and it talks about the same stuff. Weird. He even talks about the gin, only in his writing, it was brandy.
I'd be inclined to believe it was brandy, probably because I would assign more points to something a well respected journalist wrote than something my fuzzy failing brain remembers.
I think the philosophers call that synchronicity.
To read Ross's article, go to http://www.smh.com.au/news/ross-gittins/what-we-see-is-what-we-think-we-get/2006/11/14/1163266547675.html?page=fullpage#contentSwap1
Post a Comment