You have my word…but not my connotation

DictionaryIn the beginning, there was the Word.

Then the Word quickly became taken out of context. It was associated with ‘beginning’ things, the creator’s personal clique and that party where Lucifer left in a huff. The Word became a sly insult, even though there was no change in spelling or pronunciation.

Over time, the Word came to mean something completely different. Its evolution saw it take up a pile of emotional baggage and then lose it all at some foreign terminal before slipping back into common usage through a back door. Dictionaries struggle to keep up and are typically on the back foot when it comes to providing a comprehensive description of the Word’s current denotation.

As any linguist knows (or tourist with a Lonely Planet phrase book, for that matter), a strict reductionist approach to language is doomed to failure. It’s like a geneticist trying to understand how an organism functions by only looking at the protein translations of each gene – it overlooks the multiple effects of how fundamental units work within a greater context. Dictionaries and translation guides might give a rough idea of what a single word might mean in a generalised fashion, but beyond that they’re good for pressing flowers and taking up valuable shelf space.

A famous Australian athlete was recently the centre of attention for all of the wrong reasons when she boasted about a favourable rugby union win by quipping on Twitter, ‘Suck on that, faggots’. Enter press vultures, minority-group talking heads and anybody else with an opinion on sport-celebrity worship and homophobia.

The word ‘gay’ has had a rather adventurous journey over the past century, progressing from something Fred Flintstone would describe his parties as to something Barney Rubble would one day come out of the closet as (yeah, come on…it’s obvious), to eventually becoming a pejorative implying something to be weak or unsuitable. This back-and-forth struggle over the word’s meaning is usually taken to belie a single, deeper driving force; in this case, one of homophobia. So too, words like ‘faggot’ have evolved under conditions of hatred, fear and generous helpings of misinformation.

However, the second driving force of word usage is that of the social paradigm. We don’t always use words for their denotative meaning, but often for select connotative ones that are emphasised within our personal tribes. The Aussie tradition of ribbing your friends often leads to borrowing pejoratives out of an ironic implication. That doesn’t let this particular athlete off the hook, however – we always have a responsibility for considering the implications of various connotations a term will have on our entire audience. Not expecting individuals within a diverse audience to feel uncomfortable about a tweet that uses the word ‘faggots’ is rather naïve, even if it is born out of a habit of not thinking before writing.

Absence of consideration of the tokens of information that hide amongst the word’s shadows – including the substantial impact of body language, tone and context – frequently leads to heated discussions such as this. We each bring our own cultural assumptions regarding the subtle differences in what a word implies. For some, ‘faggot’ is deeply offensive. For another individual on the other side of the city, their own sociocultural context blinds them to its barbs and emphasises its playfulness. In crossing between cultural boundaries, what is a cheeky jibe is seen as a hate crime. Of course, the distinction is not a clear boundary. For it to have evolved, it needed ignorance. It needed people to forget the impact words have in getting past out cognitive defences and changing how we feel without our detection. We might not use words in considering their diverse implications, but they exist nonetheless, confusing communication when we least expect it.

The culture of cyberspace is crippled by its own blindness. Devoid of non-verbal cues, often deprived of the immediacy of direct feedback, and distanced from the audience by time and space, communication is severely handicapped. The saga of the rationalist surge ‘Tone Wars’ is a perfect testimony to this, with calls to come up with concrete definitions of what it means to be a ‘dick’, all the while people are providing perfect examples of the behaviour in comments and blogs.

Yet to be fair, the blog bubbles that concentrate such heated debates do the act of communication such disservice, one has to try hard not to be perceived as ‘dickish’. Tools we frequently use in everyday life, such as wry smiles, rising and falling tones, nodding, or even the ability to quickly gauge how an audience interprets the use of a term, don’t exist in the comments section of a backwater blog. Rarely do those engaging in web-based discussions give the benefit of the doubt to an author, seizing on a turn of phrase and demanding for a duel to the death with dictionaries at dawn.

On the flip side, it’s easy to hide in the shadows of a word. We might all think we know what ‘God’ means, yet typically forget that the term often lacks clear defining features, being stretched to imply vague notions of ‘something out there’. Science, as I’ve often said in this blog, suffers greatly from its own vague meaning. Some do this intentionally, while others are simply oblivious to the fact and carry on with the presumption that meaning will be magically transferred from one mind to another fully intact.

Knowing the power of the word as well as its limitations goes a long way towards being an effective science communicator, even if those limitations mean one cannot ever be perfect at the game. Larger audiences will tend to equate diverse cultural backgrounds, and tuning language to suit everybody can often rob it of much of its heart and essential subtext. But finding the balance is important for getting benefits out of any discussion.


Published in: on September 10, 2010 at 11:12 am  Comments (2)  
Tags: , ,

Processes and products – the gears behind the facade of science

Gears of the universeLet me introduce you to an old colleague of mine. For the purposes of this discussion, we’ll say he was a middle-aged bloke, and we’ll call him Joe. We both taught junior science in the same school, although Joe also covered subjects in IT and physical education.

Joe and I got on rather well, especially when he discovered I was an atheist. On one level I could sympathise with his eye rolling and venting about creationist beliefs in the community or the call for prayers for somebody-or-other’s hospitalised relative, although I never felt passionate enough about it to feel a need to match his vehemence.

When it came to evolution, Joe seemed to know his stuff. He could recite passages from Darwin’s ‘On the Origin of Species’, knew all of the significant details of the Dover intelligent design trial and always had interesting discussions with his classes on some strange new discovery in biology that exemplified speciation. By most accounts he was a superb teacher, loved by his students and enthusiastic about their education. Hence it came as a surprise when he expressed doubts about the validity of climate change.

I was curious. His reasons weren’t particularly novel; part conspiracy theory and part ignorance on atmospheric chemistry, from what I could gather at first. The justifications behind his doubt were so mundane I wouldn’t have even blinked, had it been anybody other than Joe. So over a beer we discussed the details of our opinions on the matter, which meandered through other topics such as psychology, sociology, anthropology and somehow back into his pet field of evolution. We agreed on most things, however when I began to press him on his philosophical underpinnings on the principles of science I was in for a shock.

By the end of the evening it seemed to me as if Joe believed in evolution because he had a particular dislike of religion. Psychology was a soft science, therefore useless as it didn’t give us the dichotomy of ‘certainty’ that physics did. And climate change was probably an invention of climatologists because they weren’t being paid a great deal. It became clear that while we agreed on what constituted a valid scientific theory, we disagreed on what made them valid.

Joe taught me a lot about epistemology – one needn’t know a lot about how science operates as a methodology to embrace ‘scientific’ beliefs. It might seem obvious in hindsight, but it was something of an epiphany to me. From that day I started to notice broken logic slipping into Joe’s arguments. Not to say I disagreed with his conclusions, but primed with an insight into his epistemology when it came to science, I found myself spotting strange non-sequitors, colourful strawmen and the occasional twisted fallacy.

Two months ago I received an email from a reader of the publication I write. Let’s say it was a young mother and we’ll call her Mary. She was concerned that an article I’d written had been phrased as to presume that evolution was factual, and felt obliged to question me on it.

I always have the option of responding to critical emails I receive with a standard ‘thank you for your feedback, here is an explanation of our policy on evolution/climate change/paranormal events etc.’ paragraph, if I feel anything more in depth would only be a waste of time. Mary’s email was quite polite and worded in a way that made me give her the benefit of the doubt. So I took the time to respond.

Several emails bounced back and forth, and none of them changed her mind into accepting that evolution was a rigid theory. However, she did demonstrate that she had read the information I had sent her and that she understood how her previous beliefs weren’t at all internally consistent. She promised to buy a book I suggested on the topic and continue to think on it.

Where Joe and I shared similar conclusions but differed in our means of arriving at them, Mary’s language gave me confidence that her way of thinking would see her through eventually. The fact she didn’t take my word on it was also somewhat gratifying; she subscribed to my publication because she liked science and wanted a practical resource to give her ideas to use with her children. But given what she’d been told about evolution, she was concerned I might be wrong.

Joe and Mary are the two people who immediately to my mind when I think about education. Joe reflects the product-based communicator, who eagerly distributes knowledge with a passion that describes it as absolute truth or absolute nonsense. Unfortunately, it is accompanied by an epistemology that says ‘if it sounds ridiculous, it probably is’ and ‘if you can’t prove it, it’s not scientific’.

On the other hand, Mary makes me think of the right epistemology, even if I disagree with her conclusions. Somewhere along the way she’d heard from a trusted friend that the human eye is irreducibly complex, and therefore evolution was flawed as an idea explaining biodiversity. She had never studied science in her senior years of high school and openly admitted she was heavily influenced by a friend she considered to be well educated. On further discussion, she admitted that her lack of education in science made her feel easily intimidated by those who claimed to know better and she sought a better way of understanding the concept in the face of what she saw as a conflict of views.

Obviously in the end, it’s the products of science that have a pragmatic impact on our lives. Good epistemology means nothing if you still believe that fairies will save you from cancer, or that Reptilians are running a New World Order global government. We can debate the validity of climate models until the cows come home, but the methodology is useless unless it is used to make a decision. Good information is a necessity in education – science cannot be taught without facts and theories. Yet prioritising the communication of the products of thinking over the process is just like painting a wall without a primer coat; you need more paint and risk having it peel off once the bad weather hits.

The world has a glut of science communicators and rationalists who are eagre to promote scientific facts, theories and hypotheses in the face of misinformation, as if by shouting it with enough passion they will somehow drown out the cacophony of nonsense. In some cases, that passion does rub off and inadvertently change epistemologies. People who would sooner stay silent are encouraged to demonstrate their epistemological values, influencing their children, nieces, nephews, students, fans, football team or band members that tiny bit, just enough so they, too, start to think more scientifically.

But it is an accidental success, one that is unqualified and incidental. In rare cases, there are those who understand the importance of encouraging young people in adopting a scientific epistemology and focus their efforts not on driving home the facts, but encouraging open investigation, discussion, critical thinking and experimentation.

Much of the time it’s hard to distinguish the Joes from the Marys, readily associating compatible beliefs with compatible philosophies. Joe’s students might side with Darwin, but only for as long as it makes sense or for as long as they like him as a teacher. Mary’s children might echo her disbelief in the effects of natural selection, but so long as she demonstrates flexibility in her beliefs and a willingness to ask questions, I have little fear that they’d defend their disbelief irrationally in the face of logical, internally consistent evidence. Hence I’m far more concerned about Joe’s impact on his students than I am about Mary’s influence on her children.

Published in: on September 6, 2010 at 4:28 pm  Leave a Comment  
Tags: , ,

Psych-out over ridiculous communication

And this bump means I'm clumsy.

I’m not a psychologist. I haven’t got a degree in it and I failed the only subject I did in it during my undergrad about fourteen years ago. That said, much of my career has revolved around making myself well acquainted with certain topics in this field. Education and cognitive psychology go together like pedals and bicycles. To butcher the analogy further, education without psychology is like a bicycle without pedals – it might feel like you’re going for a ride, but once you get to a hill you’re screwed.

While I cannot profess to significant expertise, I have worked hard to understand the fundamentals and have a keen eye for deciphering literature on current educational psychology. In my world, psychology is a useful science for determining worthwhile methods for communicating, educating, engaging and assessing.

Much like its sibling sociology, psychology is often flippantly dismissed as a pseudoscience on the grounds that it is compromised by a myriad of variables that no experiment could possibly iron out. As such, diverse opinions on anything to do with human thinking or behaviour are treated as equally valid. Or, more precisely, for some people, personal stories and ad-hoc rationalisations are granted more weight when it comes to claims in psychology than they would if the topic was physics or chemistry.

I find this rather odd, given my understanding of science as a methodology. It’s true that psychology and its ilk are impeded when it comes to devising experiments. Ethics aside, controlling for a diverse range of niggly factors makes it hard to pinpoint definitively a simple relationship between any two observations. Fortunately science is not limited to such constrictive dichotomies of ‘proven by experiment’ and ‘falsified by experiment’. Rather, science isn’t about ticks and crosses, but about the weight of evidence by way of a combination of logic and observation. It’s true that a social experiment cannot carry the same convincing power as one done with lasers and atom counters. But it’s not to say that all beliefs regarding social behaviours or cognitive reactions are of equal weighting.

The so-called tone wars rage on, with Phil Plait recently fleshing out his TAM8 speech with several blog articles, and Richard Dawkins weighing in with his two cents. While I can appreciate the sharing of opinions, I do find it quite strange how the numerous responses to each – as well as the original comments themselves – utilise personal accounts to support claims of the role of ridicule in communication. I find it odd because this is a community of skeptics; people who prize science and would quickly put the boot into anybody foolish enough to try to support their claim with an anecdote or a just-so-story.

While by no means does it come across as a majority view, I have had it pointed out to me that psychology is a soft science, therefore it’s impossible to address such a claim with evidence. I wonder, however, how many of these skeptics would quickly suggest the role of confirmation bias in psychic claims, or cite any one number of other neurological quirks or cognitive hiccoughs as a suitable explanation for some otherwise paranormal observation.

No, psychology is not physics. We won’t have laws of the brain in the next few years, or a formula for sociology. The local tabloid might have articles on equations for the best handshakes or how best to meet girls, but few self-respecting psychologists would entertain such notions seriously. But to dismiss it as science and believe it has no role in helping us understand communication simply because it’s complicated is to misunderstand how science operates. We’ll probably never know whether ridicule is the best course of action for creating a more critical community in the same way we know the laws of thermodynamics. But by no means does that mean all bets are off when it comes to studying human behaviour, and in no way does it mean all opinions on the topic are equally valid.

Published in: on August 25, 2010 at 12:35 am  Comments (4)  
Tags: , , ,

A public immunity to freeloaders

SyringeImagine a dinner party where a guest openly admits they don’t work. On inquiry, they smile warmly and cite statistics concerning death or injury in the work place. ‘I just don’t want to risk it,’ they say.

The next obvious question comes up; ‘So, how do you get by?’

‘Oh,’ they say, pausing to take a sip of wine. ‘Easy. You see, enough people pay taxes to provide me with welfare. I really don’t need to work.’

Everybody smiles and nods politely, believing it’s their friend’s choice to refrain from working if they don’t want to, and they move onto other topics to do with politics and religion.

Sound familiar? No? I must admit, in spite of the numerous dinner parties I’ve attended in my life, I’ve never encountered that scenario. There’s probably a simple reason for that – most people would be embarrassed to openly confess such a thing. Who would want to say to others that they’re not willing to roll the dice for themselves, but are happy to enjoy the benefits provided by the risk-taking of others? Very few would.

Yet several years ago during a lunch outing with work colleagues I had an acquaintance openly state they chose not to vaccinate their children. Why not? Simple – they cited the risks of vaccination and stated they didn’t want to take a chance on their child’s health. Unfortunately the mood of the conversation lightly condoned their choice and even congratulated them on making such a decision.

Now, I have an infant son who is of an age where he is getting his vaccinations, and I must confess I hate seeing him in pain or discomfort. The thought of him dying is the most terrifying thought I have ever had to encounter in my life – and that isn’t hyperbole. There’s only one thing worse; if his suffering was the direct result of a decision I made.

I chose to have my son vaccinated knowing he could face uncomfortable side effects. Not only did I know this from my undergraduate degree and occupational experience as a medical scientist, but my wife and I knew because we had easy access to literature and a physician who discussed the situation frankly and honestly. I understand not everybody is so fortunate, and that there are undoubtedly those in the medical profession who would avoid discussing the potential for harmful consequences, however I find my experience as a parent hard to reconcile the concerns of groups like the Australian Vaccination Network who feel a responsibility to present an emotion-laden ‘balance’ of information to the public.

This past colleague based their decision to refrain from vaccinating their children on two notions – one was that they’d read accounts of children suffering from seizures and even dying following vaccination, and the second was that they’d never heard of children dying or suffering from the conditions that they were being vaccinated against. At least, not recently. When another colleague pointed out that vaccination could well be the reason behind such an absence of modern mortality, they in response referred vaguely to statistics that indicated death rates from communicable diseases were dropping long before public vaccination programs.

This is, in a way, quite correct. Better sanitation and improved healthcare practices during the first half of the 20th century saw to a gentle decline in deaths from diseases such as measles and whooping cough in most post industrial countries like Australia.  It’s difficult to tease out with a simple glance at a graph the precise impact of any public vaccination program from the impact of improved medical intervention, given we don’t have a control population where there is modern healthcare practices sans immunisation. The best we can do is to watch the statistics of health complications that arise from epidemics that occur in today’s world when public vaccination falters.

Unfortunately, there is no shortage of such natural experiments. From 1988 to 1990, for instance, California experienced an epidemic of 16 400 cases that resulted in 75 deaths. That isn’t in an impoverished, third world country without access to sanitation or medicine – that is in a modern, post-industrial nation where immunisation rates had fallen.

Nations with social health care systems like Australia haven’t been free of such outbreaks, either – in 1993, the Western Public Health Unit received 889 measles notifications for Sydney’s western suburbs. Ten percent resulted in hospital admissions, with one case of encephalitis. Fortunately no deaths occurred.

How does this compare to the risks taken from a measles vaccine? During the Measles Control Program in 1998, there were 89 reported reactions out of 1.78 million vaccinations – the same number as hospital admissions in the Sydney outbreak of only 889 cases of infection. These reported reactions included 8 reported rashes, 4 with inflammation of the parotids and one febrile seizure. No children died.

There are, of course, plenty of anecdotes to suggest horrific experiences of vaccinated infants. Seizures are certainly possible, and for me to experience such an event with my child would be unimaginable. But in a community where nobody took that risk, the dice I’d be rolling for my son would be heavily loaded. Even if each anecdote was verified, it’s hard to imagine the risks would come close to the chance of complications from contracting a disease like measles. I might not like the one in a million chance of my son having a seizure, or the slightly increased chance of death such a side effect could present, but the odds he’d face in a world with no vaccination simply wouldn’t compare.

Yet the community this colleague can happily raise their child in is not unvaccinated. Enough people roll that dice, so their children can appreciate good health in a community where pathogens have nowhere to proliferate. So long as a high enough percentage of their fellow citizens take that risk for them, they won’t have to take that tiny but real chance of suffering vaccine side effects.

I’m happy to shoulder that burden on behalf of any individual whose constitution puts them at a significantly greater risk of illness should they be vaccinated, just as I’m happy for the taxes I pay to help benefit those who are impeded from working. Yet for those who simply don’t like the idea that the demonstrably minuscule odds are too much for their child to risk, I feel no such obligation.

When it comes to most things in the community, I’m a process-driven rationalist. I support people making their own decisions regarding their own finances, health and well being, and choose to engage in outreach that assist them in making decisions that have the best chance possible of matching the outcomes they hope for. I oppose regulations that take that decision out of other’s hands and stand against the use of authority to deprive any citizen of the right to decide a course of action for him or her self. I believe in education over legislation…for most things.

Vaccination is a communal decision. While education remains a vital necessity in this regard, I also feel those who wish to rely on the risk I take to see to their child’s safety borders on criminal, and in the very least can be personally considered to be highly immoral. I struggle with the idea of legally enforcing vaccination, however find it even more difficult to welcome the choice of others to take for granted the protection my risk-taking has provided for them.

Needless to say, I don’t tend to go to many dinner parties any more. I stayed quiet during that lunch. As a relatively new father, I’m not sure if could manage such silence again in the future.

Published in: on July 25, 2010 at 11:14 pm  Comments (10)  
Tags: , ,

First, do no harm

In chewing through my weekly readings for my current medical anthropology studies, I came across a paper which explores four social theories of global health. The first one was described as;

…the unintended consequences of purposive (or social) action. Introduced by the sociologist Robert Merton, this theory holds that all social interventions have unintended consequences, some of which can be foreseen and prevented, whereas others cannot be predicted. Therefore all social action needs to be routinely evaluated for unintended consequences that might lead to the modification of programmes, and even, if the consequences are serious enough, their termination. This theory would seem to be the social science equivalent of medicine’s ‘first, do no harm’, but it goes well beyond that ancient saw to reason that every action can have unintended and often harmful consequences of programmes…(Kleinman, A. (2010), The art of medicine: Four social theories for social health, The Lancet 375: 1518-1519)

It struck me how relevant this was not just to global health but to any social engagement, especially that of the the contemporary rationalist surge.

Communication of rational thought by grassroots communities appears to be a rather ad-hoc affair, reasoned by assertions that it takes all manner of styles to educate people and anecdotal evidence of what worked for them (so must surely work for others too!). On occasions that I’ve addressed this vague, almost whimsical method of outreach, I’ve been met with explanations of how grassroots communities consist primarily of volunteers and amateurs with limited time and resources and little professional support. Which is true, of course.

Yet these same individuals demonstrate boundless passion in their production of volumes of research on all manner of paranormal and pseudoscientific debunking. There are countless pages of words pumped out daily by a veritable army of bloggers who devote enormous man hours of reading and writing in the name of making the public ‘aware’ of what they perceive to be nonsense. This army consists of a community rich with academics, teachers, physicians, engineers and researchers who all have years of experience in doing their homework and solving problems.

It’s possible, of course, that communication is simply not regarded as a problem to be solved. Which is unfortunate. Given the drive so many rationalists have in wanting to promote ‘awareness’ and ‘educate the community’, it would be a shame if all of that passion was bottlenecked by a myopic refusal to pause and consider this simple question – ‘What is the full impact of my actions?’

Published in: on July 21, 2010 at 9:56 pm  Comments (10)  
Tags: , , ,