The placebo protest: under the microscope

Placebo pills

Protesting makes me feel good.

At the end of April last year, I wrote an essay on the rise of protest-based demonstrations as a means of engaging with the public on certain irrational beliefs. While the 10:23 homeopathy campaign and the so-called Boobquake protest were the two references I provided, other examples such as the atheist billboards in the US can be arguably included in the category of what I termed the ‘placebo protest’.

In its simplest terms, protests can be described as any collective’s attempt to coerce others into changing behaviour or taking action, making it a fairly broad category that can include many different forms of public engagement. However, the term also suggests the active opposing of an existing social condition, so is commonly negative or antagonistic in nature. A campaign that promotes the message ‘don’t eat pizza from Joe’s” is a protest, while ‘eat at Joe’s Pizzeria’  is less likely to fall into that category.

While academic literature varies somewhat in the precise boundaries of what constitutes a protest, there is a consensus amongst outreach researchers that any attempt to impact on public behaviour relies on specific environmental conditions (termed ‘opportunities’) to succeed. Identifying these conditions can make the difference between winning people over and wasting resources trying.

A good example of identifying opportunities involves understanding how a target demographic interprets a particular message. Using communication tools that convey subtle variations in meaning between sub-cultures risks losing key messages in translation; using stunts, demonstrations or slogans that mean one thing to the protester (or extend from a culture within that group rather than one understood externally) and another to an audience could make any effort to change behaviour somewhat impotent, or even counter productive.

A significant impediment to identifying opportunities in a demographic is a lack of  – or wide variation in – explicit goals. Terms such as ‘promoting education in…’ and ‘raising awareness of…’ are commonly bandied about without objective qualifiers or even a hint of an observable indicator. Often, the qualities of the target audience might be too broad or be presumed without good evidence. Without clear aims or targets there is an added risk of ad hoc justifications of success, typically relying on output (audience scope and reach) to represent impact (change of behaviour).

For a form of outreach to be a placebo protest, however, there is one last important feature – those engaged have to demonstrate little interest in evaluating the circumstances or effectiveness of their actions. Like placebo medicine, placebo activism is practiced not with a true desire to blind oneself to bias, but simply to feel better on having acted, regardless of the true impact of their efforts.

While I can’t accuse all individuals who are engaged in any single protest campaign of doing so for merely placebo reasons, there is some irony in that a number of people will happily offer explanations for their participation that aren’t unlike the same explanations many users of homeopathy or natural medicine offer; ‘soft science like sociology or psychology is too ineffective to study the effects of what I intuitively already suspect to be true’, ‘it takes all types of action to make a difference’, ‘doing something is better than doing nothing’, and ‘it might not work for all people, but what’s the harm in trying?’

Central to the placebo protest is the apparent assumption that sharing feelings is synonymous with sharing knowledge. An emotional reaction to a wrongdoing leads to encouraging others to see it as silly, immoral or dangerous. That’s not to say this is always ineffective (history is full of examples of fear campaigns that are immensely successful in changing behaviours), however when it comes to rational outreach, should it be the desired approach?

Boobquake was proposed as a scientific study, for example, however was presented more as a satirical exercise poking fun at an Iranian prayer leader’s claim that the exposed skin of females is positively correlated with earthquakes. Either way, it’s unclear as to what – precisely – the point of the exercise was, if not an outlet for indignation. Many people have their own view of the agenda, whether it was to promote scientific values, encourage people to understand more about tectonics, or to simply ridicule a specific view (thereby encouraging an emotional reaction in the population to an emotional claim).

The actual impact, regardless of the intentions, is unknown. Was it antagonistic towards the goals of many feminists? Did it polarise views or change them? Were a significant number of people more aware of the science of earthquakes, or of the importance of statistics in science? It’s not clear. Yet there was still a sense of ‘success’ given it had a large output.

When the sense of success carries more importance than a true understanding, however, science loses out. This is the placebo protest. For a community of people protesting in the name of science, it is a rather bitter hypocrisy.

Likewise, when the association of American Atheists launched a billboard campaign in time for Christmas, 2010, telling people ‘You know it’s a myth!’, it’s hard to know what the real aim was. At face value, it might serve as encouragement for members of the driving public who hold some theistic beliefs to abandon them. How successful was it? Are billboards an effective means for spending such funds, or could they have achieved the same (or better) results by spending it elsewhere? If they’re successful, how did it compete against the reciprocal billboard funded by Catholics stating ‘You know it’s real’?

What of the 10:23 campaign, now in its second year? Interestingly, one individual decided to take a closer look at the 2010 homeopathic ‘suicide’ stunt and seek some evidence of its impact.

As a part of a research project, David Waldock sampled reports from the mainstream and social media and analysed them in relation to the event. Focusing on a single objective of the campaign – ‘To educate the public about the full story of homeopathy, to cause them to question and become opinionated about homeopathy’ – he found that the context of the various forms of media discussion changed from being more scientific and clinical to being more political, tending towards language that reflected regulation rather than the specific mechanics of the practice.

Of course, this lays the foundation for a rather interesting discussion. Given evidence of a discourse that is leaning towards regulation, should this be the goal of future protests? Is it better to influence politics and act top-down, or should activists continue to focus on changing attitudes from the bottom-up? Are resources being well used if this is the response, or should they change?

The important thing is, useful discussion can now progress further on the back of potential evidence than on blind assertions. David’s work is by no means the final word on the matter, but it has at least provided grist for the mill and is a clear attempt at marrying observed consequences with actions.

For activism to be successful, it needs to be done with evidence, experience and expertise. Currently, protests and stunts seem to be performed more as a means of expressing frustration, anger or bigotry than a measured way of encouraging a change of culture. As such, success is measured by how many people know you’re upset.

Yet if we truly wish to combat the poor consequences of irrational thinking, we need to identify what makes outreach effective, and distinguish this from occasions when it is merely a way to placate the irate.

Advertisements

The Others

 

Apothecary

Medicines stupid people use (nb., I'm not one of them).

“How can skeptics have a dialogue with homeopaths?” Michelle asks that modern well of insight and wisdom, ‘Yahoo’. “[W]ithout pointing out the stupidity of their arguments? I’m thinking about the paranoid ramblings about big pharma as well as the ignorance of simple science.”

Ignoring for a moment the framing of Michelle’s query, I was interested to scan through the responses for a solution two centuries of debate on the topic might have overlooked.

“Crucially, homeopaths lack the educational level to understand how their potions can only be water,” says Dave, a top contributor. Another top contributor says, “They only start with the fallacies to avoid providing evidence – so no matter what they crap on about, keep dragging them back to evidence.”

“Never argue with an idiot, they’ll drag you down to their level and beat you with experience,” says Flizbap 2.0.

And on it goes. There are some that advocate avoidance of engagement without resorting to well-poisoning or flippant antagonism, but for the most part the advice involves engaging in a behaviour anthropologists and other social scientists refer to as ‘othering‘.

Regardless of the intentions, the practice involves judgments of inferiority or impeded progress based on observations of contrasting social beliefs, behaviours and values. It is born of ethnocentrism, where observations are made with the assumption that one’s own experiences define what is objectively desirable. The result is a sense that a group of people, ‘them’, is inferior to one’s own company, or ‘us’, on account of differences in beliefs and values.

By the dawn of the 21st century, however, ethnology has had enough of an influence on the developed world that it’s become difficult to ‘other’ non-local cultures without seeming naïve or xenophobic. Most people have come to see that subsistence farming or hunter-gathering is not a mark of inferiority or low intelligence, and limited technological dependence is a socioeconomic issue rather than a cultural or cognitive failing. Openly claiming a village in the Papua New Guinea highland is ignorant, stupid or indulgent in logical fallacies would probably raise eyebrows, leading such discussions on cultural practices to be couched in less derisive terms. While the debate over racial intelligence might continue, it’s harder to find people who justify their beliefs by pointing out contrasting traditions, lifestyles or cultural practices.

However, within national borders, ethnocentrism returns with all of the ignorance of our colonial ancestors. If it’s one habit we can’t seem to shake, it’s that our nationalistic heritage has embedded in us a strong correlation between culture and country, as if by being white and sharing an accent our cultural values must be homogeneous. As a result, othering occurs far easier with those who appear (at least superficially) to share an ethnic background.

What’s missed is that within our own community there are shades of culture and sub culture that pool, ebb and overlap. Healthcare is just one example, yet one that has significant consequences beyond other examples of cultural behaviour such as art or language. Medicine in the context of a scientific product leads many to interpret healthcare as a ‘culture without a culture‘. Science and medicine is typically presented as timeless, truthful and above all, objectively correct. It’s strictly biophysical, with its sociocultural component reduced to a vestigial nub.

As such, it’s far easier to other those who demonstrate contrasting medical behaviours. Lack of intellect or education can be easily held up as reasons for their alternative beliefs without evidence, as it’s assumed that all else must be equal. As such, archaic and empty solutions such as ‘better education’ or legal enforcement is suggested as a way of making people see sense.

In truth, there is a multitude of reasons why people use alternative medicines, few of which (if any) have much of a direct link with a level of education or cognitive deficiencies. Rather, values in what constitutes good evidence, familial traditions, cultural identities and distrust of contrasting sociocultural groups play far greater roles in determining health behaviour than university degrees or brain function. In other words, the very same factors medical anthropologists deal with abroad when studying any other health culture are responsible for the same alternative beliefs in our own community.

The question on how best to address culture change is also just as relevant here as it is elsewhere. It’s all well and good that African or Indigenous communities retain their cultural heritage, but what does one do when it conflicts with treatments for HIV, alcohol abuse or diabetes? This is a question a good friend of mine is currently researching through the Australian National University; as you might expect, the resulting discussion demands more than a simplistic ‘they need better education’ or ‘they’re just stupid’. Yet it’s not a novel dilemma; whether it’s vaccination, water sanitation, nutrition or infant care, the question of how to effectively assist diverse communities in improving and maintaining their health and wellbeing has occupied anthropologists for years, producing rich debate and diverse results.

Ironically, those who propose answers for Michelle seem to identify as individuals who would normally value science as a way of presenting useful solutions to a problem. Why then do few seem to be informed by research? Why are the answers without citations or references, seeming to be based on little more than personal hunches or folk wisdom?

Based on my own experience, few would be inclined to look further as they already assume to be correct. Science works for some things…unless you already think you know, at which point it’s all rhetoric and pedantry. Social science is a soft science, therefore gut feelings and intuition are as useful (if not more so).

Michelle’s question and many of the answers reveal the roadblock we face here in our efforts to address alternative healthcare. Rather than treating it as a legitimate sociological question, where science might provide some insight, the issue is reduced to a dichotomy of smart vs. stupid, of educated vs. naive. When those are the questions we ask, we certainly can’t hope for any productive answers.

ACARA’s bent spoon

Charles Darwin - the perfect anti-creationist picture

Let me be upfront and honest about something – I’m no great fan of anti-accolades at the best of times. You know the ones; an ‘award’ for the worst dressed/stupidest/most laughable actor/book/production/product and so on. I simply don’t see the point, outside of a smug satisfaction that the awarder feels in superiority to the awardee. But, given human nature, I rarely say much as it’s hardly worthy of comment.

I didn’t attend TAMOz this year for numerous reasons. But I did hear on the grapevine that the annual ‘Bent Spoon Award’ was presented in absentia to the Australian Curriculum Assessment and Reporting Authority (ACARA) for the imminent National Curriculum science framework.

And, frankly, I was pretty gobsmacked.

First, some background. The Bent Spoon Award is an annual raspberry blown by the Australian Skeptics at ‘the perpetrator of the most preposterous piece of paranormal or pseudo-scientific piffle.’ As such, it must have been decided that of all paranormal and pseudoscientific acts, products and claims made in 2010, the National Curriculum must be the worst offender. Given I’d spent a good part of the year reading through it, I naturally presumed there was a sizable chunk of witchcraft, alchemy, geocentrism, voodoo or spiritualism I must have missed.

Fortunately that’s not the case. What was it that was so offensive in this draft framework? According to the nomination, it was for ‘devaluing the teaching of evolution in schools, allowing creationism to be taught, and for teaching alternative theories such as traditional Chinese medicines and Aboriginal beliefs as part of the Science Curriculum.’

Not for removing evolution altogether, and going down the dark path of Texan education. Not for putting creationism or intelligent design or Raelianism into the year 7 classroom. Nothing quite so definitive. It was a vague ‘devaluing’ of education that Australian Skeptics wanted to advertise to the world as the most deserving of scorn over all other media items, pseudoscientific products or audacious claims. One would hope that they had some pretty strong evidence to support the connection between ACARA’s choice of content and a loss of educational ‘value’.

Education is, of course, valuable. Anything that reduces the effectiveness of the system in preparing children and adolescents for their future should be addressed. A quick flick through my blog is enough to gauge my views as far as the topic goes. Indeed, it’s so important, I take claims that it is ‘devalued’ quite seriously.

Before we look at the criticism and ask whether the curriculum really warrants its prize, it might pay to quickly establish some context.

Australian education is a responsibility of the state level of government. As such, all states have an education act that prescribes what and how people will be taught important knowledge and skills. An example can be found at Queensland’s Department of Education and Training site. All states have similar documentation, which in part dictates the creation of units or subjects in schools that reflect a curriculum created by a state body. How this is assessed varies between the states, but typically includes the collection of student work samples along with a syllabus that demonstrates significant effort has been made to follow the curriculum’s framework.

It has been a concern for a long time that while there is strong similarity between all state curricula, order and timing of the skills and content taught has the potential to create difficulty for population movement. A student whose family moves from Perth to Sydney might face disadvantages by having missed some topics while replicating others. This led to growing support for a national curriculum.

In April 2008 a national curriculum board was put together with the purpose of meeting this challenge. In early 2010, ACARA released Phase 1 of its kindergarten to year 10 National Curriculum for feedback, which covered mathematics, English and science subjects. Later it opened Phase 2, covering geography, language and the arts for public review.

It appears that science is the field that will be revised most rigorously in the future, as the dominating criticism through feedback was that a sense of perspective was lost by focusing on certain details. To go into detail on my personal views on the strengths and shortfalls of the document would fall outside of the scope of this post. Overall, in spite of certain small reservations, I felt as far as science went it was a robust framework that balanced the diverse needs of the community it was serving.

Having been responsible for working on unit plans and analysing curriculum frameworks, I can sympathise with their creators, especially when faced with a wall of teachers and community members who feel their particular pet field is more important than the others. We’re all familiar with the ‘overcrowded curriculum’, and knowing what is vital for the future citizen to know is no easy task. Keeping everybody happy while delivering a working structure is a nightmare.

Which brings us to the nomination for the Bent Spoon. It states that the teaching of evolution;

‘become virtually sidelined, appearing in one section of Year 10 only.’

Evolution does indeed appear explicitly as the first point in Science Understanding in Year 10. I question the author where else he feels it should be. In my experience as an educator, covering it explicitly as a topic in its own right is difficult prior to a student’s ability to grasp abstract concepts, which more or less rules out going into much depth before years 6 or 7. I’ve never covered it as a concept before year 10. Of course, content based on biological categorisation – which is important for grasping evolution later – can be covered, and is in year 4. Physiological adaptations are usually covered in year 8 or 9, although fossils (and discussing ancient animals) is covered early in about year 3. Genetics typically works alongside evolution in year 10 (as is the case here). So while the word only appears once, concepts that are fundamental to understanding evolution litter the curriculum.

“The evolution of man is not part of the syllabus, and all the examples of evolution given as ‘Elaborations’ in the syllabus deal with non-controversial or small scale applications of natural selection (e.g. ‘the impact of cane toads on the evolution of Australian Predators such as snakes.’)”

The evolution of man has never been part of the K-10 curriculum of any state to my knowledge (happy to be corrected) as it is covered in detail in senior subjects. One can argue for it being moved forward or made compulsory, and I can think of arguments for and against doing so. However, I can’t but help that this isn’t being argued with a pragmatic necessity in mind, but rather as a defensive posture against potential religious indoctrination.

But more on that later.

The elaborations in the document aren’t official necessities, but suggested guidelines on how it might be approached. They are typically suggested with relevance to prior knowledge in mind rather to make it easier for the teacher to determine a useful way to introduce the topic.

What of teaching that dreadful ‘Aboriginal’ science? What does the framework have to say about that?

Specific knowledge and understanding of Aboriginal and Torres Strait Islander peoples is incorporated where it relates to science and relevant phenomena, particularly knowledge and  understanding of nature and of sustainable practices. For example, systematic observations by Aboriginal and Torres Strait Islander cultures over many generations of the sequence of various natural events contribute to our scientific understanding of seasons in Australia.”

And,

“Students should learn that all sorts of people, including people like themselves, use and contribute to science. Historical studies of science, mathematics and technology in the early Egyptian, Greek, Chinese, Arabic and Aboriginal and Torres Strait Islander cultures extending to modern times will help students understand the contributions of people from around the world.”

Given my upcoming book more or less goes into some depth on this topic, I’d like to think I’ve got something of an informed opinion. I feel the demarcation problem makes it difficult to describe precisely what science is and isn’t. Indigenous Australians have developed systems of describing nature, which might be viewed as scientific. They definitely created technology. Personally, I’m inclined to define science as values that described natural events in an impersonal fashion, so would see this as an interesting contrast, where I’d present to students the question ‘Is science the same as technology?’. Nonetheless, I think it’s a valuable contrast in the classroom, and one (when taught in accordance to the skills implicit in the curriculum) that would benefit student’s comprehension of how science isn’t simply defined.

“Thus the syllabus leaves open the option of teaching Creationism, while teaching just the basic theory of Natural Selection to Year 10 students only, omitting any reference to the evolution of man, and not mentioning Darwin once. This must be of great concern to sceptics as this document will form the basis of Science teaching for the next generation.”

Here’s the core of the matter. ACARA was found guilty of peddling pseudoscience because there it didn’t seal up the cracks, preventing the possibility of creationists slipping their venom into the ears of kiddies. That’s it – it didn’t account for the reds under the beds.

What troubles me most is that in spite of a greater focus on good scientific thinking, in spite of a move towards evaluative tools and promoting a critical epistemology more than any prior document, they got their wrist slapped because they didn’t put in enough Darwin. The assumption is that this is what impedes creationism in the classroom – evolution put in bold ink and underlined in a state-enforceable document.

There is a valid concern of pseudoscience slipping into the curriculum. I’ve encountered it all before – teachers who believe that the spin of a planet causes gravity; conspiracy theories; dolphins are a type of fish. But greater detail in the curriculum would not have made a lick of difference, given the existing documents failed to dissuade such errors or misinformation. Putting another evolution topic in primary school and adding Darwin to the list of great scientists will not safeguard schools against creationist teaching, and for that to be the focus of attack demonstrates a complete ignorance of pedagogy.

For skeptics, nothing should be more important than the arming of students with the fundamental skills that allow them to hear nonsense and identify it. This is not a question of content, but skills.

What does make a difference, then? A number of things. Better trained teachers. A school culture that reinforces cross-curricular skills. Improved career prospects for teaching and non-teaching staff. Good resources. Community involvement in the classroom. It’s not a simple solution, let alone one that can be addressed through a liberal dose of public mockery. Rather, differences are made by proactively contributing to the discussion with good information on pedagogy, cognitive psychology, best classroom practices etc.

The draft science curriculum is certainly not without its faults, and can definitely stand improvement. Nobody would argue otherwise. Informed and constructive feedback is vital and groups like the Australian Skeptics should well have pulled together a team of sceptical educators and produced well supported feedback grounded in research, which could have been promoted on its website to demonstrate its measured approach to education. Discussing how to go about this would be worthwhile.

Yet what is the likelihood of their being taken seriously by any curriculum council when its response is to instead ridicule ACARA, effectively calling them pseudoscientists because their conclusions don’t have enough evolution for their liking? Not great, I’m afraid.

Published in: on November 27, 2010 at 9:14 pm  Comments (18)  
Tags: , , ,

The other side

ITB06

Published in: on July 28, 2010 at 8:49 am  Leave a Comment  
Tags: ,

A public immunity to freeloaders

SyringeImagine a dinner party where a guest openly admits they don’t work. On inquiry, they smile warmly and cite statistics concerning death or injury in the work place. ‘I just don’t want to risk it,’ they say.

The next obvious question comes up; ‘So, how do you get by?’

‘Oh,’ they say, pausing to take a sip of wine. ‘Easy. You see, enough people pay taxes to provide me with welfare. I really don’t need to work.’

Everybody smiles and nods politely, believing it’s their friend’s choice to refrain from working if they don’t want to, and they move onto other topics to do with politics and religion.

Sound familiar? No? I must admit, in spite of the numerous dinner parties I’ve attended in my life, I’ve never encountered that scenario. There’s probably a simple reason for that – most people would be embarrassed to openly confess such a thing. Who would want to say to others that they’re not willing to roll the dice for themselves, but are happy to enjoy the benefits provided by the risk-taking of others? Very few would.

Yet several years ago during a lunch outing with work colleagues I had an acquaintance openly state they chose not to vaccinate their children. Why not? Simple – they cited the risks of vaccination and stated they didn’t want to take a chance on their child’s health. Unfortunately the mood of the conversation lightly condoned their choice and even congratulated them on making such a decision.

Now, I have an infant son who is of an age where he is getting his vaccinations, and I must confess I hate seeing him in pain or discomfort. The thought of him dying is the most terrifying thought I have ever had to encounter in my life – and that isn’t hyperbole. There’s only one thing worse; if his suffering was the direct result of a decision I made.

I chose to have my son vaccinated knowing he could face uncomfortable side effects. Not only did I know this from my undergraduate degree and occupational experience as a medical scientist, but my wife and I knew because we had easy access to literature and a physician who discussed the situation frankly and honestly. I understand not everybody is so fortunate, and that there are undoubtedly those in the medical profession who would avoid discussing the potential for harmful consequences, however I find my experience as a parent hard to reconcile the concerns of groups like the Australian Vaccination Network who feel a responsibility to present an emotion-laden ‘balance’ of information to the public.

This past colleague based their decision to refrain from vaccinating their children on two notions – one was that they’d read accounts of children suffering from seizures and even dying following vaccination, and the second was that they’d never heard of children dying or suffering from the conditions that they were being vaccinated against. At least, not recently. When another colleague pointed out that vaccination could well be the reason behind such an absence of modern mortality, they in response referred vaguely to statistics that indicated death rates from communicable diseases were dropping long before public vaccination programs.

This is, in a way, quite correct. Better sanitation and improved healthcare practices during the first half of the 20th century saw to a gentle decline in deaths from diseases such as measles and whooping cough in most post industrial countries like Australia.  It’s difficult to tease out with a simple glance at a graph the precise impact of any public vaccination program from the impact of improved medical intervention, given we don’t have a control population where there is modern healthcare practices sans immunisation. The best we can do is to watch the statistics of health complications that arise from epidemics that occur in today’s world when public vaccination falters.

Unfortunately, there is no shortage of such natural experiments. From 1988 to 1990, for instance, California experienced an epidemic of 16 400 cases that resulted in 75 deaths. That isn’t in an impoverished, third world country without access to sanitation or medicine – that is in a modern, post-industrial nation where immunisation rates had fallen.

Nations with social health care systems like Australia haven’t been free of such outbreaks, either – in 1993, the Western Public Health Unit received 889 measles notifications for Sydney’s western suburbs. Ten percent resulted in hospital admissions, with one case of encephalitis. Fortunately no deaths occurred.

How does this compare to the risks taken from a measles vaccine? During the Measles Control Program in 1998, there were 89 reported reactions out of 1.78 million vaccinations – the same number as hospital admissions in the Sydney outbreak of only 889 cases of infection. These reported reactions included 8 reported rashes, 4 with inflammation of the parotids and one febrile seizure. No children died.

There are, of course, plenty of anecdotes to suggest horrific experiences of vaccinated infants. Seizures are certainly possible, and for me to experience such an event with my child would be unimaginable. But in a community where nobody took that risk, the dice I’d be rolling for my son would be heavily loaded. Even if each anecdote was verified, it’s hard to imagine the risks would come close to the chance of complications from contracting a disease like measles. I might not like the one in a million chance of my son having a seizure, or the slightly increased chance of death such a side effect could present, but the odds he’d face in a world with no vaccination simply wouldn’t compare.

Yet the community this colleague can happily raise their child in is not unvaccinated. Enough people roll that dice, so their children can appreciate good health in a community where pathogens have nowhere to proliferate. So long as a high enough percentage of their fellow citizens take that risk for them, they won’t have to take that tiny but real chance of suffering vaccine side effects.

I’m happy to shoulder that burden on behalf of any individual whose constitution puts them at a significantly greater risk of illness should they be vaccinated, just as I’m happy for the taxes I pay to help benefit those who are impeded from working. Yet for those who simply don’t like the idea that the demonstrably minuscule odds are too much for their child to risk, I feel no such obligation.

When it comes to most things in the community, I’m a process-driven rationalist. I support people making their own decisions regarding their own finances, health and well being, and choose to engage in outreach that assist them in making decisions that have the best chance possible of matching the outcomes they hope for. I oppose regulations that take that decision out of other’s hands and stand against the use of authority to deprive any citizen of the right to decide a course of action for him or her self. I believe in education over legislation…for most things.

Vaccination is a communal decision. While education remains a vital necessity in this regard, I also feel those who wish to rely on the risk I take to see to their child’s safety borders on criminal, and in the very least can be personally considered to be highly immoral. I struggle with the idea of legally enforcing vaccination, however find it even more difficult to welcome the choice of others to take for granted the protection my risk-taking has provided for them.

Needless to say, I don’t tend to go to many dinner parties any more. I stayed quiet during that lunch. As a relatively new father, I’m not sure if could manage such silence again in the future.

Published in: on July 25, 2010 at 11:14 pm  Comments (10)  
Tags: , ,