Our renewals portal is undergoing an upgrade. If you experience any issues please contact member services for support. Thank you for your patience as we transition to a new and improved system.

Australian Psychology Society This browser is not supported. Please upgrade your browser.

InPsych 2019 | Vol 41

August | Issue 4

Highlights

Debunking autism myths

Debunking autism myths

Challenging junk science in a misinformation age

Autism has been labelled a ‘fad magnet’ with many clients exposed to unscientific and sometimes dangerous practices, with enormous resources wasted, and a risk of harm. While autism has more than its share of fad interventions, it is not alone in generating interventions that have no demonstrated value and have the potential to cause harm (e.g., coloured filters to combat reading problems [Irlen lenses] and special diets to treat attention deficit hyperactivity disorder [ADHD]). One key factor perpetuating this issue is the presence of misinformation regarding the evidence base. Attempts to correct autism misinformation – like the myth that vaccines are linked to autism – can be ineffective or ironically even backfire and strengthen misconceptions. The relevant cognitive science literature can help to explain why backfire effects occur and evidence-based strategies can assist to bust myths about which autism interventions are based on solid research and which ones are not. This includes a flexible approach to correcting misinformation that we have found effective in the field of autism and may be used more widely to share information with clients, other health professionals, educators and the wider community.

Autism is a lifelong neurodevelopmental disorder characterised by impairments in social communication and the presence of restricted and repetitive behaviours and interests. A variety of untested and potentially harmful practices have been developed and marketed for this condition. So much so, the field of autism interventions has been described by Alison Singer, President of the Autism Science Foundation and mother of a child with autism as, “a cottage industry of false hope”. This includes more than 1000 different practices and interventions including a plethora of untested and even dangerous practices that proliferate in the community (see the list here).

Options available include practices (e.g., facilitated communication) and medications (e.g., secretin) shown in empirical research to be ineffective, as well as those with considerable risk of harm (e.g., chelation). In contrast, fewer than 30 practices have been documented with sufficient evidence to be classified as empirically supported (e.g., National Autism Center, 2015), making navigating the vast information available via parent groups, social media, google, and the preferable, but harder to access, empirical sources (e.g., journal articles), a metaphorical search for an evidence-based practice needle in a haystack of unsupported practices for both us as psychologists, and our clients.

Why autism seems to attract such a variety of practices has been the subject of a range of research both in Australia and internationally. Key explanations are that no single intervention is effective for all individuals with autism; parents and professionals are often keen to do anything that might help and are less keen to wait for research to ‘catch up’; aggressive marketing based on anecdotal sources of evidence and appeals to emotion rather than empirical evidence; and misinformation regarding the effectiveness/evidence-base of practices both in parents and health professionals themselves. For example, allied health professionals report using practices they believe are evidence-based, but this includes when these beliefs are incorrect (Paynter, Sulek, Luskin-Saxby, Trembath, & Keen, 2018).

This can be challenging for us as psychologists who may likewise struggle to navigate the vast information available, and may be faced with both selecting practices to implement, as well as supporting our clients and their families to select the most suited practice and intervention for implementation themselves or by other allied health and medical professionals. This is complicated by the fact that misinformation (including regarding the evidence base of practices) is surprisingly difficult to correct.

Misinformation and backfire effects

Any information accepted as true although it is false can be considered to be ‘misinformation’. Take for example the erroneous belief that vaccines cause autism. Attempts to debunk this notion have been largely unsuccessful, with some attempts to correct this misinformation actually leading to greater intentions to not vaccinate (e.g., Nyhan, Reifler, Richey, & Freed, 2014). Indeed, a number of these backfire effects have been outlined in the cognitive science literature that explain why providing a correction to misinformation can ironically strengthen people’s beliefs.

This effect is observed not only in the field of autism, but in climate science (e.g., climate sceptics), politics (in our ‘post-truth’ era), and medicine (e.g., vaccinations) to name a few examples. The following are some of the effects seen in the following fields.

Backfire effects include continued influence, familiarity, overkill, and worldview backfire effects (for a review see, Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012).

Continued influence effect refers to the fact that it is hard to unlearn information. When we provide a correction (e.g., that vaccines don’t cause autism) we need to fill this gap with a plausible alternative (e.g., autism has a strong genetic component).

Familiarity backfire effect refers to the fact that hearing a piece of information repeatedly makes us more likely to accept it as accurate. Providing a debunking, for example, repeated advertising that it is not true that ‘secretin cures autism’ may inadvertently repeat the myth ‘secretin cures autism’. This may be exacerbated by the journalistic maxim of balance whereby equal airtime is given to controversial non-evidence-based views such as the latest fad treatment, as evidence-based views. This gives a false sense of balance and controversy, when it is for example, already well-established that secretin does not change autism symptoms (Williams, Wray, & Wheeler, 2012).

Overkill backfire effect refers to the fact a simple myth is easier to accept than a complicated truth. This is a challenge where the science may be equivocal for example, how often do we respond with “it depends”, or “the truth is a bit more complicated.” For example, what caused my child’s autism? We may want to respond that a combination of biological, genetic, and perinatal factors are associated with differences in neurobiology and behavioural phenotypes, but no exact cause is known for an individual child. An anti-vaccination advocate would respond inaccurately, but simply: vaccines. Voila! A nice simple, albeit incorrect explanation. The solution, which is far easier said than done, is to make facts easy to understand and simple, but that is almost the opposite of what we are taught to do as scientist-practitioners. Particularly given that we are tempered by ethical mandates that include, for example, not promising that a particular intervention will work, and not using anecdotes to promote our services.

World-view backfire effect refers to the fact that if when we provide corrections we threaten someone’s world view they are likely to reject the information (e.g., that parents know how to feed their children, when correcting that a restrictive diet is unlikely to change autism or ADHD symptoms). This is particularly the case for individuals with strong world-views who are the least likely to be persuaded. Corrections are more effective for those not firmly decided (e.g., a client seeking advice on a therapy who has not decided to use it yet, as opposed to someone who has already invested time and money in it).

World-view backfire effects include the tendency in all of us to seek out information that supports our existing beliefs (i.e. confirmation bias), often exacerbated by online ‘echo chambers’ particularly in the field of autism in social media groups supporting a range of fad and unconventional treatments. We are also inclined to spend more time and energy opposing information that does not fit our world views (i.e. disconfirmation bias). Thus, a number of backfire effects can, and do, occur when combating misinformation both in the field of autism and in the world in general.

How to debunk misinformation

So then, how do we overcome or minimise the risk of backfire effects when our clients, or even colleagues, approach us with misinformation regarding interventions they are pursuing? We have turned to cognitive science to understand how to best bust autism intervention myths by taking a refutational approach combined with evidence-based additional elements. Our approach includes more than simply stating that something is false (e.g., we wouldn’t just say, “Hey, you’re wrong, that’s actually not based on research,” imagine the reaction that gets). Instead, we explain why a piece of information is false and what led people to believe it in the first place, while keeping the message as simple as possible. We also include six additional elements that have been shown individually to make corrections more effective (see review by Lewandowsky et al., 2012): source credibility, self-affirmation, social norming, warnings, visuals, and salience of core messages.

  Source credibility refers to having corrections or evidence coming from a trustworthy source. For our research we included a photo of myself that had been pre-rated as being trustworthy (thankfully for me!). This could also include information coming from a trusted source, such as sharing information from the Australian Psychological Society.
  Self-affirmation means corrections should avoid attacking others, but affirm the personal values of the recipient in order to combat world-view backfire effects. This could include for example affirming that we all want to do as much as possible to help people with autism.
  Social norming refers to including descriptive norms (e.g., “most psychologists agree…”) and/or injunctive norms (e.g., “drawing from the evidence-based practice framework is the right thing to do…”).
  Warnings refers to prefacing misinformation with a warning when repeating the misinformation in order to correct it, so people can process it with caution (e.g., ‘Attention: Myth Follows! Facilitated Communication, Parent and Professional Attitudes towards Evidence-based Practice, and the Power of Misinformation’ was the title of one of our recent papers).
  Visuals refers to including graphical representations of the evidence, such as a pie chart of the number of studies supporting a particular practice. This can also make the information more salient.
 

Salience refers to making the corrective messages stand out such as by emphasising the message with the way written text is displayed or by simply spending more time talking about the facts (e.g., what practices are recommended for working with clients with autism) rather than the myths (e.g., only taking about what practices are not effective but forgetting to provide alternatives, such as letting you know that the National Autism Centre review of interventions provides a great list for interested readers – see bit.ly/2FQ7f9u

Our research (Paynter et al., 2019) showed that adopting the above approach to sharing information with early intervention staff including teachers, allied health professionals, and early childhood and care professionals was more effective than existing information-based materials that did not use this approach. This included increasing support for evidence-based autism practices (e.g., Picture Exchange Communication System), and reducing support for practices unsupported by research (e.g., Facilitated Communication). This approach to providing information to correct misinformation is however flexible and may be applied to developing professional development materials, providing information to clients, as well as to other professionals across a range of areas.

While it may not be feasible to combine all of the above elements on each occasion, many could be used across a range of situations. For example, a client may approach you for advice on whether they should use a therapy their friend found helpful, that is not based on scientific evidence with their child. You could respond using the following script:

A script for myth busting

“I appreciate your concern about selecting the best practices for your child (self-affirmation), and as I work in this area (source credibility), it’s actually not based on scientific research (warning), that this therapy is helpful. Some people believe this therapy is helpful based on hearing stories from friends, but if you look at the scientific data, there is actually no evidence for this therapy (refutational approach). Here’s a graph showing the evidence (visual) I collated from the APS guideline (source credibility). In fact, there is strong consensus against this therapy (social norming) amongst psychologists. So, it seems this therapy is not based on evidence, but there is a great list of evidence-based practices available [provide list/information sheets on what to do instead] (salience of corrective messages).”

Addressing both what practices are based on evidence, as well as reducing or eliminating use of practices that are ineffective or even harmful for our clients is an important responsibility for us as psychologists. This need seems especially salient in the field of autism, where fad practices including practices associated with direct harm seem to proliferate.

Using evidence-based strategies drawn from the cognitive science literature including source credibility, self-affirmations, social norming, warnings, visuals, and making key messages salient can help us as clinicians to more effectively combat misinformation (and avoid backfire effects) that may interfere with psychoeducation regarding the selection and implementation of the best-available practices for both ourselves and our clients.

The author can be contacted at [email protected]

Acknowledgement

I would like to give thanks to Ullrich Ecker for his insights into cognitive science and development of the template for debunking described in this article and overviewed in Paynter et al. (2019). I would also like to thank our colleagues in this program of research: Kathryn Fordyce, Grace Frost, Christine Imms, Deb Keen, Sarah Luskin-Saxby, Scott Miller, Rebecca Sutherland, David Trembath, and Madonna Tucker for their valued contributions to this line of research. Finally I would like to thank all the participants in our research, without you this work would not have been possible.

References

Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.

National Autism Center. (2015). Findings and conclusions: National standards project, phase 2. Randolph, MA: Author.

Nyhan, B., Reifler, J., Richey, S., & Freed, G. L. (2014). Effective messages in vaccine promotion: A randomized trial. Pediatrics, 133(4), e835-e842.

Paynter, J., Luskin-Saxby, S., Keen, D., Fordyce, K., Frost, G., Imms, C., . . . Ecker, U. (2019). Evaluation of a template for countering misinformation—Real-world autism treatment myth debunking. PLOS ONE, 14(1), e0210746. doi:10.1371/journal.pone.0210746

Paynter, J., Sulek, R., Luskin-Saxby, S., Trembath, D., & Keen, D. (2018). Allied health professionals’ knowledge and use of ASD intervention practices. Journal of Autism and Developmental Disorders, 48(7), 2335–2349. doi:10.1007/s10803-018-3505-1

Williams, K., Wray, J. A., & Wheeler, D. M. (2012). Intravenous secretin for autism spectrum disorders (ASD). Cochrane Database of Systema

Disclaimer: Published in InPsych on August 2019. The APS aims to ensure that information published in InPsych is current and accurate at the time of publication. Changes after publication may affect the accuracy of this information. Readers are responsible for ascertaining the currency and completeness of information they rely on, which is particularly important for government initiatives, legislation or best-practice principles which are open to amendment. The information provided in InPsych does not replace obtaining appropriate professional and/or legal advice.