One of the major health care crises currently facing the United States is the exploding incidence of autism diagnoses. Thirty years ago it was estimated that roughly one in 2500 children had autism while today it is estimated that approximately one in 166 is diagnosed with the condition – more than a ten-fold increase.1 In turn, due to the high costs of treating and caring for a typical autistic individual over his or her lifetime, it is estimated that the annual cost to society of autism is thirty-five billion dollars (Ganz 2006). Clearly, the highest priority needs to be given to better understanding what is causing the dramatic increase in diagnoses and, if possible, using that improved knowledge to reverse the trend.
Despite the recent rapid increase in diagnoses and the resulting increased attention the condition has received both in the media and in the medical community, very little is known about what causes the condition. Starting with the work of Rimland (1964), it is well understood that genetics or biology plays an important role, but many in the medical community argue that the increased incidence must be due to an environmental trigger that is becoming more common over time (a few argue that the cause is a widening of the criteria used to diagnose the condition and that the increased incidence is thus illusory). However, there seems to be little consensus and little evidence concerning what the trigger or triggers might be. In this paper we empirically investigate a possibility that has received almost no attention in the medical literature, i.e., that early childhood television watching is an important trigger for the onset of autism.
Researchers might also turn new attention to study of the Amish. Autism is rare in Amish society, and the standing assumption has been that this is because most Amish refuse to vaccinate children. The Amish also do not watch television.