Sunday, December 18, 2005

Rats and cancer, Part III

In Parts I and II we laid out some of the reasons why identifying chemicals which cause cancer is important but difficult, and why epidemiology is a poor policy option for doing so. It is true that epidemiology addresses the species of interest (humans), exposed under the conditions of interest (exposure levels commonly encountered in the workplace or the environment), but there are so many loose ends sticking out of the natural experiment that is an epidemiologic study that it has difficulty distinguishing even moderately strong signals from the noise. And of course there is the problem that cancer is a long latency disease, so that detecting it by counting bodies in the morgue condemns us to decades of continued cancer from the same chemical even if we eliminated it instantly from the environment (see Part II).

Rodent studies address both of these issues. The use of experimental design allows us to isolate just the variable of interest (exposure to the chemical) by using an exposed group and a genetically identical control group. Because the latency period of a cancer is measured in "biologicl time" not "calendar time," i.e., in terms of a fraction of the organism's lifetime not in years or months, we can observe a long latency disease in a rat or mouse in two to three years, not the twenty to forty years required for humans.

But there is still one major problem, the problem of rarity. Cancer risks of public health significance are still small in absolute terms. A chemical that causes one cancer in 10,000 exposed/year, say through a water supply, would still produce over two hundred cancers in a city like Boston whose water supply goes to 2.1 million people. But trying to test this with rats would require an infeasibly large number of animals. Thus, just for one dose and one route of administration, intuitively we would need 10,000 exposed rats and 10,000 control rats, but for statistical reasons we would actually need something like 23,000 rats (at the 95% confidence level). If you have ever worked with rats in the laboratory, the thought of such an experiment is mindboggling, even though the risk of one in 10,000 is still quite large in public health terms. A risk of one in a million for something to which virtually the entire population (plus fetuses) was exposed (like saccharin) shows how impossible such an animal test would be if required to be carried out at the doses to which humans are usually exposed. A large scale bioassay uses only about 600 animals per group, and is a formidable logistical undertaking.

The solution, as is well known, is to increase the doses given to the animals to the point where the probability of tumor occurrence is high enough that experimental designs can feasibly detect it. This raises two questions. The first is whether something that causes cancer at very high doses is also plausibly able to do so at much lower doses. Might there be, for example, some carcinogenic mechanism that only comes into play at high doses but not at low doses? The second question is whether things that cause cancer in rodents are also plausibly likely to do so in humans. Both of these questions go to Professor Rappaport's query, although there is more to the problem.

Whether high doses create an artifactual situation has been a matter of debate for years. In the view of most scientists who work in this area, exposures at high doses are relevant for identifying carcinogenic potential at low doses, but this has been difficult to prove conclusively and has always been an argument from those whose chemical has been put under suspicion by bioassays. There are good biological arguments for the relevance of high doses, but defensible biological arguments also exist for the opposite. In the last analysis it comes down to a question of public health prudence, i.e., do we wish to ignore the high dose evidence and if so, what are the possible costs and to whom? Even if we monetize the suffering of cancer patients, the costs and benefits accrue to different people (consumers versus corporations).

The animal to human extrapolation is a similar question. As we noted in our earlier post, rodents are extremely well understood biologically, have similar biological systems as we do (lungs, digestive systems, nervous systems, etc.), and have been shown to be good predictors of human biology and physiology. This is why NIH and researchers the world over spend so much time and money supporting work using rodent models, on topics ranging from Alzheimer's disease to diabetes to neurophysiology. Rodent models are the well validated workhorse (pardon the mixed metaphor) of biology.

But we return to the policy argument. Because there are good scientific/biological reasons for using rodent experiments to inform us about important questions of human biology (of which carcinogen bioassays are a tiny fraction), we use them to accomplish an important task: identifying the needle in the chemical haystack which can cause cancer in humans. We have few other options, much less other options with as much scientific and plausibility support as this one. Animal bioassays are the "gold standard" for detecting the ability of a chemical to cause cancer in a whole animal.

Professor Rappaport says we should not ignore them, but then goes on to ask if they tell us anything:
It is not enough to plead ignorance and move on. I am not saying that absent good information, we should necessarily ignore rodent tests. But we should not just assume that they are informative about humans, either.
If rodent experiments aren't informative, why in the world should we not ignore them? We don't ignore them because I think it is clear we are not pleading ignorance. We know a great deal, just not everything we wish to know. And we still have to make a decision. Yet Professor Rappaport asks:
I would be interested in knowing whether animal carcinogens that are widely used by humans seem to be human carcinogens. It is sometimes said there are 24 carcinogens in coffee. Is there any evidence it is a human carcinogen?
I, too, would be interested to know if those animal carcinogens which we do not yet have epidemiological (or clinical) evidence are human carcinogens. Does he have a suggestion as to how we might find out? And in the meantime, what does he suggest we do (i.e., what is his policy recommendation about a chemical definitively known to cause cancer in animals at high doses but with either suggestive, very limited or no information about humans)? At the moment most public health scientists lean toward the conservative position that there is a rebuttal presumption that the minority of chemicals so identified are human carcinogens.

As to his coffee question, I'd like to know exactly which of the chemicals he references are confirmed animal carcinogens. If there are some, we need to think of the policy options: status quo, labeling, trying to remove or neutralize those chemicals in coffee production, banning coffee, etc. I am a confirmed coffee drinker, and if I knew it contained some animal carcinogens I would likely continue to drink it. But I think I'd like to know. For other carcinogens in the environment we may wish to have a different set of options or make a different choice, especially when the exposure is involuntary or unknowing.

In any event, denying the relevance of the animal data to humans doesn't seem like a reasonable resolution to a difficult problem. As we learn more of the mechanisms of cancer production we may be able to improve our detection methods, or at least make them cheaper and quicker, since right now we have good reasons to believe they are highly accurate.