The dawning of a new age of toxicology

Scientists are amassing a growing body of evidence that pollutants and chemicals might be altering genes—not by mutating them, but by silencing them or switching them on at the wrong times. The is leading to the dawning of a new age of toxicology.

The need for change
In the last week of July 2009, several dozen researchers and experts tackled this complicated topic, called epigenetics, at a two-day workshop in Washington, D.C. They discussed new findings that suggest chemicals in our environment and in our food can alter genes, leaving people vulnerable to a variety of diseases and disorders, including diabetes, asthma, cancer and obesity. They also considered whether regulatory agencies and industry should start testing the thousands of chemicals in use today for these effects.

Earlier that month, the prestigious science journal Nature published an article by Thomas Hartung from the Department of Environmental Health Sciences at Johns Hopkins University entitled "Toxicology for the twenty-first century". In this paper identifies the critical weaknesses in the system of regulation that has underpinned international toxicology testing of man-made chemicals that are produced in industrial quantities and used globally.

"Synthetic chemicals have been components of consumer products for just over a century. A system identifying which chemicals pose a danger to individuals and the environment was first put in place about 80 years ago. But after several productive decades, in which a patchwork of testing approaches was formed, fewer and fewer of the latest scientific developments were incorporated. The system of regulatory toxicology fell asleep - much like the fairy-tale character Snow White when she bit into the poisonous apple. In the case of toxicology, the poison was international guidelines. This international harmonization was tempting because it allowed chemical manufacturers & suppliers to use fewer resources, and it overcame barriers to trade in global markets. But implementing these guidelines came at a price: the slow and complicated international consensus process hindered self-criticism and modernization of the field of toxicology," wrote Hartung.

"There is almost no other scientific field in which the core experimental protocols have remained unchanged for more than 40 years. Yet consumers continually increase in their expectations about the safety of products," he wrote.

In 2007 that the winds of change blew through the international system of regulatory toxicology. The European Union enacted a new legislative regulation: Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH). The new legislation will address the lack of up-to-date toxicological and safety assessment data on chemicals developed before 1981. According to Hartung these pre-1981 chemicals represent 97% of the major chemicals in use globally and a staggering 99% of the chemicals manufacturing output by volume. He noted that 86% of these chemicals are lacking up-to-date data; the REACH process seeks to redress this.

The new regulation affects an estimated 27,000 chemical companies, which are now required to provide information on the toxic properties and uses of some 30,000 chemical, after a pre-registration phase in 2008.

"REACH might turn out to be the Prince whose kiss awoke Snow White after a long sleep, rousing toxicology at last," he wrote.

Future directions for Toxicology - Global perspective
So the current system of testing chemicals needed to change and change is coming. There is now acknowledgement that the individual testing tools have limitations and are inadequate for toxicology in the twenty-first century. The groundwork for the change was laid by increasing awareness of the shortcomings of current methods, as well as emerging technologies and political pressures.

In Europe, the prince who awakened toxicology was politics - the REACH legislation calls for a new safety testing approach on a large scale and was assisted by the imposition of total ban on animal testing covered in the ‘seventh amendment of the toxics directive.

In the United States, the catalyst for change was the scientific community responding to the U.S. Environmental Protection Agency’s (EPA) call for a new vision.

The dimensions of the proposed overhaul are global. What lies ahead, however, must be an entirely independent scientific process. The opportunity to create a new regulatory toxicology lies in the science that analyses the interactions of small molecules with cells.

The main challenge is to design a new global system of regulatory toxicology based on new testing technologies. Hartung is direct in proposing a three-step solution. Fist, the limitations of the current tools need to be objectively assessed, and a better understanding of their uses is needed. We need to analyse the prevalence of particular hazards because appropriate test strategies depend strongly on whether the hazard is rare or frequent. Second, in the mid-term, the various approaches need to be integrated into testing strategies, making the best use of the existing methods by combining them strategically. And third, an entirely new system of toxicological testing is urgently needed and should be built from scratch, using modern methods.

The basis of the new system has emerged over the past two decades: advances in cell-culture techniques enabling a range of biological and cytological phenomena to be studies in vitro. Another avenue now opening is designing new regulatory toxicology through the combination of bioinformatics and biotechnological approaches. Three important technologies developed during the past decade have now entered the field of toxicology: the ‘omics’ technologies - such as genomics and proteomics; imaging techniques and robotized testing platforms.

The automated testing platforms allow high throughput of samples, enabling large numbers of substances to be tested under standardized conditions. Omics technologies and imaging methods compile enormous sets of information about a single compound. Used together, the three technologies not only allow researchers to ‘fish’ for new biological markers of specific toxic effects but allow the deduction of patterns - or signatures - that are characteristic of certain toxic effects.

By harnessing advances in bioinformatics and in silico modelling, this toxicity data can be mined and then integrated with knowledge from other areas of life sciences. The combination of biochemical knowledge of cellular pathways with genomics, proteomics and metabonomics (the study of metabolic responses to environmental factors, drugs and diseases) is already advancing as systems biology, and systems toxicology is a new sub-branch of this field.

This systems approach to new regime of toxicological testing in the 21st century was put forward in a 2007 report by the US National Academy of Sciences on behalf of the EPA. This has already lead to the formation of a revised toxicity testing strategy by the EPA and the EPA ToxCast program.

Biomonitoring
Scientific advances have made it possible to detect the smallest traces of chemicals in the human body and the environment are shaping efforts to modernize U.S. chemical policies and regulations.

Biomonitoring is the modern tool that is destined to overhaul a 1976 United States statute, Toxic Substances Control Act. The new technology measures chemicals at minute levels in human tissue and bodily fluids and provides evidence of direct exposure.

'Understanding the connection between our health and our environment, with its mixture of chemicals, diet and lifestyle stressors, is no less complex than understanding the intricacies of the human genome,' said Linda Birnbaum, director of the National Institute of Environmental Health Sciences.

Thomas Zoeller, a University of Massachusetts endocrinologist, said animal studies show the effects of chemicals that interfere with hormones differ depending on when that exposure occurs. Exposure matters most when an animal's brain is young and developing. But biomonitoring, he said, usually measures chemicals in adults without any idea of what kinds of exposure occurred during development.

So to prove a direct causation, scientists must link the timing of the chemical measurement to some damage believed to be connected to that specific time, Zoeller said. But that kind of information is hard to get. And teasing out problems caused by chemical exposures remains a challenge, he said.

The latest Center for Disease Control biomonitoring study of 2,500 people detected 212 chemicals in their bodies. The Learning and Developmental Disabilities Initiative last week released a biomonitoring study with the purpose of exploring the relationship between toxic chemicals and increasing rates of learning disabilities, such as autism and attention-deficit hyperactivity disorder. It found what the researchers said were 61 neurotoxic and endocrine disrupting chemicals in the 12 people tested. The study cannot establish that the presence of any chemicals cause developmental or other health disorders, but the report's authors say the detection of so many chemicals of concern should be enough to take strong regulatory action. 'I realize now more than ever why reforming our federal toxics law is absolutely essential to protecting our health and our children's health,' said Maureen Swanson with the Learning Disabilities Association of America.

Others argue that biomonitoring data must be interpreted in context and that regulators should not bypass traditional risk-assessment processes that consider both exposure and hazard data. The National Institute of Environmental Health Sciences is working to develop methods that allow for improved risk assessments. The institute is funding 32 projects focused on testing technologies that can measure environmental exposures, diet, physical activities, psychosocial stress and other factors in disease development, said Birnbaum, the agency's director.

Surveys of the Human Toxome
In 2001 the concept of human 'body burden' testing had been initiated. European environmentalist started testing the residues of chemicals in people's bodies through the collection of blood and urine samples. The initiative came from the World Wildlife Fund in the United Kingdom in 2003. WWf tested 155 individuals across Britain and then WWF and Greenpeace Netherlands asked cel;ebrities and members of the European Parliament (amongst others) to be part of a large EU survey. By 2006 the concept of measuring body burden had spread to the United States and Canada. All these surveys on either side of the Atlantic convincingly demonstrated that regardless of age, ethnicity, place of work or residence, every one is contaminated!

In Europe the body burden testing by environmental organisations was the catalyst in the campaign to implement the Registration, Evaluation, Authorisation and Restriction of Chemical Substances program (REACH) - an European Union initiative to regulate chemicals. In Canada, the Chemical Management Plan (CMP) has been created as a consequence the Canadian Toxic Nation testing. And in the United States, the collective efforts of various organisations using body burden tests and the leadership from the Environmemtal Working Group resulted in the introduction of the Kid-Safe Chemical Act into the US Congress in late 2007 and further policy & regulatory changes to the US EPA announced in December 2009 (see next section).

December 2009 Statement from the Administrator of the US EPA
On Tuesday 2 December Lisa Jackson, Administrator of the US EPA appeared before a legislative hearing of the Senate Committee on Environmemt and Public Works. The statement highlights the changes being implemented to regulations over toxic substances in the USA and the need for further reform to the Unitied States Toxic Substances Control Act 1976. here PDF

Australasian Society for Ecotoxicology, September 2009 - calls for change in Australia
Calls for a different approach to the way toxicology is performed in Australia in the 21st century were reiterated at the 13th Australasian Society for Ecotoxicology conference held in Adelaide (20-23 September 2009).

Several speakers highlighted the deficiencies in the current toxicity assessment methodologies. Michael Warne of Centre for Environmental Contaminants Research, CSIRO Land & Water and Rick Van Dam of the Environmental Research Institute in Darwin presented a paper entitled: 101 reasons to stop generating and using hypothesis based toxicity estimates – NOECs and LOECs should be banned.

Two main measures of toxicity are generated in ecotoxicology – (1) hypothesis based data (e.g. no/lowest observed effect concentrations – NOEC/LOEC data) and (2) point estimate data (e.g. effect/lethal concentrations – EC/LC data).

The use of NOEC and LOEC data has been severely criticised since the 1990s, yet these data continue to be generated and deemed acceptable by regulatory authorities in Australia and internationally. The question must be asked why this is the case? The problems associated with NOEC and LOEC data belong to three main groups:
 * the misleading nature of their names;
 * the inappropriateness of the method by which they are calculated; and
 * the validity of the statistical methods used.

Warne and Van Dam argue that one of the key reasons for the continued generation and use of NOEC and LOEC data is that no regulatory authority (e.g. USEPA, Environment Canada), international organisation (e.g. OECD, ASTM) or professional society (e.g. SETAC or Australian Society for Ecotoxicology) has banned their use. It is our view that the generation and use of the NOEC and LOEC data should be banned.

The revision of the Australian and New Zealand water quality guidelines over the next three years presents the perfect opportunity for Australia and Australasian Society for Ecotoxicology to take the lead on this statistical issue and progress it to the point where unambiguous and sound guidance can be provided on the most appropriate statistical methods for use in typical ecotoxicological studies.

Toxicity testing remains largely in the same position as it was 50 years ago, i.e. measuring the effect of increasing doses of toxicants on populations at fixed times of exposure. That is the opinion of Francisco Sánchez-Bayo of the Centre for Ecotoxicology, Department of Environment and Climate Change. Certainly, endpoints other than lethality are tried in ecotoxicology, but the underlying approach is always the same – to estimate the median (or other percentile) effect dose or concentration at a given time. However useful this information may be, it ignores the fact that internal doses are time-dependent and therefore the values of the estimated endpoints vary with the exposure time. Besides, what really matters is the survival of populations in time after repeated exposure of several generations to a toxicant. Two recent developments that tackle these issues are presented here: time-to-effect (TTE) bioassays and population growth rates. TTEs address the first deficiency, allowing the estimation of percentile effects for doses at variable exposure times by effectively integrating these two essential parameters in a single test. Data from a TTE take the form of a matrix of measured endpoints vs time, which can be analysed using simple models. Moreover, as the estimated median effective times (ET50) are related to the doses, standard L(D)C50s can be obtained directly from that relationship, and such median values can be used to predict any level of effect for any dose at any time using appropriate modelling.

The second issue is crucial for the protection of species and the management of natural resources. Indeed, species reproduce while being unaware of their exposure to toxic pollutants, and the rate of growth of their populations will determine whether they survive or eventually go extinct. Procedures for measuring this rate in laboratory cultures were determined a long time ago, but so far have not attracted sufficient attention in ecotoxicology.

Chemicals can turn genes on and off; New tests needed, scientists say
Exposure to gene-altering substances, particularly in the womb and shortly after birth, “can lead to increased susceptibility to disease,” said Linda S. Birnbaum, who was named director of the National Institute of Environmental Health Sciences and of the US National Toxicology Program in December 2008. "The susceptibility persists long after the exposure is gone, even decades later. Glands, organs, and systems can be permanently altered," she said.

Animal studies indicate that some environmental chemicals cause epigenetic changes that trigger breast and prostate cancer, obesity, diabetes, heart disease, asthma, Alzheimer’s, Parkinson’s disease and learning disabilities, she said. And some new human studies are now adding to the evidence.

"There is a huge potential impact from these exposures, partly because the changes may be inherited across generations. You may be affected by what your mother and grandmother were exposed to during pregnancy.” Linda Birnbaum, Director, National Institute of Environmental Health Sciences said.

Some environmental chemicals enable methyl groups (a carbon atom with three hydrogen atoms attached) to attack genes, which turns them off or silences them, at a time when they should be turned on. When genes are turned off, they can’t direct the manufacture of proteins that are essential for proper cell function. Chemicals also can uncoil parts of the chromosome, causing genes to be expressed, or turned on, at inappropriate times.

An example is asthmatic children. Wan-Yee Tang, a researcher at the University of Cincinnati, found that children in New York City exposed in the womb to high levels of polycyclic aromatic hydrocarbons (PAHs), common air pollutants from traffic, were much more likely to have asthma than those who were not exposed. By studying cord blood, she found that a particular gene (ACSL3) was methylated in the asthmatic children and unmethylated in the unexposed children, and concluded that the abnormal methylation patterns probably caused the asthma.

The finding could in part explain why worldwide asthma rates have skyrocketed in much of the world, reaching epidemic proportions among children. In the boroughs of New York City with the worst air pollution, about 25 percent of children are asthmatic.

Epigenetic changes also have been observed in children conceived with assisted reproductive technologies, said Richard Meehan of the Medical Research Council in Scotland.

Some metals, such as nickel, chromium and arsenic, are well-known carcinogens - not because they are directly toxic to cells but because of their epigenetic effect, said Max Costa, a New York University professor of environmental medicine and pharmacology. They increase methylation of DNA, which results in gene silencing and cell transformation and leads to cancer, he explained.

Most researchers there agreed that compounds need to be tested for epigenetic effects. But practical testing of the 80,000 or so chemicals in commerce would require rapid screens that would prioritize the compounds into high, medium, and low-risk groups. Those at high risk for epigenetic effects could then be subjected to more definitive and expensive tests.

John M. Greally, associate professor at the Albert Einstein College of Medicine in New York City, pointed out that no single test is ideal for detecting epigenetic effects. "All of the assays have drawbacks," he said. For example, one assay requires immediate sample processing so it cannot be used on stored samples.

Dr Linda Birnbaum, who formerly was head of experimental toxicology at the U.S. Environmental Protection Agency, said regulators and industry don’t have to start from square one. “We’re already marching down this road,” said Birnbaum. “The National Toxicology Program is already talking about including some epigenetic studies in the program.”

The most important public health issue that arises from epigenetics, Birnbaum said, is that the current environment may not be the crucial factor to consider when examining what causes diseases.

“Asking heart attack victims what they ate this year or last may be far less important than what they were exposed to in the womb and shortly after birth,” she said.

United States EPA unveils plan to review 6 controversial chemicals and announces a radical reform agenda for US Toxics Policy
US-EPA unveils plan to review 6 controversial chemicals and reform US toxics policy President Barak Obama’s top environmental official announced a new push to transform the way the nation regulates toxic chemicals that may endanger people and the environment. U.S. Environmental Protection Agency Administrator Lisa Jackson called the workings of a 1976 law 'inordinately cumbersome and time-consuming' and said the administration will promote a new chemical law in Congress. In the meantime, the EPA will analyze and regulate six high-profile, widely used chemicals that have raised health concerns, including BPA [Bisphenol A] and phthalates.

Saying that the public is "understandably anxious and confused" about chemicals in their bodies and in their environment, President Obama’s top environmental official announced on Tuesday [29 September 2009] a new push to transform the way the nation regulates industrial compounds. U.S. Environmental Protection Agency Administrator Lisa Jackson called the nation's 1976 toxics law “inordinately cumbersome and time-consuming." As a result, she said the Obama Administration will promote a new chemical law in Congress in the coming months that puts the responsibility on industry to prove that its compounds are safe.

In the meantime, Jackson said, the EPA will begin to analyze and regulate six high-profile chemicals that have raised health concerns. Included are bisphenol A, or BPA, found in hard, clear polycarbonate bottles, and phthalates, which are used in vinyl and cosmetics. Also targeted are brominated flame retardants [PBDEs] added to electronics and other goods; perfluorinated compounds used in manufacturing non-stick coatings and food packaging; some paraffins, used in lubricants, and benzidine dyes and pigments. Many scientists say these chemicals can mimic hormones and obstruct development of fetuses and children, as well as possibly cause reproductive problems, cancer or other health effects.

Jackson’s announcement signals a dramatic shift away from the policy of the Bush administration. Top EPA officials who testified before Congress three years ago defended the Toxic Substances Control Act as effective in safeguarding public health from industrial chemicals.

Jackson said the EPA is gathering data from industry on the six chemicals so the agency can assess their safety and develop action plans with firm deadlines to limit exposure. The EPA may restrict their use or require labels on consumer products to warn of risks. The agency already has such authority under the existing law, she said.

The EPA will start with the six high-profile chemicals, then add more. EPA officials said they will post four "chemical action plans" in December describing how they will handle the initial compounds, and then post plans for more chemicals in four-month intervals. Some 80,000 chemicals — some of them widely used in consumer products--are in commerce today, and some lack detailed health and safety data. Jackson said the agency and the manufacturers will review and act on chemicals with the highest priority in a timely manner.

“As more and more chemicals are found in our bodies and the environment, the public is understandably anxious and confused. Many are turning to government for assurance that chemicals have been assessed using the best available science, and that unacceptable risks haven’t been ignored,” Jackson told an audience of several hundred people during a speech at the Commonwealth Club in San Francisco on Tuesday night.

The American Chemistry Council, which represents chemical manufacturers, says it has agreed in principle to “supply some level of support” to pay for more increased study and other efforts to assess the safety of compounds.

“We understand that industry has to provide more data and a greater transparency to that data,” said Cal Dooley, president of the American Chemistry Council. One of the driving forces of the industry’s participation is the desire to win consumer confidence in products and to regain world leadership in chemical safety, Dooley said. Under the current law, some 7,000 chemicals are produced or imported annually in amounts above 25,000 pounds, according to industry figures.

Only five have been banned or restricted since the law was enacted 33 years ago. The law requires the EPA to prove a toxic substance "presents an unreasonable risk of injury to health or the environment," consider the costs of restricting its use and choose "the least burdensome" approach to regulate industry.

Legislation to reform the law is expected to be introduced this autumn. Jackson released a “set of principles” that she hoped would guide Congress. The EPA isn’t focused as much on rewriting the existing law as it is on coming up with a new one that would strengthen the agency’s ability to protect the public, she said.

Under the set of principles the EPA would require manufacturers to supply enough information to conclude that new and existing chemicals are safe and don’t endanger public health or the environment. The EPA also wants clear authority to ban or restrict chemicals, although it would retain flexibility to consider social benefits and costs.

Many experts say the United States has fallen far behind in regulating toxic substances. In 2007, the European Union began implementing the world's most restrictive chemicals law. It requires manufacturers to provide basic data on the properties of thousands of chemical substances. The European Chemicals Agency then will review the chemicals, and require substitution of the most dangerous ones.

Lawsuit initiated to protect endangered species in USA from pesticide impacts
January 28, 2010, San Francisco - The Center for Biological Diversity today filed notice of intent to sue the U.S. Environmental Protection Agency for failing to adequately evaluate and regulate nearly 400 pesticides harmful to hundreds of endangered species throughout the nation, which also threaten human health. The EPA has violated the US Endangered Species Act by failing to consult with wildlife regulatory agencies about the impacts of pesticides on hundreds of protected species that are threatened by pesticide use. The agency has also violated the US Migratory Bird Treaty Act by registering pesticides that are known to kill and harm migratory birds.

Childhood acute lymphoblastic leukemia and pesticide use
In a 2009 California case–control study of childhood leukemia, children’s residential histories were compared with available agricultural pesticide-use reporting data to assess exposures to specific pesticides and groupings of pesticides during specific time periods of interest. They examined whether residential proximity to applications of these agents was associated with acute lymphoblastic leukemia (ALL), the most common type of childhood leukaemia. Leukemia is the most common form of childhood cancer, approximately 80% of childhood leukemia is acute lymphoblastic leukaemia, peaking at 4 to 12 years of age. The researchers found that ‘Elevated ALL risk was associated with lifetime moderate exposure, but not high exposure, to certain categories of pesticides, including organophosphates, chlorinated phenols, and triazines, and with pesticides classified as insecticides or fumigants’.

An increased risk of childhood acute lymphoblastic leukemia was correlated with a moderate lifetime exposure to several categories of agricultural pesticides, including insecticides or fumigants and other chemicals identified as probable or possible carcinogens, developmental or reproductive toxins, genotoxins, suspected endocrine disruptors, and anti-cholinesterases. Increased risks were not observed in the highest categories of exposure. A similar exposure–response pattern was observed in single-class models of chlorinated phenols (MCPA, 2,4 D & Diclofop-methyl), organophosphates (including Chlorpyrifos, Mevinphos & Phorate), and triazines (Atrazine, Simazine and Cyanazine).

The toxic effects of certain pesticides include oxidative stress, genotoxicity, endocrine disruption, and cholinesterase inhibition, but little is known about what role these effects may play in inducing ALL. There is limited toxicological evidence of a leukemogenic effect from exposure to specific types of agricultural pesticides such as organophosphates. Previous toxicological studies observed a leukemogenic effect from exposure to isofenphos, an organophosphate insecticide. By design, organophosphates and other anti-cholinesterase compounds inhibit the ability of the enzymes acetylcholinesterase (AChE) and butyrylcholinesterase (BuChE) to regulate acetylcholine, leading to an over-accumulation of this neurotransmitter. Emerging evidence suggests that change in AChE and BuChE activity is associated with tumor development and may play a role in cell proliferation and differentiation, although it is not clear whether this is a cause or consequence of neoplastic processes.

In summary, this study detected a modest increase in acute lymphoblastic leukaemia risk with residential proximity to moderate levels of agricultural use of several types of pesticides, but not at higher levels of use. The observed consistency of this association across toxicological and physicochemical classes of chemicals warrants further exploration in future studies. These studies should have a larger pool of cases and controls to allow for the evaluation of the effects of specific pesticides on acute lymphoblastic leukaemia, acute myeloid leukaemia, and other leukemias.

Pesticide exposure assessment should account for crop and/or plantation locations and be further refined by including factors that influence the spray drift potential of agricultural pesticides in the environment, and integrating pesticide exposure from other sources such as diet and home use. In addition, prenatal residential histories should be collected and geocoded in order to characterize exposure during the critical gestational period.

Pesticides, Gene Translocation and Non-Hodgkin Lymphoma
Childhood leukaemia and follicular non-Hodgkin lymphoma (NHL), are characterised by specific chromosomal translocations that cause the close physical proximity of the certain genes normally on separate chromosomes. In follicular NHL, the anti-apoptotic B-cell leukaemia/lymphoma (bcl-2) gene, normally found on chromosome 18, translocates to the immunoglobulin heavy chain locus of chromosome 14.

Increased frequency of the t(14;18) translocation occurs is recognized as a marker for increased lymphoma risk. Epidemiological findings indicate a positive association between t(14;18)-NHL and exposure to a variety of pesticides, including dieldrin, toxaphene, lindane, atrazine and certain fungicides.

Occupational exposure to pesticides can increase the frequency of t(14;18) translocations, both in terms of the number of people affected and the number of affected lymphocytes with those individuals. In addition exposure to the most potent dioxin, 2,3,7,8-tetrachloro dibenzo-p-dioxin (TCDD) is also correlated with increased numbers of circulating lymphocytes with t(14;18) translocations. Apparently the frequency of exposure to pesticides is correlated with an increased frequency of t(14;18) translocations in lymphocytes and the risk of NHL increases with ongoing accumulation of genetic instability. Hence genetic translocations induced by the frequency of pesticide exposure alters disease risk, in this case to NHL.

Challenges to old scientific concepts of toxicity with important implications for human health
A core assumption of current toxicologic procedures used to establish health standards for chemical exposures is that testing the safety of chemicals at high doses can be used to predict the effects of low-dose exposures, such as the residue levels commonly detected in the general population. This assumption is based on the 16th century precept that “the dose makes the poison” or higher doses will cause greater effects. If a chemical’s effects follow the monotonic dose–response curve, the more of the chemical leads to a greater measurable response. However, when toxicologists began to focus on potential health effects of endocrine disrupting chemicals (ECDs), endocrinologists raised questions about the appropriateness of assuming monotonic linear dose-response as a basis for chemical risk assessments, because the general characteristic of endogenous hormones, hormonally active drugs, and environmental chemicals with hormonal activity is a dose-response curve over a logarithmic range of concentrations may show a non-monotonic slope where the greatest responses occur at concentrations far less than the designated ‘no observed adverse effect level’(NOAEL). Nonmonotonic dose–response curves have been reported for adverse effects with a number of EDCs including the polycarbonate plastic monomer biphenol A (BPA) used in some baby bottles, water bottles, and food can linings ; di(2-ethylhexyl) phthalate (DEHP), used in medical devices and other products made with polyvinyl chloride plastic and the pesticides dieldrin, endosulfan, and hexachlorobenzene. Exposure to the phthalate DEHP at a concentration 1,000-times less than the current safety standard based on DEHP’s liver toxicity response at high doses, exacerbated allergic reactions. Similarly, exposure to extremely low (picomolar or 10-12 concentrations or parts per trillion) levels of several persistent organic pollutants increased allergic responses. None of these effects was predicted by studies that examined only high dose effects of these chemicals. In the case of the bisphenol A (BPA) study Wetherill and colleagues examined cell proliferation in prostate cancer cells (androgen-independent LNCaP cell line). The prostate cells were propagated for 72 hr in 5% charcoal-/dextran-treated fetal bovine serum supplemented with 0.1% ethanol vehicle and increasing BPA concentrations (0.1–100 nM). Cells were then labelled with bromode-oxyuridine (BrdU), and BrdU incorporation was detected via indirect immunofluorescence. Data shown are mean ± SD of three independent experiments in which at least 250 cells/experiment were analysed. The highest cell proliferation responses occurred within the detected BPAconcentration ranges found in humans [i.e. between ~0.1 and ~10 ppb). The response to 100 nM (23 ppb) BPA did not differ from control (0ppb). The standard toxicity test, starting the dose–response concentration curve at 100 nM (23 ppb) BPA and testing at incremental higher high doses would have shown no difference between cell proliferate responses in controls, at a dose at that level or above. The erroneous conclusion drawn the application of a standard toxicity testing using 100 nM (23 ppb) as the lowest test concentration would be that bisphenol A has ‘no observed adverse effect level’ (NOAEL) greater than (100nM) or 23 ppb. This experiment shows that had a concentration one and two orders of magnitude been conducted, the stimulatory effect of BPA at 1 nM and 10 nM would have been observed. In an experiment specifically designed to test the adequacy of high-dose responses of the phthalate DEHP in rats, found that a high dose DEHP increased the activity of estrogen-synthesizing enzyme, aromatase in the brains of neonatal male rats and at a dose two orders of magnitude lower there was “no effect dose”. This methodology was used to estimate the exposure concentration deemed safe for human exposure. [The aromatase enzyme is involved in determining gender-associated differences in brain function.] In this experiment the scientists also tested DEHP at lower doses and found a significant down-regulation of aromatase at a dose 37 times lower than the putative ‘no effect dose’ - an effect opposite to and unpredicted from results of testing only very high doses. It is acknowledged that the toxicity of some chemicals at high concentrations exert effects that are not associated with direct mediation wit hormone receptors. Whereas at concentrations several orders of magnitude lower, these chemicals act as endocrine disrupting chemical (EDCs) and exert actions upon synthesis or function of enzymes that may be responsible for the synthesis or degradation of hormones and on co-regulatory proteins that interact with receptors and, in the case of neurologic actions, affect neurotransmitters and their receptors. One example is the triazine herbicide, atrazine. At very low concentrations atrazine activates the aromatase gene in zebrafish embryos; this activity can alter gender determination via a rapid signalling system. Above a concentration dose at which a hormonally-active chemical saturates, that is occupies virtually all receptors, any measured change in the dose-response that occurs cannot be caused by a receptor-mediated mechanism, as this requires a change in receptor occupancy. Receptors for steroid hormones are ligand-activated transcription molecules that require a change in ligand binding to affect the rate of gene transcription. It is argued that the high-dose experiments cannot be used to predict low-dose responses mediated by a putative endocrine disrupting chemical that binds to hormone receptors altering receptor-mediated responses at low doses. Consequently the current approach in regulatory toxicology of only testing chemicals at relatively high concentrations within a narrow range of concentrations (with the highest concentration deemed the maximum tolerated dose) does not serve to predict the hazards posed by low-level exposure concentrations to numerous putative EDCs found in most people in biomonitoring studies conducted in the United States and elsewhere. Research over the past 20 years has identified multiple endocrine disrupting chemicals that mimic or disrupt hormone function at low doses in ways that are not predicted through conventional high-dose (high-concentration) response studies. Biomonitoring studies have established that many of these contaminants are now widespread contaminants in people. Paradocically classical regulatory toxicology ignores this non-monotonicity response by putative EDCs despite the empirical evidence that is non-monotonic dose–response patterns are typical of endogenous hormones. This disconnect with current science pervades virtually all regulatory agencies responsible for chemical safety around the world, and it means that many regulatory decisions are highly likely to have underestimated risks. If human and animal health implications were inconsequential, then thi clash between regulatory toxicology and endocrinology would remain buried in academia. However the range of health conditions now plausibly linked to EDCs including, but not limited to, prostate cancer, breast cancer , attention deficit hyperactivity disorder (ADHD) , infertility and male and female reproductive disorders ; miscarriage, and most recently, hyperallergic diseases, asthma ,obesity , and heart disease and type 2 diabetes  makes it imperative that the challenges facing endocrinology and regulatory toxicology be resolved in ways that reflect modern scientific understanding. These chronic diseases are major contributors to the steadily increasing human disease burden and to the escalating cost of health care throughout the world. Extensive, careful, and replicable animal research suggests that numerous common man-made chemicals to which people are exposed every day, but that have not been adequately studied for health effects in humans, may be significant contributors to these adverse health trends. Because the endocrine system is highly conserved between animals used as models in biomedical research and humans, the default assumption should be that non-monotonic dose–responses of EDCs observed in laboratory animals and in vitro, including with human cells and tissues, are applicable to human health Modernizing relevant health standards by incorporating endocrinologic principles could help reduce a significant portion of the human disease burden, but this will require regulatory decision makers to fundamentally change the paradigm commonly used to assess the risk to human health posed by chemicals.

Extensive, careful, and replicable animal research suggests that numerous common man-made chemicals to which people are exposed every day, but that have not been adequately studied for health effects in humans, may be significant contributors to these adverse health trends. Because the endocrine system is highly conserved between animals used as models in biomedical research and humans, the default assumption should be that non-monotonic dose–responses of EDCs observed in laboratory animals and in vitro, including with human cells and tissues, are applicable to human health.

Related SourceWatch articles

 * Pollution Information Tasmania