Tuesday, June 4, 2019

Predicting Effects of Environmental Contaminants

Predicting Effects of Environmental Contaminants1.1. Debunking well-nigh chemic mythsIn October 2008, the Royal Society of chemistry announced they were offering 1 million to the offset printing member of the public that could bring a 100% chemic substance free material. This attempt to reclaim the intelligence activity chemical from the advertising and marketing industries that utilization it as a synonym for poison was a reaction to a decision of the Advertising Standards Authority to keep back an advert perpetuating the myths that natural products were chemical free (Edwards 2008). Indeed, no material regardless(prenominal) of its origin is chemical free. A related ballpark misconception is that chemicals made by temperament ar intrinsically good and, conversely, those manufactured by man argon bad (Ottoboni 1991). in that location ar many examples of cyanogenic commingles produced by algae or other micro-organisms, venomous animals and plants, or even examples of surroundingsal harm guideing from the presence of relatively benign natural compounds either in un evaluate places or in unexpected quantities. It is then of prime importance to define what is meant by chemical when referring to chemical backs in this chapter and the rest of this book. The correct term to describe a chemical compound an organism whitethorn be exposed to, whether of natural or synthetic origins, is xenobiotic, i.e. a substance foreign to an organism (the term has also been utilize for transplants). A xenobiotic usher out be outlined as a chemical which is found in an organism but which is not normally produced or expected to be present in it. It ignore also cover substances which atomic number 18 present in much higher concentrations than are usual.A grasp of virtually of the fundamental principles of the scientific disciplines that underlie the characterisation of yield associated with vulnerability to a xenobiotic is required in order to understand the potential consequences of the presence of pollutants in the surround and critically appraise the scientific evidence. This chapter bequeath attempt to briefly summarise some master(prenominal) concepts of basic cyanogeneticology and environmental epidemiology relevant in this context.1.2. Concepts of Fundamental ToxicologyToxicology is the acquaintance of poisons. A poison is commonly defined as any substance that can cause an adverse effect as a result of a physicochemical interaction with living tissue(Duffus 2006). The use of poisons is as old as the human race, as a method of hunting or war colde as well as murder, suicide or execution. The evolution of this scientific discipline cannot be sepa appreciated from the evolution of pharmacology, or the science of cures. Theophrastus Phillippus Aureolus Bombastus von Hohenheim, to a greater extent commonly cognise as Paracelsus (1493-1541), a physician contemporary of Copernicus, Martin Luther and da Vinci, is widely considered as the father of toxicology. He challenged the ancient concepts of medicine based on the balance of the four-spot humours (blood, phlegm, yel funky and black bile) associated with the four elements and believed illness occurred when an organ failed and poisons accumulated. This use of chemistry and chemical analogies was particularly offensive to his contemporary medical establishment. He is famously credit the following quote that still underlies present-day toxicology.In other words, all substances are potential poisons since all can cause injury or wipe let on following excessive word-painting. Conversely, this statement implies that all chemicals can be used safely if handled with appropriate precautions and photo is kept below a defined limit, at which risk is considered tolerable (Duffus 2006). The concepts both of tolerable risk and adverse effect illustrate the value judgements embedded in an otherwise scientific discipline relying on observable, measurable empirical evi dence. What is considered abnormal or undesirable is dictated by society rather than science. Any change from the normal state is not needfully an adverse effect even if statistically significant. An effect may be considered harmful if it causes damage, irreversible change or amplifyd susceptibility to other stresses, including infectious disease. The stage of development or state of health of the organism may also have an influence on the degree of harm.1.2.1. Routes of exposureToxicity will vary depending on the route of exposure. There are three routes via which exposure to environmental contaminants may occurIngestionInhalationSkin adsorption bear injection may be used in environmental toxicity testing. Toxic and pharmaceutical cistrons generally produce the close rapid response and greatest effect when disposed intravenously, directly into the bloodstream. A descending order of effectiveness for environmental exposure routes would be inhalation, ingestion and skin adsorpti on.Oral toxicity is most relevant for substances that might be ingested with food or drinks. Whilst it could be argued that this is generally under an individuals control, there are complex issues regarding information both about the point of substances in food or water and the current state-of-knowledge about associated harmful effectuate.Gases, vapours and dusts or other airborne particles are inhaled involuntarily (with the infamous exclusion of smoking). The inhalation of solid particles depends upon their size and shape. In general, the smaller the particle, the further into the respiratory tract it can go. A large proportion of airborne particles breathed with the mouth or cleared by the cilia of the lungs can enter the gut.Dermal exposure generally requires direct and prolonged contact with the skin. The skin acts as a precise effective barrier against many external toxicants, but because of its great surface area (1.5-2 m2), some of the many diverse substances it comes in contact with may still come alive topical or systemic effects (Williams and Roberts 2000). If dermal exposure is often most relevant in occupational settings, it may so far be pertinent in relation to bathing waters (ingestion is an important route of exposure in this context). Voluntary dermal exposure related to the use of cosmetics raises the corresponding questions regarding the adequate communication of current knowledge about potential effects as those related to food.1.2.2. Duration of exposureThe toxic response will also depend on the duration and frequency of exposure. The effect of a single venereal disease of a chemical may be severe effects whilst the same dose total dose given at several intervals may have little if any effect. An example would be to compare the effects of drinking four beers in one evening to those of drinking four beers in four long time. Exposure duration is generally divided into four large categories acute, sub-acute, sub-chronic and chro nic. Acute exposure to a chemical usually refers to a single exposure event or repeated exposures over a duration of less than 24 hours. Sub-acute exposure to a chemical refers to repeated exposures for 1 month or less, sub-chronic exposure to continuous or repeated exposures for 1 to 3 months or approximately 10% of an experimental species life time and chronic exposure for more than 3 months, usually 6 months to 2 years in rodents (Eaton and Klaassen 2001). chronic exposure studies are designed to assess the cumulative toxicity of chemicals with potential lifetime exposure in humans. In real exposure situations, it is generally very onerous to ascertain with any certainty the frequency and duration of exposure but the same terms are used.For acute effects, the time component of the dose is not important as a high dose is responsible for these effects. However if acute exposure to agents that are rapidly absorbed is likely to hasten immediate toxic effects, it does not rule out the possibility of delayed effects that are not necessarily similar to those associated with chronic exposure, e.g. latency mingled with the onset of certain cancers and exposure to a carcinogenic substance. It may be worth here mentioning the fact that the effect of exposure to a toxic agent may be entirely dependent on the timing of exposure, in other words long-term effects as a result of exposure to a toxic agent during a critically reactive stage of development may differ widely to those seen if an adult organism is exposed to the same substance. Acute effects are almost always the result of accidents. Otherwise, they may result from criminal poisoning or self-poisoning (suicide). Conversely, whilst chronic exposure to a toxic agent is generally associated with long-term low-level chronic effects, this does not preclude the possibility of some immediate (acute) effects after individually administration. These concepts are close related to the mechanisms of metabolic degradat ion and excretion of ingested substances and are best illustrated by 1.1.Line A. chemical with very slow elimination. Line B. chemical with a rate of elimination equal to frequency of dosing. Line C. Rate of elimination faster than the dosing frequency. Blue-shaded area is representative of the concentration at the target site necessary to elicit a toxic response.1.2.3. Mechanisms of toxicityThe interaction of a foreign compound with a biological system is 2-fold there is the effect of the organism on the compound (toxicokinetics) and the effect of the compound on the organism (toxicodynamics).Toxicokinetics relate to the delivery of the compound to its site of action, including absorption (transfer from the site of administration into the general circulation), distribution (via the general circulation into and out of the tissues), and elimination (from general circulation by metabolism or excretion). The target tissue refers to the tissue where a toxicant exerts its effect, and i s not necessarily where the concentration of a toxic substance is higher. Many halogenated compounds such as polychlorinated biphenyls (PCBs) or flame retardants such as polybrominated diphenyl ethers (PBDEs) are known to bioaccumulate in body fat stores. Whether such requisition form processes are actually protective to the individual organisms, i.e. by lowering the concentration of the toxicant at the site of action is not clear (OFlaherty 2000). In an bionomical context however, such bioaccumulation may serve as an indirect route of exposure for organisms at higher trophic levels, thereby potentially contributing to biomagnification through and through the food chain.Absorption of any compound that has not been directed intravenously injected will entail transfer across membrane barriers before it reaches the systemic circulation, and the qualification of absorption processes is highly dependent on the route of exposure.It is also important to bill that distribution and elimi nation, although often considered separately, take place simultaneously. Elimination itself comprises of 2 kinds of processes, excretion and biotransformation, that are also taking place simultaneously. Elimination and distribution are not independent of each other as effective elimination of a compounds will prevent its distribution in peripheral tissues, whilst conversely, wide distribution of a compound will impede its excretion (OFlaherty 2000). Kinetic models attempt to address the concentration of a toxicant at the target site from the administered dose. If often the ultimate toxicant, i.e. the chemical species that induces structural or functional alterations resulting in toxicity, is the compound administered (parent compound), it can also be a metabolite of the parent compound generated by biotransformation processes, i.e. toxication rather than detoxication (Timbrell 2000 Gregus and Klaassen 2001). The liver and kidneys are the most important excretory organs for non-vol atile substances, whilst the lungs are spry in the excretion of volatile compounds and gases. Other routes of excretion include the skin, hair, sweat, nails and milk. Milk may be a major route of excretion for lipophilic chemicals imputable to its high fat content (OFlaherty 2000).Toxicodynamics is the study of toxic response at the site of action, including the reactions with and binding to cell constituents, and the biochemical and physiologic consequences of these actions. such(prenominal) consequences may thus be manifested and observed at the molecular or cellular levels, at the target organ or on the whole organism. Therefore, although toxic responses have a biochemical basis, the study of toxic response is generally subdivided either depending on the organ on which toxicity is observed, including hepatotoxicity (liver), nephrotoxicity (kidney), neurotoxicity (nervous system), pulmonotoxicity (lung) or depending on the type of toxic response, including teratogenicity (abnorm alities of physiological development), immunotoxicity (immune system impairment), mutagenicity (damage of genetic material), carcinogenicity (cancer causation or promotion). The choice of the toxicity endpoint to observe in experimental toxicity testing is therefore of critical importance. In recent years, rapid advances of biochemical sciences and technology have resulted in the development of bioassay techniques that can contribute invaluable information regarding toxicity mechanisms at the cellular and molecular level. However, the extrapolation of such information to predict effects in an intact organism for the purpose of risk legal opinion is still in its infancy (Gundert -Remy et al. 2005).1.2.4. Dose-response affinitys83A7DC81The system of dose-response relationships is based on the assumptions that the activity of a substance is not an inherent quality but depends on the dose an organism is exposed to, i.e. all substances are inactive below a certain scepter and active over that threshold, and that dose-response relationships are monotonic, the response rises with the dose. Toxicity may be detected either as all-or-nothing phenomenon such as the death of the organism or as a graded response such as the hypertrophy of a special(prenominal) organ. The dose-response relationship involves correlating the severity of the response with exposure (the dose). Dose-response relationships for all-or-nothing (quantal) responses are typically S-shaped and this reflects the fact that sensitivity of individuals in a population generally exhibits a normal or Gaussian distribution. Biological variation in susceptibility, with fewer individuals world either hypersusceptible or resistant at both end of the curve and the majority responding amidst these two extremes, gives rise to a bell-shaped normal frequency distribution. When plotted as a cumulative frequency distribution, a sigmoid dose-response curve is observed ( 1.2). canvass dose response, and developing dose response models, is central to determining safe and hazardous levels.The simplest measure of toxicity is lethality and determination of the median lethal dose, the LD50 is usually the first toxicological test performed with new substances. The LD50 is the dose at which a substance is expected to cause the death of half of the experimental animals and it is derived statistically from dose-response curves (Eaton and Klaassen 2001). LD50 values are the standard for comparison of acute toxicity between chemical compounds and between species. rough values are given in Table 1.1. It is important to note that the higher the LD50, the less toxic the compound.Similarly, the EC50, the median effective dose, is the quantity of the chemical that is estimated to have an effect in 50% of the organisms. However, median doses alone are not very informative, as they do not convey any information on the shape of the dose-response curve. This is best illustrated by 1.3. While toxicant A appears (always) more toxic than toxicant B on the basis of its lower LD50, toxicant B will start touch on organisms at lower doses (lower threshold) while the bluff slope for the dose-response curve for toxicant A means that once individuals become overexposed (exceed the threshold dose), the increase in response occurs over much smaller increments in dose.Low dose responsesThe classical paradigm for extrapolating dose-response relationships at low doses is based on the concept of threshold for non-carcinogens, whereas it assumes that there is no threshold for carcinogenic responses and a linear relationship is hypothesised (s 1.4 and 1.5).The NOAEL (No Observed Adverse Effect Level) is the exposure level at which there is no statistically or biologically significant increase in the frequency or severity of adverse effects between exposed population and its appropriate control. The NOEL for the most sensitive test species and the most sensitive indicator of toxicity is usually employed for regulatory purposes. The LOAEL (Lowest Observed Adverse Effect Level) is the lowest exposure level at which there is a statistically or biologically significant increase in the frequency or severity of adverse effects between exposed population and its appropriate control. The main(prenominal) criticism of NOAEL and LOAEL is that there are dependent on study design, i.e. the dose groups selected and the number of individuals in each group. Statistical methods of deriving the concentration that produces a specific effect ECx, or a benchmark dose (BMD), the statistical lower confidence limit on the dose that produces a defined response (the benchmark response or BMR), are increasingly preferred.To understand the risk that environmental contaminants pose to human health requires the extrapolation of limited data from animal experimental studies to the low doses critically encountered in the environment. Such extrapolation of dose-response relationships at low doses is the source of much controversy. Recent advances in the statistical analysis of very large populations exposed to ambient concentrations of environmental pollutants have however not observed thresholds for cancer or non-cancer outcomes (White et al. 2009). The actions of chemical agents are triggered by complex molecular and cellular events that may read story to cancer and non-cancer outcomes in an organism. These processes may be linear or non-linear at an individual level. A thorough understanding of critical steps in a toxic process may help refine current assumptions about thresholds (Boobis et al. 2009). The dose-response curve however describes the response or variation in sensitivity of a population. Biological and statistical attributes such as population variability, additivity to pre-existing conditions or diseases induced at background exposure will tend to smooth and linearise the dose-response relationship, obscuring individual thresholds.HormesisDose-response relationships for substances that are essential for normal physiological function and survival are actually U-shaped. At very low doses, adverse effects are observed out-of-pocket to a deficiency. As the dose of such an essential nutrient is increase, the adverse effect is no longer detected and the organism can function normally in a state of homeostasis. Abnormally high doses however, can give rise to a toxic response. This response may be qualitatively different and the toxic endpoint measured at very low and very high doses is not necessarily the same.There is evidence that concomitant substances may also impart an effect at very low doses ( 1.6). Some authors have argued that hormesis ought to be the default assumption in the risk assessment of toxic substances (Calabrese and Baldwin 2003). Whether such low dose effects should be considered stimulatory or beneficial is controversial. Further, potential implications of the concept of hormesis for the risk management of the combinations of the w ide variety of environmental contaminants present at low doses that individuals with variable sensitivity may be exposed to are at best unclear.1.2.5. chemical interactionsIn regulatory hazard assessment, chemical hazard are typically considered on a compound by compound basis, the possibility of chemical interactions being accounted for by the use of synthetic rubber or uncertainty factors. Mixture effects still represent a challenge for the risk management of chemicals in the environment, as the presence of one chemical may alter the response to another chemical. The simplest interaction is additivity the effect of two or more chemicals acting together is equivalent to the sum of the effects of each chemical in the mixture when acting independently. Synergism is more complex and describes a situation when the presence of both chemicals causes an effect that is greater than the sum of their effects when acting alone. In potentiation, a substance that does not produce specific tox icity on its own increases the toxicity of another substance when both are present. Antagonism is the principle upon which antidotes are based whereby a chemical can reduce the harm caused by a toxicant (James et al. 2000 Duffus 2006). Mathematical illustrations and examples of known chemical interactions are given in Table 1.2.Table 1.2. Mathematical representations of chemical interactions (reproduced from James et al., 2000)EffectHypothetical mathematical illustrationExampleAdditive2 + 3 = 5Organophosphate pesticidesSynergistic2 + 3 = 20Cigarette smoking + asbestosPotentiation2 + 0 = 10Alcohol + carbon tetrachlorideAntagonism6 + 6 = 8 or5 + (-5) = 0 or10 + 0 = 2Toluene + benzeneCaffeine + alcoholDimercaprol + mercuryThere are four main ways in which chemicals may interact (James et al. 2000)1. Functional both chemicals have an effect on the same physiological function.2. Chemical a chemical reaction between the two compounds affects the toxicity of one or both compounds.3. Dispos itional the absorption, metabolism, distribution or excretion of one substance is increased or decreased by the presence of the other.4. Receptor-mediated when two chemicals have differing affinity and activity for the same receptor, competition for the receptor will modify the overall effect.1.2.6. relevancy of animal modelsA further complication in the extrapolation of the results of toxicological experimental studies to humans, or indeed other untested species, is related to the anatomical, physiological and biochemical differences between species. This paradoxically requires some previous knowledge of the mechanism of toxicity of a chemical and comparative physiology of different test species. When adverse effects are detected in screening tests, these should be interpreted with the relevance of the animal model chosen in mind. For the derivation of safe levels, safety or uncertainty factors are again usually applied to account for the uncertainty surrounding inter-species diffe rences (James et al. 2000 Sullivan 2006).1.2.7. A few words about dosesWhen discussing dose-response, it is also important to understand which dose is being referred to and differentiate between concentrations measured in environmental media and the concentration that will illicit an adverse effect at the target organ or tissue. The exposure dose in a toxicological testing setting is generally known or can be readily derived or measured from concentrations in media and average consumption (of food or water for example) ( 1.7.). Whilst toxicokinetics help to develop an understanding of the relationship between the internal dose and a known exposure dose, relating concentrations in environmental media to the actual exposure dose, often via multiple pathways, is in the realm of exposure assessment.1.2.8. Other hazard characterisation criteriaBefore continuing further, it is important to clarify the difference between hazard and risk. Hazard is defined as the potential to produce harm, it is therefore an inherent qualitative attribute of a given chemical substance. Risk on the other hand is a quantitative measure of the magnitude of the hazard and the probability of it being realised. Hazard assessment is therefore the first step of risk assessment, followed by exposure assessment and finally risk characterization. Toxicity is not the sole criterion evaluated for hazard characterisation purposes.Some chemicals have been found in the tissues of animals in the arctic for example, where these substances of concern have never been used or produced. This realization that some pollutants were able to travel far distances across national borders because of their persistence, and bioaccumulate through the food web, led to the consideration of such inherent properties of organic compounds alongside their toxicity for the purpose of hazard characterisation.Persistence is the result of resistance to environmental degradation mechanisms such as hydrolysis, photodegradation an d biodegradation. Hydrolysis only occurs in the presence of water, photodegradation in the presence of UV light and biodegradation is primarily carried out by micro-organisms. Degradation is related to water solubility, itself inversely related to lipid solubility, therefore persistence tends to be correlated to lipid solubility (Francis 1994). The persistence of inorganic substances has proven more difficult to define as they cannot be degraded to carbon and water.Chemicals may accumulate in environmental compartments and constitute environmental sinks that could be re-mobilised and lead to effects. Further, whilst substances may accumulate in one species without adverse effects, it may be toxic to its predator(s). Bioconcentration refers to accumulation of a chemical from its surrounding environment rather than specifically through food uptake. Conversely, biomagnification refers to uptake from food without consideration for uptake through the body surface. Bioaccumulation integra tes both paths, surrounding medium and food. Ecological magnification refers to an increase in concentration through the food web from lower to higher trophic levels. Again, accumulation of organic compounds generally involves transfer from a hydrophilic to a hydrophobic phase and correlates well with the n-octanol/water partition coefficient (Herrchen 2006).Persistence and bioaccumulation of a substance is evaluated by standardised OECD tests. Criteria for the identification of persistent, bioaccumulative, and toxic substances (PBT), and very persistent and very bioaccumulative substances (vPvB) as defined in Annex XIII of the European Directive on the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) (Union 2006) are given in table 1.3. To be classified ad as a PBT or vPvB substance, a given compound must fulfil all criteria.Table 1.3. REACH criteria for identifying PBT and vPvB chemicalsCriterionPBT criteriavPvB criteriaPersistenceEither half-life 60 days in marine waterHalf-life 60 days in fresh or estuarine waterHalf-life one hundred eighty days in marine sedimentHalf-life 120 days in fresh or estuarine sedimentHalf-life 120 days in soilEitherHalf-life 60 days in marine, fresh or estuarine waterHalf-life 180 days in marine, fresh or estuarine sedimentHalf-life 180 days in soilBioaccumulationBioconcentration factor (BCF) 2000Bioconcentration factor (BCF) 2000ToxicityEitherChronic no-observed effect concentration (NOEC) substance is classified as carcinogenic (category 1 or 2), mutagenic (category 1 or 2), or toxic for reproduction (category 1, 2 or 3)there is other evidence of endocrine disrupting effects1.3. Some notions of Environmental EpidemiologyA complementary, observational approach to the study of scientific evidence of associations between environment and disease is epidemiology. Epidemiology can be defined as the study of how often diseases occur and why, based on the measurement of disease outcome in a study sample in relation to a population at risk. (Coggon et al. 2003). Environmental epidemiology refers to the study of patterns and disease and health related to exposures that are exogenous and involuntary. Such exposures generally occur in the air, water, diet, or soil and include physical, chemical and biologic agents. The extent to which environmental epidemiology is considered to include social, political, cultural, and engineering or architectural factors affecting human contact with such agents varies according to authors. In some contexts, the environment can refer to all non-genetic factors, although dietary habits are generally excluded, despite the facts that some deficiency diseases are environmentally determined and nutritional status may also modify the impact of an environmental exposure (Steenland and Savitz 1997 Hertz-Picciotto 1998).Most of environmental epidemiology is touch with endemics, in other words acute or chronic disease occurring at relatively low frequenc y in the general population due partly to a common and often unsuspected exposure, rather than epidemics, or acute outbreaks of disease affecting a limited population shortly after the introduction of an unusual known or unknown agent. Measuring such low level exposure to the general public may be difficult when not impossible, particularly when seeking historical estimates of exposure to predict future disease. Estimating very small changes in the incidence of health effects of low-level common multiple exposure on common diseases with multifactorial etiologies is particularly difficult because often greater variability may be expected for other reasons, and environmental epidemiology has to rely on natural experiments that unlike controlled experiment are subject to confounding to other, often unknown, risk factors. However, it may still be of importance from a public health perspective as small effects in a large population can have large attributable risks if the disease is comm on (Steenland and Savitz 1997 Coggon et al. 2003).1.3.1. DefinitionsWhat is a case?The definition of a case generally requires a dichotomy, i.e. for a given condition, people can be divided into two discrete classes the affected and the non-affected. It increasingly appears that diseases exist in a continuum of severity within a population rather than an all or nothing phenomenon. For interoperable reasons, a cut-off point to divide the diagnostic continuum into cases and non-cases is therefore required. This can be done on a statistical, clinical, prognostic or operational basis. On a statistical basis, the norm is often defined as within two standard deviations of the age-specific mean, thereby arbitrarily fixing the frequency of abnormal values at around 5% in every population. Moreover, it should be noted that what is usual is not necessarily good. A clinical case may be defined by the level of a variable above which symptoms and complications have been found to become more fr equent. On a prognostic basis, some clinical findings may carry an adverse prognosis, yet be symptomless. When none of the other approaches is satisfactory, an operational threshold will need to be defined, e.g. based on a threshold for treatment (Coggon et al. 2003).Incidence, prevalence and mortalityThe incidence of a disease is the rate at which new cases occur in a population during a contract period or frequency of incidents.Incidence =The prevalence of a disease is the proportion of the population that are cases at a given point in time. This measure is appropriate only in relatively stable conditions and is unsuitable for acute disorders. Even in a chronic disease, the manifestations are often sporadic and a point prevalence will tend to underestimate the frequency of the condition. A better measure when possible is the period prevalence defined as the proportion of a population that are cases at any time within a stated pe

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.