prev next front |1 |2 |3 |4 |5 |6 |7 |8 |9 |10 |11 |12 |13 |14 |15 |16 |17 |18 |19 |20 |21 |22 |23 |24 |review
The assessment of dietary exposure per se is a rather complex process. In accordance with Higley and Strenge (1993), the basic algorithm for calculating total daily dietary intake is as follows: I = S [(Uf) x (Rf)], where I = total intake of the contaminant in question, Uf = daily consumption rate of food type f, and Rf = residue level of the contaminant in food type f.

Daily intake rates used are typically those defined in the Nationwide Food Consumption Survey (e.g., USDA, 1983), for which data on food intake were collected from thousands of individuals living in various regions. For certain chemicals, the residue levels are taken from open literature. In other instances where specific data are unavailable and bioaccumulation (e.g, through the food chain) is anticipated, the residue levels will be extrapolated from soil, plants, and water concentrations in the contaminated area. For pesticide residues in raw agricultural commodities, their levels are estimated from field trials, federal monitoring programs, or established tolerances (see Slide 11).

U.S. EPA (2000b) has provided guidance for the assessment of exposure and risk from pesticides in food. A public meeting was recently scheduled by U.S. EPA (Federal Register, 2000) to evaluate the components and methodologies of three commercial models (Calendex/DEEM(tm),  Lifeline(tm), and REx2000(tm) as tools for dietary and residential pesticide exposure/risk assessments. In considering the drinking water component, U.S. EPA (1999b) has also issued a science policy paper to support the use of several screening models for producing high-end pesticide concentrations in drinking water.