Browsing by Author "Benham, Brian L."
Now showing 1 - 20 of 196
Results Per Page
Sort Options
- Addressing gaps in the US EPA Lead and Copper Rule: Developing guidance and improving citizen science tools to mitigate corrosion in public water systems and premise plumbingKriss, Rebecca Boyce (Virginia Tech, 2023-06-21)Lead and copper in drinking water are known to pose aesthetic and health concerns for humans and pets. The United States Environmental Protection Agency (US EPA) Lead and Copper Rule (LCR) set 90th percentile action levels for lead (15 ppb) and copper (1.3 mg/L), above which utilities must implement systemwide corrosion control. However, gaps in the US EPA LCR leave at least 10% of residents using municipal water and all private well users vulnerable to elevated lead and copper in their drinking water. To help address these gaps in the LCR, this dissertation 1) Evaluates accuracy of at-home lead in water test kits to help residents identify lead problems, 2) Refines orthophosphate corrosion control guidance to help reduce cuprosolvency, 3) Identifies challenges to mitigating cuprosolvency by raising pH, and 4) Develops guidance that can help residents assess and address cuprosolvency problems. Lead in drinking water can pose a variety of health concerns, particularly for young children. The revised LCR will still leave many residents unprotected from elevated lead in their drinking water and potentially wondering what to do about it. Many consumers concerned about lead may choose to purchase at-home lead in water test kits, but there is no certification authority to ensure their accuracy. Most off-the-shelf tests purchased in this work (12 of 16) were not able to detect dissolved or particulate lead at levels of concern in drinking water (i.e. near the lead action level of 15 ppb) due to high detection limits (5,000-20,000 ppb). Binary type tests, which indicate the presence or absence of lead based on a trigger threshold of 15 ppb, were often effective at detecting dissolved lead, but they failed to detect the presence of leaded particles that often cause high lead exposures in drinking water problems. Some of these problems detecting particles could be reduced using simple at-home acid dissolution with weak household acids such a vinegar or lemon juice. Our analysis points out the strengths and weaknesses of various types of at-home lead in water tests, which could be particularly important considering potential distrust in official results in the aftermath of the Flint Water Crisis. Elevated cuprosolvency, or copper release into drinking water, can be an aesthetic concern due to fixture staining, blue water, and green hair and can pose health concerns for residents and pets. In addition to the general gaps in the LCR described above, compliance sampling in the LCR focuses on older homes at highest risk of elevated lead, rather than the newer homes at highest risk of elevated copper. Problems with elevated copper can sometimes go undetected as a result. Guidance was developed to help proactive utilities address cuprosolvency issues through the addition of orthophosphate corrosion inhibitors or pH adjustment as a function of a water's alkalinity. Linear regressions developed from pipe cuprosolvency tests (R2>0.98) determined a "minimum" orthophosphate dose or a "minimum" pH for a given alkalinity that was expected to almost always reduce copper below the 1.3 mg/L EPA action level in a reasonable length of time. The subjective nature of the terms "almost always" and "reasonable length of time" were quantitatively discussed based on laboratory and field data. Orthophosphate addition was generally very effective at cuprosolvency control. Orthophosphate treatment in copper tube cuprosolvency tests produced cuprosolvency below the action level within the first week of treatment. As expected, orthophosphate treated waters sometimes resulted in higher long-term cuprosolvency than the same waters without orthophosphate corrosion control treatment. This is consistent with the formation of phosphate scales which have an intermediate solubility between the cupric hydroxide in new pipes and the malachite or tenorite scales expected in pipe aging without orthophosphate. A linear regression (R2>0.98) was used to determine the orthophosphate dose needed for a given alkalinity to yield copper below the 1.3 mg/L action level in the pipe segments with the highest, 2nd highest, 3rd highest copper concentrations (100th, 95th, or 90th percentile, n=20 replicates, five each from four manufacturers) after 4 or 22 weeks of pipe aging. This regression was generally in good agreement with a bin approach put forth in the 2015 Consensus Statement from the National Drinking Water Advisory Council, but in some cases the regression predicted that higher orthophosphate doses would be needed. In contrast, due to the greater complexity of the reactions involved, a similar simplistic approach for pH adjustment is not widely applicable. A linear regression predicted that higher "minimum" pH values would be needed to control cuprosolvency compared to those suggested by the 2015 National Drinking Water Advisory Consensus Statement. Results indicate that factors such as the potential for calcite precipitation, pipe age, and significant variability in cuprosolvency from pipes of different manufacturers may warrant further research. Field LCR monitoring data indicated that 90th percentile copper concentrations continued to decline over a period of years or decades when orthophosphate is not used, and our laboratory results demonstrate a few cases where copper levels even increased with time. Consideration of confounding effects from other water quality parameters such as natural organic matter, silica, and sulfate would be necessary before the "minimum" pH criteria could be broadly applied. Guidance was then developed to help address cuprosolvency issues on a single building or single home basis for residents with private wells or those with high copper in municipal systems meeting the LCR. A hierarchy of costs and considerations for various interventions are discussed including replumbing with alternative materials, using bottled water or point use pitcher, tap, or reverse osmosis filters to reduce copper consumption, and using whole house interventions like more conventional orthophosphate addition and pH adjustment, or unproven strategies like granular activated carbon filtration, reverse osmosis treatment, and ion exchange treatment. Laboratory and citizen science testing demonstrated that some inexpensive at-home tests for pH and copper, were accurate enough to serve as inputs for this guidance and could empower consumers to diagnose their problems and consider possible solutions. Citizen science field testing and companion laboratory studies of potential interventions indicate that short-term (<36 weeks) use of pH adjustment, granular activated carbon, anion exchange and reverse osmosis treated water were not effective at forming a protective scale for the resident's water tested. In this case-study, cuprosolvency problems were ultimately related to water chemistry and linked to variability in influent water pH. Overall, this work highlighted weaknesses in the current US EPA Lead and Copper Rule. It attempted to close some of these gaps by assessing the accuracy of at-home citizen science tests for lead and copper detection and developing guidance to support voluntary interventions by utilities or consumers. Ideally, local authorities (utilities, health departments, cooperative extension programs) could adapt this guidance to account for local water quality considerations and support consumers in resolving cuprosolvency issues. This guidance may also serve as a citizen science approach that some consumers could use to make decisions on their own. Future work could extend and improve on these initial efforts.
- Advances in Watershed Management: Modeling, Monitoring, and AssessmentBenham, Brian L.; Yagow, Eugene R.; Chaubey, I.; Douglas-Mankin, K. R. (American Society of Agricultural and Biological Engineers, 2011)This article introduces a special collection of nine articles that address a wide range of topics all related to improving the application of watershed management planning. The articles are grouped into two broadly defined categories.. modeling applications, and monitoring and assessment. The modeling application articles focus on one of two widely used watershed-scale water quality modeling packages: HSPF or SWAT The HSPF article assesses the model's robustness when applied to watersheds across a range of topographic settings and climatic conditions. In the SWAT-related articles, researchers used the model to inform watershed management efforts in a variety of ways, including subwatershed prioritization in the context of achieving broader watershed management goals, examining the utility of applying SWAT in a watershed receiving groundwater inputs from outside the topographic watershed boundaries, and estimating the uncertainty and risk associated with meeting TMDL target loads. The monitoring and assessment articles cover such diverse topics as an examination of how best management practice effectiveness is assessed, examination of estimated nutrient loads to a reservoir where a nutrient TMDL has been developed, examination of the sources of fecal indicator bacteria in an urban watershed, and detailed accounting of issues related to flow measurements in small watersheds. The articles in this collection contribute to the body of literature that seeks to inform and advance sound watershed management planning and execution.
- Application of the Analytic Hierarchy Process Optimization Algorithm in Best Management Practice SelectionYoung, Kevin D. (Virginia Tech, 2006-05-30)The efficiency of a best management practice (BMP) is defined simply as a measure of how well the practice or series of practices removes targeted pollutants. While this concept is relatively simple, mathematical attempts to quantify BMP efficiency are numerous and complex. Intuitively, the pollutant removal capability of a BMP should be fundamental to the BMP selection process. However, as evidenced by the absence of removal efficiency as an influential criterion in many BMP selection procedures, it is typically not at the forefront of the BMP selection and design process. Additionally, of particular interest to any developer or municipal agency is the financial impact of implementing a BMP. Not only does the implementation cost exist, but there are long-term maintenance costs associated with almost any BMP. Much like pollutant removal efficiency, implementation and maintenance costs seem as though they should be integral considerations in the BMP selection process. However, selection flow charts and matrices employed by many localities neglect these considerations. Among the categories of criteria to consider in selecting a BMP for a particular site or objective are site-specific characteristics; local, state, and federal ordinances; and implementation and long-term maintenance costs. A consideration such as long-term maintenance cost may manifest itself in a very subjective fashion during the selection process. For example, a BMPs cost may be of very limited interest to the reviewing locality, whereas cost may be the dominant selection criterion in the eyes of a developer. By contrast, the pollutant removal efficiency of a BMP may be necessarily prioritized in the selection process because of the required adherence to governing legislation. These are merely two possible criteria influencing selection. As more and more selection criteria are considered, the task of objectively and optimally selecting a BMP becomes increasingly complex. One mathematical approach for optimization in the face of multiple influential criteria is the Analytic Hierarchy Process. "The analytic hierarchy process (AHP) provides the objective mathematics to process the inescapably subjective and personal preferences of an individual or a group in making a decision" (Schmoldt, 2001, pg. 15). This paper details the development of two categories of comprehensive BMP selection matrices expressing long-term pollutant removal performance and annual maintenance and operations cost respectively. Additionally, the AHP is applied in multiple scenarios to demonstrate the optimized selection of a single BMP among multiple competing BMP alternatives. Pairwise rankings of competing BMP alternatives are founded on a detailed literature review of the most popular BMPs presently implemented throughout the United States.
- Assessing Strontium and Vulnerability to Strontium in Private Drinking Water Systems in VirginiaScott, Veronica; Juran, Luke; Ling, Erin; Benham, Brian L.; Spiller, Asa (MDPI, 2020-04-08)A total of 1.7 million Virginians rely on private drinking water (PDW) systems and 1.3 million of those people do not know their water quality. Because most Virginians who use PDW do not know the quality of that water and since strontium poses a public health risk, this study investigates sources of strontium in PDW in Virginia and identifies the areas and populations most vulnerable. Physical factors such as rock type, rock age, and fertilizer use have been linked to elevated strontium concentrations in drinking water. Social factors such as poverty, poor diet, and adolescence also increase social vulnerability to health impacts of strontium. Using water quality data from the Virginia Household Water Quality Program (VAHWQP) and statistical and spatial analyses, physical vulnerability was found to be highest in the Ridge and Valley province of Virginia where agricultural land use and geologic formations with high strontium concentrations (e.g., limestone, dolomite, sandstone, shale) are the dominant aquifer rocks. In terms of social vulnerability, households with high levels of strontium are more likely than the average VAHWQP participant to live in a food desert. This study provides information to help 1.7 million residents of Virginia, as well as populations in neighboring states, understand their risk of exposure to strontium in PDW.
- Assessing the effects of cattle exclusion practices on water quality in headwater streams in the Shenandoah Valley, VirginiaMaschke, Nancy Jane (Virginia Tech, 2012-01-27)Livestock best management practices (BMPs) such as streamside exclusion fencing are installed to reduce cattle impacts on stream water quality such as increases in bacteria through direct deposition and sediment through trampling. The main objective of this study is to assess the effects of different cattle management strategies on water quality. The project site was located near Keezletown, VA encompassing Cub Run and Mountain Valley Road Tributary streams. During two, one-week studies, eight automatic water samplers took two-hour composites for three periods: baseline, cattle access, and recovery. During the cattle access period, livestock were able to enter the riparian zone normally fenced off. Water samples were analyzed for E.coli, sediment, and nutrients to understand the short-term, high-density, or flash grazing, impact on water quality. Additional weekly grab and storm samples were collected. Results show that cattle do not have significant influence on pollutant concentrations except in stream locations where cattle gathered for an extensive period of time. Approximately three cattle in the stream created an increase in turbidity above baseline concentrations. E.coli and TSS concentrations of the impacted sites returned to baseline within approximately 6 to 20 hours of peak concentrations. Weekly samples show that flash grazing does not have a significant influence on pollutant concentrations over a two-year time frame. Sediment loads from storms and a flash grazing event showed similar patterns. Pollutant concentrations through the permanent exclusion fencing reach tended to decrease for weekly and flash grazing samples.
- Assessing the Performance of HSPF When Using the High Water Table Subroutine to Simulate Hydrology in a Low-Gradient WatershedForrester, Michael Scott (Virginia Tech, 2012-04-17)Modeling ground-water hydrology is critical in low-gradient, high water table watersheds where ground-water is the dominant contribution to streamflow. The Hydrological Simulation Program-FORTRAN (HSPF) model has two different subroutines available to simulate ground water, the traditional ground-water (TGW) subroutine and the high water table (HWT) subroutine. The HWT subroutine has more parameters and requires more data but was created to enhance model performance in low-gradient, high water table watershed applications. The objective of this study was to compare the performance and uncertainty of the TGW and HWT subroutines when applying HSPF to a low-gradient watershed in the Coastal Plain of northeast North Carolina. One hundred thousand Monte Carlo simulations were performed to generate data needed for model performance comparison. The HWT model generated considerably higher Nash-Sutcliffe efficiency (NSE) values while performing slightly worse when simulating the 50% lowest and 10% highest flows. Model uncertainty was assessed using the Average Relative Interval Length (ARIL) metric. The HWT model operated with more average uncertainty throughout all flow regimes. Based on the results, the HWT subroutine is preferable when applying HSPF to a low-gradient watershed and the accuracy of simulated stream discharge is important. In situations where a balance between performance and uncertainty is called for, the choice of which subroutine to employ is less clear cut.
- An Assessment of the Quality of Agricultural Best Management Practices in the James River Basin of VirginiaCunningham, Janelle Hope (Virginia Tech, 2003-08-21)Assessment tools were developed to address the need for a low cost, rapid method of quantifying the quality of agricultural best management practices (BMPs). Best management practices are either cost-shared, where some or all of the capital costs of the practice were subsidized with federal, state, or local funds, or non cost-shared, where the cost of the practice and its upkeep is paid for by the landowner or farm operator. Cost-share practices are required to comply with state standards, while non cost-share practices are not subject to any standards. For this study, BMP quality is defined as the adherence to design, site selection, implementation, and maintenance criteria relating to water quality as specified by state and federal agencies promoting BMP implementation. The two objectives of this research were: 1. develop a set of assessment tools to quantify the quality of agricultural best management practices in a rapid low-cost manner, and 2. test the tools and determine if differences in quality exist between cost-share and non cost-share BMPs in the James River Basin of Virginia. Assessment tools were developed for sixteen practices: alternative water systems, stream fencing, streambank stabilization, grass filter strips, wooded buffers, permanent vegetative cover on critically eroding areas, permanent vegetative cover on erodible cropland, reforestation of erodible crop and pasture land, animal waste storage facilities, grazing land protection systems, loafing lot management systems, late winter split application of nitrogen on small grains, protective cover for specialty crops, sidedress application of nitrogen on corn, small grain cover crops-fertilized and harvested, and small grain cover crops for nutrient management. Assessment tools were developed using both Virginia BMP standards and expert knowledge. Virginia Department of Recreation and Conservation (DCR) and Virginia and national Natural Resource Conservation Service (NRCS) BMP standards were collected and sorted into the four quality component categories; design, site selection, implementation, and maintenance. Standards that pertained directly to a BMPs' potential to protect water quality were translated into question format. Multiple-choice or yes/no questions were used as often as possible to avoid potential bias and for ease of processing. Assessment tool development involved an iterative process that included input from a research team (university-based researchers) and an expert team (public and private sector professionals and practitioners responsible for BMP design and assessment). One hundred and fifty-five cost-shared BMPs and 150 non cost-shared BMPs were assessed on 128 independent farms in the James River Basin of Virginia over a period of four months. The assessment tools were loaded onto a personal digital assistant (PDA), which facilitated data collection and eliminated the need for data transcription. Data collected on the PDA were uploaded periodically to a computer database. A digital camera was used to develop a photographic record of the assessed BMPs. Best management practice quality scores were based on five-point scale, with one being the lowest quality score and five as the highest. Statistical analyses conducted on both the overall quality scores and the quality component scores, indicate that there is not a strong significant difference (p = 0.05) in quality between the cost-shared and non cost-shared BMPs assessed for this study. Statistically significant differences between cost-share and non cost-share practices did, however, exist. For the filter/buffer strips practices (grass filter strips and wooded buffers), the implementation quality component cost-share mean (3.35) and the non cost-share mean (3.88) were statistically different at the 0.05 level (p-value = 0.026). One other statistically significant difference was found. For stream fencing, the overall quality cost-share mean was 4.68 while the non cost-share mean was 4.20; the means are statistically different at the 0.05 level (p-value = 0.043). Statistical analyses were performed to determine if age of practice, farm size, or Soil and Water Conservation District (SWCD) had effects on the BMP quality. No statistically significant differences (p = 0.05) were found relating to the age of an assessed BMP or farm size. One SWCD, the Robert E. Lee district, had a statistically significant difference in the design quality component means; cost-share mean = 4.21, non cost-share mean = 2.94 with a p-value of 0.048. The statistically significant differences that were detected do not establish a clear trend; it appears that for the BMPs assessed here the qualities of cost-share and non cost-share practices are roughly equal. The fact that cost-share practices and non cost-share practices do appear to be roughly equal may be the result of education and outreach programs sponsored by Virginia's SWCDs and Virginia Cooperative Extension. Non cost-share practices may be of equal quality to cost-share practices because those implementing BMPs without the benefit of cost-share may have a greater stake (both financial and personal) in those practices performing well. If no statistically significant difference in quality exists between cost-share and non cost-share practices, then non cost-share practices should be treated equally when accounting for BMPs in NPS pollution in watershed management and computer modeling. Currently, only cost-share practices are included in computer models, in part because these are the only practices tracked by the existing BMP establishment infrastructure. Estimating the numbers and distribution of non cost-share practices and incorporating them into NPS water quality modeling efforts will more accurately reflect the steps agricultural producers have and are taking to decrease the amount of NPS pollution reaching water bodies. Additionally, policy regarding NPS pollution and BMPs should reflect the apparent equal qualities of cost-share and non cost-share practices. The assessment tools developed as a part of this study can potentially be applied to determine the quality of BMPs on basin or state-wide scales to give policy makers a better understanding of the practices and populations that the policies are created for. Moreover, BMP quality scores have the potential to be used as a surrogate measure for BMP performance. Further research recommendations include correlating BMP quality scores with BMP performance, wider scale testing of the tools, continued revision of the tools, and using the assessment tool scores to diagnose BMP quality problems.
- Associations between Fecal Indicator Bacteria Prevalence and Demographic Data in Private Water Supplies in VirginiaSmith, Tamara L. (Virginia Tech, 2013-06-12)Over 1.7 million Virginians rely on private water systems to supply household water. The heaviest reliance on these systems occurs in rural areas, which are often underserved in terms of financial resources and access to environmental health education. As the Safe Drinking Water Act (SDWA) does not regulate private water systems, it is the sole responsibility of the homeowner to maintain and monitor these systems. Previous limited studies indicate that microbial contamination of drinking water from private wells and springs is far from uncommon, ranging from 10% to 68%, depending on type of organism and geological region. With the exception of one thirty-year old government study on rural water supplies, there have been no documented investigations of links between private system water contamination and household demographic characteristics, making the design of effective public health interventions, very difficult. The goal of the present study is to identify potential associations between concentrations of fecal indicator bacteria (e.g. coliforms, E. coli) in 831 samples collected at the point-of-use in homes with private water supply systems and homeowner-provided demographic data (e.g. homeowner age, household income, education, water quality perception). Household income and the education of the perceived head of household were determined to have an association with bacteria concentrations. However, when a model was developed to evaluate strong associations between total coliform presence and potential predictors, no demographic parameters were deemed significant enough to be included in the final model. Of the 831 samples tested, 349 (42%) of samples tested positive for total coliform and 55 (6.6%) tested positive for E. coli contamination. Chemical and microbial source tracking efforts using fluorometry and qPCR suggested possible E. coli contamination from human septage in 21 cases. The findings of this research can ultimately aid in determining effective strategies for public health intervention and gain a better understanding of interactions between demographic data and private system water quality.
- Automatic Calibration Tool for Hydrologic Simulation Program-FORTRAN Using a Shuffled Complex Evolution AlgorithmSeong, Chounghyun; Her, Younggu; Benham, Brian L. (MDPI, 2015-02-04)Hydrologic Simulation Program-Fortran (HSPF) model calibration is typically done manually due to the lack of an automated calibration tool as well as the difficulty of balancing objective functions to be considered. This paper discusses the development and demonstration of an automated calibration tool for HSPF (HSPF-SCE). HSPF-SCE was developed using the open source software “R”. The tool employs the Shuffled Complex Evolution optimization algorithm (SCE-UA) to produce a pool of qualified calibration parameter sets from which the modeler chooses a single set of calibrated parameters. Six calibration criteria specified in the Expert System for the Calibration of HSPF (HSPEXP) decision support tool were combined to develop a single, composite objective function for HSPF-SCE. The HSPF-SCE tool was demonstrated, and automated and manually calibrated model performance were compared using three Virginia watersheds, where HSPF models had been previously prepared for bacteria total daily maximum load (TMDL) development. The example applications demonstrate that HSPF-SCE can be an effective tool for calibrating HSPF.
- Characterizing Waterborne Lead in Private Water SystemsPieper, Kelsey J. (Virginia Tech, 2015-07-21)Lead is a common additive in plumbing components despite its known adverse health effects. Recent research has attributed cases of elevated blood lead levels in children and even fetal death with the consumption of drinking water containing high levels of lead. Although the federal Environmental Protection Agency (USEPA) strives to minimize lead exposure from water utilities through the Lead and Copper Rule (LCR), an estimated 47 million U.S. residents reliant on private unregulated water systems (generally individual and rural) are not protected. Detection, evaluation, and mitigation of lead in private systems is challenging due to lack of monitoring data, appropriate sampling protocols, and entities to fund research. Through a statewide sampling survey, over 2,000 homeowners submitted water samples for analysis. This survey documented that 19% of households had lead concentrations in the first draw sample (i.e., 250 mL sample collected after 6+ hours of stagnation) above the EPA action level of 15, with concentrations as high as 24,740. Due to the high incidence observed, this research focused on identifying system and household characteristics that increased a homeowner's susceptibility of lead in water. However, 1% of households had elevated lead concentrations after flushing for five minutes, which highlighted potential sources of lead release beyond the faucet. Therefore, a follow-up study was conducted to investigate sources and locations of lead release throughout the entire plumbing network. Using profiling techniques (i.e., sequential and time series sampling), three patterns of waterborne lead release were identified: no elevated lead or lead elevated in the first draw of water only (Type I), erratic spikes of particulate lead mobilized from plumbing during periods of water use (Type II), and sustained detectable lead concentrations (>1 ) even with extensive flushing (Type III). Lastly, emphasis was given to understand potential lead leaching from NSF Standard 61 Section 9 certified lead-free plumbing components as the synthetic test water is not representative of water quality observed in private water systems. Overall, this dissertation research provides insight into a population that is outside the jurisdiction of many federal agencies.
- Comparing Alternative Methods of Simulating Bacteria Concentrations with HSPF Under Low-Flow ConditionsHall, Kyle M. (Virginia Tech, 2007-09-03)During periods of reduced precipitation, flow in low-order, upland streams may be reduced and may stop completely. Under these "low flow" conditions, fecal bacteria directly deposited in the stream dominate in-stream bacteria loads. When developing a Total Maximum Daily Load (TMDL) to address a bacterial impairment in an upland, rural watershed, direct deposit (DD) fecal bacteria sources (livestock and wildlife defecating directly in the stream) often drive the source-load reductions required to meet water quality criteria. Due to limitations in the application of existing watershed-scale water quality models, under low-flow conditions the models can predict unrealistically high in-stream fecal bacteria concentrations. These unrealistically high simulated concentrations result in TMDL bacteria source reductions that are much more severe than what actually may be needed to meet applicable water quality criteria. This study used the Hydrological Simulation Program-FORTRAN (HSPF) to compare three low-flow DD simulation approaches and combinations (treatments) on two Virginia watersheds where bacterial impairment TMDLs had been previously developed and where low-flow conditions had been encountered. The three methods; Flow Stagnation (FS), DD Stage Cut-off (SC), and Stream Reach Surface Area (SA), have all been used previously to develop TMDLs. A modified version of the Climate Generation (CLIGEN) program was used to stochastically generate climate inputs for multiple model simulations. Violations of Virginia's interim fecal coliform criteria and the maximum simulated in-stream fecal coliform concentration were used to compare each treatment using ANOVA and Kruskal Wallis rank sum procedures. Livestock DD bacteria sources were incrementally reduced (100%, 50%, 15%, 10%, 5%) to represent TMDL load reduction allocation scenarios (allocation levels). Results from the first watershed indicate that the FS method simulated significantly lower instantaneous criterion violation rates at all allocation levels than the Control. The SC method reduced the livestock DD load compared to the Control, but produced significantly lower instantaneous criterion violation rates only at the 100% allocation level. The SA method did not produce significantly different instantaneous criterion violation rates compared to the Control. Geometric mean criterion violation rates were not significantly different from the Control at any allocation level. The distributions of maximum in-stream fecal coliform concentrations simulated by the combinations SC + FS and SC + SA + FS were both significantly different from the Control at the 100% allocation level. The second watershed did not produce low-flow conditions sufficient to engage the FS or SC methods. However, the SA method produced significantly different instantaneous violation rates than the Control at all allocation levels, which suggests that the SA method continues to affect livestock DD loads when low-flow conditions are not simulated in the watershed. No significant differences were found in the geometric mean violation rate or distribution of maximum simulated in-stream fecal coliform concentrations compared to the Control at any allocation level. This research suggests that a combination of the SC and FS methods may be the most appropriate treatment for addressing unrealistically high concentrations simulated during low-flow conditions. However, this combination must be used with caution as the FS method may increase the maximum simulated in-stream fecal coliform concentration if HSPF simulates zero volume within the reach.
- Comparing Two Methods for Developing Local Sediment TMDLs to Address Benthic ImpairmentsWallace, Carlington W. (Virginia Tech, 2012-04-23)Excessive sedimentation is a leading cause of aquatic life use impairments in Virginia. As required by the Clean Water Act, a total maximum daily load (TMDL) must be developed for impaired waters. When developing a TMDL for aquatic life use impairment where sediment has been identified as the primary pollutant, the target sediment load is often determined using a non-impaired reference watershed, i.e., the reference watershed approach (RWA). The RWA has historically been used in Virginia to establish TMDL target sediment loads because there is no numeric ambient water quality criterion for sediment. The difference between the sediment load generated by the reference watershed and the load generated by the impaired watershed is used to determine the sediment load reduction required to meet the TMDL target load in the impaired watershed. Recent quantification of the Chesapeake Bay TMDL based on Phase 5.3 of the Chesapeake Bay Watershed Model (CBWM) offers a simpler and potentially more consistent method of calculating target sediment loads for impaired watersheds within the Chesapeake Bay watershed. Researchers in the Biological Systems Engineering department at Virginia Tech have developed the "disaggregate method" (DM) which uses landuse inputs to, and pollutant load outputs from, the CBWM to determine pollutant load reductions needed in watersheds whose areas are smaller than the smallest modeling segments generally used in the CBWM. The DM uses landuse-specific unit area loads from two CBWM model runs (an existing condition run and TMDL target load run) and a finer-scale, locally assessed landuse inventory to determine sediment loads. The DM is simpler and potential more consistent than the reference watershed approach. This study compared the reference watershed approach and the disaggregate method in terms of required sediment load reduction. Three sediment-impaired watersheds (Long Meadow Run, Taylor Creek and Turley Creek) within the Chesapeake Bay watershed were used for the study. Study results showed that the TMDL development method used to determine sediment loads would have noticeable effects on resulting sediment-load reduction requirements. For Taylor Creek, the RWA required 20.4 times greater reductions in sediment load (tons/yr) when compared to the DM. The RWA also required 9.2 and 10.4 times greater reductions for Turley Creek and Long Meadow Run watersheds, respectively. On a percentage basis, the RWA for reduction Taylor Creek was 7.3 times greater than that called for by the DM. The RWA called for 4.4 and 4.6 times greater percent reductions for Turley Creek and Long Meadow Run watersheds, respectively. An ancillary objective of this research was to compare the sediment load reductions required for the impaired and their respective RWA-reference watersheds, using the DM. This comparison revealed that, both Taylor Creek and Turley Creek watersheds required less sediment load reduction than their respective reference watersheds, while the load reductions required for Long Meadow Run were slightly greater than its reference watershed. There are several issues associated with either the RWA or the DM for developing sediment TMDLs. Those issues are discussed in detail. Recommendations the need for further studies, based in questions raised by the research presented here are also discussed.
- A comparison of runoff quantity and quality among three cattle stocking treatmentsWilliams, Emily Diane (Virginia Tech, 2014-03-11)Measurements of runoff quantity and quality from three cattle stocking treatments applied to pastureland in southwestern Virginia indicate the need for further research to determine treatment effects. Three cattle stocking treatments (1) Continuous, 2) Rotational, and 3) Mob) were applied to three pastures at the Virginia Tech Prices Fork Research Farm. Rainfall simulations were performed over replicated plots in each treatment to induce runoff for collection of runoff quantity and quality data during the 2012 grazing season. Additionally, rainfall simulations were performed prior to applying the grazing treatments to establish initial conditions. Monitored runoff quantity and quality response variables included runoff depth, mean nutrient concentrations, and nutrient mass loss. Response variables were compared among the three pastures for initial conditions and among treatments for post-treatment conditions. Additionally, the trends in response variables within the 2012 season were compared among treatments. Plot and rainfall conditions that were expected to influence responses were also collected and analyzed in relation to response variables. Analyses of the response variables suggested that the variability within treatments likely muted any treatment effect on the response variables. Therefore, we concluded that further research is needed to determine treatment effects on runoff quantity and quality.
- Comparison of Two Alternative Methods for Developing TMDLs to Address Sediment ImpairmentsWallace, Carlington W.; Benham, Brian L.; Yagow, Eugene R.; Gallagher, Daniel L. (2018-12)While excessive sediment is a leading cause of aquatic life use impairments in free-flowing rivers in Virginia, there is no numeric sediment-water quality criterion. As a result, total maximum daily load (TMDL) sediment loads are often established using a comparable, nonimpaired reference watershed. Selecting a suitable reference watershed can be problematic. This case study compared the reference watershed approach (RWA) which uses the Generalized Watershed Loading Function and the disaggregate method (DM) which uses output from Phase 5.3 of the Chesapeake Bay Watershed Model. In this case study, the two methods were used to develop sediment TMDLs for three impaired watersheds in Virginia (Taylor Creek, Turley Creek, and Long Meadow Run). In this case study comparison, the RWA required between 12.8 and 14.7 times greater sediment load reductions (t/year) to reach the TMDL load (Taylor Creek > Long Meadow Run > Turley Creek) when compared to the reductions called for using the DM. While each TMDL development method has inherent limitations, the DM uses output from the Chesapeake Bay Watershed Model to establish TMDL target loads. This means that the application of the DM is restricted to the Chesapeake Bay Watershed.
- Comparison of watershed boundaries derived from SRTM and ASTER digital elevation datasets and from a digitized topographic mapPryde, J. K.; Osorio, J.; Wolfe, Mary Leigh; Heatwole, Conrad D.; Benham, Brian L.; Cardenas, A. (2007)Watersheds are natural integrators of hydrological, biological, and geological processes and as such require an integrated approach to data analysis and modeling, which usually starts delineating accurately a polygon vector layer of watershed boundaries as input. In that way, the Río Illangama watershed in Alto Guanujo, Ecuador, had been isolated with the objective of evaluate the accuracy of watershed boundaries derived from three different sources: One was delineated by hand and the other two were derived from a 30-m ASTER DEM and a 90-m SRTM DEM, using the Spatial Analyst extension of ArcGIS. Visually, there are small differences between the manually-delineated and the SRTM-based boundaries, while the ASTER-based varies from the manually-delineated one. The area of the watershed delineated manually is 13,061.3 ha, while the SRTM-based and the ASTER-based watershed are 0.66% and 2.6% larger. The regression analyses comparing the complete boundaries yielded an R2 of 0.999 between the SRTM and manual boundaries and the 0.988 for the ASTER and the manual boundaries. The t-test comparing DEMs indicated a significant difference (p
- Comparison Watershed Selection When Applying the AllForX Approach for Sediment TMDL DevelopmentBronnenkant, Kristine Nicole (Virginia Tech, 2014-04-15)This study compared physical characteristics used when selecting comparison (healthy) watersheds for the All-Forested Load Multiplier (AllForX) Approach, and examined a quantitative watershed characteristic as a selection criterion. The AllForX Approach uses a regression relationship between Virginia Stream Condition Index (VSCI) scores and AllForX values (a unit-less multiplier that is the ratio of a modeled existing sediment load divided by a modeled all-forested load condition) for an impaired watershed and several comparison watersheds to develop sediment TMDL target loads. The Generalized Watershed Loading Function (GWLF) model was used to simulate sediment loads for 20 watersheds (four impaired and 16 comparison) in the Upper James and New River basins in Virginia's Ridge and Valley physiographic region. Results suggest that within Virginia's Ridge and Valley physiographic region it may be possible to select comparison watersheds that are of a different stream order (watershed size) and lie in different river basins from the impaired watershed. Results further indicated that the topographic index (TI) distributions were not different across the modeled watersheds, indicating the watersheds are hydrologically similar. These results support selecting comparison watersheds regardless of river basin or stream order within Virginia's Ridge and Valley physiographic region. Finally, there was no statistical difference between the AllForX regressions when using the entire period of record or the two most recent VSCI data points. Therefore, for the watersheds modeled for this study, either all of the VSCI samples or the two most recent may be used in the AllForX Approach.
- Consideration of BMP Performance Uncertainty in Chesapeake Bay Program Implementation: Workshop ReportBenham, Brian L.; Easton, Zachary M.; Hanson, Jeremy; Hershner, Carl; Julius, Susan; Stephenson, Stephen Kurt; Hinrich, Elaine (Scientific and Technical Advisory Committee, Chesapeake Bay Program, 2018-02-21)Achieving Chesapeake Bay Program (CBP) nutrient and sediment reduction goals will require securing reductions largely from agricultural and urban nonpoint sources. While state and local governments rely largely on best management practices (BMPs) to achieve these goals, uncertainty surrounds the pollutant control effectiveness of these investments. Currently, the variation of BMP performance is not well documented or characterized in the CBP. Furthermore, knowledge gaps exist surrounding the sources and extent of the variation surrounding BMP performance. The purpose of this workshop was to make recommendations for improving the documentation and characterization of BMP performance uncertainty and to suggest how more detailed information on BMP uncertainty could be used to inform management decisions. Through this report, the workshop participants make several recommendations for characterizing uncertainty during the process of generating BMP effectiveness estimates (BMP Expert Panel Process). These include recommendations that the Chesapeake Bay Program partnership take measures to:
- Systematically document and represent uncertainties throughout the BMP treatment process;
- Produce information about the distribution of removal effectiveness of each BMP;
- Develop a method for simply and effectively communicating the degree and type of uncertainty across all approved BMPs; and
- Provide additional guidance for how to most effectively solicit “best professional judgment” as part of the expert panel process, including best practices for structured literature syntheses, identifying and avoiding potentially inappropriate heuristics (shortcuts) and biases when obtaining expert opinion, and expert elicitation.
- Development of a Risk Assessment Model to Assess TMDL Implementation StrategiesJocz, Robert Michael (Virginia Tech, 2012-06-18)High levels of fecal indicator bacteria (e.g. E. coli) are the leading cause of identified surface water impairments in the United States. The US Clean Water Act of 1972 requires that jurisdictions establish priority rankings for impaired waterways and develop a Total Maximum Daily Load (TMDL) plan for each. Although past research indicates that the risk of illness to humans varies by source of fecal contamination, current watershed assessments are developed according to total concentration of indicator bacteria, with all sources weighed equally. A stochastic model using Quantitative Microbial Risk assessment (QMRA) principles to translate source-specific (e.g. human, livestock) daily average concentrations of E.coli into a daily average risk of gastroenteritis infection was developed and applied to Pigg River, an impaired watershed in southern Virginia. Exposure was calculated by multiplying a ratio of source related reference pathogens to predicted concentrations of E.coli and a series of qualifying scalars. Risk of infection was then determined using appropriate dose response relationships. Overall, human and goose sources resulted in the greatest human health risk, despite larger overall E.coli loading associated with cattle. Bacterial load reductions specified in the Pigg River TMDL were applied using Hydrological Simulation Program- FORTRAN (HSPF) to assess the effect these reductions would have on the risk of infection attributed to each modeled bacterial source. Although individual risk sources (neglecting geese) were reduced below the EPA limit of 8 illnesses per 1000 exposures, the combined risk of illness varied between 0.006 and 64 illnesses per 1000 exposures.
- Emerging Contaminants: Occurrence of ECs in Two Virginia Counties Private Well Water Supplies and Their Removal from Secondary Wastewater EffluentVesely, William C. (Virginia Tech, 2018-06-29)Emerging contaminants (ECs) are chemicals such as pharmaceuticals and personal care products that have been detected in various environmental matrices, including drinking water supplies at trace concentrations (ng/L-ug/L or ng/kg-ug/kg). Current wastewater treatment plant technology is largely ineffective at removing ECs. The objectives of this investigation were to: 1) determine the occurrence of ECs in private well water supplies in Montgomery and Roanoke County, VA 2) quantify the concentrations of three ECs in selected private water supplies; 3) examine the relationship between water quality constituents (nitrate, bacteria, pH and total dissolved solids) to EC occurrence in private water supplies; and 4) determine the ability of the MicroEvapTM, a novel wastewater treatment technology, to remove ECs from secondary wastewater effluent. In partnership with the Virginia Household Water Quality Program, 57 private water supplies were sampled and tested for the occurrence of 142 ECs and 43 other water quality constituents. Up to 73 ECs were detected in the sampled private water supplies. Higher numbers of ECs detected in the tested private water supplies were related with nitrate >1 mg/L, total dissolved solids >250 mg/L, and the presence of total coliform bacteria. Results indicate the MicroEvapTM technology had >99% removal effectiveness for all 26 tested ECs from three secondary wastewater effluent. With the increasing detection of ECs in water bodies, it is essential to understand the occurrence of ECs and environmental predictors of EC presence in different water matrices and continue to develop water treatment technology capable of treating wastewater for EC removal.
- Estimating Uncertainty in HSPF based Water Quality Model: Application of Monte-Carlo Based TechniquesMishra, Anurag (Virginia Tech, 2011-07-28)To propose a methodology for the uncertainty estimation in water quality modeling as related to TMDL development, four Monte Carlo (MC) based techniques—single-phase MC, two-phase MC, Generalized Likelihood Uncertainty Estimation (GLUE), and Markov Chain Monte Carlo (MCMC) —were applied to a Hydrological Simulation Program–FORTRAN (HSPF) model developed for the Mossy Creek bacterial TMDL in Virginia. Predictive uncertainty in percent violations of instantaneous fecal coliform concentration criteria for the prediction period under two TMDL pollutant allocation scenarios was estimated. The average percent violations of the applicable water quality criteria were less than 2% for all the evaluated techniques. Single-phase MC reported greater uncertainty in percent violations than the two-phase MC for one of the allocation scenarios. With the two-phase MC, it is computationally expensive to sample the complete parameter space, and with increased simulations, the estimates of single and two-phase MC may be similar. Two-phase MC reported significantly greater effect of knowledge uncertainty than stochastic variability on uncertainty estimates. Single and two-phase MC require manual model calibration as opposed to GLUE and MCMC that provide a framework to obtain posterior or calibrated parameter distributions based on a comparison between observed and simulated data and prior parameter distributions. Uncertainty estimates using GLUE and MCMC were similar when GLUE was applied following the log-transformation of observed and simulated FC concentrations. GLUE provides flexibility in selecting any model goodness of fit criteria for calculating the likelihood function and does not make any assumption about the distribution of residuals, but this flexibility is also a controversial aspect of GLUE. MCMC has a robust formulation that utilizes a statistical likelihood function, and requires normal distribution of model errors. However, MCMC is computationally expensive to apply in a watershed modeling application compared to GLUE. Overall, GLUE is the preferred approach among all the evaluated uncertainty estimation techniques, for the application of watershed modeling as related to bacterial TMDL development. However, the application of GLUE in watershed-scale water quality modeling requires further research to evaluate the effect of different likelihood functions, and different parameter set acceptance/rejection criteria.