Speaker Info

Ethan Weikel
Deputy Director
USGS MD-DE-DC Water Science Center

Email: eweikel@usgs.gov


Ethan Weikel is the Deputy Director of the USGS Maryland-Delaware-DC Water Science Center.  He previously worked for the US Army Corps of Engineers and in the private sector.  He has an undergraduate Bachelor of Science degree in Geology with a focus on Structural Geology from the College of William and Mary and graduate coursework in Engineering-Geotechnics from the Missouri University of Science and Technology.  Ethan is a registered Professional Geologist in Virginia, California, Pennsylvania, and Delaware.   He formerly served as a senior technical expert and advisor for the Corps in the areas of hydrogeology and applied physical science.  Ethan was chosen as Innovator of the Year for USACE in 2015 for design and implementation of a lightweight and portable field in-situ thermal conductivity test unit.

More info...

Seminar Abstract

This presentation highlights decreasing trends in funding for certain federal programs and then presents an adaptive management strategy that helps move projects forward.  The data utilized in the presentation is from groundwater clean-up projects where the Environmental Protection Agency and the Department of Defense, among others, are stakeholders.  The strategy, as detailed in a case study from one of the projects where it was implemented, is a process that leverages a handful of actionable practices that focus on timely, critical, and cost-reducing outcomes, with a result of being able to pursue additional tasks or applying the savings to other programs.

Seminar Transcript

>> Ethan Weikel: Thank you, so one of the roles I had prior to joining USGS was as a technical lead for a number of multi-million-dollar groundwater cleanups. And I'm going to go through, for a number of those, you know, there were financial limitations, right? So I'm going to go through kind of a process, and then a particular case study, and then bring it back to some actionable practices, and hopefully, you'll find maybe there's some things that resonate with you, and maybe some parallels that might useful. So like I said before, we're going to cover a case study. But right before that, I'm going to talk to you about funding from the Department of Defense, and the Superfund environmental cleanup versus calculated cost to complete and projected lifecycle cost. And then, what strategies we can use to address a particular disconnect between funding versus projected cost, and finally, what some implementation of those different strategies might look like. So a lot of data right here up front, but what I'd like for you to focus on first is this top line. This is for the Department of Defense's Environmental Restoration Program, and this is their estimated liability, cost of cleanup liability over time. And you can see, in general it's going down. That's a good thing. All right? This is the delta year to year. So the decrease from one year to the next, and larger numbers are better. Smaller numbers mean we're not doing -- we're not making as much progress as maybe we should towards a particular liability, and we can talk about why in a second. And a negative number means that your liability up. There's something that's happened that has made the work that you have to complete more expensive overall. This is the annual restoration funding. These numbers are all in billions of dollars here. All right? So this the annual funding that you can get, and you can see, generally, that has decline,d as well over the years. And down at the bottom, you see this is basically the progress, given the annual funding, and the change in the cost of complete liability. So back in FY 09, for $2.02 or $2.08 billion, there was a reduction of $2.2 billion. So that was good. That's a positive outcome for us. That means we're making progress. Every dollar that we put in that's some decrease in liability, but over the years, you can see we kind of get to a negative, and that's not so good. So our liabilities are increasing, or not decreasing as much as they could dollar that we're putting into it. All right, these are EPA Superfund appropriations. Again, you can see this nice downward trend. This red dashed line here is due to BRAT funding. And so in 2009, there was an extra chunk of money, but that was a one-time shot, so I'm not really including it here. These values are in millions, we can move the decimal place in our head and see, you know, we're at, you know, somewhere between $1.5 and $1 billion a year. On the other side of this, there's some state liability costs, and I use Colorado just because they have to have a particularly good data that with regard to what they're projected costs are into the future are, as far as their responsibility for some of their circle sites, and you can see their costs are going up and up, while the federally funding is going down and down. So why is there this disagreement between the costs of complete liability and the funds that are expended or projected for a site? Well, one I mentioned before, there can be newly discovered liabilities. Somebody spills something new. Some contaminant comes out or emerges that we didn't know about before. So, you know, in theory, this liability should be decreasing over time, there are some really good reasons why would increase. Also, sometimes, the previous source investigations that were done might not have an entirely accurate or precise enough. Another reason is the nature of remediation itself. Frequently, and that's what I'm showing here, in the early time, the reduction that you can get in contaminant concentrations is high, and as you move towards the out years in your project, it's much more difficult to attain a significant drop in concentration. So it's just the fact that the earliest reductions in contaminants come early on. Secondly, there was a move toward monitored natural attenuation and long-term monitoring. It's thinking that once we knock things down low enough, under ideal geochemical conditions, contaminants will degrade on their own. We don't have to worry about them. We're just let this happen, and we'll do long-term monitoring, all right? But that has a lengthy time frame. And then lastly, the method of cost to complete. It might seem kind of counterintuitive, but these costs to complete, a lot of times, were based on a project lifespan, just for the ease of calculation. So I might assume that I'm going to calculate 30 years, even though the cost actually get to closure where the site is usable again and meets all appropriate and relevant requirements is, you know, might be 75 years out. You know, I don't know exactly what those are. So there's a difference there. And this is reflected in the fact that in FY 14, the Department of Defense estimated there and felt major liability at 58.6, but the cost to complete calculations for how much money they thought it was going to take to resolve these issues was only $27.2 billion. So right there, there's a bit of a difference. There are some other factors for specific projects that play into this, too. An assumption of better remedial technologies in the future, advancements, things like that. So another one, in 1994, EPA estimated the environmental liability for a Superfund site at $75 billion and completion in 2075. This is probably a little bit better estimate of how long it's going to take them to get there and how much, but, you know, if I go back to this graph, about how much money they're spending and when they started, they're still not going to get to the end by 2075 at that cost. All right, so, I think I've shown pretty well that costs are going to outpace our funding. So what strategies can we used to reduce our costs overall and still achieve the project goals? And what I'm showing here on the slide is and adapt of management strategy, and it's really too vague on its own to really work for us. So I'm going to try and break a little bit more into some actionable things that we used on site after site after site to try and bring the project to the point where we were seeing reductions. We were seeing reductions in costs and contaminants at the same time. First, you need to understand why your contaminant levels are not decreasing. That seems like it should be self-evident, but when you get into the mode of I'm just monitoring. I'm just monitoring. Then it gets really hard to do the analysis and figure out why things aren't working like they're supposed to. So secondly, you must focus funding where the biggest long-term impacts to be made while mitigating risk. And I'll talk about that more a little bit. Third, must only monitor what is necessary for remedy performance and protectiveness, and I understand this is a little bit different for the work that you do versus the work that I was doing, but I think there are some parallels that we can look at. Fourth, must use modern sampling and characterization tools, and fifth, that aggressive, precise actions can reduce our costs and remedial time frames. So I'm going to get into a little bit of those topics a little bit more on each one. First, did the source investigations precisely identify what the problem was? And I'll show picture here of this outflow from these culverts, and there's buckets on there. Maybe the contamination came from there historically, but maybe it came from the roadway. Or maybe a came from the sewer system. We don't really know, and so we can make some assumptions about, you know, where that source came from. Did we get it right? That's something you need to take a look at down the road. Then did the remedy that you implemented at that source actually effectively and precisely target the source? Right? And you can only do that by going back and looking at exactly what was done, instead of assuming, well, we just dug it all up. It's good. We got it all. Was the remedy -- was the remedial action actually effective? Did you do the monitoring of the remedy itself, or was your monitoring more broadly focused on the extent of the contamination? Were incorrect data or assumptions used in the modeling of the remedial timeframe? So this is a big one we're talking about. You need to figure out how long I'm going to have to spend this money on a project. If my data that went into it wasn't good, of course, I'm going to get bad results out. And then lastly, does the monitoring accurately reflect the subsurface conditions? And I could have just said, does the monitoring accurately reflect the conditions? Okay, so we need to focus funding where we can get the biggest impact while still mitigating the risks. Sometimes stakeholders on projects had a tunnel vision that focused their efforts on just those items that had the most risk, while not paying as much attention to the other parts of the project that were more costly or lengthy. So one way to put this would be looking at the shiny object over here, while ignoring the elephant in the room. Second, the need was there to avoid a stepwise approach where we can't do C until we finished A and B. there was a very methodical reason for why some of the stuff was being done, but in a lot of cases, it got them into the situation where there was no other way to operate in the mind of the parties involved than to do X then Y then Z. And lastly, you know, when we delay addressing some of those big and costly impacts, then the advantage of other strategic litigations that we might be able to implement, we can't realize. Okay, monitoring what's necessary for remedy and performance effectiveness. Optimization of monitoring, as you know, should be a part of every remedy that we implement. And in order for this to work out, that optimization approach needs to be applied consistently, must be well documented, must be logic-based, and have evaluation capabilities on the end of it. What does this mean for my project? What happens if I just stay the course that I'm on? And then lastly, you must have stakeholder buy-in on the goals or potential outcomes or what you get out of that optimization is not likely to be accepted. All right, and then use of modern sampling and characterization tools. On this project, you know, there are a lot of -- project that I'm going to talk about and many of the other ones I worked on, there's a lot of other technologies out there, but if you're heavily invested in a particular sampling method, it can be very hard to step away from that. In this instance, they were doing a lot of traditional groundwater flow sampling, low-flow sampling, but that was never meant to be the gold standard by which everything else was compared, but it was being treated like that, because that's what they had been doing for years and years and years. So instead of worrying about what the exact answer was, they were looking at how does this just compare with the data that we collected? Again, along those same lines, other methods for groundwater data collection. They relied on the same principle as groundwater low-flow sampling, but even though they had been studied for a long time, they weren't uniformly accepted. So acceptance was on a case-by-case basis when we had to continue reinventing the wheel in that regard. In some cases, there was a lack of acceptance of methods dude to feel comparisons in the literature, and what we found there is that, in order to do real comparisons, again, this seems like it would be just intuitive, but the data that you collect need to be contemporaneous. You can't collect one set of data with one method and then go out months later and collect a data set with the other method and expect them to be exactly the same. And lastly, and really importantly, precision field characterization tools were critical to understanding discreet contamination of the subsurface. Lastly, and I think this is an important point overall, was that aggressive, precise actions could really reduce your costs and remedial time frames, and so notwithstanding all of the other stuff that I've talked about, these aggressive and precise actions can really have a significant impact, and I'll show that in the case study that I'm going to talk about in a second. Overall, the science and practice of contaminant hydrogeology and [inaudible] and transport have changed significantly over the last 30 years, and so being able to take advantage of these things is pretty important if we're looking to do things cost-effectively. So one of the examples of that is, you know, there was this thought about boat contamination. That it's this blob in the subsurface, and part of that was because of the sampling methods and way we're looking at groundwater, and of course, with more modern tools, we see that it's not like that at all, and I don't expect you to be able to see all the details of this column across the bottom here. But what I'm going to point out is two things. one, this is the electron capture detector in this particular tool, and our scale here is 30 to about 32. All right? So we can see the contamination within the subsurface is actually just within a couple of inches in a particular horizon, and it's not feet worth of contamination. But if your remedy is designed to treat feet worth of contamination, it's not going to be very effective. All right, so as I mentioned before, I worked with a number of agencies. Some of them really bought into this approach already. Some, I'm going to talk about one case study. This is in the central Virginia, Piedmont. There are multiple operable units at this site or areas of contamination, and many of them are in the remedial-action operation stage. So a remedy has been decided upon. They've implemented it. Now they're seeing how well it works or if it's working the way they thought. At this particular site, the source is mostly chlorinated solvents from former military activities. The records of decision where they decided what remedy they were going to implement were signed in the late 80s through the early 1990s. We're going to focus on one operable unit in particular with some off-site impacts, a school and some residences, just to make things really exciting. And the historical remedial activities at this particular operable unit were fairly regular. So they dug out and hauled some contamination away. They put in some wells. It was a dual-phase extraction system. That's what DPE stands for. So they're pumping out water and treating soil vapor at the same time. When the dual-phase extraction system became not very effective, they moved to monitored natural attenuation and long-term monitoring. Now unfortunately, this remedy proved to not be very effective. And so they ended up implementing some contingency actions that were built in the record decision. Fortunately, they did build those contingency actions in. So this was less complicated than it might have been. All right, so here's a outline of the plume itself from 2014. It's quite large in extent. these other residences here. As you can see, the school is right up here. The source area for this plume is right down here at the end of the A to A prime cross-section. It has a separated kind of plume configuration, and that's likely do to initial transport along some old infrastructure at the installation. Overall groundwater flow is basically southwest to northeast in this area. I should note that this area is very highly investigated. Every single thought that you see on here is a boring or a well. And there are hundreds of them over this fairly tiny little area. This is the geologic cross-section along that kind of south-to-north trend that we looked at before. These brown layers up here are clays and silts near the surface. You get a little deeper, and you're into silts and clays, and then below that, you're back into some clays. And this was a cross-section based on all of these points. They're showing just the ones here that happened to hit the cross-section, but you can see, there's quite a few. This is done based on standard penetration testing-type sampling, which gives us a sample 24 inches long, and we can look at it. But we're creating a cross-section. There is quite a lot of lumping that was done. So I'm going to compare some of these more precise methods for characterization here. You can see your cross-section here that was done standard penetration testing. Then this more high-resolution tool, and it a membrane interface hydraulic profiling tool. And I don't expect anybody to remember that. But what it does is collect a number of different types of data at a very high precision in the subsurface, including electrical conductivity, the hydraulic profiling, which all they're doing is there's a little hole and it happens to inject water at a certain rate. So the degree or the pressure that builds up. They can correlate that directly with hydraulic conductivity. And then I can turn that into an estimated hydraulic conductivity itself, and then there's the electron capture detector tool, which tells me exactly where the contaminants are and their relative abundance, or relative concentration. You can see at the top, it fairly well matches the cross-section. We have high amounts of hydraulic pressure, which correlates to a low hydraulic conductivity, and there doesn't happen to be any contamination really to speak of in that area. We get into this silt and gravely layer here. You can see that our hydraulic conductivity is much higher, and this boring, when they were drilling it, of course, they collect groundwater samples afterwards when they did this. Initially, they put their boring in. They put their well in. They collected the groundwater, thinking that most of the contamination is in this sand and gravel unit, which is not a bad theory, figuring that it's going to travel along the most highly hydraulic conductivity zones. So they terminated that boring at 25 feet when it was initially drilled. We were doing this more precise boring. We realized that when we got to 25 feet, we had not seen any indication of contaminants. Now this was really interesting, because the groundwater samples that we had had some -- this is PCE, TCE, cis-1, 2, DCE, and vinyl chloride. And so there were some detections of chlorinated solvents in the groundwater there. So really odd to not have seen any. So they kept going, and they found that most of the contamination is bound up this one little zone down here, which happens to correlate to just a couple-inch thick zone of hydraulic conductivity. So important to use great and most modern sampling and characterization tools. Some other issues. The efforts that had been done on the site focused a lot on the shiny object. And the shiny object in this case was the residences over here and the school. And so, in addition to that, there was limited effectiveness -- there's a remedy on decreasing the concentrations, and I talked about earlier. They dug some material out. They put in a [inaudible] extraction system, but it wasn't very effective. So they shut it down pretty quick. The stakeholders in the project felt that the level of monitoring that was being done was the bare minimum. Can't possibly do any less than that. However, there is no actual analysis on whether that was the right amount or not that had been done. The plume was not stable in many areas. It was migrating to the north. It was migrating to the southeast here. Not stable in any way, shape or form. So it is not surprising that folks in the residences and folks at school were concerned. Lastly, and this is very important. Some of the more costly liabilities or issues, in particular, the source area that was causing this plume to continue to expand kind of unabated wasn't being aggressively addressed, because of focusing on the shiny objects. Not being able to do more than one thing at a time. All right, so what was done? We went in and put in about 40 of these higher-resolution MIHPT borings to figure out exactly where the contaminants were. Once that was done, we put in two low-flow, fairly low-flow pumping systems, where the two big red Xes there. And that's less than 5 gallons per minute each, just provide capture and stabilization in those specific areas. And then there were four very small-scale targeted, enhanced reductive dichlorination sites where material was injected in the subsurface, and these all those upside-down purple triangles there. At the same time, there was some subsurface and ambient vapor soil vapor sampling that was done around these residences to confirm whether or not there was actually any contamination to worry about that was getting up to the subsurface, instead of just, you know, kind of waving our hands and worried about it greatly. Source area directed groundwater recirculation was done in this giant red area here, and that's exactly what it sounds like. We put in wells. We make the water go a different way than it would naturally using a directed recirculation system, and that helps to get to contaminants where it might be bypassing it in other ways. So at the same time, the were a variety of activities that were done at other operable units to take advantage of time, basically. So this is a thing where, unless you fix some of these problems going along, you're going to have to continue to do with them down the road. So you try to do as many things as you can at once. And so one of them was to do some additional injections, to do enhanced reductive dechlorination at other operable units to reduce the need to go off-site, because of the cost of getting an easement off-site is very expensive, and we'd rather use that money to actually do something, instead of just paying somebody to access their property. So there was some targeted ERD injections in other operable units to knockdown source concentration there, and then additionally, we did that monitoring optimization and sampling improvements at the three largest operable units at the site overall, and that resulted in about a $250,000 savings per year, and we directly took that money and started doing other stuff with it. All right, so this is where the rubber meets the road, that it actually worked. I want to note here that the remediation efforts started at the site in 1991. So in 2014 here, this yellowish outline is the 5 microgram per liter contour of PCE. The pinkish one is a 50, and the black is a 500 microgram per liter. So after the actions that we took in 2016, with the same monitoring points, same data collected, you can see the difference. One of the big takeaways here is that we're able to actually separate the plume, and now we have other natural processes working, where we didn't before. So we're cutting the length of remedial timeframe down substantially here. Okay, so how does this translate to other types of work? So what I've really talked about is a process, and one that we implemented at site after site after site. So like I mentioned before, why aren't the contaminants decreasing? Well, you know, we had a plan, but we're not seeing the results. Why is that? Is the plan going to show us the type of answers or the information we're looking for in a reasonable timeframe? Have we asked that question? When we do optimizations, we need to communicate why, you know, why, if we're going to change, where we changing? And then secondly, and this is really important, what's the cost of not changing? What's the cost of just staying the course? Because so often, these optimizations are done, and it says, here's what we think we should do, and here's how much it's going to cost. But what's left out is, what's the cost if we don't make the changes? Second, focus on funding where the biggest long-term impact can be made while still mitigating risk. And we found out really quickly, we can't do everything everywhere cost-effectively. It just can't be done. And you must have buy-in to do those things where that biggest impact can be made overall. Otherwise, you're chasing things all the time if you don't address the big problems. As I mentioned before, the optimization monitoring is really, really important. Of that, an agreed-upon logic-based optimization strategy must be used. You have to have buy in on what the particular outcomes might be, in order to get people to agree. And then lastly, you need to avoid this investment pitfall and confirmation bias, and this is something that happened a lot, because you have a lot of people that have been involved with the project for a really long time, and they were, you know, well invested in the type of monitoring that was done. And it was the right thing to do three years, five years, 10 years prior. But because you've put all this effort into it, it's really hard to step back and say, okay, what do I need now? So I think, I showed before, we must use those modern sampling and characterization tools. You know, the world of science is changing around us. So we need to use the most up-to-date tools. And this is the last slide, really importantly, those aggressive, precise actions can reduce the costs and remedial timeframes.

Seminar Discussion

Coming Soon