Speaker Info

Dr. William Dennison
Vice President for Science Application
Integration and Application Network, University of Maryland Center for Environmental Science

Email: dennison@umces.edu

More info...

Seminar Abstract

William Dennison provides a comparison of approaches used by a suite of water quality monitoring programs to operate and sustain their monitoring activities based on the first two panel discussions of the STAR BASINs review process.

Seminar Transcript

>> So what I am going to do is talking about, talk about a series of webinars that we've had to better understand our [inaudible]. Putting them in the context with a broader suite of monitoring efforts. We call this effort building and sustaining integrating networks. So first of all we've had two webinars. One in December. We brought in folks from Puget Sound as well as the Great Lakes. And then in January, last month, we brought in a group, a larger diverse group, we got two Australian examples Moreton Bay and the Great Barrier Reef. You know the upper Mississippi. And we had a group of scientists that are coordinating what is called MARACOOS. It's the Mid-Atlantic Regional Association Coastal Ocean Observing Systems. It's a long acronym. But it's a group of people with technological capacity that are bringing new ways forward for that. So I would like to talk a little bit here about what one of the sort of things that help predicate what we're doing in this analysis. And that comes from the Science and Technical Advisory Committee or STAC and some of their monitoring concerns. And a large part of the monitoring concern has to do with the adaptive management cycle and how monitoring fits that adaptive management cycle and provides evaluation and feedback for the various efforts that are ongoing. But it's also a switch in thought process. You know, we have talked about monitoring for attainment for our water quality goals. But the STAC is concerned that we make sure that we talk about monitoring for adaptive management and how it actively feeds into that management. The other aspects that we've been, you know, that STAC has been interested in is the integration of citizen science and modern technology. And the idea that it's an opportunity to take a broad sweep look at the monitoring and not just small tweaks but a broad sweep. And then finally, the new [inaudible] agreement which is in months of being signed should clearly articulate the goals, the strategies, and various objectives that will lead to specific monitoring needs. So we got to make sure the monitoring connects into the adaptive management cycle to the goals and feedback in that fashion. So here the case studies that we have solicited from folks in Puget Sound and Moreton Bay, the upper Mississippi River, the Great Lakes, and the Great Barrier Reef. And I put these in sort of a scale here from left right. The scale of these on just a crude scale they go from about a tenth of the size of Chesapeake Bay to 20 times Chesapeake Bay. Chesapeake is in the middle of that spectrum. So there are some much smaller and some much larger, but we're looking across that span of scale to see what kinds of aspects that we could learn from that. And we've got five questions to be posed, and I am going to go through each of these five questions and give you my interpretation of the summary of those results. The first question we asked each of the groups was what are the network objectives for your monitoring network and the design? And we, you know, we kind of went through our Chesapeake objectives and design. We have the water quality monitoring, the shallow water monitoring, the [inaudible], the quality of grasses, the fisheries, and the various plankton sampling and sort of how we went through that. And then what we've learned from the case studies is that just about everybody has similar elements of water quality and then various habitat and fisheries. And so that was a pretty common trait. There were a couple that I thought were unique enough to pull out and highlight. One was this sewage plume mapping that Simon Costanzo [phonetic] talked about in Moreton Bay, a methodology of using stable isotopes and then watching that shrinkage as the sewage upgrades went online of that sewage plume. And then secondly, an expanded monitoring in the Great Barrier Reef is using for pressure state response. In other words, they are putting in a lot of effort in looking at what the farmers are doing on the landscape and monitoring their progress of achieving best practice. And so that represented a departure from just the state of environment reporting, which we mostly concentrated on. And, of course, in the Chesapeake STAC concept of what Chesapeake STAC would do would be more of that pressure state response monitoring. The next question we asked of all groups was describe your operations model including innovations. We talked about the Chesapeake that we were doing data flow and vertical profilers, engaging citizen science. We've got regular but still qualitative remote sensing, highly evolved reporting with report cards and stating and barometers. And in the case studies we had heard that just about everybody has achieved these technical capacity through various agencies. That there is a sort of professional cadre of people involved in the monitoring program that are delivering that science. Their citizen science are engaged across the board but to different levels and in different manners. So each group has got some component of citizen science. Some of the groups like the Puget Sound has a very, what they call their vital signs wheel, a very evolved and well presented set of metrics that they connect across a broad range of, they've got healthy human populations, water quality, restoring habitat, water quantity. So, you know, broad range of indicators. And then in Moreton Bay they had a fresh water to complement the marine monitoring with a reporting network around that. So they get report card scores for both fresh water as well as marine. We asked them to describe their business model. And what we learned is that there are multiple funding sources with different mechanisms of delivery to different science providers. So in some cases it was EPA money or funneled to USGS or a variety of different ways of doing that. And each time we saw the partner organizations involved in this they provided significant matching funding. And that is something characteristic in the Chesapeake monitoring. You leverage for every monitoring dollar you leverage many more dollars through the partnerships. And there's this evaluation towards user paid. So, you know, a model that we had at Chesapeake Bay is we use EPA dollars largely to fund the monitoring network. And one of the models that we've heard about is in Moreton Bay where not just the regional councils but also some of the industry partners, the fertilizer company and the meat works and the cement companies are in the user paid basis. So it's a shared burden financially on the monitoring effort. And it's connecting those potential users to the monitoring results that is key. In terms of the governance models, this turned out to be really variable. A whole lot of, everybody had some kind of [inaudible] chart that showed different pathways and different groupings and little commonalities that I could see between those because of the different scale for one as well as the different structures. They tend to be fairly complex. None of these monitoring programs are simple in execution. They all have complex governance arrangements. They have different jurisdictional arrangements. They have different science providers, different oversight, but there was a commonality that technical oversight review is provided through that monitoring network. So I guess the governance model is very much seen as a way to make sure, ensure the quality of the data, and that was the theme we got throughout. Everybody wanted to see that same quality of data. Now we're not going to monitor everywhere. And so it is not data quantitative. For example, the upper Mississippi, it turns out the upper Mississippi is a series of lakes and [inaudible]. It is not a free flowing river; it is a bunch of staircase lakes and ponds. And these sampling, you can't even go to each pond and sample there's so many. So you have to subsample there. So how do you subsample that? So you have to go through the rigor of determining how to do that effectively. So this monitoring effort is clearly takes a lot of organizational involvement. It is not unique to the Chesapeake Bay program. But you have a lot of people involved. And there's an importance to maintain the rigor throughout. And then our final question was to describe your successes and challenges. And so we had one that just came out a couple days ago, the new insights report where I would claim one of our bigger successes of integrating lots of data over many years to tell some stories. You know, we've had the scientific basis in the Bay program for a long time for nutrient sediment reduction strategies. We've identified and tracked major inputs and their impacts. We've provided feedback on overall management effectiveness. Our challenges have been, in Chesapeake, a slow, steady and slow erosion of funding support, a realignment of rebalancing from title to non-title and a recent major funding shortfall. So that is what we have provided from Chesapeake. Now what I did is I took some examples from the other case studies of some of their successes. And I think one of the more powerful successes of the MARACOOS group, that's that Mid-Atlantic Regional Association, is the kind of partnerships that they form, the really solid linkages that they've made with various industry partners on monitoring specific efforts. And so we think that that is something very much worth emulating. We see some [inaudible] partnerships in the future of Chesapeake Bay monitoring connecting with these scientists that bring a lot of the technological innovation into the monitoring realm. Another one that I alluded to earlier is expanding the monitoring to include the management response and figuring out ways to track quantitative [inaudible] management response and report back as well as the ecosystem health. And then this success in Moreton Bay of having a sewage upgrade, a series of sewage upgrades which resulted in the significant reduction of nitrogen and phosphorus flow into the Bay, and the ecological response to that empowered that community to take on harder, more difficult challenges like diffuse sources. The challenges in all cases we were looking, maybe naively when we started this webinar series, for novel solutions. Where are these magic pots of money that people are going to get from monitoring? How do they maintain a sustained funding effort for monitoring? The answer resoundingly was, "There ain't so such thing." Monitoring is hard money to find and secure and maintain over time. It takes diligence and effort, persistence and patience. And we also heard in the Moreton Bay example of about I think 13 years of report cards is this report card fatigue. People get tired of saying [inaudible] or bad news. We've seen that in Chesapeake already, and we haven't been doing it as long. So it's something we need to watch out for, and something we have to anticipate. And then we also saw in some of the programs that aren't as mature as Chesapeake I'd say is that like the Great Lakes and the upper Mississippi where they're still selecting which indicators out of a vast [inaudible] of indicators to really focus on and use. So I think the Great Lakes has 70 some indicators, and they are trying to figure out which of those they need to track and report. So that's clearly something that is a common issue among monitoring groups. So those are the kind of things that we saw from those different webinars. The other aspect that we share with all these groups is that fieldwork is really quite expensive. There were no magic associated with that either. It's people and it's equipment and it's vehicles and it's boats. The data analysis is time intensive. You need to develop and maintain databases and do statistical analyses. And these recurring costs are subject to inflationary pressures. So monitoring programs today are going to be challenged with constant dollars tomorrow. So we use this opportunity to kind of reflect a little bit on the Chesapeake efforts that we have had about a quarter century of good long-term monitoring. And we've been able in that time period to identity [inaudible] causes and impacts particularly the dead zone, the so called dead zone, the hypoxia and anoxia in the bottom waters due to nutrient enrichment and [inaudible] production and decomposition. We also use the monitoring data sets to detect and explore the impacts of climate change. We have see the Chesapeake Bay is getting warmer, it's getting saltier, it's getting deeper. The status and trends of key indicators in some cases we have got some things that are improving. Our nutrient concentrations are improving but, you know, looking at monitoring data over time gave us this opportunity to really come into, bring into focus this decline in water clarity that we've seen over the bay over the last couple of decades stimulating research. A couple more highlights, the ecological thresholds or tipping points have been, and the ecological feedbacks that stimulate these tipping points that has been both positive and negative have been articulated. We use these monitoring data to have input to public dissemination, report cards and barometers and feeding into the research. And we've used it to assess our water quality for our current management of total maximum daily load mandate for water quality improvements. Some of the convergent frameworks that we've seen across a couple of these scales, these are just three examples of using the stop light colors that the Great Lakes, Chesapeake Bay, and Puget Sound are using in different kinds of format. Also an attempt to look at trends in different ways. We've got trajectories mapped. We've got arrows, and there are different ways to do that. But in all cases a public format, a public dissemination of that reporting format is common in all of the examples that have been developed. So I've got a couple things that I want to talk about, in these next three to five, talk about sort of why we think that we need to explore these monitoring efforts more fully. One is that we need this institutional monitoring to provide skeletal backbone for any additional monitoring like citizen science monitoring. Citizen science monitoring is great, but you need the skeletal background. You always are going to rely on high quality, timely, accessible data with continuity to make important management decisions. And that piecemeal data doesn't replace this integrated monitoring. We can't see the patterns effectively without this institutional monitoring. And the other part of this that we have appreciated is an adaptive monitoring is part of adaptive management. We all know about the importance of adaptive management, but monitoring has to be adaptive too. So it needs to be something that evolves. And that's one of the processes why we're initiating these webinars to rethink monitoring. Citizen science can augment but not replace entirely the institutional monitoring. So you need the coordination, the training, the personnel turnover. It means they are always training you. There's quality assurance, quality control issues. You need the continuity. And there are some difficult or dangerous locations where trained people are needed. And yet, you know, on the positive there's tremendous untapped potential. So we see this as both a growth area with these caveats that need to be overcome to make it an effective monitoring effort. And then technology. So we've got institute technologies. We've got gliders that fly in the water. We've got satellites and aircraft that sense remotely. We got to remember that's not, they're not free. These gadgets are expensive. They aren't maintenance free. They require calibration and maintenance and operational costs. Some features you still need to be on site to sample. And yet these, one of the exciting things, and I think the MARACOOS case study was the best, but demonstrating the technological innovation provides for new partnership opportunities. So if we can use this effectively, we can create new opportunities for data acquisition. So here is my final slide and it kind of tries to bring some of these points together from these webinars. First of all, one of the realizations is we don't do this enough with monitoring. We have a very elaborate and highly evolved scheme of trading research results through scientific journals and conferences and workshops, but we haven't done that nearly as much for monitoring. It's a part from perhaps just the water quality bits and pieces. But this broader view of monitoring has not had this kind of peer sharing. And so one of the things we've learned is that is something that needs to, we need to grow that. We also recognize that we've got to be a little more cognizant of the terminologies that we use. Monitoring and assessment means something to a scientific community. They don't necessarily mean much to the managers. It sounds boring. It sounds hard. It doesn't sound interesting. One example we've developed in one of the discussions and webinars, let's not use the word monitoring. Let's use the words intelligence gathering. So we've got to be cognizant of the fact that, you know, you are using monitoring in a broader context to a broader audience than just scientific audience. The funding and [inaudible] is common. So we're not alone. We recognize that that's going to be an ongoing battle that we're just going to have to maintain diligence on. We also saw this broad engagement. There are multiple stakeholders involved, and there are different reporting mechanisms. There's not real one way of doing it. There's no particular best practice. I think we've got a very elaborate scheme in Chesapeake Bay of engaging various stakeholder groups, but some of the ideas that we saw like the vital sign wheel at the Puget Sound that we really ought to evaluate and look at as potential ways to improve our reporting. And I think that the overarching theme we saw from all the case studies is the critical need to connect monitoring results to management actions. And not only is it a need, but it's a current gap. It's not something that everybody feels like they've achieved that in any real satisfactory way in just a few places and pieces of that. But the fact that we really need to keep our focus on using monitoring to connect these results to management actions. So we've got in each of these basin webinars are on our Chesapeake Bay website. You can look in detail for all those different case studies that I have just glossed over in this very short overview. So I invite you to go look at that. We still have one more webinar planned, and we are going to continue this conversation and discussion about future monitoring needs and ways to best serve the adaptive management of Chesapeake Bay and restore this ecosystem. So I'll end it there. Thank you. [ Silence ]

Seminar Discussion

Coming Soon