Content
Valuation of Ecosystem Services
Introduction
Today’s lecture focuses on the valuation of ecosystem services. Throughout this course, we have spent considerable time on the concept of ecosystem services and have focused heavily on the estimation of the production function side of things. We have utilized the InVEST toolkit to establish where specific ecosystem service values are generated, producing very spatialized information and conducting QGIS work. However, the entire value proposition of ecosystem services to conservation has centered on the idea that it puts a dollar value on ecosystem services. So far, we have not focused significantly on that monetization component. Today, we will focus on how to take the InVEST outputs that are biophysical in nature and assign a specific dollar value to them.
Reading the Landscape: Digital Elevation Models
Understanding DEMs
The first slide presented appears familiar from our prior work in QGIS. When thinking back to what we have been using, this data represents a digital elevation model, or DEM. These models display beautiful variation that almost appears organic in nature, even though it is entirely computational. The variation in elevation across a DEM can help identify geographic locations and features.
Identifying Geographic Features
Looking at this particular DEM, we can identify several key features. The image shows approximately where the Mississippi River is located, specifically at the bend where the Minnesota and Mississippi rivers join. The St. Paul campus is located in this region, and if we followed the river’s flow, we could see it continue throughout the landscape. If we zoomed out slightly, it would become much easier to identify the location because we would see the coastlines and other definitive features.
Elevation as a Key Input
Having spent considerable time in the geospatial world, one begins to see things differently. Elevation emerges as one of the key inputs in understanding landscapes. This perspective is crucial for understanding how we approach ecosystem services from a spatial perspective and how we eventually apply value to the services provided by these landscapes.
Agenda and Framework for Today’s Lecture
Overview of Topics
Today’s lecture will cover two distinct sub-themes. First, we will discuss the different types of economic value. While these concepts are present in many parts of economics, we will emphasize the parts most relevant to ecosystem services and the more general task of putting a dollar value on nature. This understanding is crucial for the cost-benefit analyses we conduct. Second, we will discuss the methods we might use to put specific value on ecosystem services. Thus, the lecture structure moves from types of value to methods for establishing that value.
The Total Economic Value Framework
We will return to a fundamental framework throughout this lecture and fill it out as we progress. This framework presents a taxonomy of the different types of total economic value, sometimes indicated as TEV. We will slowly build up to the complete diagram, but we will start with some of the sub-components. The first component we will discuss falls on the use side of things: use value.
Use Value: The Foundation of Economic Valuation
Direct Use Value
Direct use value is the easiest of all values to think about. It represents situations where we are directly consuming something in nature. The direct use components of ecosystem services include things that have market value. A fish, for instance, we can definitely buy from the store, or roots that we might forage for in a forest or purchase directly. Hunting certainly has a market value insofar as people buy permits for it, but sometimes the value contained in the permit is less than the total value that people would be willing to pay. Timber represents another straightforward example of direct use value with observable market value.
Non-Consumptive Direct Use
It is worth noting that direct use does not necessarily mean that the resource actually gets used up. Thinking back to our discussion of public goods versus private goods and consumptive versus non-consumptive goods, there are all sorts of things that have some degree of non-rivalness. Walking down a trail, at least initially if there are not too many people, does not consume it. It does not use it up, or at least it does not seem to vary much. We might suppose that if enough people walk down a trail, you would start to get erosion, or that if you have too many people going at the same time, it does start to be rival. All of those things we discussed regarding the rivalness versus the non-rivalness of public goods would be relevant here. The key point is that when emphasizing the ecosystem service value, it could be either type of good.
Rivalness in Ecosystem Services
Ecosystem services, just like rival versus non-rival goods, would have different characteristics. You would compete with other fishers to be the first to get the fish out of the lake right as the fishing season opens. So you would consume these resources, but you do not have to worry about that with a trail unless it becomes congested. The nature of the good—whether it is rivalrous or non-rivalrous—becomes important when considering direct use values.
Indirect Use Value
The Importance of Indirect Value
Where ecosystem services really start to matter is with indirect value. This is present in lots of parts of the economy but is especially important for ecosystem services. This is because lots of the ways that nature provides value to us is not through this direct use component. Rather, indirect value represents the ways that ecosystem services support something else that has direct use value.
Forest Services and Water Filtration
Another example would be the vegetation and root structure of forests. We know that forests increase water filtration. Just like with the sediment retention model, forests hold water there rather than letting it run off across the landscape, and this allows it to seep in, which ultimately increases the amount of water available to the river over time. This would be something that contributes to the direct use of water, but the valuation of the forest itself would be indirect use because we do not consume the forest in this particular case. We are consuming the water that it provides.
Market Observability of Use Values
One thing to note is that the use value side—both direct and indirect—are all going to be things that have an observable market value. The purchasing of fish in a grocery store gives us information about how much people actually value that. The price is the perfect indicator of how valuable something is in an economy. For all the reasons we have discussed before, price reflects a situation where the market often finds itself, and that price is very accurate for the aggregate measure of how much people care about it. Lots of these things, whether direct or indirect use, will often have a large component that can be observed directly from the market. This is quite nice because the method for putting monetary numbers on ecosystem services can rely on those market values, making the process straightforward.
Option Value: The Bridge Between Use and Non-Use
Understanding Option Value
We must now discuss something that is kind of in between the two previous categories: option value. If you take a finance course, you learn a lot about option value, which represents the fact that instead of just paying for something you want to consume right now—like buying fish and then immediately eating it—option value is that people will put a positive willingness to pay, or WTP, to preserve an option for future use.
Financial Context
Finance discusses options through options traders. In finance, complex contracts exist that say things like if the price of a stock falls below a certain price, then people will buy it. These are actual contracts in place that give people the option to do something if something happens. This is one of the things that actually caused the 2008 financial crisis, but that is a whole different story.
Ecosystem Services and Option Value
For ecosystem services, this is just saying that we might put a positive value on avoiding something, especially if it has uncertainty or the possibility of degradation that is irreversible. Just like with the financial option, many people would be willing to pay a premium today for the right to have the benefits from the ecosystem later on. A classic case of this is the value of genetic resources in biodiversity-rich ecosystems. There are all sorts of direct values that people get from taking the genetic information from various species we discover and figuring out how they might be useful for making drugs. This is one of the major sources of drug discovery. There is research that shows it made a whole lot of sense to pay people to not cause degradation in these biodiversity-rich ecosystems, just because that might trigger irreversible actions that would have value we might use later on. Of course, you do not know exactly what the value is because you have to do the research and development to understand it, but there is direct value in preserving the possibility that value exists.
Precautionary Principle
One of the interesting things about option value is that it provides an economic rationale for being careful or precautionary. Even when current benefits seem low, option value might make it so that when we consider the preservation of that option for later, the value goes greatly up. This bridges the gap between market values and non-market values, which we will now turn to next.
Non-Use Value: Beyond Market Transactions
Introduction to Non-Use Values
Referring back to our original slide, those were the use values. The second major type is non-use value. These are going to be things that typically do not have market valuation possible. We will split this into two specific types of non-use value. First, we will have existence value. Second, we will discuss bequest value.
Existence Value
Existence value represents the fact that we might be willing to pay for a resource just to continue existing, even with no intention of using it. It sounds similar to option value, but option value is about preserving the option of using it, whereas existence value is purely non-use. An example would be people who will never visit the Amazon, and yet they still report that they would be willing to pay for conservation actions that preserve the Amazon.
The Paradox of Existence Value
From the ultra-rational economist’s perspective, this makes absolutely no sense. How can you be willing to put value on something and even pay for its preservation if you are never going to be the one who gets to consume it? Well, lo and behold, billions of dollars flow to organizations like the Nature Conservancy, where they are doing exactly this—taking money from individuals who are paying to protect nature they have heard about but will probably never go to.
Personal Experience with Tropical Ecosystems
The Amazon, for sure, will never see a personal visit there. The aversion is terrified because of going to a place like that with gigantic bugs and huge snakes. The closest experience was when on a trip to Sri Lanka for a summer and we went to a national park. Even though they are a lot less developed than what might be expected, we showed up at the main office and it was clear nobody had been there for weeks and weeks. We totally startled the forest guy keeping track of the national park. He was like, you want to buy a ticket here? He was kind of surprised, so he dusted off these official government tickets and gave them to us.
Wild Nature and Existence Value
We went out on this trail and the whole thing was absolutely terrifying. We made it around the first turn and there is this log across the way. We went to step over it, except the whole log was moving. There were tons of insects everywhere—millipedes and scary things. So there is a positive value on the Amazon, and this is definitely a non-use value. We do not want to go there. We immediately turned around. This was not the type of national park that we think of, where it is very well-manicured like a lot of U.S. national parks are. That is an example of people who would pay to protect it even if they do not benefit from it directly.
Charismatic Species and Conservation Funding
Another example would be charismatic species. A lot of money flows into environmental protection because of a handful of cute, cuddly, or sometimes ferocious-looking animals. The panda, the red panda, Bengal tigers—these are ones that are very charismatic. We probably care about the whole ecosystem, but something about our human value system is willing to pay more money for something that is cute, that has features that look a lot like us and look fuzzy. We seem to be willing to pay for it, which is a lot more possible to drive donations than trying to raise money for cockroaches or something like that. This is a big part of the flows of conservation funds into these particular species and their habitats.
Bequest Value: Intergenerational Equity
But even existence value does not quite get the whole picture of non-use values. When you think about what creates value for people, there is also a willingness to preserve things for future generations. Hence the word bequest. This one is especially important as you get older and start to think about what sort of world you are going to give to your children, grandchildren, or their grandchildren. This is motivated by intergenerational equity. It is even further removed from rational economics. Existence value takes a big step away because you are never going to consume it, but you would still put a dollar value on it. Well, here, it is not that you are not going to yourself get benefit from it, but bequest value captures that component of wanting other people to have the option of valuing this. We are not talking about our own future uses but other people’s future uses.
Richness of Economic Frameworks
One great thing is that if you think about standard economics and how simplified that could be, it would be all use value, all direct value, and all as measured through market values. But this framework provides a much richer understanding for how people actually do make decisions. This is probably true way beyond just ecosystem service values or other things. Probably a lot of the things in the market are also like this, but economics still has this bias to just the one line. The framework we have presented here shows the heterogeneity among these different ways that we value things. This heterogeneity means that the methods we have to use to establish all these different reasons we might care about or put value on nature means we are going to have to use a wide variety of methods to do that.
Moving from Types to Methods
The Necessity of Multiple Valuation Methods
That covers the many different types of value. The heterogeneity among these different ways that we value things means that the methods we have to use to establish all these different reasons we might care about or put value on nature, we are going to have to use a wide variety of methods to do that. This is the second key thing we are talking about today. We are now going to talk about different ecosystem services, which will have different components of value. This means we will have to explore a wide variety of different methods for establishing what that value is. So instead of types, we are now moving to methods.
Methods Overview
What are these methods? One way to organize these methods is to align this with the previous graphic. At the bottom of the previous framework, we talked about market valuation on the left and non-market valuation on the right. This methods framework is similar but moves down to more specific approaches.
Methods for Valuing Ecosystem Services
Market Valuation Methods
Introduction to Market-Based Approaches
For market valuation, we start with replacement cost. These methods work because we are observing something in the market—we see a person who had to pay the replacement cost or we see damages actually happen or could be avoided.
Replacement Cost Method
What is replacement cost? There is interesting research on wild pollinators. It turns out there are research and development funds being put into using tiny little drones that can actually replicate pollination services. They are given basic artificial intelligence to navigate around to different species, basically bump into them, and have little feelers that will collect the pollen and travel to other flowers or plants that need that pollination. That sounds expensive, right? Well, that would be very expensive as a replacement for pollination services that we get for free. Here is the point: it is not going to have a market value that is very easily observed, but it leads to the production of crops which do have a market value. If people are willing to spend actual money on these drones, then the value of the natural thing that we got for free does have a value. Put differently, however much you would be willing to pay for a replacement is a good estimate of the environmental value of the thing lost.
Wetland Replacement Cost Example
A classic example is with wetland values. If you lose a wetland, any municipal engineer will immediately know that you should be considering what alternative you are going to have for water filtration and storage. My parents are members of a church that was struggling financially, but a big condo wanted to build right next to them. One of the requirements is that environmental engineers said this condo is going to pave a lot of natural land, so they have equations they use for how much water filtration and storage is going to need to be installed. The church got an easement on their property and was paid $660,000 just for the ability to convert a field into a retaining pond. You see those all over, especially in the suburbs where there is lots of space and no underground piping. The solution is to build these retaining ponds that essentially hold the influx of water after a big storm but also let it slowly percolate into the groundwater. That $660,000 was a good estimate for what was the wetland value of keeping that land natural. So one way of thinking about it is that as we degrade nature, there are literally engineering and expensive solutions that we have to pay to get a replacement for the thing we were getting for free from the wetland.
Limitations of Replacement Cost
This logic is pretty strong because it even makes it into municipal budget planning discussions. But it can break down. It does not work in all cases. Number one, there is not always a replacement. The drones for bees one might turn out not to be a very good replacement; it just does not seem like it will work very well. But other things have simply no replacement, and so this will underestimate the valuation of the thing in that case. The method is therefore quite useful when replacements exist but must be used with caution when they do not.
Avoided Cost Method
Definition and Concept
The second method is avoided cost. It sounds similar to replacement cost but is different. Avoided cost values an environmental good or ecosystem good by how much costs on society it prevents from happening in the first place. These red bars indicate how much cost happens from flooding damages. Hurricanes are very expensive. Hurricane Sandy holds the current record at $61 billion of damage. It is very easy to observe that damage because it is buildings flooded, buildings knocked down—people paying for it. There is no doubt there is value and costs associated with the event.
Environmental Protection and Cost Avoidance
But if it happens to be the case, and it often does, that nature provides a way to avoid those costs, we should be putting a dollar value on that. An example here is mangroves. Mangroves are considered coastal armor because they grow right on the edge of saltwater. Anybody ever been to mangroves? They are so cool. The experience on a trip where we got to go snorkeling underneath them was remarkable. The water was only about this deep, but the roots would arc over like tunnels. The fish were just dense there, and it was like a maze. The root structure is literally armor. When you have a big hurricane or storm event, this will drastically reduce the amount of storm surge, waves, or other flooding drivers that make property just adjacent to it destroyed. There are many other types of ecosystems that do this besides mangroves—coral reefs have a huge protective value, seagrasses, lots of things do. But the basic idea is that if we did not have them, the amount of damages incurred by those buildings offshore would be much higher.
Overlap Between Methods
This is similar to replacement costs but different in so far as we are avoiding some damage rather than having to replace it with a substitute. There is a little bit of overlap because it is so expensive having hurricanes hit that mangroves will be estimated not just by the costs but also by the replacement value. They are literally putting up seawalls, and the amount of expenditure going into this globally is rising over time as oceans themselves rise and we have more extreme storms. We are paying a lot of money for really expensive solutions. Therefore, the mangroves would probably be able to have multiple ways to put a dollar value on it—the replacement costs and the avoided costs.
Common Applications
Common applications of avoided cost include flood protection, air filtration, and pollination. This works really well when the avoided costs are well defined, but it can get very hard when the damages are diffuse and spread over all of society. The case of hurricanes is easy to see because whoever owns that building, it is pretty obvious they were the ones damaged. But what about things like climate change through slightly increased temperatures that affect labor forces all throughout society? It is a little bit harder to see the direct damages, and it is hard because it is diffuse among the whole population.
Water Treatment Avoided Costs in Minnesota
Some of the work done in Minnesota—here are a couple of close collaborators—is to think about avoided costs of water treatment. Minnesota is a very agricultural state, so there is a lot of nutrient runoff. One of the many ways this is a problem is when nutrients get into the groundwater. Those folks who have wells pulling from that groundwater are now having polluted water come up, which causes all sorts of health problems, one of which is blue baby syndrome. As you increase nitrogen in these wells, people have direct damages that can be measured by the behavior of people who own wells and what they spend money on to try to mitigate this damage. A study by Bonnie Keeler and Steve Pulaski from this department looked at specific costs paid by specific well owners. The spatial data shows existing wells, and the color indicates the level of nitrate pollution. They also polled those people on what things they would do or have done to mitigate this damage: reverse osmosis with filtration through a membrane, distillation by evaporating it with heat, anion exchange with chemicals, or simply building a new well. People do this because you simply cannot use these wells once they get too polluted.
Future Scenario Analysis
These are real values that if nature had been able to filtrate the water effectively before it made it into the wells, these would be dollar values that these well owners would not have had to pay. They also did an interesting analysis where they modeled what would happen if agriculture expanded into the future. That would mean more agricultural fields and more nutrients being put onto the landscape. They modeled what the actual number of wells would be that would surpass the dangerous threshold under this scenario of agricultural expansion, and they saw that the dollar values just exploded. This is a good example of where you cannot just look at the actual present costs but where it might go under scenarios into the future.
Non-Market Valuation Methods
Introduction
So that is the market valuation side. Now let us switch to the non-market side. These market-based approaches work because we are observing something in the market—we saw a person who had to pay the replacement cost or we saw damages actually happen or could be avoided. But in many cases, it is hard to do that or there is no observable market transaction, so you have to get a little bit more clever. These are a lot of fun.
Hedonic Analysis
Definition and Origins
The first non-market method is hedonic analysis. This is something that exists outside of environmental economics. Hedonic comes from the root word hedonism—how do we get direct pleasure from things? It initially came about to try to understand how people care about things like houses—how much money is an extra bathroom worth? Hedonic analysis is a bunch of statistical techniques to analyze different transactions on houses of different qualities, like three bathrooms versus four, and make a predictive model for how much higher the price would be with an extra bathroom.
Application to Environmental Values
But in the domain of environmental values, what is really cool is this technique can use the same datasets—the transactions of all housing sales in a metro area—and look at how houses near environmental amenities sell for more. This is a useful way because the difference in price between, say, a home with a beautiful environment around it versus a home in an urban concrete area—that price difference is another way of putting a dollar value on how much people care about nature.
The Confounding Variable Problem
Who can think of a problem with this approach? What also might be true of houses next to a really beautiful park versus houses in an urban concrete area? Rich people tend to live near nature. We are all thinking it, and that is the case. Rich people tend to have houses in these nice areas. That is really just a reflection on the fact that nature is valuable. But it also is true that rich people tend to live in areas with lots of green space and access to parks. This creates a confounding variable problem.
Statistical Solutions
This is why this is a deeply statistical approach where you use regression analysis, which essentially means let us try to isolate the environmental attribute, like park proximity, from all the other housing characteristics. If we ignore other housing characteristics like the size of the house, then we are going to be obviously misestimating how valuable that environmental amenity is. But there are tons of examples where we have enough data to do this well.
The Loon Study Example
One fun example is “a loon on every lake.” This was a hedonic analysis of lake water quality in the Adirondacks. They collected a boatload of data, including the number of loons present on that lake in the year of the sale and other characteristics of loons. They did a bunch of other things, many of which were environmental like the acidity of the water, how close the house was to that water, and the size of the lake. They also had a whole lot of data on the attributes of the house like number of rooms, building age, building age squared—because maybe it has a nonlinear relationship—and the square footage of the house. If you combine all of these things, even ignoring the environment, you can make a very accurate prediction of what the sale price of a house would be. If you look at thousands and thousands of transactions, you can get very accurate predictions.
Isolating Environmental Effects
What they were interested in is not just predicting the price of the house but rather isolating out the effect of the presence of loons. The results show that if you had an eleven percent increase in loons present in the year of sale, the mean property value impact was $21,803. People love loons. People will pay a lot more money for a cabin on a lake that is pretty enough and wild enough to support loon populations.
Indicator Species and Environmental Bundles
You might want to break that down—maybe the loon is just an indicator species. They also looked at many other things. Maybe it is not just the loons but other bundles of environmental goods like better fishing. Nonetheless, you can use statistical analysis as long as you account for all the non-environmental things like the size of the cabin and distance from the city. The remaining price markup is a good estimate of how valuable nature is.
Travel Cost Method
Basic Concept
Another method is travel cost. Our goal here is to understand how much people value these things that are hard to get a dollar value on. Often, people will spend money on things that are necessary to be able to consume the environmental good. The basic idea of travel analysis or travel cost analysis is that the travel costs—the amount you spend to be able to make it to the park or whatever environmental amenity you are going to—is at least a good estimate of your willingness to pay. It is certainly a minimum estimate, because you might get a bunch of value on top of however much you spent traveling there. That is the whole point of going on a vacation. But at least it is better than zero. If people spend $1,000 to be able to make it to a lake, that is at least $1,000 of value they attribute to going to the lake.
Demand Curves from Distance Data
The cool thing about this is you can use that information. The fact that visitors come from all sorts of different locations either closer to or further from the park means we can actually look at how likely they are to make a visit to that park and compare it with the distance they had to travel, which is essentially the cost. We can get a full demand curve. This is a demand curve for that park where we have price on the vertical axis and the quantity of visits to that park on the horizontal axis.
Flickr Photo User Days Data
One of the reasons this one has generated a lot of interest is because of the data. There is really fun data you can use. This is research that was involved in at the Institute on the Environment down the hall. Researchers estimated something called FPUDS, which is Flickr Photo User Days. They found a really rich dataset that was essentially geo-marked photos. If you look on your phone, it often has a specific latitude and longitude of where your picture was. If that is uploaded to some database like Flickr, we can use that data.
Inferring Visitation from Photo Data
What we can do is make an inference that if lots and lots of photos are taken within a certain park, that is a good indicator that a lot of people actually traveled there. We can combine that to create an indicator of, in these different parks in northern Minnesota, how many photos per day on average were there. Then cross-reference this with a travel network showing what are the distances in roads it would take to get there to estimate the travel cost of actually arriving there. If anybody has ever done the Boundary Waters, it is very costly to get there, right? It is quite a distance away, whereas things much closer are quite a bit cheaper. This was modified by other data like what the actual entrance fees are, so everything close to the city was not necessarily the cheapest. They just had the cheapest travel component, but there would then be costs of using it.
Predicting Lake Quality Effects
The idea here—instead of thinking about houses, if we collected a bunch of data on things like lake size, lake clarity, depth, and all the other elements like whether or not it was in the Boundary Waters canoe area, whether or not it was considered a state park, and critically, whether or not it had invasive species present—we could take those variables and combine them in a statistical model with the travel cost to see how much these variables like invasive species would reduce how much people were willing to travel to those given parks.
Invasive Species Case Study
The basic point is that invasive species, like zebra mussels, will dramatically decrease the quality of the lake. There will be very little fish. These are an invasive species that essentially filter out all the nutrients, and they are also very sharp so you cannot step on them. We can actually get data to show how much less travel happens to these places. If we also know the amount of money that people would have spent on that travel, we can get a lower bound estimate of how valuable that lake is and how much value that lake lost by having invasive species come in. This is a little different than the house example but uses the same basic idea. As long as we can get costs and do a statistical analysis of how much it mattered to people in their decision-making, that becomes useful information for putting a price tag on nature.
Contingent Valuation
Historical Context and Oil Spills
We might have to save the last method for the next class, but we should touch on contingent valuation. A question that really motivated this was a bunch of oil spills and other disasters. A lot of people were upset by things like the Exxon Valdez oil spill, very famous in the history of environmental protection, because it creates all sorts of ecological damage.
The Survey Approach
If you were to ask Americans at the time, how much would you be willing to pay to avoid an oil spill, and simply added that up, this is sort of a brute force method with all sorts of challenges. But there are methods that let us figure out what is the overall amount that people would have been willing to pay to prevent, for instance, another oil spill. They did a huge study that found that on average each American would have been willing to spend $31 to prevent another oil spill similar to the Exxon Valdez oil spill. When you multiply that by the population of the United States, that gave a dollar value of $2.8 billion of damages that people in aggregate would have felt.
Legal Applications
What is useful is that these numbers can be used in court. They actually were. Lots of the lawsuits that came after, specifically the Exxon Valdez oil spill but many other environmental disasters, will do a contingent valuation. While the methods will be saved for next class, the main point is that in aggregate, we can identify how much people were damaged. This can be used in court, and Exxon Valdez actually needed to make these payments. These payments could either go directly to individuals or toward trying to restore the quality to where it was.
Recent Examples
More recently, we had the Deepwater Horizon oil spill put out 200 million gallons in 2010. That is the one where it dramatically blew up. This exact same approach was used there. The contingent valuation method has proven to be a useful tool in environmental litigation and policy.
Bringing Methods Together with Biophysical Models
Integration of Valuation Methods with InVEST
Just to close out for today, remember why we spent a lot of time showing where ecosystem services are provided. Most of those were biophysical indicators, essentially the quantity of ecosystem service. But now we are talking about a big grab bag of different methods where we actually can put a price on it. It was not emphasized in the ecosystem service models we ran, but many of them have options for putting different price tags from these different methods onto those ecosystem service goods. The methods we have discussed today—replacement cost, avoided cost, hedonic analysis, travel cost, and contingent valuation—can all be applied to the biophysical outputs from tools like InVEST to create comprehensive valuations of ecosystem services.
Next Steps
We will pick up with contingent valuation in the next lecture, where we will dive deeper into the methodological details of this approach. Additionally, there are quizzes to hand back. The scores have been online, but if you would like to see the actual quiz results from Micro Quiz 3, we can discuss those after class for those individuals who are particularly interested in reviewing their performance. The names of those who should stay for discussion include Denton, Alex, Kellen, Rhea, and Griffin. The rest of the quizzes have been left in the office and can be picked up at a later time. More details will be provided next time. Thank you.
Administrative Announcements and Course Updates
Room Change and Meeting Logistics
The class will return to the normally scheduled room for the remainder of the semester. There was only one scheduling conflict, and the instructor will end the class five minutes early to prevent delays for another group setting up their event in the space.
Guest Lecturer Schedule for the Following Week
The instructor will provide updates on upcoming events and personal travel plans, as well as details about guest lecturers who will be visiting the class. On Monday, the instructor will teach as scheduled but will need to leave immediately after class to drive to the airport for a flight. There is approximately a ninety percent chance of making the flight, though the timing will be tight and will depend on factors like TSA wait times.
Colleen Miller, who serves as Senior Biodiversity Scientist at NatCap, will visit on Wednesday to discuss how biodiversity forms the basis of all ecosystem services. This topic is sometimes taught before diving into ecosystem services discussions because the rich and complex web of life that comprises ecosystems is the fundamental reason that nature provides value to human society. By presenting this content later in the course, it will serve as a retrospective look at the foundation underlying all the ecosystem services that have been discussed throughout the semester.
Distinguished McKnight Professor Carlisle Ford Rungi will visit on Friday to discuss land and the history of economic analysis—or the historical exclusion thereof—regarding how land affects the economy. A key historical point is that original economists greatly cared about land as a factor of production. However, during the nineteen-fifties, sixties, and seventies, economists decided that land was simply identical to any other type of capital. This decision led to economic models that focused only on labor and capital while ignoring land. This fundamental shift in economic thinking has been greatly detrimental to understanding environmental economics and the role of natural resources in economic systems.
Personal Travels and Professional Speaking Engagement
The instructor will be traveling to the Chilean Central Bank to present on earth economy modeling. This is the same location where similar materials have been presented previously, so some slides will be reused from earlier presentations. The instructor will be giving a keynote address at a large international conference with representatives from many different central banks who want to implement earth economy modeling. The content being taught in this class will likely be incorporated into that presentation to central bankers, though the audience will differ significantly from college students. Instead of students, the presentation will be heard by individuals with substantial financial resources who make important decisions regarding environmental protection. This provides an interesting opportunity to demonstrate how the academic material has real-world applications for high-level policy and financial decision-making.
Instructor Health Update
The instructor experienced a kidney infection over the weekend, which resulted in a high fever reaching one hundred two point nine degrees Fahrenheit. While this was unpleasant, the instructor is not contagious and is on antibiotics. Kidney infections typically resolve quickly, unlike lingering flu viruses or coronavirus infections. The instructor is feeling better and is ready to continue with the course.
Course Structure and Final Project Overview
Introduction to the Final Project Framework
The instructor has updated the course website with the final project link and will walk through the details of the assignment. While the instructor has been indicating the general direction of the final project, the official assignment details have now been finalized. The final project is built on the foundation of all the skills and concepts introduced throughout the course.
The core concept of the final project is that students will imagine being asked by a senior policymaker in an assigned country to prepare a briefing document. This mirrors exactly what the instructor will be doing the following week, except the briefing will be for a central bank policymaker in Chile. The briefing will address what earth economy interactions the country will face over the coming decades and what actions should be taken in response.
Central Bank Perspectives on Environmental Economics
Central banks have long been analyzing climate change and are increasingly concerned about systemic risks to their countries’ economies. The mandate of central banks is to maintain a stable economy and stable currency. However, central banks are increasingly thinking beyond climate change alone, recognizing that the challenge involves both climate change and nature. There is growing interest among central banks in ensuring that their countries are resilient to both climate change and possible nature crises. The final project asks students to write a briefing that addresses this emerging concern from the perspective of a hypothetical central bank.
Final Project Content and Themes
The student reports will address key themes including market failure, sustainability, climate change, land use change, ecosystem services, future scenarios, and essentially all the topics covered throughout the course. These are the foundational concepts that students have been building towards since the beginning of the semester. The final project represents an integration of all these disparate topics into a coherent policy analysis.
Project Components and Grading Structure
The final project has two main components. The first component is a five-minute lightning talk where on the last day of class, all students will present their work. Five minutes is a brief presentation window, typically allowing for approximately two to four slides. The second component is the written report itself, which contains more detailed information and a more comprehensive rubric specifying expectations for each step of the project.
Key Deadlines
The rough draft of the final report is due on the second-to-last day of class. Students will present the slides version of their project on the last day of class. The final report is then due on the final exam date. These staggered deadlines allow students to receive feedback on their rough draft and incorporate those suggestions into their final submission.
Proposed Changes to Course Grading Structure
Rationale for Eliminating the Final Exam
The instructor is proposing a significant change to the course structure. In the initial syllabus, both a final report and a final exam were planned, similar to the midterm examination. However, upon reflection, the instructor believes that this type of material is not well-suited to a traditional examination format. Microeconomics material can be effectively assessed through a traditional exam, but when the course moves into spatial analysis, policy thinking, and sustainability, essay questions do not work particularly well as an assessment method. The material is better suited to the kind of in-depth analysis and synthesis that the final project requires.
The instructor proposes that the final project replace the points that would have been assigned to the final exam, eliminating the final exam entirely. This means that students will not have to sit down on May twelfth and write essay questions by hand, which is the only way to administer such an exam in the age of ChatGPT and other language models. This change would allow students to focus more time on creating a quality report that demonstrates their understanding of the course material.
Advantages of the Proposed Structure
One significant advantage of having the final exam go away and replacing it with the final project is that students have substantial time to respond to feedback and make appropriate revisions. Rather than having just a single deadline for the report, the staggered deadline structure provides time between the rough draft due date and the final due date for students to make meaningful improvements. Students can also use feedback from their presentations on the last day of class to improve their final reports. They will essentially extract slides from the essay, and can guess the key figures they will make, such as maps of their country with one or two of the ecosystem services. These figures will appear in the report, but can then be easily copied and pasted into the presentation slides, providing efficiency in the project creation process.
Process for Implementing the Change
The instructor is trying to be fair to all students because the syllabus represents an agreement made at the beginning of the semester. Changing things in the middle of the course could potentially be unfair. However, the instructor believes that everyone will be in favor of this change because it provides superior educational outcomes. Here is how the change will be implemented: the instructor will update the website with the new percentages for the final grade, taking the score that would have gone to the final exam and allocating it to the final project. The instructor will post this change and send an announcement to the class. Students will then be given a couple of days to anonymously submit any concerns about the change. If there are no concerns, the class will move forward with the new structure. If concerns are raised, the instructor will address them.
An alternative approach would be to allow students to choose between doing the final exam or just the final project, but this would not be a good solution. The students choosing the final exam would still face the same amount of effort on the final project, and would receive only fifteen percent of their score from an additional exam they would have to take. This would create an inequitable and inefficient situation.
Final Project Details and Requirements
Report Length and Structure
The report should be approximately two thousand words as a guideline, though the instructor will not count words precisely. Instead, the instructor will evaluate whether the student makes the key points that have been introduced in the assignment instructions.
Required Content Areas
Students must include discussion of their assigned country and how it fits into the planetary donut framing or other relevant framings of environmental economics. The report should discuss the specific challenges and market failures that the country faces. Students must discuss land use in their country and the ecosystem services that are present or absent. The report should analyze natural capital in the context of the country. Students should examine future scenarios under different shared socioeconomic pathways (SSPs) and what those scenarios mean for their country. The report should conclude with policy recommendations and conclusions that flow from the preceding analysis.
Data Sources and Tools
The data that students will use are materials and datasets that have been introduced throughout the course. The instructor has collected key links together, including the SSP database which shows what will happen to GDP in each country under different assumptions about socioeconomic development. Students will use real data that analysts and policymakers use in professional settings. Additionally, students will use country geospatial data that the instructor has collected, which will be discussed in detail during the lecture and which students will use to run InVEST and conduct other spatial analyses for their projects.
The Course as Foundation for the Project
This entire course has been building toward the final project, which is why the instructor has been enthusiastic about developing it. This is the first time the instructor has taught this course. The course is not primarily an environment and natural resources course in the traditional sense—only about four lectures have focused on that topic. Nearly all of the other course content constitutes what will eventually be rebranded as “Earth Economy Modeling.” This is the frontier of where this type of analysis is going, and earth economy modeling is becoming an increasingly important approach to environmental policy and decision-making.
Valuation Methods: Contingent Valuation and Choice Experiments
Overview of Contingent Valuation
The instructor is returning to the slides left off from the previous class on valuation methods. One of the last methods discussed was contingent valuation, which is particularly important in environmental economics. Contingent valuation has been used for major environmental disasters like oil spills. It has been especially important to environmental economics discourse because the dollar values assigned through this method—often billions of dollars of damages to the environment—are what corporations have to pay if they lose lawsuits related to the environmental damage.
Legal Framework and Methodology
Contingent valuation works through court cases. Some body of people sues an oil company, arguing that the company has caused them harm. This is a very standard type of lawsuit, similar to a case where somebody crashed into your car and refused to pay. In such a case, you would have a civil claim against them for damages. The fundamental difference with environmental cases is that it is much harder to put a precise dollar value on the damages.
In a car accident case, you go to a mechanic and ask how much it will cost to fix the car. The court case then becomes about whose mechanic is correct regarding the repair costs. The assessment of whether the car is properly fixed is relatively clear and straightforward, so there is usually not a great deal of variation in expert opinions. Environmental cases follow the same basic structure, but with an important difference. You have experts—scientists instead of mechanics—who propose how much it will cost to restore or remediate the environmental damage.
Identifying and Quantifying Damages
In an oil spill case, some costs are relatively easy to identify. These include the amount of money spent on containment boats, the cost of fuel, and the captain’s salary. However, there are many other types of values that do not have an easily identifiable dollar value attached to them. What is the value of all the birds that were lost in the spill? While these damages may be identified by experts, they do not have a market value that can be easily discovered. When experts go to court, they use methods specifically like contingent valuation to demonstrate what the average person would pay for preventing that environmental harm from occurring.
Data Collection Approaches
The general approach used in contingent valuation is to ask people, either in a laboratory setting or in the field, how much they care about this environmental resource or avoiding this environmental harm. One approach involves experts mailing physical letters to randomly chosen individuals, asking how much they would be willing to pay to prevent or remediate the environmental damage. This approach is problematic because respondents can make up whatever number they want without any actual obligation to pay it. There is very little constraint on the answer they provide.
Some better approaches use real money to make the contingent valuation exercise more realistic. A key example comes from research on hunting access. Hunting permit applicants were given a choice between having their free hunting permit or receiving a one hundred dollar gift card, but not both. Hunters who really love hunting would presumably prefer their license, and researchers have good evidence that they valued it more than one hundred dollars. The researchers sent out different gift card values to different groups to figure out how much hunting was actually worth. This approach is clever because it used real money rather than just a fill-in-the-blank survey where respondents provide a made-up dollar value.
The original study in this area actually sent a check to respondents and checked whether it was cashed. If the check was cashed, the researchers canceled the hunting permit. Though this seems harsh, the point is that there are many different ways to ask people and get a dollar value. If done well, these values are hopefully usable in court to establish damages and appropriate compensation.
The Problem of Strategic Responses
Problems with contingent valuation arise when researchers do not use payment card methods or real money. When people see these surveys and they are clever enough to know it is probably for environmental valuation and they really like the environment, they will report a really high value for the environmental resource because they know they do not actually have to pay. There is very good evidence in the academic literature that respondents claim willingness to pay that is two to three times higher in studies that do not use real money compared to studies using real money. So people do game the system when they have the opportunity to do so without real financial consequences.
Connection to Choice Experiments
Contingent valuation is tightly related to choice experiments, which derive from a similar underlying idea but consider a more complex set of choice options. A typical choice experiment might be aimed at eliciting what ecosystem people really prefer. Respondents might be given a choice between an ecosystem that is a park with benches and garbage cans and development that changes species distribution—perhaps mice come in around the garbage cans while songbirds leave—versus another option with a path but no infrastructure and a different set of animals.
This method asks people which bundle of goods they would prefer. There was a real Department of Natural Resources study trying to figure out the value of lake clarity under different configurations. The study had lots of different scenarios with things like lake color, boat launch availability, and distance from the respondent’s home. By polling enough people, researchers can create a statistical model that predicts which lake configuration people would choose. This approach is useful for eliciting value on a bundle of different ecosystem amenities. A clear lake is generally good, but whether people prefer it depends on whether they are a fisher person who values fish or a wakeboarder who values clean water for the activity itself. Choice experiments can help disentangle these preferences.
Valuation Methods: Benefits Transfer
Definition and General Approach
The last method in the valuation toolkit is benefits transfer, which is sometimes referred to as the dark arts. It is called that because most academics find this method very problematic. However, it is the easiest to implement, so high-paid consultants often use it when the government pays them to put a dollar value on nature. The benefits transfer approach involves taking existing studies where somebody did a good job assessing ecosystem service value through any of the other valuation methods and applying those values to new contexts.
Implementation Process
The benefits transfer approach uses existing studies as its foundation. Researchers conduct a literature review, extract the stated willingness to pay or estimated value from those studies, and ideally construct a function describing all studies. This function maps how the value depends on something like lake clarity or some other variable that affects ecosystem service provision or the demand for ecosystem services. Then researchers take that willingness to pay value and transfer it to all hectares of that particular land type.
The Costanza Example: Valuing Global Ecosystems
A famous and heavily criticized example of benefits transfer is the Costanza estimation of how valuable all of nature is globally. The researchers found that nature was worth thirty-three trillion dollars. What they did was one gigantic benefits transfer, transferring studies on how much a particular parcel of land is worth to all such hectares of land on Earth with that same type. In the Costanza case, they had studies on how valuable ocean water access was for fishing and recreation, and then assumed every hectare of ocean had that same value. This created an enormous overestimate of the value of global ocean ecosystems.
Problems with Spatial Heterogeneity
There are obvious reasons why this would not be a good method for establishing the total ocean value. The studies that the Costanza analysis was based on were heavily biased towards extremely high-value areas. Miami’s real estate amenity value from beach access is probably much higher than the Congo coastline and exponentially higher than a random hectare in the deep sea with no beaches, no surfing opportunities, and limited accessibility. The fundamental problem is that it is hard to get a correct value that scales appropriately with spatial heterogeneity across the globe. The Costanza numbers were heavily criticized for essentially taking the value of Miami beachfront as the value of every single part of the entire ocean, which obviously does not seem correct.
Legitimate Applications
There are legitimate applications of the benefits transfer method when it is done less egregiously and with more attention to spatial and ecological heterogeneity. However, when ecosystem services start becoming valuable to governments, consultants who make money off this science crop up and can essentially make the science bad. These consultants will do whatever is easiest and easiest to sell to their clients, which often means oversimplifying the analysis and transferring values inappropriately across contexts. This is a significant problem in applied environmental economics when real money is involved and there are financial incentives to frame problems in particular ways.
The Reality of Ecosystem Service Valuation in Practice: The Minnesota Lakes Case
Complexity Beyond Modeling
The instructor wants to tie the theoretical discussion of valuation back to real-world practice in Minnesota. While the course has discussed ecosystem service models and noted that tools like InVEST make it easy to estimate value, in reality when you look at specific cases, the work becomes much more complicated. It is hard to put an adequate dollar value on something when there are many different types of users with different preferences and different benefits from environmental changes.
Research on Water Quality Valuation
There was a really good study conducted by Bonnie Keeler, a former employer of the instructor and now director of the Water Research Institute, where they looked at this complexity closely. The researchers tried to figure out what you need to consider when tracking why people might value water quality in lakes. They argued that you need a valuation approach that is sensitive to different actions that affect water quality. Additionally, you need to identify different use endpoints—how people actually use the water—and must recognize that there are unique groups of beneficiaries who are all differently affected by environmental changes.
Framework for Water Quality Valuation
The researchers presented a systematic way of thinking about how to assess some set of actions affecting water quality and see how, under different action sets, you have changing water quality. Research links those physical changes to ecosystem services. Then, once you get a change in water quality, you identify the change in ecosystem services that results. But critically, as a final and essential step, you have to think about the change in value that is specific to different benefit groups. Different people and communities will be affected differently by changes in water quality, and the economic value of those changes will differ accordingly.
Drivers of Water Quality Change in Minnesota Lakes
The researchers mapped this framework out with the specific case of Minnesota lakes. They started by asking what actions might happen that could affect water quality. They identified primary and secondary drivers of water quality change. For example, with nitrogen from applying fertilizer on agricultural land, you have an action causing increased nitrogen loading to water bodies, which affects water clarity through algal blooms. These algal blooms have secondary effects on fish abundance and pest abundance. The researchers continued analyzing all different actions and drivers, including sediment loading, temperature changes, or toxin inputs.
Effects on Different Value Components
These drivers have different effects on different parts of the value change. They identified specific Minnesota lake ecosystem services: lake and river fishing opportunities, swimming, boating, trout angling, nature viewing, navigation, hydropower, commercial fishing, and safe drinking water. These are things for which people spend a lot of money and which provide significant value to communities. You then have to get from that physical change in water quality—which affects opportunities for swimming or fishing—to the dollar value associated with it for different groups.
Valuation Methods for Minnesota Lake Ecosystem Services
The researchers enumerated all the different valuation methods and their applications to Minnesota lakes. From their work, you can see references to specific valuation examples: avoided sedimentation through avoided water treatment costs, which is addressed using the avoided costs method; the value of swimming, which is harder to estimate but might come from contingent valuation studies; and the value of avoided death or illness through contaminated irrigation water. There is a whole additional set of literature on how to deal with changes in ecosystem services and their value when they prevent people from dying or becoming ill. There will be a special lecture on this topic, and this component represents a big part of the overall valuation puzzle in water quality applications.
Introduction to the InVEST Annual Water Yield Model
Model Overview and Purpose
The instructor now wants to quickly introduce the InVEST annual water yield model. Students do not need to open up or access the model during this lecture—the instructor will save actually running it for the next class. The instructor wants to go slowly and spend time introducing the model because this is where students will actually look at the data they will be using for their final projects.
Defining Water Yield
The basic concept the model addresses is water yield. When researchers refer to water yield, they are referring to water that is yielded into the economic system—water that becomes available for human use. In this case, the specific water yield being modeled is water that ends up in a reservoir, the area behind a dam. This water is particularly useful because it can be pumped to a water treatment plant for drinking water or straight to fields for center pivot irrigation for agricultural production.
Factors Determining Water Yield
What factors determine what water is yielded into a reservoir? Obviously, precipitation matters as the key input to the system. However, a whole bunch of other factors matter as well. You have to think about underground actions and vegetation actions that are important for determining the final thing that people care about: yield. Understanding water yield requires thinking about multiple interconnected hydrological processes.
Water Inputs and Groundwater Recharge
You have to think about the inflow—similar to how sediment retention works, water flowing in from other locations in the watershed matters. The precipitation combined with the inflow goes into the underground, and some of that becomes groundwater recharge, which is certainly valuable for anybody with a well. This groundwater recharge is why your well does not go dry during droughts. But from the perspective of the reservoir, stuff that infiltrates into the ground is not available as yielded water that can be pumped from the reservoir. There is a fundamental trade-off between groundwater recharge, which is valuable for some uses, and surface water yield, which is what ends up available in the reservoir.
Evaporation and Transpiration
The second major process everybody knows about is evaporation. Depending on temperature, wind, and exposure, some of the precipitated water evaporates and returns to the atmosphere before it goes below ground. But the critical thing that many people forget is the complexity of the ecosystem process through transpiration. Transpiration is the water on the surface or underground that gets sucked up by plants. Plants use this water to produce sugars and other compounds that help them grow. But for the purposes of water yield, what is really important is that plants make water not get yielded to the reservoir, so you need to account for what vegetation is present in the watershed.
Complex Effects of Vegetation on Water Availability
This matters in some good ways and some bad ways. Vegetation can slow down water flow and increase groundwater recharge by slowing water movement through the soil. But vegetation also changes timing of water availability. Some water will transpire up into the atmosphere through plant leaves, which is bad for short-term yield, but it is good in another sense because transpired water eventually precipitates again elsewhere, spreading out the window in which water reaches the reservoir system. This temporal smoothing of water availability can be quite valuable for water management.
Pause for Next Class
The instructor will pause here and pick up the detailed discussion and modeling of water yield in the next class when the actual InVEST model will be run. That exercise will involve using geospatial data to compute and understand water yield in specific contexts. The basics of what will be computed using InVEST and geospatial data have now been introduced, and students should have a conceptual foundation for understanding the model when it is actually demonstrated.
Introduction and Course Logistics
Welcome and Daily Agenda
Day 3 of the valuation slides builds upon the PowerPoint presentations that have been used throughout the course. Students are encouraged to refresh their browsers if they maintain multiple Chrome tabs, as the slides have been updated with new information and examples. The agenda for this lecture session focuses on diving directly into water yield analysis using country-specific data, which simultaneously provides students with practical preparation for their final projects. Following the water yield section, the lecture will transition to examining the value of a statistical life, remaining within the ecosystem service valuation framework because ecosystem services contribute significantly to human welfare by literally keeping people alive. The extent to which ecosystem services reduce mortality through channels such as air pollution reduction represents one of the key mechanisms by which value is obtained from these services.
Administrative Announcements Regarding the Final Exam
Several logistical comments deserve attention at the outset of this lecture. No objections were raised regarding the proposed modifications to the final exam structure, which means that the final project will now take the place of the traditional final exam. This structural change offers students considerable flexibility, as they will be able to submit their final projects early, which solves any travel-related complications for students with holiday plans or commitments. Students will present their final projects on the last day of class, which occurs many days before what would have been the final exam day—approximately the fourth of the month. Because no objections were received from the class, the syllabus and grade weightings in Canvas will be updated to reflect these changes accordingly. This modification will give students more time to focus on what the instructor considers the novel and meaningful aspect of this class rather than traditional midterm-style examinations. The instructor sought confirmation that students were comfortable with these changes before proceeding.
Upcoming Guest Lectures and International Engagement
Speaking of reports and presentations, the instructor has been consistently referencing guest lectures, and the preliminary agenda has now been released. The instructor will be giving a keynote lecture at this series of guest lectures. The main speaker is the head of the Network for Greening the Financial System, which is an organization composed of approximately 140 central banks worldwide working cooperatively to future-proof their banking strategies so that climate change and nature collapse do not cause their economies to collapse. These represent really significant names and institutions in the field, which understandably creates some nervousness for the instructor. The keynote will require speaking for an hour and a half, which is longer than any lecture the instructor has previously given, and this has necessitated generating entirely new slides for the presentation. As a Midwesterner, the instructor remarked that they do not typically like to boast, but in this case they decided to mention this significant professional opportunity.
Water Yield Model: Theory and Application
Scientific Foundation of Water Yield
Water yield, one of the ecosystem service models explicitly included in the report to be presented to the Chilean Central Bank, has already been introduced through a quick discussion of its scientific basis at the end of the previous class. The basic science of water yield relies on the relationship between precipitation and evaporation, but this relationship becomes substantially more complex when plants enter the equation. Plant roots play a really significant role in the water cycle by increasing the extent to which groundwater recharges into the soil rather than simply running off the surface. Simultaneously, plants extract water from the soil through their root systems and transpire this water back into the atmosphere, further reducing the net water yield available for other uses or reaching downstream areas. This combination of processes—infiltration through plant roots, soil percolation, and transpiration—creates the fundamental dynamic that the water yield model attempts to capture.
Connecting Theory to Student Data Collection
The instructor wants to connect the water yield theory to the data that students have been collecting for the class throughout the semester. Screenshots included in the course slides show how the instructor has organized their own data if students want to refer back to this example for their own organizational approaches. Students are obviously free to organize their data however they prefer, but the instructor’s approach provides a useful template. The instructor maintains a folder structure for APEC 3611, which contains the repositories used to publish the course website. Previously, students had been using InVEST base data, where all the different ecosystem services and their associated data were ready to input directly into the model without significant preprocessing.
The Challenge of Real-World Data Organization
The instructor has taught InVEST many times across different courses and institutions, and almost universally the main catching point is when students rightfully point out that it is easy when all the data is provided in a pre-organized format. The experience of using InVEST feels falsely easy when all the data is nicely prearranged by someone else. That is why for this particular class, the instructor actually recommends having another separate folder to keep things organized for the final country report. That separate folder is where students should download their country-specific zip file. The instructor will use Nicaragua for demonstration purposes, believing that nobody else had selected Nicaragua for their country project. This zip file has been placed in the appropriate location, and the instructor and students will be using this data for the InVEST analysis during this class session. Just by looking at the initial exploration of the Nicaragua data, it becomes apparent that it is not organized by model, and therefore one of the key jobs students will undertake in their country reports is figuring out what is actually contained in these data files and how it should be oriented.
Data Literacy as a Professional Skill
As a dedicated researcher, the instructor notes that most of their time when actually doing new research projects is spent just looking through datasets and reading documentation. This is a very real skill that professionals use constantly in environmental science and related fields. One of the key files the instructor wants students to reference is the documentation included with the data packets. In Nicaragua, this documentation looks somewhat different for each country, but not dramatically so. It is very tempting for students to ignore files like readme.pdf, but the instructor strongly cautions against this approach. These readme files contain really condensed and critical information. All of this data comes from the Integrated Economic Environmental Modeling Platform created by one of the instructor’s colleagues and friends, O’Neill Banerjee. The documentation describes what is inside each data packet. These packets are designed to be plug-and-play for the four InVEST models that students have the option of running in this course. The documentation also gives a really good set of descriptions for how students might go about using the models, describing the different sections and providing important notes on all four models. The instructor really recommends reading this documentation carefully to get up to speed on what is contained in the data package.
Data Sources and Documentation Structure
The documentation will describe where the data came from and how it was assembled. If students were undertaking this analysis for another region for which they did not have a person providing them the data already compiled, this documentation section is where they would go to understand exactly how to proceed and where to obtain the data themselves. It documents precisely how students would conduct the process and gather the necessary datasets independently. A really useful table is included in the documentation showing the four ecosystem service models that these data packets are built to support. This table also shows which of the data layers are used in which of the different models. For instance, the carbon storage model will use land use data and information on soil carbon storage, while the annual water yield model will use substantially more data layers. Students should check out these tables in detail to understand the full scope of what data are involved in each model.
Country-Specific Considerations and Data Challenges
For specific countries, many of them have country-specific information about the data included in their documentation. There are always different challenges when working with data from many countries, and even when using global datasets, there might be different complications or complete omissions of data that need to be understood. Students will want to read through these country-specific notes carefully. A lot of the information will be the same as what is in the general README, but these notes will highlight any important things that need to be known for working with that particular country’s data. The instructor would strongly recommend that if students are using Google Drive Desktop Sync, they should copy this data over onto their computer before beginning analysis. It can get quite challenging when pointing to cloud-hosted files and trying to perform intensive geospatial analysis on them. The best and most reliable way to do this work is to download the data. The instructor has downloaded the zip file and extracted it on their local computer, so they are not working directly from Google Drive, and they recommend this same approach for students.
Getting Started with InVEST and QGIS
Initial Setup and Workspace Configuration
With all that background and setup information covered, the instructor invites students to open up both InVEST and QGIS, as they are going to dive straight into working with the water yield model. As InVEST loads up on students’ computers, they should see the home screen with all the available models listed. If InVEST loads directly into one of the different models and students do not see this full list of options, they can simply click on the home button that brings them back to the interface showing all the available models. The class will be working with the annual water yield model today. This process should start to feel familiar now that students have gone through this setup with two other models in previous sessions. The big difference in today’s session is making sure that the workspace is configured to point to the country-specific folder that each student is using. For the instructor, teaching this class means the final report will be based on Nicaragua, and that is where the workspace should be configured to point. That is where the instructor recommends setting each student’s workspace as well.
ISO Country Code Conventions
The instructor pauses to ensure that everyone is following along appropriately, because they do not want to power ahead and leave students behind who are still setting up their workspaces. Students should note that their file will not be named NIC, which is the three-letter code specific to Nicaragua. Each student’s country-specific files will have their own three-letter code corresponding to the ISO 3166-1 standard for assigning three-character letters to countries based on international conventions. The instructor asks whether students have the country data downloaded yet or whether they are still in the process of finding and downloading it. After confirming that some students are still working on getting their data, the instructor indicates they will continue moving forward while checking back again just to ensure everybody has reached this point in the setup process.
Using File Suffixes for Model Iteration
One thing to note about the InVEST interface is that students have been skipping over the file suffix option, but this is actually a really useful trick that becomes important when conducting sensitivity analysis or testing different model specifications. If students want to run multiple different versions of the model—for instance, to test what happens if they use one precipitation layer versus another—this small text box lets them put a suffix on the end of all the output files. So instead of each new run completely overwriting the previous files, the outputs from each run will have that suffix appended to allow them to be kept separate. This means students will end up with multiple different output files that they can compare to understand how changes in inputs affect the model outputs. This functionality is possibly quite useful for their final projects.
Water Yield Model Parameters
Precipitation Data
For the precipitation parameter, that is obviously one of the main drivers of water yield and the amount of water available for the ecosystem and human use. This is where students are going to get themselves accustomed to navigating a different file structure than they may have used before, but the Nicaragua example’s file structure is pretty straightforward to follow. The precipitation file is named annual precipitation, and in the instructor’s example, it is specifically the Nicaragua annual precipitation file. Students should look for the equivalent annual precipitation file for their own country in their country-specific data folder.
Reference Evapotranspiration
For the reference evapotranspiration parameter, the instructor has located an example file, and in the Nicaragua case, it is named Nicaragua reference evapotranspiration. This parameter drives the amount of water that is returned to the atmosphere through evaporation from soil and water surfaces plus transpiration from vegetation.
Root-Restricting Layer Depth
Root-restricting layer depth is an important parameter in the water yield model that can sometimes be overlooked. This parameter is represented as a geospatial map indicating the soil depth at which root penetration is strongly inhibited because of physical or chemical characteristics of the soil. This parameter really matters because it indicates how far down into the soil profile plants can push their roots before hitting bedrock or other impenetrable layers. If you are on a mountainside that has not had much vegetation growing on it for very long, you are going to have pretty thin soil, and the root-restricting layer is going to be relatively close to the surface. Essentially, the root-restricting layer depth indicates where the soil transitions to stone. That information can be derived from soil maps created by soil scientists and other experts, and it has already been preprocessed and provided in the data packages.
Plant Available Water Content
Plant available water content depends on the vegetation type of the area and is represented as a fraction. This parameter answers the question: what is the fraction of water that can be stored in the soil profile available to plants? This depends on all sorts of different attributes of the soil, including texture, structure, and mineralogy, but it is absolutely critical in the water yield model context because it determines how much water percolates into the groundwater versus how much is evapotranspired up through the plant back into the atmosphere.
Land Use Land Cover Data
Land use land cover is a parameter that students have worked with before in previous ecosystem service models. In this case, the data layer used is the LULC CI, which stands for Climate Change Initiative, a project of the European Space Agency. The European Space Agency, known as ESA, does a whole lot of interesting research nowadays, and although they may not have rockets as advanced as some other space agencies, they put their rockets to better use. When specifying this parameter, students are going to point to the .tiff file, which is the raster format, rather than pointing to any XLS or other tabular format files.
Biophysical Tables
For the biophysical tables parameter, this information is found under Model Lookup Tables in the data structure. Students need to make sure they point the model to the annualwatercci.csv file specifically, which contains the necessary biophysical coefficients for the water yield model applied to their landscape.
Estimating Model Parameters
The Challenge of Parameter Estimation
Now it is going to be a little harder for students to fill in the remaining parameters without having them explicitly provided in the data. If students really want to impress the instructor on the final project, they could look up the values for these different parameters specific to academic literature published for their particular country. Alternatively, students could use the user’s guide for the InVEST model, which gives them the typical range of values that these parameters take across different environmental contexts. For the purposes of getting the model running today, students can just choose something in the middle of that range. Of course, students could always spend a lot more time justifying these parameter choices better if they wanted to really develop their projects. If students submit this work to peer review and go down the route of becoming a scientist, this is usually where they get criticism: why did you choose a value of 15? Is that just the middle of what the user’s guide says? The reviewer would ask whether that is based on local research or expert opinion. That is not usually considered a very strong argument from a scientific standpoint. But it might work for the purposes of this class assignment and for getting started with the model. For today’s demonstration, the instructor is going to use a Z parameter value of 15.
Watershed and Sub-Watershed Data
Then there are a few more elements that need to be specified. Students are going to have to give the model both a watershed vector file and a sub-watershed vector file. Before proceeding with the technical details, the instructor wants to talk about watersheds themselves, as they are a foundational concept in hydrology and ecosystem service modeling. The instructor has referenced this concept very briefly before, but it deserves more thorough explanation. If students imagine having a country, let us just pretend this is an island country. Like most islands, this country is probably volcanic, so there is probably a mountain right in the middle of it. Just pretend that mountain represents the topography of the island.
Understanding Watershed Delineation
If students want to know what the watersheds are for this island, and they identify the peak, they can think about defining a watershed by asking the question: if a drop of water landed at some random location on the island, where would it flow? The answer depends entirely on the valleys and the slopes. The water would flow down through the valleys and eventually flow out to the ocean. Now if students ask about some other point on the island, that water is going to flow down some other valley depending on the local topography. These flow paths are not stream networks, although initially they might look like a simple network of streams. At some point the flow accumulation gets so substantial that it actually does become a recognizable stream. But basically, a watershed is going to be defined by the very bottommost point where water flows out and the whole area that flows into that outlet point. It is basically just a catchment area, which is the same concept students learned about before when discussing the sediment retention model.
Sub-Watersheds and Hierarchical Thinking
The cool thing is that students can also think about sub-watersheds as a hierarchical concept nested within the larger system. What matters is that this is the whole watershed because the water flow really ends there at the outlet, but students could also conceptually ask what would happen if they started counting from some higher point upstream. They could then draw the sub-watershed, which would be the subset of the larger watershed of all the points that flow into that intermediate point, which will then obviously continue flowing on down to the ultimate outlet. This hierarchical nesting of watersheds is really important in hydrology.
Applications to Ecosystem Service Modeling
Hydrological engineering comes in here as a key consideration, and lots of different models operate on a watershed-by-watershed basis because water flows downstream and interacts with other water and materials at different scales. Students might be thinking, what happens in the chemical mixing as the nutrients and other things mix with all the other chemicals coming in as the water travels down the stream and eventually makes it out to the ocean or into a reservoir? So the InVEST model is going to look at these results in terms of both the full watershed and the sub-watershed level. For the Nicaragua data that the instructor is using, the watershed and sub-watershed files are nicely labeled, making identification straightforward. The first one for students to select is the watershed file, and then for the other one, students select the sub-watershed file. The instructor indicates they are going to circulate around the classroom and check on people’s data to help troubleshoot, but students can also keep powering ahead if they want to.
Data Formatting Across Countries
Some of the countries are not formatted exactly the same way as others in the data packages provided, but so far everything is looking great as students load their data. The data should be relatively consistent across countries, but students may notice some variations in naming conventions or file organization.
Running the Water Yield Model
Optional Demand Tables
For the optional demand tables, students could run the water yield model without including these, but the model prompts students to find them. The instructor actually thought the demand table should be there but is not locating it immediately. The instructor asks if anybody else has found the demand table in their country-specific data yet. Since the demand table is not being located, the instructor decides to run the model without it for now, and they give the go-ahead for students to do the same thing.
Practical Considerations for the Final Project
For students’ final projects, locating and properly incorporating the water yield model is going to be one of the lifting points if they want to include this particular ecosystem service in their analysis. Figuring out how to work with this model and its various data requirements will be an important skill, and the InVEST user’s guide is going to be absolutely essential for this work. The instructor has already clicked Run on their model, so students can do that as well and see what happens with their results.
Processing Time and Data Scale Considerations
For the very first time in the class, students are not all running on the exact same data. This is where students get to discover if their country is large or small in terms of how long the model takes to run. One of the things that much of the instructor’s professional life has been focused on is getting models to run faster on computers, trying to optimize code and algorithms for efficiency. The instructor chose a very small country with Nicaragua, so the model only took 4.18 seconds to complete. But as students get to bigger and bigger countries, no matter how fast their computer is, it starts to become a real big data challenge that cannot be overcome by hardware alone. The instructor asks who has the largest country among the students. Mexico emerges as the largest country, and the instructor checks whether that student is still waiting for their model to run. The real reason the instructor wanted to do this analysis in class is to inoculate students to the reality that geospatial analysis can take time. This is kind of a first stress test of students’ computers to see how they handle intensive processing. If students really want to throw in the towel and choose a smaller country instead, the instructor says maybe they will allow it, but they encourage students to see if they can make it work. Students should let the instructor know if they have any troubles. The model should generally be able to run in a reasonable amount of time for most countries. But if students have one of those really big countries, they might not get results in the time available right now. What those students will have to do is plug in their computer to power and wait overnight or for an extended period for the model to complete. If students think they are not going to get to a battery charging facility for their computer, they might want to hit pause on the model run. The instructor has had that happen before where they were running a model and they could not drive home fast enough to get their computer plugged back in before the battery died. When using all the cores on the computer’s CPU, the device uses the battery way faster than normal operation. Only one student has finished so far, which suggests the instructor’s prediction about processing time was accurate, so the group is doing well. The instructor decides to take a look at the results from that completed model run.
Interpreting Water Yield Model Results
Understanding the Output Structure
This is one of the models where the developers of InVEST have not yet created an automatic report generation system. The other ecosystem service models had two buttons at the completion screen: one labeled Open Workspace, which opens the folder where results are stored, and another labeled Open Report, which provides that nicely formatted HTML document with all the results nicely laid out. This water yield model is one where students just kind of have to do it themselves using the output files. What students see after running the model is they chose Nicaragua, or whatever country they are working with. The instructor might have been wise to put it in a separate folder from the input data, but the data structure is kind of nice in that it has an output data folder. That folder is where all the actual results are stored.
Visualizing Watershed Results in QGIS
The instructor is going to open up their QGIS and show students the first and most basic thing they might want to do with shapefiles in QGIS, which is to load and visualize them. They are going to start with the easy one: watershedResultsWyield.shp. The instructor is going to drag that file over into their QGIS workspace to display it. Here are all the watersheds present in Nicaragua displayed as polygons on the map. Because the class is also learning the basics of QGIS, the instructor wants to point students to some of the key things they can do with shapefiles. Like before, with raster files, if a student loses a file altogether, they can go to zoom to layer if they are pointing at somewhere that does not look like it has any data. They can also select specific polygons with a selection tool here. Another useful thing is this tool here with the little information icon. If students click that and then click in a polygon, it is going to show them all of the information and attributes associated with that polygon. It might be too much information about it, but this shows students what data are being stored in this shapefile. The shapefile has these polygons, and for each polygon, it records a bunch of extra information and attributes. If students become a GIS expert, they start to learn a lot about what these different attribute fields are and represent. The instructor just points students to the end of the attribute list, and they can notice that there are fields named water yield MN and water yield volume. These are the key results that InVEST generated and stored in the shapefile.
Viewing the Full Attribute Table
The instructor also wants to show students how to look at the table of all the results at once. So instead of just clicking on one polygon to see its attributes, what if students want to see results for water yield for all the polygons simultaneously? For that, students should right-click on the layer name and go down to Open Attribute Table. What students will get is a table view where each row is the data associated with one of the polygons. Before, when the instructor clicked on a polygon, it was essentially just showing that single row. But now students are seeing all the different polygons and all their associated data in a tabular format. Another fun thing students can do is click through this table, and it will highlight which of the polygons it is representing in the map. Looking here at the attribute table, a lot of this pre-calculated stuff was not generated by this particular model run, but the class is going to skip over that. What the class really cares about is if students scroll all the way to the right of the attribute table. That is where the water yield volume column is located. This column is going to show what is literally the volume of water measured, which the instructor believes is in cubic meters, though students can refer to the InVEST user’s guide if they want confirmation. But this is the key model output saying that given everything the model knows about evaporation, transpiration, root interactions, and all the other hydrological processes, this is how much water flows into the bottom point of each of the sub-watersheds. The instructor sees that some of the watersheds do not have the sub-watershed data, so students might want to load up that secondary result, the subwatersheds.shp file, and look at it there. But the point is, this water yield volume is the key result they have been working toward.
Connecting Biophysical Results to Economic Value
Tying this back to how the class has been talking about ecosystem services throughout the course, students should recall that there is the ecosystem structure. From there, it produces, through an ecosystem service production function, some level of biophysical ecosystem service provision. That is what we have in this column here: the volume of water. The ecosystem service value provision is the thing that students can then, hopefully, multiply by some price or other monetary metric that brings it from biophysical terms into economic terms that policymakers and the public can understand. Water yield is a biophysical variable, not an economic one. Water yield is measured in physical units like cubic meters. But students might think that for water, it probably does have some sort of connection directly into the economy, especially if that water is used for hydroelectric power generation or drinking water supply. After viewing the results, the instructor is going to circulate around the classroom and see the status of everybody’s model runs and check whether they are having any issues.
Technical Reminders for QGIS Users
A good question has come up: if students have not gotten the attribute table up on their screen like the instructor has shown, just as a reminder, they should go to their Layers tab and right-click on the layer that they want to look at and go to Open Attribute Table. The attribute table is essential for viewing the full results. Some people are asking about the icons for different tools being in different locations on people’s computers, which can make it confusing to find tools. The tool that students want to select if they want to select a specific polygon is the one that has the pointer arrow inside a little box next to a bigger box. That is the polygon select feature. Another thing to note is that students can select multiple polygons at once. That can be useful for various purposes in their analysis too.
Exporting Subsets of Spatial Data
This class is noted as being way above average in terms of tech competency, which is very good and will serve students well in their future environmental work. The last thing the instructor might say, in terms of the sort of bonus training in GIS that students are getting from this class, is that oftentimes maybe students only care about one sub-watershed, or maybe they have a map of the world and they only care about one country. In those cases, QGIS gives a very easy way to create new shapefiles containing only the subset of data that matters. Say students only care about this watershed, this watershed, and this watershed. If students select the ones they care about, they can right-click on the layer that they have loaded and hit export. This is where it gives them the option to save features as. What is really nice is it defaults to geo package instead of shapefile format. The instructor threw a lot of shade at the shapefile format earlier, but they have been too busy prepping for their Chile trip to actually fix the data layers for the students, so they are kind of a hypocrite about this. Either way, when it saves it, it will be in a geo package by default, but that is what students could do if they like it old school and prefer to work with shapefiles. The instructor wants to make sure students have no questions about this process before moving on.
Working with Attributes in QGIS
The instructor reminds students about opening the attribute table to view all features at once: select the layer, go down to Open Attribute Table. The terminology is a little bit different in QGIS than in ArcGIS. In QGIS, they call it attributes instead of features, which can be a source of confusion for students who have worked in ArcGIS before. Unless there are any remaining questions, the instructor wants to pivot to the really important topic of what do students actually do with these water yield results.
Valuation of Water Yield: Hydropower Case Study
From Biophysical to Economic Values
The instructor wants to pivot to the topic of what students do with the water yield results they have generated. Students got the volume of water, right? What might students want to do if they are going to a policymaker and making an argument that some sort of environmental restoration program is or is not worth it from an economic standpoint? Basically, they are trying to do a cost-benefit analysis, but now properly including all these other values that are either ignored as externalities or simply ignored because we do not even know the value, which is even worse than an externality. An externality is something that you can at least see as something that we do not have people valuing, but we know it is there and it has effects. This is even worse—an invisible value that most people do not even know exists.
Hydropower Valuation Framework
The valuation method from the ending point of InVEST is really straightforward because it is basically market plus physics. We have a pretty clear idea of why we spend so much money having our Army Corps of Engineers and similar agencies building dams. When you create a dam and put it at the bottom of a watershed or sub-watershed, you then know that based on how much water you let through, some subset of this water is going to fill up to the height of the dam. In principle, if you had a super high dam that was higher than all the elevation around it, it would eventually fill up the whole thing, but that would be a pretty bad idea because now you would have a really massive spill on your hands. But the point is, however high we build this dam determines how much reservoir capacity there is. That capacity directly captures and constrains the results we just computed in InVEST. As we have got our water volume for each one, we will need a little bit more information then on what is H sub D, which is the water height behind the dam at the turbine level.
Physics of Hydropower Calculation
There are some other coefficients that go into the hydropower calculation, like gravity. That is the acceleration approximately 9.81 meters per second squared. This constant comes up in a lot of physical calculations. Gravity determines how much electricity we can get from falling water because potential energy is mass times gravity times height. Also, rho, this is water density. By definition, it is 1,000 kilograms per cubic meter, because the actual scientific definition of density comes from water’s density as the reference standard. That is a little bit of what goes into the calculation. Students would essentially need to get information from the government agency responsible for managing the dams in their country. That information is actually pretty easy to get in most cases through public records or agency websites. There are a few additional things that are a little harder to find. Students need to know about the turbine efficiency: what is the percentage of inflow water volume at the reservoir that will actually be used to generate electricity? Most dams let through a large portion of the water without ever using it to generate electricity because there might be more water flowing through than the dam has capacity to put through the generator.
Economic Calculation of Hydropower Value
The final calculation would combine all of the economically relevant components to determine the total value from hydropower. What do we get? The price of electricity multiplied by this epsilon D that we calculated above, which came from the previous two equations about gravity, density, head, and turbine efficiency. That gives us quantity of electricity. That is just price times quantity minus total costs. Now we are back to basic economics, right? This is total revenue minus total costs, the standard economic profit calculation. Then we are going to add one extra term: the discounting factor. Students have seen this concept a bunch of times throughout the course. However far out into the future we are thinking that this flow of value is going to be accrued, this term will get larger and larger with time, which makes the discounting term get smaller. This means that dollar values in the future, farther out in time t, are going to be worth less in present value terms. This discounting accounts for human time preferences and the opportunity cost of capital. This gets us the net present value of hydropower at that dam, which is a single number that can be used in policy analysis.
Policy Relevance of Hydropower Valuation
What makes this relevant to policymakers is that if there is ever a cost-benefit analysis where something is going to disrupt the hydrological cycle—or maybe we are going to divert water to irrigation and it will not make it into the hydropower dam downstream—a model like this lets us more accurately compare the costs and benefits of that decision. Has anybody heard about the project to pipe water from Lake Superior down to Arizona? That is a big example of this kind of large-scale water diversion where hydropower impacts would matter. These are huge diversion projects that could have enormous hydropower implications. The instructor does not think that particular one is going to go through, but this is something that actually does happen in various places around the world. There is going to be an example of local costs and benefits differing from the larger scale costs and benefits in water diversion cases. That is the valuation method for hydropower, which is one important way to put economic value on the water yield that the InVEST model produces.
The Value of a Statistical Life
Introduction to VSL Concept
That is the valuation method for hydropower, which leaves us with just 10 minutes to talk about the very last topic the instructor wants to cover in this lecture: the value of a statistical life. What is kind of nice is that in calculating the value of a statistical life, which students will henceforth just call VSL because you see that abbreviation a lot, they will find that the basic methods are already familiar. It is going to be hedonics in most cases, just applied to a new context.
Hedonic Analysis Refresher
Students might be thinking back to hedonics from earlier in the course. They talked about what is the value of a lake. They talked about how different amenities on the lake—like whether there is a boat launch or how clear the water is—looking at the changes in the clarity of the water and how that affected sale prices of houses on that lake, gave them information about how much people cared about water quality. That hedonic analysis is hard to get and requires a lot of data, but researchers can do the exact same approach to get how much people care about their own life.
Individual Perspectives on Valuing Life
The instructor acknowledges that some people have cognitive dissonance putting a dollar value on a life, and they wonder if that seems wrong to people in the class. When you put it in a specific person’s terms, it becomes a really hard-to-assess thing. But the fact of the matter is, the government does this calculation all the time in various policy contexts. What is sort of interesting is that different agencies have different numbers. The Pentagon has a much lower price or value that they put on a statistical life than many other agencies, and it is actually quite relevant to them because they do lose lives in military operations. They do not use hedonic analysis; they use replacement cost instead—essentially, how much does it cost to train up a soldier? The instructor can sort of see the logic at least from a decision-making metric, even if it feels cold. Oftentimes, though, people are not the military, and they are caring about environmental things instead. How can they do the valuation without looking at it from the perspective of replacement cost? Well, there has been tons of really awesome academic literature on the point that you can use people’s observed market behavior in job markets to determine how much they care about, in fact, their own life.
Job Market Differentiation and Risk
This comes from the fact that, just like with a house or a lake, there is going to be a big bundle of different attributes that people care about with respect to any given job. For a job, it might be what are the responsibilities, do people get to be a supervisor, do they need to travel, can they work from home? That last one is a big one now, especially after the pandemic. What are the hours? But then, critically, there is this one that is kind of unique in the context of labor markets: risk of death or injury. We do not think about this too often in most of our daily lives. A lot of the jobs that people who are college-educated take on tend to have essentially zero risk of death during normal work. But there is actually a ton of data about risk in jobs because some jobs are genuinely risky. Things like working on an oil rig or driving a truck through a war zone have significant risk. There is the TV show Ice Road Truckers—a genuinely risky job. Another one is The Most Dangerous Catch, where fishermen are catching fish in the North Sea in very dangerous conditions. That is also very risky and documented in popular media. Any job has some sort of inherent risk of death or serious injury, though this risk ranges dramatically from nearly zero for desk workers to very substantial for certain occupations.
Using Wage Differentials to Estimate VSL
We can leverage this fact and all the different observations on how much these riskier jobs pay their workers to determine, in the same way that hedonic analysis was used for house prices and water quality, how much a human’s life is worth in terms of willingness to accept risk. Basically, the way it works is researchers collect data on all those job characteristics—the things that might matter for wage determination, like are you a supervisor, what are the hours, how far do you commute? It is kind of like how many bathrooms in a house or whatever—because researchers are trying to isolate the effect of the risk. They use a statistical model that predicts wage as a function of multiple variables including education of the workers, physical attributes, hours worked, distance to the job, whether work can be done from home—a big one these days. But then researchers include all these things trying to describe all of the attributes that go into predicting wage. But then one last one: risk. This is what researchers call the coefficient of interest if you do statistics.
Specification and Interpretation of VSL Models
Assuming that you have got all of these things correctly identified and that you did not leave anything important out—if researchers left out hours, that would be really bad because now the estimate for the risk coefficient would be picking up on the effect of hours instead—if the model has been well specified, then researchers can look at how different jobs with different risk levels affect the wage. If there is a 1% increase in risk of death, researchers would be solving for the amount by which that 1% increase in risk would lead to a whatever percent increase in the wages. This is the key coefficient that comes out of the regression and allows the valuation of statistical lives. This relates to environmental economics because all sorts of different things that we do—such as cleaning up the air or cleaning up a toxic spill—oftentimes much of its value is not through ecosystem services like sediment retention, but simply through the fact that it keeps people from dying. The mortality reduction channel is often the dominating economic value, even if ecosystem services also have value.
Case Study: Environmental Justice and the HERC
The instructor will give one example: the HERC, the Hennepin Energy Recovery Center. It burns garbage and is very unfortunately located right in the middle of the city, right next to a bunch of low-income residents. There is a massive environmental justice aspect to this facility because it imposes costs on people who did not choose to live there and did not benefit from the waste disposal service. How would researchers use the tools of VSL to establish the costs and benefits of a policy to close or replace the facility? The benefits are clear—it stores our garbage and burns it, managing the solid waste stream. But what are the costs to the surrounding residents? This is a stylized example, but suppose we would have 10,000 people that are exposed to the emissions from this facility. The policy, in this case getting rid of the HERC, would reduce mortality risk by some amount that could be estimated from epidemiological studies. If researchers further knew from hedonic analysis that people value a risk reduction of 0.01 at $200, researchers can then simply multiply two different things together. The number of residents multiplied by the risk and by the willingness to pay per unit of risk reduces the abstract risk into concrete health impacts. The calculation the instructor has sort of jury-rigged the numbers here so that it comes out nicely shows that this risk with that level of exposure leads to one statistical life saved by doing the policy. The second thing researchers need to know is the value of that one life. With a $200 willingness to pay per unit of risk, for all 10,000 of those residents, the total willingness to pay across the population can be calculated. This suggests that the value of a statistical life multiplied by the one statistical life saved says that getting rid of the HERC, these are numbers that are obviously not real but are for illustration purposes, would give researchers an estimate of the health benefit value. If the replacement of the HERC costs more than this calculated health benefit, that is important information. If it costs less, then closing the facility passes a basic cost-benefit test on health grounds. It does not say anything at all about the environmental justice component, and that is where there is a real caution. But it is just a good example of how researchers can use this VSL concept in some really meaningful debates about pollution and health.
Values and Range of Statistical Life Estimates
What the instructor will end with is that there is tons of data out there. The class will return to this topic at the beginning of next lecture with more depth. There is really rich literature on what is the value of a statistical life, and different studies give different numbers depending on the context and populations studied. Just so that students do not leave without this information, it ranges from a little bit less than a million dollars per life to higher-end values looking at around $20 million. That is the range of estimates for how much the economy or researchers values a human life in different contexts. These numbers shape real policy decisions about pollution control, workplace safety, and many other things.
Conclusion and Next Steps
Class Performance and Data Troubleshooting
Have students had any questions about their data running through their country models, as it looked like everybody was pretty successful with their model runs? This class session was also structured to give students practice troubleshooting their country-specific data and all the challenges that come with real-world datasets. Either way, feel free to reach out to the instructor if you had any issues. Now is a good time to get help, and the instructor will be available to assist. Their next class meeting is coming up soon, and they will see all students later.
Welcome to Day 4: Valuation Methods
Welcome to Day 4 of our discussion on valuation methods. Today’s session will cover the completion of the Value of a Statistical Life (VSL) analysis, followed by an interactive classroom game to reinforce these concepts. We will then discuss the various policies that emerge from different VSL estimates and explore other non-market valuation methods, including ecosystem services. This lecture represents a significant portion of the course material on valuation components.
Completing the VSL Framework
The Literature on Statistical Life Valuation
A vast and extensive literature exists on establishing the value of a statistical life through published, peer-reviewed studies. The critical methodological distinction in this body of work centers on the approach used to estimate these values. Almost all modern studies derive their estimates from labor market analysis, though a smaller subset employs contingent valuation methods. Understanding this methodological foundation is essential for comprehending how economists arrive at VSL estimates and the strengths and limitations of each approach.
Contingent Valuation Versus Labor Market Analysis
Contingent valuation represents a form of direct questioning where researchers ask people about their willingness to pay or accept compensation for various outcomes. When the topic concerns matters as critical as life and death, this method becomes especially challenging to execute accurately. People struggle to provide reliable responses when asked directly about their valuation of life itself. Consequently, the field has progressively shifted toward labor market analysis, particularly in more modern studies. This methodological evolution reflects both the practical challenges of contingent valuation and the theoretical advantages of observing actual market behavior.
The Labor Market Approach to VSL Estimation
The labor market method focuses on analyzing how workers respond to risky employment opportunities. Specifically, economists examine whether workers demand higher compensation to accept jobs with greater mortality risk. This revealed preference approach operates on the principle that workers’ actual decisions in the marketplace reveal their implicit valuation of risk. Rather than asking people hypothetically what they would pay, researchers observe real wage differentials between risky and safer jobs and use these differences to infer valuations of mortality risk.
A Worked Example: Understanding the VSL Calculation
Setting Up the Problem
Consider a straightforward example that illustrates the fundamental calculation underlying VSL estimation. Suppose a worker demands five thousand dollars more in annual compensation to accept a job with significantly higher mortality risk compared to a safer alternative position. We need to establish two pieces of information clearly. First, we need to identify the willingness to accept a risk premium, which represents the wage differential necessary to induce the worker to take on additional risk. This premium is not simply the total wage for the risky job, but rather the difference between what the worker would require for the risky job versus an unrisky alternative. Second, we need to specify the actual mortality risk associated with the risky job. Suppose the job carries a one percent probability of death annually. This means that on average, if one hundred workers accept this job, one of them will die.
Calculating the Value of a Statistical Life
The calculation of VSL from this information follows a logical progression. If workers will only accept the risky job for an additional five thousand dollars compensation, and if a one percent mortality risk means that among one hundred workers one will die, then the value of a statistical life is simply the risk premium multiplied by the inverse of the risk probability. In this case, five thousand dollars times one hundred equals five million dollars. This calculation reveals that workers implicitly value a statistical life at five million dollars based on their wage-risk trade-offs.
Why This Approach Matters
This methodology provides a more satisfying approach to valuing life than direct questioning because it is based on observed marketplace information. Real people are working real jobs and making actual decisions about accepting or rejecting risky positions. Economists can extract from the different prices paid in these markets the real premium that people place on their lives when accepting risky employment. The behavior revealed through actual labor market choices provides a foundation grounded in genuine economic decision-making rather than hypothetical responses to survey questions.
The Abalone Fishing Game: A Classroom Demonstration
Setting the Stage for the Experiment
To reinforce these concepts with greater specificity and provide a hands-on learning experience, we conduct an interactive classroom game centered on abalone harvesting. Abalone are mollusks that live underwater in shells and are harvested by divers who feel around underwater with their hands in murky water to locate them. They represent a delicacy and, for our purposes, an example of a genuinely risky occupation. In this classroom experiment, the instructor takes the role of a fishing boat captain who already owns the boat and equipment but requires workers to do the dangerous diving. The captain will hire four divers whose task is to harvest abalone and bring them to market.
The Physical Setup and Game Materials
The physical setup for this game uses common household items to simulate fishing grounds. The primary constraint was finding items that looked and felt different while representing different outcomes in the game. Flavored Keurig coffee pods serve as the base materials for this experiment. The good fish, representing valuable and harvestable abalone, are the vanilla, caramel, and mocha flavored pods. These are desirable outcomes. The bad fish, representing dangerous electric eels that will harm the divers, are the maple pecan flavored pods, which the instructor identifies as the worst possible coffee flavor. The goal of the game is for workers to harvest three abalone each to validate their compensation.
How the Game Works
The game operates through a bidding and labor market clearing process. Workers must write their names on a sheet and choose a wage between one and five dollars that represents the compensation they would require to become a diver. The game proceeds across three different rounds, each representing different fishing grounds with different risk profiles. The captain will then choose workers based on the bids, with the market clearing at a wage where the captain can attract exactly four workers. Workers who bid at or below the clearing wage are selected to participate. If a worker bids below the clearing wage, they benefit because they receive the higher market wage. Compensation is provided in class points, which can be used to skip homework assignments or restore late or incomplete work to full credit value. If a worker successfully completes their job by drawing three good fish, they receive class points equal to the wage they bid or the market wage, whichever is higher.
Safety Valves and Risk Mitigation
Because the game involves some workers being uncomfortable with the risk component, there is a built-in safety valve. If a worker fears the fishing experience, they can simply bid five dollars, which is set above the expected market clearing price. The market will clear somewhere below this threshold, meaning that if workers bid the maximum, the captain will need to find other workers willing to work for less. Additionally, workers who participate but die during the experiment, though they lose their class points for that round, can continue in the class without penalty. This eliminates any genuine consequence beyond the academic incentive.
The Three Fishing Areas and Their Risk Profiles
The game features three distinct fishing areas with progressively increasing risk levels. In Area 1, all nine items in the fishing pool are good abalone with no dangerous eels. This represents a completely safe fishing ground where divers have a one hundred percent chance of drawing a good fish on each draw. Since workers must draw three times, the probability of successfully completing the job is one.
Area 2 introduces moderate risk by including one maple pecan eel among nine total items. This means that ten percent of the items are dangerous eels. To calculate the probability of being electrocuted when drawing three times, we multiply the probability of drawing a good fish across all three draws: 0.9 times 0.9 times 0.9 equals 0.729, which means there is a 27.1 percent chance of drawing at least one eel and being electrocuted.
Area 3 represents a truly dangerous fishing ground with three eels among ten total items, meaning thirty percent of items are dangerous. The probability of survival when drawing three times is calculated as 0.7 times 0.7 times 0.7 equals 0.343, which means there is a 65.7 percent chance of drawing at least one eel and being electrocuted. This represents a substantially riskier job.
Labor Market Theory Applied to the Game
This classroom experiment can be directly mapped onto fundamental microeconomic theory. On the vertical axis of a standard labor market diagram, we would plot wages ranging from one to five dollars. On the horizontal axis, we would plot the number of workers. The firm, represented by the fishing boat captain, has a fixed demand for exactly four workers, represented as a vertical line at four workers. This labor demand does not change regardless of wage because the captain needs exactly four people to harvest twelve abalone at three per person.
On the supply side, individual workers will determine their willingness to work at different wages based on the risk profile of the job. Workers have varying risk preferences and requirements, so the labor supply curve will slope upward, indicating that higher wages are needed to attract more workers. The market will clear where the labor supply curve intersects the fixed demand, determining the wage at which exactly four workers are willing to participate. The captain, as a profit-maximizing firm, will set the wage at this clearing point. If the wage is set too low, fewer than four workers will volunteer. If the wage is set too high, the captain will not maximize profit. The equilibrium wage emerges where supply meets demand.
Running Round 1: The Safe Fishing Ground
In the first round, workers are asked to write down the wage they would require to fish in Area 1, where all nine items are good abalone and there is no risk of electrocution. After collecting all bids, the captain announces the clearing wage. In a typical result, five workers indicate they are willing to work for two dollars. Using a random selection mechanism, the captain selects four workers to proceed. These four workers come to the front of the classroom and each draw three items from the pool. Since all items are good abalone, all workers successfully complete their draws and receive their payment in class points.
Running Round 2: Moderate Risk
Round 2 introduces risk by adding one maple pecan eel to the nine abalone. Now there is a ten percent chance of drawing the eel on any single draw. Workers are asked to indicate the wage they would require to participate in this round, knowing that the risk profile has changed. When workers announce their bids, a typical result shows that most workers maintain their bids at two dollars, indicating that they do not perceive the ten percent risk as substantial enough to warrant higher compensation. However, some workers may increase their demands to three or four dollars. The captain selects four workers based on the market clearing wage, which typically remains at two or three dollars.
The selected workers come forward and draw three times each. With a ten percent chance per draw, approximately one worker may draw an eel and be electrocuted. That worker receives no class points and must sit out the remainder. The surviving workers typically all complete their draws without electrocution and receive their class points. This round demonstrates how workers respond to the introduction of moderate risk with minimal wage increases in many cases.
Running Round 3: High Risk
In the final round, the risk profile increases substantially. The captain removes two abalone and adds two more eels, creating a pool with three eels and seven abalone. Now thirty percent of the items are dangerous. Workers are explicitly told that if they are electrocuted, they receive nothing, and they must carefully consider whether the compensation justifies the risk. When the captain asks for bids, the results typically show a substantial shift in worker preferences. Fewer workers are willing to work for one or two dollars. More workers demand three or four dollars to compensate for the increased risk. Some workers join the five-dollar group, essentially opting out entirely.
The captain selects four workers based on the new market clearing wage, which has typically increased to three or four dollars. These workers come to the front and draw three times from the pool. With a sixty-five percent chance of electrocution, multiple workers are likely to draw an eel. Those who do receive no compensation. The survivors typically all receive their class points at the higher wage. This round viscerally demonstrates how workers adjust their compensation demands when faced with substantially higher risks, consistent with the theoretical predictions of labor market economics.
Deriving the Value of a Statistical Life from Game Results
Collecting and Organizing the Data
Once the game concludes, the instructor compiles the results by recording the market clearing wage for each of the three rounds. The labor supply is revealed through the number of workers willing to work at each wage level in each round. By plotting these points, an upward-sloping labor supply curve emerges, satisfying the law of supply. The data shows that workers required increasing compensation as the risk increased, consistent with economic theory.
The Two-Step Calculation Process
To derive the value of a statistical life from the game results, we employ a two-step process. First, we calculate the wage increment, or risk premium, by taking the difference between the clearing wage in the riskiest area and the clearing wage in the risk-free area. In a typical year, Area 1 cleared at a wage of two dollars, while Area 3 cleared at a wage of three dollars and twenty-two cents. The increment is therefore three dollars and twenty-two cents minus two dollars, which equals one dollar and twenty-two cents.
Second, we combine this wage increment with the actual risk metrics. In Area 3, the probability of death is sixty-five percent, or 0.665 as a decimal. We calculate the inverse of the risk by dividing one by 0.665, obtaining approximately 1.504. We then multiply the wage increment by the inverse of the risk: one dollar and twenty-two cents times 1.504 equals approximately one dollar and eighty-five cents. This represents the value per person. However, the value of a statistical life is expressed as the value for one death across a population. We therefore multiply this per-person value by approximately one hundred to obtain the total value of a statistical life in the relevant units.
Historical Results and Consistency
The instructor and previous instructor have maintained meticulous records on this classroom game for several years, including data from 2025, 2024, 2023, 2022, 2021, and earlier years. Despite variations in class size and student composition, the results consistently show the same patterns. When people play this game using relatively low stakes, they systematically require more compensation to accept higher levels of risk. The derived values of a statistical life from these classroom experiments fall within ranges that are plausible given labor market research. The consistency across years and classes suggests that the patterns observed reflect genuine economic behavior rather than idiosyncratic results.
The Hedonic Pricing Framework
This entire exercise illustrates the powerful technique of hedonic pricing, or hedonics. Hedonics is a method for extracting implicit prices from market data by examining how the market price varies with different characteristics or attributes. In the context of housing, hedonics might examine how the price of a house varies with access to fishing grounds or a view of a beautiful park. In the context of labor markets and mortality risk, hedonics examines how wages vary with the mortality risk associated with different jobs. Whether examining houses or risky jobs, hedonics provides a powerful methodology for eliciting these implicit prices. This technique has become extensively used throughout the economic literature on valuation.
Policy Applications: Air Quality and Valuation
The Epidemiological Foundation
One of the most significant real-world applications of VSL estimation concerns air quality policy. The Environmental Protection Agency identifies key pollutants that require monitoring and regulation, with particulate matter smaller than 2.5 microns (PM2.5) being among the most important. PM2.5 comes from various sources but primarily originates from emissions produced by burning coal. This particulate matter enters the lungs and causes premature mortality. One of the seminal articles in this field compiled epidemiological evidence about the relationship between PM2.5 concentrations and health outcomes, then multiplied these health effects by VSL estimates to monetize the policy benefits.
Calculating Avoided Mortality and Benefits
The epidemiological research conducted a literature review to identify evidence on the number of deaths per one hundred thousand people at different PM2.5 concentrations. The concentrations were measured in micrograms per cubic meter, with levels like thirteen, twelve, and eleven considered. The research calculated the “avoided mortality” at each concentration level, meaning the number of lives that would be saved by reducing pollution to that concentration compared to a higher baseline. For adult mortality, reducing PM2.5 to thirteen micrograms per cubic meter avoided one hundred forty deaths per one hundred thousand people. Reducing it further to twelve micrograms per cubic meter avoided four hundred sixty deaths per one hundred thousand. Reducing it even further to eleven micrograms per cubic meter avoided fifteen hundred deaths per one hundred thousand.
Beyond Mortality: Measuring Morbidity Benefits
While the mortality benefits form the core of the analysis, the research also examined a range of non-fatal health outcomes. Infant mortality was examined, though the numbers were substantially smaller than adult mortality. Beyond mortality considerations, the research quantified benefits including avoided non-fatal heart attacks, hospital admissions for various respiratory and cardiac conditions, emergency room visits for asthma and other respiratory conditions, and lost work days due to illness. These non-fatal outcomes represent genuine health improvements that people care about. The research recognized that while these outcomes do not result in death, they nevertheless represent significant harms that deserve to be valued in a policy analysis.
Personal Experience and the Severity of Air Pollution
The impact of air pollution on health and quality of life becomes concrete when considering acute asthma responses to pollution. Many people have experienced the terrifying sensation of unable to breathe during exposure to heavy pollution or air quality events. In one such instance, while traveling in Africa, the local practice was to burn garbage at a designated time after work ends. The burning occurred all at once and created a dense haze of pollution. Exposure to this pollution triggered hyperventilation and a struggle to obtain sufficient air, creating a genuinely frightening experience. Though the person ultimately was fine, the experience illustrates why people care deeply about air quality and why non-fatal impacts deserve consideration in policy analysis.
Monetizing All Health Benefits
Once the epidemiological relationships between PM2.5 and health outcomes are established, researchers can monetize all these benefits by applying appropriate valuations. For mortality outcomes, they multiply the number of avoided deaths by the VSL. For non-fatal outcomes like heart attacks, hospital admissions, emergency room visits, and lost work days, they apply unit values derived from other sources or studies to monetize these health improvements. The result is a comprehensive picture of the total health benefits at each pollution reduction level, expressed in monetary terms. This allows for comparison with the costs of pollution abatement, following the cost-benefit framework discussed in earlier lectures.
Extending VSL Beyond Human Lives
The Value of a Statistical Dog Life
The valuation methodology is not limited to human lives. One creative research paper applied contingent valuation methodology to estimate the value of a statistical dog life. Using surveys where respondents were asked their willingness to pay to reduce risks to dogs, researchers found that people value a statistical dog life at approximately ten thousand dollars. This application demonstrates that people care about the welfare of other beings and will express this care through monetary valuations.
Why Hedonic Analysis Cannot Be Applied to Dogs
When considering why hedonic pricing cannot be applied to estimate the value of a dog’s life, the key issue becomes apparent. Hedonic analysis relies on observing market choices and trade-offs. Dogs do not participate in labor markets. They do not make choices about whether to accept risky employment in exchange for compensation. Researchers cannot observe dogs’ revealed preferences through their market behavior because dogs do not engage in the types of economic transactions that generate the necessary data. Consequently, contingent valuation, which directly asks people about their valuations, becomes the only practical methodology for estimating the value of a statistical dog life. This limitation highlights that hedonic analysis, while powerful when applicable, requires observable market transactions involving the entity being valued.
Critical Examination of VSL Methodology: Strengths and Limitations
The Virtue of Observed Market Behavior
The fundamental strength of hedonic valuation of a statistical life is that it is grounded in observed behavior in actual markets. Workers make real decisions in the marketplace, accepting or rejecting jobs based on the wages offered and the risks involved. These actual choices reveal people’s genuine preferences regarding mortality risk in ways that hypothetical questions cannot replicate. The wage-risk trade-offs observed in actual labor markets provide a foundation for VSL estimates that reflects genuine economic decision-making.
The Problem of Perfect Information
Despite the virtues of labor market analysis, significant limitations require careful consideration. The first critical assumption is that workers have perfect information regarding the risks they face. In reality, workers often systematically underestimate risk, particularly when risks are relatively small. Behavioral evidence strongly suggests this assumption is frequently violated. Workers may not have accurate information about occupational mortality rates, may not understand or internalize small probability risks, or may base their risk perceptions on biased information or cognitive errors. If workers lack accurate risk information when making their job choices, the resulting wage-risk trade-offs do not reflect informed decisions and may not provide reliable valuations.
The Problem of Non-Representative Risk Preferences
A second significant issue concerns whether the workers who accept risky jobs have risk preferences representative of the broader population. Consider workers on off-shore oil derricks or cowboys engaged in inherently dangerous occupations. These professions often develop a particular culture that celebrates danger and risk-taking, potentially attracting individuals with unusual risk preferences who are more willing to accept danger for lower compensation than the general population. If the workers willing to take risky jobs are systematically different in their risk preferences, using their wage-risk trade-offs to estimate values for the general population may produce misleading estimates. A revealed preference in a self-selected group of risk lovers may not generalize to the population as a whole.
The Issue of Uncompensated Externalities
When a worker accepts a risky job, that worker bears the personal cost of potentially dying. However, the costs of a death extend far beyond the individual. The family members and loved ones of the deceased suffer significant emotional and financial harm. The employer and coworkers lose a productive member of the workforce. Society loses whatever contributions the deceased would have made. These external costs are not borne by the worker and therefore do not factor into their wage-risk trade-off. If workers only consider their private cost and benefit when deciding whether to accept a risky job, their wage demands will not capture the full social cost of the risk, leading to an underestimate of the true value from a societal perspective.
The Age Question: Differential Valuation Across Lifecourse
Perhaps the most contentious issue in VSL estimation concerns whether the value of a statistical life should vary by age. Should the death of an eighteen-year-old be valued differently from the death of a ninety-year-old? This question involves both economic logic and moral intuitions. From an economic standpoint, there is an argument that younger people have more remaining life years ahead of them, so avoiding their death preserves more years of life. However, this calculation raises profound questions about whether it is fair or moral to place different monetary values on different people’s lives based on age. When researchers have surveyed the general public about this question, responses were roughly evenly divided, with some people firmly believing that all people should be valued equally and others endorsing differential valuations based on remaining life expectancy.
The Political Controversy Surrounding Differential Valuation
Several governments have attempted to implement policies that value lives differently based on age, and the political response has been severe. Citizens and advocacy groups argued vehemently that all people are created equal and that their lives should not be assigned different monetary values based on age. The controversy demonstrates that this is not merely an academic or technical question but a deeply political one involving fundamental questions about equality and human dignity. Public opposition to age-differentiated valuations has been substantial enough to derail or modify several policy initiatives.
Quality-Adjusted Life Years as a Solution
Within the economics profession, quality-adjusted life years (QALYs) has emerged as a preferred framework for thinking about the age question while respecting the value of all lives. QALYs recognize that when assessing mortality risk reduction, we should consider two distinct variables. First is the timing of death: how long does the person live? Second is the quality of life during those years: what is the person’s health status and functional capacity throughout their lifespan? The benefit is conceptualized as the area under the curve of quality of life over the person’s remaining lifespan, somewhat analogous to consumer surplus but applied to health outcomes.
QALYs allow for comparison of different health interventions with different impacts on longevity and quality of life. Someone might die early but experience good health for a period before a sudden decline due to exposure to a hazardous substance like mercury. Their quality-adjusted life years would be calculated as the area under that curve. Someone else might live longer but experience declining health throughout their lifespan. The two individuals might have different total QALYs even if one lived longer in years. A policy that increases safety by reducing the probability of death will appear as an increase in quality-adjusted life years simply because the person shifts from early death to later death, extending the period of life. Alternatively, an intervention like improving indoor air quality might increase quality-adjusted life years even without extending life span, because it improves the quality of the years the person does live. QALYs thus provide a framework that values both mortality and morbidity improvements consistently and avoids the moral objection to assigning different values to different people based solely on age.
Historical Perspectives on Risk and Occupational Hazards
Mercury and the Mad Hatters
Historical evidence provides powerful illustrations of the harsh consequences of occupational exposure to toxic substances. In earlier centuries, hat makers suffered from chronic mercury poisoning. Mercury was used in the process of blocking hats to achieve the desired shape and finish. Workers in this industry were routinely exposed to mercury dust and vapor without protection or awareness of the dangers. The mercury accumulated in their bodies and caused severe neurological damage, leading to the expression “mad as a hatter” to describe the symptoms of mercury poisoning: erratic behavior, violent mood swings, cognitive decline, and other mental symptoms.
Chimney Sweeps: The Ultimate Occupational Hazard
Perhaps the most horrifying historical example concerns chimney sweeps in Victorian England. Because of the small sizes of chimneys and their interior dimensions, the most efficient workers for this task were children. Homeless and poor children were employed as chimney sweeps, forced to climb inside narrow chimneys caked with coal soot to scrape out the accumulated deposits. The expected lifespan of a chimney sweep child was seven years. These children worked in conditions of constant exposure to coal dust, unable to breathe properly, spending days cleaning multiple chimneys. The combination of injuries from climbing in confined spaces, respiratory damage from coal dust exposure, and poor living conditions meant that these children rarely survived to adulthood. This represents perhaps the most extreme example of workers accepting extraordinarily dangerous and unhealthy conditions because of overwhelming poverty and lack of alternatives. The moral dimensions of occupational risk and valuation become starkly clear when considering how desperately poor children were exploited in dangerous conditions with minimal compensation and anticipated early death.
Conclusion: Summary of Key Concepts
The value of a statistical life represents a powerful tool for economic policy analysis and decision-making. Derived primarily from labor market analysis of wage-risk trade-offs, VSL estimates allow policymakers to quantify the benefits of policies that reduce mortality risk in monetary terms. The classroom abalone fishing game demonstrates the core concept: workers require higher compensation to accept greater risks, and this wage differential combined with risk probability yields a value for a statistical life. These estimates have found extensive application in air quality policy and other regulatory contexts where mortality risk reduction is a central consideration. However, significant limitations and ethical concerns surround the VSL methodology. Assumptions about perfect information, representative risk preferences, uncompensated externalities, and appropriate age-differentiation all deserve careful scrutiny. Quality-adjusted life years provides a framework that addresses several of these concerns by incorporating both mortality and morbidity into a comprehensive measure of health improvements. Understanding both the strengths and limitations of VSL represents an essential component of competence in environmental economics and policy analysis.