Content
Climate Change Economics: From Science to Policy
This lecture marks a significant pivot in the course, shifting from the foundational microeconomic and macroeconomic theory covered in the first half to a more topics-focused approach centered on climate change, ecosystem services, and natural capital. Up until this point, the course has been very focused on the basics of environment and natural resource economics, including supply and demand models, cost-benefit analysis, and discounting. Now the focus shifts to applying those tools to specific, real-world topics. This also means that the style of assessment will change, becoming less micro-mathematics focused and more oriented toward having students answer questions with informed stances. This is a natural progression, going from detailed tools to more advanced understanding.
Course Logistics and the Final Project
Weekly Questions and Country Selection
A new weekly question has been posted, due on Friday of the current week. This question asks students to choose which country they will focus on for their final project, which is part of the shift toward more applied work. The final project will also earn students their writing credit required for graduation.
Students will indicate which countries from a provided list they are most interested in learning about. The instructor will use these responses to manually assign countries, ensuring full coverage across the class. Students should list their favorite few countries and write the most compelling argument for why they want to study those countries, as the most persuasive arguments will receive priority in assignment.
Assignment and Micro-Quiz Schedule Changes
An assignment will be posted on Canvas, due Monday of the following week, serving as preparation for the third micro-quiz. The schedule has been adjusted from previous iterations. Previously, the assignment was due right before the micro-quiz, which prevented the instructor from providing an answer sheet in advance. Under the new schedule, the assignment is due on Monday, the answer sheet is posted on Tuesday so students can study what they did or did not get right, and then students have Wednesday and Thursday to prepare for the micro-quiz on Friday. The micro-quiz will still be the same format: a very short assessment, probably one question, drawn very similarly from the assignment. This assignment and micro-quiz will focus on fisheries and/or discounting and net present value.
Final Project Overview
The final project is a significant part of the grade. Students will choose one country from a provided list and create an Earth Economy country report. This report will pull together information specific to the chosen country, including coverage of climate change — such as how exposed the country is, what its risks and opportunities are — as well as ecosystem services and natural capital present in that country.
The report will also incorporate some of the microeconomic tools learned in the first half of the course. The question students will address is what the basic tools in environment and natural resource economics say about their country and the incentives or policies that it is considering.
The project will use cutting-edge tools, including spatial data analyzed using geographic information systems (GIS). The course will start from no assumed knowledge about GIS, and students will come out of the project with basic GIS skills, which may be a marketable job skill. The instructor plans to switch to a classroom with table-style seating, as the project will involve hands-on exercises on computers with the instructor walking around to assist.
The Country List
The list of available countries is biased toward Central and South America. This focus was chosen for several reasons, the most important being that this is one of the areas that will be most critical for preserving nature. There is not a lot of high-value nature remaining in Europe or the United States compared to these locations. The instructor considered focusing on Africa as an alternative but chose Central and South America because of better data coverage.
The countries on the list range in size from the smallest, Jamaica, to the largest, Brazil. Because the project will involve computing actual models of climate change and ecosystem services, students may want to consider choosing a country that is not super large, unless they are confident in their computer skills, since larger countries require more computation.
The Basic Science of Climate Change
The Energy Budget of the Earth
The underlying dynamic of climate change is illustrated by the Earth’s energy budget, which is fundamentally a simple physics question. Energy comes in from the sun, with approximately 174,000 terawatts arriving at the top of the atmosphere. Just like a 1,500-watt space heater or a 140-watt laptop, this is a basic flow of energy, and the energy budget illustrates what happens to that energy.
The budget describes equilibrium conditions. Some energy is reflected by the atmosphere. Of the 174,000 terawatts coming in from space, roughly 10,000 terawatts bounce off the atmosphere, and another 35,000 terawatts bounce off clouds. The remainder makes it into the atmosphere and reaches the surface.
Reflection Versus Radiation
An important distinction in understanding climate change is the difference between reflecting and radiating energy. Although this sounds like a small linguistic difference, it is very different physically. Reflected energy behaves like a mirror: it comes in as light and leaves almost exactly the same, without being changed or absorbed. The reflected portions are the easy part — the energy simply gets sent back into space.
However, the majority of the problem is that not all energy is reflected. Much of it is absorbed by different parts of the Earth and the Earth system, and in the process of being absorbed, it is transformed from the energy in photons of light into infrared heat. Once absorbed, some of that energy will be re-radiated out to space, but the absorbed energy leaves behind heat. Unlike the reflected energy, where basically all of it bounces back out, the energy that is radiated back out leaves a residual of heat behind.
Some of this energy is absorbed by the atmosphere, some by land and oceans. For everything that is absorbed, some fraction bounces back out as heat into space instead of being perfectly mirrored. But the fundamental problem is that some of the radiation that would have gone back out to space is reabsorbed by the atmospheric layer. This protective layer, which shields the Earth from various hazards, captures outgoing infrared radiation and sends it back down to Earth. Understanding all of these different effects could fill a lifetime of scientific work.
Carbon Dioxide as the Central Molecule
The central molecule in this story is carbon dioxide (CO₂). A CO₂ molecule consists of one carbon atom and two oxygen atoms. We know this molecule very well because we are constantly breathing it out from our lungs. Carbon dioxide is exceptionally good at absorbing outgoing infrared radiation and re-radiating it back toward the Earth’s surface, which is why increasing its concentration in the atmosphere leads to warming.
The Evidence for Human-Caused Climate Change
Setting Aside Climate Skepticism on the Science
The lecture quickly addresses the arguments about climate change and the basis of evidence for it being human-caused, so that the class can set aside the scientific debate and focus on economics. The class is focused on what science says, and there is very little science supporting the climate skeptic position on whether climate change is happening.
However, the class will engage meaningfully with climate economics. Although there is very little debate about the basic facts — such as whether carbon dioxide concentrations have increased in the last hundred years — there is a great deal of debate about what we should do about it. This is where economics becomes valuable. The class will explore the full spectrum from the very conservative answer of “not a big problem, don’t worry about it” to much more progressive answers of “we need to act quickly, swiftly, and immediately.”
The focus will not be on skepticism that says climate change is not happening, but rather on using economics tools to analyze the cost-benefit analysis of climate mitigation. Economists are in a perfect situation to apply cost-benefit analysis, and different approaches within that method will illuminate the very conservative and very progressive political policies on climate change. The class is being inclusive across the political spectrum on the economics, just not on the basic science.
The 400,000-Year Record of CO₂ Concentrations
Carbon dioxide concentrations are measured in parts per million (PPM). Looking at the time series of CO₂ concentrations going back 400,000 years, which corresponds to how deep scientists can drill into ice sheets to observe tiny bubbles of air that were trapped there, we see all sorts of natural variation including ice ages. The climate skeptic point that climate has changed in the past is undeniably true. However, the question is simply whether the concentrations we see now are putting us in territory where we feel comfortable. The historical record shows that CO₂ concentrations swing around over time, but the current levels are going into wholly new and unprecedented territory.
The Industrial Revolution and the Great Acceleration
Zooming in to the period from roughly 0 BCE to 2000 CE, the carbon dioxide record shows no real change, just a stable level, until a sharp rise beginning shortly before 1800. What happened before 1800 was the Industrial Revolution. Prior to that, all the energy humanity relied upon came from the sun directly, going into plants. People could grow crops, have animals eat plants, and derive energy from those processes, but these were all relatively tiny sources of energy.
The Industrial Revolution introduced a wholly new source of energy that had never been significantly used before. Instead of relying on things coming from the sun in the short term, like wood, humanity began digging into the earth and extracting stored solar energy — plants and other organic matter that had been transformed over geological time into coal. This suddenly exploded the amount of energy available to humankind.
The rise begins with the Industrial Revolution, and energy usage becomes increasingly intensive over time. Then, in the middle of the twentieth century, essentially right after World War II, things accelerate dramatically. This is the great acceleration, which the course discussed previously. All sorts of indicators — per capita consumption, miles driven by cars, electricity usage — go exponential. This great acceleration is what drives the drastically higher concentration of CO₂ in the atmosphere.
The Mauna Loa Observatory and Direct Measurement
The era from 1960 onwards represents the modern science era, during which we have detailed records of a very simple experiment: capturing some of the atmosphere at a specific location and counting how many molecules of CO₂ are present. The Mauna Loa Observatory in Hawaii has the longest time series, though this measurement is now done at many locations around the world.
The data show two key features. First, there is an annual oscillation, representing the fact that the Earth breathes. There is much more vegetation in the Northern Hemisphere than the Southern Hemisphere, so photosynthetic activity during the northern summer causes CO₂ levels to cycle seasonally. This is predictable and should not distract from the overall trend. Second, the overall trend shows concentrations reaching unprecedented territory at approximately 420 parts per million.
This measurement is extraordinarily simple to replicate. A third-grade student could do it. The experiment requires only the ability to read numbers off of instruments and approximately $35 to buy a carbon sensor. Since the COVID pandemic, carbon sensors have become much cheaper because people have been using them to assess their exposure risk to COVID, reasoning that if CO₂ concentrations go up in a room, there are probably many humans nearby exhaling CO₂ and potentially pathogens. All one has to do is take a measurement one year, wait for the annual cycle to complete, and measure again the next year. Every single year, concentrations will be higher.
Temperature Reconstruction and the Modern Record
With more CO₂ in the atmosphere, the problem is that CO₂ is very effective at bouncing radiation back down toward the Earth’s surface. Looking at temperature reconstructed back to 100,000 years ago reveals a similar pattern of natural variation followed by an unprecedented spike in modern temperatures.
This is where skepticism becomes harder to dismiss with simple experiments. Measuring temperature, especially backwards into the past, requires more scientific understanding, including accounting for factors like sunspots and other natural variability. Nevertheless, it remains quite straightforward that we are in unprecedented territory.
Attribution Through Climate Models
Climate models are critical for understanding the human contribution to warming. The concept of models is familiar from the course, having encountered the circular flow diagram, the supply and demand model, and many other models. Models are toy representations of reality that allow us to ask “what if?” questions about what would have happened if something had been different.
Climate models allow for reconstruction into the past. Scientists simulate what happened, try to figure out what temperatures were, and match those reconstructions against what we do know from direct evidence, such as the concentrations of chemicals obtained from ice cores.
Once a model that understands the past is in hand, it can be used to ask a different type of “what if?” question about attribution. There is a model of temperature with humans — representing what was actually observed, since we are here producing emissions. Separately, a model of temperature without humans can be run, simulating only the natural forces that have changed over time.
This is important because the extent to which there is a gap between these two projections provides the answer to whether climate change is anthropogenic. When specified with this precision, we can even ask about the probability of being wrong. The 95th percentile confidence intervals are computed for both projections, and as soon as they diverge enough that the confidence intervals do not overlap, we can say with that degree of certainty that humans caused the change. And they have diverged — it is us.
A Long-Known Problem
The scientific understanding of this phenomenon has existed for a very long time. One of the earliest references to climate change appears in Popular Mechanics magazine from March 1912. The article discusses the tonnage of carbon dioxide being produced from burning 2 billion tons of coal per year and notes that the effects may be considerable in a few centuries. They had the basics right, and their prediction was correct.
Temperature Trends and Recent Extremes
Looking more closely at the recent record, global surface temperatures show the same rising pattern. These temperatures are expressed relative to the 1880 to 1920 mean, which is important because the baseline already includes some of the Industrial Revolution. The deviations would appear even larger if the comparison were to pre-Industrial Revolution temperatures, but direct observational records (literally from thermometers) do not extend back that far. The data could be reconstructed further back, but this record represents direct instrument-based observations.
The record shows variation, including a real flattening period that is often pointed to by climate skeptics, who note that there was concern about global cooling in the past. That is what that era represented. But that is old news. Since then, there has been an inexorable march upwards in temperatures.
Scientists have excellent explanations for what caused the flattening. The discussion does not dive into those explanations because the important point is that warming is happening and appears to possibly be happening at an exponentially increasing rate, especially in recent years. The weather in the last few years has been extreme, more extreme than scientists were expecting. Even the most pessimistic climate scientists have been surprised because temperatures have been worse than their projections.
One cannot draw a conclusion of great statistical certainty from only five or six years of above-trend temperature increases, but there comes a point where concern is warranted. At some point it becomes statistically relevant that the trajectory may not be a linear fit but perhaps something much scarier, like a tipping point.
Climate Futures and Representative Concentration Pathways
Projecting Forward with Scenarios
The question from an economics perspective does not depend much on where temperatures are now, but on where they are going. What will temperatures rise to if we continue on the current path?
Climate models are now used not to project backwards for validation, but to run forward. The course will increasingly use scenarios to explore the space of different possible futures. When projecting backwards, there is only one line — something specific happened, and while there may be uncertainty about exact values, there is no uncertainty about the specific series of events that occurred. But the future is different. It has not happened yet. Multiple options of what might happen must be considered, and these are called scenarios.
The RCP Framework
For climate, the scenarios are called Representative Concentration Pathways, abbreviated as RCPs. A Representative Concentration Pathway is essentially a set of assumptions about what humanity will do in the future with respect to emissions. The numbers associated with each RCP are the key indicators.
RCP 1.9 is the most positive frequently used scenario, representing little climate change. It represents a world where all Paris Agreement commitments are immediately implemented and warming is kept below 2 degrees Celsius. Running models under this scenario shows how beneficial such an outcome would be.
There are a range of scenarios between the best and worst possible outcomes. RCP 8.5 is the worst frequently used scenario, representing catastrophic climate change. RCP 8.5 was defined a considerable number of years ago, and there is an encouraging note: twenty years ago, this scenario was a very plausible possibility. However, humanity has taken some positive steps — many agreements have been signed, solar power has become much cheaper — and it is now becoming consensus that unless something unexpected happens, such as crossing a tipping point that causes methane to begin pumping out of permafrost, it is very unlikely that RCP 8.5 will be reached.
This is genuinely good news. There is even an argument that using RCP 8.5 represents a form of climate alarmism. The more appropriate interpretation is that humanity should be commended for having slowed down emissions growth. Nevertheless, the full spectrum of climate futures will be used to analyze the costs and benefits of different policy options, from aggressive mitigation now to essentially ignoring the problem.
The Human Costs of Climate Change
Uninhabitable Zones and Migration
Much of the analysis of climate change costs comes down to understanding the impacts. Under middle-of-the-road RCPs, large areas of the planet become simply uninhabitable due to extreme heat. This causes knock-on effects beyond direct mortality, including massive migration, which is a politically sensitive issue.
Wet Bulb Temperature and Physiological Limits
One of the most important and undeniable limits that climate change may impose comes from the question of how hot the human body can physically survive. Research by Matt Huber and colleagues, published in 2010, examined the physiological response of human bodies to extreme heat. Although global temperatures may not universally reach the most extreme projections, in some locations temperatures could rise above 7 degrees Celsius of warming.
The research introduces the concept of wet bulb temperature, which is measured by wrapping a thermometer in water to allow evaporation, mimicking the body’s cooling mechanism. Wet bulb temperature is the best indicator of whether a human can survive in given conditions.
Under super extreme conditions at certain temperatures, no amount of personal mitigation or self-protection besides actively pumping electricity into air conditioners would work. At those temperatures, even if a person were standing in gale force winds, completely doused with water, wearing no clothing to maximize heat dissipation, and critically not working at all — standing as still as possible to minimize heart rate — the body would still reach lethal temperatures. There is a definite, known point where the human body physically stops functioning.
A Fictional but Grounded Depiction
Kim Stanley Robinson’s novel The Ministry for the Future opens with a vivid depiction of what happens when a heat wave passes the wet bulb maximum temperature that the human body can tolerate. Set in India, the opening describes a cascading power failure where the entire electrical grid goes down, eliminating air conditioning, and resulting in a horrific catastrophe. The novel uses this catastrophe as a galvanizing force for societies to come together to try to solve climate change. While devastating in its opening, the book takes a somewhat positive trajectory from that point.
Economic Solutions to Climate Change
The Carbon Tax as Consensus Solution
There is essentially consensus across the political spectrum of economists that the best solution to climate change would be to implement a tax on carbon. Even conservative economists agree on this point. The basic mechanics are straightforward: put a tax in place that makes the marginal costs to society, including all climate damages, equal to the marginal costs faced by the producers of carbon-emitting technologies. This would make carbon-intensive energy sources like coal much more expensive.
The problem is that there does not seem to be sufficient political willpower to implement such a tax. In a certain sense, there is not much to debate — the optimal solution is known. However, underneath this basic answer lies enormous complexity about specifics.
The Discount Rate and the Value of the Future
One of the most critical factors in climate economics is the discount rate, which governs how much we care about the future versus the present. This concept was already explored earlier in the course in the context of fisheries and was shown to change optimal decisions dramatically for any resource problem involving time.
The impact of discounting is especially dramatic when applied to climate change. The basic idea is that a changing interest rate and a changing view of how far into the future one looks will drastically alter how much future values are worth today.
A stark example: if one stood to gain $1,000 of value, delivered as a check in 100 years, and the applicable discount rate is 10 percent (which is close to the market discount rate for investments), that future payment is worth only 7 cents today. One should pay no more than 7 cents for such a contract.
In the climate context, this means that almost any massively beneficial policy that might reduce climate damages in the future is going to be worth very little today. This simply reflects the fact that the benefits of climate action will be felt by future generations, not the current generation making the investment.
To put a concrete number on this, suppose climate change is going to cause $1 trillion in damages by the year 2100. If a discount rate of 3 percent is assumed, it would only be optimal, following cost-benefit analysis, to spend $7 billion to avoid those damages. This result is problematic from many perspectives, but it is the straightforward output of standard cost-benefit analysis when applied to long time horizons.
The DICE Model: An Integrated Assessment of Climate and the Economy
What are Integrated Assessment Models?
Integrated Assessment Models (IAMs) combine some representation of the environment with some representation of the economy. They are the primary tools economists use to analyze climate policy.
Introduction to the DICE Model
The Dynamic Integrated Model of Climate and the Economy (DICE model) is the most famous integrated assessment model and was created by William Nordhaus. The course examines this model for two reasons.
First, it is a genuinely impressive economics model. It won its inventor the Nobel Prize in Economics in 2018. It draws directly on material covered earlier in the course, namely the Ramsey model of economic growth.
Second, the DICE model is the basis for much climate skepticism from economists. The DICE model has been used to argue, using economic logic that is quite solid internally, that it is not optimal to spend very much money on abating climate change. This has led to skepticism of a particular type — not skepticism about whether climate change is happening, but skepticism that says climate change is happening yet it is not optimal to spend very much on dealing with it. The class will engage with this question very directly.
The Structure of the DICE Model
The DICE model is an integrated assessment model that links economic activity to climate change through cost-benefit analysis. It couples a climate model that takes emissions from economic activity with intertemporal optimization. The optimization asks: what is the optimal amount of emissions and emissions abatement over time in order to maximize the net present value of consumption?
This structure is not far from standard macroeconomics. It uses a growth model to determine optimal growth and optimal decisions that maximize welfare, but adds an extra dimension: the option to invest in climate mitigation, where the benefit of that investment is that future consumption might be higher because climate change damages are reduced.
In summary, the DICE model is a cost-benefit optimization that integrates climate and economy.
History of the DICE Model
The most recent version discussed in the course is the 2023 version of the DICE model. The first version dates to 1992, making it the first integrated climate-economy optimization model. It has been continuously updated over the years, reflecting additional years of observations, what countries have actually done in terms of emissions policy, and the latest climate science. It was this long history of model development and refinement that led to Nordhaus winning the Nobel Prize.
The Three Components of the DICE Model
The model works through a basic linkage. An economic model produces emissions, which affect two things. First, the emissions feed into a carbon cycle module that tracks how much carbon is stored in the atmosphere, the upper oceans, and the deep oceans, thereby tracking CO₂ concentrations. Second, the model calculates from those emissions the amount of climate change (temperature increase) that results. These two outputs are then combined into a damage function, which quantifies how much loss to economic activity occurs as a function of the temperature increase.
The Welfare Maximization Problem
The DICE model maximizes welfare, where welfare is defined as the sum over all time periods into the future of a utility function, representing the idea that humans derive happiness from consumption. This is identical in structure to what the course covered previously with the Ramsey model.
The welfare function is linked to population growth, because the model is applied to the real economy rather than a purely theoretical one, so it must keep track of how many people there are.
The welfare function also includes a discount factor, which is a simplification where the discount rate is raised to a negative time power. This is the mathematical mechanism by which utility in a distant time period (such as period 100) is reduced by a very large amount.
A final term allows the model to consider time steps, so the utility can be computed over intervals such as every five years.
The Damage Function
The real twist of the DICE model is that the welfare maximization is subject to a damage function. In the Nordhaus formulation, the damage function takes temperature (uppercase T) as its input — where lowercase t represents time and uppercase T represents temperature. The damage function is one of the simplest possible specifications: the temperature multiplied by some coefficient, plus the temperature squared multiplied by another coefficient. The values of these two parameters are chosen to match the damage function to studies that have attempted to understand climate damages.
Concretely, the damage function looks like this: at 1 degree Celsius of warming, there is a tiny reduction in GDP, well below 1 percent. At 6 degrees Celsius of warming, there is a 19 percent loss of GDP. The curvature in the function between these points reflects the squared temperature term.
The coefficients in the damage function are calibration parameters — values drawn from the academic literature that make the shape of the function match empirical studies. Nordhaus examines a collection of studies in the literature and determines what parameter values make his equation best fit those studies.
Solving for the Optimal Amount of Climate Change
Given the structure of maximizing welfare subject to damages, where the model gets to choose how much climate change occurs through the level of abatement investment, Nordhaus solves for the optimal amount of climate change. The result is expressed as the social cost of carbon, which represents the dollar value of the damage caused by emitting one additional ton of carbon dioxide.