Air Quality Archives - Reason Foundation https://reason.org/topics/environment/air-quality/ Free Minds and Free Markets Wed, 17 Feb 2021 17:09:22 +0000 en-US hourly 1 https://reason.org/wp-content/uploads/2017/11/cropped-favicon-32x32.png Air Quality Archives - Reason Foundation https://reason.org/topics/environment/air-quality/ 32 32 It Is Time for Environmentalism 3.0 https://reason.org/commentary/it-is-time-for-environmentalism-3-0/ Thu, 28 Jan 2021 05:00:29 +0000 https://reason.org/?post_type=commentary&p=39675 The modern environmental movement, founded on the very best of principles — protecting people and the planet — is failing.

The post It Is Time for Environmentalism 3.0 appeared first on Reason Foundation.

]]>
The modern environmental movement, founded on the very best of principles—protecting people and the planet—is failing.

To put it bluntly, many of the policies promoted to address the most serious and high profile environmental problems—climate change, declining biodiversity, oceanic dead zones, and tropical deforestation—simply aren’t working. Worse, some of those policies have become politically divisive and, in more than a few cases, actually make the problems worse.

It’s time to reconsider what we’re doing.

For the sake of this discussion, we can view environmentalism as a kind of social operating system, akin to a computer’s operating system. Like computer operating systems, environmentalism has proceeded through a series of versions, which can be referred to as Env1.0, 2.0, etc. This essay describes and evaluates Env1.0, which was in operation from about 1900 to about 1970, and Env2.0, which has been in operation since 1970. It then lays out a vision for Env3.0.

Env1.0, that of Theodore Roosevelt and Gifford Pinchot (sometimes called conservationism), was grounded in the idea of a human-centered conservation of the environment. Env1.0 was primarily about making sure that humans didn’t over-consume natural resources that humanity might want to use later. The rallying cry of Env1.0 might have been, ‘Save the whales! We will want to eat them later.’ I joke, but this was actually a decent starting point for an environmental operating system because it was scaled to our then-understanding of the environment, both locally and globally. It also reflected what we had the ability to influence in any meaningful way at the time. Unfortunately, it didn’t work well enough and didn’t fulfill our evolving ideas about the environment. The passenger pigeon still went extinct.

Env2.0, founded by people such as Julia Hill, Rachel Carson, Paul Ehrlich, and David Suzuki, and now associated with rituals such as Earth Day, non-governmental organizations such as Greenpeace, Earth First!, and Sierra Club, marked a significant departure from Env1.0.

In Env2.0, nature was seen to have intrinsic value outside of its utility to humanity and lie largely beyond the human power of rational management. Thus, the goal of Env2.0 wasn’t about maintaining and preserving the flow of environmental goods and services feeding human society but sought to wall off the environment from human society as much as possible. The rallying cry of Env2.0 might be caricatured as, ‘The environment is more important than us, we can only bung it up, so leave it alone!’

But Env2.0 is also failing on its own stated terms. Nature is neither being conserved via Env.1.0 nor has it been elevated above human needs and desires or left alone in its perfection via Env2.0.

While most of the drivers of Env2.0 were quite reasonable, it is failing—not because it was wrong to sound the alarm about environmental degradation and humanity’s role in that degradation. That was largely correct. Nor have environmentalists been wrong in pushing for environmental degradation remediation and greater ecological and human health protections. That was right as well. Even the fact that Env2.0 wasn’t particularly frugal in its methods (to put it lightly) isn’t grounds for calling it a failure. No, Env2.0 is failing for fundamental physical reasons. And by physical, I mean the fundamental cause-begets-effect, gravity-is-a-bitch kind of physical. Env2.0 is running afoul of the laws of both the physical world and intrinsic human nature. Here are three examples.

First, Env2.0 approaches generally ignore a defining characteristic of life: that living organisms respond to external stimuli. Put simply, human beings respond to incentives. Economics is the study of how humans respond to incentives. And all life is economic in one way or another. Paying attention to incentives when you’re looking at the movement of, say, glucose through an ecosystem but ignoring incentives when considering the flow of money in human ecosystems is a prescription for failure. Simply put, setting people against their own best interests and forcing them to pursue someone else’s interests makes them likely to refuse to go along with the plan.

Second, when Env2.0 concerns turn toward practical solutions, policy proponents seem to lose sight of how living systems actually respond to threats. Nature is dynamic and resilient, balancing the rigid and the flexible, continually evolving, but still highly efficient (ecologically, engineering, and biologically economic). But when it comes to regulation, environmental activists focus overwhelmingly on rigid prescriptions and inflexible regulation, attempting to pin humanity’s relationship to nature down to their specific vision of how an environmental utopia might appear, right down to setting the optimal numbers of polar bears that would constitute a “healthy ecosystem.” This approach has often been self-defeating because it fails to grasp how nature actually works and it conflicts with dynamic number one: people respond to incentives, and people want things like prosperity, liberty, health, opportunity, children, housing, jobs, and other things that are not satisfied by the Env2.0 approach.

Despite the claims of proponents that we can have both Env2.0 and all other human desires met, the reality comes back to science fiction author Robert Heinlein’s universal law:  “There Ain’t No Such Thing as a Free Lunch.”

At times, Env2.0 has indeed appeared to embrace policies that are resilient, dynamic, and take account of human desires with some market-based measures, such as emission trading, tradeable quotas for fisheries, and pricing mechanisms such as carbon taxes, energy taxes, and other economically mediated solution pathways.

The usual exemplar of this approach (and nearly the only one that has been given a serious effort in the United States) sought to control air emissions that caused acid rain. Some Env2.0 advocates argued that pollution causing acid rain was addressed efficiently with a market mechanism and that ambient lead pollution, and a few other air pollutants, were also managed with market mechanisms. They usually assert that carbon taxes are a market mechanism as well. However, these arguments are peripherally true but centrally false. What most people think of as market-based environmental control systems may well be resilient, flexible, efficient, and decentralized, giving people a certain degree of latitude to express their preferences through choices in a “market.”

But environmental trading systems to date haven’t tapped into the critical part of what lets markets genuinely manifest the truc cost of something by allowing individuals to assess the costs and benefits of trade without artificial constraints. Rather, in current environmental-trading practice, some third-party makes the value assessments, imposes its preferred outcome in accordance with its view, and allows for a subset of the range of choices individuals might make freely.

In other words, sulfur trading, lead trading, and carbon trading are pseudo-market systems, not genuine ones, and they do not embody the resilience that is missing from Env2.0. That is not to say that they did not achieve benefits: they most certainly did. However, they do not disprove the fact that Env2.0 was primarily a system of command-and-control, not one that embraced true market-driven or even market-based environmental policies.

A third reason why Env2.0 is failing is that it didn’t live up to its original mantra: think globally, act locally. Rather than allowing for a diversity of environmental protection approaches to flourish at municipal, state, or regional levels, the environmental movement quickly moved to centralize such regulations, first at the state level (in places like California), and then increasingly at the national level, and ultimately at the global level (through the United Nations).

Each step toward centralization was a step along a primrose path. Yes, it was a predictable and readily-available approach—after all, governments have regulated for as long as we’ve had governments—but the further removed from the needs of people at their own local levels, the less uniform were the costs and benefits allocated to different people and different communities. All of this led to decreasing support in environmental policies from different segments of the public for various regulatory regimes, and the subversion—overtly or covertly—of those regimes.

One example of this would include the alienation of states that were heavily invested in coal production or power generation versus those states better endowed with lower-carbon sources of energy and economic growth. And another example is the one-size-fits-few problem with designating broad brush carbon targets that, by causing energy prices to rise, disproportionately impacts lower-income earners more than it does higher-income earners.

That’s not to say there have not been important and notable environmental successes.

Nobody wants to return to the pollution levels of the 1960s and 1970s. I know I don’t—I carried an asthma inhaler ever since collapsing from an asthma attack during high school physical education while running laps in California’s notoriously smoggy San Fernando Valley. And, only a cretin would not celebrate the resurgence of whale populations, the protection of migratory bird populations, and our most iconic raptor species. Successes of Env2.0 make up a fairly long list.

But as I mentioned at the outset, the rigid compulsory/engineering approach of Env2.0 is increasingly running into dead ends. For example, we are continuously failing to achieve emission targets, reduction deadlines, treaty obligations, measurable improvements to climate stability, biodiversity targets, and so on. National and International environmental protection targets have been missed far more than they’ve ever been hit. Virtually all national targets for greenhouse gas emissions have been missed, and almost no serious people think current greenhouse gas control targets (net-zero by 2050) are in any way attainable, much less compatible with the living standard achieved in developed countries, and desired around the world.

In the meantime, technological breakthroughs promised by Env2.0 (which were to have spontaneously arisen because, ironically, markets) have stubbornly failed to appear. Despite subsidies, electric vehicles still make up a tiny fraction of the automobile market. Wind and solar power are intermittent, as California was recently reminded, and, because of the need for scarce materials and redundant backup systems, renewables are still more expensive than fossil fuels. The long-promised super-batteries that would let people power their homes off the grid, and even power entire cities, states, and countries using wind and solar are nowhere to be seen. And now the public hears rhetoric that asserts a “climate emergency” that dispenses with the very idea that humans can sustain the quality of life they’ve worked so hard to achieve, instead cautioning us to prepare for a dystopian future while still futilely plugging away at the same failed policies that got us here.

So what’s the answer?

If at first you don’t succeed, revise and resurge. Now that Env2.0 is crashing painfully against the unyielding shores of reality, it’s time to turn toward approaches that work with fundamental human nature, economic incentives, fundamental ecological system characteristics, and fundamental physical reality by moving away from the static, non-economic, physically impossible approaches of Env2.0.

It’s time for Env3.0.

Doing this efficiently requires differentiating between two broad classes of environmental problems. One class consists of those environmental problems we know can be successfully managed with the resilient and dynamic systems to accurately manifest, and efficiently distribute property rights, which include the right to property in and of your own body, as well as to things outside of it. The overwhelming majority of environmental problems fit into this category, with the limits mostly being a matter of whether or not a given jurisdiction has a market-compatible policy environment. Local air pollution, local water pollution, local chemical exposure, local resource over-utilization, local wildlife endangerment, can (and sometimes have) been managed through markets and the rule of law. And by “local,” I mean “within one geopolitical region,” which can be cities, states, or even countries, but are virtually never transnational, trans-economic, or trans-environmental-boundary problems.

In 1991, Terry Anderson and Donald Leal published Free Market Environmentalism, an excellent book that summarized and expanded our understanding of why environmental problems exist and how incentives, markets, and property rights could help solve those problems. Their answer was not, as the prevailing orthodoxy of the time held, that humans were just selfish and thoughtless despoilers of the world that had to be regulated into submission to what the environmentalist elites demanded of them. Unfortunately, some Env2.0 aficionados who distrust people’s inclination to prioritize Env2.0’s particular view of what it means to have a healthy environment (often meaning one untainted by human hands) have resisted this property-based approach to environmental management and protection.

But sometimes we can’t use property rights and a legal framework for pollutants we really care about, pollutants that cross borders, that cannot be traced to a responsible party, that persist beyond the time when the original actors can be held responsible for remediating their harms. Greenhouse gas pollution is one of these problems. So is conventional air pollution that moves between jurisdictions, even at the continental level. Plastic pollution is another, and biodiversity (especially of migratory wildlife) is still another. Ocean mammal protection would be such a problem (save for some revolutionary technology that would allow for private management of animals like whales), as would the risk of polar bear extinction. And those critics of environmentalism based purely on property rights are correct: some environmental problems simply won’t yield to that framework.

That’s where a different approach from Env2.0 needs to be brought to bear. What is needed is a management approach that shifts away from viewing the goal of environmental protection as the meeting of fixed, often arbitrary and unattainable targets and timelines for the reduction of single sources of environmental concern (such as, say, greenhouse gases) toward a focus on building the overall resilience of our integrated, globally shared, social-ecological system that is actually compatible with those stubborn laws of physical reality and human behavior mentioned above. This is a framework that treats nature and the human economy as an integrated social-ecological system, of the sort described by economist Elinor Ostrom in Governing the Commons.

Before going further though, we need to define what we mean by “resilience,” which has, unfortunately, become something of a code word lately for “more government” in all sorts of realms—from infectious disease recovery to managing climate change to any number of other perceived social ills.

The kind of resilience we’re talking about here is not about preventing any or all changes to a social-ecological system or imposing a pre-defined vision of exactly what that system is supposed to look like, it is about managing such systems in such a way that they can bounce back from disturbance, to return to a desirable base state.

In the case of the United States, one might define that desirable base state as consisting of a society based on the principles of the Constitution, and the other documents of America’s founding as well as the strong environmental protection ethic that has come to permeate American sensibilities

But what are the elements that make a social-ecological system (SES) resilient? More importantly, what elements of SES organization or management can make an SES less resilient, or more fragile.

Identifying the components of SES resilience was the goal of researchers David A. Kerner and J. Scott Thomas (Kerner and Thomas) in “Resilience Attributes of Social-Ecological Systems: Framing Metrics for Management.”  Kerner and Thomas break SES-resilience determinants into three broad categories: those which promote or compromise system stability, those which promote or compromise the system’s adaptive capacity, and those that promote or compromise the system’s readiness, which can be seen as the system’s speed and scope of responsiveness. Let’s consider a few examples.

Among the things that might compromise SES stability, Kerner and Thomas identify the presence of “single points of failure” which might cause the entire system to fail:

  • System balance, or “The degree to which a system is not skewed toward one strength at the expense of others;”
  • And system dispersion: “The degree to which the system is distributed over space and time”.

Among the things that might compromise SES adaptive capacity, Kerner and Thomas identify:

  • System response diversity, or the ability to employ alternative components to withstand stresses;
  • Collaborative capacity, or the “potential of system managers to work cooperatively to ensure system function in a timely and flexible manner;”
  • Connectivity, or how readily a system can exchange resources and information internally and externally to ensure continued function in the face of existential threats;
  • And learning capacity.

Finally, Kerner and Thomas identify some of the system attributes that might compromise SES readiness, including:

  • The absence of simplicity or understandability;
  • The presence of False Subsidies that may do more harm than good;
  • And the presence or absence of autonomy, the degree to which “an organization, operation, or function can self-select alternate actions, configurations, and strategies to achieve the specific mission or function—essentially, control over its destiny.”

Astute readers will readily recognize that market systems or, as a second-best alternative, market-based environmental protection measures, will likely perform better on the various parameters of building SES-resilience than would either the conservation approaches of Env1.0 or the command-and-control approaches of Env2.0.

Summary

We should be concerned about the failure of Env2.0, that, through its lens of humanity as being somehow outside of the environment, dismisses the physical and evolutionary laws that govern human behavior, and produces an utterly dysfunctional environmental protection regime of political antagonism which fails at its self-proclaimed priority: protecting the environment.

What is needed now, 120 years since the advent of Env1.0, and 50 years after the advent of Env2.0, is a new system. We need an Env.3.0 that recognizes most environmental problems, especially localized problems, can be fixed most efficiently, and with respect for human needs, with genuine, bottom-up market-based management systems.

Env3.0 would also move beyond the limitations of Env2.0 by managing systems not as either “human” systems or “environmental systems,” but as integrated social-ecological systems that require more integrated thinking and accommodation of different people’s needs and desires, with a focus on resilience, rather than static, arbitrarily defined targets, timelines, quantitative goals, and inflexibility.

The post It Is Time for Environmentalism 3.0 appeared first on Reason Foundation.

]]>
The Economic Consequences of Fuel Economy Standards https://reason.org/policy-brief/the-economic-consequences-of-fuel-economy-standards/ Mon, 30 Mar 2020 21:58:50 +0000 https://reason.org/?post_type=policy-brief&p=33406 This is the fourth in a series of briefs explaining and evaluating corporate average fuel economy (CAFE) and zero-emissions vehicle (ZEV) standards.

The post The Economic Consequences of Fuel Economy Standards appeared first on Reason Foundation.

]]>
Since 1978, the National Highway Traffic Safety Administration has promulgated “corporate average fuel economy” (CAFE) standards. Meanwhile, the California Air Resources Board (CARB) has, since 1990, employed its own low emission vehicle (LEV) standards, including zero-emission vehicle (ZEV) standards.  Twelve other states and the District of Columbia subsequently adopted California’s LEV standards and nine of those adopted its ZEV standards.

The original CAFE standards sought to reduce U.S. dependence on foreign oil. While that remains one accepted purpose of current standards, the rationale has now largely shifted to reductions of emissions—especially those associated with climate change.

In 2012, NHTSA and the Environmental Protection Agency (EPA) issued a joint rulemaking for CAFE and GHG emission standards covering the period 2017–2021 and describing “augural” (i.e. intended) standards for 2022–2025. In 2018, EPA and NHTSA issued a proposal for revised standards for 2021 and new standards for 2022–2025 under the auspices of the proposed Safer Affordable Fuel-Efficient Vehicles rule (“SAFE” rule).

This is the fourth in a series of briefs published by Reason Foundation explaining and evaluating CAFE and ZEV standards. It incorporates lessons from those briefs and other studies that consider the overall effect of CAFE standards, as well as more recent work looking at revisions to federal CAFE and GHG standards under the proposed SAFE rule.

Fuel economy standards undoubtedly result in harmful distortions to markets, undermining choice and competition and leading overall to a reduction in beneficial innovation and economic growth. To the extent that reductions of various emissions (such as NOx and CO2) are desirable, there are more economically efficient ways to achieve these reductions, such as taxes and, even better, mileage-based user fees (MBUFs).

In terms of its effects on the economy, the SAFE rule thus represents a substantial improvement over the existing CAFE and GHG emission standards for MY 2017–2021 and an enormous improvement over the augural standards for 2022–2025. The rule will likely save consumers and manufacturers tens if not hundreds of billions of dollars, freeing up resources to be spent on innovation that will increase rates of economic growth and result in vehicles that consumers actually prefer.

Looking to the future, it seems likely that vehicle fuel economy will increase even without fuel economy standards, as vehicle manufacturers implement cost-effective improvements and respond to the felt needs of consumers, especially in the face of gas prices that are generally higher than during the late 1980s and 1990s.

In a study published in 2017, Kenneth Small used an adapted version of the National Energy Modeling System to estimate that if NHTSA and EPA were to leave the average CAFE requirement for light-duty vehicles (LDVs) at 33.7 mpg, the average fuel economy for new cars would rise from 36.7 mpg in 2015 to 41.7 mpg in 2025, while the fuel economy of new light trucks would rise from 27.3 mpg to 35.6 mpg over the same period.

In other words, the average fuel economy for new LDVs would rise to 38.9 miles per gallon (this is about 2 mpg more than the mandated increase in fuel economy under the proposed SAFE rule). Small assumed that the price of a gallon of gasoline would rise to $3.58 by 2025, which is lower than the $3.78 assumed by NHTSA and EPA in their 2012 rulemaking but higher than the $2.93 assumed in the SAFE rulemaking.

These findings are relevant because they suggest that if gas prices are higher than expected in the proposed SAFE rule, the fuel economy standards could cease to be binding, thereby reducing the costs they impose on society. It is difficult to predict the future of gasoline prices, but it seems clear that consumers and manufacturers adjust quickly to unexpected price changes, whereas mandated fuel economy standards run the risk of becoming either ineffectual or seriously harmful.

The Economic Consequences of Fuel Economy Standards

Related Research:

The Effect Of Corporate Average Fuel Economy Standards On Consumers

CAFE and ZEV Standards: Environmental Effects and Alternatives

CAFE Standards in Plain English

The post The Economic Consequences of Fuel Economy Standards appeared first on Reason Foundation.

]]>
Fuel Economy Standards Hurt Consumers and the Economy https://reason.org/commentary/fuel-economy-standards-hurt-consumers-and-the-economy/ Tue, 10 Apr 2018 05:22:54 +0000 https://reason.org/?post_type=commentary&p=23271 A “mileage-based user fee” could ensure that drivers pay for the harm they generate – and would incentivize drivers to drive less when the harm they cause is greater.

The post Fuel Economy Standards Hurt Consumers and the Economy appeared first on Reason Foundation.

]]>
Last week, the Environmental Protection Agency announced it will not raise fuel economy and greenhouse gas emission standards for cars and light trucks by as much as the Obama administration had proposed. “The Obama administration’s determination was wrong,” said Scott Pruitt, head of the Environmental Protection Agency (EPA). The EPA’s decision is good news for consumers, the economy and environment.

Corporate average fuel economy (CAFE) standards were first introduced by Congress in 1975, ostensibly to reduce US reliance on foreign oil. The standards have been tightened several times since then, most recently in 2016, when they were combined with standards intended to reduce greenhouse gas emissions. Those standards, which are binding, peak at 41 miles per gallon in 2021. The Obama administration also proposed raising the standards after 2021, suggesting that they should reach 54.5 miles per gallon by 2025. It is these latter, non-binding standards that now won’t be implemented.

Economists have long been skeptical of fuel economy standards because they are an inefficient means of reducing fuel consumption and emissions. The majority of U.S. consumers demonstrably prefer larger, more powerful vehicles, such as pickups and SUVs. But vehicle manufacturers continue to produce large numbers of small vehicles, which have higher fuel economy, in order to comply with fuel economy standards. As a result, manufacturers and dealers are forced to discount the prices of new cars, undermining their overall profitability.

In addition, fuel economy standards result in a “rebound” effect: when it costs less to drive each mile people tend to drive more, which means more emissions. Moreover, by increasing the price of new vehicles, fuel economy standards incentivize consumers to keep older vehicles on the road longer. This is especially true for larger, more powerful “gas guzzlers.” Studies suggest these actions by consumers reduce the potential fuel savings and emissions by 25 to 50 percent.

In a recent study for Reason Foundation, Arthur Wardle and I conservatively estimated that implementing the proposed 2025 standards would reduce annual carbon dioxide emissions by 50 million tons at a cost of $50 billion. That puts the cost of CO2 reduced at an astronomical $1,000 per ton – about 50 times the “social cost of carbon” estimated by the EPA during the Obama administration.

As I noted in another recent study, the social cost of carbon is likely less than $10 per ton (and could be zero). So, had the EPA proceeded with the Obama-era proposals, it would have imposed costs at least 100 times greater than the benefits. Those costs would have fallen disproportionately on poorer consumers.

In a study published last week, I calculated that a person purchasing a three-year-old pickup is already paying about $100 per year more for their vehicle and fuel than they would be in the absence of the new CAFE standards. As fuel standards become more onerous, which they will until at least 2021, the net cost is likely to increase. If the more ambitious Obama-era standards had been imposed, consumers could have been paying $500 per year more than necessary by 2025.

Other policies are better suited to address the pollution associated with vehicle use. Arguably the best policy would be to charge a fee per mile driven, varied according to the time and location of the driving, as well as the emissions from specific vehicles. Such a “mileage-based user fee” could ensure that drivers pay for the harm they generate – and would incentivize drivers to drive less when the harm they cause is greater.

Even the existing gas tax is more efficient than fuel economy standards. The University of California, San Diego, economist Marc Jacobsen estimates that using fuel economy standards to reduce fuel use and emissions costs three to five times as much as a gas tax. In addition to harming consumers and the economy, fuel economy standards are bad for the environment. As noted, they encourage consumers to use their cars more than they otherwise would and they encourage consumers to keep “gas guzzlers” on the road longer for longer. And because these effects drive up the cost of reducing emissions, they effectively waste resources that could be spent on other environmental amenities.

Unfortunately, Congress has tied the hands of agencies that set fuel economy standards, preventing them from rolling back existing standards that are already in force. It would take an act of Congress to scrap those standards.

EPA says it will produce new emission standards for 2022, and beyond, that are less onerous. Given that the existing standards already impose costs greater than the benefits, the best outcome would be for EPA not to increase the standards after 2021.

The post Fuel Economy Standards Hurt Consumers and the Economy appeared first on Reason Foundation.

]]>
The Effect Of Corporate Average Fuel Economy Standards On Consumers https://reason.org/policy-brief/the-effect-of-corporate-average-fuel-economy-standards-on-consumers/ Sun, 01 Apr 2018 04:00:52 +0000 https://reason.org/?post_type=policy-brief&p=23175 Fuel economy and greenhouse gas emissions standards for vehicles are a very inefficient way to address issues related to fuel consumption and emissions.

The post The Effect Of Corporate Average Fuel Economy Standards On Consumers appeared first on Reason Foundation.

]]>
Corporate Average Fuel Economy (CAFE) standards require manufacturers to meet minimum fuel economy requirements for their fleets of vehicles sold in the U.S. As a result, manufacturers adjust certain vehicle attributes in order to comply with these standards. Among the many vehicle attributes that a manufacturer may adjust are weight, power, and drivetrain. Such adjustments have consequences for the cost and performance of vehicles, which affects consumers.

In their assessment of the likely effects of CAFE standards, the National Highway Traffic Safety Administration (NHTSA) and the Environmental Protection Agency (EPA) claim that the new standards introduced since 2011 generate substantial benefits for consumers. Underlying that claim is an assumption that consumers fail adequately to take into consideration the economic benefits of more fuel-efficient vehicles when making purchasing decisions. However, a slew of recent studies questions the assumptions made by NHTSA and EPA. This brief assesses the effects of CAFE standards on consumers.

Proponents of CAFE standards claim that they benefit consumers by reducing the total costs of purchasing and using vehicles. The evidence contradicts this claim. Consumers generally purchase vehicles with characteristics that meet their needs, including their expectation of the total cost of future gas purchases. CAFE standards distort manufacturers’ incentives, forcing them to produce new vehicles with lower gas consumption than would be preferred by consumers. As a result, the range of vehicle options available to consumers is limited and many consumers are effectively forced to purchase vehicles that are less able to meet their preferences.

Among the most adversely affected consumers are those, predominantly in rural areas, who seek to purchase used pickups. The distortions created by CAFE standards artificially raise the cost of these vehicles by more than the average savings from reduced gas usage, increasing the total cost of ownership. Given the steep rise in the price of used pickup trucks that resulted from CAFE standards for the 2012–2016 period and current increases occurring as the 2017–2021 standards are implemented, it is likely that prices would rise at an even faster rate if the agencies were to implement standards along the lines of those proposed as “augural” for 2022–2025.

In addition, as noted in a previous paper, fuel economy and greenhouse gas emissions standards for vehicles are a very inefficient way to address issues related to fuel consumption and emissions. Ideally, the federal government would scrap the federal CAFE and greenhouse gas emissions standards. However, this option is not currently on the table.

Ideally, the federal government would scrap the federal CAFE and greenhouse gas emissions standards. However, this option is not currently on the table. Nonetheless, the agencies implementing the standards do have the option of setting future greenhouse gas emissions and CAFE standards at the same level currently set for the model year 2021. That would certainly be preferable to the alternative of raising the standards further. In addition, to the extent that other extant EPA and NHTSA regulations serve as barriers to the introduction of vehicles that better suit consumer preferences, it behooves the agencies to seek ways to remove these barriers. One example noted herein are the essentially arbitrary and unnecessary differences between U.S. and international standards for a variety of vehicle parts. Harmonization of these standards would likely result in the production of vehicles that better serve consumers at a lower price. In addition, to the extent that the threat of anti-trust action impedes collaboration between manufacturers in the development of new technologies, a simple process for the granting of anti-trust waivers could facilitate more rapid innovation, not only of more-efficient vehicles but also in many other aspects of automotive technology.

Full Brief: The Effect Of Corporate Average Fuel Economy Standards On Consumers

Related Research:

CAFE and ZEV Standards: Environmental Effects and Alternatives

Climate Change, Catastrophe, Regulation and The Social Cost of Carbon

The post The Effect Of Corporate Average Fuel Economy Standards On Consumers appeared first on Reason Foundation.

]]>
Federal Fuel Economy Standards are Costly, Inefficient and Harm the Environment https://reason.org/commentary/federal-fuel-economy-standards-are-costly-inefficient-and-harm-the-environment/ Fri, 11 Aug 2017 14:54:23 +0000 http://reason.org/?post_type=commentary&p=18634 New vehicles sold in the U.S. must comply with increasingly stringent Corporate Average Fuel Economy (CAFE) standards. Originally intended to reduce reliance on oil imports, the new standards focus mainly on reducing emissions of greenhouse gases. But even the federal … Continued

The post Federal Fuel Economy Standards are Costly, Inefficient and Harm the Environment appeared first on Reason Foundation.

]]>
New vehicles sold in the U.S. must comply with increasingly stringent Corporate Average Fuel Economy (CAFE) standards. Originally intended to reduce reliance on oil imports, the new standards focus mainly on reducing emissions of greenhouse gases. But even the federal government’s own studies show that CAFE standards are an extremely costly and inefficient means of achieving these objectives. In a review now under way, the Trump administration has an opportunity to reduce these costs by leaving the standards unchanged after 2021.

In 2012, the National Highway Transportation and Safety Administration (NHTSA) and the Environmental Protection Agency (EPA) issued new CAFE standards for vehicles manufactured in the years 2017-2025. Under these rules, the minimum fleet wide average fuel economy for passenger cars would rise from 39.6 mpg this year to 55.3 mpg in 2025. For light trucks, minimum fuel economy would rise from 29.1 mpg this year to 39.3 mpg in 2025.

Final rules were only set for model years 2017-21, with subsequent rules to be established after a mid-term evaluation. In that evaluation, EPA and NHTSA found that between 2016 and 2028 the standards would reduce total carbon dioxide emissions by up to 748 million metric tons. In its regulatory impact assessment of the new CAFE standards, NHTSA estimated the annualized cost of implementation at between $5.4 billion and $7.6 billion. Even at that low end — $5.4 billion per year — the CAFE standards represent an implicit cost of $87 per ton of carbon dioxide (CO2) reduced. That is higher than most estimates of the cost that carbon dioxide imposes on society – and more than twice the EPA’s own estimate (since withdrawn by President Trump).

Additionally, these estimates almost certainly overstate the benefit of the new fuel economy standards and underestimate their costs. When consumers buy more fuel efficient cars, they typically drive more because their per-mile cost of driving is lower. And many other consumers prefer large, powerful vehicles. Because the new CAFE standards drive up the cost of new large vehicles disproportionately, consumer demand for older, large “gas guzzlers” rises and those vehicles stay on the road longer than they might otherwise. In combination, these two factors could reduce the effectiveness of CAFE standards by as much as one-half of EPA and NHTSA’s estimates.

Even assuming a more modest total effect, the new CAFE standards might optimistically reduce CO2 emissions by around 50 million metric tons per year, which equals just one percent of total U.S. emissions.

Researchers at the Heritage Foundation found that the costs of meeting the new CAFE standards could range from $61.2 billion to $186.1 billion per year. The higher figure represents approximately 1 percent of U.S. GDP – a staggering cost. If the true cost of the new CAFE standards is even $50 billion per year and the true emissions savings were as much as 50 million metric tons a year, CAFE standards would represent an implicit cost of $1,000 per ton of CO2 reduced – making them among the least efficient ways to reduce CO2 emissions.

CAFE standards distort the market, forcing companies to build vehicles that are more fuel-efficient than consumers want. When consumers purchase these more-expensive-than-necessary vehicles, they have fewer resources to invest in other things, including conservation and other practical environmental improvements. Moreover, companies forced to invest in the development of more fuel-efficient vehicles have fewer funds to invest in more useful innovations, such as the development of autonomous vehicles, which are likely to save fuel and reduce emissions and traffic congestion.

NHTSA recently announced plans to prepare an environmental impact statement for CAFE standards for model years 2022-2025. One of the options under consideration is to keep the fuel economy standards at the levels set for the model year 2021. Doing so would reduce market distortions, save consumers billions of dollars, and enable companies to invest more in the development of improvements that actually benefit consumers and the environment.

This column first appeared in the Hill

The post Federal Fuel Economy Standards are Costly, Inefficient and Harm the Environment appeared first on Reason Foundation.

]]>
CAFE and ZEV Standards: Environmental Effects and Alternatives https://reason.org/policy-brief/cafe-and-zev-standards-environmental-effects-and-alternatives/ Thu, 03 Aug 2017 16:24:29 +0000 http://reason.org/?post_type=policy-brief&p=16555 Introduction Manufacturers and importers of vehicles sold in the U.S. are subject to federal fuel economy and greenhouse gas (GHG) emission standards. In addition, California and nine other states have imposed more-stringent standards and require a minimum proportion of vehicles … Continued

The post CAFE and ZEV Standards: Environmental Effects and Alternatives appeared first on Reason Foundation.

]]>
Introduction

Manufacturers and importers of vehicles sold in the U.S. are subject to federal fuel economy and greenhouse gas (GHG) emission standards. In addition, California and nine other states have imposed more-stringent standards and require a minimum proportion of vehicles to have “zero” emissions.

The stated objectives of Corporate Average Fuel Economy (CAFE) standards are: (1) to reduce fuel consumption and (2) to reduce greenhouse gas (GHG) emissions. This brief considers the extent to which the standards are likely to achieve these objectives, the relative importance of achieving such objectives compared with other possible environmental objectives, and more cost-effective alternative policies.

CAFE and ZEV Standards

CAFE standards were first introduced in 1978 and new standards have been implemented at various intervals since then. In 2012, the National Highway Transportation and Safety Administration (NHTSA) and the Environmental Protection Agency (EPA) issued new CAFE and GHG emission rules for vehicles manufactured in years 2017–2025. Under these rules, fleetwide average fuel economy for passenger cars rises to 39.6 mpg in 2017 (from 38.2 mpg in 2016) and then incrementally to a minimum of 55.3 mpg in 2025. For light trucks, minimum fuel economy rises to 29.1 mpg in 2017 (from 28.9 mpg in 2016) and then incrementally to 39.3 mpg in 2025. These minimum fuel economy standards are based on the maximum vehicle footprint in each category: fleets with smaller average vehicle footprints are required to meet stricter standards.

Under the GHG emission rules in the 2017–2025 standards, manufacturers may offset other efficiency improvements, such as in air conditioning systems, thereby reducing the minimum effective fuel efficiency for passenger vehicles in 2025 to 49.6 mpg. Moreover, manufacturers may count each electric and plug-in hybrid vehicle as more than one effective vehicle: for plug-in hybrids, this multiple starts at 1.6 in 2017 and declines to 1.3 in 2021; for electric vehicles, the multiple is 2 in 2017, falling incrementally to 1.5 in 2021.

Also in 2012, California’s Air Resources Board adopted the Advanced Clean Cars program, which created standards for new vehicles in California through 2025. A key provision of the program requires that, starting in 2018, major motor vehicle manufacturers sell a minimum proportion of “Zero Emission Vehicles” (ZEVs—i.e. electric cars) or combination of ZEVs and “Transitional Zero Emission Vehicles” (TZEVs—i.e. plug-in hybrids). For model year 2018, the minimum is 4.5%, including at least 2% ZEVs. This rises annually reaching 22% in 2025, of which at least 16% must be ZEVs. California was permitted to implement these stricter standards under a waiver granted by the Environmental Protection Agency. Other states have been permitted to adopt the same standards as California and so far nine additional states have done so, namely: Connecticut, Maine, Maryland, Massachusetts, New Jersey, New York, Oregon, Rhode Island and Vermont.

Full Brief – CAFE and ZEV Standards: Environmental Effects and Alternatives

The post CAFE and ZEV Standards: Environmental Effects and Alternatives appeared first on Reason Foundation.

]]>
Devastating Fires Show Forest Management Reforms Are Badly Needed https://reason.org/policy-brief/forest-fires-management-reform/ Tue, 01 Sep 2015 14:00:00 +0000 http://reason.org/policy-brief/forest-fires-management-reform/ In the past three decades, the area burned by wildfires in the U.S. each year has grown dramatically, while the number of fires has remained roughly constant. The big change has been the average size of each wildfire, which has more than doubled.

The post Devastating Fires Show Forest Management Reforms Are Badly Needed appeared first on Reason Foundation.

]]>
On August 19, a ferocious wildfire in Chelan, Washington, claimed the lives of three firefighters. That fire, now 50 percent “contained”, has burned about 90,000 acres. Meanwhile, in nearby Okanogan, Washington, five massive fires merged to create a 305,000 acre monster, the largest in the state’s history – and still only 25 percent contained (as of 8/30). Across the Northwest (Washington, Oregon, Idaho and Montana), currently active fires have burned over 1.4 million acres, destroying homes and businesses in their wake. Meanwhile, the U.S. Forest Service alone is reportedly spending $100m per week on fire suppression. With conditions tinder dry and fire crews at their limit, worse may yet be to come.

Many of those living near federal forest lands in the Northwest fear a repeat of August 1910, when a single, monstrous wildfire burned 3 million acres in Washington, Idaho, and Montana in two days, killing 92 people. With five times as many people living in those states, such a fire today would likely be far more devastating. Yet, ironically, today’s problems in large part stem from the misguided federal response to that fire – especially management practices on the 63 percent of forests owned by the federal government (see Figure 1). If worse problems are to be avoided in the future, management practices on those federally owned forests must change.


Source: Brian Seasholes using data from U.S. Forest Service

Catastrophic Wildfires Increase

In the past three decades, the area burned by wildfires in the U.S. each year has grown dramatically, while the number of fires has remained roughly constant (having peaked in the 1970s) – see Figure 2. The big change has been the average size of each wildfire, which has more than doubled – as can be seen clearly in Figure 3 (solid line is the trend). (Data for these figures is from the National Interagency Fire Center.)


2015 will likely break records – and that is not a good thing. Year-to-date, more land has been destroyed by fire in the U.S. than at any comparable period in the past decade – see Figure 4.

The scale of wildfires in the Northwest this year is in part due to a persistent drought, high temperatures and a snow pack that was well below average and melted earlier. Snowpack typically melts over the course of the spring and summer, providing water to the forest ecosystem. A low snowpack (resulting from below-average snowfall during the winter months), combined with hot summer weather, reduces the availability of water for trees, increasing the likelihood of fires.

But such climatic factors cannot explain the pattern of fires observed over the past century. Data from the National Oceanic and Atmospheric Administration show that average winter precipitation in the Northwest (which NOAA defines as Washington, Oregon and Idaho – but the situation is similar for the forested mountains of Western Montana) has not changed significantly since 1895. Although winter temperatures in the “Northwest” and Western Montana show a small trend (though not statistically “significant”) – and were well above average last winter – summer droughts show no particular trend, as can be seen in Figure 5. So, while it is possible that climate change has played a role in increasing the size of fires, the primary cause seems to be forest management practices, which have changed several times over the course of the past 200 years.

Figure 5: Palmer Drought Severity Index for July-September, 1895 – 2014 (yellow = worse)


Source: National Oceanic and Atmospheric Administration

200 Years of Changing Forest Management Practices

Prior to European settlement in the early 1800s, Native Americans set forest fires for hunting, harvesting and many other purposes – including fire proofing. These frequent, small, low-intensity fires, usually set at times of year when they were unlikely to rage out of control, burned much of the dead wood and created more park-like forests of variegated tree types and ages, which created favorable hunting conditions. By removing fuel, these fires reduced the number and scale of catastrophic wildfires.

European settlement resulted in the collapse of the Native American population – and with it the practice of intentionally setting small fires. Forests reverted to a more “natural” state (i.e. how they might have been without human intervention): pioneer trees such as pine, aspen and larch, were able to mature, while shade tolerant species, such as spruce, fir and giant cedar grew in the understory. In these densely packed forests, trees compete for water and in drier years become severely water stressed and thereby more susceptible to disease and pests.

When lightning hits a healthy tree in a lightly packed, variegated forest, the resultant sparks might cause a small fire that clears the understory. In a dense forest full of diseased and dead trees, a lightning strike rises up the ladder created by the thick understory, reaching the crown. The resulting, explosive crown fire then spreads rapidly, especially in arid, windy conditions.

Those were exactly the conditions that prevailed on August 20, 1910, when perhaps 3,000 separate wildfires in Washington, Idaho and Montana merged into one. The resulting inferno, driven by ferocious winds, burned about three million acres of forest in two days, killing 92 people, destroying seven towns and displacing thousands of people.

Prior to the “Big Blowup”, as the 1910 fire became known, some argued that the U.S. Forest Service, which was founded in…, should carry out small, prescribed burns (essentially imitating what the Native Americans had done), while others – including Gifford Pinchot, first chief of the Forest Service – favored total suppression. The Big Blowup settled the debate: fire suppression became the rule.

For decades, the Forest Service’s policy of fire suppression appeared to be highly successful. As Figure 6 shows, from 1910 to the mid-1950s, the area of National Forest burned fell from over 1 million acres to around 200,000 acres – and stayed more-or-less constant until the 1980s, when it inevitably began to rise once again.

Figure 6: Effects of Fire Suppression on the Area of National Forest Burned, 1910-97


Source: General Accounting Office

In the hot, dry summer of 1988, several ferocious crown fires raged through Yellowstone, burning more than a third of the National Park’s 2.2 million acres, as well as hundreds of thousands of acres in the surrounding area. In some respects, the Yellowstone fire was “natural”: 80 percent of the trees in the park were lodgepole pine, which is habituated to intense, stand-replacing fires. But the scale of the fire was nonetheless likely exacerbated by the changes in forest management since the 1800s, including 70 years of fire suppression. And the political response was far from natural, as forestry expert Alison Berry notes: “Managers were mandated to create fire plans for all federal forest lands, and all fires were to be suppressed until fire plans were in place.”

In 1990, the Fish and Wildlife Service listed the northern spotted owl as “threatened” under the Endangered Species Act. Largely in response to this listing, the Forest Service implemented a new “ecosystem management” policy, which resulted in a further reduction in timber harvests. Then, shortly before he left office President Clinton introduced a rule that severely restricted the use of existing roads and construction of new roads on 49 million acres of National Forest, limiting the ability of the Forest Service to thin the dense thickets of trees that resulted from almost a century of fire suppression – or even remove dead trees.

The net result of these changes was a dramatic decline in the amount of timber removed from federal lands. As Figure 7 shows, between 1960 and 1990, an average of 10.3 billion board feet of timber were removed each year from Forest Service land. Removals declined precipitously between 1991 and 2000. From 2000 to 2013, an average of just 2.1 billion board feet of timber were removed from Forest Service land. That represents a near 80 percent decline in removals. Similarly, from 1990 to 2002, timber sales on BLM land fell 74 percent as a direct result of these policy changes and has remained at these suppressed levels since then.

Decades of suppression and low levels of timber harvesting on federal lands exacerbated the tendency of forests in the Northwest to become thickets of mature but spindly pines with a dense understory of spruce and fir. (These conditions also favor outbreaks of mountain pine beetle, which may exacerbate the risk of fire.)



Source: Forest Service

The Situation Today

As a result of these various factors and especially the combination of very limited removals of fuel (either by logging or by intentionally setting fires) and fire suppression, large crown fires have become the norm in the Northwest, destroying millions of acres of trees every year and threatening people’s lives and livelihoods. Meanwhile, the Forest Service and other forest-owning federal agencies, including as the Bureau of Land Management, have increased their expenditure on fire suppression in direct proportion to the area burned – as shown in Figure 8.

As this year’s fires demonstrate, another megafire like Big Blowup seems increasingly likely. With more people living in or near forests, such a fire could be far more deadly and economically devastating. It would also likely destroy vast swathes of species habitat, including that of the spotted owl.

For the sakes of people and the environment, the management of the forests in the Northwest must change. One objective might be to return the forest lands in the Northwest to a condition similar to that which pertained prior to European settlement. That could be done through a combination of ecologically sensitive logging (in most cases, this would mean thinning, not clear-cutting) and prescribed burns, while allowing more wildfires to burn naturally.



Source: National Interagency Fire Center

Unfortunately, this is easier said than done. Past efforts to reorient the priorities of the Forest Service and BLM have often floundered due to pressure from vested interests – and such pressures are unlikely to disappear. Employees of federal agencies seek to protect their jobs. Homeowners and business in forested areas demand “protection” from fires. And environmental activists vigorously oppose a resumption of logging.

The challenge, then, is to identify practical and desirable reforms that are politically feasible. Here are a few ideas:

  1. Remove the requirement that all fires be suppressed unless a fire plan is in place. This would free up the Forest Service and BLM to invest resources according to more locally determined priorities.
  2. Cut Congressional appropriations for fire management. In its 2016 Budget, the Forest Service has allocated itself $2.35 billion for “fire management” and an additional $854 million “suppression cap adjustment.” If these funds were cut to, say, $1 billion, the Forest Service would have fewer incentives to operate from crisis-to-crisis and instead have greater incentives to make more long-term and rational decisions concerning expenditures on fire management. Freed from the obligation to fight every fire, it might, for example, spend more on thinning, creating fire breaks, and prescribed burns. (A yet more incremental policy would be to require that the Forest Service fund fire suppression activities from local budgets – at least in some areas, as Alison Berry has suggested.)
  3. Make other Forest Service activities self-funding. With federal budgets stretched and unfunded liabilities measured in multiples of annual GDP, this could save taxpayers around $5 billion per year. At the same time, it would incentivize the Forest Service to provide goods and services to people living near its land – for example by selling more timber.
  4. Scrap the “roadless” rule. This would immediately remove an unnecessary – and counterproductive – regulatory restriction on logging on federal land. That would enable the removal of hazardous fuels, thereby reducing the threat to species habitat posed by explosive crown fires. It would also improve the economic prospects of those living in towns in the Northwest that have traditionally been heavily reliant on the wood products industry.
  5. A more radical approach would be to make the Forest Service entirely self-financing – as Randal O’Toole and others have suggested. In addition to selling more timber, the Forest Service might sell off some of its landholdings, thereby converting land that is currently a liability into an asset. The Forest Service might also contract with states, counties, towns, companies and individuals to provide fire management services – ensuring that those services are more closely aligned with local interests. (An incremental move in this direction would be to create what Robert Nelson calls “charter forests”.)
  6. A complementary approach would see federal agencies transfer some lands to the states – as Don Leal, Ken Ivory and others have suggested. Utah has already passed legislation requesting that it be granted control of federal lands in the state. By reducing their landholdings in this way, the federal agencies would be better able to prioritize their resources and serve the public. Meanwhile, by taking over responsibility for managing resources, states would be able to ensure that those resources are put to uses that are consistent with the needs and wishes of the local people, while better protecting the environment.

This is far from an exhaustive list of possible reforms. But it is at least suggestive of what might be done to reduce the potential for catastrophe in the Northwest, while simultaneously improving the prospects of the people living there – and the environment in which they live.

In July, the House passed a Bill (H.R. 2647) that would, if enacted, reduce some of the barriers that currently limit the ability of the Forest Service to remove trees that pose a fire risk. It also includes several measures that would increase collaboration between federal agencies and state, county and local communities regarding forest management. These changes might make the Forest Service somewhat more responsive to local needs. However, H.R. 2647 would also fully fund fire suppression activities based on a rolling average of suppression expenditures over the previous ten years. Given that such expenditures have generally been increasing over the past 10 years, this would mean continuous increases in fire suppression expenditures, which would in all probability make the problem of catastrophic fires worse.

H.R. 2647 was passed by the House 262-147 on July 9. While the majority of support came from Republicans, 19 Democrats voted in favor, indicating some bi-partisan support. However, the White House declared its intention to veto the Bill should it come before the President. As the Senate considers its own bill to address the mismanagement of forests, it would do well to consider some of the options outlined above that would at least begin to fix the problem.

Julian Morris is vice president of research at Reason Foundation.

The post Devastating Fires Show Forest Management Reforms Are Badly Needed appeared first on Reason Foundation.

]]>
The Social Cost of Carbon Underestimates Human Ingenuity, Overestimates Climate Sensitivity https://reason.org/commentary/climate-change-overestimates-sensit/ Fri, 14 Aug 2015 15:30:00 +0000 http://reason.org/commentary/climate-change-overestimates-sensit/ To great fanfare, the White House recently released its final "Clean Power Plan," which seeks to reduce carbon dioxide emissions from electricity generation. The Obama administration also released a 343-page "regulatory impact assessment," which purports to detail the Clean Power Plan's costs and benefits. But as I show in a new study for Reason Foundation, one of the key assumptions underlying that assessment, the so-called "social cost of carbon," is deeply flawed. As a result, the administration has massively exaggerated the benefits from reducing carbon dioxide emissions. Under more realistic assumptions, the plan would fail a basic cost-benefit test.

The Clean Power Plan is the latest in a series of federal regulations targeting emissions of greenhouse gases. Under federal law, agencies proposing new regulations are required to assess their costs and benefits. Since 2010, federal agencies have counted the benefits of reducing greenhouse gas emissions by multiplying the estimated total reduction of such emissions (measured in tons) by the "social cost of carbon" (measured in dollars per ton).

The post The Social Cost of Carbon Underestimates Human Ingenuity, Overestimates Climate Sensitivity appeared first on Reason Foundation.

]]>
To great fanfare, the White House recently released its final “Clean Power Plan,” which seeks to reduce carbon dioxide emissions from electricity generation. The Obama administration also released a 343-page “regulatory impact assessment,” which purports to detail the Clean Power Plan’s costs and benefits. But as I show in a new study for Reason Foundation, one of the key assumptions underlying that assessment, the so-called “social cost of carbon,” is deeply flawed. As a result, the administration has massively exaggerated the benefits from reducing carbon dioxide emissions. Under more realistic assumptions, the plan would fail a basic cost-benefit test.

The Clean Power Plan is the latest in a series of federal regulations targeting emissions of greenhouse gases. Under federal law, agencies proposing new regulations are required to assess their costs and benefits. Since 2010, federal agencies have counted the benefits of reducing greenhouse gas emissions by multiplying the estimated total reduction of such emissions (measured in tons) by the “social cost of carbon” (measured in dollars per ton).

The group of federal agencies that developed the social cost of carbon (SCC), known as the Interagency Working Group, gave a range of estimates for the SCC that depend on: the year during which the emissions occur (the SCC is assumed to rise over time), the discount rate (the rate at which society is assumed to be willing to forego current consumption in return for future consumption – the higher the discount rate, the lower the SCC) and the degree to which climate change is expected to cause harm (the greater the expected harm, the higher the SCC).

In the original analysis by the Interagency Working Group, published in 2010, estimates of the SCC for the year 2015 ranged from $5.70 per ton of carbon dioxide (for the “median” forecast of climate change impact discounted at 5 percent) to $64.90 per ton (for the “95th percentile” forecast discounted at 3 percent). This analysis was heavily criticized, with some saying the estimates were too high, others that they were too low. In 2013, the Interagency Working Group revised its estimates of the SCC twice. In the May 2013 revision, the “median” forecast, assuming a 5 percent discount rate had doubled to $12 per ton (from $5.70 a ton in the 2010 estimate). In a minor revision made in November 2013, this was reduced to $11 per ton. The SCC was again revised slightly in July 2015 (but no change was made to the aforementioned “median” forecast discounted at 5 percent).

While the Interagency Working Group increased its estimates of the SCC between 2010 and 2013, academic economists using the same models generally did not. Moreover, the trend in estimates has been declining over the long-term. In 2011, Professor Richard Tol – one of the world’s leading experts on the economics of climate change and the developer of one of the three models used by the Interagency Working Group – produced a meta-analysis of 311 estimates of the social costs of carbon and found that the “mean” estimate fell consistently from 1992 to 2011, which as he notes “suggests that estimates of the impact of climate change have become less dramatic over time.” Moreover, the proportion of estimates of the SCC that are negative — implying that additional carbon dioxide has net benefits – has increased over time.

If one were to base the social cost of carbon on these models alone, it would be very difficult to justify a rate of more than a few dollars. But the models are really no better than the assumptions built into them. And if those assumptions are wrong, so are the conclusions. As the old saying goes: garbage in, garbage out.

One assumption that is particularly problematic concerns the relationship between increases in carbon dioxide concentrations and increases in temperature – known as climate sensitivity. In its assessment, the Interagency Working Group assumed the same range for climate sensitivity as did the Intergovernmental Panel on Climate Change (IPCC) in its 2007 assessment. But since 2010 several studies have shown that the actual temperature change since 1990 has been about half that of the lowest forecasts made using the IPCC’s models.

A likely explanation for the failure of the climate models is that they assume the climate is more sensitive to increases in carbon dioxide than is actually the case. This is supported by several recent studies, which suggest that climate sensitivity is lower than even the lowest estimate assumed by the Intergovernmental Working Group. The IPCC implicitly acknowledged this in its 2013 update, which reduced the bottom of its range of estimates of future warming, but the Interagency Working Group ignored it in its own updates.

A further significant problem with the Interagency Working Group assessment of the social cost of carbon is that it is based on highly pessimistic assumptions concerning the ability of humanity to adapt to climate change. While one of the three models used by the group assumes that adaptive capacity will increase over time and in response to climate change, the others do not. Yet human adaptive capacity has been increasing at a rapid rate, making us less susceptible to a range of climate-related problems than at any time in the past. This is likely to continue.

Take food production. Pessimists claim that climate change is already reducing food availability. But the trend is in the opposite direction. Over the course of the past half-century, improvements in agricultural technology have dramatically increased food production. For example, cereal yields have nearly tripled, from 0.6 tons per acre in 1961 to 1.7 tons per acre in 2013. As a result, per capita food availability is greater and fewer people die today from malnutrition than 50 years ago in spite of a doubling of the world’s population. New technologies available today – but not yet widely adopted – enable farmers to grow food in hotter, drier, and saltier soils than was previously possible. As new technologies continue to be developed and adopted, it seems almost certain that yields will continue to increase even as the world becomes warmer.

Diseases that are affected by climate have also been on the decline. Until the early 1900s, each year about 1 in every 1,000 people contracted typhoid fever in the U.S. – and about 20 percent of those, most of them children, died. By 1920 the incidence had fallen to 1 in 2,500 and by 1960 it affected fewer than 1 in 100,000 people. This decline was largely a result of improvements in, and more widespread disinfection of, drinking water.

Deaths from heat have also been falling in the U.S. in spite of increases in temperature. In 1987, 50 of every 1,000 deaths were attributable to heat; by 2005 that had fallen to 19 per 1,000 deaths. Much of this reduction in heat-related mortality was a result of more widespread use of air conditioning. But other factors, including better treatment for heart attacks, also contributed.

While extreme weather events such as hurricanes and droughts continue to cause disease and misery, not to mention destruction of property, they have become far less deadly. Across the world, improvements in nutrition, water availability, access to health care and other factors associated with economic development have reduced the mortality rate from extreme weather events by more than 90 percent since 1920.

Over the past half-century, trade, technological progress and associated economic growth have lifted billions out of poverty. Indeed, there are fewer people in the world living in absolute poverty today than at any time since 1820, even though the population has grown seven-fold in that time. Over the coming decades, billions more people in Africa, Asia and Latin America will likely escape from poverty. With this increase in wealth will come better access to clean drinking water and sanitation, better health care, more widespread use of air conditioning and other technologies that have already reduced humanity’s vulnerability to climate change.

It is important to avoid Panglossian thinking. We do not live in the best of all possible worlds. Increased concentrations of carbon dioxide will likely impose costs on many people. However, many others will almost certainly benefit. Higher concentrations of carbon dioxide have already increased the rate of growth of most crops – and this is likely to continue. In a warmer world, growing seasons at higher latitudes will be longer, enabling additional crops to be grown. And warmer winters mean fewer deaths from cold, as well as lower heating bills.

So, to sum up: (1) estimates of climate sensitivity suggest that global warming is likely to be milder than even the lowest forecast made by the Interagency Working Group; (2) humanity’s ability to adapt to climate change will almost certainly be greater than even the most optimistic forecasts of the Interagency Working Group; (3) while there will undoubtedly be losers from global warming, there will also be winners and it is quite possible that additional carbon dioxide emissions will generate net benefits for humanity, at least until the end of this century.

This does not mean that the costs and benefits actually cancel each other out: the winners are unlikely to compensate the losers, though in an ideal world perhaps they should. (In a future paper, I intend to identify ways in which laws and policies might be crafted so as to achieve this more equitable end.) But the social cost of carbon is based on cost-benefit analysis and in that analysis, for better or worse, it does not matter whether the winners compensate the losers.

Attempting to use economic models to investigate such complex propositions is so fraught with difficulty that, on the basis of what we know today, to a first approximation the social cost of carbon is zero. That means every regulation promulgated by the White House under the presumption that the social cost of carbon is greater than zero, including the Clean Power Plan, must be re-evaluated.

Julian Morris is vice president of research at Reason Foundation and author of the study, “Assessing the Social Costs and Benefits of Regulating Carbon Emissions.”

Permission to reprint this column is hereby granted, provided that the author, Julian Morris, and Reason Foundation are properly credited.

The post The Social Cost of Carbon Underestimates Human Ingenuity, Overestimates Climate Sensitivity appeared first on Reason Foundation.

]]>
Assessing the Social Costs and Benefits of Regulating Carbon Emissions https://reason.org/policy-study/social-cost-of-carbon/ Thu, 06 Aug 2015 13:00:00 +0000 http://reason.org/policy-study/social-cost-of-carbon/ Taking into account a wide range of climate models, impact evaluations, economic forecasts and discount rates, as well as the most recent evidence on climate sensitivity, this study finds that the range of social cost of carbon should be revised downwards. At the low end, carbon emissions may have a net beneficial effect (i.e. carbon should be priced negatively), while even at the high end carbon emissions are very unlikely to be catastrophic.

The post Assessing the Social Costs and Benefits of Regulating Carbon Emissions appeared first on Reason Foundation.

]]>
Atmospheric concentrations of greenhouse gases (GHGs), which include carbon dioxide and methane, have been increasing for more than a century. Rising human emissions of these gases, especially from the combustion of fossil fuels and from agriculture, appear to be the primary cause of this increase in concentrations. The temperature of the atmosphere has also increased over the past century. Some of that increase is likely the result of the increase in concentration of GHGs.

Such an increase in temperatures has various consequences, some of which are likely to be beneficial, others harmful. In the late 1970s, economists began assessing the impact of rising greenhouse gas concentrations-and the consequences of restricting emissions. The framework they adopted for this analysis is called “cost benefit analysis.” The objective of such analysis is to identify policies whose benefits exceed their costs.

In 1993, President Clinton signed Executive Order 12866 which, among other things, requires agencies of the U.S. government to “assess both the costs and the benefits of the intended regulation and, recognizing that some costs and benefits are difficult to quantify, propose or adopt a regulation only upon a reasoned determination that the benefits of the intended regulation justify its costs.”

Starting in 2008, in compliance with this executive order, some agencies of the U.S. government began to incorporate estimates of the “social cost of carbon” (SCC-see box) into their regulatory impact analyses (RIAs). However, not all agencies were using the same estimates of the social cost of carbon, resulting in regulatory impact analyses that were inconsistent. In response, the Office of Management and Budget and the Council of Economic Advisors convened an interagency working group in order to establish a consistent SCC for use in RIAs relating to regulations that restrict emissions of these gases.

In February 2010, the Interagency Working Group (IWG) published “Technical Support Document: Social Cost of Carbon for Regulatory Impact Analysis Under Executive Order 12866.” In that document, a range of estimates was given for the SCC. The SCC was calculated at five-year increments from 2010 to 2050 and it is expected to rise over time. As with all U.S. government estimates of costs and benefits, future costs and benefits are discounted (that is to say, future amounts are reduced by a certain percentage per annum to give their current dollar value).

However, unusually, the IWG did not discount at the rate recommended by the Office of Management and Budget (7% per year), instead choosing to use a range of lower and variable discount rates (these averaged 2.5%, 3% and 5%). In addition, while most of the estimates provided are for the average (in this case, median) forecast of future costs and benefits, the IWG also gave an estimate of the “95th Percentile”-that is, the estimate that is above 95% of all forecasts, or in other words the estimate that is expected to occur with only 5% probability. The IWG has revised its estimates three times since 2010. In the first revision (May 2013), the range of costs shifted upwards dramatically. In the second revision (November 2013), the costs were revised downwards slightly compared to the May 2013 revision. In the third revision (June 2015), the costs were again revised down slightly.

Study Outline

This study seeks to assess the Interagency Working Group’s estimates of the social cost of carbon (SCC). It begins with a discussion of the framework that underpins the SCC, i.e. cost-benefit analysis.

Part Two provides a brief history of economists’ attempts to estimate the social costs and benefits of carbon.

Part Three reviews some of the estimates of the social cost of carbon that have been derived using integrated assessment models (that is, the types of models used by the IWG).

Part Four describes the methodology adopted by the IWG for calculating the social cost of carbon and assesses some of the criticisms of that assessment.

Part Five focuses on two key factors affecting the “damage function”: the sensitivity of the climate to increases in greenhouse gases and the ability of society to adapt to climate change.

The final section draws conclusions and recommends that policies and regulations predicated on the assumption that the SCC is different from zero should be adjusted to reflect an SCC of zero.

Attachments

The post Assessing the Social Costs and Benefits of Regulating Carbon Emissions appeared first on Reason Foundation.

]]>
Study: Social Cost of Carbon Likely Much Lower Than Previous Estimates https://reason.org/news-release/social-cost-of-carbon-study/ Thu, 06 Aug 2015 13:00:00 +0000 http://reason.org/news-release/social-cost-of-carbon-study/ A new study from Reason Foundation shows that estimates of the "social cost of carbon" have been falling - and would fall further if new scientific evidence were incorporated. The study calls into question the analyses used to underpin the Obama administration's new Clean Power Plan and other federal regulations targeting emissions of greenhouse gases.

The post Study: Social Cost of Carbon Likely Much Lower Than Previous Estimates appeared first on Reason Foundation.

]]>
Los Angeles, CA – A new study from Reason Foundation shows that estimates of the “social cost of carbon” have been falling – and would fall further if new scientific evidence were incorporated. The study calls into question the analyses used to underpin the Obama administration’s new Clean Power Plan and other federal regulations targeting emissions of greenhouse gases.

The study by Julian Morris, vice president of research at Reason Foundation, finds the administration’s estimates of the social cost of carbon are “biased upwards” due to their reliance on three “simplistic models, all of which use estimates of climate sensitivity that are likely too high and two of which likely overestimate the economic impact of climate change.”

The study shows that by combining more reliable estimates of the sensitivity of the climate to changes in concentrations of greenhouse gases with more realistic assumptions about both the benefits and costs of carbon dioxide emissions, the social cost of carbon falls dramatically.

“The current best estimate for the social cost of carbon is zero,” said Morris, who has been researching the economics of climate change for over 20 years. “Using that number, most federal regulations limiting carbon emissions, including the Clean Power Plan, could no longer be justified.”

Morris notes that, “The government’s estimates are based on pessimistic assumptions that don’t accurately reflect humanity’s ability to innovate and adapt, and they ignore recent evidence showing that the climate is less sensitive to rising concentrations of carbon dioxide than was previously assumed.”

Full Study Online

The full study, Assessing the Social Costs and Benefits of Regulating Carbon Emissions, is available here and here (.pdf).

About Reason Foundation

Reason Foundation is a nonprofit think tank dedicated to advancing free minds and free markets. Reason Foundation produces respected public policy research on a variety of issues and publishes the critically acclaimed Reason magazine and its website www.reason.com. For more information please visit www.reason.org.

Contact

Julian Morris, Vice President of Research, Reason Foundation, (212) 495-9599

The post Study: Social Cost of Carbon Likely Much Lower Than Previous Estimates appeared first on Reason Foundation.

]]>
Rebutting the Obama Administration’s Clean Power Plan’s Claims https://reason.org/commentary/obama-clean-power-plan-climate-chan/ Mon, 03 Aug 2015 19:00:00 +0000 http://reason.org/commentary/obama-clean-power-plan-climate-chan/ The Obama administration released its new clean power plan today. The White House Fact Sheet claims: "The Clean Power Plan is a Landmark Action to Protect Public Health, Reduce Energy Bills for Households and Businesses, Create American Jobs, and Bring Clean Power to Communities across the Country."

It would be more accurate to say that, "The Clean Power Plan is a centralized plan for electricity generation in the United States that is likely to harm public health, increase energy bills for households and businesses, destroy American jobs, and cause blackouts in communities across the country."

Here are a some thoughts on the plan's main claims:

The post Rebutting the Obama Administration’s Clean Power Plan’s Claims appeared first on Reason Foundation.

]]>
The Obama administration released its new clean power plan today. The White House Fact Sheet claims: “The Clean Power Plan is a Landmark Action to Protect Public Health, Reduce Energy Bills for Households and Businesses, Create American Jobs, and Bring Clean Power to Communities across the Country.”

It would be more accurate to say that, “The Clean Power Plan is a centralized plan for electricity generation in the United States that is likely to harm public health, increase energy bills for households and businesses, destroy American jobs, and cause blackouts in communities across the country.”

Here are a some thoughts on the plan’s main claims:

1. The White House asserts, “The Clean Power Plan, and other policies put in place to drive a cleaner energy sector, will reduce premature deaths from power plant emissions by nearly 90 percent in 2030 compared to 2005 and decrease the pollutants that contribute to the soot and smog and can lead to more asthma attacks in kids by more than 70 percent.”

Indeed, in its regulatory impact assessment of the original Clean Power Plan rules, these public health benefits accounted for the majority of all benefits.

But these benefits do not come from reducing greenhouse gas emissions: they come from reducing other emissions, such as particulates. Yet these “public health benefits” would be achieved far more cost effectively by targeting harmful emissions directly. The Environmental Protection Agency (EPA) is already targeting such emissions. To claim such benefits for the Clean Power Plan is either double counting or misdirection, or both.

2. The White House asserts that the Clean Power Plan will, “Create tens of thousands of jobs.”

This may be true, but it will also likely destroy hundreds of thousands of jobs.

Given the enormous capital cost of implementing the plan, it seems highly likely that the net effect on jobs will be negative.

3. The EPA claims that these jobs will be created, “while ensuring grid reliability.”

This is surely a joke, since the Clean Power Plan envisages a significant increase in the proportion of electricity supplied by “renewable” power, much of which will come from intermittent sources such as solar and wind, which as the North American Electricity Reliability Corporation notes, tend to reduce grid reliability. To compensate, grid operators will be forced to spend billions of dollars on new technologies that enable them to balance power generation with demand.

4. The White House asserts that the final rule will, “Drive more aggressive investment in clean energy technologies than the proposed rule, resulting in 30 percent more renewable energy generation in 2030 and continuing to lower the costs of renewable energy.”

The rule does indeed require more electricity generation from “renewable” sources – from about 13 percent today to 28 percent by 2030. That will likely drive innovation, lowering costs of such technologies over time. But this will come at the cost of investments in other forms of innovation that would likely have greater benefits to society. The history of attempts to plan the direction of innovation is replete with white elephants. There is no good reason to think that this time is any different.

While the effectiveness of different sources of renewable generation varies according to climate and geography, we estimate that the maximum that could be produced using the current generation of intermittent technologies (that is, technologies such as wind and solar that are not capable of producing continuous power) is about 10 percent. Currently, solar and wind account for about 5 percent of electricity generation. Taking this to 20 percent or above will require heroic feats of engineering that may simply be impossible – and certainly will be enormously expensive. It is highly likely, therefore, that the Clean Power Plan will result in a substantial increase in the number of brown- and blackouts.

5. The White House claims that the plan will, “Save the average American family nearly $85 on their annual energy bill in 2030, reducing enough energy to power 30 million homes, and save consumers a total of $155 billion from 2020-2030.”

This assertion is based on the assumption that consumers will install energy saving devices because of the rule. In reality, the rule will almost certainly increase the cost of energy considerably over the next 15years, and beyond, as a result of the massive increase in investment in costly “renewable” generation that is required by the rule. While this will likely encourage consumers to be more frugal in their use of energy, it is entirely disingenuous to claim that this will “save” them any money: consumers will almost certainly spend more in total on energy and energy saving devices than without the rule.

6. The White House claims that the rule will, “Give a head start to wind and solar deployment and prioritize the deployment of energy efficiency improvements in low-income communities that need it most early in the program through a Clean Energy Incentive Program.”

While it is no doubt true that people living in low-income communities would benefit from energy efficiency improvements, it is far from clear that a federal rule is the best way to stimulate such investments. Meanwhile, increasing the proportion of energy generated by wind and solar power is likely to be highly regressive – harming the poor more than others.

Poorer consumers spend a higher proportion of their income of energy, so any policy that increases the cost of energy will disproportionately affect such poorer consumers, who will spend more on energy and have less available for other items. In other words, it will make poor people poorer. Since poverty is associated with health, the Clean Power Plan will almost certainly adversely affect the health of those who are already poor.

7. Finally, the White House claims that the rule will, “Continue American leadership on climate change by keeping us on track to meet the economywide emissions targets we have set, including the goal of reducing emissions to 17 percent below 2005 levels by 2020 and to 26-28 percent below 2005 levels by 2025.”

This is perhaps the most bizarre claim of all. First, it is by no means clear that the targets set for reducing emissions would pass an impartial cost-benefit test. The Obama administration’s current estimate of the “social cost of carbon,” which is used by the EPA to justify these cuts, is almost certainly higher than is justified by an impartial assessment of the costs and benefits of emissions of carbon dioxide, as we show in a paper to be released later this week.

Second, carbon dioxide emissions have been falling in the U.S. largely as a result of an autonomous switch from burning coal to burning natural gas, yet instead of assuming that this trend will continue, the revised Clean Power Plan rule requires states to focus on increasing the amount of power that will be generated by “renewable” and nuclear power. This is simply illogical.

Julian Morris is vice president of research at Reason Foundation.

The post Rebutting the Obama Administration’s Clean Power Plan’s Claims appeared first on Reason Foundation.

]]>
The Limits of Wind Power https://reason.org/policy-study/the-limits-of-wind-power/ Thu, 04 Oct 2012 04:00:00 +0000 http://reason.org/policy-study/the-limits-of-wind-power/ Very high wind penetrations are not achievable in practice due to the increased need for power storage, the decrease in grid reliability, and the increased operating costs. Given these constraints, this study concludes that a more practical upper limit for wind penetration is 10%. At 10% wind penetration, the CO2 emissions reduction due to wind is approximately 45g CO2 equivalent/kWh, or about 9% of total.

The post The Limits of Wind Power appeared first on Reason Foundation.

]]>
Executive Summary

Environmentalists advocate wind power as one of the main alternatives to fossil fuels, claiming that it is both cost effective and low in carbon emissions. This study seeks to evaluate these claims.

Existing estimates of the life-cycle emissions from wind turbines range from 5 to 100 grams of CO2 equivalent per kilowatt hour of electricity produced. This very wide range is explained by differences in what was included in each analysis, and the proportion of electricity generated by wind. The low CO2 emissions estimates are only possible at low levels of installed wind capacity, and even then they typically ignore the large proportion of associated emissions that come from the need for backup power sources (“spinning reserves”).

Wind blows at speeds that vary considerably, leading to wide variations in power output at different times and in different locations. To address this variability, power supply companies must install backup capacity, which kicks in when demand exceeds supply from the wind turbines; failure to do so will adversely affect grid reliability. The need for this backup capacity significantly increases the cost of producing power from wind. Since backup power in most cases comes from fossil fuel generators, this effectively limits the carbon-reducing potential of new wind capacity.

The extent to which CO2 emissions can be reduced by using wind power ultimately depends on the specific characteristics of an existing power grid and the amount of additional wind-induced variability risk the grid operator will tolerate. A conservative grid operator can achieve CO2 emissions reduction via increased wind power of approximately 18g of CO2 equivalent/kWh, or about 3.6% of total emissions from electricity generation.

The analysis reported in this study indicates that 20% would be the extreme upper limit for wind penetration. At this level the CO2 emissions reduction is 90g of CO2 equivalent/kWh, or about 18% of total emissions from electricity generation. Using wind to reduce CO2 to this level costs $150 per metric ton (i.e. 1,000 kg, or 2,200 lbs) of CO2 reduced.

Very high wind penetrations are not achievable in practice due to the increased need for power storage, the decrease in grid reliability, and the increased operating costs. Given these constraints, this study concludes that a more practical upper limit for wind penetration is 10%. At 10% wind penetration, the CO2 emissions reduction due to wind is approximately 45g CO2 equivalent/kWh, or about 9% of total.

Attachments

The post The Limits of Wind Power appeared first on Reason Foundation.

]]>
The Environmental Protection Agency and Boiler MACT Regulation https://reason.org/policy-brief/epa-boiler-mact-standards/ Tue, 10 Apr 2012 04:00:00 +0000 http://reason.org/policy-brief/epa-boiler-mact-standards/ Boiler MACT is an example of a regulation that could be amended in simple, appropriate ways to adhere to the spirit of President Obama's Executive Order 13563. Instead of moving forward with the current proposed rule, EPA should address the issues raised in this brief, including by:

  • Basing MACT floor policy decisions on the performance of actual existing boilers, not the performance of a hypothetical boiler that comprises restrictions for individual pollutants currently only achieved in isolation.
  • Setting health-based standards per Section 112(d) (4) of the CAA for acid gases that are prevalent and have historically been regulated according to such standards.
  • Only reclassifying fuels as "solid waste" (with all the associated additional burdens) if the EPA is able to prove that such a reclassification will result in substantial health benefits. Currently EPA is moving in the opposite direction, placing the burden of proof on industry to petition to remove substances that have historically been used as fuel.

The post The Environmental Protection Agency and Boiler MACT Regulation appeared first on Reason Foundation.

]]>
“Boiler MACT” is the name given to national emission standards being promulgated by the Environmental Protection Agency in an effort to curb emissions of hazardous air pollutants (HAP) from industrial boilers and process heaters.

The regulation imposes standards and emission limits for more than 200,000 boilers used in manufacturing, processing, mining, refining and other industries, as well as commercial boilers used in malls, apartments, restaurants and hotels. It does not apply to major commercial electricity generators, which are subject to different rules. Boilers burn fuels, such as natural gas, coal, biomass and oil, to produce heat, which is then either used directly in industrial processes or used to produce steam, which drives turbines to produce electricity.

Under Boiler MACT, facilities are divided into two categories: “major sources” and “area sources.” Major sources are facilities that emit 10 tons per year (tpy) or more of any single Hazardous Air Pollutant (HAP) or 25 tpy or more of any combination of HAPs; area sources are facilities that emit less than this. According to EPA, there are approximately 13,840 major source and 187,000 area source boilers in the U.S.

Boiler MACT is an example of a regulation that could be amended in simple, appropriate ways to adhere to the spirit of President Obama’s Executive Order 13563. Instead of moving forward with the current proposed rule, EPA should address the issues raised in this brief, including by:

  • Basing MACT floor policy decisions on the performance of actual existing boilers, not the performance of a hypothetical boiler that comprises restrictions for individual pollutants currently only achieved in isolation.
  • Setting health-based standards per Section 112(d) (4) of the CAA for acid gases that are prevalent and have historically been regulated according to such standards.
  • Only reclassifying fuels as “solid waste” (with all the associated additional burdens) if the EPA is able to prove that such a reclassification will result in substantial health benefits. Currently EPA is moving in the opposite direction, placing the burden of proof on industry to petition to remove substances that have historically been used as fuel.

Attachments

The post The Environmental Protection Agency and Boiler MACT Regulation appeared first on Reason Foundation.

]]>
A Cost Effective Way to Cut Smog https://reason.org/commentary/a-cost-effective-way-to-cut-smog/ Wed, 14 Mar 2012 14:01:00 +0000 http://reason.org/commentary/a-cost-effective-way-to-cut-smog/ Advocates for the EPA plan claim that ever lower ozone standards would have additional health benefits. If the impact of ozone were linear that would be true; but it is not: once ozone levels reach a certain level, the additional health benefits of further reductions are likely small to non-existent.

The post A Cost Effective Way to Cut Smog appeared first on Reason Foundation.

]]>
The smog that so often hangs over Los Angeles has long been viewed as a byproduct of Americans’ enjoyment of gas-guzzling cars. But while domestic causes of this hazardous haze have been declining, as much as 20 percent of California’s smog may now be coming from Asia. Rather than imposing enormously expensive new restrictions on American emissions of ozone, as the Environmental Protection Agency (EPA) has proposed, perhaps it is time to look across the Pacific for more cost-effective ways to clear the air.

In the mid-20th century, smogs in Los Angeles and many other urban centers were truly horrific and deadly. But they have declined dramatically. Since 1980, average levels of ozone, the main constituent of smog, have declined by 25 percent nationally according to the EPA. And in Los Angeles the average has fallen by more than 60 percent. A significant part of this decline can be attributed to regulations limiting emissions of ozone-forming chemicals by vehicles and industry. Indeed, concentrations of ozone have now fallen so far that in many places and times they are close to background levels – that is to say the levels that would occur without any industry or vehicles.

Moreover, there is now evidence that a small but significant proportion of ozone on the West Coast of the U.S. comes from Asia. A recent study published in the Journal of Geophysical Research showed that in spring, low pressure systems in Asia can cause ozone to rise two to six miles into the atmosphere, where it is blown across the Pacific by strong winds. It is then pulled back down to ground-level by high-pressure systems in the Northeastern Pacific. The study found that Asian emissions contributed 8 to 15 parts per billion (ppb) of ozone in California during the spring of 2010 – accounting for about half of the days the state was in non-compliance with EPA’s threshold of 75 ppb.

The EPA had planned to ratchet down the limit to between 60 and 70 ppb. Given the proportion of ozone flowing over from Asia, that would mean an effective limit of 45-50 ppb for domestic emissions in some places – that’s below the springtime background level in some locations. In other words, some counties could have found themselves in breach of EPA rules even if they had no industry and no cars.

Advocates for the EPA plan claim that ever lower ozone standards would have additional health benefits. If the impact of ozone were linear that would be true; but it is not: once ozone levels reach a certain level, the additional health benefits of further reductions are likely small to non-existent. Moreover, there are costs to further cutting emissions of ozone forming chemicals. The EPA estimated the costs of its proposed rule at $90 billion a year; industry put the bill at $1 trillion. Economists call these “opportunity costs” because of the opportunities that are foregone by spending money to cut ozone instead of spending it on other things. How many lives could be saved if $90 billion were spent on other health improvement measures?

To put this into context: if the lower ozone threshold had been in operation in 2010, the number of counties in non-attainment would have increased from 242 to around 600. Companies wishing to invest in non-attainment areas would be forced to comply with costly and time consuming permitting processes, the installation of expensive equipment, and could be required to reduce emissions from nearby sources to offset emissions at new facilities. Counties could be denied highway and infrastructure funds if they fail to meet the ozone level. Many businesses would simply avoid these areas due to the potential costs and headaches.

Last fall, under intense pressure from American businesses, President Obama scrapped the EPA’s proposed rule. But the EPA is currently conducting its regular five-year review of ozone, which could impose a lower level as early as 2013. The regulation would have a direct impact on industries as diverse as manufacturing, construction, trucking, farming, electric utilities, and other important sectors. Additionally, EPA and California regulators are engaged in a continuous effort to create auto and fuel regulations that curb ozone-forming pollutants that also have a direct impact on the cost of gasoline – something that impacts nearly every American family and business.

While additional regulations would no doubt cut emissions of ozone-forming chemicals further, they would also drive up the cost of doing business in America. Since abundant, affordable energy helps create the wealth that allows Americans to afford an ever-cleaner environment, the cost of such regulations is far from an ephemeral concern. A sense of proportion is required. Moreover, to the extent that the ozone problem is now being imported from Asia, instead of focusing exclusively on domestic emissions, it might make sense to look at ways of encouraging China, Thailand, Vietnam and other culprit countries to reduce their emissions.

Julian Morris is Vice President of Research and Adam Peshek is a research associate at the Reason Foundation.

The post A Cost Effective Way to Cut Smog appeared first on Reason Foundation.

]]>
The Facts Behind the EPA https://reason.org/commentary/the-facts-behind-the-epas-latest-pr/ Thu, 16 Feb 2012 15:50:00 +0000 http://reason.org/commentary/the-facts-behind-the-epas-latest-pr/ On Thursday, the Environmental Protection Agency put a regulation on the books that will cost $10 billion in just one year, but the regulation's details show it may not even come close to accomplishing its goal of improving public health.

The post The Facts Behind the EPA appeared first on Reason Foundation.

]]>
On Thursday, the Environmental Protection Agency (EPA) put a regulation on the books that will cost $10 billion a year and will do almost nothing to accomplish its aim of improving public health. It is merely another example of the EPA’s politicization of science in a continued effort to bypass Congress to regulate greenhouse gas and eliminate coal.

The “Mercury and Air Toxics Standards” (MATS) is the first of its kind to require installing expensive equipment on over 700 power plants. It has been estimated that this, and other related regulations, will shut down nearly ten percent of coal generating electricity in the country. This could be considered worth the cost if the regulation produced a significant impact on public health.

Clean air is an important aspect of a healthy society and its economy. And mercury, when exposed at high concentrations, is a dangerous neurotoxin. But despite its name, the regulation does almost nothing to reduce mercury or toxic emissions. Less than one-tenth of one percent of the regulation’s benefits come from reductions in mercury. But this fact has not stopped the EPA and its supporters from exaggerating the regulation’s benefits.

EPA Administrator Lisa Jackson said of the rule, “By cutting emissions that are linked to developmental disorders and respiratory illnesses like asthma, these standards represent a major victory for clean air and public health – and especially for the health of our children.” The Center for American Progress claims that “slashing mercury and other contaminants will save 11,000 lives annually” and “provide economic benefits” that the EPA estimates to be between $33 and $90 billion. With these kinds of figures, surely mercury exposure is an epidemic destroying our economy and families.

Hardly. For a regulation that will cost $10 billion a year, the EPA estimates a benefit of only $500,000 to $6 million in benefits from the reduction of mercury.

So how does one get from $6 million to $90 billion in benefits? The answer lies in what is essentially a bait and switch accounting trick at the hands of the EPA. Almost all of the regulation’s “benefits” come from the reduction of soot. Soot, or “particulate matter,” develops both naturally from forest fires, volcanic ash and through man-made processes including cars, power plants, and even dust from unpaved roads. Even if the reduction of soot was an appropriate use of industry and taxpayer resources, there are three main problems here.

First, the EPA already regulates soot through other national standards. Second, since MATS focuses on reductions of mercury and toxic emissions, the EPA can only count health benefits for reductions in soot already below the level it currently considers safe through soot national standards — otherwise they would be double counting.

Third, if the EPA can identify up to $90 billion in health benefits from one set of regulations, one could logically conclude that the EPA should spend more time focusing on the reduction of soot from all industries across the country. But the EPA isn’t doing that.

Under the Clean Air Act, the Agency is required to review current science every five years and set standards accordingly. The EPA missed this deadline in October and announced last month that it would delay setting new standards until June 2013. The Agency stated that reviewing all of the relevant science associated with exposure to soot is “a massive undertaking.”

The EPA’s estimates for MATS are based on two cherry-picked studies that affirm the data that toe the line of the Agency’s objectives. Harvard toxicologist Dr. Julie Goodman recently told a House committee that “the fact that EPA only considered studies that suggested an association [between soot reductions and health] means that it conducted a biased assessment of the available data.” She noted that “dozens of other studies are available and many report no such correlations.”

MATS claims to target one pollutant but draws all of its benefits from another pollutant that is already below EPA-approved safe levels. The air is cleaner than it’s ever been, but at $10 billion a year, MATS will be the most expensive EPA air regulation ever. Last week, the closure of nine power plants in four states was announced directly because of the regulation, and more are looming. Affordable energy is key to a recovering economy and when the costs and benefits are weighed, it’s clear that this regulation’s costs are enormous and the benefits to society are minimal at best.

Adam Peshek is a research associate at the Reason Foundation.

The post The Facts Behind the EPA appeared first on Reason Foundation.

]]>
Impacts of Transportation Policies on Greenhouse Gas Emissions in U.S. Regions https://reason.org/policy-study/greenhouse-gas-policies-cost-transp/ Wed, 30 Nov 2011 05:00:00 +0000 http://reason.org/policy-study/greenhouse-gas-policies-cost-transp/ This report compares the cost and effectiveness of improved fuel economy, transportation system improvements and shifts in travel behavior on the reduction of man-made CO2 emissions in urban areas. We study in detail 48 major U.S. regions containing 41% of the U.S. population, 60% of transit use and 90% of congestion delay. This report quantifies how much CO2 cars, light trucks and commercial trucks currently emit (base year 2005) in each region, how much CO2 would have increased with prior CAFE standards, how much the new CAFE standards will reduce, and how much CO2 might be reduced by other commonly suggested policies. These policies include the new fuel economy standards, additional smaller-car sales, signal timing and speed controls, capacity increases, high-occupancy or priced lanes, travel reduction polices, transit use increases, carpooling, telecommuting and walking to work. We then assess the cost versus effectiveness of each policy for each region and recommend detailed regional strategies.

The post Impacts of Transportation Policies on Greenhouse Gas Emissions in U.S. Regions appeared first on Reason Foundation.

]]>
This report compares the cost and effectiveness of improved fuel economy, transportation system improvements and shifts in travel behavior on the reduction of man-made CO2 emissions in urban areas. We study in detail 48 major U.S. regions containing 41% of the U.S. population, 60% of transit use and 90% of congestion delay. This report quantifies how much CO2 cars, light trucks and commercial trucks currently emit (base year 2005) in each region, how much CO2 would have increased with prior CAFE standards, how much the new CAFE standards will reduce, and how much CO2 might be reduced by other commonly suggested policies. These policies include the new fuel economy standards, additional smaller-car sales, signal timing and speed controls, capacity increases, high-occupancy or priced lanes, travel reduction polices, transit use increases, carpooling, telecommuting and walking to work. We then assess the cost versus effectiveness of each policy for each region and recommend detailed regional strategies.

Interest in man-made CO2 has sharply increased in the last several decades. A small portion of global CO2 emissions is a byproduct of fossil fuel combustion from human activity. Most such combustion occurs in the production of energy, and about a third of this involves transportation. CO2 reduction policy options in the transportation sector primarily focus on the reduction of manmade combustion through the reduction of the underlying activity (i.e., travel), or through reducing the amount of CO2 in vehicle exhaust by mandating increased vehicle fuel efficiency.

New Corporate Average Fuel Economy (CAFE) standards setting an overall new-car/truck efficiency of 35 MPG by 2020 have recently been put in place. The Supreme Court has ruled that even though CO2 is not a “listed pollutant,” the EPA must provide standards for its management. The U.S. transportation community is also increasing attention to the issue. The Transportation Research Board, a national transportation research organization, made “climate change” its theme for its 2009 meeting. At the state and local level, according to a recent survey, 36 states and several hundred local governments have “signed on to aggressive plans to cut back greenhouse gas emissions from electric energy generation, industry, and transportation.” California has recently passed legislation calling for a reduction in greenhouse gas emissions to 1990 levels by 2020.

This interest in emission-reduction goals has materialized without much region-specific research or economic assessment. A few regions have conducted substantial analyses. However, baseline estimates of CO2 in specific urban regions, and the impact of plans to reduce it, have not yet been developed. Many factors affect transportation’s contribution to greenhouse gas levels in various urban regions, and because each urban area is unique in its transportation needs and behavior and the costs and effectiveness of various emission-reduction policies, a one-size-fits-all plan is not appropriate. In order to balance achievable impact with affordable costs, each region must tailor its policies. Region-specific data would be very helpful to local governments and transportation communities in preparing sensible plans to reduce emissions most cost-efficiently.

This study compares the cost-effectiveness of attempting to reduce CO2 emissions through specific transportation systems and behavior policies with the likely impact of new federal CAFE standards alone in 48 major U.S. urbanized areas. The major findings of this study are:

1. U.S. man-made carbon dioxide emissions constitute about 21% of global CO2 emissions.

2. U.S. surface transportation emissions constitute about 6% of global CO2 emissions.

3. New CAFE standards will result in about a 31% reduction in U.S. surface transportationrelated CO2 emissions by 2030 (1.9% of world CO2 emissions), compared with prior standards, at a cost of about $52/ton reduced.

4. Regions vary widely in the circumstances that affect their ability to reduce CO2 emissions. If a policy of reducing carbon dioxide emissions is seen as desirable, it is clear that such emissions reductions should not be imposed uniformly on all regions or sectors. Instead, it would make sense to encourage emissions reduction in the most efficient way possible.

5. If consumers shift sharply to smaller vehicles, an additional reduction of about 2.7% of U.S. transportation emissions (0.16% of global CO2 emissions) is achievable, even with conventional fuels. Although some shift has been noticed recently, it is not guaranteed to continue and may lag if gasoline prices remain relatively low.

6. Improved signalizations for arterials could yield as much as 2.3% additional savings (0.14% of global CO2 emissions). The nominal cost of this would be $112 per ton of CO2 removed. However, there are substantial other benefits to improved signalization, making the reduction in CO2 essentially an ancillary benefit.

7. Fifty-five-mph speed limits (caps on high freeway speeds) could reduce as much as 3.0% of CO2 emissions (0.18% of global CO2 emissions), at a very low cost of $0.13 per ton reduced. However speed caps have very large societal costs in extra travel time.

8. Major road capacity improvements could achieve as much as a 4.1% reduction in CO2 emissions (0.25% of global CO2 emissions), but at costs averaging $3,995 per ton reduced. However, there are other reasons for improving capacity (travel time savings, reduced accidents, lower operating costs, greater choices of jobs/housing/retail and economic benefits). The most cost-effective sites are likely to be major bottlenecks and turn lane capacity actions on the minor arterials and collector systems.

9. A 50% increase in work-at-home shares is reducing considerable CO2 emissions already and has the potential to decrease an additional 0.5% of CO2 emissions (0.03% of global CO2 emissions) at about $3,496 per ton reduced.

10. A doubling of HOV lane and stand alone HOT lane mileage could reduce about 0.64% of CO2 emissions (0.04% of global emissions) but at quite a high price of $2,462 per ton reduced, assuming that new lanes would be needed. This application is probably limited to larger regions.

11. A 25% increase in carpooling-to-work shares could reduce about 0.75% of CO2 (0.05% of global CO2 emissions) but also at a relatively high cost of $2,776 per ton reduced, assuming that the increase is in the form of vanpool services. However, carpooling is declining nationally, and so this policy’s applicability is probably limited to regions that have a strong history of ridesharing.

12. An across-the-board 5% reduction in personal travel could reduce about 4.0% of CO2 (0.24% of global CO2 emissions), but the gasoline price needed to achieve this reduction is in the range of $5/gallon, about $3,923 per ton of CO2 reduced.

13. Transit service improvements necessary to achieve a 50% increase in transit work shares could reduce about 1.1% of CO2 emissions (0.07% of global CO2 emissions) but also at a high price
at $4,257 per ton reduced.

14. A 50% increase in walk-to-work shares would yield about a 0.35% reduction in CO2 (0.02% of global CO2 emissions), but its implementation is dependent on changing land use patterns, with costs believed to be very high.

In short, policies aimed at reducing transportation-related CO2 emissions by improving overall fleet fuel efficiency are likely to have the greatest relative and most cost-effective impact. Overall, technological improvements to vehicles resulting in higher fuel efficiency, along with traffic signal timing and speed harmonization, hold out the most hope for significant reductions in future CO2 emissions.

Next in line are policies aimed at improving the efficiency of the transportation system, particularly signal timing and coordination, and speed harmonization. Next in cost-effectiveness are policies aimed at changing commuting behavior, particularly work-at-home policies. Likely to be less effective, both absolutely and relatively, are major capacity increases, more HOV or standalone HOT lanes, transit shift policies and carpooling, although in some areas they can provide modest savings. However, none of the reviewed policies alone, including the new CAFE standards, is likely to reduce global CO2 emissions by more than about 2%, and most policies would have less than a 0.2% impact on global CO2 emissions. This means that even if implemented across many U.S. regions in a concerted fashion, the policies reviewed here would not likely have a significant effect on global CO2 emissions. And at the regional level they may prove very difficult to implement and may not even reduce CO2 emissions significantly.

Given the high relative cost of many policies, therefore, policy makers should not rush to implement them.

National Strategy

To the extent that CO2 reduction in the transportation sector is a goal, the new CAFE standards largely achieves it. Encouraging the purchase of more fuel efficient vehicles is also effective. But both policies are not without significant consequences other than CO2 reductions, some of them negative.

Most of the strategies being widely discussed in transportation plans-transit increases, VMT reductions, carpooling, pricing, making cities denser and more walk-friendly-would have little measureable effect on regional, national or certainly global CO2 emissions, even if implemented widely at very high cost. Several other policies-improving CAFE standards even further than presently mandated, encouraging small car purchasing-are beginning to be discussed. Still others-signalization improvements, telecommuting-are being largely ignored even though they have proven effective.

National policy should not encourage regions to implement strategies that are not cost-effective. There is currently significant uncertainty of the potential or cost-effectiveness of various alternative fuels, particularly whether any fuel will expand beyond its current applications and become national in scope. We believe that prudent policy regarding alternative fuels is to focus now on preparing for the mid-term, 15 to 30 years in the future, by concentrating on the reductions achievable with conventional fuels, and letting technologies evolve further. It is simply too risky to determine the impact of individual fuel types at this time. The nation should agree on and put in place mechanisms for measuring (not just estimating) CO2 emissions from the transportation sector, perhaps by region and/or mode. These will be necessary to determine progress over time and to set baselines when and if further actions become necessary. The nation should resist the temptation to mandate CO2 reduction plans as part of long-range transportation planning.

We should not mandate CO2 reduction targets for regions and states that are based on behavioral shifts. Such activity is likely to be unnecessary, wasteful in staff effort and lead to unrealistic expectations. Instead, national policy should be to encourage study, analysis and quantification of CO2 emissions in regions, but not mandate actions to deal with them. This is because fleet turnover is likely to significantly mitigate and possibly even reverse the growth of CO2 emissions in most regions.

What Should Regions Do?

Even though the new CAFE standards and possibly more small car sales will reduce the rate of increase in transportation related CO2 emissions, this strategy will not be enough to achieve significant reductions in CO2 emissions for some fast-growing regions. Most regions would benefit significantly from major attention to signalization improvements and limited applications of speed harmonization. These policies have significant benefits in terms of saving time for drivers and in the delivery of goods, and while they are not necessarily very cost effective ways of reducing CO2 emissions, they are less expensive than some other proposed policies. Speed limits are not recommended because of their enforcement problems and large societal costs. Other policies such as telecommuting, HOV or HOT lanes, carpooling, capacity improvements, VMT reductions and transit service improvements are likely to be even less cost-effective CO2 emission reducers, although of course there are other reasons for doing some of them. Some slowgrowing regions may be able to achieve significant reductions in CO2 emissions with modest actions in addition to vehicle technology improvements. However, even large “baskets” of policies are not likely to reduce transportation related CO2 emissions more than about 15 to 20% below our chosen baseline of 2005 levels in most regions, and their effect on global CO2 emissions is likely not observable.

In most regions policy-driven reductions of 20 to 50% are unlikely to be achieved, and therefore long-range plans should be realistic and not be overly optimistic. Particularly, plans should eschew promoting actions related to living patterns or the like, which are not likely to be approved, or be effective if endorsed. Given the wide range of circumstances across regions, the report recommends that all major actions be subjected to a detailed assessment of CO2 reduction potential versus cost.

In short:

  • Regions should understand local circumstances (growth rate, mixes, transit shares, etc.). Slower growing regions are likely to achieve considerable reductions just through fleet turnover.
  • Each region should review its opportunities for emission reduction, considering costeffectiveness.
  • Most regions should focus more on work-at-home strategies and on speed-related system improvements such as reducing delays at intersections and on the arterial system. Some might also benefit from speed reduction on fast-flowing facilities, but the loss of time is substantial and can harm economies.
  • Regions should resist the temptation to over-hype transit impacts and other “green” actions. Transit impacts are likely to be very small, very costly and cost-ineffective in most regions, particularly those with less than one million people.
  • Similarly, most high-capacity additions are likely to be cost-ineffective too. They should be evaluated carefully, looking for possibilities such as turn lanes, signal actions, bottleneck removal and selective widenings.

In conclusion, our assessment finds that significant reductions in CO2 emissions beyond those already mandated from new CAFE standards are likely to be relatively small, particularly on a global scale, and may be unnecessary.

Attachments

The post Impacts of Transportation Policies on Greenhouse Gas Emissions in U.S. Regions appeared first on Reason Foundation.

]]>
Reducing Greenhouse Gas Emissions From Automobiles https://reason.org/policy-study/reducing-greenhouse-gases-from-cars/ Tue, 29 Nov 2011 05:00:00 +0000 http://reason.org/policy-study/reducing-greenhouse-gases-from-cars/ Federal, state and local governments are considering or have implemented policies that seek to reduce human emissions of greenhouse gases (GHGs).

This study seeks to assess the relative merits of specific policies intended to reduce GHGs from automobiles. (It does not consider whether or not reductions in GHGs are actually desirable.) Current policies and proposals for reducing GHGs from autos would require implementation of strong land use restrictions (compact development). Technological alternatives for reducing GHG emissions have received considerably less attention.

We estimated the costs of a range of such policies, beginning with government documents and reports prepared in cooperation with organizations advocating behavioral policies. Behavioral strategy costs and the costs of technological strategies were evaluated against the upper limit on acceptable costs for GHG emissions reductions as estimated by the Intergovernmental Panel on Climate Change. (This upper limit, $50/ton of carbon dioxide equivalent in 2020-2030, is used because of its source, not because we endorse that value).

The post Reducing Greenhouse Gas Emissions From Automobiles appeared first on Reason Foundation.

]]>
Federal, state and local governments are considering or have implemented policies that seek to reduce human emissions of greenhouse gases (GHGs).

This study seeks to assess the relative merits of specific policies intended to reduce GHGs from automobiles. (It does not consider whether or not reductions in GHGs are actually desirable.) Current policies and proposals for reducing GHGs from autos would require implementation of strong land use restrictions (compact development). Technological alternatives for reducing GHG emissions have received considerably less attention.

We estimated the costs of a range of such policies, beginning with government documents and reports prepared in cooperation with organizations advocating behavioral policies. Behavioral strategy costs and the costs of technological strategies were evaluated against the upper limit on acceptable costs for GHG emissions reductions as estimated by the Intergovernmental Panel on Climate Change. (This upper limit, $50/ton of carbon dioxide equivalent in 2020-2030, is used because of its source, not because we endorse that value).

GHG emission reduction goals cannot be realistically achieved by applying “fair share” quotas to economic sectors. Depending on the availability of strategies requiring expenditures less than $50 per ton, a sector might account for more or less of the eventual reduction in GHG emissions than its share of total emissions. A “fair share” approach would require some unnecessarily expensive strategies, while neglecting some less costly strategies. As an example, IPCC research indicates that transportation represents 23% of global emissions, yet estimates the economic potential for GHG reduction in transport to be less than one-half that figure (10% or less).

Research by McKinsey & Company and The Conference Board found that substantial GHG emission reductions can be accomplished cost-effectively while “maintaining comparable levels of consumer utility” (an economic term denoting quality of life). This means “no change in thermostat settings or appliance use, no downsizing of vehicles, home or commercial space and traveling the same mileage” and “no shift to denser urban housing.”

Sustainability is often narrowly defined as pertaining to the environment, such as GHG reduction. However, environmental sustainability also depends upon achieving other dimensions of sustainability, including financial, economic and political.

Behavioral Strategies (Compact Development)

Proponents of this approach argue that GHG reduction will require radical changes in lifestyles. Their solution is behavioral strategies (compact development) to increase urban densities and change the way people travel. The two most prominent reports on this approach (Driving and the Built Environment and Moving Cooler) predict that compact development could reduce GHGs from autos by between 1% and 9% between 2005 and 2050. Driving and the Built Environment acknowledges that there will still be significant increases in overall driving (vehicle miles traveled or VMT).

Compact development raises various issues:

Reasonable Expectations: Projected results from the most aggressive scenarios appear to be implausible based upon reservations stated in Driving and the Built Environment and broader criticisms of Moving Cooler. It is suggested that a range of 1% to 5% is more realistic for the maximum GHG emissions reductions from autos between 2005 and 2050 under compact development policies.

Traffic Congestion and Compact Development: Even this modest level of GHG reduction could be further diminished by the “GHG Traffic Congestion Penalty.” The higher densities required under compact development would cause greater local traffic congestion. As traffic slows and moves more erratically, the GHG reductions from less driving are diminished. Further traffic congestion retards the quality of life of households and imposes economic costs on metropolitan areas.

Housing Affordability and Compact Development: Compact development is associated with higher housing prices. This is burdensome to lower income households, which are disproportionately minority. Assessing the impact of compact development on house prices, a Latino (Hispanic) think tank noted “an increase is always the result.” The increased household expenditures for mortgage interest and rents alone could amount to nearly $20,000 per GHG ton annually, nearly 400 times the IPCC $50 maximum expenditure by 2050 (2010$). This loss of housing affordability would represent a huge transfer of wealth from lower and middle income households.

Infrastructure Costs and Compact Development: Despite theoretical claims that suburban infrastructure is more expensive than in more dense areas, data for metropolitan areas indicates no such premium.

Higher Densities: Compact development would require unprecedented increases in density, well beyond those envisioned by current compact development policies. This densification could require aggressive use of eminent domain and could be prevented by neighborhood resistance and public reaction.

Compact development is incapable of reducing GHG emissions within the IPCC $50 maximum expenditure. Compact development’s higher than necessary expenditures could reduce economic growth, increase congestion costs, and result in public resistance and greater social imbalances. Because of its detrimental impact on financial, economic and political sustainability, compact development is unsustainable as a strategy for reducing GHG emissions from autos.

Facilitative Strategies

The alternate view is that technology solutions can achieve sufficient GHG reduction from autos.These facilitative strategies would alter the underlying GHG intensity of how people live and travel without requiring major changes in behavior or the standard of living.

There is substantial potential for reducing GHGs:

The trend of present fuel efficiency improvements, if they can be continued beyond 2030, would produce auto-related GHG reductions of 18% by 2050 (from 2005). And if VMT increases at a lower rate, as some experts now project, a 33% reduction could be achieved.

If the average auto were to achieve the best current hybrid fuel economy by 2040, GHGs would fall 55% between 2005 and 2050.

Emerging fuel technologies also offer promise. Hydrogen fuel cells and zero-emission cars (principally plug-in electric vehicles), if paired with electricity from hydro-power, could help reduce GHGs from autos by 2050.

Various issues are examined with respect to facilitative strategies:

GHG Reduction and VMT Increases: Department of Energy projections indicate that auto GHG emissions will decline, even though total driving will continue to increase.

Maximum Expenditures: Facilitative strategies that would require more than the $50 IPCC maximum expenditure are rejected.

Quality of Life: Current technologies can be implemented without retarding Americans’ quality of life. However, some of the more advanced technology strategies may reduce quality of life by requiring smaller autos. Under either scenario, people could continue to live in houses of the same size at affordable prices, to travel the same mileage, and there would be no necessity for a shift to denser urban housing. Research associates greater economic growth with geographic mobility, which is preserved even under the more advanced technologies.

Relying on Technology: Based upon the current availability of far more fuel-efficient technologies, such as hybrid vehicles, it is plausible to assume continued GHG reductions after 2030. The emerging strategies could accelerate the improvement. Of course, as noted above, any projection is uncertain.

New technologies have the potential to achieve substantial GHG emission reductions at costs within the $50 IPCC maximum expenditure per ton. This could be accomplished while preserving quality of life. As a result, public acceptance is more likely.

Conclusions and Recommendations

Generally, existing and likely future technologies have a far greater potential to reduce GHG emissions than compact development. Based upon Driving and the Built Environment and Moving Cooler, compact development provides little possibility of achieving a reduction of more than 5% in auto GHGs by 2050.

On the other hand, wider application of existing technologies could produce GHG emission reductions of up to 54% by 2050 with current hybrid technology. GHG reductions from new technologies, such as electric cars, could be even greater. These technologies are potentially sustainable financially, economically and politically, and thus environmentally. By contrast, imposing compact development would be enormously expensive, is likely to reduce economic growth substantially, and could stifle opportunity for lower income households, which are disproportionately African-American and Hispanic. These factors render compact development unsustainable financially, economically and politically, and thus environmentally.

As governments consider policies intended to reduce GHG emissions from autos:

· Compact development strategies should be neither mandated nor encouraged.

· Technology strategies should receive priority.

At the same time, any such policies other than removing government-imposed barriers to new technology development and adoption should be implemented with great caution.

Attachments

The post Reducing Greenhouse Gas Emissions From Automobiles appeared first on Reason Foundation.

]]>
The EPA Is Overreaching Again https://reason.org/commentary/the-epa-is-overreaching-again/ Tue, 25 Oct 2011 10:00:00 +0000 http://reason.org/commentary/the-epa-is-overreaching-again/ On October 13th, the House voted to postpone the implementation of Environmental Protection Agency's "Boiler MACT" regulations. Coming only a few weeks after President Obama's high profile intervention to scrap the EPA's proposed new ozone rules, the House vote, which was supported by 41 Democrats, should be a wake-up call to an agency that has over-reached. Instead, the EPA looks set to push forward with final rules in the next few weeks that would significantly increase the cost of energy for consumers and industrial users. The "EPA Regulatory Relief Act of 2011" would postpone implementation of the EPA's "Boiler MACT" rules, which aim to curb emissions from boilers used to produce electricity and heat for industry and commercial use.

While policies that reduce harmful air pollution are obviously desirable, the methods used by EPA in this instance would be enormously difficult to achieve and impose huge costs on society. The EPA is aware of these problems and delayed the promulgation of a final rule by a year in order to "calculate standards that fully reflect operational reality" but it now says it will likely not address major issues in the final rules scheduled for release at the end of the month.

The post The EPA Is Overreaching Again appeared first on Reason Foundation.

]]>
On October 13th, the House voted to postpone the implementation of Environmental Protection Agency’s “Boiler MACT” regulations. Coming only a few weeks after President Obama’s high profile intervention to scrap the EPA’s proposed new ozone rules, the House vote, which was supported by 41 Democrats, should be a wake-up call to an agency that has over-reached. Instead, the EPA looks set to push forward with final rules in the next few weeks that would significantly increase the cost of energy for consumers and industrial users. The “EPA Regulatory Relief Act of 2011” would postpone implementation of the EPA’s “Boiler MACT” rules, which aim to curb emissions from boilers used to produce electricity and heat for industry and commercial use.

While policies that reduce harmful air pollution are obviously desirable, the methods used by EPA in this instance would be enormously difficult to achieve and impose huge costs on society. The EPA is aware of these problems and delayed the promulgation of a final rule by a year in order to “calculate standards that fully reflect operational reality” but it now says it will likely not address major issues in the final rules scheduled for release at the end of the month.

One major issue is the way the EPA has cherry-picked emissions data from various different boilers and then sought to apply the results to all boilers. The result is a set of standards that have never been demonstrated as operationally achievable when applied together. Some observers have dubbed the hypothetical boiler to which these standards would apply a “frankenboiler.” Unless EPA addresses this and other important issues, such as the dubious basis for its cost-benefit test, Congress is justified in intervening to prevent these rules from being promulgated.

The EPA says that its frankenboiler standards are justified under the Clean Air Act, which requires the elaboration of standards based on the “maximal available control technology”, or MACT. It is true that MACT standards impose stringent emission limits on boilers used in manufacturing, processing, mining, refining, and other industries, as well as commercial boilers used in malls, apartments, restaurants, and hotels. Under the Clean Air Act, EPA may force companies to upgrade to the technologies used in the top 12 percent of facilities within any particular industry. For example, EPA can require existing steel mills to apply technologies used in the cleanest 12 percent of steel mills nationwide.

The most natural interpretation of the MACT standard is that EPA should take a look across an industry, identify the least polluting 12 percent of facilities and direct the rest of the industry to move towards similar controls. Instead, EPA looked at individual pollutants at facilities, cherry-picked the lowest emitters of any particular pollutant (regardless of the same boiler’s emissions of other pollutants), spiced them together, and set the bar there. And like Dr. Frankenstein’s cobbled together freak show in Mary Shelley’s famous novel, this monstrous boiler is entirely imaginary.

The EPA has taken a similarly burdensome approach before on a smaller scale – and the result was devastating. In 1997, it issued regulations for incinerators used in the disposal of hospital waste. Environmentalists sued, claiming that the standards were too lax, and two years later the agency was told by a court to redo them. EPA took its time issuing new standards, but the regulations deemed too weak by environmentalists nevertheless decimated the industry – cutting the number of facilities from over 2,300 to 57 in just one decade.

By the time the EPA finally issued new standards for hospital waste, in 2007, it had taken the environmentalists’ message to heart. Not only did it use new data, from 2007 when 98 percent of businesses had already exited the industry, but, claiming that “there appears … to be a substantial ambiguity in the statutory language about whether the MACT floor is to be based on the performance of an entire source or on the performance achieved in controlling particular hazardous air pollutants.” It decided that the obvious solution to the ambiguity is to take the most perverse interpretation. Thus begat the frankenboiler.

Earlier we noted that there are no boilers that comply with the new MACT standards. We may have exaggerated. We said may: it has been estimated that between zero and two percent of boilers in the country might comply with the proposed standard. Ninety eight percent would fail. Unfortunately, industrial and commercial boilers are not part of an industry America can afford to decimate. Under the regulation, the majority of the country’s 14,000 large boilers will have to be retrofitted with new and costly technologies. The EPA estimates the upfront price tag of doing this at $10 billion and annual compliance costs of around $3 billion. Industry estimates are much higher.

Earlier this year President Obama signed an executive order requiring federal agencies to use the “least burdensome tools” that take “into account benefits and cost” when creating regulations. This is an example where EPA can adhere to this mandate by making simple, sensible policy decision: withdraw the proposed standards and as EPA Administrator Lisa Jackson has put it: “calculate standards that fully reflect operational reality.”

Unless the EPA withdraws the proposed standards voluntarily – and it seems unlikely to do so – Congress should require it to. It’s time to rein in this Dr. Frankenstein Agency.

Julian Morris is vice president at Reason Foundation. Adam Peshek is a research associate at Reason Foundation.

The post The EPA Is Overreaching Again appeared first on Reason Foundation.

]]>
EPA jumps the gun with job-killing rules https://reason.org/commentary/epa-jumps-the-gun-with-job-killing/ Fri, 19 Aug 2011 15:20:00 +0000 http://reason.org/commentary/epa-jumps-the-gun-with-job-killing/ Twice this year, President Obama asked federal agencies to review regulations to ensure that they are not interfering with efforts to rebuild the U.S. economy. In January, he signed an executive order directing agencies to use the “least burdensome tools” … Continued

The post EPA jumps the gun with job-killing rules appeared first on Reason Foundation.

]]>
Twice this year, President Obama asked federal agencies to review regulations to ensure that they are not interfering with efforts to rebuild the U.S. economy. In January, he signed an executive order directing agencies to use the “least burdensome tools” that take “into account benefits and cost” and “[promote] economic growth … and job creation.”

Either the Environmental Protection Agency didn’t get the memo or it was lost under the growing stack of regulations the agency is advancing at record speed.

Last week, the EPA said it would soon release updated ozone regulations that are going to kill jobs and impose substantial costs on the U.S. economy – at least $90 billion, by its own estimates, and $1 trillion annually between 2020 and 2030 according to industry estimates.

“Good” ozone in the upper atmosphere protects us from ultraviolet radiation, but at ground level it can affect human health and is the main constituent of urban smog. This “bad” ozone comes mostly from vehicle exhaust and industrial emissions, but can also come from natural sources.

The EPA has already missed four self-imposed deadlines (most recently last month) to impose new standards. And environmentalists are not happy about this. Earthjustice and other environmental groups have demanded that the EPA immediately release their rule, stating that the agency “has run out of excuses for any more stalling on this decision.”

But this is hubris. EPA is under no obligation to develop new regulations at this time. The Clean Air Act – the legal basis for most federal air quality regulations – requires the EPA to review national air quality standards every five years. If they find that current thresholds are detrimental to health, the EPA can go through the process of setting a new, scientifically-backed standard. The last time these standards were reviewed was three years ago. Legally, EPA is not obliged to initiate a review for another two years.

So, why is it doing so now? Is smog on the rise? Nope. According to the EPA, ozone levels have been falling year after year. Since 1980, ozone emissions have fallen by nearly 50 percent. And yes, new standards played a significant role in this. But it takes time for companies to develop and implement technologies that will enable them to comply in cost effective ways. Small companies in particular find this difficult because they have fewer resources to use in complying with regulations.

In 1997, the Clinton administration set an eight-hour ozone standard of 84 parts per billion. By 2004, there were still 474 counties that had not achieved that goal. Today, just 242 counties aren’t meeting the standard. In 2008, the Bush administration set an even stricter standard that lowered the ozone level to 75 parts per billion. Small businesses are still in the process of implementing technologies to comply with that ruling. Yet the EPA now wants to ratchet down emissions to between 60 and 70 parts per billion.

At those points, the number of nonattainment counties – places with ozone levels that exceed the “‘acceptable” amount – would increase from 242 to between 515 and 650, depending on where the new bar is set.

Even some wilderness areas, including Yellowstone Park, could find themselves in the EPA’s non-attainment category. That’s because ozone also occurs in nature – as a result of chemicals released by plants and soil.

Nonattainment is not a trivial label for a county or region. Companies wishing to invest in these areas would be forced to comply with costly and time consuming permitting processes, expensive equipment requirements, and could even be required to reduce emissions from nearby plants to offset emissions at new facilities. Counties themselves could be denied highway and infrastructure funds if ozone levels were increased in their area. A smart business would simply avoid these areas due to the potential costs and headaches.

The Manufacturers Alliance, a manufacturing industry research group, estimated that the regulation could result in the loss of 7.3 million jobs. While those numbers might be exaggerated, it is clear that the regulations are going to cause job losses. And with a teetering economy and unemployment lingering over 9 percent, the EPA should not be pushing through a premature, costly, job-killing regulation. If Mr. Obama is serious about putting the kibosh on regulations that limit economic growth he would be wise to focus on this one.

Adam Peshek is a research associate at the Reason Foundation. This column first appeared in The Washington Times.

The post EPA jumps the gun with job-killing rules appeared first on Reason Foundation.

]]>
The EPA https://reason.org/commentary/the-epas-carbon-footprint/ Mon, 15 Feb 2010 16:37:00 +0000 http://reason.org/commentary/the-epas-carbon-footprint/ On December 7, as delegates from around the world gathered in Copenhagen for the United Nations climate conference, Environmental Protection Agency Administrator Lisa Jackson announced that her bureaucracy would begin to regulate the emission of carbon dioxide and other gases deemed to be warming the planet. "Today, I'm proud to announce that EPA has finalized its endangerment finding on greenhouse gas pollution," Jackson proclaimed. As a consequence, the agency "is now authorized and obligated to take reasonable efforts to reduce greenhouse pollutants under the Clean Air Act."

"Reasonable" here is in the eye of the beholder. The 1990 Clean Air Act was designed for conventional air pollutants such as particulates and ozone smog, not for carbon dioxide. Applying those rules to CO2 will mean imposing costly regulations not just on cars and factories but on commercial buildings, churches, and even residences. All told, more than 1 million entities could become subject to new federal controls on greenhouse emissions.

The post The EPA appeared first on Reason Foundation.

]]>
On December 7, as delegates from around the world gathered in Copenhagen for the United Nations climate conference, Environmental Protection Agency Administrator Lisa Jackson announced that her bureaucracy would begin to regulate the emission of carbon dioxide and other gases deemed to be warming the planet. “Today, I’m proud to announce that EPA has finalized its endangerment finding on greenhouse gas pollution,” Jackson proclaimed. As a consequence, the agency “is now authorized and obligated to take reasonable efforts to reduce greenhouse pollutants under the Clean Air Act.”

“Reasonable” here is in the eye of the beholder. The 1990 Clean Air Act was designed for conventional air pollutants such as particulates and ozone smog, not for carbon dioxide. Applying those rules to CO2 will mean imposing costly regulations not just on cars and factories but on commercial buildings, churches, and even residences. All told, more than 1 million entities could become subject to new federal controls on greenhouse emissions.

The EPA power grab was no surprise; indeed, it was inevitable. The regulatory train was set in motion in 2007, when the Supreme Court ruled by a 5-4 vote in Massachusetts v. EPA that the Clean Air Act applied to greenhouse gases. The EPA probably would have made the same move had John McCain been president, by court order if not voluntarily. Now that the train is picking up speed, it will be almost impossible to stop and difficult to control. If you think federal environmental regulation is costly and inefficient, you ain’t seen nothing yet.

Orders From the Court

The push to extend the Clean Air Act began late in the Clinton administration. In 1998, during a House Appropriations Committee hearing, EPA Administrator Carol Browner told Congress that existing law provided enough authority for the agency to begin regulating the greenhouse gases that environmentalists fear are warming the planet past the point of no return. An EPA legal memorandum on this point soon followed. Environmental groups then tried to force the agency’s hand by filing a petition demanding regulation, but the Clinton White House, still smarting over a failed 1993 attempt to impose a nation-wide energy tax, was in no rush to adopt such far-reaching regulations.

By the time the Bush administration took over, the greens were tired of waiting for an answer. In 2002, the petitioners sued the EPA for failing to act. The Bush EPA formally denied the petition in 2003, on grounds that it lacked the authority to regulate greenhouse gases because the Clean Air Act was written to address localized air pollutants, not globally dispersed emissions such as carbon dioxide. If Washington wanted to fight climate change, the administration argued, coordinated international efforts made more sense than haphazard regulation via a law written in a different time for a different purpose.

The petitioners, now joined by several northeastern states and others, promptly sued. They prevailed in 2007, when the Supreme Court’s narrow majority concluded that the EPA had power to regulate greenhouse gas emissions and had acted arbitrarily in declining to exercise the Clean Air Act’s underlying authority.

Under the original law, the EPA is required to regulate any emissions that “cause, or contribute to, air pollution which may reasonably be anticipated to endanger public health or welfare.” According to the five-justice majority, the six greenhouse gases cited by the petitioners-carbon dioxide, methane, nitrous oxide, hydrofluorocarbons, perfluorocarbons, and sulfur hexafluoride-“fit well within the Clean Air Act’s capacious definition of ‘air pollutant,’ ” as they contribute to the accumulation of greenhouse gases in the atmosphere, which in turn contribute to a gradual warming that could “threaten the public health and welfare of current and future generations.” Writing for the majority, Justice John Paul Stevens brushed aside concerns that a complex regulatory statute designed to combat local pollution problems was a poor fit for global climate control. EPA regulation of greenhouse gases “would lead to no…extreme measures,” he wrote.

The Supreme Court stopped short of ordering the EPA to regulate greenhouse gases, but the writing was on the wall. If the EPA concluded that, per the Clean Air Act, greenhouse gas emissions “may reasonably be anticipated to endanger public health or welfare,” the agency would now be legally obligated to regulate. Since even the Bush EPA had repeatedly warned that global warming was a problem the nation “must address,” greenhouse gas regulation became a question of “when,” not “if.”

Not Just Cars and Trucks

The immediate consequence of the sweeping new EPA authority will be more stringent regulation of automobiles. Section 202 of the Clean Air Act requires the EPA to adopt emission controls once an “endangerment” finding is made. In September, anticipating that finding, the EPA and the National Highway Transportation Safety Administration proposed new regulations that would effectively require automakers to produce cars and light trucks with an average fuel efficiency rating of 35.5 miles per gallon by 2016. According to the EPA’s own estimates, the regulations could cost automakers more than $50 billion and increase new vehicle prices by an average of $1,000. The rules could also reduce auto safety by encouraging production of lighter, smaller cars. With the endangerment finding on the books, a final rule should follow shortly.

That won’t be the only new regulation set in motion. While the EPA made its endangerment finding under Section 202, other provisions of the act have virtually identical language. For example, Section 111, which governs emissions for newly built or modified industrial facilities, likewise requires the agency to set standards for stationary sources of emissions that cause or contribute to “air pollution which may reasonably be anticipated to endanger public health or welfare.” If the EPA must regulate automotive emissions under Section 202, it will also have to set standards for newly constructed industrial facilities under Section 111.

And that just scratches the surface of the EPA’s potential greenhouse impact. Under Section 165 of the Clean Air Act, when companies construct or modify any facility that qualifies as a “major” stationary source of air pollution, they are required to adopt the “best available control technology” for all emissions subject to regulation by any part of the act. The law defines a “major” source as a facility that has the potential to emit 250 tons per year of a regulated pollutant (or, for some specified facilities, 100 tons per year). For traditional air pollutants, such as sulfur dioxide or nitrogen oxides, these thresholds mean that only the biggest and dirtiest facilities, amounting to several thousand nationwide, are subject to federal controls.

Carbon dioxide, however, is a ubiquitous by-product of modern industrial society. (Indeed, some efforts to control traditional pollutants increase carbon dioxide emissions by design, as a by-product of more complete combustion.) Plenty of industrial facilities emit more than 250 tons of carbon dioxide per year. So do many commercial and residential buildings. The EPA itself estimates that a strict application of Section 165 would increase the number of required air pollution permits “more than 140-fold.” A study commissioned by the U.S. Chamber of Commerce goes even further, estimating that the 250-ton threshold would encompass nearly 200,000 manufacturing facilities, approximately 20,000 farms, and at least 1 million commercial buildings, including a substantial percentage of hospitals, hotels, large restaurants, and even some churches. On average, the Chamber of Commerce study reported, “a building with over 40,000 square feet uses enough hydrocarbons to become a regulated source.” And since the act applies to all facilities with the mere potential to emit 250 tons in a year, the regulatory net could be spread even wider.

Just one EPA permit for a new or modified source can cost hundreds of thousands of dollars for the applicant and require more than 300 person-hours for a regulatory agency. Adding hundreds of thousands of new permit seekers would likely overwhelm the state agencies that typically implement EPA rules, causing extensive delays and cost increases. It would be a substantial new burden on an already struggling economy.

‘Absurd Results’

The EPA is well aware of the potential regulatory nightmare-and political backlash-that enforcement of Section 165 could create, so the agency has offered to modify its carbon rules. In September, shortly after proposing the new regulations for cars and trucks, the EPA proposed a dramatically higher new threshold of 25,000 tons per year before the new greenhouse gas requirements are imposed, even though the statute expressly sets a limit of 250. The EPA estimates that the new threshold, if adopted, would force fewer than 15,000 facilities to obtain carbon permits, and most of those are already subject to other environmental regulations. “This is a common-sense rule that is carefully tailored to apply to only the largest sources,” the EPA’s Jackson explained.

However sensible it may be, the proposal directly conflicts with the act’s explicit text. Section 165 applies to “any…source with the potential to emit two hundred and fifty tons per year or more of any pollutant” (emphasis added). The EPA justified its elastic reading of the law on grounds that a lower threshold is “not feasible” for greenhouse gases. Without any statutory text to support this decision, the EPA relied on the doctrines of “administrative necessity” and avoiding “absurd results.”

According to the EPA, applying the Clean Air Act as written to greenhouse gas emissions would “extensively disrupt” existing regulatory programs and perhaps make them “impossible” to administer. Yet such administrative concerns did not persuade a majority of the Supreme Court in Massachusetts v. EPA. Nor did they convince the attorneys who sued the EPA to force greenhouse gas regulation in the first place, some of whom now work for the EPA. Environmentalists were happy to push inflexible readings of statutory provisions that expand the EPA’s regulatory power. Now that their friends are in charge, some want regulators to have discretion over how this authority is exercised. The EPA says it won’t regulate smaller facilities now, but it’s also pledging to revisit the 25,000 ton threshold within five years.

The EPA’s regulatory benevolence could be even more short-lived, as the agency’s creative reading of the Clean Air Act is unlikely to survive judicial review. During the Bush years, federal courts repeatedly chastised the agency for taking liberties with the Clean Air Act. “Read the statute!” the U.S. Court of Appeals for the D.C. Circuit exclaimed in one case. In another, the court declared that the EPA’s interpretation of the act could make sense “only in a Humpty Dumpty world.” One can imagine its response to the EPA’s effort to turn 250 into 25,000. If the text of the Clean Air Act applies to greenhouse gases, it requires more stringent regulation than even this administration wants.

Pollute Globally, Regulate Locally

Section 165 is not the only potentially massive regulatory consequence of the EPA’s carbon announcement. In all likelihood the finding will force the agency to set National Ambient Air Quality Standards for carbon dioxide and other greenhouse gases as well. This could end up making Section 165 look like a walk in the park.

Sections 108 and 109 of the Clean Air Act require the EPA to set standards for all emissions which, in the judgment of the administrator, “cause or contribute to air pollution which may reasonably be anticipated to endanger public health or welfare” and “the presence of which in the ambient air results from numerous or diverse mobile or stationary sources.” We’ve seen this first requirement before; it’s the basis of the endangerment finding, so it has already been met. Emitted from countless mobile and stationary sources, greenhouse gases easily satisfy the second criterion as well.

Air quality standards must be set at the level “requisite to protect the public health” with “an adequate margin of safety” and “requisite to protect the public welfare from any known or anticipated adverse effects.” States then must develop implementation plans detailing how they will ensure that local air quality meets the standard before a deadline. These plans must include more stringent facility permitting requirements and whatever other measures are necessary to ensure the target is met, including regulation of automobile use. States that fail to comply risk sanctions, including loss of highway funds and direct imposition of even stricter rules.

The problem: It makes no sense to set ambient air quality standards for greenhouse gases. There is simply no way for state and local regulators to ensure that individual cities, or even larger regions, meet an air quality standard for a globally dispersed atmospheric pollutant. Local emissions could be reduced to zero, and a given area would still violate the standards if global emissions did not decline. It would be a pointless regulatory exercise.

Nonetheless, some people feel the exercise is necessary. On December 2, the Center for Biological Diversity filed a petition with the EPA demanding that it adopt National Ambient Air Quality Standards for greenhouse gases. It was a petition of just this sort that set the greenhouse regulatory train in motion in the first place. The Center for Biological Diversity is more than ready to file suit if the EPA does not comply, and the Clean Air Act is on their side.

A Done Deal

Barring congressional action to amend the Clean Air Act, most of this mischief is a done deal. Anti-regulatory groups have already announced their intention to challenge the EPA’s endangerment finding in court, but they are unlikely to get very far. All the EPA must show is that it could reasonably anticipate that global warming could threaten public health or “welfare,” an expansive term the act explicitly defines to include effects on climate, “economic values,” and “personal comfort and well-being.” Reviewing courts will not substitute their reading of the relevant scientific evidence for that of the EPA, so it’s no use arguing the agency placed too much weight on one study while discounting another.

Although EPA head Jackson would later claim the timing was coincidental, when she announced her decision to regulate carbon she said it would allow U.S. negotiators to “arrive at the climate talks in Copenhagen with a clear demonstration of our commitment to facing this global challenge.” Lacking climate change legislation with binding targets, the White House concluded that an announcement of EPA regulation was the next best thing. The greenhouse regulatory train set in motion by Massachusetts v. EPA will continue to steam ahead unless Congress intervenes. All aboard.

Jonathan H. Adler (jha5@case.edu) is a professor of law and the director of the Center for Business Law and Regulation at the Case Western Reserve University School of Law. He participated in an amicus curiae brief of law professors and the Cato Institute opposing EPA regulation of greenhouse gases in Massachusetts v. EPA. This column first appeared at Reason.com.

The post The EPA appeared first on Reason Foundation.

]]>