Climate Change Archives - Reason Foundation https://reason.org/topics/environment/climate-change/ Free Minds and Free Markets Sat, 22 Oct 2022 01:40:42 +0000 en-US hourly 1 https://reason.org/wp-content/uploads/2017/11/cropped-favicon-32x32.png Climate Change Archives - Reason Foundation https://reason.org/topics/environment/climate-change/ 32 32 The government’s bad idea to stop using single-use plastics https://reason.org/commentary/the-governments-bad-idea-to-stop-using-single-use-plastics/ Mon, 24 Oct 2022 18:00:00 +0000 https://reason.org/?post_type=commentary&p=59172 The Government Services Administration should not ban single-use plastics from its supply and acquisition chains.

The post The government’s bad idea to stop using single-use plastics appeared first on Reason Foundation.

]]>
The Government Services Administration is considering phasing out single-use plastics from its supply chain and procurement processes, which would have major ramifications for America’s economy and the functioning of its production and service sectors. Due to the size and market power of the GSA, the proposed rule’s impacts would likely ripple through the national plastics economy and the personal plastics economy of individual Americans, who would find their choices to use single-use plastics impacted, perhaps considerably.

On July 7, 2022, the Government Services Administration (GSA) put out an advance notice of proposed rulemaking asking its contractors who make or use single-use plastics to tell the GSA what they think about the Center for Biological Diversity’s proposal that they stop contracting for goods or services that use such materials.

In the notice, GSA poses a long list of questions about the scale and scope of single-use plastics used in goods and services they source through their providers and what it would cost those providers to go along with the plan to ditch the single-use plastics. The class of single-use plastics includes plastic drinking straws, plastic water bottles, plastic packaging materials, plastic grocery bags, plastic cutlery, and many other plastic items often treated as environmental villains of the moment. But it also includes things less commonly considered nuisances, even when found out of place as litter, such as single-use medical containers, products, and devices of many sorts, such as surgical masks.

The GSA appears to be acting on this issue due to the petitioning of the Center for Biological Diversity (CBD). This aggressive group describes itself as a conservation organization “dedicated to the protection of endangered species and wild places.” But in this case, what CBD requests the Government Services Administration to do would not improve global, national, local, or individual environmental health and safety. These proposed actions would, in all probability, most likely compromise those very things.

The Center for Biological Diversity argues that banning single-use plastics aligns with President Joe Biden’s Executive Order 14008, “Tackling the Climate Crisis at Home and Abroad,” which calls for federal agencies to align their activities with the president’s climate change agenda. The crux of CBD’s petition is on page 9:

In furtherance of its stated policy to purchase sustainable products, and in line with its directive to procure environmentally preferable and nonhazardous products, the GSA must issue a rule committing the federal government to reduce and eventually eliminate its procurement and acquisition of single-use disposable plastic products….

Petitioners request that the GSA revise its regulations to reduce and eventually eliminate the acquisition of single-use plastic bags, single-use plastic utensils and straws, beverage bottles, packaging, and other single-use food service items and personal care products.

These revisions should apply to the procurement of single-use plastics for federal government meetings, conferences, and events; food service facilities in leased and custodial buildings; and supplies for federal government operations. In addition, the new regulations should apply to all manners by which civilian executive agencies acquire goods and services, directly or indirectly, including through lease, procurement, contracting, and purchase orders.

We further request that the rulemaking contains exemptions for disability accommodations, disaster recovery, medical use, and personal protective equipment. GSA regulations must clarify that “single-use product” does not include medical products necessary for the protection of public health, or personal protective equipment, including masks, gloves, or face shields.

To give the CBD some credit where it is due, this last paragraph is refreshingly grounded in the reality of real-world tradeoffs—some of them, anyway. More such thinking would improve environmental policy considerably. But there does not appear to be much emphasis on the trade-offs of many problematic elements of CBD’s or GSA’s proposed approach to plastics.

Policy Problem One: First, do no harm (proximal)

Perhaps the first test of sound public policy is the same test used to determine sound medical policy, which is, as the Aesculapians like to say, primum non nocere, or first, do no harm.

It doesn’t take much reviewing of the research literature on the topic of plastic material substitutions to reveal that, in fact, plastic substitutes are usually worse for the environment than plastics, as well as worse for human health and safety. I have written about the downsides of plastics substitutions at some length. My recent piece here examines the Canadian context, where they’re even farther ahead of the United States in pursuing “zero plastic waste.”

So why are alternatives to single-use plastics worse for the environment? One of the biggest reasons for this is that the “reusables,” as I’ll call them, consume more energy over their life cycles than their single-use plastic alternatives. More energy in manufacturing, distribution, utilization, and disposal means greater environmental impacts coming out of the soil (oil production); going into the air (conventional pollutants and greenhouse gases); running off into the water, and going back into the land (landfilling).

The downsides with regard to human health involve something that should be top-of-mind for everyone in the post-Covid-19 pandemic landscape—biological contamination. Single-use products are more likely to be sterile when first used, and they are rarely used again in a context where sterility is essential.

The same is not true for durable plastic alternatives that see regular use involving the same activities where biological contamination is an issue: eating and contact with body fluids.

The research literature on the use of renewable bags is fairly solid on this issue and would extend to renewable alternatives to plastic packaging (for food and medicines, for example). Reusable materials are more likely to be contaminated on secondary and sequential use and are simply less safe.

It should be obvious, but this is one reason why single-use plastics were adopted over reusable materials in the first place, particularly in medical settings, but also with regard to food contamination and preservation.

Policy Problem Two: Also, First, do no harm (proximal-distal)

The second policy problem is the same as the first: The policy is likely to violate the idea of doing no harm—in this case, distally, through its impacts on the economy in which we all live and from whose productive powers we receive all the wonderful goods and services that give us our historically absurdly nurturing quality of life.

From the more proximal economic standpoint of impacts to the American economy specifically, the proposal to get GSA out of the business of participating in the market of single-use plastics can only be a net harm. America’s economy is a high-tech transformation and service economy. America specializes in a certain kind of material and energy transformation, which is the creation and use of advanced technologies, materials, and heaps and gobs of powered gizmos and gewgaws of every sort. That’s our thing. We’re not a nation of farmers anymore. We’re not “hewers of wood and haulers of water,” as some of our Canadian friends have been styled. We’re not a raw natural-resource economy where we just dig up materials found in our environment and trade in them.

We are increasingly a consumer goods and services economy that engages in a vast spectrum of activities requiring a vast spectrum of materials with which to provide those goods and services. And though manufacturing has shrunk as a share of US Gross Domestic Product, America still invents, makes, uses, invents new uses of, and, importantly, sells high-tech goods and services rendered with such goods, in high quantities, at high speed, to as big a market as we can reach.

And plastics have become a significant part of that over a short span of time–only about 60 years since early adoption in the US materials economy.

More distally still (but, to this biologist, no less compelling) is that this entire idea of rationing and restricting access to a useful material such as plastic is unwise from the higher-order perspective of humanity’s evolutionary niche. Unlike other animals, human beings evolved to use technology and energy to transform raw materials into things that let us survive in the places that we otherwise might not, which is most of the surface of the Earth, and compete against animals that would otherwise view us as a light snack, or perhaps a decent lunch. Our transformative capabilities also let us defend ourselves against other humans, some of whom might not have gotten the memo about “cooperation is a better strategy for mutual co-existence.”

This is obligatory stuff woven into human evolution. Humans need to make use of virtually all materials available to them (and need a lot more that are not yet created, like that catalyst that will split water with little energy input) in order to meet their evolutionary imperatives to survive. Banning plastics, arguably one of the singularly most useful materials ever available to homo sapiens (as easily shown through the eagerness with which it has been incorporated into the human materials ecosystem freely, without government compulsion), will needlessly—and obviously—set back humanity’s ability to prosper in a hostile universe.

Summary

The Government Services Administration’s proposal to remove single-use plastics from their supply and acquisition chains at the behest of the Center for Biological Diversity would be detrimental to environmental health and safety from the standpoint of humanity’s evolutionary imperatives, America’s social and economic imperatives, people’s individual imperatives and rights, and the protection of the environment itself, either locally or globally.

The GSA might feel obligated to act on the petition of the Center for Biological Diversity’s anti-plastic demands. However, sound public policy principles would suggest that, at the end of the day, the agency should not give the CBD what it wants. The Government Services Administration should not ban single-use plastics from its supply and acquisition chains. That could only do America more harm than good.

The post The government’s bad idea to stop using single-use plastics appeared first on Reason Foundation.

]]>
Can virtual reality technology encourage remote work and slow climate change? https://reason.org/commentary/can-virtual-reality-technology-encourage-remote-work-and-slow-climate-change/ Wed, 24 Nov 2021 07:02:00 +0000 https://reason.org/?post_type=commentary&p=49272 Virtual reality headsets and meetings rooms could enhance the telecommuting experience.

The post Can virtual reality technology encourage remote work and slow climate change? appeared first on Reason Foundation.

]]>
Virtual reality, a technology normally associated with gaming, promises to enhance remote collaboration, further reducing the necessity for physical meetings and the need for workers to commute. If virtual reality (VR) successfully builds upon technological advantages that have enabled remote work, it could help put a dent in greenhouse gas emissions attributable to in-person commerce.

Although the term telecommuting was coined all the way back in 1973, the practice required technological progress, and, sadly, a global pandemic to come into its own. Businesses, non-profits, and government agencies in the early 1970s lacked personal computers, had rudimentary telecommunications capabilities, and relied heavily on physical files for information storage. Workers needed to be physically present just to obtain and share information.

The rise of personal computers (PCs), the internet, digital video cameras, smartphones, and cloud technologies have enabled most knowledge workers to exchange information and complete tasks from almost anywhere. Meanwhile, categories of work not traditionally thought of as knowledge or office work have come online.

Over the last two decades, online collaboration software has proliferated and improved. But online meeting tools don’t fully capture the interpersonal dynamics of physical meetings. And for many users, they can be tiring. Stanford Communications Professor Jeremy N. Bailenson has researched the phenomenon of “Zoom fatigue.” He considered four possible aspects of online meetings that could cause participants to become fatigued:

  • Excessive amounts of close-up eye gaze, 
  • Cognitive load arising from difficulty in sending non-verbal signals, 
  • Increased self-evaluation from staring at video of oneself, and 
  • Constraints on physical mobility.

New technology promises to reduce some of these drawbacks by making online meetings less stressful and tiring. Meta (formerly known as Facebook) is beta testing Horizon Workrooms, which leverages the company’s Oculus Quest 2 VR headsets to create a virtual workspace. Currently, Meta’s workroom service is free, but to get the full benefit, participants need to have the headsets which cost $299 each.

Instead of staring at colleagues’ video headshots on Zoom, participants in a Horizon Workrooms meeting view their coworkers’ avatars around a virtual conference table, a configuration that demands less one-on-one eye contact. Avatars can move around and share virtual whiteboards, mimicking a common tool used for planning, ideation, and product design in physical meetings.

Reviews suggest that Meta’s VR workroom will face barriers to adoption. The headsets are heavy and extended use can cause discomfort. But these issues should be ameliorated through technological innovation. For example, the Quest 2 headsets weigh 12% less than their predecessor, following the trajectory of laptops and other electronic devices that have become lighter over time.

In the wake of Facebook’s name change to Meta, the company’s emphasis on virtual reality and the metaverse has been subject to derision not unlike that directed to older platforms such as Minecraft and Second Life. But those who dismiss VR may not have a full historical appreciation of the process from obscure technology to household name. Technologies that seemed solely fit for recreation have become mainstream.  Consider, for example, how Robinhood attracts younger investors by “gamifying” the task of portfolio management.  iPads and other tablets originally targeted at home users are now commonly used for point of sale applications. With heavy investment from Meta, Microsoft, and others, there is reason to think that VR will enter the mainstream later in the 2020s.

Policy Implications

The idea that telecommuting has environmental benefits goes back to at least 1979 when economist Frank Schiff wrote in The Washington Post about the possibility of remote work reducing gasoline consumption, congestion, and air pollution.

Now, with VR joining a stable of older enabling technologies, remote work appears to be on its way to becoming commonplace with or without policy change.

The San Francisco Bay Area’s Metropolitan Transportation Commission (MTC) considered mandating large employers keep at least 60% of their employees at home by 2035. The mandate would have applied to all businesses with 25 or more employees in jobs eligible for remote work. But the idea was criticized by transit advocates concerned about ridership impacts. Ultimately, MTC watered down the mandate, instead recommending that employers implement trip reduction programs “to shift auto commuters to any combination of telecommuting, transit, walking, and/or bicycling.”

But, from a climate perspective, mass transit travel is not a full substitute for telecommuting. Even trips completed on fully electrified transit modes are not fully green. The Bay Area’s utility, Pacific Gas and Electric, generates one-third of its electricity from renewable sources and hopes to reach 60% from renewables by 2030. But this means a sizable proportion of electricity is, and will continue to be, derived from burning fossil fuels.

In any event, the Bay Area could reach the now discarded remote work target without a mandate. In July, the Bay Area Council found that 68% of the employers it surveyed expected a typical employee to go into the office three days or less post-pandemic. A Bay Area News Group poll found “that 70% of those able to work from home now want to stay out of the office most, if not all, of the time once the pandemic is over.”

So, at least in one large metropolitan area, a transition to remote work appears to be inevitable without government funding or encouragement. That said, government policies that could potentially slow this climate-friendly development deserve more careful scrutiny. These government policies include explicit and implicit subsidies for physical travel. For example, fares covered less than 10% of the cost of operating public transit in Santa Clara County before the pandemic. If this 90% travel subsidy was reduced or focused just on students and low-income passengers, more white-collar employees and their employers might opt for remote work. 

On the other hand, policies that reduce the cost of accessing broadband networks can make it less expensive to work from home while taking advantage of advanced technology like VR. That said, direct broadband subsidies in recent federal legislation could result in waste. Encouraging greater competition among private providers is a more fiscally sustainable approach.

But, as long as governments do not get in the way, we can expect the megatrend of remote work to continue—yielding unexpected climate change-related dividends with minimal costs to taxpayers. The further development of VR and other advanced technologies promise to accelerate this welcome trend.

The post Can virtual reality technology encourage remote work and slow climate change? appeared first on Reason Foundation.

]]>
A primer on carbon taxes https://reason.org/policy-brief/a-primer-on-carbon-taxes/ Thu, 28 Oct 2021 16:00:00 +0000 https://reason.org/?post_type=policy-brief&p=48591 Executive Summary Carbon taxes are again being discussed in the United States as a means of reducing emissions of carbon dioxide and other greenhouse gases (GHGs). Three main arguments are proffered in support of carbon taxes, either alone or in … Continued

The post A primer on carbon taxes appeared first on Reason Foundation.

]]>
Executive Summary

Carbon taxes are again being discussed in the United States as a means of reducing emissions of carbon dioxide and other greenhouse gases (GHGs). Three main arguments are proffered in support of carbon taxes, either alone or in combination:

  1. That by setting a price on greenhouse emissions equal to the “social cost of carbon,” a carbon tax would optimally reduce GHG emissions.
  2. That replacing existing regulations, subsidies, and tax expenditures with a carbon tax would more cost-effectively achieve emissions-reductions goals.
  3. That a revenue-neutral carbon tax would be economically beneficial.

These arguments are found to be wanting.

First, in theory, a carbon tax set at the “social cost of carbon” would lead to an optimal rate of greenhouse emissions. However, the “social cost of carbon” is highly uncertain. The current U.S. administration has chosen to use estimates of the “social cost of carbon” developed during the Obama administration, which would be in the region of $53 per metric ton of “carbon dioxide equivalent” emissions. This is likely significantly higher than the optimal rate.

A carbon tax applied with no offsetting reductions in other taxes or changes in regulations would increase the cost of goods and services. Energy and energy-related goods would be especially hard hit. A tax of around $50 per ton would raise natural gas prices by about 40% and gasoline prices by about 15% above recent levels. This would reduce economic growth by as much as 0.2% and also reduce employment. Even taking into account reductions in damage associated with GHG emissions, applying a carbon tax at a rate of $53 per ton would most likely cause net economic harm.

Second, numerous existing regulations, subsidies, and tax expenditures currently aim to reduce greenhouse gas emissions, including: the Renewable Fuel Standard, vehicle fuel economy and GHG emission standards, renewable portfolio standards, and tax credits for renewable energy and low-emission vehicles.

These regulations, subsidies, and tax expenditures cost hundreds of billions of dollars but do relatively little to reduce emissions. Replacing them all with a carbon tax applied at a uniform rate would in principle be both much less costly and more effective as a means of incentivizing reductions in emissions.

In practice, the likelihood of such a “grand bargain” being successfully implemented is extremely low because powerful, concentrated special interests who currently benefit from the regulations, subsidies, and tax expenditures would lobby heavily to maintain them.

Third, a revenue-neutral carbon tax, achieved by reducing either corporate income tax or the payroll tax or both, could have net benefits even if existing policies aimed at reducing carbon emissions were not repealed. However, it is unlikely that a carbon tax would be implemented in a truly revenue-neutral manner.

Many proposals from Congress and the Biden administration propose paying for new programs with carbon tax revenues, suggesting that there would be pressure to increase such revenue. This is precisely what happened in British Columbia, where an initially revenue-neutral carbon tax has gradually been increased. Elsewhere in the world, carbon taxes have almost ubiquitously been used for revenue-raising purposes.

Introduction

Governments across the world, including the U.S. federal government and many state governments, have sought to regulate emissions of greenhouse gases (GHGs) through a vast array of regulations and subsidies. Several of these have been controversial. For example, federal mandates and subsidies to promote the production and use of ethanol as a fuel have been criticized by both economists and environmentalists.

Meanwhile, vehicle and appliance energy efficiency standards became a cause célèbre in the recent U.S. presidential election.

Governments across the world, including the U.S. federal government and many state governments, have sought to regulate emissions of greenhouse gases (GHGs) through a vast array of regulations and subsidies.

In the face of such controversies, many economists and policy advocates (on both the political left and right) have argued that a carbon tax would be a more efficient policy to reduce GHGs emissions than these regulations and subsidies. There are broadly three arguments made in favor of introducing a carbon tax.

First, from an economic perspective, it is viewed as an efficient way to reduce GHG emissions, thereby internalizing the “social cost” of those emissions.

Second, regardless of whether a carbon tax is desirable per se, it is widely viewed as being superior to existing regulations and subsidies.

Third, even if a carbon tax were introduced on top of existing regulations and subsidies, some argue that it would have net economic benefits if it were implemented revenue-neutrally.

This brief considers the arguments for and against such a tax. It is organized as follows:

• Part 2 describes and evaluates the merits and drawbacks of the social cost argument.

• Part 3 discusses the economic effects of introducing a carbon tax without any other changes in tax, subsidy, or regulatory policy. The aim is to describe the effects of introducing a carbon tax on top of existing policies.

• Part 4 describes and critically evaluates the “grand bargain” argument, whereby a carbon tax is introduced as a replacement for existing regulations, subsidies, and tax expenditures that are aimed at reducing GHG emissions.

• Part 5 assesses the possibility and consequences of introducing a revenue-neutral carbon tax, that is to say combining the introduction of a carbon tax with offsetting reductions in other taxes so that net revenue remains constant.

• Part 6 offers some concluding remarks.

Excerpt of the Policy Brief’s Conclusions

This brief has explored the main arguments put forward in support of introducing a carbon tax.

Part 2 considered the argument that a carbon tax is justified on the grounds that carbon emissions impose a net external cost on society. While that may be true, the scale of those external costs remains uncertain. In determining an appropriate price for carbon emissions, the current U.S. administration uses “social cost of carbon” estimates developed during the Obama administration of approximately $53 per metric ton of CO2-e.

Part 3 explored the economic implications of applying a carbon tax at about that rate. Such a tax would significantly increase the cost of energy and energy-related goods. Studies show that, in the absence of any other changes to taxes, subsidies, or regulations, a carbon tax of around $50 per metric ton would cause U.S. GDP to fall by about 0.4%, lead to hundreds of thousands of lost jobs, and cause incomes to fall across the board, perhaps especially among those already on lower incomes.

While a carbon tax on its own would undoubtedly cause economic harm (notwithstanding any environmental benefits that might arise), it would likely be far less harmful than the many regulations and subsidies currently implemented to reduce carbon emissions. It would also likely be more effective than those policies in reducing emissions. So, in principle, a “grand bargain” in which a carbon tax was introduced in return for eliminating all those more harmful policies would have merit.

Unfortunately, as discussed in Part 4, the existing regulations and subsidies have created sets of concentrated beneficiaries, while the harms they cause are dispersed among the wider population. As such, any attempt to reform these policies is likely to be met with fierce and well-funded opposition.

Part 5 noted that a revenue-neutral carbon tax, achieved either by reducing (possibly even eliminating) corporate income tax or by reducing the payroll tax, could have net benefits even if existing policies aimed at reducing carbon emissions were not repealed. However, it seems unlikely that a carbon tax would be implemented in a truly revenue-neutral manner. Even if such a tax were initially close to revenue-neutral, numerous pressures would almost inevitably lead to it being increased at a rate such that it would generate additional net tax revenues.

Given the economic harm that would be caused by a carbon tax, and since most if not all the benefits from either a grand bargain or a revenue-neutral carbon tax would be generated by the reduction in other taxes, regulations, and subsidies, it would seem preferable for governments to reduce those taxes (or, at least not increase them), and remove those regulations and subsidies without imposing a carbon tax.

Full Policy Brief: A Primer on Carbon Taxes

Full Study: Evidence-Based Policies to Slow Climate Change

The post A primer on carbon taxes appeared first on Reason Foundation.

]]>
Proposed electric vehicles tax credit prioritizes labor unions over carbon reduction goals https://reason.org/commentary/proposed-electric-vehicles-tax-credit-prioritizes-labor-unions-over-carbon-reduction-goals/ Wed, 27 Oct 2021 19:41:00 +0000 https://reason.org/?post_type=commentary&p=48622 Whenever politicians win, they like to use the cliche that elections have consequences. When Democrats won control of the House and the Senate last year, a variety of policies aimed at addressing climate change were expected. But an electric vehicle … Continued

The post Proposed electric vehicles tax credit prioritizes labor unions over carbon reduction goals appeared first on Reason Foundation.

]]>
Whenever politicians win, they like to use the cliche that elections have consequences. When Democrats won control of the House and the Senate last year, a variety of policies aimed at addressing climate change were expected. But an electric vehicle provision in Congress’ evolving $1.5 to $2 trillion reconciliation bill would undermine their climate change-related efforts in order to provide an unprecedented gift to the United Auto Workers Union. The provision, in Section 136401 of the initial House reconciliation bill, would create a new, extra federal tax credit for buyers of some electric vehicles, who would receive an additional $4,500 tax credit—if they purchase electric vehicles assembled by union labor.

CNET reports:

This bill adds $4,500 to the current $7,500 tax credit available for a total of $12,500 potentially available to EV buyers. As the bill stands today an EV must be assembled in the US and with union labor, which would disqualify nonunion automakers such as Tesla and Toyota. In addition, it must use a US-built battery to qualify for the full $12,500 incentive. The bill is part of the broader Democratic-backed budget plan, though it continues to face major hurdles. It’s unclear if this provision will stick as President Biden and others in his party try to work out a compromise on the spending plan.

Back in the 1960s, when Detroit’s Big Three automakers (Chrysler, Ford, and General Motors) assembled nearly all of the cars sold in America, the term auto worker was synonymous with being a United Auto Workers member. By the 1980s, that began to change, as Honda, Nissan, and Toyota introduced popular, mostly better-built cars that became widely popular with drivers. Threatened with new tariffs, however, the Japanese car companies, along with BMW, Mercedes-Benz, Volkswagen, and others began building assembly plants in the United States. But, knowing full well the high costs, restrictive work rules, and strike potential at auto plants located in Michigan, nearly all of these carmakers’ new plants were located in Southeastern states, such as South Carolina and Tennessee. Unlike Michigan, these are “right-to-work” states, with laws that make it harder to form unions.

Today, only two of the 50 electric vehicles that would otherwise qualify for the new tax credit are assembled by union workers (at Ford and GM plants). By comparison, all the rest of the electric vehicles would become considerably more costly to consumers due to the absence of the bonus tax credit.

Every major auto company has more electric vehicle (EV) models in its pipeline, but even U.S. auto companies don’t plan to build most of them in Michigan. Both Ford and GM have announced plans for new EV production facilities in Tennessee, and the non-U.S. companies plan to keep building their vehicles (including EVs) in right-to-work states.

On Sept. 30, 12 of the largest non-U.S. auto companies sent a joint letter to House Speaker Nancy Pelosi (D-CA) and senior House members arguing that this tax credit measure would discriminate against their 131,000 American auto workers at 500 facilities in 36 states. They noted that their companies currently account for 55% of all new vehicle registrations each year.

Cody Lusk, CEO of the American International Automobile Dealers Association, told Politico, “If I want to buy a Volvo made in the US, I wouldn’t get the same benefits as someone who buys from GM in Michigan. The $4,500 credit is a huge amount and makes any non-union vehicles in a market non-competitive.”

That may be an exaggeration, but the proposed tax credit would obviously reduce the sales of electric vehicles made by every company except the Big Three’s models assembled in a handful of states. Policymakers hoping to increase the number of electric vehicles on the road should worry about the unintended consequences of this tax credit proposal. For example, if electric vehicle sales by all of the popular carmakers that would be ineligible for the $4,500 tax credit are cut in half in the coming decades because their EVs look less competitive, by 2050 the overall fraction of electric vehicles in the U.S. personal vehicle fleet could be only 25%, rather than the projected 50%.

A major emphasis of the Democratic Party’s agenda in Congress right now is implementing policies intended to reduce America’s carbon footprint. Making the transition to electric vehicles over the next 30 years is going to be a major element if that goal is successfully reached. Yet, this tax credit prioritizes labor union membership ahead of reducing the country’s carbon footprint. By trying to push buyers to union-assembled cars, Congress risks slowing the shift to electric vehicles and undercutting the carbon reduction goals it is trying to achieve.

The post Proposed electric vehicles tax credit prioritizes labor unions over carbon reduction goals appeared first on Reason Foundation.

]]>
Evidence-based policies to slow climate change https://reason.org/policy-study/evidence-based-policies-to-slow-climate-change/ Tue, 26 Oct 2021 04:01:00 +0000 https://reason.org/?post_type=policy-study&p=48231 Top-down policy approaches to control emissions may not be as effective as bottom-up approaches that harness the natural tendency of entrepreneurs and innovators.

The post Evidence-based policies to slow climate change appeared first on Reason Foundation.

]]>
Executive Summary

Human emissions of greenhouse gases (GHGs) are contributing to a rise in global average temperatures, with potentially significant effects on the climate. In response, governments around the world have introduced policies intended to reduce emissions of GHGs. Most of these policies are “top-down” and include mandatory restrictions on emissions, mandatory use of certain “low carbon” technologies, and subsidies to specific technologies.

This study finds that such top-down policies’ approaches to controlling GHG emissions may not be as effective as bottom-up approaches that harness the natural tendency of entrepreneurs and innovators to identify more efficient and cost-effective ways to produce goods and services.

The study identifies several key trends that suggest bottom-up approaches are already delivering results:

  • Energy use per dollar of gross domestic product (GDP) has been declining at a fairly constant rate in the U.S. for about a century.
  • Emissions of carbon dioxide (CO2) per dollar of GDP have been falling faster than the rate of decline in energy use for the past half century, both in the U.S. and globally.
  • Over the past 30 years, emissions of other GHG per dollar of GDP have been falling faster than emissions of CO2 globally.

These trends are largely driven by improvements in efficiency and changes in the sources of energy, including a centuries-long shift toward more energy-dense, lower-carbon fuels. These improvements were mainly driven by market forces, not government intervention.

While continued improvements in energy efficiency may slow or even stop the growth in energy use, they are unlikely to lead to a reduction in energy use, let alone a reduction in CO2 emissions. As such, if reductions in carbon dioxide emissions are to occur, they will need to come primarily from a continued shift toward lower-carbon fuels.

Currently, about 90% of the world’s energy and 80% of U.S. energy are supplied by carbon-based fuels. Numerous lower-carbon energy sources are currently available and are able cost-effectively to supply some portion of current energy demand. Unfortunately, however, attempts to shift largely or exclusively to zero-carbon fuels in the short term are likely to be prohibitively costly.

  • Hydropower can only be cost-effectively produced in locations that are geologically suitable.
  • Geothermal energy can be cost-effective in certain locations and applications.
  • Solar and wind power can be cost-effective in a relatively wide range of locations and applications but cannot be relied upon by themselves to supply power because the sun only shines for an average of 12 hours per day and the wind does not blow continuously. For these intermittent power sources to form a significant proportion of energy supply, storage (such as batteries) or back-up generation will be needed.
  • Battery storage is currently not cost-competitive with natural gas as a source of back-up power for renewable energy systems.
  • Nuclear power remains an important source of energy in the U.S., but the cost of a new nuclear power plant is more than twice the cost of a new natural gas-fired power plant per kW of energy generated.

For low- and especially zero-carbon energy to become the dominant source of power in the U.S. and globally, continued innovation is key. The question is, which policies are most appropriate to drive such innovation?

In other cases, innovations may derive laterally from innovations in other technologies. For example, large-scale battery storage technology has already benefited from dramatic improvements in lithium-ion batteries initially developed for laptops and other small consumer electronics. Likewise, geothermal energy generation is already benefiting from innovations developed to enable the extraction of oil and natural gas from shale formations.

Because many factors are geographically specific, the optimal combination of lower-carbon technologies will vary significantly from place to place. It will also change over time as innovation drives down costs. So, policymakers should avoid one-size-fits-all, top-down approaches and instead look at ways to encourage innovation and implementation from the bottom up—both in general and specifically in energy markets.

Some of the most important factors affecting innovation in general are:

  • Competition, both in general and specifically in capital markets;
  • Flexible labor markets;
  • Low personal and corporate taxes; and
  • Streamlined, cost-effective regulation.

Meanwhile, governments could improve the prospects for low-carbon energy generation specifically by taking actions to:

  • De-monopolize electricity markets;
  • Remove trade barriers in the energy sector (both exports and imports);
  • Reduce subsidies and tax expenditures for energy and energy-related technologies;
  • Streamline permitting for all forms of energy generation, including nuclear; and
  • Eliminate arbitrary, technology-specific energy mandates.

Innovation has the potential dramatically to reduce carbon emissions over the course of the next half-century. Indeed, if the United States were to adopt the pro-innovation approach outlined here, U.S. GHG emissions could fall to zero, or close to zero, by about 2060. Globally, it could take a little longer, but with a concerted effort to remove barriers to innovation, greenhouse gas emissions could approach zero in the last two decades of the century without any need for explicit restrictions on CO2 or other greenhouse gases.

Introduction

Human emissions of greenhouse gases (GHGs) are contributing to a rise in global average temperatures. Concerns about the effects of increased temperature stemming from further increases in atmospheric GHG concentration have led governments around the world to implement policies that aim to reduce GHG emissions.

Unfortunately, many of the policies so far implemented have done little to reduce GHG emissions or reduce the risk of future temperature increases, at enormous cost. Unfortunately, many of the policies so far implemented have done little to reduce GHG emissions or reduce the risk of future temperature increases, at enormous cost. The Renewable Fuels Standard, discussed in Part 7 of this study, is an extreme example, but there are many others. Going forward, it is important to identify cost-effective policies to reduce GHG emissions and thereby slow the rate of climate change.

This study examines and explains the mechanisms underpinning reductions in GHG emissions and describes a set of policy changes that would achieve such reductions cost-effectively. It begins in Part 2 with a simple description of the relationship between economic activity, GHG emissions, and global warming.

Part 3 delves more deeply into the changing relationship between economic activity and emissions and offers a hypothetical projection of future emissions based on this changing relationship.

Parts 4 and 5 consider the role of energy density and dematerialization as explanations for the changing relationship between output and emissions.

Then, Part 6 assesses various factors that underpin both increasing energy density and dematerialization.

Part 7 evaluates the prospects for increasing energy efficiency both in general and through targeted policies.

Part 8 identifies technologies and policies that might lead to lower-carbon energy generation.

Finally, Part 9 draws together the several strands of policies discussed throughout the paper and offers conclusions.

Full Study: Evidence-Based Policies to Slow Climate Change

The post Evidence-based policies to slow climate change appeared first on Reason Foundation.

]]>
The Limited Role Transit Can Play In the Bay Area’s Climate Change Strategies https://reason.org/commentary/the-limited-role-transit-can-play-in-the-bay-areas-climate-change-strategies/ Wed, 04 Aug 2021 04:24:21 +0000 https://reason.org/?post_type=commentary&p=45680 Spending billions of dollars to replace a relatively small number of car trips is not a cost-effective approach to combating climate change.

The post The Limited Role Transit Can Play In the Bay Area’s Climate Change Strategies appeared first on Reason Foundation.

]]>
Many Bay Area residents are deeply concerned about climate change and actively support local transit projects to reduce greenhouse gas emissions. But building more mass transit infrastructure locally often fails to move the needle very much. Rather than build expensive new transit infrastructure, Bay Area innovators should be applying technologies to make off-peak transit on existing lines and working from home more convenient.

Well before the COVID-19 pandemic, the Bay Area’s mass transit ridership was declining. This means that the costly transit projects now in the works or on the drawing board are likely to have a much smaller effect on the climate than previously expected.

A Santa Clara Civil Grand Jury report found, “[The Santa Clara Valley Transportation Authority] VTA’s light rail system is one of the most expensive, heavily subsidized and least used light rail systems in the country.”

Yet the Valley Transportation Authority is continuing to move forward with a $468 million, 2.4-mile light rail extension from Eastridge to Milpitas. With an average weekday ridership of less than 10,000 prior to the suspension of service, the more appropriate question is whether LRT service should be reduced rather than extended.

Similarly, the recently opened BART extension to Milpitas and Berryessa cost $2.3 billion, but, as the pandemic continues, the two new stations each are currently serving only around 400 exiting passengers daily. Extending service to downtown San Jose and Santa Clara is expected to cost $6.9 billion and take until 2030, while the estimate of 52,000 daily riders, made before the pandemic, seems unlikely to materialize.

Meanwhile, San Francisco is spending $1.6 billion on its repeatedly delayed Central Subway and $346 million for a two-mile Bus Rapid Transit line on Van Ness Avenue, which is now three years behind schedule. Collectively, these projects will put no more than a tiny dent into the 940,000 private automobile trips that occur in the city of San Francisco on an average weekday.

Spending billions of dollars to replace a relatively small number of car trips is not a cost-effective approach to combating climate change, but the Bay Area has other powerful tools in its shed. High-speed internet, cloud computing, online meeting tools and project collaboration software are reducing the need for many professionals to drive to offices or fly to conferences.

Bay Area companies have contributed to these innovations and are taking the lead in reconceptualizing the workplace to support hybrid or fully remote work. And these new technologies can be implemented around the world, reducing the need for commuting and in-person meetings globally.

We can replace millions of car trips by making it cheaper and easier for employees to collaborate with colleagues from home or from co-working spaces within walking distance from home. This means making technology improvements such as higher transmission speeds, increased network reliability and better software tools for collaboration.

Transit innovations could also contribute. If transit agencies could implement driverless buses and trains, they could affordably maintain five-minute headways throughout the day rather than just at peak rush hours. As more employees return to their offices, they could begin coming in on a staggered basis without having to worry about long waits on the platform.

To the extent that Bay Area innovators can leverage autonomous vehicle technologies to implement driverless transit at low cost, they can benefit not only VTA, BART and SF Muni, but other transit systems around the country and the world.

With only 0.1 percent of the world’s population, there is only so much the Bay Area can contribute to climate change alleviation through local mass transit initiatives. Rather than build expensive new transit infrastructure, the Bay Area should find more cost-effective solutions.

A version of the column previously appeared in the Mercury News.

The post The Limited Role Transit Can Play In the Bay Area’s Climate Change Strategies appeared first on Reason Foundation.

]]>
High-Speed Rail Is Unlikely to Play a Major Role In Achieving Climate Goals https://reason.org/commentary/high-speed-rail-is-unlikely-to-play-a-major-role-in-achieving-climate-goals/ Tue, 23 Mar 2021 18:00:49 +0000 https://reason.org/?post_type=commentary&p=41255 Advocates of high-speed rail projects sometimes argue that high-speed rail would help reduce emissions and fight climate change. However, the construction timelines, costs, and travel patterns of typical high-speed rail projects make that unlikely in the United States. In terms … Continued

The post High-Speed Rail Is Unlikely to Play a Major Role In Achieving Climate Goals appeared first on Reason Foundation.

]]>
Advocates of high-speed rail projects sometimes argue that high-speed rail would help reduce emissions and fight climate change. However, the construction timelines, costs, and travel patterns of typical high-speed rail projects make that unlikely in the United States.

In terms of timelines, the obvious example in the U.S. is California’s high-speed rail system, which was approved by the state’s voters in 2008. As of this writing in 2021, there’s no sign of the trains. The latest estimates suggest that a small section of the rail system will not start until 2028 at the earliest. While some rail proponents cite China’s speedy completion of high-speed rail projects as a model to follow, that nation does not respect property and labor rights. Thus, China’s rail construction costs and timelines aren’t comparable to those in the United States.

In a 2010 University of California—Berkeley study, professors Mikhail Chester and Arpad Horvath estimated that the entire California high-speed rail project would generate 9.7 million metric tons of carbon dioxide during construction. They also estimated that it would take high-speed rail 71 years of operation at medium occupancy to offset its own construction-related greenhouse-gas emissions.

Building high-speed rail systems require steel and concrete, the manufacturing of which typically generates greenhouse gases. Trucks, bulldozers, and other construction site equipment also consume energy. Thus, during their long construction phases, high-speed rail projects add greenhouse gases. Adding lanes to existing highways also generates greenhouse gases, but to the extent that recycled asphalt is used for road paving climate impacts can be somewhat reduced.

There are far quicker, more cost-effective ways to reduce greenhouse gas emissions than high-speed rail. By the time high-speed rail projects commence service, more cars will be fully electric, so future high-speed rail systems would be replacing fewer gasoline-powered automobile trips than they would’ve been replacing decades ago. California, for example, plans to terminate the sales of gasoline-powered cars by 2035. Similar bans are being implemented in Canada and the United Kingdom. Given the California rail project’s delays and carbon reductions being achieved by new technology, like electric vehicles, it is possible that, if built, the rail system will never pay back the carbon investment required to build it.

High-speed rail projects are also very costly. California’s high-speed rail project, which was estimated to cost $33 billion when presented to voters in 2008, is now estimated to eventually cost about $100 billion to eventually connect San Francisco to Anaheim. If the California high-speed rail is completed on its current budget, the cost per mile would be approximately $192 million a mile. This compares to about $10 million for a new mile of an interstate highway or $4 million per mile to widen an existing highway.

Costs are high in other parts of the country as well. The Texas project connecting Dallas and Houston is estimated to cost $30 billion. A 2012 plan to convert Amtrak’s Northeast corridor to true high-speed service (220 miles per hour) said it would cost $118 billion.

High-speed rail’s ridership is also uncertain at best, in part, because high-speed rail rarely serves a traveler’s point of origin and destination directly. Rail typically requires connecting services. If these connecting services are inconvenient, many potential rail riders may choose to drive or choose a mode of transportation that offers more convenience or a shorter travel time.

High-speed rail best connects riders in large central cities. However, most U.S. cities are dispersed, with the majority of the population living in suburbs. Thus, in most major urban areas that would consider high-speed rail, suburban customers would either need to take another form of mass transit or drive to get the high-speed rail station, further reducing any environmental benefits of the high-speed rail system itself.

Currently, and in the short-to-mid-term future, travel, work, and leisure habits are going to be changed by the COVID-19 pandemic. When the pandemic is behind us, there may be permanent changes in working-from-home, commuting patterns, and a permanent reduction in intercity travel. As such, there are numerous reasons for taxpayers and policymakers to be wary of high-speed rail’s potential to fight climate change or replace automobile and air travel in cost-effective ways.

The post High-Speed Rail Is Unlikely to Play a Major Role In Achieving Climate Goals appeared first on Reason Foundation.

]]>
Would a Green Fiscal Stimulus Help the Environment and the Economy? https://reason.org/policy-brief/would-a-green-fiscal-stimulus-help-the-environment-and-the-economy/ Thu, 28 Jan 2021 05:01:46 +0000 https://reason.org/?post_type=policy-brief&p=39733 This policy brief considers the main “green recovery” proposals and evaluates whether they would achieve their stated objective

The post Would a Green Fiscal Stimulus Help the Environment and the Economy? appeared first on Reason Foundation.

]]>
Introduction

The COVID-19 pandemic and associated shutdowns and stay-at-home orders have dealt a significant blow to the United States economy. U.S. gross domestic product (GDP) fell by an annualized rate of 30 percent during the second quarter of 2020 and unemployment rose above 14 percent in April but has been declining since then; at the end of October it was 6.9 percent. In response, the federal government has passed numerous bills intended to stimulate the economy.

Between March and July of 2020, numerous policymakers and pundits published reports and proposals claiming that COVID-19 offers an opportunity to make the economy more equitable and environmentally friendly. Of particular note, now-President Joe Biden proposed a $2 trillion plan that aims simultaneously to stimulate the economy, achieve net-zero carbon emissions by 2050, and create “millions of good, union jobs.” Even a small group of Republican senators wrote in support of significant additional subsidies to renewable energy.

This policy brief considers the main “green recovery” proposals and evaluates whether they would achieve their stated objectives. Part 2 summarizes the reports and proposals.  Part 3 analyses a selection of the proposals and claims that underpin them.  Finally, Part 4 offers some conclusions.

Full Policy Brief: Would a Green Fiscal Stimulus Help the Environment and the Economy?

The post Would a Green Fiscal Stimulus Help the Environment and the Economy? appeared first on Reason Foundation.

]]>
It Is Time for Environmentalism 3.0 https://reason.org/commentary/it-is-time-for-environmentalism-3-0/ Thu, 28 Jan 2021 05:00:29 +0000 https://reason.org/?post_type=commentary&p=39675 The modern environmental movement, founded on the very best of principles — protecting people and the planet — is failing.

The post It Is Time for Environmentalism 3.0 appeared first on Reason Foundation.

]]>
The modern environmental movement, founded on the very best of principles—protecting people and the planet—is failing.

To put it bluntly, many of the policies promoted to address the most serious and high profile environmental problems—climate change, declining biodiversity, oceanic dead zones, and tropical deforestation—simply aren’t working. Worse, some of those policies have become politically divisive and, in more than a few cases, actually make the problems worse.

It’s time to reconsider what we’re doing.

For the sake of this discussion, we can view environmentalism as a kind of social operating system, akin to a computer’s operating system. Like computer operating systems, environmentalism has proceeded through a series of versions, which can be referred to as Env1.0, 2.0, etc. This essay describes and evaluates Env1.0, which was in operation from about 1900 to about 1970, and Env2.0, which has been in operation since 1970. It then lays out a vision for Env3.0.

Env1.0, that of Theodore Roosevelt and Gifford Pinchot (sometimes called conservationism), was grounded in the idea of a human-centered conservation of the environment. Env1.0 was primarily about making sure that humans didn’t over-consume natural resources that humanity might want to use later. The rallying cry of Env1.0 might have been, ‘Save the whales! We will want to eat them later.’ I joke, but this was actually a decent starting point for an environmental operating system because it was scaled to our then-understanding of the environment, both locally and globally. It also reflected what we had the ability to influence in any meaningful way at the time. Unfortunately, it didn’t work well enough and didn’t fulfill our evolving ideas about the environment. The passenger pigeon still went extinct.

Env2.0, founded by people such as Julia Hill, Rachel Carson, Paul Ehrlich, and David Suzuki, and now associated with rituals such as Earth Day, non-governmental organizations such as Greenpeace, Earth First!, and Sierra Club, marked a significant departure from Env1.0.

In Env2.0, nature was seen to have intrinsic value outside of its utility to humanity and lie largely beyond the human power of rational management. Thus, the goal of Env2.0 wasn’t about maintaining and preserving the flow of environmental goods and services feeding human society but sought to wall off the environment from human society as much as possible. The rallying cry of Env2.0 might be caricatured as, ‘The environment is more important than us, we can only bung it up, so leave it alone!’

But Env2.0 is also failing on its own stated terms. Nature is neither being conserved via Env.1.0 nor has it been elevated above human needs and desires or left alone in its perfection via Env2.0.

While most of the drivers of Env2.0 were quite reasonable, it is failing—not because it was wrong to sound the alarm about environmental degradation and humanity’s role in that degradation. That was largely correct. Nor have environmentalists been wrong in pushing for environmental degradation remediation and greater ecological and human health protections. That was right as well. Even the fact that Env2.0 wasn’t particularly frugal in its methods (to put it lightly) isn’t grounds for calling it a failure. No, Env2.0 is failing for fundamental physical reasons. And by physical, I mean the fundamental cause-begets-effect, gravity-is-a-bitch kind of physical. Env2.0 is running afoul of the laws of both the physical world and intrinsic human nature. Here are three examples.

First, Env2.0 approaches generally ignore a defining characteristic of life: that living organisms respond to external stimuli. Put simply, human beings respond to incentives. Economics is the study of how humans respond to incentives. And all life is economic in one way or another. Paying attention to incentives when you’re looking at the movement of, say, glucose through an ecosystem but ignoring incentives when considering the flow of money in human ecosystems is a prescription for failure. Simply put, setting people against their own best interests and forcing them to pursue someone else’s interests makes them likely to refuse to go along with the plan.

Second, when Env2.0 concerns turn toward practical solutions, policy proponents seem to lose sight of how living systems actually respond to threats. Nature is dynamic and resilient, balancing the rigid and the flexible, continually evolving, but still highly efficient (ecologically, engineering, and biologically economic). But when it comes to regulation, environmental activists focus overwhelmingly on rigid prescriptions and inflexible regulation, attempting to pin humanity’s relationship to nature down to their specific vision of how an environmental utopia might appear, right down to setting the optimal numbers of polar bears that would constitute a “healthy ecosystem.” This approach has often been self-defeating because it fails to grasp how nature actually works and it conflicts with dynamic number one: people respond to incentives, and people want things like prosperity, liberty, health, opportunity, children, housing, jobs, and other things that are not satisfied by the Env2.0 approach.

Despite the claims of proponents that we can have both Env2.0 and all other human desires met, the reality comes back to science fiction author Robert Heinlein’s universal law:  “There Ain’t No Such Thing as a Free Lunch.”

At times, Env2.0 has indeed appeared to embrace policies that are resilient, dynamic, and take account of human desires with some market-based measures, such as emission trading, tradeable quotas for fisheries, and pricing mechanisms such as carbon taxes, energy taxes, and other economically mediated solution pathways.

The usual exemplar of this approach (and nearly the only one that has been given a serious effort in the United States) sought to control air emissions that caused acid rain. Some Env2.0 advocates argued that pollution causing acid rain was addressed efficiently with a market mechanism and that ambient lead pollution, and a few other air pollutants, were also managed with market mechanisms. They usually assert that carbon taxes are a market mechanism as well. However, these arguments are peripherally true but centrally false. What most people think of as market-based environmental control systems may well be resilient, flexible, efficient, and decentralized, giving people a certain degree of latitude to express their preferences through choices in a “market.”

But environmental trading systems to date haven’t tapped into the critical part of what lets markets genuinely manifest the truc cost of something by allowing individuals to assess the costs and benefits of trade without artificial constraints. Rather, in current environmental-trading practice, some third-party makes the value assessments, imposes its preferred outcome in accordance with its view, and allows for a subset of the range of choices individuals might make freely.

In other words, sulfur trading, lead trading, and carbon trading are pseudo-market systems, not genuine ones, and they do not embody the resilience that is missing from Env2.0. That is not to say that they did not achieve benefits: they most certainly did. However, they do not disprove the fact that Env2.0 was primarily a system of command-and-control, not one that embraced true market-driven or even market-based environmental policies.

A third reason why Env2.0 is failing is that it didn’t live up to its original mantra: think globally, act locally. Rather than allowing for a diversity of environmental protection approaches to flourish at municipal, state, or regional levels, the environmental movement quickly moved to centralize such regulations, first at the state level (in places like California), and then increasingly at the national level, and ultimately at the global level (through the United Nations).

Each step toward centralization was a step along a primrose path. Yes, it was a predictable and readily-available approach—after all, governments have regulated for as long as we’ve had governments—but the further removed from the needs of people at their own local levels, the less uniform were the costs and benefits allocated to different people and different communities. All of this led to decreasing support in environmental policies from different segments of the public for various regulatory regimes, and the subversion—overtly or covertly—of those regimes.

One example of this would include the alienation of states that were heavily invested in coal production or power generation versus those states better endowed with lower-carbon sources of energy and economic growth. And another example is the one-size-fits-few problem with designating broad brush carbon targets that, by causing energy prices to rise, disproportionately impacts lower-income earners more than it does higher-income earners.

That’s not to say there have not been important and notable environmental successes.

Nobody wants to return to the pollution levels of the 1960s and 1970s. I know I don’t—I carried an asthma inhaler ever since collapsing from an asthma attack during high school physical education while running laps in California’s notoriously smoggy San Fernando Valley. And, only a cretin would not celebrate the resurgence of whale populations, the protection of migratory bird populations, and our most iconic raptor species. Successes of Env2.0 make up a fairly long list.

But as I mentioned at the outset, the rigid compulsory/engineering approach of Env2.0 is increasingly running into dead ends. For example, we are continuously failing to achieve emission targets, reduction deadlines, treaty obligations, measurable improvements to climate stability, biodiversity targets, and so on. National and International environmental protection targets have been missed far more than they’ve ever been hit. Virtually all national targets for greenhouse gas emissions have been missed, and almost no serious people think current greenhouse gas control targets (net-zero by 2050) are in any way attainable, much less compatible with the living standard achieved in developed countries, and desired around the world.

In the meantime, technological breakthroughs promised by Env2.0 (which were to have spontaneously arisen because, ironically, markets) have stubbornly failed to appear. Despite subsidies, electric vehicles still make up a tiny fraction of the automobile market. Wind and solar power are intermittent, as California was recently reminded, and, because of the need for scarce materials and redundant backup systems, renewables are still more expensive than fossil fuels. The long-promised super-batteries that would let people power their homes off the grid, and even power entire cities, states, and countries using wind and solar are nowhere to be seen. And now the public hears rhetoric that asserts a “climate emergency” that dispenses with the very idea that humans can sustain the quality of life they’ve worked so hard to achieve, instead cautioning us to prepare for a dystopian future while still futilely plugging away at the same failed policies that got us here.

So what’s the answer?

If at first you don’t succeed, revise and resurge. Now that Env2.0 is crashing painfully against the unyielding shores of reality, it’s time to turn toward approaches that work with fundamental human nature, economic incentives, fundamental ecological system characteristics, and fundamental physical reality by moving away from the static, non-economic, physically impossible approaches of Env2.0.

It’s time for Env3.0.

Doing this efficiently requires differentiating between two broad classes of environmental problems. One class consists of those environmental problems we know can be successfully managed with the resilient and dynamic systems to accurately manifest, and efficiently distribute property rights, which include the right to property in and of your own body, as well as to things outside of it. The overwhelming majority of environmental problems fit into this category, with the limits mostly being a matter of whether or not a given jurisdiction has a market-compatible policy environment. Local air pollution, local water pollution, local chemical exposure, local resource over-utilization, local wildlife endangerment, can (and sometimes have) been managed through markets and the rule of law. And by “local,” I mean “within one geopolitical region,” which can be cities, states, or even countries, but are virtually never transnational, trans-economic, or trans-environmental-boundary problems.

In 1991, Terry Anderson and Donald Leal published Free Market Environmentalism, an excellent book that summarized and expanded our understanding of why environmental problems exist and how incentives, markets, and property rights could help solve those problems. Their answer was not, as the prevailing orthodoxy of the time held, that humans were just selfish and thoughtless despoilers of the world that had to be regulated into submission to what the environmentalist elites demanded of them. Unfortunately, some Env2.0 aficionados who distrust people’s inclination to prioritize Env2.0’s particular view of what it means to have a healthy environment (often meaning one untainted by human hands) have resisted this property-based approach to environmental management and protection.

But sometimes we can’t use property rights and a legal framework for pollutants we really care about, pollutants that cross borders, that cannot be traced to a responsible party, that persist beyond the time when the original actors can be held responsible for remediating their harms. Greenhouse gas pollution is one of these problems. So is conventional air pollution that moves between jurisdictions, even at the continental level. Plastic pollution is another, and biodiversity (especially of migratory wildlife) is still another. Ocean mammal protection would be such a problem (save for some revolutionary technology that would allow for private management of animals like whales), as would the risk of polar bear extinction. And those critics of environmentalism based purely on property rights are correct: some environmental problems simply won’t yield to that framework.

That’s where a different approach from Env2.0 needs to be brought to bear. What is needed is a management approach that shifts away from viewing the goal of environmental protection as the meeting of fixed, often arbitrary and unattainable targets and timelines for the reduction of single sources of environmental concern (such as, say, greenhouse gases) toward a focus on building the overall resilience of our integrated, globally shared, social-ecological system that is actually compatible with those stubborn laws of physical reality and human behavior mentioned above. This is a framework that treats nature and the human economy as an integrated social-ecological system, of the sort described by economist Elinor Ostrom in Governing the Commons.

Before going further though, we need to define what we mean by “resilience,” which has, unfortunately, become something of a code word lately for “more government” in all sorts of realms—from infectious disease recovery to managing climate change to any number of other perceived social ills.

The kind of resilience we’re talking about here is not about preventing any or all changes to a social-ecological system or imposing a pre-defined vision of exactly what that system is supposed to look like, it is about managing such systems in such a way that they can bounce back from disturbance, to return to a desirable base state.

In the case of the United States, one might define that desirable base state as consisting of a society based on the principles of the Constitution, and the other documents of America’s founding as well as the strong environmental protection ethic that has come to permeate American sensibilities

But what are the elements that make a social-ecological system (SES) resilient? More importantly, what elements of SES organization or management can make an SES less resilient, or more fragile.

Identifying the components of SES resilience was the goal of researchers David A. Kerner and J. Scott Thomas (Kerner and Thomas) in “Resilience Attributes of Social-Ecological Systems: Framing Metrics for Management.”  Kerner and Thomas break SES-resilience determinants into three broad categories: those which promote or compromise system stability, those which promote or compromise the system’s adaptive capacity, and those that promote or compromise the system’s readiness, which can be seen as the system’s speed and scope of responsiveness. Let’s consider a few examples.

Among the things that might compromise SES stability, Kerner and Thomas identify the presence of “single points of failure” which might cause the entire system to fail:

  • System balance, or “The degree to which a system is not skewed toward one strength at the expense of others;”
  • And system dispersion: “The degree to which the system is distributed over space and time”.

Among the things that might compromise SES adaptive capacity, Kerner and Thomas identify:

  • System response diversity, or the ability to employ alternative components to withstand stresses;
  • Collaborative capacity, or the “potential of system managers to work cooperatively to ensure system function in a timely and flexible manner;”
  • Connectivity, or how readily a system can exchange resources and information internally and externally to ensure continued function in the face of existential threats;
  • And learning capacity.

Finally, Kerner and Thomas identify some of the system attributes that might compromise SES readiness, including:

  • The absence of simplicity or understandability;
  • The presence of False Subsidies that may do more harm than good;
  • And the presence or absence of autonomy, the degree to which “an organization, operation, or function can self-select alternate actions, configurations, and strategies to achieve the specific mission or function—essentially, control over its destiny.”

Astute readers will readily recognize that market systems or, as a second-best alternative, market-based environmental protection measures, will likely perform better on the various parameters of building SES-resilience than would either the conservation approaches of Env1.0 or the command-and-control approaches of Env2.0.

Summary

We should be concerned about the failure of Env2.0, that, through its lens of humanity as being somehow outside of the environment, dismisses the physical and evolutionary laws that govern human behavior, and produces an utterly dysfunctional environmental protection regime of political antagonism which fails at its self-proclaimed priority: protecting the environment.

What is needed now, 120 years since the advent of Env1.0, and 50 years after the advent of Env2.0, is a new system. We need an Env.3.0 that recognizes most environmental problems, especially localized problems, can be fixed most efficiently, and with respect for human needs, with genuine, bottom-up market-based management systems.

Env3.0 would also move beyond the limitations of Env2.0 by managing systems not as either “human” systems or “environmental systems,” but as integrated social-ecological systems that require more integrated thinking and accommodation of different people’s needs and desires, with a focus on resilience, rather than static, arbitrarily defined targets, timelines, quantitative goals, and inflexibility.

The post It Is Time for Environmentalism 3.0 appeared first on Reason Foundation.

]]>
COVID-19’s Lasting Impact on Housing, Commuting and Climate Change https://reason.org/commentary/covid-19s-lasting-impact-on-housing-commuting-and-climate-change/ Wed, 20 Jan 2021 05:00:18 +0000 https://reason.org/?post_type=commentary&p=39665 The COVID-19 pandemic is forcing Californians and policymakers to adapt to accelerating trends that were already underway.

The post COVID-19’s Lasting Impact on Housing, Commuting and Climate Change appeared first on Reason Foundation.

]]>
The ongoing battle against the COVID-19 pandemic has been brutal. There have already been over 30,500 COVID-related deaths in California. In addition to managing this crisis and developing better strategies to get vaccines out more quickly, policymakers need to also examine the long-term impact the pandemic will have on important issues and vexing problems related to the economy, climate change, infrastructure, and homelessness.

Transportation accounts for about 40 percent of California’s greenhouse gas emissions. The pandemic has shown workers and companies that many of us can work from home and a lot of in-person meetings and conferences can be effectively replaced with group calls and webinars. The need to reduce in-person contact has dramatically accelerated the adoption of these technologies, which should continue to improve and evolve as demand increases.

As more employers shift to work-from-home policies, we could see a permanent reduction in commuting and business travel, thereby lowering carbon emissions from cars, buses, and planes. And the state can get many of these climate change benefits with little or no policy changes. About the only thing policymakers need to do is encourage competition among internet providers, which would help workers and companies by reducing costs and increasing connection speeds.

Admittedly, there will be substantial changes and collateral damage from a large-scale permanent shift to remote work. Among others, office buildings will lose value, retailers and restaurants in business districts could be hurt, and demand for mass transit commuting options will fall.

But if a primary state goal is to minimize greenhouse gas emissions, policymakers should patiently watch these trends emerge and resist pressure from interest groups that will continue to urge the construction of costly rail, transit, and other pre-pandemic projects. Even operating a fully-electrified rail system adds to greenhouse gas emissions to the extent that the electricity is still generated from fossil fuels. And travel patterns will continue to change over the long-term. People who are able to work from home, for example, may choose different places to live and prioritize quality of life improvements if they’re less concerned about commute times than they’ve ever been.

With less demand for office space, it is important to let developers convert commercial buildings to residential use. This is also the case with hotels and retail facilities that go out of business in the coming months and years. California already had an excess of retail space before the pandemic, and now, unfortunately, this surplus is growing as many stores and restaurants are permanently closing. Likewise, even before the pandemic, developers were converting shuttered malls to mixed-use communities, turning old office buildings into apartment buildings or hotels, and storefronts into live/workspaces.

Thankfully, much of the state’s underutilized commercial space can be converted to housing. And it can be done at lower costs than building homes from scratch, especially in high-cost areas in California’s major cities and along the coast. By rezoning more areas to residential use, or de-zoning them entirely, local governments can facilitate the transition of commercial space to housing rather than perpetuating commercial vacancies at the risk of blight.

Converted properties would add much-needed supply to California’s housing overly expensive market. More housing supply would help drive down prices and rents. California governments have already rented hotel rooms and acquired motels to accommodate homeless individuals during the pandemic and beyond, so this added supply could play a role in addressing the state’s massive homelessness problem.

The COVID-19 pandemic is forcing Californians and policymakers to adapt to accelerating trends that were already underway. Technology is allowing us to do more of our work, shopping, and playing from home. We’re all anxious to return to pre-pandemic activities but we can also take advantage of the transformations we’re experiencing to reduce greenhouse gas emissions and the state’s housing problems. And, importantly, we can obtain most of these benefits with relatively minor policy changes and without large public spending.

Hopefully, California’s political leadership will recognize how residential housing conversions and the greater use of telecommuting can help the state meet its long-term climate and housing goals. Rather than over-regulating these areas, policymakers should let innovation and adaptation create these changes in the public interest.

A version of this column previously appeared in the OC Register

The post COVID-19’s Lasting Impact on Housing, Commuting and Climate Change appeared first on Reason Foundation.

]]>
Fuel Economy Standards Hurt Consumers and the Economy https://reason.org/commentary/fuel-economy-standards-hurt-consumers-and-the-economy/ Tue, 10 Apr 2018 05:22:54 +0000 https://reason.org/?post_type=commentary&p=23271 A “mileage-based user fee” could ensure that drivers pay for the harm they generate – and would incentivize drivers to drive less when the harm they cause is greater.

The post Fuel Economy Standards Hurt Consumers and the Economy appeared first on Reason Foundation.

]]>
Last week, the Environmental Protection Agency announced it will not raise fuel economy and greenhouse gas emission standards for cars and light trucks by as much as the Obama administration had proposed. “The Obama administration’s determination was wrong,” said Scott Pruitt, head of the Environmental Protection Agency (EPA). The EPA’s decision is good news for consumers, the economy and environment.

Corporate average fuel economy (CAFE) standards were first introduced by Congress in 1975, ostensibly to reduce US reliance on foreign oil. The standards have been tightened several times since then, most recently in 2016, when they were combined with standards intended to reduce greenhouse gas emissions. Those standards, which are binding, peak at 41 miles per gallon in 2021. The Obama administration also proposed raising the standards after 2021, suggesting that they should reach 54.5 miles per gallon by 2025. It is these latter, non-binding standards that now won’t be implemented.

Economists have long been skeptical of fuel economy standards because they are an inefficient means of reducing fuel consumption and emissions. The majority of U.S. consumers demonstrably prefer larger, more powerful vehicles, such as pickups and SUVs. But vehicle manufacturers continue to produce large numbers of small vehicles, which have higher fuel economy, in order to comply with fuel economy standards. As a result, manufacturers and dealers are forced to discount the prices of new cars, undermining their overall profitability.

In addition, fuel economy standards result in a “rebound” effect: when it costs less to drive each mile people tend to drive more, which means more emissions. Moreover, by increasing the price of new vehicles, fuel economy standards incentivize consumers to keep older vehicles on the road longer. This is especially true for larger, more powerful “gas guzzlers.” Studies suggest these actions by consumers reduce the potential fuel savings and emissions by 25 to 50 percent.

In a recent study for Reason Foundation, Arthur Wardle and I conservatively estimated that implementing the proposed 2025 standards would reduce annual carbon dioxide emissions by 50 million tons at a cost of $50 billion. That puts the cost of CO2 reduced at an astronomical $1,000 per ton – about 50 times the “social cost of carbon” estimated by the EPA during the Obama administration.

As I noted in another recent study, the social cost of carbon is likely less than $10 per ton (and could be zero). So, had the EPA proceeded with the Obama-era proposals, it would have imposed costs at least 100 times greater than the benefits. Those costs would have fallen disproportionately on poorer consumers.

In a study published last week, I calculated that a person purchasing a three-year-old pickup is already paying about $100 per year more for their vehicle and fuel than they would be in the absence of the new CAFE standards. As fuel standards become more onerous, which they will until at least 2021, the net cost is likely to increase. If the more ambitious Obama-era standards had been imposed, consumers could have been paying $500 per year more than necessary by 2025.

Other policies are better suited to address the pollution associated with vehicle use. Arguably the best policy would be to charge a fee per mile driven, varied according to the time and location of the driving, as well as the emissions from specific vehicles. Such a “mileage-based user fee” could ensure that drivers pay for the harm they generate – and would incentivize drivers to drive less when the harm they cause is greater.

Even the existing gas tax is more efficient than fuel economy standards. The University of California, San Diego, economist Marc Jacobsen estimates that using fuel economy standards to reduce fuel use and emissions costs three to five times as much as a gas tax. In addition to harming consumers and the economy, fuel economy standards are bad for the environment. As noted, they encourage consumers to use their cars more than they otherwise would and they encourage consumers to keep “gas guzzlers” on the road longer for longer. And because these effects drive up the cost of reducing emissions, they effectively waste resources that could be spent on other environmental amenities.

Unfortunately, Congress has tied the hands of agencies that set fuel economy standards, preventing them from rolling back existing standards that are already in force. It would take an act of Congress to scrap those standards.

EPA says it will produce new emission standards for 2022, and beyond, that are less onerous. Given that the existing standards already impose costs greater than the benefits, the best outcome would be for EPA not to increase the standards after 2021.

The post Fuel Economy Standards Hurt Consumers and the Economy appeared first on Reason Foundation.

]]>
The Effect Of Corporate Average Fuel Economy Standards On Consumers https://reason.org/policy-brief/the-effect-of-corporate-average-fuel-economy-standards-on-consumers/ Sun, 01 Apr 2018 04:00:52 +0000 https://reason.org/?post_type=policy-brief&p=23175 Fuel economy and greenhouse gas emissions standards for vehicles are a very inefficient way to address issues related to fuel consumption and emissions.

The post The Effect Of Corporate Average Fuel Economy Standards On Consumers appeared first on Reason Foundation.

]]>
Corporate Average Fuel Economy (CAFE) standards require manufacturers to meet minimum fuel economy requirements for their fleets of vehicles sold in the U.S. As a result, manufacturers adjust certain vehicle attributes in order to comply with these standards. Among the many vehicle attributes that a manufacturer may adjust are weight, power, and drivetrain. Such adjustments have consequences for the cost and performance of vehicles, which affects consumers.

In their assessment of the likely effects of CAFE standards, the National Highway Traffic Safety Administration (NHTSA) and the Environmental Protection Agency (EPA) claim that the new standards introduced since 2011 generate substantial benefits for consumers. Underlying that claim is an assumption that consumers fail adequately to take into consideration the economic benefits of more fuel-efficient vehicles when making purchasing decisions. However, a slew of recent studies questions the assumptions made by NHTSA and EPA. This brief assesses the effects of CAFE standards on consumers.

Proponents of CAFE standards claim that they benefit consumers by reducing the total costs of purchasing and using vehicles. The evidence contradicts this claim. Consumers generally purchase vehicles with characteristics that meet their needs, including their expectation of the total cost of future gas purchases. CAFE standards distort manufacturers’ incentives, forcing them to produce new vehicles with lower gas consumption than would be preferred by consumers. As a result, the range of vehicle options available to consumers is limited and many consumers are effectively forced to purchase vehicles that are less able to meet their preferences.

Among the most adversely affected consumers are those, predominantly in rural areas, who seek to purchase used pickups. The distortions created by CAFE standards artificially raise the cost of these vehicles by more than the average savings from reduced gas usage, increasing the total cost of ownership. Given the steep rise in the price of used pickup trucks that resulted from CAFE standards for the 2012–2016 period and current increases occurring as the 2017–2021 standards are implemented, it is likely that prices would rise at an even faster rate if the agencies were to implement standards along the lines of those proposed as “augural” for 2022–2025.

In addition, as noted in a previous paper, fuel economy and greenhouse gas emissions standards for vehicles are a very inefficient way to address issues related to fuel consumption and emissions. Ideally, the federal government would scrap the federal CAFE and greenhouse gas emissions standards. However, this option is not currently on the table.

Ideally, the federal government would scrap the federal CAFE and greenhouse gas emissions standards. However, this option is not currently on the table. Nonetheless, the agencies implementing the standards do have the option of setting future greenhouse gas emissions and CAFE standards at the same level currently set for the model year 2021. That would certainly be preferable to the alternative of raising the standards further. In addition, to the extent that other extant EPA and NHTSA regulations serve as barriers to the introduction of vehicles that better suit consumer preferences, it behooves the agencies to seek ways to remove these barriers. One example noted herein are the essentially arbitrary and unnecessary differences between U.S. and international standards for a variety of vehicle parts. Harmonization of these standards would likely result in the production of vehicles that better serve consumers at a lower price. In addition, to the extent that the threat of anti-trust action impedes collaboration between manufacturers in the development of new technologies, a simple process for the granting of anti-trust waivers could facilitate more rapid innovation, not only of more-efficient vehicles but also in many other aspects of automotive technology.

Full Brief: The Effect Of Corporate Average Fuel Economy Standards On Consumers

Related Research:

CAFE and ZEV Standards: Environmental Effects and Alternatives

Climate Change, Catastrophe, Regulation and The Social Cost of Carbon

The post The Effect Of Corporate Average Fuel Economy Standards On Consumers appeared first on Reason Foundation.

]]>
Climate Change, Catastrophe, Regulation and the Social Cost of Carbon https://reason.org/policy-study/climate-change-catastrophe-regulation-and-the-social-cost-of-carbon/ Thu, 08 Mar 2018 11:00:33 +0000 https://reason.org/?post_type=policy-study&p=22827 Going forward, a more fruitful approach to addressing the problem of climate change would address barriers to adaptation, especially those created by government, such as regulations, taxes, and subsidies.

The post Climate Change, Catastrophe, Regulation and the Social Cost of Carbon appeared first on Reason Foundation.

]]>
Executive Summary

Federal agencies are required to calculate the costs and benefits of new regulations that have significant economic effects. Since a court ruling in 2008, agencies have included a measure of the cost of greenhouse gas emissions when evaluating regulations that affect such emissions. This measure is known as the “social cost of carbon” (SCC).

Initially, different agencies applied different SCCs. To address this problem, the Office of Management and Budget and Council of Economic Advisors organized an Interagency Working Group (IWG) to develop a range of estimates of the SCC for use by all agencies. However, the IWG’s estimates were deeply flawed. In April 2017, President Trump issued an executive order rescinding the IWG’s estimates and disbanded the IWG. The question now is what value regulatory agencies should use for the SCC—if any—when evaluating rules that affect greenhouse gas emissions.

PROBLEMS WITH CALCULATING A SOCIAL COST OF CARBON

Most analyses of the social cost of carbon, including the IWG’s, have utilized “integrated assessment models” (IAMs), the basic methodology of which involves the following six steps:

  1. Develop (or choose from existing) scenarios of future emissions of GHGs;
  2. Use those scenarios to estimate future atmospheric concentrations of GHGs;
  3. Project changes in average global temperature and/or climate resulting from these future atmospheric GHG concentrations;
  4. Estimate the economic consequences of the resultant changes in temperature/climate;
  5. Estimate the costs of abating specific amounts of GHG emissions;
  6. Combine the estimates from steps 4 and 5 to produce an assessment of the net economic effect of different scenarios and thereby identify the optimum path of emissions.

Each step in this process is fraught with difficulty:

1. Future emissions of GHGs are unknown—and unknowable—but likely lower than assumed in most IAMs.

Future human-related emissions of GHGs will depend on many factors, especially: the human population, the extent and use of technologies that result in energy consumption, the types of technology used to produce energy, and the efficiency with which technologies use energy.

None of these factors can be forecast with any precision. Predicting future technologies is particularly challenging. However, greenhouse gas emissions from U.S. sources have declined from their peak (see Figure ES1), mainly as a result of using more energy-dense, lower carbon fuels (especially natural gas) and by using energy more efficiently (see Figure ES2).

Figure ES1: U.S. Energy Consumption and CO2 Emissions 1949-2016
Climate Control - Figure ES1

Source: Energy Information Administration. March 2017 Monthly Energy Review.

Figure ES2: Sources Of U.S. Energy 1776-2012
Climate Control - Figure ES2

Source: Energy Information Administration, https://www.eia.gov/todayinenergy/detail.php?id=11951

Global emissions are rising but at a declining rate, in spite of robust economic growth. If these trends continue, future concentrations of greenhouse gases are likely to be at the low end of estimates used by the IWG when calculating the SCC.

2. The relationship between emissions and concentrations of greenhouse gases is complicated.

Calculating future atmospheric concentrations of GHGs, based on estimates of future human emissions, requires knowledge of the length of time that these GHGs will remain in the atmosphere. That, in turn, requires knowledge about the rate at which they will break down and/or be absorbed. This is no simple task. The rate at which GHGs such as methane and dinitrogen monoxide break down depends on such things as temperature and the amount of water vapor and other chemicals in the atmosphere with which they might react. The rate at which CO2 is taken up by plants, soil and oceans varies considerably depending on factors such as temperature and the availability of nutrients. The dynamic and interactive nature of these effects complicates the picture further.

3. The climate is likely much less sensitive to increased emissions of GHGs than has been presumed in most IAMs, including those used by the IWG.

Early estimates of the sensitivity of the climate to increased concentrations of greenhouse gases found that a doubling of atmospheric carbon dioxide would result in a warming of between 1.5°C and 4.5°C, with a “best guess” of 3°C. But those estimates were based on poorly specified models. Tests of models using those estimates of climate sensitivity predict about twice as much warming as actually occurred. Nonetheless, the IWG used those early, inaccurate estimates. More recent estimates of climate sensitivity suggest that future emissions are likely to result in much more modest warming of the atmosphere (with a doubling of carbon dioxide concentrations resulting in a warming of 1.5°C or less).

4. The effects of climate change are unknown—but the benefits may well be greater than the costs for the foreseeable future.

If the recent lower estimates of climate sensitivity are correct and emissions follow a relatively low path, warming will likely be modest and its effects mild. Likely effects include:

  • Warming will be greater in cold places (i.e. farther from the equator), seasons (winter), and times (night) than in warm places (equatorial regions), seasons (summer) and times (day).
  • At higher latitudes, winters will be less extreme.
  • Precipitation will increase, but not everywhere, and some places will become drier.
  • Sea levels will continue to rise slowly, as the oceans expand and land-based glaciers melt. (If current trends continue, sea level will rise by about 11 inches by 2100.)
  • The incidence of extreme weather events will not change dramatically.

While increased temperatures in warm places and seasons may result in higher mortality among those who are less able to cope with higher temperatures, warmer winters will reduce the number of people who die from cold. Since 20 times as many people currently die from cold as die from heat, modest warming will reduce temperature-related deaths. These effects will be tempered by the use of heating and cooling technologies, but the costs of additional cooling will be more than offset by reduced expenditure on heating.

While rising temperatures have the potential to increase the incidence of some diseases, such as diarrhea, these effects are likely to be moderated by the adoption of better technologies, including piped clean water and sewerage.

Increased concentrations of carbon dioxide and higher temperatures are likely to increase agricultural output in many places. While agricultural output may fall in other places, this effect is likely to be moderated by the adoption of new crop varieties and other technologies. On net, crop production is likely to rise in the U.S. and globally.

Many economic models of climate change, including two of the three IAMs used by the IWG assume very limited adaptation. Yet the history of human civilization is one of adaptation. Food availability per capita and access to clean water have risen dramatically over the past half-century, reducing malnutrition and water-borne diseases and increasing life expectancy (see Figure ES3). Rising wealth and the adoption of new technologies have reduced mortality from extreme weather events by 98% in the past century (see Figure ES4). It seems highly likely that continued innovation and more widespread adoption of adaptive technologies will continue to reduce mortality, mitigating most—if not all—the adverse consequences of rising temperatures.

Figure ES3: Food Availability Per Capita In Select Regions, Food Supply, kcal/day
Climate Control - Figure ES3

Source: Food and Agriculture Organization of the United Nations: FAOStat (http://www.fao.org/faostat/en/#home)

 

Figure ES4: Global Mortality From Weather-Related Natural Disasters Per Capita Income
Climate Control - Figure ES4

Source: Author’s calculations based on data from EM-DAT (the international disasters database: http://www.emdat.be/), Angus Maddison Project (http://www.ggdc.net/maddison/maddison-project/home.htm) and World Bank World Development Indicators (http://databank.worldbank.org/data/reports.aspx?source=world-development-indicators).

5. The costs of reducing future emissions of GHGs are unknown—and will depend very much on the extent and timeframe of any reduction.

Proponents of taking action now argue that any delay would increase the total cost of emissions reductions—because baseline emissions (i.e. the emissions that would occur without any mandated reductions) would be higher and the size of any such future reduction would have to be greater. But such arguments presume both significant increases in baseline emissions and a need dramatically to reduce such emissions. If the trends in technology identified earlier do continue, growth in baseline GHG emissions will continue to slow and in the longer term may even fall without any government mandates. Indeed, it is possible that baseline emissions in the future (i.e. after 2050) will be consistent with a pathway of emissions that results in atmospheric GHG concentrations that generate net benefits.

Even if baseline emissions rise to a level that justifies intervention in the future, that does not necessarily justify reducing emissions now. Humanity currently relies predominantly on carbon-based fuels for energy generation, and the costs of alternative sources of energy are in most cases relatively high. (If alternative sources of energy were less expensive, then it would make economic sense to adopt them.) Continued innovation will almost certainly result in lower emissions per unit of output in the future, so the costs of reducing a unit of GHG emissions in the future will be lower than they are today.

6. When combining benefits and costs, the IWG used inappropriately low discount rates, giving the false impression that the benefits of reducing emissions are greater than the costs. At discount rates that reflect the opportunity cost of capital, the current costs of taking action to reduce GHG emissions now and in the near future are almost certainly greater than the benefits.

OMB guidelines state that, for the base case, “Constant-dollar benefit-cost analyses of proposed investments and regulations should report net present value and other outcomes determined using a real discount rate of 7%. This rate approximates the marginal pretax rate of return on an average investment in the private sector in recent years.”

Unfortunately, when discounting the benefits and costs associated with global warming, many analysts have used discount rates that do not reflect the opportunity cost of capital. For example, the IWG provided an estimate of the SCC at a 5% discount rate, but it is the highest rate given. In its guidance, the IWG emphasized the SCC calculated at a 3% discount rate. Its rationale for using the lower rate is that future benefits from avoiding climate change costs relate to future consumption, rather than investment. Policies to address climate change would affect both consumption and investment, but for the purposes of evaluation what matters is the effect on investment, since it is the effect of policies on investment decisions that will determine rates of innovation and hence economic growth, the ability to adapt to climate change, and future consumption. In other words, while future consumption is of primary concern, due to its relationship to human welfare, return on investment is the key factor determining future consumption. Thus, the appropriate discount rate is the rate of return on capital.

Changing the Assumptions

Changing the assumptions made in the IWG’s models can have a dramatic effect on estimates of the SCC. Anne Smith and Paul Bernstein of National Economic Research Associates ran the IAMs used by the IWG making four changes:

  1. they changed the emissions scenario to reflect more realistic assumptions regarding the relationship between emissions and economic growth;
  2. they changed the time horizon from 2300 to 2100;
  3. they changed the discount rate from 3% to 5%;
  4. they changed the scope from global to U.S. only.

When all these changes were combined, the effect was to reduce the SCC by 97%, from $43 to about $1.30. Smith and Bernstein’s analysis did not change any assumptions regarding climate sensitivity or other relevant climate parameters that might have been misspecified in the IAMs used by the IWG. Kevin Dayaratna, Ross McKitrick and David Kreutzer assessed the effects of using more-recent empirical estimates of climate sensitivity to calculate updated SCC estimates using two of the IWG models. They found that, for one model, the average SCC fell by 30%–50% and for the other it fell by over 80%. Moreover, at a 7% discount rate, one of the models generated a negative SCC.

If all of the adjustments made by Smith and Bernstein were combined with those made by Dayaratna et al. it seems likely that the SCC would fall to well below $1. Indeed, given uncertainties in the various parameters used, it seems difficult to avoid the conclusion that for practical purposes the SCC is effectively $0.

What About Catastrophic Climate Change?

Some economists have objected that conventional measures of the SCC fail adequately to account for the possibility of catastrophic climate change. However, such criticisms are based on assumptions concerning the probability of catastrophe that have no empirical basis. A recent attempt to estimate the SCC by surveying experts to find out what they would be willing to pay to avert catastrophe is so riddled with defects as to be of no utility.

THE SOCIAL COST OF CARBON AND REGULATORY REFORM

The IWG’s SCC was developed under Executive Order 12866, which requires regulatory agencies to consider the costs and benefits of regulations they are promulgating—and alternatives—and choose the regulatory option that maximizes net benefits to society.

If the SCC is $0, mandatory reductions of GHG emissions are not justified. Thus, regulations predicated on a positive SCC should be reconsidered. While these regulations often also have purported “co-benefits” of significant magnitude (such as reduced emissions of particulates), those co-benefits could almost certainly be achieved at much lower cost through alternative means. As such, when evaluating these regulations, agencies should compare their cost to alternative regulations that specifically address the co-benefit elements.

Other alternatives to mandatory reductions in emissions of GHGs that should be considered include the reform or removal of regulatory and other restrictions on the development of lower-carbon forms of energy and the removal of barriers to adaptation.

Full Study: Climate Change, Catastrophe, Regulation and The Social Cost of Carbon

The post Climate Change, Catastrophe, Regulation and the Social Cost of Carbon appeared first on Reason Foundation.

]]>
CAFE and ZEV Standards: Environmental Effects and Alternatives https://reason.org/policy-brief/cafe-and-zev-standards-environmental-effects-and-alternatives/ Thu, 03 Aug 2017 16:24:29 +0000 http://reason.org/?post_type=policy-brief&p=16555 Introduction Manufacturers and importers of vehicles sold in the U.S. are subject to federal fuel economy and greenhouse gas (GHG) emission standards. In addition, California and nine other states have imposed more-stringent standards and require a minimum proportion of vehicles … Continued

The post CAFE and ZEV Standards: Environmental Effects and Alternatives appeared first on Reason Foundation.

]]>
Introduction

Manufacturers and importers of vehicles sold in the U.S. are subject to federal fuel economy and greenhouse gas (GHG) emission standards. In addition, California and nine other states have imposed more-stringent standards and require a minimum proportion of vehicles to have “zero” emissions.

The stated objectives of Corporate Average Fuel Economy (CAFE) standards are: (1) to reduce fuel consumption and (2) to reduce greenhouse gas (GHG) emissions. This brief considers the extent to which the standards are likely to achieve these objectives, the relative importance of achieving such objectives compared with other possible environmental objectives, and more cost-effective alternative policies.

CAFE and ZEV Standards

CAFE standards were first introduced in 1978 and new standards have been implemented at various intervals since then. In 2012, the National Highway Transportation and Safety Administration (NHTSA) and the Environmental Protection Agency (EPA) issued new CAFE and GHG emission rules for vehicles manufactured in years 2017–2025. Under these rules, fleetwide average fuel economy for passenger cars rises to 39.6 mpg in 2017 (from 38.2 mpg in 2016) and then incrementally to a minimum of 55.3 mpg in 2025. For light trucks, minimum fuel economy rises to 29.1 mpg in 2017 (from 28.9 mpg in 2016) and then incrementally to 39.3 mpg in 2025. These minimum fuel economy standards are based on the maximum vehicle footprint in each category: fleets with smaller average vehicle footprints are required to meet stricter standards.

Under the GHG emission rules in the 2017–2025 standards, manufacturers may offset other efficiency improvements, such as in air conditioning systems, thereby reducing the minimum effective fuel efficiency for passenger vehicles in 2025 to 49.6 mpg. Moreover, manufacturers may count each electric and plug-in hybrid vehicle as more than one effective vehicle: for plug-in hybrids, this multiple starts at 1.6 in 2017 and declines to 1.3 in 2021; for electric vehicles, the multiple is 2 in 2017, falling incrementally to 1.5 in 2021.

Also in 2012, California’s Air Resources Board adopted the Advanced Clean Cars program, which created standards for new vehicles in California through 2025. A key provision of the program requires that, starting in 2018, major motor vehicle manufacturers sell a minimum proportion of “Zero Emission Vehicles” (ZEVs—i.e. electric cars) or combination of ZEVs and “Transitional Zero Emission Vehicles” (TZEVs—i.e. plug-in hybrids). For model year 2018, the minimum is 4.5%, including at least 2% ZEVs. This rises annually reaching 22% in 2025, of which at least 16% must be ZEVs. California was permitted to implement these stricter standards under a waiver granted by the Environmental Protection Agency. Other states have been permitted to adopt the same standards as California and so far nine additional states have done so, namely: Connecticut, Maine, Maryland, Massachusetts, New Jersey, New York, Oregon, Rhode Island and Vermont.

Full Brief – CAFE and ZEV Standards: Environmental Effects and Alternatives

The post CAFE and ZEV Standards: Environmental Effects and Alternatives appeared first on Reason Foundation.

]]>
The Paris Agreement: An Assessment https://reason.org/policy-brief/the-paris-agreement-an-assessment/ Tue, 19 Apr 2016 04:00:00 +0000 http://reason.org/policy-brief/the-paris-agreement-an-assessment/ On December 12, 2015, government officials from 195 nations meeting in Paris, France, finalized a new agreement on climate change. This brief assesses the merits of the Paris Agreement in order to encourage informed debate about its ratification by signatory nations. It examines the nature and extent of the threat posed by climate change, the effectiveness of solutions adopted in Paris, and considers other, better solutions.

The post The Paris Agreement: An Assessment appeared first on Reason Foundation.

]]>
On December 12, 2015, government officials from 195 nations meeting in Paris, France, finalized a new agreement on climate change. This brief assesses the merits of the Paris Agreement in order to encourage informed debate about its ratification by signatory nations. It examines the nature and extent of the threat posed by climate change, the effectiveness of solutions adopted in Paris, and considers other, better solutions.

The brief begins with a concise history of the emergence of concern over climate change. This is followed by a status report on, first, the science of climate change and, second, economic analyses of the problem of climate change. It then assesses the Paris Agreement in light of the scientific and economic evidence. Finally, it explores some other policy options that would better address the problem of climate change.

The Paris Agreement seeks to limit the increase in global mean temperatures to “well below 2°C above pre- industrial levels.” The Agreement asserts that this goal is undertaken in order to enhance the implementation of the objective of the UN Framework Convention on Climate Change (FCCC), which is to prevent “dangerous anthropogenic interference with the climate system.”

There is considerable disagreement over what increase in global mean temperature might be considered “dangerous.” For example, economic analyses suggest that warming of 3°C above pre-industrial levels may generate net benefits. In the very scenarios in which climate change is presumed to be greatest, its impact will be distributed most evenly and the harms will be most readily mitigated by adaptive responses. Merely asserting that an increase of 2°C above pre-industrial levels will be “dangerous” does not make it so.

Recent analysis indicates that the sensitivity of the climate to rising levels of carbon dioxide is lower than previously assumed. This is consistent with the finding that forecasts of warming by the Intergovernmental Panel on Climate Change have substantially over-estimated the likely extent of warming. If these new analyses of sensitivity are correct, average global temperatures might rise to less than 2°C above preindustrial levels without any restrictions on greenhouse gas emissions.

Regardless of whether the new analyses of climate sensitivity are correct or not, the voluntary commitments to reduce emissions made by governments in their Intended Nationally Determined Contributions (INDCs) would have little impact on the rate of warming.

Thus, the Paris Agreement likely does not enhance the objective of the FCCC, either because the 2°C target is not desirable, or because restrictions on greenhouse gas emissions are not necessary (or, if necessary, not sufficient) to achieve that target. As such, the Paris Agreement must be considered a new and separate treaty. Indeed, it implicitly acknowledges its status as a separate treaty, noting that it will only enter into force once it has been ratified by signatories representing 55% or more of global greenhouse gas emissions.

Innovation and associated economic development will likely be the most effective means by which humans address climate change. But the commitments made under the Paris Agreement would divert trillions of dollars into low-carbon technologies and government-funded schemes for mitigation and adaptation, thereby undermining the bottom-up processes that drive more widespread innovation and, as a result, impeding the ability of people to adapt to climate change and other threats.

Given the potential for the Paris Agreement to result in harmful and even counterproductive restrictions on economic activity, it would appear that ratification is not in the interests of the majority of signatory nations.

Attachments

The post The Paris Agreement: An Assessment appeared first on Reason Foundation.

]]>
Urban Containment: The Social and Economic Consequences of Limiting Housing and Travel Options https://reason.org/policy-study/urban-containment-the-social-travel/ Mon, 07 Mar 2016 05:00:00 +0000 http://reason.org/policy-study/urban-containment-the-social-travel/ Responding to a growing interest in curtailing carbon emissions, some cities are limiting their urban footprint-a practice called "urban containment." Urban containment policy seeks to control "urban sprawl" and to reduce GHG emissions by densifying urban areas and substituting transit, cycling and walking for car and other light duty vehicle use. This study evaluates four urban containment reports-by the U.S. Department of Energy, the Transportation Research Board (Driving and the Built Environment), the Urban Land Institute (Moving Cooler) and the U.S. Environmental Protection Agency-to determine their cost-effectiveness in reducing greenhouse gas (GHG) emissions and their impact on household affluence and the poverty rate.

The post Urban Containment: The Social and Economic Consequences of Limiting Housing and Travel Options appeared first on Reason Foundation.

]]>
Responding to a growing interest in curtailing carbon emissions, some cities are limiting their urban footprint-a practice called “urban containment.” Urban containment policy seeks to control “urban sprawl” and to reduce GHG emissions by densifying urban areas and substituting transit, cycling and walking for car and other light duty vehicle use. This study evaluates four urban containment reports-by the U.S. Department of Energy, the Transportation Research Board (Driving and the Built Environment), the Urban Land Institute (Moving Cooler) and the U.S. Environmental Protection Agency-to determine their cost-effectiveness in reducing greenhouse gas (GHG) emissions and their impact on household affluence and the poverty rate.

Urban Containment and Cities

Cities have experienced declining population densities for centuries. This occurred as urban areas expanded at a greater rate than population, due in large measure to improved transportation technologies, as walking was substantially replaced by transit and later, transit was substantially replaced by cars. Even in the densest parts of urban areas-the core municipalities-population densities have declined virtually around the world.
The physical expansion of cities, known as “urban sprawl,” has been a principal concern of urban planners for decades, which has led to the adoption of “urban containment.” The most important urban containment policies are restrictions on urban fringe development- by means of urban growth boundaries or similar land-rationing measures-and policies to reduce light duty vehicle use.

Concern about GHG emissions drives an increasing emphasis on urban containment policy. This is based on the assumption that higher densities and less car use would translate into materially lower GHG emissions. In effect, urban containment policy seeks to replace the more liberal land-use policies that have been typical in U.S. metropolitan areas since World War II.

Urban Containment and Greenhouse Gas Emissions

The DOE report, which reviews other reports, indicates that urban containment policies “have significant potential to impact … GHG emissions significantly over the long term.” The DOE report provides an overview of urban containment policy and summarizes GHG emissions reduction projections from previous research. The two most important reports reviewed- Driving and the Built Environment and Moving Cooler-indicate that urban containment policies could reduce 2050 greenhouse gas emissions from light duty vehicles by 1% to more than 10%. The later EPA report projected 2050 GHG emissions reductions at 4.3%.

These projections raise several issues for analysis:

  • Driving and the Built Environment itself raises doubts about the political feasibility of implementing the policies the reports deem “necessary” to reduce GHG emissions.
  • Moving Cooler was strongly criticized by a sponsor, AASHTO (American Association of State Highway and Transportation Officials), which withdrew from the project indicating that the conclusions were based on “assumptions that are not plausible” and that the report “did not produce results upon which decision-makers can rely.”
  • Recent analysis casts further doubt on the potential for urban containment to reduce GHG emissions.
  • Comprehensive research at the University of California questions the robustness of the association between strategies to increase population densities and reducing GHG emissions.

In response to these uncertainties, this analysis examines and evaluates the range of projections from both Driving and the Built Environment (range minimum) and the EPA report (range maximum). This study finds that the overwhelming share of GHG emissions reduction projected in each of the reports is caused by fuel economy improvements from the base years that are assumed in the modeling, not urban containment policy. Since fuel economy is likely to continue to improve, even greater GHG emission reductions are likely in the near future. Moreover, this study contends that additional GHG emissions from the increased traffic congestion likely to be produced by the denser environments created by
urban containment policies could materially mitigate or even overwhelm the projected GHG emissions reductions projected in the reports.

Finally, this study cautions that the use of long-term projections based on anticipated human behavioral changes is inherently unreliable, suggesting substantial margins of error. Moreover, the projected GHG emissions from urban containment policy are so small that they could be offset by projection errors and unreliability.

Urban Containment and Mobility

Economic growth in metropolitan areas is strongly associated with higher levels of mobility. Metropolitan areas are labor markets. If employees are able to access a larger percentage of jobs in a fixed period of time (such as 30 minutes), the economic productivity of the metropolitan area is likely to be greater.
U.S. metropolitan areas rely principally on light duty vehicles for personal mobility. Transit access is very limited. On average, only 6% of jobs in major metropolitan areas can be reached on transit in 45 minutes by the average employee. In contrast, nearly two-thirds of jobs can be reached by light duty vehicle in that same time frame.

Low transit use not only reflects reachability of employment but also quality of transportation mode. While transit works for some point-to-point downtown commuters, it is less effective for other trips, including non-work travel, which makes up nearly 85% of trips. This is because light duty vehicles offer a vastly speedier, less burdensome mode of transportation for all manner of non-commute trips, such as parents transporting children or pets, equipment or large or heavy items, groceries in need of refrigeration/freezing, or “trip-chaining” several errands.

Higher densities are strongly associated with increased traffic congestion. This not only impedes personal mobility but is also a concern with respect to commercial traffic and build-upsiness costs. Texas A&M Transportation Institute data indicate a strong relationship between limiting the expansion of roadways and greater traffic congestion over the last three decades.

By favoring modes of transport (transit, cycling and walking) that cannot equal the mobility provided by light duty vehicles, urban containment could retard the productivity of metropolitan areas and significantly degrade people’s everyday lives, leading to a lower standard of living and greater poverty.

Urban Containment and Housing Affordability

For much of the period since World War II, there has been comparatively little variation in house prices relative to household incomes around the country. However significant differences have arisen in more recent decades, in some places more than in others, and especially in dense, urban areas.
Economic theory indicates that limits on supply tend to increase prices, all things being equal, regardless of the good or service (including land for housing).

This potential association is largely dismissed by urban containment advocates and the DOE report, yet a considerable body of research confirms the economic theory that limiting the supply of a good (land) upsets the ratios between demand and supply, leading to higher prices (houses). The fundamental difficulty is that the “competitive supply of land” identified by economist Anthony Downs is not maintained.

This study finds the expected correlation between higher house prices and limited land supply confirmed by the research. As early as 1973, British researchers were associating higher house prices with urban containment and especially noting negative effects on low- income households. A number of researchers have identified similar results across the United States and internationally. A study by the Tomas Rivera Institute in California expressed concern about the negative impacts on Hispanic and African American households.

A detailed examination by Dartmouth economist William Fischel identified the land use regulatory structure as the principal reason for California’s extraordinary house price increases. An examination of housing affordability in Portland shows substantial price increases since the research cited in the DOE report, and particularly large increases in housing costs in high-density and low-income core areas. Housing is generally a household’s largest expenditure item, thus the variation in housing costs between metropolitan areas has a greater impact than that of other expenditure items. Therefore the higher house prices relative to incomes that are associated with urban containment reduce household discretionary income, leading to a lower standard of living and higher poverty rates.

Urban Containment and GHG Emission Reduction Costs

It is necessary to minimize the costs of any level of GHG emissions reductions to preserve economic growth and the standard of living. The normal metric for evaluating the cost of GHG emissions reduction is the cost per metric ton of carbon dioxide equivalent. Cost varies significantly between economic sectors, and it is important to select the most cost- effective strategies, regardless of economic sector. “Across-the-board” reductions can lead to more-costly and less-effective strategies being implemented, which could threaten economic growth.

The cost per metric ton of carbon dioxide equivalent emissions from urban containment policy is hundreds to thousands of times the cost of reducing emissions in the power sector. There are thus vastly more cost-effective alternatives to urban containment policy for reducing GHG emissions. Research by both the Congressional Budget Office and Resources for the Future found that sufficient GHG emission reductions can be achieved without reducing driving or living in denser housing.

Urban Containment and the Broader Economy

There are broader consequences to urban containment policy. Research has associated urban containment policy with slower metropolitan area employment growth and slower economic growth. Further, during the last decade there was a pronounced net domestic migration toward lower cost housing metropolitan areas from higher cost areas. With their restrictions on development outside the urban footprint, urban containment policies effectively trap people and businesses into higher cost areas, with unintended consequences for the broader economy.

Urban Containment and the Standard of Living

The United States has the most affluent metropolitan areas in the world, despite their low density. International data indicate that, compared to other nations, traffic congestion in the United States is less intense and average work trip travel times are better, indicating a higher level of mobility. International data also indicate that housing is generally more affordable relative to incomes than in other nations.

Implementation of urban containment policies will likely lead to more-congested cities and less mobility, as well as lower discretionary incomes as house prices rise relative to incomes. The result would be a lower standard of living and greater poverty.

Sufficient GHG emissions reductions can be achieved without urban containment policy and its attendant economic problems. The key is focusing on the most cost-effective strategies, without unnecessarily interfering with the dynamics that have produced the nation’s affluence.

The post Urban Containment: The Social and Economic Consequences of Limiting Housing and Travel Options appeared first on Reason Foundation.

]]>
Obama’s Clean Power Plan Is Bad News for California https://reason.org/commentary/obamas-clean-power-plan-is-bad-news/ Fri, 28 Aug 2015 20:42:00 +0000 http://reason.org/commentary/obamas-clean-power-plan-is-bad-news/ The Clean Power Plan will ultimately force America to outsource more of its energy-intensive industrial production, and the jobs that go with it, to other countries. That's potentially a lot of pain for very little or no environmental gain.

The post Obama’s Clean Power Plan Is Bad News for California appeared first on Reason Foundation.

]]>
The White House recently released its Clean Power Plan, which aims to reduce the nation’s carbon dioxide emissions by 32 percent by 2030. Almost immediately, California Gov. Jerry Brown praised the plan, claimed it should be the model for international agreements and touted California’s own statewide plans. But California’s carbon control program should be a warning to the rest of the country, not an endorsement of the president’s plan.
California’s Assembly Bill 32, passed in 2006, mandates the state to lower greenhouse gas to 1990 levels by 2020, mainly by forcing utilities to purchase one-third of their electricity from renewable sources and by imposing a convoluted carbon cap-and-trade system. Before AB32, California had some of the least reliable and most expensive electricity in the nation, the most expensive gasoline and among the highest rate of unemployment. AB32 made each of those problems worse, not better.
California’s gasoline prices are typically $1 a gallon more than most other states and electricity prices are 33 percent higher than the average of other states. Higher energy prices, coupled with stagnant employment numbers, have contributed to California’s poverty rate being much worse than the rest of the country. Other states are likely to suffer a similar fate if they follow California’s lead by implementing the Obama administration’s Clean Power Plan.
Like California’s AB32, the Clean Power Plan envisages a significant increase in the proportion of electricity supplied by renewable power, much of which will allegedly come from intermittent sources such as solar and wind. When the wind doesn’t blow, and when it blows too hard, wind turbines must stand idle. Wind power cannot be stored up for when you need it. Similarly, the sun doesn’t shine at night and storing solar power is expensive. To keep the state’s lights on and the air conditioning functioning – even when the wind is not blowing and sun is not shining – power grid operators must rely on power from additional sources, usually gas turbines that can rapidly be spun up.
Even so, maintaining stability on a grid using intermittent sources subject to unpredictable and rapid changes in supply is challenging. The North American Electricity Reliability Corporation notes that solar and wind tend to reduce grid reliability, meaning outages and blackouts occur more frequently. As the proportion of power from intermittent sources increases, grid operators will be forced to spend billions of dollars on new technologies that enable them to balance power generation with demand.
While some Californians may be tempted to smirk at the idea of other states suffering the same energy fate as us, the reality is that the federal energy plan will worsen California’s woes. The state currently imports over 30 percent of its electricity – even more in summer months when air conditioning loads peak. While a third of that is carbon-free hydroelectric from the Northwest, most of the rest comes from coal-fired power plants in the Southwest. But the Environmental Protection Agency’s new plan will force many of these coal plants to shut down. That will hurt California – leaving it vulnerable to brownouts and blackouts.
You might think there must be some benefit from these disastrous costs. Well, maybe, but not much. Even assuming that the prognostications of the administration’s climate pessimists prove correct, the Clean Power Plan would reduce temperatures maybe by approximately two-hundredths of a degree. Meanwhile, California’s plan would reduce global temperatures by less than one-hundredth of one degree. Such changes are, for all intents and purposes, unmeasurable – and are certainly not enough to affect rainfall or sea levels.
While political leaders in Sacramento and Washington, D.C. have put Californians on a carbon reduction diet, the leadership in China, India and other emerging industrial giants have made no such commitment.
Just as AB32 has forced California to outsource its energy production to other states, the Clean Power Plan will ultimately force America to outsource more of its energy-intensive industrial production, and the jobs that go with it, to other countries. That’s potentially a lot of pain for very little or no environmental gain.
Tom Tanton is senior fellow and Julian Morris is vice president of research at Reason Foundation. This article originally appeared in the Orange County Register.

The post Obama’s Clean Power Plan Is Bad News for California appeared first on Reason Foundation.

]]>
The Social Cost of Carbon Underestimates Human Ingenuity, Overestimates Climate Sensitivity https://reason.org/commentary/climate-change-overestimates-sensit/ Fri, 14 Aug 2015 15:30:00 +0000 http://reason.org/commentary/climate-change-overestimates-sensit/ To great fanfare, the White House recently released its final "Clean Power Plan," which seeks to reduce carbon dioxide emissions from electricity generation. The Obama administration also released a 343-page "regulatory impact assessment," which purports to detail the Clean Power Plan's costs and benefits. But as I show in a new study for Reason Foundation, one of the key assumptions underlying that assessment, the so-called "social cost of carbon," is deeply flawed. As a result, the administration has massively exaggerated the benefits from reducing carbon dioxide emissions. Under more realistic assumptions, the plan would fail a basic cost-benefit test.

The Clean Power Plan is the latest in a series of federal regulations targeting emissions of greenhouse gases. Under federal law, agencies proposing new regulations are required to assess their costs and benefits. Since 2010, federal agencies have counted the benefits of reducing greenhouse gas emissions by multiplying the estimated total reduction of such emissions (measured in tons) by the "social cost of carbon" (measured in dollars per ton).

The post The Social Cost of Carbon Underestimates Human Ingenuity, Overestimates Climate Sensitivity appeared first on Reason Foundation.

]]>
To great fanfare, the White House recently released its final “Clean Power Plan,” which seeks to reduce carbon dioxide emissions from electricity generation. The Obama administration also released a 343-page “regulatory impact assessment,” which purports to detail the Clean Power Plan’s costs and benefits. But as I show in a new study for Reason Foundation, one of the key assumptions underlying that assessment, the so-called “social cost of carbon,” is deeply flawed. As a result, the administration has massively exaggerated the benefits from reducing carbon dioxide emissions. Under more realistic assumptions, the plan would fail a basic cost-benefit test.

The Clean Power Plan is the latest in a series of federal regulations targeting emissions of greenhouse gases. Under federal law, agencies proposing new regulations are required to assess their costs and benefits. Since 2010, federal agencies have counted the benefits of reducing greenhouse gas emissions by multiplying the estimated total reduction of such emissions (measured in tons) by the “social cost of carbon” (measured in dollars per ton).

The group of federal agencies that developed the social cost of carbon (SCC), known as the Interagency Working Group, gave a range of estimates for the SCC that depend on: the year during which the emissions occur (the SCC is assumed to rise over time), the discount rate (the rate at which society is assumed to be willing to forego current consumption in return for future consumption – the higher the discount rate, the lower the SCC) and the degree to which climate change is expected to cause harm (the greater the expected harm, the higher the SCC).

In the original analysis by the Interagency Working Group, published in 2010, estimates of the SCC for the year 2015 ranged from $5.70 per ton of carbon dioxide (for the “median” forecast of climate change impact discounted at 5 percent) to $64.90 per ton (for the “95th percentile” forecast discounted at 3 percent). This analysis was heavily criticized, with some saying the estimates were too high, others that they were too low. In 2013, the Interagency Working Group revised its estimates of the SCC twice. In the May 2013 revision, the “median” forecast, assuming a 5 percent discount rate had doubled to $12 per ton (from $5.70 a ton in the 2010 estimate). In a minor revision made in November 2013, this was reduced to $11 per ton. The SCC was again revised slightly in July 2015 (but no change was made to the aforementioned “median” forecast discounted at 5 percent).

While the Interagency Working Group increased its estimates of the SCC between 2010 and 2013, academic economists using the same models generally did not. Moreover, the trend in estimates has been declining over the long-term. In 2011, Professor Richard Tol – one of the world’s leading experts on the economics of climate change and the developer of one of the three models used by the Interagency Working Group – produced a meta-analysis of 311 estimates of the social costs of carbon and found that the “mean” estimate fell consistently from 1992 to 2011, which as he notes “suggests that estimates of the impact of climate change have become less dramatic over time.” Moreover, the proportion of estimates of the SCC that are negative — implying that additional carbon dioxide has net benefits – has increased over time.

If one were to base the social cost of carbon on these models alone, it would be very difficult to justify a rate of more than a few dollars. But the models are really no better than the assumptions built into them. And if those assumptions are wrong, so are the conclusions. As the old saying goes: garbage in, garbage out.

One assumption that is particularly problematic concerns the relationship between increases in carbon dioxide concentrations and increases in temperature – known as climate sensitivity. In its assessment, the Interagency Working Group assumed the same range for climate sensitivity as did the Intergovernmental Panel on Climate Change (IPCC) in its 2007 assessment. But since 2010 several studies have shown that the actual temperature change since 1990 has been about half that of the lowest forecasts made using the IPCC’s models.

A likely explanation for the failure of the climate models is that they assume the climate is more sensitive to increases in carbon dioxide than is actually the case. This is supported by several recent studies, which suggest that climate sensitivity is lower than even the lowest estimate assumed by the Intergovernmental Working Group. The IPCC implicitly acknowledged this in its 2013 update, which reduced the bottom of its range of estimates of future warming, but the Interagency Working Group ignored it in its own updates.

A further significant problem with the Interagency Working Group assessment of the social cost of carbon is that it is based on highly pessimistic assumptions concerning the ability of humanity to adapt to climate change. While one of the three models used by the group assumes that adaptive capacity will increase over time and in response to climate change, the others do not. Yet human adaptive capacity has been increasing at a rapid rate, making us less susceptible to a range of climate-related problems than at any time in the past. This is likely to continue.

Take food production. Pessimists claim that climate change is already reducing food availability. But the trend is in the opposite direction. Over the course of the past half-century, improvements in agricultural technology have dramatically increased food production. For example, cereal yields have nearly tripled, from 0.6 tons per acre in 1961 to 1.7 tons per acre in 2013. As a result, per capita food availability is greater and fewer people die today from malnutrition than 50 years ago in spite of a doubling of the world’s population. New technologies available today – but not yet widely adopted – enable farmers to grow food in hotter, drier, and saltier soils than was previously possible. As new technologies continue to be developed and adopted, it seems almost certain that yields will continue to increase even as the world becomes warmer.

Diseases that are affected by climate have also been on the decline. Until the early 1900s, each year about 1 in every 1,000 people contracted typhoid fever in the U.S. – and about 20 percent of those, most of them children, died. By 1920 the incidence had fallen to 1 in 2,500 and by 1960 it affected fewer than 1 in 100,000 people. This decline was largely a result of improvements in, and more widespread disinfection of, drinking water.

Deaths from heat have also been falling in the U.S. in spite of increases in temperature. In 1987, 50 of every 1,000 deaths were attributable to heat; by 2005 that had fallen to 19 per 1,000 deaths. Much of this reduction in heat-related mortality was a result of more widespread use of air conditioning. But other factors, including better treatment for heart attacks, also contributed.

While extreme weather events such as hurricanes and droughts continue to cause disease and misery, not to mention destruction of property, they have become far less deadly. Across the world, improvements in nutrition, water availability, access to health care and other factors associated with economic development have reduced the mortality rate from extreme weather events by more than 90 percent since 1920.

Over the past half-century, trade, technological progress and associated economic growth have lifted billions out of poverty. Indeed, there are fewer people in the world living in absolute poverty today than at any time since 1820, even though the population has grown seven-fold in that time. Over the coming decades, billions more people in Africa, Asia and Latin America will likely escape from poverty. With this increase in wealth will come better access to clean drinking water and sanitation, better health care, more widespread use of air conditioning and other technologies that have already reduced humanity’s vulnerability to climate change.

It is important to avoid Panglossian thinking. We do not live in the best of all possible worlds. Increased concentrations of carbon dioxide will likely impose costs on many people. However, many others will almost certainly benefit. Higher concentrations of carbon dioxide have already increased the rate of growth of most crops – and this is likely to continue. In a warmer world, growing seasons at higher latitudes will be longer, enabling additional crops to be grown. And warmer winters mean fewer deaths from cold, as well as lower heating bills.

So, to sum up: (1) estimates of climate sensitivity suggest that global warming is likely to be milder than even the lowest forecast made by the Interagency Working Group; (2) humanity’s ability to adapt to climate change will almost certainly be greater than even the most optimistic forecasts of the Interagency Working Group; (3) while there will undoubtedly be losers from global warming, there will also be winners and it is quite possible that additional carbon dioxide emissions will generate net benefits for humanity, at least until the end of this century.

This does not mean that the costs and benefits actually cancel each other out: the winners are unlikely to compensate the losers, though in an ideal world perhaps they should. (In a future paper, I intend to identify ways in which laws and policies might be crafted so as to achieve this more equitable end.) But the social cost of carbon is based on cost-benefit analysis and in that analysis, for better or worse, it does not matter whether the winners compensate the losers.

Attempting to use economic models to investigate such complex propositions is so fraught with difficulty that, on the basis of what we know today, to a first approximation the social cost of carbon is zero. That means every regulation promulgated by the White House under the presumption that the social cost of carbon is greater than zero, including the Clean Power Plan, must be re-evaluated.

Julian Morris is vice president of research at Reason Foundation and author of the study, “Assessing the Social Costs and Benefits of Regulating Carbon Emissions.”

Permission to reprint this column is hereby granted, provided that the author, Julian Morris, and Reason Foundation are properly credited.

The post The Social Cost of Carbon Underestimates Human Ingenuity, Overestimates Climate Sensitivity appeared first on Reason Foundation.

]]>
Assessing the Social Costs and Benefits of Regulating Carbon Emissions https://reason.org/policy-study/social-cost-of-carbon/ Thu, 06 Aug 2015 13:00:00 +0000 http://reason.org/policy-study/social-cost-of-carbon/ Taking into account a wide range of climate models, impact evaluations, economic forecasts and discount rates, as well as the most recent evidence on climate sensitivity, this study finds that the range of social cost of carbon should be revised downwards. At the low end, carbon emissions may have a net beneficial effect (i.e. carbon should be priced negatively), while even at the high end carbon emissions are very unlikely to be catastrophic.

The post Assessing the Social Costs and Benefits of Regulating Carbon Emissions appeared first on Reason Foundation.

]]>
Atmospheric concentrations of greenhouse gases (GHGs), which include carbon dioxide and methane, have been increasing for more than a century. Rising human emissions of these gases, especially from the combustion of fossil fuels and from agriculture, appear to be the primary cause of this increase in concentrations. The temperature of the atmosphere has also increased over the past century. Some of that increase is likely the result of the increase in concentration of GHGs.

Such an increase in temperatures has various consequences, some of which are likely to be beneficial, others harmful. In the late 1970s, economists began assessing the impact of rising greenhouse gas concentrations-and the consequences of restricting emissions. The framework they adopted for this analysis is called “cost benefit analysis.” The objective of such analysis is to identify policies whose benefits exceed their costs.

In 1993, President Clinton signed Executive Order 12866 which, among other things, requires agencies of the U.S. government to “assess both the costs and the benefits of the intended regulation and, recognizing that some costs and benefits are difficult to quantify, propose or adopt a regulation only upon a reasoned determination that the benefits of the intended regulation justify its costs.”

Starting in 2008, in compliance with this executive order, some agencies of the U.S. government began to incorporate estimates of the “social cost of carbon” (SCC-see box) into their regulatory impact analyses (RIAs). However, not all agencies were using the same estimates of the social cost of carbon, resulting in regulatory impact analyses that were inconsistent. In response, the Office of Management and Budget and the Council of Economic Advisors convened an interagency working group in order to establish a consistent SCC for use in RIAs relating to regulations that restrict emissions of these gases.

In February 2010, the Interagency Working Group (IWG) published “Technical Support Document: Social Cost of Carbon for Regulatory Impact Analysis Under Executive Order 12866.” In that document, a range of estimates was given for the SCC. The SCC was calculated at five-year increments from 2010 to 2050 and it is expected to rise over time. As with all U.S. government estimates of costs and benefits, future costs and benefits are discounted (that is to say, future amounts are reduced by a certain percentage per annum to give their current dollar value).

However, unusually, the IWG did not discount at the rate recommended by the Office of Management and Budget (7% per year), instead choosing to use a range of lower and variable discount rates (these averaged 2.5%, 3% and 5%). In addition, while most of the estimates provided are for the average (in this case, median) forecast of future costs and benefits, the IWG also gave an estimate of the “95th Percentile”-that is, the estimate that is above 95% of all forecasts, or in other words the estimate that is expected to occur with only 5% probability. The IWG has revised its estimates three times since 2010. In the first revision (May 2013), the range of costs shifted upwards dramatically. In the second revision (November 2013), the costs were revised downwards slightly compared to the May 2013 revision. In the third revision (June 2015), the costs were again revised down slightly.

Study Outline

This study seeks to assess the Interagency Working Group’s estimates of the social cost of carbon (SCC). It begins with a discussion of the framework that underpins the SCC, i.e. cost-benefit analysis.

Part Two provides a brief history of economists’ attempts to estimate the social costs and benefits of carbon.

Part Three reviews some of the estimates of the social cost of carbon that have been derived using integrated assessment models (that is, the types of models used by the IWG).

Part Four describes the methodology adopted by the IWG for calculating the social cost of carbon and assesses some of the criticisms of that assessment.

Part Five focuses on two key factors affecting the “damage function”: the sensitivity of the climate to increases in greenhouse gases and the ability of society to adapt to climate change.

The final section draws conclusions and recommends that policies and regulations predicated on the assumption that the SCC is different from zero should be adjusted to reflect an SCC of zero.

Attachments

The post Assessing the Social Costs and Benefits of Regulating Carbon Emissions appeared first on Reason Foundation.

]]>
Study: Social Cost of Carbon Likely Much Lower Than Previous Estimates https://reason.org/news-release/social-cost-of-carbon-study/ Thu, 06 Aug 2015 13:00:00 +0000 http://reason.org/news-release/social-cost-of-carbon-study/ A new study from Reason Foundation shows that estimates of the "social cost of carbon" have been falling - and would fall further if new scientific evidence were incorporated. The study calls into question the analyses used to underpin the Obama administration's new Clean Power Plan and other federal regulations targeting emissions of greenhouse gases.

The post Study: Social Cost of Carbon Likely Much Lower Than Previous Estimates appeared first on Reason Foundation.

]]>
Los Angeles, CA – A new study from Reason Foundation shows that estimates of the “social cost of carbon” have been falling – and would fall further if new scientific evidence were incorporated. The study calls into question the analyses used to underpin the Obama administration’s new Clean Power Plan and other federal regulations targeting emissions of greenhouse gases.

The study by Julian Morris, vice president of research at Reason Foundation, finds the administration’s estimates of the social cost of carbon are “biased upwards” due to their reliance on three “simplistic models, all of which use estimates of climate sensitivity that are likely too high and two of which likely overestimate the economic impact of climate change.”

The study shows that by combining more reliable estimates of the sensitivity of the climate to changes in concentrations of greenhouse gases with more realistic assumptions about both the benefits and costs of carbon dioxide emissions, the social cost of carbon falls dramatically.

“The current best estimate for the social cost of carbon is zero,” said Morris, who has been researching the economics of climate change for over 20 years. “Using that number, most federal regulations limiting carbon emissions, including the Clean Power Plan, could no longer be justified.”

Morris notes that, “The government’s estimates are based on pessimistic assumptions that don’t accurately reflect humanity’s ability to innovate and adapt, and they ignore recent evidence showing that the climate is less sensitive to rising concentrations of carbon dioxide than was previously assumed.”

Full Study Online

The full study, Assessing the Social Costs and Benefits of Regulating Carbon Emissions, is available here and here (.pdf).

About Reason Foundation

Reason Foundation is a nonprofit think tank dedicated to advancing free minds and free markets. Reason Foundation produces respected public policy research on a variety of issues and publishes the critically acclaimed Reason magazine and its website www.reason.com. For more information please visit www.reason.org.

Contact

Julian Morris, Vice President of Research, Reason Foundation, (212) 495-9599

The post Study: Social Cost of Carbon Likely Much Lower Than Previous Estimates appeared first on Reason Foundation.

]]>