Natural gas saves water, even when factoring in water lost to hydraulic fracturing

Natural gas saves water, even when factoring in water lost to hydraulic fracturing
For every gallon of water used to produce natural gas through hydraulic fracturing, Texas saved 33 gallons of water by generating electricity with that natural gas instead of coal (in 2011). Credit: University of Texas at Austin

A new study finds that in Texas, the U.S. state that annually generates the most electricity, the transition from coal to natural gas for electricity generation is saving water and making the state less vulnerable to drought.

Even though exploration for through hydraulic fracturing requires significant consumption in Texas, the new consumption is easily offset by the overall water efficiencies of shifting electricity generation from coal to natural gas. The researchers estimate that water saved by shifting a power plant from coal to natural gas is 25 to 50 times as great as the amount of water used in hydraulic fracturing to extract the natural gas. Natural gas also enhances drought resilience by providing so-called peaking plants to complement increasing wind generation, which doesn't consume water.

The results of The University of Texas at Austin study are published this week in the journal Environmental Research Letters.

The researchers estimate that in 2011 alone, Texas would have consumed an additional 32 billion gallons of water—enough to supply 870,000 average residents—if all its natural gas-fired were instead coal-fired plants, even after factoring in the additional consumption of water for hydraulic fracturing to extract the natural gas.

Hydraulic fracturing is a process in which water, sand and chemicals are pumped at high pressure into a well to fracture surrounding rocks and allow oil or gas to more easily flow. Hydraulic fracturing and horizontal drilling are the main drivers behind the current boom in U.S. natural gas production.

Environmentalists and others have raised concerns about the amount of water that is consumed. In Texas, concerns are heightened because the use of hydraulic fracturing is expanding rapidly while water supplies are dwindling as the third year of a devastating drought grinds on. Because most electric power plants rely on water for cooling, the electric power supply might be particularly vulnerable to drought.

"The bottom line is that hydraulic fracturing, by boosting natural gas production and moving the state from water-intensive coal technologies, makes our electric power system more drought resilient," says Bridget Scanlon, senior research scientist at the university's Bureau of Economic Geology, who led the study.

To study the drought resilience of Texas power plants, Scanlon and her colleagues collected water use data for all 423 of the state's power plants from the Energy Information Administration and from state agencies including the Texas Commission on Environmental Quality and the Texas Water Development Board, as well as other data.

Since the 1990s, the primary type of power plant built in Texas has been the natural gas combined cycle (NGCC) plant with cooling towers, which uses fuel and cooling water more efficiently than older steam turbine technologies. About a third of Texas power plants are NGCC. NGCC plants consume about a third as much water as coal steam turbine (CST) plants.

The other major type of natural gas plant in the state is a natural gas combustion turbine (NGCT) plant. NGCT plants can also help reduce the state's water consumption for electricity generation by providing "peaking power" to support expansion of wind energy. Wind turbines don't require water for cooling; yet wind doesn't always blow when you need electricity. NGCT generators can be brought online in a matter of seconds to smooth out swings in electricity demand. By combining NGCT generation with wind generation, total water use can be lowered even further compared with coal-fired power generation.

The study focused exclusively on Texas, but the authors believe the results should be applicable to other regions of the U.S., where rates for the key technologies evaluated—hydraulic fracturing, NGCC plants with cooling towers and traditional coal steam turbine plants—are generally the same.

The Electric Reliability Council of Texas, manager of the state's electricity grid, projects that if current market conditions continue through 2029, 65 percent of new power generation in the state will come from NGCC plants and 35 percent from natural gas combustion turbine plants, which use no water for cooling, but are less energy efficient than NGCC plants.

"Statewide, we're on track to continue reducing our water intensity of electricity generation," says Scanlon.

Hydraulic fracturing accounts for less than 1 percent of the water consumed in Texas. But in some areas where its use is heavily concentrated, it strains local water supplies, as documented in a 2011 study by Jean-Philippe Nicot of the Bureau of Economic Geology. Because natural gas is often used far from where it is originally produced, water savings from shifting to natural gas for might not benefit the areas that use more water for .

Explore further

New online database charts water quality regulations related to oil and gas development

Journal information: Environmental Research Letters

Citation: Natural gas saves water, even when factoring in water lost to hydraulic fracturing (2013, December 19) retrieved 14 October 2019 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Dec 20, 2013
NGCT plants can also help reduce the state's water consumption for electricity generation by providing "peaking power" to support expansion of wind energy.

But NGCT built for wind turbine load following are generally about half as efficient both in terms of energy and water use compared to NGCC plants. (Approx. 60% versus 35% efficient.)

Since wind power isn't dispatchable, it has to dispace baseload power generation so that demand would always be guaranteed. This means whenever the wind is not blowing, the gas turbines are turning, and you alternate between the two to maintain constant output.

So, for a typical wind power capacity factor of 0.25 you need 75% of the energy from the NGCT gas turbines. This means that to produce 1 unit of energy, you consume 0.75 / 0.35 = 2.14 units of natural gas.

Compare that to NGCC alone with no wind power, which would use 1 / 0.6 = 1.67 units of gas and 40% the water, and it becomes apparent the whole wind power argument is just bunk

Dec 20, 2013
Furthermore, the energy balance of the equation shows that you get 0.25 units of energy from wind power, but you use 0.47 units more natural gas, which means utilizing wind power actually increases your fossil fuel consumption!

In order to break even in terms of energy, you need to add no more than 0.25 units of gas consumption over the best baseline of 1.67, which means the capacity factor of the wind turbines must be over 0.33 which on average never happens with wind turbines on land.

Texas had 12212 MW of capacity and produced 31,860,000 MWh in 2012 yielding a capacity factor of 0.297, so despite claims of improved Cp they're not actually making any savings over them.

This again goes to show what the reality is with non-dispatchable intermittent renewables with low capacity factors, such as wind and solar. The external costs of simply using them are massive, which is not factored in in the public propaganda for renewable energy.

Dec 20, 2013
The consumption of water during shale gas fracking is huge, but still negligible in comparison to amount of underwater contaminated with it. But these loses aren't considered in the above study.

Dec 20, 2013
On the other hand, General Electric seems to be claiming wind turbine capacity factors of over 50% which is physically impossible unless you do a little slight of hand.

For the statisticians out there, to simulate an ideal wind turbine, you can run a Rayleigh distribution with sigma 5 for wind average of 6.25 m/s to get an approximation of how many hours the wind blows at what speeds over the course of a year. You can vary the number slightly to get different estimates. Raise that to the third power and you get a distribution of energy over wind speeds. Mind that after 12 m/s wind turbines switch to limit power to prevent damage to the turbine, so your maximum output and nameplate capacity is 12^3. Then integrate surface between 0-20 m/s to get the output of an ideal wind turbine that captures all wind energy there is to be had.

Then compare this amount with 12^3 and you get the ideal capacity factor. With these numbers, it turns out to be 0.24.

So how do you get to 50%?

Dec 20, 2013
Well, you get to 50% by pretending that your wind turbine is smaller than it really is. If you start to shutter power at 7 m/s instead of 12 m/s and tell everyone that is your real nameplate power, you get constant power at anything over 7 m/s and you produce a Cp of 0.57 - at least in theory.

But you also lose 52% of the available wind energy, which means your turbine cost just went up 2x. You only produce half the energy it would be capable of.

Some of you might question why I chose sigma 5 instead of 4 or 6 or whatever. Well, I chose it because it closely matches the average wind speed distribution measured on a wind farm site at Lee Ranch, Colorado: http://en.wikiped...ency.svg

Dec 20, 2013
And the irony is that wind power subsidies paid by the government offer absolutely no incentive to increase wind power capacity factors by throwing away energy, because the feed-in tariffs reward them for pushing more energy at worse capacity factors.

The result is that wind power producers get paid and the rest of the society has to use more energy than they save to accomodate the forced feed of wind power into the grid.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more