YOU ARE HERE : Home / Sustainability / Environmental Sustainability / Recycling and Waste Reduction / Waste to Energy / Background Info / What ia a Megawatt ? 
What ia a Megawatt ?

What is a Megawatt? - By Bob Bellemare, President and CEO

Mega-what? The term is tossed around a lot. Megawatts are basic to understanding electricity planning concepts, but what are they? (In the spirit of today's IssueAlert™, we will present a complimentary copy of UtiliPoint's Renewables InfoGrid to the first three readers who correctly answer the question at the end of this article.)

News stories covering electric generation topics often try to illustrate the worth of a megawatt in terms of how many homes a particular amount of generation could serve. A June 11, 2003 Reuters' article describing the potential sale of AEP's Texas generation facilities states that AEP is offering to sell “29 generating units with a total net generation capacity of 4,497 megawatts, or roughly enough electricity to power 4.5 million average homes.” A May 21, 2003 article in the San Diego Union Tribune describes an agreement with Sempra that “involves 1,900 megawatts, enough to supply 1.9 million homes.”

Such articles give the impression that one megawatt is enough electricity to supply 1,000 homes. Yet, occasionally, an article will illustrate a different conversion such as an April 17, 2003 article by Environment News Service which states “Tucson Electric Power expanded its solar capacity to 2.4 megawatts, enough to power 420 homes.”

So what really is a megawatt (MW) and how many homes can one MW of generation really serve?

The Basics

The answer starts with understanding the basic definition of energy terms. Watts (W) are the yardstick for measuring power. A one hundred watt light bulb, for example, is rated to consume one hundred watts of power when turned on. If such a light bulb were on for four hours it would consume a total of 400 watt-hours (Wh) of energy. Watts, therefore, measure instantaneous power while watt-hours measure the total amount of energy consumed over a period of time.

A megawatt (MW) is one million watts and a kilowatt (kW) is one thousand watts. Both terms are commonly used in the power business when describing generation or load consumption. For instance, a 100 MW rated wind farm is capable of producing 100 MW during peak winds, but will produce much less than its rated amount when winds are light. As a result of these varying wind speeds, over the course of a year a wind farm may only average 30 MW of power production. Similarly, a 1,000 MW coal plant may average 750 MW of production over the course of a year because the plant will shut down for maintenance from time-to-time and the plant operates at less than its rated capability when other power plants can produce power less expensively.

The ratio of a power plant's average production to its rated capability is known as capacity factor. In the previous example, the wind farm would have a 30 percent capacity factor (30 MW average production divided by 100 MW rated capability) and the coal plant would have a 75 percent capacity factor (750 MW average divided by 1,000 MW rated capability). Load factor generally, on the other hand, is calculated by dividing the average load by the peak load over a certain period of time. If the residential load at a utility averaged 5,000 MW over the course of a year and the peak load was 10,000 MW, then the residential customers would be said to have a load factor of 50 percent (5,000 MW average divided by 10,000 MW peak).

Knowing the peak and average demand of a power system is critical to proper planning. The power system must be designed to serve the peak load, in this example 10,000 MW. But the actual load will vary. The load might be 10,000 MW at noon, but only 4,000 MW at midnight, when fewer appliances are operating. The capacity or load factor gives utility planners a sense of this variation. A 40 percent load factor would indicate large variations occur in load, while a 90 percent load factor would indicate little variation. Residential homes tend to have low load factors because people are home and using appliances only during certain hours of the day, while certain industrial customer will have very high load factors because they operate 24 hours a day, 7 days a week.

Residential Electricity Consumption

The amount of electricity consumed by a typical residential household varies dramatically by region of the country. According to 2001 Energy Information Administration (EIA) data, New England residential customers consume the least amount of electricity, averaging 653 kilowatt hours (kWh) of load in a month, while the East South Central region, which includes states such as Georgia and Alabama and Tennessee, consumes nearly double that amount at 1,193 kWh per household.
















Census
Division



Number of



Average
Monthly



State



Consumers




Consumption (kWh)



New England



5,822,935



618



Middle
Atlantic



15,045,495



641



East North Central



18,705,754



763



West North Central



8,287,837



903



South Atlantic



22,473,797



1,088



East South Central



7,356,975



1,193



West South Central



12,883,403



1,151



Mountain



7,368,280



847



West Coast



15,763,570



668



Hawaii
& Alaska



609,661



642



U.S. Total



114,317,707



877



 



The large disparity in electric consumption is driven by many factors including the heavier use of air conditioning in the South. So it stands to reason that a one megawatt generator in the Northeast would be capable of serving about twice as many households as a generator located in the South because households in the Northeast consume half the amount of electricity as those in the South.

Going through the math, a 1,000 megawatt rated coal generator with a 75 percent capacity factor generates about 6.6 billion kWhs in a year, equivalent to the amount of power consumed by about 900,000 homes in the Northeast but only 460,000 homes in the South. In other words, each megawatt of rated capacity for a coal plant in the Northeast generates the equivalent amount of electricity consumed by 900 homes in the Northeast but only about 460 homes in the South.

By comparison, a 30 percent capacity factor, 100 MW wind farm would generate the equivalent amount of power consumed by about 35,000 homes in the Northeast and 18,000 homes in the South. In other words, each megawatt of rated capacity for a wind farm in the Northeast generates the equivalent amount of electricity consumed by 350 homes in the Northeast and 180 homes in the South.

So What is a Megawatt Worth?

The examples demonstrate that there are two very important aspects to knowing what a MW of generation capacity is worth in terms of how many equivalent homes it represents. The first factor is how much electricity the power plant itself produces, which can be calculated by knowing the plant's rating and capacity factor. Second, the location of the plant is very important as the amount of electricity consumed in a typical household can vary dramatically across the country.

The numbers used in the examples were typical representations of coal and wind power plants. A low-cost coal plant typically operates at capacity factors of 60 percent or higher. High quality wind sites will generate at about 30 to 40 percent of their rated capability on average because of wind speed variations. Solar generators average even less production, typically under 25 percent capacity factor, because the generators do not produce electricity during the nighttime or during cloudy days.

The commonly used “one MW of generation equates to 1,000 homes” is a myth that likely originated years ago when households were smaller and air conditioning wasn't as common. For conventional generators, such as a coal plant, a megawatt of capacity will produce electricity that equates to about the same amount of electricity consumed by 400 to 900 homes in a year. For renewable energy such as wind or solar, the equivalent is even less because they typically produce less energy than conventional generators since their “fuel source” is intermittent.

Of course, no one generator is normally considered sufficient by itself to supply an individual customer. All generators must be taken out of service for maintenance and some types of generators, such as nuclear, wind, and solar, are not normally able to “follow” changes in load. For these reasons power systems require the use of backup generation sources and occasionally electric energy storage, such as batteries, to ensure the amount of power generated always matches the load demand, every second.

Test Yourself

UtiliPoint will send a copy of our Renewables InfoGrid to the first three readers who answer the following problem correctly. Email your answer to rbell@utilipoint.com.

In the following paragraph, calculate the capacity factor for the situation described. Assume the average house in Colorado consumes 8,000 kWh (8 MWh) of electricity in a year.

“At 162 megawatts, the new Colorado Green wind farm will be the third-largest in the nation and will produce enough electricity each year to meet the needs of roughly 75,000 homes.” June 16, 2003, Pueblo Chieftain.

Hint:

Let's check the previously referenced statement in the June 11, 2003 Reuters' article concerning AEP's Texas generators: “29 generating units with a total net generation capacity of 4,497 megawatts, or roughly enough electricity to power 4.5 million average homes.”

According to EIA data, in Texas the average household consumes about 14,000 kWh (14 MWh) in a year. If 4,497 MW of generation were truly to serve the equivalent of 4.5 million homes, then the generation fleet would need to operate at an average capacity factor of 160 percent, which is not possible.

The calculation goes as follows:

First calculate the total residential load for a year. This is done by multiplying 14 MWh per household by 4.5 million homes for a total of 63 million MWhs.

Next calculate the maximum amount of generation that could be produced from the generators in a year. Here it is assumed the generators will operate at their rated capability (4,497 MW) during all hours of a year (8760 hours). Multiplying 4,497 MW by 8760 hours gives 39.4 million MWhs of generation capability in year. In other words, if the generators operated at a 100 percent capacity factor they would generate 39.4 million MWhs in a year.

Now divide 63 million MWhs by 39.4 million MWhs to calculate a capacity factor of 1.6 or 160 percent.

This is another way to calculate capacity factor, by taking the total generation production over a year and dividing by its maximum potential generation production in a year. In this case we divide 63 million MWhs—the total generation production needed in a year to serve the 4.5 million homes—by 39.4 million MWhs—the theoretical maximum production from 4,495 MW of generation - resulting in a 160 percent capacity factor, which is not possible to achieve.

Clearly, 4,497 MW of generation will not serve 4.5 million households in Texas. In fact, in practice, 4,497 MW of generation will produce substantially less that 39.4 million MWhs in a year because no generation units will operate at 100 percent of their rated value year round.
Print View   Site Map   Login