Wednesday, June 21, 2006

Nuclear Power

Nuclear power is the controlled use of nuclear reactions to release energy for work including propulsion, heat, and the generation of electricity. Human use of nuclear power to do significant useful work is currently limited to nuclear fission and radioactive decay. Nuclear energy is produced when a fissile material, such as uranium-235 (235U), is concentrated such that the natural rate of radioactive decay is accelerated in a controlled chain reaction and creates heat - which is used to boil water, produce steam, and drive a steam turbine. The turbine can be used for mechanical work and also to generate electricity. Nuclear power is used to power most military submarines and aircraft carriers and provides 7% of the world's energy and 17% of the world's electricity. The United States produces the most nuclear energy, with nuclear power providing 20% of the electricity it consumes, while France produces the highest percent of its energy from nuclear reactors—80% as of 2006. [1] [2]

Origins
The first successful experiment with nuclear fission was conducted in 1938 in Berlin by the German physicists Otto Hahn, Lise Meitner and Fritz Strassman.
During the Second World War, a number of nations embarked on crash programs to develop nuclear energy, focusing first on the development of nuclear reactors. The first self-sustaining nuclear chain reaction was obtained at the University of Chicago by Enrico Fermi on December 2, 1942, and reactors based on his research were used to produce the plutonium necessary for the "Fat Man" weapon dropped on Nagasaki, Japan. Several nations began their own construction of nuclear reactors at this point, primarily for weapons use, though research was also being conducted into their use for civilian electricity generation.
Electricity was generated for the first time by a nuclear reactor on December 20, 1951 at the EBR-I experimental fast breeder station near Arco, Idaho, which initially produced about 100 kW.

In 1952 a report by the Paley Commission (The President's Materials Policy Commission) for President Harry Truman made a "relatively pessimistic" assessment of nuclear power, and called for "aggressive research in the whole field of solar energy". [3]
A December 1953 speech by President Dwight Eisenhower, "Atoms for Peace", set the U.S. on a course of strong government support for the international use of nuclear power.

Early years

The Beaver Valley Nuclear Generating Station in Shippingport, Pennsylvania was the first commercial reactor in the USA and was opened in 1957.
On June 27, 1954, the world's first nuclear power plant to generate electricity for a power grid started operations at Obninsk, USSR [4]. The reactor was graphite moderated, water cooled and had a capacity of 5 megawatts (MW). The world first commercial nuclear power station, Calder Hall in Sellafield, England was opened in 1956, a gas-cooled Magnox reactor with an initial capacity of 45 MW (later 196 MW) [5]. The Shippingport Reactor (Pennsylvania, 1957), a pressurized water reactor, was the first commercial nuclear generator to become operational in the United States.

In 1954, the chairman of the United States Atomic Energy Commission (forerunner of the U.S. Nuclear Regulatory Commission) famously declared that nuclear power would be "too cheap to meter" [6] and foresaw 1000 nuclear plants on line in the USA by the year 2000.
In 1955 the United Nations' "First Geneva Conference", then the world's largest gathering of scientists and engineers, met to explore the technology. In 1957 EURATOM was launched alongside the European Economic Community (the latter is now the European Union). The same year also saw the launch of the International Atomic Energy Agency (IAEA).

Thanks to the presence of the nearby Bettis Laboratory and the Shippingport power plant, Pittsburgh, Pennsylvania became the world's first nuclear powered city in 1960.
[edit]

Development
Installed nuclear capacity initially rose relatively quickly, rising from less than 1 gigawatt (GW) in 1960 to 100GW in the late 1970s, and 300GW in the late 1980s. Since the late 1980s capacity has risen much more slowly, reaching 366GW in 2005, primarily due to Chinese expansion of nuclear power. Between around 1970 and 1990, more than 50GW of capacity was under construction (peaking at over 150GW in the late 70s and early 80s) - in 2005, around 25GW of new capacity was planned. More than two-thirds of all nuclear plants ordered after January 1970 were eventually cancelled.[7]

During the 1970s and 1980s rising economic costs (related to vastly extended construction times largely due to regulatory delays) and falling fossil fuel prices made nuclear power plants then under construction less attractive. In the 1980s (U.S.) and 1990s (Europe), flat load growth and electricity liberalization also made the addition of large new baseload capacity unnecessary.

A general movement against nuclear power arose during the last third of the 20th Century, based on the fear of a possible nuclear accident and on fears of latent radiation, and on the opposition to nuclear waste production, transport and final storage. Perceived risks on the citizens health and safety, the 1979 accident at Three Mile Island and the 1986 Chernobyl accident played a key part in stopping new plant construction in many countries. Austria (1978), Sweden (1980) and Italy (1987) voted in referendums to oppose or phase out nuclear power, while opposition in Ireland prevented a nuclear programme there. However, the Brookings Institution suggests in [8] that new nuclear units have not been ordered primarily for economic reasons rather than fears of accidents.

As of 2006, the stated desire to use nuclear power for electricity generation has been suspected of being a cover for nuclear proliferation in Iran and North Korea.

0 Comments:

Post a Comment

<< Home