Friday, September 25, 2009

Water detected on the Moon

Water particles have been detected on the surface of the Moon by three missions, including an Indian probe.
WASHINGTON (AFP) - – Water particles have been detected on the surface of the Moon by three missions, including an Indian probe.
The evidence, disclosed in new scientific papers, overturns the long accepted view that lunar soil is dry and comes just two weeks before a NASA probe is to crash into the surface near the Moon's southern pole to see if water can be detected in the dust and debris released by the impact.
The new data was gathered by probes equipped with NASA instruments designed to map the Moon's mineral composition.
The so-called "Moon Mineralogy Mapper," or M3, uses the reflection of sunlight off the Moon's surface to determine soil composition.
In one of the three papers published in the latest edition of the journal Science, researchers said they analyzed light waves detected by an M3 instrument on board an Indian satellite, Chandrayyan-1.
The reflected light waves indicated a chemical bond between oxygen and hydrogen -- proof, the researchers said, of the existence of water on the Moon's surface.
Larry Taylor of the University of Tennessee, one of the study's co-authors, said the instrument is capable of detecting the composition of the thin upper layer of the Moon's surface only to a depth of two or three inches.
Until now, scientists had advanced the theory that there might be ice at the permamently dark bottom of craters at the Moon's poles but that the rest of the Moon was totally dry.
Lunar rocks and soil contain about 45 percent oxygen, but the source of the the hydrogen observed by the instruments on the three probes remains to be determined.
Taylor and his colleagues believe it may have come from an astronomical phenomenon called the solar wind, which consist mainly of streams of positively charged hydrogen atoms emitted as the sun undergoes nuclear fusion.
They estimate that each ton of lunar soil consists of 25 percent water.
Two other probes equipped with M3-type instruments also detected the chemical signature for the presence of water.
These include data gathered by the American spacecraft Cassini as it passed near the moon a decade agon on its way to Saturn.
The third probe, also American, was Deep Impact, which was launched toward the comet Tempel-1 in 2005 to pierce it with a projectile in order to analyze the dust cloud created by the impact.
Deep Impact passed near the Moon to gather data with an instrument similar to M3.
Samples of lunar rock and soil brought back to Earth by Apollo astronauts in the 1960s also contained traces of water.
But the containers in which they were transported were not hermetically sealed so researchers dismissed the presence of water as coming from the Earth.
"To some extent, we were fooled," said Taylor, who has studied the original Apollo missions. "Since the boxes leaked, we just assumed the water we found was from contamination with terrestrial air."
Indian scientists lost radio contact with the Chandrayaan-1 lunar satellite last month, but it had already collected enough data to provide the firmest evidence so far of water concentrated near the lunar poles.
"To find water on the moon was one of the main objectives," mission director Mylswamy Annadurai told AFP in Bangalore.
"The baby has done its job," a clearly delighted Annadurai said. "It's a major milestone, although we still have to quantify the findings."
"It was a combined team effort and of great significance for international space cooperation," he added. link....

Tuesday, September 22, 2009

New Solar Panels Generate Energy from Indirect Sunlight

Commonly used solar cells require direct sunlight in order to be able to produce electricity. If these solar cells do not get enough sunlight, their efficiency considerably drops. Recently engineers from GreenSun Energy, a company based in Tel Aviv, presented their latest invention - a solar cell able to produce power from diffused light. The new solar cell features a specialized colored panel, resembling a colored plexi-glass.
According to GreenSun Energy, their latest invention has its glass made with fluorescent dyes and nanoparticle metals. Besides being more efficient, the new solar cells could also have a lower cost compared to traditional solar cells. Another advantage of the new solar cells is that they require 80 percent less silicon than the traditional ones (less silicon means a lower cost of production). When sunlight (be it direct or indirect) touches the panels, it disperses across and the metal nanoparticles bring the sunlight to the edges where the silicon is placed.

The company's latest invention costs $2.10/W and is 12 percent more efficient than the traditional solar cell, which costs around $4.54/W. In addition, the conventional solar cells have efficiency loss because of the heat that doesn't turn into energy, informs CleanTechnica. In the new solar panels, the sunlight is diffused across the entire panel, thus nanoparticles are able to bring light to the edges of the panel where the light is transformed into energy. You can read more about various green technologies and eco-friendly developments here at www.InfoNIAC.com - please check the links at the bottom of the article. Currently the Tel Aviv-based company is working on making its latest invention even more efficient.

Engineers at GreenSun look forward towards increasing the efficiency of their new solar cell from 12 percent to 20 percent. They also hope to reduce the costs of producing the new panels to $0,94/W. link....

Saturday, September 19, 2009

Europe Launches Biggest Offshore Wind Farm in the World

Today wind power represents the fastest developing source of green energy in Europe. Recently Denmark announced that this week it hopes to launch Horns Rev 2, the largest offshore wind farm in the world.
The farm features 91 turbines that spread over a territory of 35 square kilometers and will be able to generate 209 megawatts of electricity, which is enough to power about 200,000 homes.
The Danish utility company Dong Energy estimates its project at $1 billion. Although the price is rather high, Horn Rev is expected to considerably cut the level of carbon emissions, making one step forward towards Europe's goal of producing 20 percent of green energy by 2020, reports Green Inc.
According to the European Wind Energy Association, in case the development of offshore wind projects continues, the offshore turbines could generate around 10 percent of Europe's power over the next 11 years. In addition, it is expected that offshore wind projects would provide 200,000 new work places by 2025. Today wind power represents the fastest developing source of green energy in Europe. Recently Denmark announced that this week it hopes to launch Horns Rev 2, the largest offshore wind farm in the world.
The farm features 91 turbines that spread over a territory of 35 square kilometers and will be able to generate 209 megawatts of electricity, which is enough to power about 200,000 homes.
The Danish utility company Dong Energy estimates its project at $1 billion. Although the price is rather high, Horn Rev is expected to considerably cut the level of carbon emissions, making one step forward towards Europe's goal of producing 20 percent of green energy by 2020, reports Green Inc.
According to the European Wind Energy Association, in case the development of offshore wind projects continues, the offshore turbines could generate around 10 percent of Europe's power over the next 11 years. In addition, it is expected that offshore wind projects would provide 200,000 new work places by 2025. link....

Nationwide Hydrogen Fuel Network to Be Created in Germany by 2015

A lot of attention is paid today to green cars, especially the electric ones, but the popularity of hydrogen-powered vehicles is continuously increasing.
These cars will surely become popular after recent announcement from auto companies that billion dollars has been spent by now on fuel cell cars and Germany looks forward to come up with a hydrogen fueling network throughout the country by 2015.

There are 8 companies that plan to finish the construction of the network. These companies are: Daimler, EnBW, Linde, OMV, Shell, Total, Vattenfall and the NOW GmbH National Organisation Hydrogen and Fuel Cell Technology.

During the first step, planned for 2009-2011, the companies are going to lobby for public transport and start installing fuel stations, informs Gas2.0. Step number two will include the rollout of hydrogen-powered vehicles and the finish of the fuel network construction.

Countries that look forward to adopt the hydrogen fuel cell technology, besides Germany, also include Canada (which is currently working on the creation of a hydrogen highway that will connect Vancouver and Whistler by the 2010 Winter Olympic Games) and Denmark (which has plans to make a hydrogen network that will link Denmark, Sweden, Norway and Germany). link....

Vehicle That Runs on Algae-based Fuel

The latest invention from American environmentalists is Algaeus, the world's first plug-in hybrid car powered by algae fuel. This vehicle pretty much resembles Toyota Prius, having a nickel metal hydride battery as well as a plug. The car runs on green crude, created by Sapphire Energy.
There is no need to make any alteration to the gasoline engine. Rebecca Harrel, the producer of a documentary film entitled FUEL and the co-founder of the Veggie Van Organization, says that their latest invention is so effective that it is able to run from coast to coast on just 25 gallons. She believes that the algae-based fuel is the invention that could be used during the next Apollo mission.
Algaeus will be used to promote both green energy and the movie, which tells about the ongoing dependence of the United States on foreign oil. Rebecca Harrell will be on a ten-day tour together with the director of the film Josh Tickell. The two will take the vehicle on tour together will several other green vehicles, including the biodiesel-powered bus.Because fuel produced from green algae will not be available at local gas stations, Sapphire Energy looks forward to increase the production of green crude by 2 million gallons a year over a two-year period. The company hopes that in the near future the cost of algae-based fuel will be competitive with fossil fuels. link....

World's Biggest Solar Plant to Be Built in China

A company with headquarters in Arizona recently announced about its plans to build the largest solar plant on the planet. The 2,000 megawatt plant is expected to be constructed in Ordos, China in 2019. According to engineers the plant will generate electricity enough to power 3 million homes.
It is worth mentioning that thanks to this project, China, which is currently the second biggest energy-consuming country, could become one of the most important nations in terms of solar energy production. The plant represents a multi-billion dollar investment and will occupy an area of 25 square miles.
The first phase of the construction is expected to begin in 2010. If everything goes well, the world's biggest polluter could grow to become the biggest green energy consumer on the planet. link....

Thursday, September 10, 2009

Amazing Green Shanghai Pavilion Built From Used CD Cases

The Shanghai World Expo 2010 has already engaged a large number of designers from various countries to work on daring projects. One of the most impressive constructions is the Shanghai Corporate Pavilion, designed by the architecture firm Atelier Feichang Jianzhu.
The most incredible thing about this building, besides the fact that it takes advantage of a great number of recycled materials, is that it is made from thousands of plastic tubes that were made from used CD cases.
The construction's exterior is made of hundreds of polycarbonate transparent recycled plastic tubes that are arranged in a grid-like matrix. At the end of its life cycle the construction can be easily recycled. Designers look forward to incorporate multi-colored LED lights into the outer walls of the building. All of these lights will be controlled by a computer, which will regularly change the appearance of the construction, reports ArchDaily.
The heat gathering tubes installed on the building's roof will be able to generate energy to power the pavilion and heat water. The entire solar thermal energy system will occupy an area of 1,600 square meters.
The electricity required for expositions as well as every day needs will be produced using ultra-low temperature power generation. The system that will produce a mist effect will add more mystery to the building. In addition, the produced mist will serve as a tool for lowering temperature and purifying the air. Water and mist will be produced from filtered rainwater. This is a truly amazing construction and might be a step towards a greener future. link....

Tuesday, September 1, 2009

Turning food waste into energy

Food waste is one of the least recycled materials in municipal solid waste systems, according to the Environmental Protection Agency. But at least one organization in the San Francisco Bay Area is trying to change that.
The East Bay Municipal Utility District is experimenting with innovative techniques to convert raw food waste into usable energy, taking some of the massive amounts of food waste generated by local restaurants and using it to power its operations in Oakland, Calif.
In 2007, EBMUD was awarded a $50,000 grant from
the EPA as part of the Resource Recovery Program to explore new ways of digesting food waste to produce methane gas.
Today, the facility is home to a million-dollar facility that is generating usable methane and producing nearly 100 percent of the power needed to operate the regional wastewater treatment operation.
The main wastewater treatment plant sits at the base of the San Francisco-Oakland Bay Bridge. It was the first sewage treatment facility in the nation to convert post-consumer food scraps to energy via anaerobic digestion. EBMUD is adapting some of the equipment that's been used for decades in water treatment. That, coupled with new ideas and patented processes, enable EBMUD to produce nearly 100 percent of the energy needed to power the facility.
he process all starts at restaurants like this one, Pizzaiolo in Oakland. Currently, a few hundred restaurants contribute to the program, amounting to 100 tons of compostable food waste that gets delivered to the EBMUD facility every week to be anaerobically digested
to produce methane.
Restaurants in nearby Alameda, Contra Costa, and San Francisco counties generate 1,700 tons of food waste each day, giving the program a lot of potential for expansion, according to Dave Williams, the director of wastewater at EBMUD.
A bill signed in 2006 by California Governor Arnold Schwarzenegger requires the state to limit greenhouse gas emissions to 1990 levels by the year 2020. Processes
like the one EBMUD is testing could help the state achieve that goal, said Williams.
Roger Taylor, an employee at Pizzaiolo, says the restaurant generates around 100 gallons of food waste a day, more compostable waste than any restaurant he has worked at before. Pizzaiolo sends all of it to the EBMUD processing facility to be turned into usable energy.
A truck arrives with a load of commercial restaurant food waste and dumps it into a holding tank below. The scraps from restaurants typically contain all sorts of foreign material like plates, knives, forks, chopsticks, and other plastics. These items must be filtered out before the food makes its way into the digestor to produce methane.
Technicians at the processing facility watch as a load of food waste is added to the tanks below.
After delivering a truckload of waste into the holding tanks, water is added and the mixture is stirred to give the solution the proper consistency for processing.
The processes EBMUD employs to convert the food scraps into methane are done using existing waste treatment infrastructure. As it has developed techniques, several of the processes have been patented. Before the pulp can be processed, it goes through the paddle finisher, which screens and filters unwanted fibrous materials and bits of paper and plastics. This rejected material, the pumice, is sent over to the landfill to be composted, Williams said.
With the pumice removed, the end product is a smooth liquid that flows down a trough and is then pumped into anaerobic digestors, where it will break down in an oxygen-free environment, producing methane gas.
The methane gas flows out of this huge tube and is sent toward the three co-generation engines that power the wastewater treatment operations. In this photo, shot from the top of digestor No. 12, you can see a view of downtown Oakland.
Using organic solids to produce methane is not uncommon across the country, but an average facility would probably only make about 35 percent of the energy necessary to power the wastewater facilities. By implementing the food scrap program, EBMUD is now generating nearly 100 percent of the energy it needs, and hopes to someday produce more, which could then be sold back to PG&E. link....

Monday, August 31, 2009

South Korea to Feature Green Super City

Recently the government of South Korea announced about its intention to build a self-sufficient super-city. The project was designed by Foster + Partners, who worked in cooperation with PHA and Mobility in Chain. The city will serve as the territory where eco-friendly technologies will be developed. As soon as the Incheon eco-city is constructed, it will house 320,000 residents. It is expected that the city will become a place where sustainable industries will carry out their high-tech research and development programs, creating photovoltaic panels and wind turbines. Thus Incheon is expected to feature high-tech eco-friendly technologies, including biomass energy production, hydrogen fuel cells, as well as hydroponic roofs. The city will be finished in 10-15 years. More information on green technologies you can find here at www.InfoNIAC.com, please click the links at the bottom of the story.
At the moment the region is mostly agricultural and houses about 35,000 people. Green roofs will replace terrace farming, which would reduce the loss of agricultural space. All buildings in the city will not exceed 50 meters in height. According to Grant Brooker, a design director at Foster + Partners, the idea behind the whole project is to explore the sustainable possibilities of the island. More information is available here. link....

Sunday, August 23, 2009

The Top 10 Worst Polluted Places on Earth

More than 10 million people in eight different countries are at serious risk for cancer, respiratory diseases, and premature death because they live in the 10 most polluted places on Earth, according to a report by the Blacksmith Institute, a nonprofit organization that works to identify and solve specific environmental problems worldwide.
Top 10 Worst Polluted Places Remote but Toxic
Chernobyl in Ukraine, site of the world’s worst nuclear accident to date, is the best known place on the list. The other places are unknown to most people, and located far from major cities and populations centers, yet 10 million people either suffer or risk serious health effects because of environmental problems ranging from lead contamination to radiation.
“Living in a town with serious pollution is like living under a death sentence,” the report says. “If the damage does not come from immediate poisoning, then cancers, lung infections, mental retardation, are likely outcomes.”
“There are some towns where life expectancy approaches medieval rates, where birth defects are the norm not the exception,” the report continues. “In other places children's asthma rates are measured above 90 percent, or mental retardation is endemic. In these places, life expectancy may be half that of the richest nations. The great suffering of these communities compounds the tragedy of so few years on earth."
Top 10 Worst Polluted Sites Serve as Examples of Widespread ProblemsRussia leads the list of eight nations, with three of the 10 worst polluted sites. Other sites were chosen because they are examples of problems found in many places around the world. For example, Haina, Dominican Republic has severe lead contamination—a problem that is common in many poor countries. Linfen, China is just one of several Chinese cities choking on industrial air pollution. And Ranipet, India is a nasty example of serious groundwater pollution by heavy metals.
The Top 10 Worst Polluted PlacesThe
Top 10 worst polluted places in the world are:
1.Chernobyl, Ukraine
2.Dzerzhinsk, Russia
3.Haina, Dominican Republic
4.Kabwe, Zambia
5.La Oroya, Peru
6.Linfen, China
7.Maiuu Suu, Kyrgyzstan
8.Norilsk, Russia
9.Ranipet, India
10.Rudnaya Pristan/Dalnegorsk, Russia

Choosing the Top 10 Worst Polluted Places
The Top 10 worst polluted places were chosen by the Blacksmith Institute’s Technical Advisory Board from a list of 35 polluted places that had been narrowed from 300 polluted places identified by the Institute or nominated by people worldwide. The Technical Advisory Board includes experts from Johns Hopkins, Hunter College, Harvard University, IIT India, the University of Idaho, Mount Sinai Hospital, and leaders of major international environmental remediation companies.
Solving Global Pollution ProblemsAccording to the report, “there are potential remedies for these sites. Problems like this have been solved over the years in the developed world, and we have the capacity and the technology to spread our experience to our afflicted neighbors.”
“The most important thing is to achieve some practical progress in dealing with these polluted places,” says Dave Hanrahan, chief of global operations for the
Blacksmith Institute. “There is a lot of good work being done in understanding the problems and in identifying possible approaches. Our goal is to instill a sense of urgency about tackling these priority sites.” link....

Water Pollution

News and information about the causes of water pollution and how to combat it.
Tap Water in 42 States Contaminated by ChemicalsPublic water supplies in 42 U.S. states--the tap water millions of Americans drink every day--are are contaminated with 141 unregulated chemicals for which the U.S. Environmental Protection Agency has never established safety standards, according to an investigation by the Environmental Working Group.
Coca-Cola Charged with Groundwater Depletion and Pollution in IndiaGroundwater depletion has become a serious problem in India, and villagers blame Coca-Cola for aggravating the groundwater problem. Learn what India's government and citizens are doing, and how Coca-Cola is responding to the groundwater and pollution charges.
Water Pollution - Canada Takes Crap for Flushing Raw Sewage into the OceanCanada flushes some 200 billion litres of raw sewage directly into natural waterways every year, from the St. Lawrence River to the Strait of Juan de Fuca and the Pacific Ocean. One of the worst offenders is the city of Victoria, the picturesque capital of British Columbia, the province that is preparing to host the 2010 Winter Olympics. link....

Environmental Issues: Ozone Depletion

What is ozone depletion? And how does ozone depletion affect the earth? Learn about the causes and effects of ozone depletion, and how it changes the environment for humans, animals, and plants.
Ozone Hole TourDetailed information about the hole in the ozone over Antarctica, provided by the Centre for Atmospheric Science at Cambridge University.
zSB(3,3)
Forecast Earth: Ozone Depletion VideosThese two short videos about the science and response to ozone depletion and the health effects of ultraviolet radiation were created through a partnership between the U.S. Environmental Protection Agency and The Weather Channel. The videos can be viewed over broadband or 56K modem connections. Text transcripts are also available.
Check the UV Level Where You LiveThe U.S. Environmental Protection Agency provides an online UV index that is searchable by zip code. Check the UV level in your neighborhood to determine how much you and your neighbors are contributing to depletion of the ozone layer.
EPA: Ozone DepletionDetailed information about ozone depletion from the U.S. Environmental Protection Agency.
Global Efforts to Control Ozone DepletionInformation about worldwide collaboration to mitigate the effects of ozone depletion, from the United Nations Environment Programme Ozone Secretariat.
Ozone: The Good and Bad of OzoneFrom a human perspective, ozone is both helpful and harmful, both good and bad. In the upper atmosphere, ozone protects all life on Earth. At ground level, ozone is toxic and corrosive, a threat to human health, ecosystems, plants and marine life. link....

Part I: The History behind the Ozone Hole

The Beginning ...
Dramatic loss of ozone in the lower stratosphere over Antarctica was first noticed in the 1970s by a research group from the British Antarctic Survey (BAS) who were monitoring the atmosphere above Antarctica from a research station much like the picture to the right.
The Halley Research Station - Information
BAS research stations in the Antarctic
Folklore has it that when the first measurements were taken in 1985, the drop in ozone levels in the
stratosphere was so dramatic that at first the scientists thought their instruments were faulty. Replacement instruments were built and flown out, and it wasn't until they confirmed the earlier measurements, several months later, that the ozone depletion observed was accepted as genuine.
Another story goes that the TOMS satellite data didn't show the dramatic loss of ozone because the software processing the raw ozone data from the satellite was programmed to treat very low values of ozone as bad readings! Later analysis of the raw data when the results from the British Antarctic Survey team were published, confirmed their results and showed that the loss was rapid and large-scale; over most of the Antarctica continent.

What Is Ozone And How Is It Formed?
Ozone (O3 : 3 oxygen atoms) occurs naturally in the atmosphere.
The earth's atmosphere is composed of several layers. We live in the "Troposphere" where most of the weather occurs; such as rain, snow and clouds. Above the troposphere is the "Stratosphere"; an important region in which effects such as the Ozone Hole and Global Warming originate. Supersonic jet airliners such as Concorde fly in the lower stratosphere whereas subsonic commercial airliners are usually in the troposphere. The narrow region between these two parts of the atmosphere is called the "Tropopause".
Ozone forms a layer in the stratosphere, thinnest in the tropics (around the equator) and denser towards the poles. The amount of ozone above a point on the earth's surface is measured in
Dobson units (DU) - typically ~260 DU near the tropics and higher elsewhere, though there are large seasonal fluctuations. It is created when ultraviolet radiation (sunlight) strikes the stratosphere, dissociating (or "splitting") oxygen molecules (O2) to atomic oxygen (O). The atomic oxygen quickly combines with further oxygen molecules to form ozone:
O2 + hv->O + O(1)
O + O2->O3(2)
(1/v = wavelength < ~ 240 nm) It's ironic that at ground level, ozone is a health hazard - it is a major constituent of photochemical smog. However, in the stratosphere we could not survive without it. Up in the stratosphere it absorbs some of the potentially harmful ultra-violet (UV) radiation from the sun (at wavelengths between 240 and 320 nm) which can cause skin cancer and damage vegetation, among other things. Although the UV radiation splits the ozone molecule, ozone can reform through the following reactions resulting in no net loss of ozone: O3 + hv->O2 + O(3)
O + O2->O3(2)
as above
Ozone is also destroyed by the following reaction:
O + O3->O2 + O2(4)
The Chapman Reactions
The reactions above, labelled (1)-(4) are known as the "
Chapman reactions". Reaction (2) becomes slower with increasing altitude while reaction (3) becomes faster. The concentration of ozone is a balance between these competing reactions. In the upper atmosphere, atomic oxygen dominates where UV levels are high. Moving down through the stratosphere, the air gets denser, UV absorption increases and ozone levels peak at roughly 20km. As we move closer to the ground, UV levels decrease and ozone levels decrease. The layer of ozone formed in the stratosphere by these reactions is sometimes called the 'Chapman layer'.
The Missing Reactions..
But there was a problem with the Chapman theory. In the 1960s it was realised that the loss of ozone given by reaction (4) was too slow. It could not remove enough ozone to give the values seen in the real atmosphere. There had to be other reactions, faster reactions that were controlling the ozone concentations in the stratosphere. We'll learn about these in Part III of this tour of the ozone hole.
What Is The Ozone Hole?
The Ozone Hole often gets confused in the popular press and by the general public with the problem of global warming. Whilst there is a connection because ozone contributes to the greenhouse effect, the Ozone Hole is a separate issue. However it is another stark reminder of the effect of man's activities on the environment. Over Antarctica (and recently over the Arctic), stratospheric ozone has been depleted over the last 15 years at certain times of the year. This is mainly due to the release of manmade chemicals containing chlorine such as CFC's (ChloroFluoroCarbons), but also compounds containing bromine, other related halogen compounds and also nitrogen oxides (NOx). CFC's are a common industrial product, used in refrigeration systems, air conditioners, aerosols, solvents and in the production of some types of packaging. Nitrogen oxides are a by-product of combustion processes, eg aircraft emissions.
A more detailed description of the chemistry will follow in Part III.
The current levels of depletion have served to highlight a surprising degree of instability of the atmosphere, and the amount of ozone loss is still increasing. GreenPeace have documented
many of the concerns that this raises.
What Is Being Done?
The first global agreement to restrict CFCs came with the signing of the
Montreal Protocol in 1987 ultimately aiming to reduce them by half by the year 2000. Two revisions of this agreement have been made in the light of advances in scientific understanding, the latest being in 1992. Agreement has been reached on the control of industrial production of many halocarbons until the year 2030. The main CFCs will not be produced by any of the signatories after the end of 1995, except for a limited amount for essential uses, such as for medical sprays.
The countries of the European Community have adopted even stricter measures than are required under the Montreal Protocol agreements. Recognising their responsibility to the global environment they have agreed to halt production of the main CFCs from the beginning of 1995. Tighter deadlines for use of the other ozone-depleting compounds are also being adopted.
It was anticipated that these limitations would lead to a recovery of the ozone layer within 50 years of 2000; the
World Meteorological Organisation estimated 2045 (WMO reports #25, #37), but recent investigations suggest the problem is perhaps on a much larger scale than anticipated. link....

Part II: Recent Ozone Loss over Antarctica

Why the Antarctic?
There are now many measurements and observations of the changes in ozone that occur over Antarctica. Such measurements come from ground based instruments at the Antarctica research stations, from aircraft during scientific missions and from satellites.
Ozone loss was first detected in the stratosphere over the Antarctic (
see Part I). Although mid-latitude and Arctic depletion has also been observed, the loss is most dramatic in the lower stratosphere over the Antarctica continent, where nearly all the ozone is destroyed over an area the size of Antarctica within a layer in the lower stratosphere that's many km thick.
Halley Bay, Antarctica
The graph to the right shows the measured total ozone above the
Halley Bay station in Antarctica. Each point represents the average total ozone for the month of October. Note the sudden change in the curve after about 1975. By 1994, the total ozone in October was less than half its value during the 1970s, 20 years previous. This dramatic fall in ozone was caused by the use of man-made chemicals known as 'halocarbons' which include the well-known CFCs commonly used in fridges and so on. These CFCs had made their way into the upper atmosphere where the much stronger UV radiation from the Sun had broken them down into their component molecules, releasing the potentially damaging chlorine (and bromine) atoms, which, given the right conditions, could destroy ozone. We'll learn more about the chemistry behind the loss of ozone in Part III of this tour.
Regular ozone measurement have been made from the
Halley Bay Research Station for many years. Ozone depletion is most marked in the Antarctic Spring, around October.
TOMS Satellite Measurements The TOMS (Total Ozone Mapping Spectrometer) is a satellite-borne instrument used to gain a global picture of ozone levels. The following movie shows how the ozone levels over the Antarctic have been changing over the last 15 years. Measurements are taken daily, and the frames in the movie are constructed from monthly averages of the data. The data is freely available from several sites, including the
British Atmospheric Data Centre.
Inline movie of TOMS ozone measurements from Nov 1978 to Jan 1992(3.7 Mb)
MPEG movie of TOMS ozone measurements from Nov 1978 to Jan 1992(1 Mb)
The TOMS instrument measures ozone levels from the back-scattered sunlight, specifically in the ultra-violet range. It measures wavelength bands centred at 312.5, 317.5, 331.3, 339.9, 360.0 and 380.0 nanometres. The first four wavelengths are absorbed to greater or lesser extents by ozone; the final two are used to assess the reflectivity. The ozone levels computed are 'column ozone' (i.e.
Dobson Units or DU for short).
During the Antarctic winter (May - July), data is unavailable near the pole, which is in total darkness.
For more information, do visit the
TOMS Home Page.
Monthly Averages for October
It is important to appreciate that the atmosphere behaves differently from year to year. Even though the same processes that lead to ozone depletion occur every year, the effect they have on the ozone is altered by the meteorology of the atmosphere above Antarctica. This is known as the 'variability' of the atmosphere. This variability leads to changes in the amount of ozone depleted and the dates when the depletion starts and finishes. To illustrate this, the monthly averages for October, from 1980 to 1991, are shown below.
You can obtain a larger image of a particular year by clicking on the appropriate globe.
link....

Part III. The Science of the Ozone Hole

Introduction
Evidence that human activities affect the ozone layer has been building up over the last 20 years, ever since scientists first suggested that the release of
chlorofluorocarbons (CFCs) into the atmosphere could reduce the amount of ozone over our heads.
The breakdown products (chlorine compounds) of these gases were detected in the stratosphere. When the ozone hole was detected, it was soon linked to this increase in these chlorine compounds. The loss of ozone was not restricted to the Antarctic - at around the same time the first firm evidence was produced that there had been an ozone decrease over the heavily populated northern mid-latitudes (30-60N). However, unlike the sudden and near total loss of ozone over Antarctica at certain altitudes, the loss of ozone in mid-latitudes is much less and much slower - only a few percentage per year. However, it is a very worrying trend and one which is the subject of intense scientific research at present. More on this in Part IV of the tour.
Many of these findings have since been reinforced by a variety of internationally supported scientific investigations involving satellites, aircraft, balloons and ground stations, and the implications are still being quantified and assessed. More about these international investigations in Part IV.
The Recipe For Ozone Loss
In trying to understand how the ozone loss occurs and the things that need to happen to destroy so much ozone, it helps to think of it as a 'recipe'. We need several ingredients to make the ozone loss occur. We'll now look at these 'ingredients' one at a time.
The Special Features of Polar Meteorology
We start by looking at the way the atmosphere behaves over the poles - the features of the meteorology in the
stratosphere. The figure to the right shows schematically what happens over Antarctica during winter. During the winter polar night, sunlight does not reach the south pole. A strong circumpolar wind develops in the middle to lower stratosphere. These strong winds are known as the 'polar vortex'. This has the effect of isolating the air over the polar region.
Since there is no sunlight, the air within the polar vortex can get very cold. So cold that special clouds can form once the air temperature gets to below about -80C. These clouds are called
Polar Stratospheric Clouds (or PSCs for short) but they are not the clouds that you are used to seeing in the sky which are composed of water droplets. PSCs first form as nitric acid trihydrate. As the temperature gets colder however, larger droplets of water-ice with nitric acid dissolved in them can form. However, their exact composition is still the subject of intense scientific scrutiny. These PSCs are crucial for ozone loss to occur.So, we have the first few ingredients for our 'ozone loss recipe'. We must have:
1.Polar winter leading to the formation of the polar vortex which isolates the air within it.
2.Cold temperatures; cold enough for the formation of Polar Stratospheric Clouds. As the vortex air is isolated, the cold temperatures persist.

Chemical Processes Leading To Polar Ozone Depletion

It is now accepted that chlorine and bromine compounds in the atmosphere cause the ozone depletion observed in the `ozone hole' over Antarctica and over the North Pole. However, the relative importance of chlorine and bromine for ozone destruction in different regions of the atmosphere has not yet been clearly explained. Nearly all of the chlorine, and half of the bromine in the stratosphere, where most of the depletion has been observed, comes from human activities.
The figure above shows a schematic illustrating the life cycle of the CFCs; how they are transported up into the upper stratosphere/lower mesosphere, how sunlight breaks down the compounds and then how their breakdown products descend into the polar vortex.
The main long-lived inorganic carriers (
reservoirs) of chlorine are hydrochloric acid (HCl) and chlorine nitrate (ClONO2). These form from the breakdown products of the CFCs. Dinitrogen pentoxide (N2O5) is a reservoir of oxides of nitrogen and also plays an important role in the chemistry. Nitric acid (HNO3) is significant in that it sustains high levels of active chlorine (as explained soon).
Production of Chlorine Radicals
One of the most important points to realise about the chemistry of the ozone hole is that the key chemical reactions are unusual. They cannot take place in the atmosphere unless certain conditions are present: our
first two ingredients in our recipe for ozone loss.
The central feature of this unusual chemistry is that the chlorine reservoir species HCl and ClONO2 (and their bromine counterparts) are converted into more active forms of chlorine on the surface of the polar stratospheric clouds. The most important reactions in the destruction of ozone are:
HCl + ClONO2->HNO3 + Cl2(1)
ClONO2 + H2O->HNO3 + HOCl(2)
HCl + HOCl->H2O + Cl2(3)
N2O5 + HCl->HNO3 + ClONO(4)
N2O5 + H2O->2 HNO3(5)
It's important to appreciate that these reactions can only take place on the surface of polar stratospheric clouds, and they are very fast. This is why the ozone hole was such as surprise. Heterogeneous reactions (those that occur on surfaces) were neglected in atmospheric chemistry (at least in the stratosphere) before the ozone hole was discovered. Another ingredient then, is these heterogeneous reactions which allow reservoir species of chlorine and bromine to be rapidly converted to more active forms.
The nitric acid (HNO3) formed in these reactions remains in the PSC particles, so that the gas phase concentrations of oxides of nitrogen are reduced. This reduction, 'denoxification' is very important as it slows down the rate of removal of ClO that would otherwise occur by the reaction:
ClO + NO2 + M->ClONO2 + M(6) (where M is any air molecule)
... and so helps to maintain high levels of active chlorine. Here is some
more information on Polar Stratospheric Clouds.
This movie shows a 3D model simulation of how chlorine nitrate (ClONO2) changes during a northern hemisphere winter in the lower stratosphere. Remember that ClONO2 is destroyed when the PSCs form, so for a large part of the movie, you see nothing. But as sunlight returns to the polar night region over the Arctic we see the ClONO2 start to recover. This first happens around the edge of the polar vortex, and we the the now classic doughnut shape of the so-called 'chlorine nitrate collar'.
Evolution of ClONO2 over the North Pole during winter 1994 (3.4 Mb)
Evolution of ClONO2 over the North Pole during winter 1994 (small) (554 Kb)
Evolution of ClONO2 over the North Pole during winter 1994 (large)(1.2Mb)
The Return Of Sunlight
Lastly note that we have still only formed molecular chlorine (Cl2) from reactions (1)-(5). To destroy ozone requires atomic chlorine.
Molecular chlorine is easily photodissociated (split by sunlight):
Cl2 + hv-> Cl + Cl
This is the key to the timing of the ozone hole. During the polar winter, the cold temperatures that form in the 'vortex' lead to the formation of polar stratospheric clouds. Heterogeneous reactions convert the reservoir forms of the ozone destroying species, chlorine and bromine, to their molecular forms. When the sunlight returns to the polar region in the southern hemisphere spring (northern hemisphere autumn) the Cl2 is rapidly split into chlorine atoms which lead to the sudden loss of ozone. This sequence of events has been confirmed by measurements before, during and after the ozone hole.
There is still one more ingredient for our recipe of ozone destruction. We have most of it but we have still not explained the chemical reactions that the atomic chlorine actually takes part in to destroy the ozone. We'll discuss this next.
Catalytic Destruction of Ozone
Measurements taken of the chemical species above the pole show the high levels of active forms of chlorine that we have explained above. However, we still have many more atoms of ozone than we do of the active chlorine so how it is possible to destroy nearly all of the ozone?
The answer to this question lies in what are known as 'catalytic cycles'. A catalytic cycle is one in which a molecule significantly changes or enables a reaction cycle without being altered by the cycle itself.
The production of active chlorine requires sunlight, and sunlight drives the following catalytic cycles thought to be the main cycles involving chlorine and bromine, responsible for destroying the ozone:
(I)ClO + ClO + M->Cl2O2 + M
Cl2O2 + hv->Cl + ClO2
ClO2 + M->Cl + O2 + M
then:2 x (Cl + O3)->2 x (ClO + O2)
net:2 O3->3 O2
and (II)
ClO + BrO->Br + Cl + O2
Cl + O3->ClO + O2
Br + O3->BrO + O2
net:2 O3->3 O2
The dimer (Cl2O2) of the chlorine monoxide radical involved in Cycle (I) is thermally unstable, and the cycle is most effective at low temperatures. Hence, again low temperatures in the polar vortex during winter are important. It is thought to be responsible for most (70%) of the ozone loss in Antarctica. In the warmer Arctic a large proportion of the loss may be driven by Cycle (II).
The Recipe For Ozone Loss
To summarise then, we have looked at the 'ingredients' or conditions necessary for the destruction of ozone that we see in Antarctica. The same applies more or less to the loss of ozone in the Arctic stratosphere during winter. Although in this case the loss is not nearly so severe.
To recap then, the requirements for ozone loss are:
The polar winter leads to the formation of the polar vortex which isolates the air within it.
Cold temperatures form inside the vortex; cold enough for the formation of Polar Stratospheric Clouds (PSCs). As the vortex air is isolated, the cold temperatures and the PSCs persist.
Once the PSCs form, heterogeneous reactions take place and convert the inactive chlorine and bromine reservoirs to more active forms of chlorine and bromine.
No ozone loss occurs until sunlight returns to the air inside the polar vortex and allows the production of active chlorine and initiates the catalytic ozone destruction cycles. Ozone loss is rapid. The ozone hole currently covers a geographic region a little bigger than Antarctica and extends nearly 10km in altitude in the lower stratosphere. link....

Part IV. The Ozone Hole - Current Research Work

Where Does All The Ozone Go?
A major European campaign, the
European Arctic Stratospheric Ozone Experiment (EASOE) was organised to study the polar regions during the winter of 1991/92. Much new information was gained, but many questions still remained:
What caused the mid-latitude loss?
How were the losses over the poles linked to those at mid-latitudes?
While CFCs and the bromine-containing compounds known to destroy ozone over the poles are strongly implicated in the mid-latitude loss, many uncertainties remain.
In 1994 and 1995 European scientists conducted
SESAME, the Second European Stratospheric Arctic and Mid-latitude Experiment. They investigated the processes occurring at both high and mid-latitudes and how they are linked. At the same time a US-led expedition considered similar processes in the southern hemisphere.
Day 20 (11 September '94)The latest European campaign is called THESEO (THird European Stratospheric Experiment on Ozone) which takes places from 1997-1999. Scientists from many European countries, including some of this site, are collaborating on a wide range of experiments to determine the processes responsible for depleting ozone in the lower stratosphere but at mid-latitudes over the northern hemisphere.
The
European Ozone Reserach Coordinating Unit have full details of the THESEO programme. Visit their website to find out more about the missions planned, press releases and the latest report of the UK Stratospheric Ozone
Day 40 (1 October '94)Review Group.
Chemical Modelling
Most of the research work here at the
Centre for Atmospheric Science involves various computer models of the atmosphere. These models 'blow' (or advect) chemical species around the globe using known or computed weather patterns - winds, temperatures and pressures. The rates of various chemical reactions are dependent on temperature, pressure, and, in the case of photolytic processes, the position of the sun. At each step of the model, the computer code attempts to predict what chemical changes will occur by solving the equations representing each reaction.
Day 56 (17 October '94)The schematic figure below gives some idea of the different parts of such computer models and the sequence of events are the model executes on the computer. Such models can be, and often are, very complex with many man-years work behind them.
Anatomy of Chemical Model
Different classes of model are used. These are:
Box Models consider just a single point in the atmosphere. Such models are comparatively cheap to develop and run on a PC or workstation. The advantage of such models is that very complex chemical reactions can be included since only the chemistry at a single point is simulated. This is very useful for comparing model simulations with measurements in idealised cases and also for developing less complex chemistry schemes which are used in multi-dimensional models.
Trajectory Models are the next step up from box models. Essentially a trajectory model is a 'box model that moves'. A trajectory of a point (or points) of air is calculated from known wind fields. The chemistry is then calculated for all points along the path that the parcel or air took. This type of model is very useful for determining the chemical properties of air reaching observation stations. By running very many chemical trajectory models, it is also possible to begin to develop a three-dimensional picture of the chemistry in the atmosphere.
Three-dimensional Models use the traditional technique of simulating the atmospheric system on a grid of latitude/longitude points and vertical levels (surfaces of constant
Potential Temperature or Pressure). Such models have a realistic representation of the movement or meteorology of air as well as other processes such as clouds, solar radiation and so on. In a way, you can think of a 3D model as a grid of box models where the air it being moved through the boxes. As many points are being represented it becomes impossible to use the complex chemistry schemes found in box models as this would place too great a demand on computing power. As it is, these 3D chemical models of the atmosphere require the most powerful High Performance Computers around. In the UK we use the Cray supercomputer and Fujisu supercomputers at the Rutherford Appleton Laboratory in Oxford. Models and Observations Comparison of model results with observations both helps confirm our understanding of the processes responsible for ozone depletion, and can highlight those processes that require further study. A model of chemistry and transport has been used extensively in recent observational campaigns in the Arctic and Antarctic.The following graphics compare the output of the TOMCAT (grid-point) model with TOMS satellite data for the beginning of the Antartic spring - the ASHOE Campaign. TOMCAT was run on a resolution of approximately 5 deg x 5 deg. Further studies have used far higher resolutions.The TOMS instrument relies on backscattered sunlight for its measurements; hence for the Antarctic winter, data tends to be sparse and incomplete. This data came from the Meteor 3 Satellite. More information on TOMS is available here. Comparison between Model Results and Actual Satellite Data
Inline movie of comparison between TOMS satellite data and TOMCAT model run(1.7 Mb)
MPEG movie of comparison between TOMS satellite data and TOMCAT model run(366 Kb)
The model column ozone is very similar to that observed by satellite. Over the Antarctic continent there are low amounts of ozone, where there has been chemical destruction. Around the edge of the vortex, between 30S and 60S, there are higher amounts of ozone. These high amounts result from the transport of ozone from the region of production in the tropics. link....

Tuesday, August 18, 2009

When it comes to using climate models to assess the causes of the increased amount of moisture in the atmosphere, it doesn't much matter if one model

Total amount of atmospheric water vapor over the oceans on July 4, 2009. The scale is 10° x 10° latitude/longitude. These results are from operational weather forecasts of the European Centre for Medium-Range Weather Forecasting (ECMWF).
They all come to the same conclusion: Humans are warming the planet, and this warming is increasing the amount of water vapor in the atmosphere.
In new research appearing in the Aug. 10 online issue of the Proceedings of the U.S. National Academy of Sciences, Lawrence Livermore National Laboratory scientists and a group of international researchers found that model quality does not affect the ability to identify human effects on atmospheric water vapor.
“Climate model quality didn't make much of a difference,” said Benjamin Santer, lead author from LLNL's Program for Climate Modeling and Intercomparison. “Even with the computer models that performed relatively poorly, we could still identify a human effect on climate. It was a bit surprising. The physics that drive changes in water vapor are very simple and are reasonably well portrayed in all climate models, bad or good.”
The atmosphere's water vapor content has increased by about 0.4 kilograms per cubic meter (kg/m_) per decade since 1988, and natural variability alone can't explain this moisture change, according to Santer. “The most plausible explanation is that it's due to human-caused increases in greenhouse gases,” he said.
More water vapor - which is itself a greenhouse gas - amplifies the warming effect of increased atmospheric levels of carbon dioxide.Previous LLNL research had shown that human-induced warming of the planet has a pronounced effect on the atmosphere's total moisture content. In that study, the researchers had used 22 different computer models to identify a human “fingerprint” pattern in satellite measurements of water vapor changes. Each model contributed equally in the fingerprint analysis. “It was a true model democracy,” Santer said. “One model, one vote.” But in the recent study, the scientists first took each model and tested it individually, calculating 70 different measures of model performance. These “metrics” provided insights into how well the models simulated today's average climate and its seasonal changes, as well as on the size and geographical patterns of climate variability.This information was used to divide the original 22 models into various sets of “top ten” and “bottom ten” models. “When we tried to come up with a David Letterman type 'top ten' list of models,” Santer said, “we found that it's extremely difficult to do this in practice, because each model has its own individual strengths and weaknesses.”
Then the group repeated their fingerprint analysis, but now using only “top ten” or “bottom ten” models rather than the full 22 models. They did this more than 100 times, grading and ranking the models in many different ways. In every case, a water vapor fingerprint arising from human influences could be clearly identified in the satellite data.
“One criticism of our first study was that we were only able to find a human fingerprint because we included inferior models in our analysis,” said Karl Taylor, another LLNL co-author. “We've now shown that whether we use the best or the worst models, they don't have much impact on our ability to identify a human effect on water vapor.”
This new study links LLNL's “fingerprint” research with its long-standing work in assessing climate model quality. It tackles the general question of how to make best use of the information from a large collection of models, which often perform very differently in reproducing key aspects of present-day climate. This question is not only relevant for “fingerprint” studies of the causes of recent climate change. It is also important because different climate models show different levels of future warming. Scientists and policymakers are now asking whether we should use model quality information to weight these different model projections of future climate change.
“The issue of how we are going to deal with models of very different quality will probably become much more important in the next few years, when we look at the wide range of models that are going to be used in the Fifth Assessment Report of the Intergovernmental Panel on Climate Change,” Santer said.
Other LLNL researchers include Karl Taylor, Peter Gleckler, Celine Bonfils, and Steve Klein. Other scientists contributing to the report include Tim Barnett and David Pierce from the Scripps Institution of Oceanography; Tom Wigley of the National Center for Atmospheric Research; Carl Mears and Frank Wentz of Remote Sensing Systems; Wolfgang Brüggemann of the Universität Hamburg; Nathan Gillett of the Canadian Centre for Climate Modelling and Analysis; Susan Solomon of the National Oceanic and Atmospheric Administration; Peter Stott of the Hadley Centre; and Mike Wehner of Lawrence Berkeley National Laboratory.Founded in 1952, Lawrence Livermore National Laboratory is a national security laboratory, with a mission to ensure national security and apply science and technology to the important issues of our time. Lawrence Livermore National Laboratory is managed by Lawrence Livermore National Security, LLC for the U.S. Department of Energy's National Nuclear Security Administration.
link....

Wednesday, August 12, 2009

Elephants Face Extinction By 2020

Researchers claim that elephants face the risk of extinction as soon as by 2020 that will become a result of high death rate due to poaching.
African elephants are widely killed for their ivory and this tendency seems to continue at a quick pace.
University of Washington biologists say that the public is unaware of the dangerous situation with the mammals.In 1989 the elephants' death rate of 7.4 percent a year led to the international ban on the ivory trade. The recent studies showed that the death rate of African elephants is now 8 percent a year with the fact that the ban is absent today. Taking into account that the fatality rate among elephants 20 years ago was based on a population of more than one million and now elephant population is less than 470,000, the situation becomes marginal.
“If the trend continues, there won’t be any elephants except in fenced areas with a lot of enforcement to protect them,” said Samuel Wasser, a UW
biology professor, lead author of the research paper.Scientists warn that today's tendency means that most of the large groups of elephants face extinction by 2020, unless serious measures will be taken.Wasser developed special DNA techniques to find out the origins of elephant population ivory. These tools could be helpful as often poachers evade the law enforcement, killing the elephants in one country, but shipping the ivory from the neighboring nation. The recent research shows that poachers target specific areas for elephant's ivory. Authorities should consider this information to prevent the major killing of the elephants.Wasser added that the situation is very serious and public pressure is needed for an international effort to stop the hunting. He explained that the extinction of elephants will lead to major habitat changes. Within a habitat, each depends on other species, contributing to the integrity of the habitat. The disappearance of the elephants will trigger the loss of other species that are adapted to live in these ecosystems.Researchers said that it is important to concentrate the enforcement in the areas, where elephants come from before their ivory will be brought to a global crime trade network. link....

Latest Rare Animal Sightings to Be Observed on Google Earth

Soon those who enjoy watching wildlife will have the possibility to observe latest rare sightings of animals by simply using their computers.
It is worth mentioning that some researchers use cameras with infrared triggers, which are also known as camera traps, in order to watch, count and spot large mammals in remote areas.
Recently it was reported that Google Earth will feature latest images taken by camera traps established in Ecuador by researchers from Earthwatch.
By adding their photos on Google Earth, researchers look forward to increase awareness of endangered species, promote donations and engage tourists in supporting preservation efforts.
"It's a form of fishing or hunting that doesn't kill anything," says on of the Earthwatch's scientists, Mika Peck of the University of Sussex in Brighton, UK, who is also the leading researcher of the project.
Some of the rare mammals caught on camera in the cloud forest feature the spectacled bear, also known as "Paddington Bear", which lives in South America; puma and deer.
Researchers hope to expand their idea by taking rare sighting in other reserves and thus motivate the local government to use this in order to control all their forests. They hope that the system will start running by July. link....

Charging Electric Cars in a Solar Forest

With the number of electric vehicles increasing, as a result of the fact that more people are becoming aware about the environmental issues, new ideas are generated by different talented designers who look forward to supply electric cars with clean energy.
One of such designers is Neville Mars who came up with the idea of designing a striking EV charging station which resembles an evergreen open space of solar trees. You can find more inventions that harness solar energy here at www.InfoNIAC.com - check the links at the bottom of the story.
The photovoltaic forest has two functions: first of all solar tree harness solar energy to charge electric vehicles and secondly the shades of solar trees cover vehicles from the hot sun while they charge, reports EcoFriend.
Every solar tree features a set of photovoltaic leaves installed on poles and the base of each tree carries a power outlet. Thus whenever a person parks a car under a solar tree they can instantly connect the vehicle to the socket to charge the car. This is a truly inspiring idea to use a forest-like parking lot that exploits renewable energy. link....

Ten Unnoticed Effects of Global Warming

The LiveScience Magazine has published a Top 10 global warming side-effects that not everyone knows about. Global warming is, actually, not only about ice melting in the Arctic and temperature rising. It could turn into very strange things. Here they are:
10. More aggravated and aggressive allergies (the "MAAAA")
People are experiencing more aggravated and aggressive allergy fits and, partly, experts blame the global warming. How are these things interrelated? Very simple: on one side there are changes in people's lifestyle, strengthened by
pollution. The protection against allergens weakens and the results are allergies. On the other side - global warming. It made springs come earlier, thus plants bloom earlier too and produce more pollen. All of this results in a simple axiom: a longer spring equals a longer allergic season.
9. Change of the habitat
Small rodent forest animals are not to be seen on height we were used to see them for some time now. Probably, this happens because such animals as mice and squirrels move to higher elevations and, probably, this is also caused by global warming. A similar threat hangs over polar bears that live on sea ice, which gradually melts. What happens with this species, if their habitat disappears is not even a guess - it is a fact: they will disappear with the ice they live on.
8. The blooming ices
While everyone is concerned with the effects of the global warming across the places of human living, we forgot about those who live and bloom in the Arctic. Since the flora we used to see around us experienced difficulties, plants in the Arctic enjoy more and longer sun baths per year and since their usual habitat was 'under ice', they are eager to start s small forest on ice. Research has found higher levels of the form of the photosynthesis product chlorophyll in modern soils than in ancient soils. This fact demonstrates a small biological boom among the icy mountains.
7. Draining the lakes
Reports show at least 125 lakes disappeared in the Arctic, and only during the last few decades. Researches proved that the lakes have seeped through the soil. This happens due to the fact that the permafrost, a layer on the bottom of the lakes, which kept the waters in the basin, gradually melts. When the layer thawed out, the waters seeped into the soil. When these lakes disappear, the creatures, living in that lake, disappear too. A researcher compared this event with pulling the plug in the bathroom.
6. The Earth thawing underneath
Global warming is causing not only massive melting of the Arctic glaciers, but it makes the permanent frozen soils under the earth's surface thaw too. This process ends up in deformation of the grounds and no one can predict what form the upper layer of the earth would take. Among the after effects of these processes could be mentioned ruined building, railroads and highways.
5. "There will be only one"
Global warming tests everyone and if everything goes on as it does now, a natural selection will take place: "only the fittest will survive". With the early beginnings of springs and short winters plants bloom earlier (as mentioned in point 10), thus animals that wait until the plants' usual blooming time to start their migration might miss all the food. And only animals able to evolve and to reset their biologic clocks will make it to places of greenest grass and beautiful pastures. Ultimately, these species will be able to pass their genes to following generations, changing the genetic code according to current global situation.
Some traces of this kind of evolution can be already seen this year: in some regions, were the abnormal warm winter and an early spring have triggered a mass invasion of silkworm. These insects are eating everything green they see, thus destroying partly the crops and leaving forests leafless.
4. Lower density of the atmosphere
It is not a secret that
emissions cause the most harm to our planet. The effects caused by carbon dioxide emissions do their dirty work even in the atmosphere's outmost layer. The air in the exosphere is very thin, yet still creating drag to slow down satellites, making engineering boost them up from time to time and bring them back on their orbit around the earth. However, the growing concentration of carbon dioxide in the exosphere makes this weak drag even weaker. When the carbon dioxide molecules collide in the lower layers of the atmosphere, they release heat, warming the air. Due to a smaller quantity of molecules in the upper layers, there is also less collision among molecules and the small amount of energy created radiates away. The conclusion is that with more carbon dioxide the more cooling occurs and the air settles in the exosphere. Thus the air concentration decreases, decreasing the drag with the satellites.
3. The growing mountains
After the beginning of the industrial era in the history of human race the mountains began a steadier growth. Although this fact might pass by unnoticed by hikers - it is a fact. The glaciers on top of the mountains have slowed down growth of the mountains with the immense weight of these masses of ice. Yet when they started to thaw, with the global warming as the main cause of this, the rocks are rebounding faster when there is nothing to push them down.
2. The memory doesn't remain
There are not so many 'live' monuments remaining, which prove the existence of ancient civilizations. These monuments are not as stable and strong as the engineering wonders of today. Thus the extreme weather and the rising waters might write the final lines in the book of history of these ancient monuments. The process has already started as floods have already damaged the amazing 600-year-old Sukhothai site - once the capital of the Thai kingdom.
1. Forests burning
Global warming means not only melting Arctic ices, but wild forests fires too. It was observed that over the past several decades more fires have deserted the countryside in the Western states of the USA. The forest fires are now more intense and they last for longer period of times. Experts say that the fact that now the spring come in the middle of the winter, the snow melts earlier, leaving longer periods for draught in the forests, which increases the risk of their ignition during this dry period.
Perhaps, people should not be so ignorant about this obvious problem of global warming, as even the leaders of the G8, on their last meeting in Germany, have intensively discussed the problem of global warming and harmful emission into the atmosphere.
link....

World's First Ship that Uses Solar Energy to Power Its Main Electrical Grid

This huge solar-powered ship is called M/V Auriga Leader. It features 328 solar panels mounted on its top deck. Auriga Leader is part of a demonstration project developed by the Port of Long Beach, Toyota and NYK Line, a shipping company with headquarters in Tokyo.
The ship is the world's first to use
solar energy for its power needs. With the help of solar panels the ship is able to burn less diesel fuel while in port. The solar array of Auriga Leader can produce about 40 kilowatts, which is enough to power 10 average homes.
Although previously there were other ships that used solar energy to power some small electronics, such as auxiliary lights, this ship is the first in the world to use solar energy to power its main electrical grid (at infoniac.com you can find additional information about green technology, so check out the links below this story).
According to Fumihiko Shimizu, manager of the U.S. car carrier group for NYK Logistics and Megacarrier, an NYK subsidiary, at the moment the company is in the process of estimating the effectiveness of the solar panels. Engineers still have to see whether the panels are able to withstand corrosive effects of salt sea air.
The 60,000-ton Auriga Leader is 665 feet long. At the same time it is a rather simple vessel in terms of requirements when it arrives in port. It is worth mentioning that the shipping business is very hazardous for the environment, generating huge amounts of
greenhouse gases and though Auriga Leader's 328 solar panels will not significantly cut the ship's emissions it is still a very important step towards a greener shipping business. link....