Economy

The Weekly Carboholic: a bit of everything

carboholic

Due to the large size of this week’s Carboholic, I’ve busted it up into sections for readability.

Science

When you’re talking about Antarctica and global heating, there are some serious problems. The few satellites that are in polar orbits to monitor the poles can’t directly detect how thick the ice shelf is, or what the salinity and temperature of the water beneath the ice shelf is. Floating sensor buoys can’t get beneath the ice shelves and are rather limited in their operational depth. And sending people out to drill deep holes and manually measure temperature and salinity is far too expensive and dangerous. And given that what the water under floating ice shelves is doing may be key to understanding how Antarctica will respond to global heating, the problems represent serious limitations. Which is why the key to understanding what’s happening immediately around and beneath the Antarctic ice shelves may well be elephant seals.

According to the Canberra Times article, scientists outfitted elephant seals with special sensor-laden headgear and then sat back and recorded the data that they got back before the sensors ran out of battery power or fell off the seal’s head. In the process, the scientists involved recorded 30 times more data about the sea ice zone of the Southern Ocean than they’d had previously. They also provided 9 times more data than conventional buoys would have and increased the accuracy of Antarctic ocean current maps around the southern Indian Ocean, the Ross Sea and Western Antarctic Peninsula.

———-

As of last week, the area of Arctic sea ice had shrunk to its second lowest area in recorded history, and with three more weeks of melting yet to come before the sea ice hits its seasonal minimum in September, there’s an excellent chance that this year will set a new record for the smallest amount of sea ice. In addition, both the Northwest and the Northeast passages were open to shipping as of last week, making the Arctic sea ice cap essentially a floating island for what is believed to be the first time in 125,000 years.

It’s possible that, as the BBC article quotes Mark Serreze, a senior scientist at the Colorado-based NSIDC, as saying, the Arctic may have actually hit a “tipping point” where the sea ice is guaranteed to continue shrinking until it disappears altogether in the summer. This creates a major problem because less ice means that there is more absorption of solar energy, so the Arctic warms even faster. This could mean that the permafrost on the Arctic shore could thaw out and start decomposing, releasing massive amounts of methane into the air, it it may also lead to releases of methane from offshore hydrate deposits.

———-

Speaking of permafrost, an article in TerraDaily last week pointed out that “the climate change models upon which future projections are based, do not include the potential impact of the gases trapped frozen Arctic soils.” This was part of a commentary by German researcher Christian Beer of the Max Planck Institute for Biogeochemistry that was published along side a study by a team of American researchers who sampled the amount of carbon stored in Alaskan permafrost. What they found was that prior studies had underestimated the amount of carbon by 60%. No prior study had ever probed deeper than 40 cm, and the minimum depth that the new study probed to was at least a meter.

To put this into perspective, North America has an amount of organic carbon stored away in its Arctic permafrost that is roughly equal to 1/6 of the total amount of carbon in the atmosphere. The team of scientists in the article believe that the permafrosts of Europe and Asia have similar amounts of carbon stored.

———-

According to an article in the Lexington Herald Leader by a McClatchy Newspapers reporter, some scientists believe that the Permian-Triassic mass extinction was caused by runaway global heating that was in turn caused by massive carbon dioxide (CO2) emissions from the Siberian Traps lava flows. Lee Kump, a geoscientist at Pennsylvania State University, is quoted as saying “The end-Permian catastrophe is an extreme version of the consequences of global warming.” For comparison, the P-T mass extinction resulted in the extinction of an approximately 95% of all marine organisms and up to 85% of land organisms.

———-

A new study published in Nature and reported in Science News indicates that the carbon cycle in the ocean may be more complex than previously thought, and that as a result climate models may be overestimating how much carbon the ocean can store.

Previous models assume that the availability of other nutrients like nitrogen and iron determine how fast ocean life can absorb CO2 from the water and air. The new study indicates that the ratio of nutrients matters too – when there’s too many nutrients compared to the CO2, then bacteria may thrive instead of phytoplankton (the life that would actually pull CO2 out of the water and sequester it in the ocean). This could result in a net release of CO2 from the oceans instead of a net absorption, increasing the amount of CO2 in the air and thus boosting global heating. However, the study was based on the Arctic ocean, and the scientists involved have no information on what parts of the world’s oceans are carbon-limited vs. nutrient limited. Regardless, however, this new information will need to be included in the next generation of climate models.

———-

In yet another study, this one published in the journal Science, scientists from Israel, the University of Maryland-Baltimore County, and NASA report that the interaction between aerosols and clouds depend greatly on how clear or cloudy the area is already. In areas where the sky is already clear, aerosols like soot from fires and pollution provide a cooling effect by stimulating more cloud formation. But when too much aerosol is added to an already cloudy area, the additional aerosol actually heats up the air and leads to existing clouds burning off.

The impressive thing about this study is that, when the analytical model developed by the researchers was tested against actual data from the Amazon, the results matched very accurately. So accurately, in fact, that the researchers were able to rule out cloud cover as a source of temperature changes in the measured data near areas of Amazon forest burning. I haven’t read the paper myself, so there’s a chance that there’s some circular modeling going on here (using Amazon data to develop the model and then testing it on the Amazon data would naturally get a high degree of accuracy), but if it holds up to scientific scrutiny, then it’s excellent news. Cloud models built from base principles have been one of the biggest holes in climate modeling, and so an improvement in this area will dramatically improve the accuracy of climate modeling in general.

———-

Back in 1998, Michael Mann of Penn State University released his infamous “hockey stick” graph. It showed that modern temperatures were the highest in the last thousand years or so and it was based largely on tree rings. This graph, however, was roundly criticized for its methodology even though the U.S. National Research Council generally accepted the graph as accurate. The BBC reports that a new study by Michael Mann has just been published, and it shows that the “hockey stick” was right.

Mann and his team took 1200 new proxy data records from terrestrial and marine sources throughout the Northern Hemisphere and, when they were done doing their analysis, discovered that the hockey stick was right even when tree rings were entirely excluded. Not only that, but ten more years of data has pushed the end of the stick back to 700 AD with significant accuracy and back to roughly 0 AD with less accuracy. And by using multiple statistical methods to analyze the data, the researchers have largely immunized this conclusion from the same criticisms that the 1998 hockey stick faced.

Overall, the data now shows that the later half of the 20th century and the first decade of the 21st century were unusually hot in the Northern Hemisphere as compared to the last 1300 years. Note that this includes the so-called Medieval Warm Period. Unfortunately, Mann’s team can only make such conclusions for the Northern Hemisphere, as there is at present insufficient data from the Southern Hemisphere to draw any detailed conclusions for the entire planet.


Technology/Research and Development

Cement is believed to be responsible for about 5% of the world’s total CO2 emissions. These emissions are largely a result of the coal and natural gas-fired kilns that are used to cook limestone into cement for use in buildings, roads, bridges, etc. But if medical cement inventor and Stanford professor Brent Constantz has anything to say about it, his new construction cements will be net CO2 absorbers, not emitters.

The trick is that Constantz’s new cement doesn’t need to be cooked in a kiln, although he was understandably reticent to discuss the details with the San Francisco Chronicle – he hasn’t been awarded his patent on the process yet. Not only that, but he can take coal emissions, bubble them through seawater, and sequester an additional ton of CO2 for every ton of cement he makes. And he says that the same process can be used to manufacture aggregate, the sand and gravel that underlies roads and makes up concrete and asphalt. This suggests that he’s somehow converting the power plant’s emissions into a mineral directly.

There are, however, two potential problems. The first is the major one – construction companies are understandably conservative when it comes to new building materials, and they may not be willing to use Constantz’s new cement right away. The second is price, although if he’s right and he can sell his cement for $100 per ton instead of $110 per ton for standard Portland cement, that one will fade quickly.

———-

One of the big problems with carbon sequestration is ensuring that the pressurized gases or liquid CO2 don’t leak back out into the environment. If chemists could develop a way to convert CO2 into a mineral like chalk or sand, sequestering CO2 would be downright easy – just dump it or use it as fill somewhere. It now appears that a Pennsylvania State University chemical engineer may have figured out how to do this very thing, at least on a small scale.

Dirk Van Essendelft of Penn State described how to use the mineral serpentine to absorb CO2 and convert it into sand and magnesium carbonate, a mineral similar to chalk. He calculated that capturing the CO2 would consume much less energy (about 10% loss of energy from the power plant) than present methods (about 30% loss). His method uses water, an acid, ammonia, and carbon dioxide to create sand and magnesium carbonate, and the magnesium carbonate could be used instead of limestone to make cement (gee, this sounds familiar….). Unfortunately, while serpentine is pretty common on the coasts, it’s too expensive to transport inland, so this is not a magic bullet for carbon capture and sequestration (CCS).

———-

Fluorocarbons (FCs) are amazing things. They’re very stable chemical compounds and they have huge greenhouse gas potential. Compared to CO2, most FCs are 100x to 10,000x more potent greenhouse gases. The reason everyone’s focused on CO2, however, is because there’s so much CO2 in the air that it, at present, far overwhelms the impact of FCs. However, the more FCs that are emitted into the atmosphere, the more heating we’ll get from those industrial chemicals. And because FCs are so chemically stable, they last in the atmosphere a very, very long time – decades to centuries.

According to New Scientist, however, chemists have discovered a way to break open the chemical bonds of these long-lived industrial chemicals in a way that is low-energy and that results in environmentally benign waste products. Given that the only other alternative is long-term storage (50,000 years) or expensive and energy intensive destruction methods, if the chemists are right, then this will enable FCs to be gradually taken off the list of greenhouse gases contributing to global heating.

———-

All three of the discussions above are examples of what the American Chemical Society (ACS) calls green chemistry, and green chemistry is the subject of this month’s Chemical & Engineering News cover story, Calling All Chemists.

The basic ideas behind green chemistry are laid out in the ACS’ “12 principles of green chemistry”. The principles basically come down to prevent waste, use every atom, design and use safer chemicals and solvents, make the reactions low energy and energy efficient, and use renewable feedstocks wherever possible. As an example, chemists are working on developing methods to adapt synthetic solvent-based chemistry to water-based and to use hydrogen peroxide as a solvent instead of other, toxic chemicals.

According to the story, John C. Warner, president and chief technology officer of the Warner Babcock Institute for Green Chemistry, estimates that about 10% of all chemical processes and products are presently environmentally benign and that about 25% could be made benign “relatively easily.” This leaves 65% to be invented or reinvented. However, as important as developing a green chemical industry is to the future, most universities don’t require their chemistry and chemical engineering grads to demonstrate any understanding of the environmental impacts of the processes and products they use every day. However, green chemists are looking to nanotechnology to lead the way toward green chemistry – nanotech is a new field and can be made green from the beginning.

It’s a very long article, and this little blurb doesn’t do it justice. But it’s good to know that there are a lot of people interested in making chemistry green.

Politics

A United Nations report issued last week urges every nation to phase out energy subsidies as being unfair to the poor, economically wasteful, and environmentally questionable. According to the AP article, the UN estimates that governments spend $300 billion every year in energy subsidies that “encourage consumption and discourage efficiency.” Politically expedient subsidies are also perpetuating “inefficiencies” in the global markets and some nations are spending more on energy subsidies than they are on their education and health budgets combined. Obviously, this isn’t a good thing.

Perhaps most interesting, the AP quotes Kaveh Zahedi, UNEP’s climate change coordinator, as saying that “cutting off the subsidies would be good for the environment as it would reduce carbon emissions by as much as 6 percent”. These reductions would be from corrections to market inefficiencies that are a result of the subsidies as well as the investment of some of that $300 billion per year into green energy.

———-

Back in April, the Carboholic brought the news that the UK was starting a conversation about abandoning low lying villages and towns to rising seas. Last week brought news that the UK’s Environment Agency head Lord Chris Smith of Finsbury was already drawing up plans to evacuate and resettle people away from the areas hardest hit by coastal erosion and sea level rise.

According to the Daily Mail story, the Environment Agency is already drawing up plans for evacuations and will present those plans to the communities affected in Norfolk, Suffolk, East Anglia, and any other areas that are deemed unsavable due to cost or lack of technology. And Lord Smith warned that UK taxpayers would probably have to foot the bill for resettlement costs – private insurers would be unlikely to cover losses as a result of sea level rise.

I understand that no-one wants to abandon their home and community, but Lord Smith is unfortunately correct – some communities will be simply untenable in the face of rising sea levels. The United States Congress needs to catch up to the UK on this issue. Some towns along the Mississippi River that are in the flood plain should probably be relocated since the U.S. taxpayer shouldn’t have to keep paying to rebuild doomed communities. This is unfortunately just as true of low-lying parts of major metropolitan areas, such as New York City, Baltimore, Washington D.C., San Diego, Miami, and even New Orleans.

Energy

Back in August, the New York Times reported that two companies will be building large solar power plants in central California. The electricity made by the plants will be sold to Pacific Gas and Electric (PG&E), and when complete, the two plants will supply 800 MW of power to the California grid. The larger of the two plants, at 550 MW, will be almost 9x larger than the largest solar plant anywhere else in the world, and the two plants combined will cover approximately 12.5 square miles with photovoltaic panels.

While PG&E doesn’t expect the new solar electricity to be competitive with coal or natural gas, they do expect cost parity with other renewable sources, and since there’s a law that requires 20% of all electricity in California to come from renewable sources by 2010, PG&E doesn’t really have a whole lot of choice in the matter.

———-

You know something’s wrong when you have to turn off your renewable energy sources (solar power, wind turbines) in order to keep from overloading your transmission lines, but that’s exactly what a NYTimes article last week reported. Generating power is becoming easier and cheaper, but because our transmission lines are so bad, moving that cheap renewable energy to where it can be used is just as hard as always.

The problem, according to the NYTimes article, is that our present electricity grid was built with local and regional needs in mind. To borrow the article’s metaphor:

[i]t resembles a network of streets, avenues and country roads.

“We need an interstate transmission superhighway system,” said Suedeen G. Kelly, a member of the Federal Energy Regulatory Commission (FERC).

It’s so bad, in fact, that the best wind generation areas aren’t being built out with turbines because there’s no transmission lines that can get the electricity from those turbines to market. And the states see little advantage to helping their neighboring states improve their electricity grids without getting a benefit themselves. And when the Energy Policy Act of 2005 went into effect, and the Department of Energy declared two large sections of the country “national interest electrical transmission corridors,” the states affected understandably hit the roof, metaphorically speaking.

The problem is that the U.S. has a vested national interest in ensuring that the national grid is reliable and redundant, and right now, with over 500 owners controlling the 200,000 miles of transmission line in the United States, there’s no good way to force the issue. You don’t want the FERC coming in and throwing people off their lands using federal eminent domain, but “local control” cannot be permitted to impede the construction of DC high voltage transmission lines from solar-thermal plants in Arizona to Denver and San Francisco either.

———-

According to the Houston Chronicle, people living near wind turbines have started reporting various medical problems ranging from sleep disorders, headaches, ringing in the ears, vertigo, and even problems with concentration and memory.

According to the Chronicle article, Dr. Nina Pierpont of Malone, N.Y. has written a book about what she calls “wind turbine syndrome,” and in it she goes into detail about various medical problems that can be caused by living too close to wind turbines. She recommends that turbines not be erected any closer than two miles from the nearest home lest the low-frequency vibrations caused by the spinning turbines cause medical problems. As you might imagine, wind turbine manufacturers and the utilities who are erecting the turbines don’t buy Dr. Pierpont’s conclusions.

Applying a little bit of basic science and logic to this issue, it makes sense to me that there would be some people who are susceptible to those low frequency vibrations, and that they’d suffer various medical maladies as a result. But until we get actual medical studies instead of books for mass publication, it’ll be difficult to draw any serious conclusions.

———-

I’ve discussed the problems that would result in the release methane from methane hydrates, but the methane also represents a potential opportunity. There’s too much methane to burn it all off (and it would produce CO2 anyway, although that’s still a net greenhouse gas reduction) before it becomes a major climate threat, but there’s also so much of it that we could potentially tap it and burn it for electricity as a bridge technology between extremely dirty coal and clean renewable sources such as solar and wind, and it would cut CO2 emissions in half. This article discusses this potential in a little more detail.

———-

If I could afford it, I’d do two things to my home in an instant – add photovoltaic solar panels to my roof and batteries to my basement, and have a small ground source heat pump installed in my back yard. These little heat pumps pull the heat out of your home in the summer and store it in the ground beneath your home, cooling your home in the summer. In the winter, they pull the stored heat out of the ground and pump it back into your house, heating it some. And while the heat pump doesn’t eliminate the need for heating or air conditioning/evaporative cooling, according to the NYTimes article, the pump can reduce energy costs by 25% to 65%.

Unfortunately, I can’t afford this right now, and even if I could, I’m not sure that the cost of electricity and natural gas in Colorado are so high to justify the expense just yet. And considering that the most cost effective systems are not home systems, but rather commercial systems, it might not make sense for a while yet. But if the Department of Energy statistics shown in the NYTimes article are accurate (about a doubling in installed heat pump systems over 3 years), then I would expect the costs to come down and effectiveness for residential installations to go up.

Image credits:
NSIDC
Jeff Schmaltz aboard ISS, NASA Visible Earth
RealClimate

5 replies »

  1. Pingback: www.buzzflash.net