Researchers find that PECONFs might offer a better CO2 sorbent at lower costs | Global CCS Institute

The search for a material that can soak up high volumes of CO2, at a low cost, remains a tantalizing goal at labs around the world.  Ideally such a material will mop up lots of CO2 under certain conditions, but happily let it go under others. Likewise, it should be particular to CO2, such that it won’t absorb other gases. And it should be fairly stable across a range of chemical conditions and heat levels. Last but not least, it shouldn’t be too costly.Earlier this month, chemists at Lehigh University, in eastern Pennsylvania, released findings of a new porous material that promises to do all that, and maybe more. The researchers, Kai Landskron, Paritosh Mohanty and Lillian D. Kull reported their findings on 19 July in the journal Nature Communications.

Results of the study show that the new material captures CO2 almost as well as more expensive materials in use today, but works at lower pressures, and is stable at temperatures as high as 752 degrees Fahrenheit. The combination suggests it would be cheaper to deploy given that current materials achieve optimal performance only in conditions that are costlier to maintain.

Quoted by Australia’s ABC News, lead research Dr Landskron said he believes the material can be mass produced on a large scale. “We can make this material… simpler than most other materials can be made,” he said. “We can make them from relatively inexpensive building blocks in simple solution reactions, by so-called polycondensation reactions.”

The material still has a long way to get from lab to power plant though. CSIRO’s Dr Lincoln Paterson, who focuses on carbon capture, told ABC News’ Meredith Griffiths that Dr Landskron’s work is a fundamental advance but “…still needs to be taken a long way towards practical application.” He added: “The tests they’ve got so far are that it performs extremely well in the laboratory and it uses low cost materials.”

For those with the appetite, here’s what I could find on the chemical recipe for this new material. According to USA Today, the team reacted two chemicals, hexachlorocyclotriphosphazene and diaminobenzidine, to create a gel. Once dried the material formed porous electron-rich covalent organonitridic frameworks (or PECONFs) that proved to have a strong appetite for CO2.

Here’s how the researchers describe it in their abstract:

Here we report the synthesis and CO2, CH4, and N2 adsorption properties of hierarchically porous electron-rich covalent organonitridic frameworks (PECONFs). These were prepared by simple condensation reactions between inexpensive, commercially available nitridic and electron-rich aromatic building units. The PECONF materials exhibit high and reversible CO2 and CH4 uptake and exceptional selectivities of these gases over N2. The materials do not oxidize in air up to temperature of 400 °C.

For a deeper dive, click through to the full article here: “Porous covalent electron-rich organonitridic frameworks as highly selective sorbents for methane and carbon dioxide.” (Subscription required.)

This is just the latest in a string of research findings looking for lower cost sorbent materials. Back in April, I checked out promising lab work showing that a form of treated sawdust might offer an affordable option as a CO2 sorbent. See ‘Capturing carbon with sawdust‘ 6 April, 2011.

Fretting over a ‘valley of death’ for basic CCS research | Global CCS Institute

As Christopher Short pointed out on these pages earlier this week, American Electric Power (AEP)’s recently suspended operations at its Mountaineer project in West Virginia, a move which underscores how policy uncertainty is having a corrosive effect on viable CCS projects. Short reminds us that, based on Global CCS Institute projects data, Mountaineer is just one of a half dozen US projects that have been shelved partly because of a lack of federal carbon policy. 

There’s a second troubling dimension to this policy problem that occurred to me while catching up on what should otherwise pass for good news in the realm of CCS research and development.

In investment circles, the phenomenon is known as the ‘valley of death’. It happens when promising early-stage technologies fail not for lack of groundbreaking performance improvements, but for a lack of finance or other business-related barrier to scaling.

In the case of CCS, the absence of clear policy means that promising research has fewer paths to scale up for commercial  deployment.

Here’s what brought the thought to mind. On 12 June, just two days before AEP’s announcement, the US Department of Energy (DOE) expanded by three the group of projects designed to confirm the safety of long-term sequestration of CO2. (Find details of the projects further down.)

It’s welcome news, of course, but given the AEP news, generally dim prospects for US carbon policy, and resulting indecision among both private and public-sector players, there’s a worrisome question over how the results of the DOE’s valuable CCS research can evolve.

Take a step back. Much has been written about the failings of the US R&D machine. The country is inarguably blessed with many of the planet’s finest research universities, and is famously skilled at incubating discoveries. But we’re notoriously poor at commercializing those advances. Exceptions exist, to be sure, such as IT and software, but the spectre of ‘invented here, built there’ haunts much of US economic and job growth policy discussions.

Now there’s reason to argue that just such a pattern is setting up in CCS. And there’s certainly risk that a ‘valley of death’ may open up, distancing CCS R&D  projects from crucial commercialization opportunities.

The DOE is seeding numerous R&D projects, but there’s a decreasing population of commercial players who can take on the risk of commercializing them. Likewise, talented researchers drawn to carbon related technical fields face dimmer prospects with the erosion of mid-stage projects.

Now, back to the good news. Cribbing from Carbon Capture Journal, here are details of the projects being newly funded. Funding for the trio will total $34.5 million over four years:

* Blackhorse Energy, based in Houston, Texas, plans to inject approximately 53,000 tons of CO2 into a geologic formation located in Livingston Parish, Louisiana. The project will assess the suitability of strandplain geologic formations for future large-scale geologic storage of CO2 in association with enhanced oil recovery. Additionally, they will test the efficacy of increased storage using short-radius horizontal well technology to inject supercritical CO2 and CO2 foam into the reservoir.

* The University of Kansas Center for Research, in Lawrence, Kansas, will inject at least 70,000 metric tons of CO2 into multiple formations. The project will demonstrate the application of state-of-the-art monitoring, verification, and accounting tools and techniques to monitor and visualize the injected CO2 plume and establish best practice methodologies for MVA and closure in ‘shelf clastic’ and ‘shelf carbonate’ geologic formations.

* Virginia Polytechnic Institute & State University, in Blacksburg, Virginia, will test the properties of coal seams, and evaluate the potential for enhanced coalbed methane recovery by injecting approximately 20,000 tons of CO2 into un-mineable coalbeds. (Click here for further details at Carbon Capture Journal.)

As a signal of continuing commitment to CCS, this is encouraging. Given political realities in the US, where legislative policy is blocked by partisan politics, the White House is smart to use federal agencies—the DOE and Environmental Protection Agency, mainly—to spur the climate policy agenda.

But in the absence of full-blown federal policy, I can only wonder: how far can this approach really go, for how long?

Review: Revenge of the Electric Car | OnEarth

Chris Paine’s 2006 documentary Who Killed the Electric Car? arrived with perfect timing, capturing the country’s collective frustration with sky-high energy prices as well as our growing disenchantment with the automotive alternatives on offer. Let’s hope his sequel, Revenge of the Electric Car, previewed last week in New York and set for wide release this October, proves equally as prescient. The film, which captures what may turn out to be the first stages of the auto industry’s evolution away from oil, cruises smoothly over the finish line where its predecessor ultimately stalled short.

For Revenge, Paine scored fly-on-the-wall access to three of the most charismatic leaders in the auto industry. And he did so at a key moment — just as each was in the midst of executing a high-risk, multi-billion-dollar bet on battery-powered cars. Add in the fact that Paine’s crew was filming during the 2008 economic crisis and implosion of GM, and the result is more than just a snapshot of the gamesmanship behind the creation of mass-market vehicles. Revenge offers a look inside the minds of business leaders struggling through one of the most troubled periods of recent economic history.

As the documentary opens, U.S. automakers face an environment that’s radically different from the cheap-oil days that ruled when GM developed its first electric vehicle, EV1. Now oil prices are running at historic highs, and governments around the world have begun to put some real muscle behind the idea of the electric car.

Here’s Bob Lutz, GM’s American-born vice chairman and a veteran of the Big Three (Chrysler, Ford, and GM), becoming the unlikely champion of the Chevy Volt, and opening a door to GM’s salvation after the company’s downfall. Known in Detroit as “Mr. Horsepower,” Lutz personifies the about-face that the industry as a whole went through in the time that passed between the making of the two films. Once a deep skeptic of EVs, he now artfully tilts GM’s monolithic culture toward his goal of developing the Volt.

Facing off against GM is the enigmatic Carlos Gohn, the Brazilian-Lebanese CEO of Nissan/Renault, which is building the all-electric Leaf. Gohn’s orderly execution of the Leaf offers a welcome perspective on EVs from beyond American borders. After all, battery-powered cars are likely to flourish on the roads of Paris, Shanghai, and Tokyo before they do here, for the same reasons that small cars did.

Playing counterpoint to the corporate titans is Paypal-founder Elon Musk, a charismatic South African-Canadian struggling to steer the scrappy Tesla from startup mode to full-scale manufacturing. With confidence bordering on hubris, the then 38-year-old is at once inspiring and pain-inducing, as he underestimates the complexity of manufacturing and struggles to produce a stream of fault-free $100,000-plus electric sportsters. (This while also navigating his way through a painful divorce and playing doting dad to his five sons.) There’s real drama in watching Musk’s brave face flicker as he inspects an armada of faulty cars and in watching him awkwardly deliver the news to early depositors that the price of their vehicles will have to rise yet again.

One of the film’s delightful subplots involves the struggles of Greg “Gadget” Abbott, a goateed indie tinkerer who made a brief appearance in Who Killed and who excels at retrofitting classic cars with batteries and electric motors. With an infectious, mischievous air, Gadget offers a reminder of the gear-head roots of EVs’ most devoted fans.

Unlike with his first film, where Paine came to the topic too late to build a “how-it-happened” tale and leaned instead on activists and half-baked acolytes, Revenge captures rich natural tension as it unfolds. Who Killed, for example, featured a parade of Hollywood A-listers (Tom Hanks) and B-listers (Phyillis Diller), many of them sore about having lost their exotic cars and whining about GM’s decision to kill the EV1. Revenge gives us mercilessly few Hollywood prima dons — though Danny Devito does get downright giddy test-driving the Volt.

It won’t be giving anything away to tell you that the end of Revenge is a happy one. Of course, it’s far from the end of the story. Should Paine opt to complete what seems like a natural triptych, the final installment will no doubt prove more global in scope. Beijing has set national EV goals that dwarf those of Washington, for example, and the Chinese have much deeper capital resources. They also have a strong knack for building things like smart grids, which will be necessary for the wide-scale adaptation of EVs. And the race to build a better battery is heating up elsewhere overseas, with labs in dozens of countries working to build batteries capable of matching the range of your average gas tank.

With the gee-whiz stage of EV creation now complete, GM, Nissan, and Tesla also face the tougher slog of turning these enormous bets into reliable, mass-market machines that can actually make some money. Sales of EVs and hybrids are so far running far below the ambitious targets set by national governments, including our own.

Lurking farther out is the persistent threat of volatile oil prices. Many, myself among them, would argue that the real killer of the electric car was cheap oil. In the late 1990s, prices hit a post-’60s low, in inflation-adjusted terms, at the very moment that GM’s EV1 was being rolled out. That wouldn’t make it easy for any $1.25-million prototype to get off the ground, I don’t care how many starlets tell you it’s a great idea. Sub-$2-a-gallon gasoline may seem unimaginable to us today, but a double-dip recession — a real possibility given the anemic economic growth and sovereign debt woes on both sides of the Atlantic — could send energy demand crashing, rendering the EV once again an intolerably uneconomic prospect.

Revenge closes with a scene featuring the Los Angeles Times reporter Dan Neil. The sole automotive writer ever to win a Pulitzer, Neil is cynical about the industry’s abysmal record on eco-cars. At the same time, reflecting on a lifelong affair with gas-guzzlers, he admits that in recent years even he has begun to “let go” of the idea of the traditional car, and to acknowledge that it may finally be rolling toward the sunset.

Original URL: http://www.onearth.org/article/revenge-of-the-electric-car

GM Upgrades OnStar to Power First Real-World, Smart Grid EV Pilot | GreenBiz

Hard to believe that OnStar — GM’s in-car mobile data service — celebrates its Sweet 16 this year.

Back in 1995, when the service was launched for GM’s luxury line, pundits griped it was just a superfluous add-on. This was back in the cell phone Stone Age when they were still a luxury, analog and kinda huge. Few predicted then that telematics would mushroom in importance over the next decade. These days six million subscribers pay for OnStar’s emergency assistance, remote diagnostics, mapping, entertainment and more.

To that long list, add one more trick OnStar is helping GM to pull off: offering a short-cut to connect electric vehicles (EVs) to the smart grid. GM yesterday announced the launch of a pilot program that can let utilities and customers skip the need to install physical smart grid points to manage recharging of their EVs. The new OnStar service will act as a remote brain, wirelessly tracking and governing the EV’s charging behavior, coordinating the timing and billing, and potentially dramatically lowering the costs to extend smart-grid management features to EVs.

By skipping the need to install physical smart apparatus, the OnStar system can save utilities some $18 million per 1,000 customers, said Vijay Iyer, GM’s director of communications for OnStar, citing GE estimates. To mesh OnStar’s data services with utilities’ internal information management systems, GM worked with GE, whose IQ Demand Optimization Services unit is used by utilities to monitor demand response systems.

This is important step for utilities which are busily, and expensively, building intelligent power and data devices in customers’ garages, as well as at charging terminals, to referee how and when EVs will re-charge. Utilities don’t want fleets of EVs drawing power on 95 degree summer afternoons when power is in short supply. Customers, likewise, will prefer the option of charging at night when power is much cheaper.

The Detroit automaker is calling the trial the first “real-world pilot of smart grid solutions.” This quarter, staff of regional utilities will become the guinea pigs for this program, driving Chevrolet Volts for everyday use. Ford announced plans to scale up a broad smart-grid integration at the last Detroit auto show. And Toyota has laid out ambitions to collaborate with Microsoft.

GM is betting that this approach will let it leapfrog the smart-grid technology demos being piloted across the U.S. Given that OnStar can pick up recharging activity anywhere — whether at home or on a distant road trip — the approach promises to offer deeper insight into how, where and when EVs are charged. Since it doesn’t matter whether the EV is connected to a smart-grid charge point, OnStar should let utilities more accurately model how to manage peak versus non-peak charging too.

GM’s EV approach may get real traction where others have struggled. It appears to offer utilities a faster, cheaper way to hook up a major new source of electricity consumption to the grid. If utilities don’t see the benefit, it will be DOA. We saw evidence of the importance of this recently when software heavyweights Google and Microsoft suspended efforts to develop software applications for home energy management, in part because of the difficulty of getting access to the all-important data stream from utilities.

There are some intriguing long-term implications to GM’s announcement. With smart-grid enabling technology embedded in the car, GM opens the door to a faster rollout of sophisticated vehicle recharging schemes than would be possible if utilities must first build hardware networks of recharging stations.

There’s big global potential too: GM as a company remains the No. 2 auto producer in the world, even after its recent near death experience. Two of the top 10 selling vehicles in China, the world’s largest auto market, are GMs. And China has the largest goals for EV deployment of any country.

The inevitable next question is whether GM might make this service available to other automakers, so that they could roll out smart grid EV charging on a faster track too? Until 2006, GM licensed OnStar through a variety of other carmakers, but has since stopped. This Sunday, however, GM will release a portable form of OnStar that can be installed in any car, OnStar for My Vehicle (OnStar FMV).

Who knows? In time, maybe even your Toyota could hook up to the smart grid via GM’s OnStar.


Business Loves Lighting Efficiency, So Why Try to Dim Efforts to Make a Better Bulb? | OnEarth

Efficiency is a generally considered a good thing. Good politics. Good business. That’s why efforts from national mileage standards for cars to rules requiring your refrigerator to use less energy have proven popular and effective, quietly spurring the gradual replacement of outdated technology with better-performing alternatives.

And that’s why, back in 2007, barely anyone raised an eyebrow when Congress applied efficiency standards to an energy guzzler that hadn’t changed much in more than a century: the light bulb.

A requirement that would make bulbs at least a third more efficient starting next year passed Congress as part of the Energy Independence and Security Act of 2007 in a 3-to-1bipartisan vote. Half of House Republicans supported the bill; Rep. Fred Upton, a Michigan Republican who now chairs the House Energy and Commerce Committee, called the legislation a “common sense, bipartisan approach … to save energy as well as help foster the creation of new domestic manufacturing jobs”; and President George W. Bush duly signed it into law.

The lighting industry welcomed the bill, which gave it time to work on meeting the new standards and set out a schedule gradually phasing them in, starting New Year’s Day 2012. First to be replaced will be the bulbs we’ve long know as “100-watt” incandescents. Over the next two years, today’s 75-, 60-, and 40-watt bulbs will have to likewise cut there energy use by about a third.

And guess what? The new rules have worked just as intended, accelerating the development of a variety of new lighting offerings, all of which save consumers money in the long run. In addition to improved CFLs, the new options include low-energy halogens that look like today’s incandescents, as well as LED bulbs that last for years. They’re already on sale at your local hardware store or Home Depot. You’re probably using some of them in your house, perhaps without even realizing there’s a difference.

“Efficiency is a desirable thing, and this type of standard has been a part of our body politic for a long time,” said Randall Moorhead, vice president of government affairs at Philips, as quoted at ThinkProgress.org. “The reality is, consumers will see no difference at all. The only difference they’ll see is lower energy bills because we’re creating more efficient incandescent bulbs.” The National Electrical Manufacturers Association and General Electric have also voiced support for the new rules.

So who’s got a problem with lighting efficiency standards now? Not business, certainly. And not the consumers reaping the benefits (which are estimated to reach $6 billion a year). It’s some of the same House Republicans (including Upton, whose statement crowing about the 2007 law as a “common sense, bipartisan approach” that will create jobs has disappeared from his website) who think they can score cheap political points with fans of Rush Limbaugh — who decries efficiency standards as “nanny state-ism” — and Glenn Beck, who apparently thinks anything that saves consumers money is “all socialist.”

On Tuesday night those House Republicans failed to pass a law known as the “Better Use of Light Bulbs Act,” or BULB Act, that would have repealed state and municipal rights to set efficiency standards for light bulbs. Business and consumers can hope this signals the end of a misguided effort to roll back progress. That’s never been a very bright option.

UPDATE 7/14/2011: Not the end! On Thursday, House Republicans launched yet another misguided attack on light bulb efficiency. Sigh.

Original URLhttp://www.onearth.org/blog/business-loves-lighting-efficiency-so-why-try-to-dim-efforts-to-make-a-better-bulb

Can big oil jump-start CCS? Expanding enhanced oil recovery could absorb decades’ worth of U.S. coal-plant CO2 emissions | Global CCS Institute

Just how big is the potential to sequester power-plant CO2 emissions into the U.S. oil patch?

In a word, “vast,” says a recent report released last month by MIT and The University of Texas at Austin that evaluated the capacity of the oil sector to pump CO2 into ageing wells to boost oil recovery, a process known as enhanced oil recovery, or EOR.

Aligning oil-producing areas with potential supplies of power-plant CO2, the researchers identified a variety of geographies that could accept an estimated 15 years or more of current, total CO2 output from U.S. coal plants, or approximately 3,500 gigawatt-years-equivalent of CO2.

That’s a potentially huge wedge to remove from the country’s climate challenge, given that coal plants account for about 30% of total US CO2 emissions.

(Jump to the bottom for links to the report and related resources.)

What’s more, domestic U.S. oil output would surge. The report estimates that, using the full CO2 output of coal-fired power plants to drive more petroleum from oil reservoirs, an additional 3 million barrels per day could be produced by 2030. That would be a 50 per cent increase over current domestic output.

The promise of scaling up CCS to expand EOR is nothing short of tantalizing. Near term, there is no larger potential source of commercial demand for CO2. The U.S. needs more domestic oil and the resulting economics could substantially subsidize the scaling up of CCS technology.

To be sure, widespread adoption of combining EOR-CCS faces major hurdles. The report names: a lack of CO2 transport and injection infrastructure; regulations remain underdeveloped at best; and there are scant and inconsistent incentives to match up supply and demand of CO2. Each of these shortcomings, the authors conclude, could be overcome with better government coordination.

There’s a long way to go. To get to the levels imagined by the report – that EOR could absorb a full year’s worth of coal plant CO2 output for 15 years – the industry has a long way to go. At its current scale, the industry could only handle 3 percent of that amount.

Here’s how the industry looks today, by that measure:

  • Demand for CO2, from current EOR operations – EOR uses about 115 million metric tons (MT) of CO2 per year currently.  Of this, 65 million MT are “new”,  rather than recycled CO2 being re-injected. This “new” CO2 comes mostly from natural geological CO2 reservoirs, and is pumped to oil wells via a network of pipelines.
  • Supply of CO2, from coal-fired power plants – Coal-fired power plants in the U.S. produce about 2,000 million MT of CO2. As a share of the total, EOR’s current demand (65 million MT of CO2) amounts to 3 per cent. Put another way, EOR’s appetite for CO2 could be met today with the emissions from approximately 10 gigawatt electric (GWe) of high-efficiency (supercritical) baseload coal power plants capacity, according to the report.

For a deeper dive into the MIT Univ. of Texas study, along with the research papers underlying the report, and other related material, follow the links below:

  • The report, summarizing the findings of a conference held in June last year, was published in May 2011 and can be downloaded from the Univ. of Texas here. You can view the individual academic presentations given at the July 2010 meeting at the homepage of MIT Energy Initiative, here.

Check out the original post at:
http://www.globalccsinstitute.com/community/blogs/authors/adamaston/2011/07/13/can-big-oil-jump-start-ccs-expanding-enhanced-oil-recov

Firing up first world’s first coal-fired CCS plant: Five questions for Southern Co | Global CCS Institute

After two years of construction, Southern Co.  flipped the switch on the world’s largest-scale, coal-fired CO2 capture facility at a site on the banks of the Mobile River, in Barry, Alabama last month.

Teaming up with Mitsubishi Heavy Industries, Southern Co. brought on line a 25-megawatt, coal-fired carbon capture and sequestration (CCS) facility on a patch of river-side land that is home to the James M. Barry Electric Generating Plant, one of the largest in Southern Co.’s portfolio.

With more than 42 gigawatts of total generating capacity, and 4.4 million customers, Atlanta-based Southern Co. is one of the largest electric utilities in the United States.
For more details on the Barry CCS project, I had a quick exchange with Southern Co.’s Nick Irvin, a principal research engineer, just before the July 4th long weekend.

What carbon capture technology is the facility using? 

Southern Company teamed up with Mitsubishi Heavy Industries which, together, we are responsible for the CO2 capture plant design and operation. This facility utilizes the KM CDR Process, a capture technology which was developed jointly by Mitsubishi Heavy Industries and The Kansai Electric Power Co.

The first step, of course, is coal combustion, which generates electricity and gives off a flue gas. A share of the flue gas from the main coal power plant is piped to the CCS facility. There, as part of the KM CDR process, the flue gas reacts with KS-1, an amine solvent, which captures the CO2. This creates a flow of CO2 that can then be separated from the KS-1, compressed and sent to a sequestration off site.

What made the Barry plant the site of choice?

Barry was chosen on the basis of the facility’s size and status as a flagship site among Southern Co.’s fleet. The main facility here – the James M. Barry Electric Generating Plant – is home to seven generating units, powered by coal and natural gas, with a total nameplate generating capacity of 2,657 megawatts. The CCS plant takes a slipstream of the existing plant’s flue gas, equivalent to about 25 megawatts out of the total 700-megawatt gas flow.

What volume of CO2 are you capturing?

The facility is designed to capture about 500 metric tons per day, pulling about 90 per cent of the CO2 out of the inbound flue gas slipstream. Annually, it will operate with the capacity to capture 150,000 tons to 200,000 tons of CO2.

How is the CO2  being handled?  

Pipeline construction is underway to pump the CO2 to a site about 12 miles away.  Beginning this autumn, the Southeast Regional Carbon Sequestration Partnership will transport the captured CO2 through a pipeline to the Citronelle Oil Field, which is operated by Denbury Resources.

There the CO2 will be injected 9,500 feet into a deep saline geologic formation. The CO2 is being injected into an oil-drilling region but it is not being used for enhanced oil recovery. The CO2 is being sequestered in a formation about 3,000 feet above the deeper oil deposits, where it will remain permanently stored.

The U.S. Department of Energy (DOE), along with its program participants -Denbury Resources, Electric Power Research Institute and Southern States Energy Board – are managing the design and operation of the pipeline and injection system.

What’s next for Southern Co.’s CCS strategy? 

The goal at first Southern Co. is developing options to reduce emissions and meet potential regulatory requirements. We want to look at technologies of the future as an option to do this.  In addition to the Barry CCS project, the company is also:

  • Managing the DOE National Carbon Capture Center in Alabama, where we’re testing the next generation of technologies to capture carbon dioxide emissions.
  • Building a commercial-scale, 582-MW generating plant in Kemper County, Mississipi, using local lignite and the company’s Transport Integrated Gasification (TRIG) technology, with 65 per cent carbon capture and re-use.
  • Drilling wells to assess geologic suitability for carbon storage at other power Southern Co. power plants
  • Partnering with universities to train the next generation of CCS engineers and to advance the industry’s geologic testing capabilities.

Other resources: For more information on Mitsushishi’s KM CDR process, which is described as less energy-intensive than other CO2 capture technologies, see this introduction. Mitsubishi is rolling out the process at a variety of other facilities globally which are listed here.

Check out the original story here:
http://www.globalccsinstitute.com/community/blogs/authors/adamaston/2011/07/06/firing-first-world%E2%80%99s-first-coal-fired-ccs-plant-five-qu

How IBM is Enabling Smarter Management of … Medieval Abbeys | GreenBiz

Question:What does a rack of high-performance, network blade-servers have in common with the 15th century Flemish tapestry The Hunt for the Unicorn?

Answer: Both are being watched over by some of IBM’s most advanced smart building systems.

Last week, as part of a roll-out of a broader suite of smart building technologies, I got to enjoy a dose of high culture and high technology, catching Big Blue’s announcement of novel collaboration with New York City’s Metropolitan Museum of Art.

IBM is supplying a suite of hardware and software as part of its Intelligent Building Management system to help the Met fine-tune the maintenance and preservation of its one-of-a-kind collection of medieval artwork.

The move adapts technology developed originally to monitor and manage energy-intensive data centers — cutting-edge technology of the 21st century — to help care or for some of the most priceless, hand-made religious and secular artifacts dating back to 800 AD.

hunting of the unicorn detailAt the announcement, Dr. Paolo DionisiVici, an Associate Research Scientist at the Met, made clear why doing the latter represents the greater challenge.

The Met’s main medieval collection is housed in The Cloisters Museum & Gardens, an assemblage of medieval French abbey structures, transplanted nearly a century ago (more on that below).

Authentic as the setting is, the challenges of managing temperature, humidity, and other variables are daunting: castle-like thick rock walls define a space that sees tens of thousands of visitors each year.

The artwork itself also offers up material preservation challenges of a stupefying variety.

The tapestries, like the most-famous-of-all Unicorn hanging, are made of a combination of wool, linen, metal threads, vegetable dyes and other delicate materials. After 600 years, they remain remarkably vivid.

Nearby in the museum are painted wooded religious artifacts that require different preservation conditions.

“Every object is unique,” DionisiVici said.

memsic iris moteThat’s where IBM comes in. To monitor this menagerie, IBM is deploying a network of low-power motes — small electronics packages fitted with sensors and antennae for communication. [An example of a type of mote IBM has developed, the MEMSIC IRIS mote, is pictured at left.]

“The motes can sense temperature, humidity, corrosion, contamination, light levels, air flow, pressure and more,” said Dr. Hendrik F. Hamann, one of IBM’s research scientists working on the project. The motes are also very energy-efficient and self-configure into a “mesh” network, he explained, by independently figuring out how to route data from one to the next.

For now, the technology covers seven rooms, a portion of the Cloisters’ overall layout, but including the famous Unicorn tapestries.

What can motes do when overseeing a vaulted-ceiling room filled with priceless art? In a way the museum cannot today, the sensors and analytic software can tap into real-time data to observe and model microclimates, revealing with greater precision areas of higher heat, or dryness, or other variables that can conventional sensor systems.

IBM’s analytical models can even guide how to place particular works of art, given better understanding of a room’s thermal dynamics. Wooden materials, for example, need steady humidity for optimal preservation.

“This is the first large-scale deployment of a smart sensing infrastructure coupled with deep analytics in a museum environment,” said Hamann.

IBM’s pairing with The Cloisters continues a century-old tradition of one-of-a-kinds that define the site.

About a century ago, craftsmen in France painstakingly dismantled five cloisters — essentially, abandoned convents or monasteries designed for secluded, contemplative living — packed the pieces in boxes, and shipped them via boat across the Atlantic.

Once landed, the crates were hauled up to the highest promontory of Manhattan, to land owned by John D. Rockefeller — the wealthiest man in U.S. history. Through the years of the Great Depression, some of those same craftsmen methodically reassembled the stones into a meticulous recreation of an abbey complex.

To complete the setting, Rockefeller bought up scores of acres of Manhattan land around the Cloisters and — just for good measure — also acquired the ridge of land immediately across the Hudson River, so as to protect the view from his creation.

Rockefeller had no interest in living in the transplanted abbey. In fact, he died a year before the complex was completed.

It was built strictly as a home for the more than 5,000 pieces of medieval art he collected in his twilight years, with the intention of donating the entire complex — building, land and artwork — to the Met.

If visiting New York City, The Cloisters is a must see — one of the best and least-known treasures the city has to offer. With IBM’s help, the masterworks there will be around to enjoy for a long time to come.

Interior Cloisters photo CC-licensed by respres. Mote photo courtesy of IBM.


Utilities Turn to Mergers as Demand for Power Slows | New York Times

The slowdown is spurring a fresh cycle of deal-making among publicly traded utilities. Not unlike the wave of consolidation that came after deregulation in the 1990s, major electricity players are looking to get bigger to protect their bottom lines.

So far this year, utilities in the United States have announced mergers and acquisitions with a total value of $44 billion. That compares with $30 billion in all of 2010, according to Thomson Reuters.

A Progress Energy nuclear plant in New Hill, N.C.  The utility’s merger with Duke Energy is pending.

If approved, Duke’s Energy’s $26 billion deal in January to buy Progress Energy would create the country’s largest utility. The combined company would own power plants with 57 gigawatts of capacity, generate $22.7 billion in revenue and serve 7.1 million customers across six states.

The surge of deals “marks the acceleration of a long-awaited consolidation of the U.S. electric utility industry,” said Todd A. Shipman, credit analyst of utilities and infrastructure ratings at Standard & Poor’s.

By his take, today’s deal-making will pick up from the previous era of consolidation. Since deregulation, the industry has shrunk to roughly 50 publicly traded companies, from 100. That number could be halved to 25 in as little as five years, Mr. Shipman said.

While the usual financial pressures to fortify balance sheets and improve credit quality are once again pushing mergers and acquisitions, environmental dynamics are playing a bigger role than in the past. Utilities — facing pending regulation on greenhouse gas emissions and renewed enforcement of older rules on air pollution — must reckon with the rising costs of compliance.

The added expenses come just as growth in electricity demand is being crimped by efficiency gains. Electricity usage increased 0.5 percent a year on average for the decade that ended 2010, down from 2.4 percent a year during the 1990s, according to the Energy Information Administration.

The anemic figures represent the tail end of a six-decade deceleration of electricity demand. The rate peaked in the 1950s, at 9.8 percent a year, during a period of supercharged industrial growth and home construction.

Customers’ plans reflect a secular shift. Nine out of 10 businesses and 70 percent of consumers have set specific goals to lower their electricity costs, according to a recent study by the Deloitte Center for Energy Solutions with the Harrison Group, a research services firm. Nearly a third of companies polled have goals to self-generate electricity, whether through solar panels, reuse of wasted heat or other methods.

Utilities are adjusting to the new reality. With customers tapering their electricity use, Consolidated Edison is deferring the installation of transformers and other costly capital equipment in New York, said Rebecca Craft, the company’s director of energy efficiency and demand management. Con Ed trimmed its outlook for how much the city’s appetite for power will grow in the coming decade to 1 percent a year, from 1.7 percent.

“Practically every utility today is thinking about flattened growth of demand for energy,” said Gregory E. Aliff, a vice chairman of energy and resources at Deloitte.

“In the last wave of utility mergers, it was more offensive — companies were seeking growth,” he said. “Today is different: the industry is more on the defensive. Companies face a question of how to grow, and consolidation is a way to grow earnings.”

New environmental regulations are only heightening the growth challenge. The industry faces potentially sizable bills to meet a raft of air pollution rules being pushed by the White House.

Utilities with a big reliance on coal face the steepest emissions penalties. American Electric Power, which derives about 85 percent of its power from coal, recently estimated the new rules could cost $6 billion to $8 billion in coming years. The money would pay for adding filters to power plant smokestacks, closing coal-fired generators and switching to lower-emission natural gas generators.

Some merger-minded utilities are shedding assets to lower their exposure to such rules. As part of its $7.9 billion deal to buy Constellation EnergyExelon plans to sell a batch of coal-fired plants. While coal fuels 12 percent of their current generation, the companies aim to halve that share after the merger.

A.E.P., Duke and Exelon declined to comment.

Unlike in previous periods of consolidation, regulators seem more willing to approve deals, given the sluggish economy and job market. Previously, the process could drag on for years, but recent mergers have been moving along more quickly.

This month Connecticut regulators effectively approved Northeast Utilities’ tie-up with NStar, despite opposition from the state’s attorney general. When the $6.9 billion deal including debt was announced last October, the companies pledged that “no broad-based, corporatewide layoffs or early retirements are planned.”

Said Mr. Shipman of S.&P., “In most of these deals, executives have been careful to emphasize that jobs will be spared, rather than cut.”

DealBook - A Financial News Service of The New York Times

Copyright 2011 The New York Times Company

See the story original story here http://dealbook.nytimes.com/2011/06/16/utilities-turn-to-mergers-as-demand-for-power-wanes/

How Peppermill Makes the Most of Its Geological Blessings | GreenBiz

In the virtuous race over bragging rights for the title of greenest building, there are various shades of LEED, there’s net-zero energy, and there’s Energy Star, too.

But how many facilities can sink a pipe into a parking lot, tap into more than enough energy to supply year-round heating needs, and just maybe have enough left over to generate much of the site’s electricity?

Welcome to Reno, Nevada, where the filigreed micro-fractures that spread out from California’s San Andreas and other major faults have created a subterranean wealth of hot rocks and heated water.

These geological blessings have made it possible, right in downtown Reno, for the Peppermill Resort Spa Casino a 2.2-million-square-foot complex — to switch off its fleet of fossil-fueled boilers as part of a $9.7-million project to switch over to geothermal heat.

The hotel is saving $2.2 million per year in natural gas purchases. “That is what you call a no-brainer,” says Dean Parker, who as Peppermill’s Executive Director of Facilities, has overseen the four-year project.

Heat exchangerPeppermill’s past and future sit side by side in its spotless power plant shed, a stone’s throw from the main hotel. To get to the new geothermal heating unit, you have to pass by four hulking Cleaver Brook Boilers. Now barely used, the central-plant boilers once cranked out up to 100 million BTUs of heat.

But today, Parker jokes he’s thinking of putting them up on eBay. Replacing all four of the garage-sized boilers is a single so-called “plate and frame” heat exchanger, made by Alfa Laval and pictured at right, no larger than a modest bathroom.

The device sandwiches together hundreds of thin, table-sized stainless steel sheets that work as a radiator, transferring heat from incoming geothermal fluids that measure 174 degrees Fahrenheit into fresh water passing through the heated plates. The fresh water is thereby warmed to 120 degrees and then stored in an armada of oversized Thermoses. Earth-heated water supplies all of the hotel’s needs: 1,600 guest rooms, heating, kitchens, laundry, pools, hot tubs, spas, kitchens and more.

The heat exchanger can meet the hotel’s needs and then some, pulling about 900 gallons per minute of geothermal fluids from the earth. Both it and the well are designed to crank up to 1,500 gallons per minute. “So far as we know, we’re the only hotel-casino fully heated by on-site geothermal,” says Parker.

Geothermal Heating is a Go; Energy is on the Horizon

Electricity generation could be the company’s next project. The geothermal resource below Peppermill has enough capacity that the Peppermill is scoping out a 6- or 7-megawatt electricity generator. With commercial rates for power in the Reno area at around 9 cents per kilowatt-hour, the savings could be considerable.

There are challenges, however. Though warm enough to meet the hotel’s day-to-day hot water demands, Peppermill’s geothermal supply may be a too cool to generate power, since conventional geothermal power plants typically need water at 275 degrees and higher.

But the technology is improving quickly: Parker is hopeful that a so-called “binary cycle” power plants might work. Rather than spinning a turbine by boiling water into steam, these systems use special gases that transform from fluid to gas at room temperatures.

The hotel’s experience developing its new project also reveals the risks of prospecting geothermal resources. Northwest Nevada is rich in geothermal resources, and the hotel had years ago built a so-called “shallow” well, that, at around 900 feet deep, was supplying 120-degree water. Hot enough to heat its pools, but not sufficient to meet operational needs.

To upgrade its geothermal capacity, the hotel decided to dig deeper. Following studies suggesting hotter water could be found deeper down, the hotel erected a drilling rig in a parking lot late in the summer of 2009. Sound barriers helped minimize the noise from affecting guests and neighbors.

Parking lot drillAt first, the drill came up dry, raising the specter of a costly error. Pushing on, and operating round the clock for 28 days — at $100,000 per day — the rig bored to a depth of 4,421 feet down. There it found a gusher of 174-degree water that today provides all of the hotel’s heat.

These days, with the drill long gone, the well is all but invisible, hidden under a manhole cover that straddles two parking spaces in the restored lot. Opening that cover, all that can be seen now is a 9″ diameter pipe that surfaces into a fire hydrant-sized valve which can steer up to 1,500 gallons of hot, salty brine into the hotel’s power plant a few hundred yards away.

In January 2010, the hotel drilled a similar, second well located about 1,000 feet across the hotel’s property. Plunging to just 3,900 feet, this second well is used to re-inject the hot brine back into the earth after it has been used to heat fresh water.

This closed-loop is designed to make sure the subterranean heat resource isn’t depleted. Once the hot fluid surfaces, only some of its heat is extracted before the fluid is piped back to the injection well and sent back down into the same geological formation. Down below, the water re-absorbs heat from the earth, to be recycled again and again.

The Peppermill’s location is blessed with more than ample superheated water. Over the past decade, Reno has emerged as a hotbed (couldn’t resist…) of the geothermal industry. About a quarter of Reno’s daytime electricity demand, and most of its nighttime needs, are supplied by Ormat’s 100-megawatt Galena geothermal plant just outside the city.

Along with Ormat, the bulk of the U.S. geothermal industry — including Ram Power, Magma Energy, Terra Gen, Gradient Resources and Enel Green Power — all operate locally. Dozens of geothermal power plants are in the pipeline, in large part to help meet California’s looming goal of generating one-third of its electricity from renewables.

Political support is strong too: Reno is an area where “Republicans want renewables,” and lawmakers have supported both state-level renewables policy, along with university R&D programs as well as community college courses focused on supplying geothermal technicians.


Writer, editor, content advisor, creative leader – energy, climate | Chief storyteller at RMI | Co-founder of T Brand at The New York Times