Sunday, October 31, 2010

Dimethylfuran and 5 hydroxymethyl furfural From Biomass

The biomass-derived chemical dimethylfuran, seen above, has been shown to possess combustion and emissions qualities competitive with gasoline.
5 hydroxymethyl furfural, shown highlighted above, is also derived from biomass, and can serve as a platform for the production of renewable fuels and high value chemicals.

To this point, it is the production of this wide array of useful chemicals and fuels from biomass which has been demonstrated -- not necessarily their economical production at large scale. It will take a number of years to accomplish that.

It is best to see the future of biomass derived fuels and chemicals as a broad highway with many lanes of traffic. Not all lanes will reach the goal of profitable, large-scale production. Certainly not all lanes will reach the goal at the same time. And counter-intuitively, the high-speed lane may not be the lane which allows the largest flow of traffic in total.

There will be no magic bullet, instant replacement for geological hydrocarbons. That is why it is so important to maintain access to the wide array of geo-hydrocarbons for as long as possible. But as soon as the dieoff.orgiasts and the energy starvationists can be hounded out of public office, the prospects for a prosperous and expansive human future will enlarge vastly.

Labels:

Friday, October 29, 2010

GE-Hitachi Teams w/ Savannah River Nuclear Solutions to Explore Revolutionary PRISM Reactor Gen IV Burning of Spent Nuclear Fuel

The PRISM reactor design, which completed U.S. Nuclear Regulatory Commission pre-application reviews in 1994, is an advanced, Generation IV sodium-cooled reactor technology. A key attribute of PRISM technology is that it generates additional electricity from recycling used nuclear fuel...

...GE-Hitachi leans away from reprocessing, and towards recycling, as a potential future fuel cycle in the USA: "Our vision is recycling—you burn the actinides in a fast reactor, such as PRISM, and you don't separate plutonium. That alleviates some of the proliferation concerns, and you get as close to closing the fuel cycle as practical. You also reduce waste, and reduce the half-life of radioisotopes that are disposed." _NuclearEngineering
The GE-Hitach PRISM reactor is a Gen. IV design that promises to recycle and burn the mountains of spent nuclear fuel left over from conventional nuclear fission reactors. If this sodium-cooled Gen IV reactor proves out in testing at the DOE Savannah River Site, it will revolutionise the nuclear waste picture.
GE Hitachi Nuclear Energy (GEH) and Savannah River Nuclear Solutions, LLC, (SRNS) signed a memorandum of understanding (MOU) to explore the potential of deploying a prototype of GEH’s Generation IV PRISM reactor as part of a proposed demonstration of small modular reactor (SMR) technologies at the US Department of Energy’s (DOE) Savannah River Site.

The MOU sets the stage for continued discussions on the potential NRC licensing and deployment of a 299-megawatt PRISM reactor at the federally owned facility. SRNS is the management and operating contractor for DOE at Savannah River Site (SRS)....

....The PRISM reactor design, which completed US Nuclear Regulatory Commission pre-application reviews in 1994, is an advanced, Generation IV reactor technology that builds on research and development of sodium-cooled reactors. A key attribute of PRISM technology is that it generates additional electricity from recycling used nuclear fuel. _GCC

Labels:

First US Coal-to-Gasoline Plant for West Virginia?

TransGas Development Systems, LLC announced an agreement with SK Engineering & Construction Co., Ltd (SKE&C) leading to engineering, procurement and construction of its first US coal-to-gasoline plant—Adams Fork Energy—to be located in Mingo County, West Virginia. _GCC

US coal deposits contain 12 X as much energy as all known oil in Saudi Arabia. The gasification process to be used in the new West Virginia CTL plant could cleanly utilise coals of any grade -- including the cheapest and dirtiest coal. By moving US coal reserves into the liquid fuels arena, the prospects for peak oil continue to remain slight -- unless the Obama administration decides to shut down all coal, even clean coal projects. Obama has promised to bankrupt coal companies, and all his other policies are consistent with an "energy starvation" approach to shutting down US industrial production. Time will tell.
The Adams Fork Energy project will convert regional coal into premium-grade gasoline, producing 18,000 barrels per day (756,000 gallons US, 2.86 million liters). When fully developed, the Adams Fork project will be the largest coal-to-gasoline project in the world, according to Adam Victor, President and CEO of TransGas Development Systems.

The project team has been issued a permit to construct by the West Virginia Department of Environmental Protection and plans to begin work on the site during the second quarter of 2011.

The plant will have several process components. First, coal is gasified to produce synthesis gas, using Uhde PRENFLO PDQ gasifiers. The synthesis gas will then be cleaned to remove impurities, turning most into marketable byproducts. Next, the synthesis gas will be converted into methanol, which in turn will be converted into gasoline utilizing ExxonMobil Research and Engineering Company’s (EMRE) MTG process. During the operation of the integrated facility, air emissions are expected to be so low that it will qualify as a minor source under US law. _GCC

Labels:

Thursday, October 28, 2010

Proved Oil Reserves Continue to Grow Over Time

Proved oil reserves at the end of 2008 are estimated to have been 1258.0 thousand million (billion) barrels. That represented an increase of 17.7% over the 1997 figure of 1068.5 billion barrels, despite estimated cumulative production of 290 billion barrels during the intervening ten years, ie global reserves additions amounted to around 480 billion barrels between end-1998 and end-2008.

...The reserve numbers published in the BP Statistical Review of World Energy are an estimate of proved reserves, drawn from a variety of official primary sources and data provided by the OPEC Secretariat, Cedigaz, World Oil and the Oil & Gas Journal and an independent estimate of Russian oil reserves based on information in the public domain. Oil reserves include field condensate and natural gas liquids as well as crude oil. They also include an estimate of Canadian oil sands 'under active development' as a proxy for proved reserves. This inclusive approach helps to develop consistency with the oil production numbers published in the Review, which also include these categories of oil. _Source
In fact, proved oil reserves have increased since the beginning of the use of petroleum as an economic resource. One reason for this is that the total amount of hydrocarbon resources distributed throughout the Earth's crust are unknown -- and likely to be far higher than anyone has estimated. Whenever economic conditions require that new exploration should be carried out, new reserves are discovered. In addition, new technologies are constantly developed which convert unproved reserves into proved reserves.

Well over 70% of the Earth's crust remains to be explored for hydrocarbon (including oil, coal, gas, kerogens, bitumens, peats, clathrates, heavy oils, etc etc). The truly giant oil fields may not have been discovered yet. Technology for oil exploration and production is still in the early stage of development, and geologic theory of oil reservoirs is due for radical overhaul.

The oil fields that are being tapped currently are generally only a few hundreds of millions of years old at most. Most of the truly ancient oil fields have been submerged beneath other forms of rock -- besides being under the seafloor -- which are not thought to be fertile ground for oil drilling. The classical theories of oil exploration are suitable for low-hanging fruit. As the old geological theories are subsumed by more realistic approaches, ever new proved reserves of oil will emerge.

It is a race against time, of course. Within 20 years, microbial fuels will be both price competitive and approaching a scale to rival the modern oil industry. BTL, CTL, GTL, BitTL, KerTL, clathrates-to-liquids, etc. will be performed in situ and on-site, using advanced nano-catalytic technologies combined with microbial assist. It will be a new world of energy, but a world with a far lower demand for liquid fuels, as small modular reactors take over the electrical sector.

Labels: ,

Wednesday, October 27, 2010

Densifying Biomass: Biomass Tablets at 20,000 PSI

MU's Bradford Research and Extension Center farm and a local business have built a machine that can compact corncobs, switchgrass and other biomass so four times as much material can fit in the same amount of space.

Instead of needing an 18-wheeler truck to move biomass to burn as fuel for electricity and ethanol, the same amount could be transported in a dump truck. _Missourian
Biomass is an inherently bulky and clumsy material for transport. Finding ways to compress and/or densify biomass very close to the source is crucial toward building a thriving biomass economy. Otherwise, a significant amount of energy will be wasted simply transporting the biomass to pre-processing, processing, and refining facilities.

Current methods of densifying biomass include compression into pellets and briquettes, and fast pyrolysis conversion to bio-crude. But those processes involve complex multi-process machines which are difficult to transport. University of Missouri ag researchers along with collaborators have developed a special hydraulic press at 20,000 psi (as opposed to 3,000 psi typical for briquette presses) which achieves an almost 20% improvement in mass compression over briquettes and pellets, and almost 4 X better mass compression than commercial balers.
The tabletizer works like this: a hopper with a hydraulic motor turns the auger and feeds the 4- to 6-inch diameter cylindrical mold with biomass. Then a ram pounds the biomass tightly into the mold, shrinking the material from about 10 inches to 2, which is smaller than most biomass briquettes.

“Basically, it squeezes the snot out of it,” VanEngelenhoven laughs. The mold then turns and ejects the compacted tablet. The pressure exerted on the biomass in the mold is about 20,000 pounds per square inch, enough to force the material to adhere together without additional binders. “We don’t put anything extra in it,” VanEngelenhoven says. Long, coarse-cut feedstocks are favorable in the process, as they stick together more easily, he adds.


The resulting tablets have an average density of 55 pounds per cubic foot, compared with average bale density of 15 pounds per cubic foot and pellet density of about 45 pounds per cubic foot, VanEngelenhoven boasts. “So it’s significantly better than a baler, but it uses more energy as well,” he says. “So in that realm, you have to try to compare how much energy this machine is using versus another machine and is there one that’s inherently better? And the answer is, it depends on what you want it to do.” If the densified material is being used at a power plant or to burn for heat in a home, a bale doesn’t fit the bill because it’s too big, VanEngelenhoven says. “So if you want something smaller, you can go with pellets, or now you have this option.” _BiomassMag
As commercial versions of these presses become available, private startups will move from farm to farm -- or from farm region to farm region -- to compress biomass for more convenient and economical hauling to either point of use, or further processing into fuels or materials.
The tablets could provide fuel for co-firing with coal in stocker and fluidized-bed boilers for power generation, fuel for heating buildings and feedstock for producing charcoal. Primary markets would be power plants and ethanol plants. _MUNews
As economies of scale bring the costs of production down, individuals and communities may elect to install their own boilers to provide combined heat and power using compressed biomass.

One ton of compressed biomass is roughly equivalent to 3 barrels of heating oil in energy content. The biomass press described above can use a wide range of raw biomass without having to grind or heat the feedstock before compression into tablets.

Labels:

Tuesday, October 26, 2010

As Phantoms of Peak Oil Fade into Dawn

Peak oil has taken on the trappings of an apocalyptic religion recently. True believers tout "Hubbert Curves" and "Simmons Depletion Analyses" in circular jerkulars and echo choruses across the webosphere. Oil should be well over $500 a barrel by now, according to not-so-distant predictions of the peakers -- some dead and some alive. But oil is in the $80 range largely due to a combination of the weak Obama dollar, the speculative lemming's rush to a safe investment haven, and a misleading surge in demand from India, China, and Brasil.

But ignoring the doomseekers, what are the actual hydrocarbon resources which we have to deal with? Below are some rather conservative early estimates of hydrocarbon resources -- neglecting the vast terrestrial and sub-sea coal and methane clathrate resources. It may take a while to achieve "peak oil" when you consider how quickly industry is developing means of converting these unconventional hydrocarbons to conventional liquid fuels.

Here is more on the vast hydrocarbon resources remaining:
How fast are we using our oil reserves?

Currently, we are consuming about 31,100,000,000 barrels a year.

How much oil do we have? Or more specifically, how much do we have left, and in what forms?
- Conventional Oil, About 1,750,000,000,000 barrels left.
- Oil Sands Oil, About 3,600,000,000,000 barrels left.
- Oil Shale Oil, About 3,300,000,000,000 barrels left.
- Bio Fuel, Till the sun burns out…

At this burn rate, it’ll take approximately 56 years before we run out of easy to get conventional oil or 278 years till we run out of oil reserves altogether. One of the common misconceptions about our consumption of oil is that we are using it at an ever increasing rate. While in reality, there was a peak in refining capacity which occurred in 1980 and most of the world and has been remarkably steady ever since. _Source
The outlook expressed above is remarkably rosy when viewed from a doomer's perspective. But in reality, the resource estimates are likely to be far too low in the long run.

It is all a moot point, however, because long before conventional oil is exhausted, humans in advanced countries will have begun to shift to resources which are essentially inexhaustible, such as advanced interlocking fuel cycles of nuclear fission, microbial biofuels, and perhaps nuclear fusion.

The world's transportation and chemical industries are far too reliant upon conventional crude oil at this time. But Canadian oil sands are coming on quickly -- as long as the price of oil stays above $70 a barrel. Other unconventional fuels are following along. More will follow as needed.

Microbial biofuels are ten years in the future to price competitiveness, and another ten years to scale up to necessary volumes to truly compete. By that time, development of CTL, GTL, BTL, and oil sands, oil shales, methane clathrates, etc. should be in full bloom. Advanced nuclear fission will also be taking off, as small modular reactors begin to take over from coal generation plants.

History is witness to the scatterings of myriad apocalyptics all the way back to antiquity. It's not a bad scam, if you can keep your self respect and your sanity.

Labels:

Sunday, October 24, 2010

Biofuels Technology: Not Standing Still for Anyone

New Oil Resources' plant in Louisiana is utilising hydrothermal liquefaction of biomass, and subsequent direct production of transportation fuels from liquefaction products.
Louisiana-based startup New Oil Resources (NOR) is commercializing a near-critical (i.e., 320-390 °C, 200-420 bar) aqueous phase process which converts biomass containing cellulose, hemicellulose and lignin into a high-octane gasoline fraction. New Oil Resources licensed the technology (US Patent 6,180,845) in 2009 from Louisiana State University (LSU); the original developers of the process are Drs. James Catallo and Thomas Junk.

Catallo and Junk determined that reacting organic compounds in near-critical or supercritical aqueous phases can transform the compounds over short time periods (i.e., minutes to hours) into petroleum-like hydrocarbon mixtures. The reductive process is conducted in anaerobic or near-anaerobic conditions, essentially free of any strong oxidants. Strong reducing agents or other co-reactants may be added to tailor product distributions...

...Some 70% to 80% of the energy in the feed is returned in the final products. The remaining 20% to 30% of the energy is used to run the process. The process has a small footprint, produces renewable energy and is water friendly. It also utilizes technology and equipment already in use in the petrochemicals industry.

Our advantage is that we apply basic chemistry instead of using biological processes or relying on catalysts. The chemistry we use is similar to gas phase kinetics which is more reliable and much easier to scale up.

Our process is similar to that used by several companies worldwide. We use hot water to depolymerize the cellulose, lignins, lipids and other polymers contained in the biomass. The difference between all these companies is what you do next. We impact the chemistry so that the depolymerized biomass turns into the products we want. Cellulosic material comes out of our process as oxygen free aliphatics containing five to eight carbon atoms and aromatics. This product is similar to the high octane gasoline produced in petroleum refineries.
Dr. Gary Miller _GCC
The New Oil process claims near 80% energy yields from feedstock, with the versatility of creating either hydrocarbons or alcohols as final products. Meanwhile, Tuskegee University and Florida State University are exploring various means of deoxygenating bioliquids in order to directly form hydrocarbons from biomass.

Labels:

Thursday, October 21, 2010

$2 to $4 a Gallon Diesel from Waste Biomass Projected from UMass

The process is an extension of early work by Huber and James Dumesic that first presented a catalytic process for the conversion of biomass-derived carbohydrates to liquid alkanes (C7–C15).

The four-steps of the process are:

  1. Combined acid hydrolysis into xylose and acid-catalyzed biphasic dehydration of xylose into furfural;

  2. aldol condensation of the furfural extract through which the alkane precursor F-Ac-F is formed through the reaction of furfural with acetone;

  3. low-temperature hydrogenation of the F-Ac-F dimer to thermal stable hydrogenated dimers (H-FAFs). In this step, three types of double bonds of F-AC-F are saturated and the final hydrogenated dimers contain merely spiro and alcohol forms of dimers;
  4. and
    High-temperature hydrodeoxygenation of the hydrogenated dimer solution and H2 to produce jet and diesel fuel range alkanes over a bifunctional catalyst.
_GCC
A developing process from UM Amherst and the University of Maine promises to provide much higher yields of fuels from wood-processing waste.
The researchers obtained experimental yields of 76% for jet fuel range alkanes, corresponding to a weight-percent yield of 0.46 kg of alkanes per kg of xylose (monomer and oligomers) in the hemicellulose extract. The theoretical yield for this process is 0.61 kg. Currently, the low-yielding steps are dehydration and hydrodeoxygenation. For the dehydration step, they were able to obtain a yield of 90% with model xylose solutions, and the kinetic model indicates that a yield of 95% can be achieved.

Therefore it should be possible to obtain yields close to 95% for this process by further optimizing the reaction system. Yields higher than 95% are very challenging due to the undesired decomposition and polymerization reactions. The yield for the hydrodeoxygenation step could also be improved from 91% to 95% with improvements in the catalysts and reactor design.

We predict that the overall yield for jet and diesel fuel range alkanes could be increased up to 88% with these modest process improvements. The straight alkanes produced in our process can be further upgraded via the hydroisomerization process to form branched alkanes. The straight and branched alkanes together can either be directly sold as chemicals or liquid fuels, or sent to a refinery as additives to make the desired jet and diesel fuels by blending with other hydrocarbons.

Currently the alternative approach to the synthesis of straight and branched alkanes for synthetic jet and diesel fuels is the Fischer–Tropsch process, using synthesis gas derived from natural gas. As such, our process provides another way to make jet and diesel fuels range alkanes from waste hemicellulose-derived solutions.
—Xing et al.
The team performed a preliminary economic analysis for this process and concluded that jet and diesel fuel range alkanes can be produced from between $2.06/gal to $4.39/gal depending on the feed xylose concentration in the hemicellulose extract, the size of the plant capacity and the overall yields. _GCC

Given the low energy density of biomass, it is critical that BTL processes be as efficient, with as high yield as possible. Naturally, it is also important that biomass feedstocks be energy-densified at as early a stage in the collection process as possible, with as low a cost as possible.

Biomass is a relatively low cost means of converting solar energy into gaseous, liquid, and solid forms of fuel -- as well as plastics, high value chemicals, fertilisers, animal feeds, and more. As more integrated means of processing and utilising biomass feedstocks are developed, the costs should drop as the activities are made more efficient, and are scaled up.

Labels: ,

Wednesday, October 20, 2010

More on Small Modular Nuclear Fission Reactors

With six nuclear companies involved in the design of SMRs, competition could create a global market for U.S. SMRs. These companies include Westinghouse and Babcock & Wilcox, which build small reactors for the U.S. Navy, as well as start-ups like NuScale, Hyperion, and Intellectual Ventures, a Seattle firm financed by Microsoft _Standard

Small modular fission reactors can jump-start a new era in US power production. Newer technologies will require ever greater electric power generation capacities. This new power will need to be available when it is required. It must be both baseload and dispatchable. Given the modular nature of Small Modular Reactors (SMRs), and the potentially load-following nature of some of the new designs, SMRs offer incredible utility and high value for the money.
Ranging in capacity from 45 to 140 megawatts, SMRs are a fraction of the size of conventional 1000 MW nuclear plants and can be rapidly assembled at new or existing nuclear plant sites. Major SMR components such as the reactor vessel and turbine-generator are factory fabricated and transported by truck, rail or barge for prompt onsite-assembly. Some SMR designs operate continuously for over a decade before refueling.

Because factory fabrication permits greater cost savings, SMRs are expected to cost about $5,000 per kilowatt or several $100 million rather than billions for a 1000 MW plant.... _Standard
It is difficult to predict how large a boon to an economy these versatile new reactors represent. But we are beginning to learn:
A prototypical 100 megawatt (MW) SMR costing $500 million to manufacture and install onsite is estimated to create nearly 7,000 jobs and generate $1.3 billion in sales, $627 million in value-added impacts (a measure of GDP), $404 million in earnings (payroll), and $35 million in indirect business taxes;
The annual operation of each 100 MW SMR unit is estimated to create about 375 jobs and generate $107 million in sales, $68 million in value-added impacts, $27 million in earnings (payroll), and $9 million in indirect business taxes. _Heritage
Certainly US utility giant TVA is figuring SMRs into their future plans:
"In the longer term, we are just gearing up to participate in small nuclear reactors," Brinkworth said.

These are a new generation of reactor, about one-third the size of traditional reactors, that are built in a factory rather than on-site, can be transported by truck or rail, and are much less expensive than a site-built reactor.

"It is possible down the road that as we get more experience with that technology that we may see more of those units become part of our resource mix," Brinkworth said. "One of the advantages is they are smaller and you can add them in smaller increments so you can match your units to your growth (of demand)." _KnoxvilleNews
SMRs are far from being a totally-new technology. In fact they have been around for quite some time, and have accrued quite a safety record:
The United States has been operating small nuclear power plants continuously since the early 1950s. These reactors have been used for research and development, power generation, and ship propulsion. There have been tens of thousands of people associated with the design, development, manufacture and operation of these smaller reactors.

The enterprise has accumulated an admirable safety and reliability record and is the basis for some of the optimism associated with the development of small modular reactors that can produce power in volumes that would be uneconomical if produced in the far larger nuclear plants that became the standard during a period when coal, oil and natural gas were extremely cheap _RodAdams
Besides safety advantages, there are a large number of cost advantages inherent in the use of small modular reactors in place of most other power technologies:
Small reactor clustors located at recycled coal fire power plant locations potentually have greatly simplified grid connections. Not only can they be located near to the cities they are intended to serve, but grid hookup is facilitated by existing transformer farms, and grid connections. Because they can be located close to served cities new transmission lines will not cover long distances, thus lowering grid expantion costs. Large reactors may require new transmission lines that are hundreds of miles long, inorder to move surpluse electricity to market.

In addition to the above savings, and potential savings mentioned above there are other potential savings that may be avaliable with small reactors. For example, with advanced nuclear technology, for example molten salt reactors, combined Rankine (steam) and Brayton (gas) cycles are possible. A bottoming desalinization cycle could be offered to to the system, thus offering formible efficiency from small reactor packages. A high temperature reactor can provide top cycle heat for industrial processes, as well as producing middle cycle electricity generation, and bottom cycle heat for electrical generation. By adding a second generating cycle, small reactors can lower their electrical generation costs. Desalinizaion would add a further revinue stream from the reactors operation through the sale of portable water.

Thus it can be concluded that shifts to small nuclear power plants will potentially offer significant savings over current conventional nuclear costs. Shifts from conventional nuclear technology, to some advanced nuclear technologies, also offer significant potential savings. Some advanced technology savings are avaliable to both large and small nuclear power plants, but the flexibility of small NPPs may mean that at least in certain situations small advanced nuclear power plants may offer very significant potential savings in comparison to large conventional NPPs. _CharlesBarton
While many of the newer SMR designs will need to prove themselves, some of the designs are minor variants of already-proven designs with excellent safety and operating records. The NRC needs to take that into account as it begins the approval process for these designs.

Labels:

The Nasty Truth Behind Big Wind Energy

Wind energy has become a great cause and crusade by faux environmentalists and Obamists in and out of government. But big wind energy is most famous for the things it is not: Big wind is not affordable, it is not reliable, it is not dispatchable, it is not baseload. Big wind has become a dogma in a religion of pseudo-science -- a vital tenet in the implementation of a designed energy starvation.
1. Study after study shows that wherever wind development was put in place, natural gas demand went up and the environmental benefits were the opposite of what the advocates expected.

“Cycling” coal plants to accommodate wind generation makes the plants operate inefficiently, which drives up emissions. Moreover, when they are not operated consistently at their designed temperatures, the variability causes problems with the way they interact with their associated emission control technologies, frequently causing erratic emission behavior that can last for several hours before control is regained. Ironically, using wind to a degree that forces utilities to temporarily reduce their coal generation results in greater SO2, NOX and CO2 than would have occurred if less wind energy was generated and coal generation was not impacted.”

2. There is a huge disparity between installed capacity and actual output into the system. In many cases the actual output in the system is less than 20% and in some cases even far less.

There are other unsavory facts that are included in these graphics such as the area required by a wind farm compared to, e.g., nuclear power plant. The Roscoe wind farm in Texas occupies 100,000 acres for a bit less than 800 MW of installed capacity; the Palo Verde nuclear power plant in Arizona occupies 4,050 acres (4 percent of the Texas wind farm) but has a 500 percent larger power capacity (almost 4,000 MW.)

Even more obscene are the government subsidies that go into wind power. For an energy source that barely exceeds one percent of energy output, wind subsidies are $23 per megawatt hour, about 60 times of the $0.44 per megawatt hour that go to the mainstay of US electrical power output, coal and 100 times the $0.25 per megawatt hour that go to natural gas, the two sources that account for over 70 percent of US power supply. Way to go for social engineering. _EnergyTribune

Labels:

Tuesday, October 19, 2010

The Age of Easy Oil: Just Beginning?

The oil minister of Saudi Arabia says that the nation's oil fields still have plenty of cheap and easy crude yet to be pumped. Saudi Arabia's and the world's largest oil field -- Ghawar -- is said to still contain at least 88 billion barrels of oil alone. And most of Saudi Arabia has yet to be thoroughly surveyed for the scores of smaller, still undiscovered fields that remain.
"I am sorry to disappoint people, easy oil is not over," Ali Al-Naimi told reporters in Riyadh on the occasion of OPEC's 50th anniversary.

"How can you say that easy oil is over, when we still have over 88 billion (barrels) in the Ghawar field...You can dismiss that notion that easy oil in Saudi Arabia is gone," he added.

Ghawar, the world's largest onshore oilfield has been in production since 1951.

Proponents of the theory that global oil output is at or near its peak have said Saudi reserves may be less than stated and that fields like Ghawar may be under strain.

...Each barrel of oil equivalent cost an average of just over $3 to discover last year, compared with just $1.18 in 2001, according to upstream consultant Wood Mackenzie.

Saudi Aramco is undertaking a $16 billion development at its Manifa offshore oilfield. The field would pump 900,000 barrels per day (bpd) of heavy crude by 2024. [ID:nLDE65E1QY]

Naimi said the development of Manifa will allow the kingdom to tap into heavy oil production and was not a signal to the end of the easy oil age.

"That is one of the reasons but we also need it for new refineries that are built."

Asked about depletion rates in fields, Naimi said the kingdom has sufficient production capacity at 12 million barrels per day (bpd) and has a strategy of preserving its resources and developing new sources of energy.

"We have the production capacity and we don't have to deplete our reservoirs as fast as someone who's just there for investment...so we don't really have to pull our reservoirs as hard as we should," Naimi said.

...The kingdom had reached a production rate of up to 70 percent at some of its fields.

Naimi said the world would remain dependent on fossil fuels for at least 50 years and reiterated his country was looking into other forms of energy.

OIL PRICES

For now Naimi said the market is balanced and a price range between $70-80 per barrel was fair to both consumers and producers, giving no indication on what action could be taken in OPEC's next meeting in December. _Reuters

Commercial oil development took off in North America long before it did in the middle east. Roughly a thousand times more exploration has gone on in North America for oil deposits over the past 160 years than in Saudi Arabia, the Persian Gulf, or the rest of the middle east.

While the Peak Oil Doom! business may be booming among many folks who have nothing better to do, among those who are into solving problems, reality must be taken into account.

Labels: ,

Monday, October 18, 2010

Two New Gasification Biomass_to_Alcohols Plants

Ineos Bio is opening a new plant in Indian River County, Florida, to produce biomass to ethanol via gasification and catalytic conversion. The process can be expanded to add any number of new high value chemical products to the line in the future -- depending upon the catalytic conversion processes used.

Maverick Biofuels is building a biomass-to-mixed-alcohols plant in North Carolina, based upon a similar approach. Biomass is gasified to syngas, which is catalytically converted to C2-4 olefins, and then to C2-4 alcohols. The mixed ethanol-propanol-butanol product is a higher value fuel additive than pure ethanol, with a higher energy density.

The decision to use gasification plus catalytic conversion to produce alcohols rather than attempting to break up the cellulose / hemicellulose into sugars, and then ferment the sugars to alcohols, is an economic one. Gasification and catalytic technology is a relatively mature technology, in comparison to technologies which will be used to dismember cellulose and ferment the mixed sugars to alcohols.

In the long run, microbial approaches will probably prove more economical than high temperature approaches such as pyrolysis and gasification, due to lower energy requirements. Greenhouse gas laws, mandates, regulations, taxes, and penalties will also likely play a part in the calculation -- much to the detriment of the underlying economy.

Labels: ,

Saturday, October 16, 2010

Algae Biomass is Prolific and Leaves Farmland for Growing Crops

The Earth is 70% covered by salt water. It makes sense to utilise the saltwater marine environment to produce feedstocks for fuels, chemicals, plastics, and other materials.
Biofuels from algae grown in seawater are the only fossil fuel alternative that doesn't compromise food and freshwater supplies, believes Yusuf Chisti. Algae are an increasingly popular potential feedstock for biofuels, but the Massey University, New Zealand, scientist says that currently used techniques won't provide fuel in the quantities needed. _IOP


Algae, says Mayfield, is going to be the next big agricultural crop. The only difference is algae grows on water, whereas traditional ag crops grow on land.

Today, researchers across the country are studying algae to produce fuel and feed and maybe even some day fiber, and Mayfield told me during an interview as part of a San Diego Algae Tour, that what we’re looking for in algae is exactly what they worry about in ag.

There are four things that Mayfield and his team are focusing on in their algae research: growth rate, the product being made, crop protection and harvestability. For example, when his team is growing algae, they need it to grow fast, produce a high amount of lipids, be free of disease, and be harvested as cheaply as possible. _DF

Both micro-algae and macro-algae will be most useful for their biomass in the early stages of algal fuels and chemicals. Using pyrolysis, gasification, fermentation, and catalytic synthesis, industrial chemists will be able to turn algal biomass into fuels, chemicals, plastics, and a wide range of other valuable materials which would otherwise be made from fossil fuel sources.

While fossil fuel sources are far more plentiful than generally acknowledged, biomass crops such as algal forms can be grown at will over most of the world's surface -- including the oceans. This ability to locate and scale one's feedstock source -- and to replenish it year after year -- is an advantage which has not yet been figured into the economic picture.

Labels: ,

Nuclear Energy Carnival #23 at NEI Nuclear Notes

Here are a few excerpts from the 23d Carnival of Nuclear Energy:

Brian Wang at Next Big Future noted how fast India is increasing its nuclear generation and exceeding its projected targets.


Charles Barton at Nuclear Green pointed out the number of cost savings that can be achieved from small and advanced reactors.

Steve Aplin at Canadian Energy Issues discussed another go at building new nuclear at Darlington in order to replace 6,000 MW of coal. Aplin pointed out how history has proved that when decision-makers take bogus ideas like Amory Lovins’ ‘negawatts’ seriously, the result is skyrocketing greenhouse gas emissions and a stagnant job market.

much more from NEI Nuclear Notes via Brian Wang

Meanwhile, Russia has agreed to build a nuclear power plant in Venezuela for Hugo Chavez.

Energy is the life's blood of advanced civilisations. Any government which turns away from the development of all viable energy sources -- such as the US Obama regime is doing -- is condemning its citizens to many years of hard labour by way of compensating for their government's short-sightedness.

Nuclear, coal, gas, oil sands, oil shale, offshore oil etc. -- all must be developed and utilised as needed. It is the nation whose energy supplies are there when demanded which will be able to weather political peak oil and the other desperate strains which are coming. Carbon hysteria must be jettisoned as the suicidal contrivance that it is.

Labels: ,

Friday, October 15, 2010

New Diesel and Bio-Diesel Powered Fuel Cell

Brian Westenhaus has a story about a new solid-acid fuel cell capable of running on diesel and biodiesel fuels. The development comes from research taking place in Norway. If the new fuel cell proves out, it should be a big boost for electric vehicles -- allowing rapid re-fueling with current infrastructure. Future biodiesel infrastructures will easily fit into the scheme for both vehicular and stationary applications.
In trials, a 200-watt solid-acid fuel cell ran on both pure hydrogen and on hydrogen produced from diesel by the unit’s reformer – with only an insignificant difference in performance. The system is another handy way to solve the hydrogen production and storage issue as well as keep consumers access to abundant fuels used at very high efficiencies.

Diesel is a hydrocarbon thus CO2 is an issue. The reformer section converts the hydrocarbons into hydrogen, CO2 and heat. Due to the unit’s high efficiency, CO2 emissions are substantially lower than in conventional combustion engines, and no other demonstrable exhaust is discharged – meaning that diesel particulates, black carbon soot, nitrous oxide (NOx) and carbon monoxide (CO) are eliminated. An added plus is that the reformer emits no smoke or odor. And, it’s dead silent.

The silent electric generator is being developed and produced by the Norwegian company Nordic Power Systems (NPS). The new type of fuel cell is being developed and delivered from the California firm SAFCell. The development of solid-based acid fuel cells (SAFC) was pioneered in the Haile Lab of the Material Science Department at Caltech. Dr. Calum Chisholm, together with a team of experienced scientists, engineers, and business executives founded SAFCell to bring the technology to the market in November of 2009. Things are moving very fast – it not been a year yet and the prototype field test units are being built. _BrianWestenhaus

Update on waste-to-fuels enterprise by Terrabon

Labels: , ,

Thursday, October 14, 2010

Fuels, Chemicals, Plastics, and More from Biomass

Smart producers are planning for multiple revenue streams from their investments in biomass to biofuels. By integrating the production of high value chemicals, plastics, and other products into the mix, these companies can achieve profitability earlier, and hedge their revenue streams from slowdowns in individual sectors.
Avantium, which spun off from Shell in 2000 to develop furan-based biofuels and biomaterials, has begun construction of a pilot plant at the Chemelot site (Geleen, the Netherlands) to convert carbohydrates into furanic building blocks—which the company calls “YXY”—for making renewable materials and fuels. Furanics are heteroaromatic compounds derived from the chemical intermediate HMF (hydroxymethylfurfural, C6H6O3).

The pilot plant is expected to become operational in the first quarter of 2011 and marks a major milestone in the commercialization of Avantium’s technology. The plant will produce several tons of YXY building blocks per year to support product development.

...The Chemelot site in Geleen, the Netherlands offers services and a specialized chemical infrastructure to the industrial producers, among others DSM and Sabic.

YXY (pronounced “ixy”) is a patented technology that converts biomass into Furanics using Avantium’s catalytic technology. YXY can be implemented in existing chemical production methods. _GCC
A look at 2007 testing of furanal fuels by Avantium

Labels: ,

A Nascent Energy Boom in Oil Shale

The US is sitting on more reserves of oil-equivalent than is likely to be found in the entire Persian Gulf. Under the Obama regime that is all the US will do -- sit on it. In a more rational post-Obama world, energy resources will be utilised. More from Alexander Smith:
Oil shale has been used as fuel and as a source of oil in small quantities for many years; however, few countries currently produce oil from oil shale on a significant commercial level - the US wants to change that.

Many countries do not have significant oil shale resources, but in the countries that do, the oil shale industry has been very slow to develop. This is because, until recently, the cost of oil derived from oil shale has been significantly higher than conventional pumped oil. The same principles apply in mining. Whenever something is deeper, or harder to get to, the costs go up. With that stated, new technologies and the quick disappearance of easy access oil has spurred a revival in this sector.

...The two most famous oil shale deposits in the United States, and the world for that matter, are the Green River Formation and the Bakken Oil Shale.

The Green River Formation consists of fine-grained sedimentary rocks which hold an exceptional amount of kerogen. Estimates have accounted for more than 800 billion barrels of recoverable oil in the Green River Formation! It is vast and spreads across Wyoming, Colorado and Utah. However, close to 75% of the deposit is located on federally owned land which will make any production from private or public companies extremely difficult. In addition, the production process at the Green River Formation has many holes in its game, and production here has yet to occur.

The Bakken Oil Shale is spread across Montana, North Dakota and Saskatchewan. The U.S. Geological Survey has estimated that up to 4.3 billion barrels of technically recoverable oil exists in the Bakken. Not the unbelievable 800 billion barrels of the Green River, but still nothing to shake a stick at. The Bakken is producing oil and gas and unlike the Green River Formation, it is primarily light sweet crude. Not all oil shales are overly expensive or complicated to produce.

...Major improvements in directional and horizontal drilling techniques, along with hydraulic fracturing methods, have revolutionized shale exploration and development in the United States.

...The depth of the shale resource is important in that there are economic and physical limits on horizontal drilling along with temperature constraints for the directional tools required to orient the well. So, the deeper true vertical depth (TVD) of the shale, the more problematic the well becomes.

Most operators are comfortable with a max true vertical depth of 14,000 ft and a max horizontal leg of 10,000 ft or less. A good average for shale wells currently drilled or drilling would be 9,000 ft total vertical depth and 4,000 ft laterals. So imagine a large pole being drilled 9000 feet down with an arm reaching out 4000 feet in any direction. You can imagine how this technique would allow them to penetrate huge areas below the surface from a single well bore and tap into resources what would otherwise remain uneconomic. The deeper TVD wells drilled are found in the Haynesville play and the longer lateral wells are currently attempted in the Bakken play. _SeekingAlpha

The Obama regime has adopted an intentional policy of energy starvation, applied to as many forms of energy as possible. The only forms of energy promoted by Obama are non-viable sources that are exorbitantly expensive -- such as wind and solar. The others -- coal, nuclear, gas, oil, oil shales, etc. have either been blocked or impeded by strangulating regulations, bureaucratic prohibitions and foot-dragging.

Energy starvation policies tend to kill economies, as we see in the US with the ongoing Obama depression. But sooner or later, Obama will be gone. And those who are closely associated with him will be largely discredited. At that point, the vast energy resources of North America will be tapped. Because the alternative is energy starvation. And you do not want to see that.

Labels: ,

Saturday, October 09, 2010

22nd Carnival of Nuclear Energy at ANS Nuclear Cafe

ANS Nuclear Cafe is hosting the 22nd Carnival of Nuke Energy.  Here are a few excerpts:

At Next Big Future, Brian Wang has two compelling blog posts. First, he reports on how laser enrichment of uranium could be made more efficient. Brian also gave a talk at the TedX conference in the Bay Area on October 5th on energy technologies. Brian is a futurist and his views are always thought provoking. Check out his slides from the talk which are now online.

At the Nuclear Green Revolution, Charles Barton has a blog post titled “Reverse engineering the future of energy.”

This post calls attention to further problems in renewable energy plans, problems which appear to limit the ability of renewable energy sources to keep the grid.stable. A renewable dominated grid appears likely to rely on carbon emitting natural gas power generation facilities, for peak power, and to respond to Summer and Winter temperature variations. While conventional nuclear power approaches do not appear to offer satisfactory solutions, Molten Salt nuclear approaches appear to offer attractive solutions to a number of post carbon energy options.

At NEI Nuclear Notes, Everett Redmond, Director, Nonproliferation and Fuel Cycle Policy, NEI, has his first blog post on Yucca Mountain. He writes “What is certain in policy consideration is that we will be securely storing used fuel in above-ground facilities for an extended period of time.”

More:  The US is to cooperate with France and Japan in the development of molten sodium cooled fast reactors.
Such advanced reactors are necessary in order to make nuclear energy safer, cleaner, more sustainable, and less prone to weapons proliferation concerns.

Small modular reactors are shaping up to be the most logical near-term approach to rapid nuclear energy development.  If only the Obama regime's Nuclear Regulatory Commission were not dragging its feet.

Labels:

Scientists: Realistic US Biofuels Potential is Massive

More than 80% of total agricultural production in the United States is used to feed animals, not human beings directly;
Our analysis shows that the US can produce very large amounts of biofuels, maintain domestic food supplies, continue our contribution to international food supplies, increase soil fertility, and significantly reduce GHGs. If so, then integrating biofuel production with animal feed production may also be a pathway available to many other countries. Resolving the apparent “food versus fuel” conflict seems to be more a matter of making the right choices rather than hard resource and technical constraints. If we so choose, we can quite readily adapt our agricultural system to produce food, animal feed, and sustainable biofuels.
—Dale et al
_gcc



GCC
Up until now, most people analysing US biofuels potential have failed to look at a realistic and integrated system of fuels and food. In real life -- unlike a typical computer model with excessively simplified and misleading assumptions -- new economies grow up to utilise by-products of new and existing processes and industries. When these new markets and economies are neglected by forecasters and modelers, their results become completely erroneous.
In their study, they analyzed only the 114 million ha of cropland used now to produce animal feed, corn ethanol, and exports. Cropland used for direct human consumption, forests, grassland pasture, and rangeland are not considered. Thus, they note, the analysis provides an example of what is technically feasible, not an upper limit on US biofuel production.

For the study, they considered two land-efficient animal feed technologies: ammonia fiber expansion (AFEX) pretreatment to produce highly digestible (by ruminants) cellulosic biomass and leaf protein concentrate (LPC) production.

During AFEX, concentrated ammonia is contacted with cellulosic biomass at moderate temperatures, resulting in greatly increased production of fermentable sugars by enzymatic hydrolysis. AFEX increases the digestibility of cellulosic biomass for ruminant animals while increasing protein production in the animal rumen due to the addition of ammonia-based byproducts.

Although extensive feed testing and commercial applications have not yet been introduced, AFEX-treated rice straw has been successfully included in dairy cattle diets, and tests with switchgrass and corn stover have shown increased cell wall digestibility when exposed to rumen microorganisms.


High-protein LPC products are generally produced by first pulping and then mechanically pressing fresh green plant matter. The resulting protein-rich juice is then coagulated and dried. The remaining fibrous material is depleted in protein, but is still suitable for animal feed or biofuel production.


Animal feeding operations can be adapted to these new feeds, thereby freeing land for biofuel production, according to the authors. They also considered aggressive double-cropping, thereby increasing the total biomass produced per ha. _GCC


Even the CO2 that is produced in fermentation reactions can be filtered and used in high-value operations -- such as oil well recovery, algae growth, food production, and a wide range of chemical processes.

Instead of seeing the CO2 as a net positive, third-rate scientists and analysts tend to foolishly and short-sightedly look at CO2 as a "dangerous pollutant" and a complete liability. This faulty perspective is most likely to be seen where politics unduely influences scientific funding and publishing.

Cross-posted to Al Fin

Labels: ,

Thursday, October 07, 2010

Buying Heat from the Devil In The Age of Global Warming

Southern Methodists Locate Gate to Hell in West Virginia



Traditional folklore tells us about "gates to hell" which provide the devil ready access to the human world. Some scientists are wondering if these hellgates might be put to good use. A 2007 MIT study found that exploitable geothermal heat resources could provide a significant portion of future heat and power needs for North America. Now, scientists from the Geothermal Lab of the Southern Methodist University have located an area in West Virginia with elevated crustal temperatures and heat flow. A detailed mapping of bottom hole temperatures from oil and gas wells has clarified the view of potential geothermal resources, as part of an ongoing effort to create an update of the Geothermal Map of North America (GMNA)
The GMNA was developed from roughly 3,600 heat flow and 12,000 BHT data measurements along with regional thermal conductivity models. However, large areas of the Central and Eastern United States contain few data points and have been under sampled in all previous national geothermal resource assessments. Since the previous GMNA data sets were completed, approximately 7,500 new data points have been analyzed and currently more are being processed for this project. The data was collected from oil, gas, water, and thermal gradient wells from New York, Pennsylvania, West Virginia, Ohio, Indiana, Illinois, Kentucky, Tennessee, and Michigan. As a result of the new heat flow determinations, estimates of heat content and MWe potential for Michigan, Pennsylvania and West Virginia are substantially increased. _SMU

More from Brian Westenhaus:
The high temperature zones beneath West Virginia revealed by the new mapping are concentrated in the eastern portion of the state. Starting at depths of 4.5 km (greater than 15,000 feet), temperatures reach over 150°C (300°F), which is hot enough for commercial geothermal power production.

Blackwell continues, “The early West Virginia research is very promising but we still need more information about local geological conditions to refine estimates of the magnitude, distribution, and commercial significance of their geothermal resource.”

Zachary Frone, an SMU graduate student researching the area said, “More detailed research on subsurface characteristics like depth, fluids, structure and rock properties will help determine the best methods for harnessing geothermal energy in West Virginia.” The next step in evaluating the resource will be to locate specific target sites for focused investigations to validate the information used to calculate the geothermal energy potential in this study.

Of added significance the team’s work may also shed light on other similar geothermal resources. “We now know that two zones of Appalachian age structures are hot — West Virginia and a large zone covering the intersection of Texas, Arkansas, and Louisiana known as the Ouachita Mountain region,” said Blackwell. “Right now we don’t have the data to fill in the area in between,” Blackwell continued, “but it’s possible we could see similar results over an even larger area.” Lets hope the research finds a large extent of fast rising heat for geothermal production in the Eastern US _NewEnergyandFuel
Google provided a grant to the SMU Geothermal Lab to do the study. But who will give the devil his due, when humans start stealing heat from these nether regions? Will Google? Will the southern methodists? Stay tuned.

Labels:

Wednesday, October 06, 2010

One Reason Why Obama's "Smart Grid" is Such a Stupid Idea

President Barack Obama’s talk about the need for a “smart grid” sounds, well, smart...As currently envisaged, however, it’s a dangerously dumb idea. _SciAm
President Obama tends to be attracted to ideas that sound good on the surface, but which are incredibly stupid and destructive at their core. The so-called "smart grid" is yet one more in a long line of such stupid and destructive ideas coming from the Obama White House.
The problem is cybersecurity. Achieving greater efficiency and control requires hooking almost every aspect of the electricity grid up to the Internet—from the smart meter that will go into each home to the power transmission lines themselves. Connecting what are now isolated systems to the Internet will make it possible to gain access to remote sites through the use of modems, wireless networks, and both private and public networks. And yet little is being done to make it all secure.

The grid is already more open to cyberattacks than it was just a few years ago. The federal government has catalogued tens of thousands of reported vulnerabilities in the 200,000-plus miles of high-voltage transmission lines, thousands of generation plants and millions of digital controls. Utilities and private power firms have failed to install patches in security software against malware threats. Information about vendors, user names and passwords has gone unsecured. Logon information is sometimes unencrypted. Some crucial systems allow unlimited entry attempts from outside.

As the power industry continues to invest in information technology, these vulnerabilities will only get worse. Smart meters with designated public IP addresses may be susceptible to denial of service attacks, in which the devices are overwhelmed with spurious requests—the same kind of attacks now made on Web sites. Such an attack could result in loss of communication between the utility and meters—and the subsequent denial of power to your home or business.

The smart grid would also provide hackers with a potential source of private information to steal. Just as they use phishing attacks to elicit passwords, credit-card numbers and other data stored on home computers, hackers could find ways of intercepting customer data from smart meters. A sophisticated burglar might use these data to figure out when you’re away on vacation, the better to rob your house.

Customer data could also give hackers a way to bring down the grid. Smart meters injected with malware, for instance, could disrupt the grid just as networks of PC botnets—home computers hijacked by viruses—now disrupt the Internet. A network of drone smart meters could cause a swath of the grid to power down, throwing off the grid’s electrical load. The imbalance would send large flows of electricity back to generators, severely damaging them or even blowing them up. _SciAm

The integrated electronics in the "smart grid" will also make the Obama Grid more prone to devastating damage from an EMP attack or a solar storm.

In uncertain economic times such as the current Obama Depression, it makes more sense to toughen the power grid to make it more resistant to hacking and the various forms of catastrophic failure. Instead, it seems almost as if the Obama regime wants to make the US more vulnerable, rather than less.

cross-posted to Al Fin

Labels: ,

Tuesday, October 05, 2010

Toughening Industrial Microbes to Create Chemical Powerhouses

In Biofuels Digest's recent series on the new biorefinery project, they emphasized the need for biofuels makers to incorporate renewable chemicals manufacture early in the game -- for crucial early profit streams.
If we have learned anything from the stories of hot companies like Amyris, LS9, Gevo, Solazyme, ZeaChem, Algenol, or Cobalt Technologies, as well as exciting pure-plays like Segetis, Elevance, GlycosBio or Rivertop Renewables, it is the importance of producing chemicals or other bio-based materials first to generate revenues, before taking the company further down the cost curve and up in scale in order to make competitively-priced renewable fuels. _BiofuelsDigest
Microbes will be instrumental in the changeover from dependence on petroleum to a widespread utilisation of sustainable bio-derived fuels. But before microbes become the prime producers of liquid fuels, they will be crowned king of chemicals. High value chemicals bring in more profits than bulk commodity fuels, and will be correctly targeted by the new industrial microbes as profit cows -- on the way to large scale bio-fuels production. Here is one challenge: making the microbes tough enough to survive the chemicals they are creating.
Microorganisms can sometimes produce chemicals and fuels as cheaply as conventional methods, while using sugar instead of petroleum. Technology developed by researchers at Argonne National Laboratories could help reduce the cost of production of chemicals or fuels made using microorganisms, and potentially increase the range of such materials. It does this, in part, by keeping the bugs alive for longer. The researchers recently announced that Nalco, a company based in Naperville, Illinois, will commercialize the technology.

The new tool uses an advanced form of electrodeionization (EDI), a technology used to make ultrapure water. The advanced EDI provides a better way to control the acidity of the solution in which an organism grows, and this helps optimize the microbe-driven production process. It also efficiently removes the chemicals that microorganisms make, allowing the process to run continuously. By creating a less harmful environment for growth, the tool could make it easier to engineer microorganisms for producing new chemicals.

...The Argonne researchers have developed what they call resin wafers, which combine multiple resins, binding agents, and additives to improve the conductivity and porosity of the material. The approach also keeps electricity costs low--just a few cents of electricity is needed to produce a pound of a chemical that sells for a dollar, says Seth Snyder, who is leading the project at Argonne.

The technology could be used first for cheaper fermentation. One of the biggest industrial applications of fermentation is the production of plastics precursors such as succinic acid, and food additives such as citric acid. But these acids need to be counteracted by adding large quantities of base materials, such as calcium hydroxide, which are then converted in the process to waste products such as gypsum.

...The Argonne technology also makes continuous production possible. In conventional fermentation, organisms are grown in a tank where they excrete a product until it reaches a concentration that is toxic to the organism. The contents of the fermentation tank are then poured out, and the process starts again. With the new EDI tool, the chemical product can be removed as it is made, so concentrations stay low. Microorganisms can continue producing chemicals for 10 times longer.

The new tool may also aid in engineeing organisms for new applications. It makes it unnecessary to use acid-resistant organisms, or organisms that can tolerate high concentrations of the chemical they produce.

The Argonne researchers have demonstrated the production and purification of chemicals at a small pilot-scale plant. The partnership with Nalco will make production versions of the system available for companies and other researchers. _TechnologyReview

Labels: ,

Monday, October 04, 2010

More on Methane Clathrates

Japan has the most ambitious plans for developing its methane clathrate resources. But other countries which are investigating developing undersea methane hydrates include Canada, China, South Korea, and the US. And no wonder -- frozen methane clathrates may contain twice the amount of carbon as all known fossil fuels combined.

Faux environmentalists, carbon hysterics, and peak oil doomers are all agreed that any energy resource this promising should be left alone. But people who work solving real-world problems have a different attitude. Rather than obstructing crucial energy resources, problem-solvers want to develop a wide variety of abundant, clean, and versatile energy sources.
Methane trapped in marine sediments as a hydrate represents such an immense carbon reservoir that it must be considered a dominant factor in estimating unconventional energy resources; the role of methane as a 'greenhouse' gas also must be carefully assessed.
Dr. William Dillon,
U.S. Geological Survey
Hydrates store immense amounts of methane, with major implications for energy resources and climate, but the natural controls on hydrates and their impacts on the environment are very poorly understood.

Gas hydrates occur abundantly in nature, both in Arctic regions and in marine sediments. Gas hydrate is a crystalline solid consisting of gas molecules, usually methane, each surrounded by a cage of water molecules. It looks very much like water ice. Methane hydrate is stable in ocean floor sediments at water depths greater than 300 meters, and where it occurs, it is known to cement loose sediments in a surface layer several hundred meters thick.

The worldwide amounts of carbon bound in gas hydrates is conservatively estimated to total twice the amount of carbon to be found in all known fossil fuels on Earth.

This estimate is made with minimal information from U.S. Geological Survey (USGS) and other studies. Extraction of methane from hydrates could provide an enormous energy and petroleum feedstock resource. Additionally, conventional gas resources appear to be trapped beneath methane hydrate layers in ocean sediments. _USGS

Labels:

Biomass King Can Grow over 70% of Planet's Surface

The king of biomass is not switchgrass, miscanthus, or even micro-algae. The king of biomass is macro-algae -- seaweed.

A key trend from the Algae 2020 study finds most macro-algae projects prior to 2010 focused on ethanol. However, since 2010, the entrance of oil and petrochemical majors Dupont and Statoil are expressing an increased interest in extracting sugars from seaweed to create drop-in fuels, biochemicals and other valuable co products such as biobutanol. This follows a key trend by Shell and BP investing $12 and $8 billion respectively in sugar-based conglomerates in Brazil to produce bio-butanol, drop-in fuels, and bio-based chemical products.

Emerging Markets Online’s updated Algae 2020 study finds the surging investments in extracting sugars from seaweed follows an emerging microbial “sugar to biofuels” trend in the Americas in Brazil for ethanol, biobutanol, and advanced biofuels. In September 2010, Bunge and Chevron invested in US-based Solazyme to create renewable algae-based oils. In addition, LS9, Amyris, and Virent aim to use plant-based sugars to produce drop-in fuels,  bio-based diesel, biobutanol, biogasoline, biochemicals and bioplastics.

Will sea-based sugars from macro-algae provide a new feedstock for advanced biofuels, drop in fuels and biochemicals for these emerging sugar-based, infrastructure compatible biofuels and chemicals platforms? Evidently, an increasing number of petrochemical majors including Dupont, Statoil, believe harvesting sugars from seaweed is attractive and are investing in next-generation,sea-based macroalgae projects as a feedstock for advanced biofuels, drop-in fuels, biochemicals, and biopolymers.

_BiofuelsDigest


Emerging Trends in Macroalgae Investment

Project  and Partners Products Description
South Korea National Energy Ministry Ethanol Korea – $275 USD million project over 10 years to produce nearly 400 million gallons a year of ethanol by 2020, approximately 13% of S. Korea’s consumption. The project will create an offshore seaweed forest approximately 86,000 acres in size.
City of Venice JV with Port Authority and Electric Power Plant Algae Biofuel for Electric Power Italy – $200 million Euro project announced in March 2009 by the city of Venice to capture algae seaweed and generate 40 MW of power from algae biofuel to supply up to half of the city’s power supply and for to port facilities and docked ships in the harbor.  The project will also cultivate microalgae in closed photobioreactors to generate biomass for power generation.
Biomara / Scotland’s Ministry of Energy Algal Biofuels Scotland – $8 million USD from Scotland’s Energy Ministry and the EU’s INTERREG IVA Programme, and Crown Estate in April 2009 to investigate seaweed and microalgae strains for commercial scale production.
Chilean Economic Development Corporation (CORFO) and Bio-Architecture Lab (BAL) Ethanol Chile – $7 million USD investment in 2010 in a seaweed-based bio-ethanol project lead by US-based BAL in collaboration with Chilean oil company ENAP and the Universidad de Los.  Project goal is to replace 5% of Chile’s gasoline consumption with 165 million litres of ethanol.
Philippines National Government, Korean Institute for Industrial Technology. Ethanol and biofuels Philippines – $5 million from the Philippines government to develop a 250 acre, seaweed-based ethanol plant and aquafarm cluster. The aquafarms will in 4 locations and will utilize a South Korean ethanol extraction technology developed at the Korean Institute for Industrial Technology.
Statoil and Bio-Architecture Lab (BAL) Ethanol and Co-Products (Lipids, Proteins, Iodine) Norway – starting in late 2010, Statoil will fund  BAL’s  R&D and demonstrations projects in Norway with the goal of commercialization of BAL’s technology in Norway and in Europe.  BAL will utilize its process technology will convert seaweed from Statoil’s aquafarming operations into ethanol and co-products in the partnership.
Dupont/BAL  (Bio-Architechture Lab) Biobutanol, Sugars for Advanced and Drop-In Fuels USA – $9 million US-based Advanced Research Projects Administration Energy  announced in Spring 2010 to fund a DuPont/BAL macroalgae project  aimed at supplying biobutanol  to be marketed by Butamax, the BP-DuPont JV
Sources: Algae 2020 study Vol . 2, Biofuels Digest, Emerging Markets Online,  industry journals
Rapid growing, with frequent harvests, and capable of thriving in salt water -- the most abundant form of water on the planet -- seaweed has the capacity to out-mass any other form of biomass-to-fuels, biomass-to-electricity, and biomass to chemicals.

Labels: ,

Saturday, October 02, 2010

500 Billion Barrels of Oil Equivalent in Jordan's Oil Shale

JORDAN has over 0.59 trillion barrels of oil equivalent (boe) of oil-shale reserves, says one of the companies trying to develop the resource. It is an estimate that could revolutionise the country's economy – and slash its dependence on energy imports.

And, suddenly, the resource could be viable at around $65 a barrel, making it profitable in the range in which international oil prices have traded in the past two years. Estonia's state-run Eesti Energia says it can produce Jordan's oil-shale economically, and cleanly, within seven years. _PetroleumEconomist
As with Canada's vast oil sands reasources, the enemy of oil shale development is low oil prices. As long as the price of oil stays in the vicinity of $80 a barrel, a world wide bonanza of new oil and oil-equivalent resources will be unleashed onto world markets -- and OPEC will have little to say about it.

While Jordan's oil shale resources are tiny in comparison to North American oil shale and oil sands resources, there is a significant difference -- the government of Jordan is not handicapped by a philosophy of energy starvation, nor is it under the influence of a faux environmental desire for a great human dieoff.orgy. Jordan will use its resources, if it is economical to do so.
Dr. Maher Hijazin, the Director General of The Natural Resources Authority, added, "The oil shale deposits are strategically important to Jordan's national energy policy. Through this concession, we have put in place the framework for future development of the oil shale industry. We strive to balance the benefits from commercial exploration of Jordan's natural resources with the need to ensure that the projects are conducted in a sustainable and environmentally responsible manner." _Rigzone

Labels:

Nuke Carnival #21 at Next Big Future

Brian Wang is hosting the 21st edition of the Carnival of Nuclear Energy. Here is a quick look:

3.The ANS Nuclear Cafe has a guest contributor Ted Rockwell. Guest contributor Ted Rockwell questions the 'special status' accorded to nuclear technologies in regulatory circles and in public perception. Requirements that don’t make a nuclear power plant safer, or cheaper, or better in some way merely add to the cost and saddle the developer with a device or procedure that may bring problems of its own. Adding more and more “safety" requirements does not necessarily make a system safer.


Rockwell outlines the rewards and penalties associated with being special and concludes that there is wisdom in the advice, “Don’t fix what ain’t broke.

4. Atomic Insights reports that Michael Brune of the Sierra Club discusses actions that the club is taking to reduce the environmental impact of extracting and burning fossil fuels. The actions should make nuclear energy more competitive. If they are not against us, they are for us.

5. The Areva north America blog has an article that Expanding nuclear energy makes sense for Americans.


As we consider ways to meet our nation's energy demands and increase our energy security while reducing our CO2 emissions, building new nuclear power plants makes a lot of sense. Each new nuclear power plant that we build also will create thousands of jobs and spur billions of dollars of investment in local communities. In this economic environment, who wouldn't welcome new jobs and investment in their community?
_NukeCarnival21



Brian also looks at an interesting partnership between Toyota, Toshiba, and Hitachi to produce the world's first commercial molten salt reactor using fertile Thorium as a breeder fuel.


NextBigFuture also keeps up with advances in fusion energy research.

Item #4 in the carnival excerpt above points out an irrationality in the argument in favour of nuclear energy -- the compulsion of many nuke advocates to reflexively attack hydrocarbon fuels, out of carbon hysteria. Such an approach may sway a few opinions toward nuclear power in the short term, but in the long term it is yet another form of suicidally self destructive energy starvation. It is painfully obvious that nukes alone will not see us through this century without suffering through a devastating energy suffocation. And clearly wind and solar are huge rat holes sucking up maximal resources for less than minimal returns.

Get real, people.

Labels:

Friday, October 01, 2010

Additional Hydrocarbon Production by Adding Algal Biomass to Delayed Coker Feedstock

Oil refiners are straining to extract as much value from petroleum -- particularly sour heavy crudes. Foster Wheeler's delayed coking technology extracts even more hydrocarbon from the residual remaining after fractional and vacuum distillation of petroleum.

Now, Foster Wheeler reports that the addition of algal biomass to petroleum "vacuum residue" yields additional high value hydrocarbons:
Testing was conducted to demonstrate that the biomass is an effective add-in complement to vacuum residue coker feedstock, and does not significantly affect overall coker operations. The initial test results demonstrate that biomass, mixed with vacuum residue, yields additional valuable hydrocarbons as a result of biomass carbohydrate and lipid decomposition. Further testing and engineering development is underway to optimize process parameters and feedstock blend ratios.

"We are very pleased with the results from our initial testing of PetroAlgae's biomass that generates green fuels from a blend of biomass and petroleum vacuum residue," said Umberto della Sala, president and chief operating officer of Foster Wheeler AG. "These results could lead to a change in the way refineries look at biofuels in the future, as we believe this presents a commercially scalable source of biomass which produces a true 'drop in' feedstock which is compatible with the entire existing transportation fuel infrastructure."
_TradingMarkets_via_BiofuelsDigest

This obscure approach may represent a near-term means for the economical use of algal biomass for fuel and chemical production. We will have to learn more about the yields and economies involved, and whether government mandates or subsidies are required to achieve profitability. Those are questions which press releases often overlook.

Labels:

Newer Posts Older Posts