Jan 10, 2014 Sponge
In an article titled “Feedstocks for Lignocellulosic Biofuels” published in Science, Chris Somerville of the University of California, Berkeley, and Deputy Director Steve Long of the University of Illinois at Urbana-Champaign with bioenergy analysts Caroline Taylor, Heather Youngs and Sarah Davis at the Energy Biosciences Institute suggest that a diversity of plant species, adaptable to the climate and soil conditions of specific regions of the world, can be used to develop “agroecosystem” for fuel production that are compatible with contemporary environmental goals.
Well, press release and research notes aside, they mean that there can be a set of plant species that could provide substantial amounts of biomass grown widely across the planet without an impact on food and feed production. The troubled firm BP, well before the Gulf well crisis, funded the study.
The study authors discuss the sustainability of current and future crops that could be used to produce advanced biofuels with emerging technologies that use non-edible parts of plants. Such crops include perennial grasses like Miscanthus grown in the rain-fed areas of the U.S. Midwest, East and South; sugarcane in Brazil and other tropical regions, including the southeastern U.S.; Agave in semiarid regions such as Mexico and the U.S. Southwest; and woody biomass from various sources.
The team takes some assumptive license by making some simplifying assumptions: that technology will become available for converting most of the structural polysaccharides that comprise the bodies of plants to sugars, that all the sugars can be used for fuel production, and that the process energy required for the conversion of the sugars to fuels will be obtained from combustion of the other components of the biomass, mostly the lignin. That way a sugar-to-ethanol bioconversion process using current technology, a metric ton (MT) of switchgrass or poplar, for example, would be expected to yield about 310 liters of ethanol.
The author’s base is founded on the comparative soil impacts. Maize or corn plants used completely remove much more soil fertility than a perennial plant. Perennial plants that use C4 photosynthesis, such as sugarcane, energy cane, elephant grass, switchgrass, and Miscanthus, have intrinsically high light, water, and nitrogen use efficiency as compared with that of C3 species as seen in corn. Moreover reduced tillage and perennial root systems add carbon to the soil and protect against erosion.
While the team reports that tropical Napier Grass in El Salvador natural stands of Echinochloa polystachya on the Amazon floodplain can respectively reach production of 88 and 100 MT/ha/year, temperate Miscanthus x giganteus produced in England at 52°N a peak biomass of 30 MT/ha/year and harvestable biomass of 20 MT/ha/year. (ha is hectare, 2.47 ha per U.S. acre) Miscanthus also offers an important soil protection effect, seasonality leads to an annual cycle of senescence, in which perennial grasses such as Miscanthus mobilize mineral nutrients from the stem and leaves to the roots at the end of the growing season. Thus, harvest of biomass during the winter results in relatively low rates of removal of minerals.
That could account for the observation that stands grown at Rothamsted, UK showed no response to added nitrogen during a 14-year period during which all biomass was removed each year. In side-by-side trials in central Illinois, unfertilized M. x giganteus produced 60% more biomass than a well-fertilized highly productive maize crop, and across the state, winter-harvestable yields averaged 30 MT/ha/year.
The author’s note in an observation that if Miscanthus were used as the only feedstock, less than half of the 14.2 Mha currently set aside for the U.S. Conservation Reserve Program (CRP) would be required to deliver the ethanol mandate of the Energy Independence and Security Act of 2007. Contrary to that readers should be informed that a great chunk of the CRP land area is tiny little headlands, terraces, protective filters along watercourses and the like. But there are vast amounts of highly erodeable land that could better serve the economy than being used for corn or soybean production.
Its worthwhile to note that as the authors seem to overlook some details they turned up others. The Global Potential of Bioenergy on Abandoned Agriculture Lands published in 2008 reveals that more than 600 Mha of land worldwide has fallen out of agricultural production, mostly in the last 100 years.
Most readers will know that for tropical production sugarcane isn’t beaten yet and won’t most likely. Harvested cane arrives with the sugar in liquid form ready for fermentation and the plant remnants can be burned for distillation with power left over for the electric grid. Many other regions of the world beyond Brazil are also well suited to sugarcane production or formerly produced sugarcane on land that has been abandoned. Thus, “the total amount of fuel that may be produced from sugarcane worldwide could eventually be a very substantial proportion of global transportation fuels.” As the authors seem to be aware – the potential in sugarcane defies calculation in responsible numbers for now.
Approximately 18% of the earth’s surface is semi-arid and prone to drought. The authors suggest various Agave species that thrive under arid and semi-arid conditions with high efficiencies of water use and drought resistance hold a potential opportunity for production of biomass for fuels. Agave species that thrive under arid and semi-arid conditions by using a type of photosynthesis called Crassulacean acid metabolism (CAM) that strongly reduces the amount of water transpired by absorbing CO2 during the cold desert night and then internally assimilating this into sugars through photosynthesis during the warmer days. By opening their stomata at night, they lose far less water than they would during the day. Much of the land noted in the Global Potential of Bioenergy on Abandoned Agriculture Lands that has fallen out of agricultural production worldwide is semi-arid, and it appears that the amount of land that may be available for cultivation of Agave species is vast.
The research paper points out that about 89 to 107 Mha of land that were formerly in agriculture globally are now in forests and urban areas. The authors bravely note the biomass that is harvested annually in the Northern Hemisphere for wood products has an energy content equivalent to approximately 107% of the liquid fuel consumption in the United States. Wood resources provide regionally specific opportunities for sustainably harvested biomass feedstocks. That explains the Chevron and Weyerhaeuser deal for biomass.
For this summary its important to note one more point the authors took the time to briefly discuss. It is inevitable that some mineral soil nutrients will be removed when biomass is harvested, it will be essential to recycle mineral nutrients, which are not consumed in the production of biofuels, from biomass-processing facilities back onto the land. That is virtually all of the minerals. It needs to be a built in cost before soils are degraded further by any new biomass effort.
This writer’s summary leaves a lot out from the published study including the references, the supporting documentation and the available links. For this article Science has free registration, an opportunity cost well worth the small effort.
The authors did a good job here, but left a lot out. There are lots more plants to consider, but the local weather and soils are going to decide what farming can accomplish and the profit for production will in the end decide. This writers main concern is that highly profitable biomass could displace prime food and feedstock land and force food and feedstock production onto the less optimal soils. Some oversight, as oppressive as it is – is going to be needed to balance the demands with the conditions – something competition isn’t going to get done.
Original post here: New Energy and Fuel
Dec 31, 2013 Energy Talks
A new magnetic effect was discovered by accident when a UC Berkeley postdoctoral researcher and several students grew graphene on the surface of a platinum crystal. Graphene is a one atom-thick sheet of carbon atoms arranged in a hexagonal pattern, that looks like chicken wire. Examination showed when grown on platinum, the carbon atoms do not perfectly line up with the metal surface’s triangular crystal structure, which creates a strain pattern in the graphene as if it were being pulled from three different directions.
Michael Crommie, professor of physics at UC Berkeley and a faculty researcher at Lawrence Berkeley National Laboratory runs the lab where the discovery was made. Charles Kane and Eugene Mele of the University of Pennsylvania first predicted the appearance of a “pseudomagnetic” field in response to strain in graphene for carbon nanotubes in 1997. Nanotubes are a rolled up form of graphene.
Crommie explains the strain produces small, raised triangular graphene bubbles 4 to 10 nanometers across in which the electrons occupy discrete energy levels rather than the broad, continuous range of energies allowed by the band structure of unstrained graphene. This new electronic behavior was detected spectroscopically by scanning tunneling microscopy. These so-called Landau levels are reminiscent of the quantized energy levels of electrons in the simple Bohr model of the atom.
Crommie said, “This gives us a new handle on how to control how electrons move in graphene, and thus to control graphene’s electronic properties, through strain. By controlling where the electrons bunch up and at what energy, you could cause them to move more easily or less easily through graphene, in effect, controlling their conductivity, optical or microwave properties. Control of electron movement is the most essential part of any electronic device.”
Inventive engineers take note – this opens a new field.
What happens is the electrons within each nanobubble segregate into quantized energy levels instead of occupying energy bands, as in unstrained graphene. The energy levels are identical to those that an electron would occupy if it were moving in circles in a very strong magnetic field, as high as 300 tesla, which is stronger than any laboratory can produce except in brief explosions, said Crommie. For comparison, a magnetic resonance imager uses magnets running at less than 10 tesla, while the Earth’s magnetic field at ground level is only 31 microtesla. The scale, while atom sized on one dimension – is incredible.
Meanwhile over the last year Francisco Guinea of the Instituto de Ciencia de Materiales de Madrid in Spain, Mikhael Katsnelson of Radboud University of Nijmegen, the Netherlands, and A. K. Geim of the University of Manchester, England predicted what they termed a pseudo quantum Hall effect in strained graphene. This is the very quantization that Crommie’s research group has experimentally observed. Boston University physicist Antonio Castro Neto, who was visiting Crommie’s laboratory at the time of the discovery, immediately recognized the implications of the data, and subsequent experiments confirmed that it reflected the pseudo quantum Hall effect predicted earlier.
This is pretty cheerful stuff. Crommie observes, “Theorists often latch onto an idea and explore it theoretically even before the experiments are done, and sometimes they come up with predictions that seem a little crazy at first. What is so exciting now is that we have data that shows these ideas are not so crazy. The observation of these giant pseudomagnetic fields opens the door to room-temperature ‘straintronics,’ the idea of using mechanical deformations in graphene to engineer its behavior for different electronic device applications.”
The catch in all the excitement is the nanobubble experiments performed in Crommie’s laboratory were performed at very low temperature. Crommie notes that the pseudomagnetic fields inside the nanobubbles are so high that the energy levels are separated by hundreds of millivolts, much higher than room temperature. Thus, thermal noise would not interfere with this effect in graphene even at room temperature.
Normally, electrons moving in a magnetic field circle around the field lines. Within the strained nanobubbles, the electrons move in circles in the plane of the graphene sheet, as if a strong magnetic field has been applied perpendicular to the sheet even when there is no actual magnetic field. Apparently, Crommie said, the pseudomagnetic field only affects moving electrons and not other properties of the electron, such as spin, that are affected by real magnetic fields.
There’s a lot of pseudo so far in the press release and the paper’s abstract at Science. But the research effort is measuring the Tesla force. That point focuses attention is a major way. Getting to 10 Tesla requires lots of power and a source without such a power input thirty times as strong is prey worthy of the best minds in science. Should the effect make it beyond microelectronics in scale to say motors, the impact would be huge.
The long term potential isn’t known in precise terms. There is a great deal of further exploration and experimentation to come. Yet the early theory ideas have borne fruit – by accident.
The serendipitous post doc remains un named, but add to paper’s author list Castro Neto and Francisco Guinea, Sarah Burke, now a professor at the University of British Columbia; Niv Levy, now a postdoctoral researcher at the National Institute of Technology and Standards; and graduate student Kacey L. Meaker, undergraduate Melissa Panlasigui and physics professor Alex Zettl of UC Berkeley. It’s a paper that might be worth the reading fee for the inventive engineer.
Here is the original: New Energy and Fuel
Dec 26, 2013 Sponge
It’s out there, or so they say in the cosmic physics community. There is a mysterious “dark energy” believed to constitute nearly three-fourths of the mass and energy of the Universe out there. That’s a lot of power.
Dark energy is the label scientists have given to what is causing the Universe to expand at an accelerating rate. The acceleration was discovered in 1998, but its cause remains unknown. Physicists have advanced competing theories to explain the acceleration with the necessary dark energy the current leader.
The numbers are just comprehendible in order of magnitude ranges – leaving estimations in the pure supposition mode. Its safe to say that once dark energy is nailed down the amount from a species point of view looking out from a small planet will be ‘endless’.
There are major problems. As much as there should be out there, it’s going to be widely dispersed – and even if there are concentrations, these will be thin pickings. The energy will likely be quite far away, too. Yet, something is pulling the Universe apart, and figuring out what and how might have important implications for energy in general.
That makes news on the dark energy front interesting, maybe useful and certainly instructive.
Pioneering observations with the National Science Foundation’s giant Robert C. Byrd Green Bank Telescope (GBT) have given astronomers a new tool for mapping large cosmic structures. Physicists believe the best way to test the competing theories to explain the acceleration is to precisely measure large-scale cosmic structures.
Green Bank Telescope. Click image for the largest view. Image Credit: The National Science Foundation.
The thinking is sound waves in the matter-energy soup of the extremely early Universe are thought to have left detectable imprints on the large-scale distribution of galaxies in the Universe. The GBT researchers developed a way to measure such imprints by observing the radio emission of hydrogen gas. Their technique, called intensity mapping, when applied to greater areas of the Universe, could reveal how such large-scale structures have changed over the last few billion years, giving insight into which theory of dark energy is the most accurate.
Imagine you’re in space with a wee explosive charge where the gravity is about uniform in all directions. Set the explosive off and the instant the chemical energy is expended maximum velocity would be achieved. Assuming your explosive isn’t too vigorous the exploded bits would be overcome by there own gravity and clump back together if the gravity around isn’t too powerful. There’s your recurring big bang theory in the ultra simplified mode.
That’s not what’s happening out there.
There isn’t enough matter with gravity to slow the whole thing down.
Some kind of energy is pushing things ever faster and further apart – some 14 or 15 billion years later.
Inquiring mind are intrigued. What is that energy? Can it be put to use? First it has to be found. Science is still in ‘clue’ mode on this one.
To get their results, the researchers used the GBT to study a region of sky that previously had been surveyed in detail in visible light by the Keck II telescope in Hawaii. This optical survey used spectroscopy to map the locations of thousands of galaxies in three dimensions. With the GBT, instead of looking for hydrogen gas in these individual, distant galaxies — a daunting challenge beyond the technical capabilities of current instruments — the team used their intensity-mapping technique to accumulate the radio waves emitted by the hydrogen gas in large volumes of space including many galaxies.
Jeffrey Peterson, of Carnegie Mellon University explained, “Since the early part of the 20th Century, astronomers have traced the expansion of the Universe by observing galaxies. Our new technique allows us to skip the galaxy-detection step and gather radio emissions from a thousand galaxies at a time, as well as all the dimly-glowing material between them.”
Tzu-Ching Chang, of the Academia Sinica in Taiwan and the University of Toronto comes to the point, “Our project mapped hydrogen gas to greater cosmic distances than ever before, and shows that the techniques we developed can be used to map huge volumes of the Universe in three dimensions and to test the competing theories of dark energy.”
The astronomers also developed new techniques that cleaned up both man-made radio interference and radio emission caused by more-nearby astronomical sources, leaving only the extremely faint radio waves coming from the very distant hydrogen gas. The result was a map of part of the “cosmic web” that correlated neatly with the structure shown by the earlier optical study. The team first proposed their intensity-mapping technique in 2008, and their GBT observations were the first test of the idea.
Ue-Li Pen of the University of Toronto said, “These observations detected more hydrogen gas than all the previously-detected hydrogen in the Universe, and at distances ten times farther than any radio wave-emitting hydrogen seen before.”
Where does that get us? Closer by some clues. The theories that explain the Universe flying apart at ever faster speeds involves a great deal of power. Accelerating a baseball is one thing, moving up to a galaxy and then the whole universe something entirely different.
Just what the energy is will the precious answer. Then the questions really get fascinating. Has anyone put a calculation to the power required to accelerate the Universe? It will be a big number – an energy source well worth understanding even if not collected.
Post written by: New Energy and Fuel
Dec 11, 2013 Sponge
Dr. Steve Larter holds the University of Calgary’s Canada Research Chair in Petroleum Geology and has more than 30 years’ research experience in petroleum geology and geochemistry in both academia and industry. Dr. Larter was named as one of the Top 10 Geologists in the U.K.in 2003, and has received numerous awards for his scientific contributions, including the Friendship Medal of the Peoples Republic of China. When he speaks, the smart people pay attention.
Dr. Larter was the keynote speaker June 17 for the 2010 Goldschmidt Conference hosted by the University of Tennessee, Knoxville, and Oak Ridge National Laboratory. In his presentation, “Can Studies of Petroleum Biodegradation Help Fossil Fuel Carbon Management,” Larter discussed microbes in the environment and their role in breaking down oil and generating natural gas.
This is with an eye to the feasibility of recovering hydrogen, instead of oil, directly from oilfields undergoing natural biodegradation processes. Larter is also examining the feasibility of using a related process, biologically assisted carbon capture and conversion of CO² to methane or natural gas via H² + CO² methanogenesis in the hydrogen-rich environments of weathering subsurface ultrabasic rocks, as a route to recycle carbon dioxide in flue gases back to methane.
But the most interesting is the in field conversion of oil to natural gas. If Larter can develop the idea into a working process much of the oil in place, or about 4 times the oil already pumped and used could be available in the form of natural gas. It’s an astonishing concept.
Over two years ago Dr. Larter showed how crude oil in some oil deposits around the world — including in Alberta’s oil sands — are naturally broken down by microbes in the reservoir. Larter is working on understanding how crude oil biodegrades into methane, or natural gas, opening the door to being able to recover the clean-burning methane directly from deeply buried, or in situ, oil sands deposits.
Currently a problem exists out of the media and public’s view – biodegradation of crude oil into heavy oil in petroleum reservoirs is a problem worldwide for the petroleum industry. The natural process is caused by bacteria that consume the oil, making the oil viscous, or thick, and contaminates it with pollutants such as sulfur. This makes recovering and refining heavy oil difficult and costly. People don’t realize they’re competing with microbes for the oil.
Using a combination of microbiological studies, laboratory experiments and oilfield case studies, the University of Calgary team demonstrated the anaerobic degradation of hydrocarbons to produce methane. The findings offer the potential of ‘feeding’ the microbes and rapidly accelerating the breaking down of the oil into methane.
Larter is now working on an approach of capturing carbon dioxide and pumping it and special bacteria underground into alkaline rock formations where the carbon dioxide, a greenhouse gas, will be converted into natural gas.
Larter says the petroleum industry already has expressed interest in trying to accelerate biodegradation in a reservoir.
The business end has already started with Dr. Larter involved with Gushor, a Canadian consulting firm. Gushor is focusing on heavy oil recovery, fluid mobility, biodegradation, and carbon management emissions.
To date Larter’s findings indicate that feeding the oil reservoir microbes rapidly accelerates the breaking down of oil into natural gas. Larter says, “Instead of 10 million years, we want to do it 10 years. We think it’s possible. We can do it in the laboratory. The question is: can we do it in a reservoir?”
The matter now is the sense of urgency. With ‘peak oil’ losing its public momentum, a great U.S. success from the Bakken formation in the Williston basin, a major oil well disaster in discovering a huge field in the Gulf of Mexico, and a series of discovery successes over the past two years around the world, the recovery techniques that Larter is proposing are getting pushed back into the less urgent category.
That might not be the best idea. Petroleum hydrocarbons will be needed for centuries in declining amounts. Natural gas isn’t particularly good as a motor fuel, but would certainly be useful for light transport substitution. But for making heat whether for a home on to producing steam, natural gas is a very desirable product.
The clean motive – less CO² also has a friend in natural gas. The single carbon atom in methane (CH4) with the four-atom hydrogen set makes for a lot of heat for a minimum of carbon reaction with oxygen. Methane also could have a big role in high efficiency fuel cells.
Larter’s work is getting noticed and consideration. The move to commercial interest is underway. It’s an idea well worth having in the world’s fuel production arsenal.
Here is the original post: New Energy and Fuel