A Bite of Reality for Climate Science

Roger Pielke Sr. wrote on June 4th 2009 a short piece on how “climate science” papers, if there is such a reputable thing, are short circuiting the scientific method so causing falsehoods and a dangerous trend in science that deserves attention from taxpayers, grantors and others interested in good science, properly done, factually accurate and useful for humankind.

Pielke points out, as others and I have in the past that much if not all the “climate science” is based in assumptions and built out using computer modeling.  No experimentation is done.  No testing, no verifiable conclusions, no facts.

But Pielke goes a little further, he’s calling to account the publishers of the “climate science” to adhere to the minimum standards of the scientific method.  With peer review responsibilities on his resume’ Pielke has good reason to see the problems of credibility when peer review journals and the following media rush to print sensationalism rather than science.

I repeat here again, a computer model is not a fact.  The reliance on computers, programs and the assumptions or data input is only, at best, a speculation.  Pielke offers the six steps common to describing the scientific method condensed by sciencebuddies.org as:

  1. Ask a question
  2. Do background research
  3. Construct a hypothesis
  4. Test the hypothesis with experimentation
  5. Analyze the data for conclusions
  6. Communicate the results

But today, the peer review publishers are short-circuiting the scientific method.  Having read a few it’s much more like: pose a conclusion, construct a hypothesis, prove it with your computer and press release your results.  Its insulting, to the informed readers, the scientific institutions providing the resources and others researching properly.

What’s lost is accurate descriptions of how the real world functions.  When one has a hypothesis that can withstand testing one has a fact, until a test comes along that unravels the theory.  That’s how humanity got out of the wild into civilization.

Pages: 1 2

The Oil Spill Disaster That Isn’t

The BP well blowout, fire, explosion and platform collapse, and the ensuing crude oil leak are without doubt the result of human failings.  Underestimating the quality of the reservoir is one reason, perhaps some engineering choices and safety oversights, inadequate equipment, testing that didn’t work out in the real world and all the rest only show that human planning can come up short.

Now that its over this writer can recoil from the anger felt as the catastrophe unfolded.  Yes, the well getting away is cause No. 1 – something that has happened before and will happen again – hopefully with more and more infrequency.  The lessons keep coming – from drilling into the earth since Drake’s day; the pressures down there can surprise you.

But the sorrow of the lives lost was quickly overcome by the shear idiocy of the media and political response.  There has been essentially no worthy information making the mainstream press or incorporated into political activity.  The reverse is the fact – misinformation is rampant and the consequences, not counting the loss of life itself is simply incredible.

The President’s behavior has been an utter failure – doing far more damage than the oil itself.  The offshore drilling ban is keeping 50,000 jobs without paychecks topping $2 billion in payroll losses alone, not counting the effect throughout the local economy in the situation where the major economic engine, tourism, disappeared.  The President’s action wasn’t just foolish, but cruelly focused on a few innocents, thoughtless and without any kind of leadership or sense of responsibility to the local area or the nation as a whole.  The reaction actually fed the media hysteria – a fault beyond forgiving in a leader.  No gulf beach trips and minigolf photoshoots will take away the realization the President is out of his league.

In the meantime property values are gong to be hit with incomes going down.  From Texas to Florida the tourism business is in shambles and may take years to recover.

There are many reports that no one is buying Gulf seafood, even in areas unaffected by the spill. Gulf Coast shrimpers and fishermen are in a tough spot: On the one hand, as more areas of the Gulf are declared safe, they presumably won’t be able to collect compensation from BP or the government and will have to get back to work; on the other hand, no one’s buying their catch. Given the public fear of toxins in food, this problem could last a long time.  But this writer is buying – Gulf seafood – if you can find it, hasn’t been so reasonably priced in decades.

For the future perhaps the most important lesson is the current administration can’t be trusted to act in the national interest.  Bans, moratoriums and other fear based knee-jerk reactions have spoiled regulatory certainty, which will exact a huge cost from oil firms, their shareholders, management and employees and in particular we consumers. Some insider reports suggest that oil assets in the Gulf are already being disposed of at fire-sale prices.  Fear leading fear, just what an economic recovery can not stand.

The most damning realization is the most liberal administration in American history is composed of people who lack the reflexive skepticism that intelligence and science apply to the mainstream media and those left-wing blogs. Spend some time following the reporting and blogging on Deepwater Horizon, and you come to realize that the administration’s behavior in the crisis likely wasn’t based on a cynical progressive master plan.  The administration was overwhelmed by sheer emotional panic about the magnitude of the potential disaster it faced as outlined by its most loyal supporters.  Embarrassing to thoughtful knowledgeable citizens.

Here is why.  What President Obama called the “worst environmental disaster America has ever faced” – the oil has pretty much already disappeared into the environment.  The disaster was a man made broad-based failure on the part of the media, the science establishment, and the federal bureaucracy. With the nation and its leaders looking for facts, information was replaced with a massive plume of apocalyptic disinformation and threats of losing a significant part of the coastline to the goo.

While the leaking oil was terrible in many resects the magnitude was vastly over wrought.  In June a slick computer-modeled animated video showed a gigantic part of the spill making its way around the southern tip of Florida and up the East Coast. Oil covered everything from the Gulf to the Grand Banks.  The New York Daily News said, “BP Oil Slick Could Hit East Coast In Weeks: Government Scientists.”  CBS, MSNBC and many others followed on.  The video was a huge YouTube hit.  It was one of history’s most successful news frauds from the National Center for Atmospheric Research – paid for by taxpayers.  Then the National Oceanographic and Atmospheric Administration (NOAA) disavowed the scenario.  Too late, who ever hears about the recantations when the media screws up?

Watson Technical Consulting of Savannah, Ga. a firm specializing in computer modeling of the effects of hurricanes, seismic events, geophysical hazards, and weapons of mass destruction asserts the simulation was bogus from the very beginning, because it ignored important conditions in the Gulf. Furthermore, says Chuck Watson, the media never took account of how diluted the oil would be once it got around Florida, through the Gulf Stream and finally got to the Atlantic: The bulk of the theoretically massive spill the video shows amounts to roughly a quart of oil per square mile. Watson claims flat-out that NOAA was “gold digging” for grants as there’s probably more federal research money floating around the Gulf than there is oil. “There is a feeding frenzy with people trying to get funding for their specialty,” he said.  Never let a disaster go wasted or some such cleverness from the administration – does that sound like people that can be trusted?

The coffin for this writer was the “Giant Plumes” of oil.  Here the lying got very creative and flunked high school general science class.  Halfway into May coming up with oil on the surface was getting problematic so some marine researchers were drafted to provide the answer.  Water tests were showing oil in small quantities under the water’s surface from wave action, but how much no one could say nor, obviously, was there any peer reviewed literature to check on the known facts.

Media reports implied and even tried to assert that “enormous oil plumes” were waiting, like nuclear submarines, to rise and attack unsuspecting beaches and wetlands. The New York Times summed up the media consensus on May 15: “Scientists are finding enormous oil plumes in the deep waters of the Gulf of Mexico, including one as large as 10 miles long, 3 miles wide, and 300 feet thick in spots. The discovery is fresh evidence that the leak from the broken undersea well could be substantially worse than estimates that the government and BP have given.” The article quoted Samantha Joye, a marine-sciences professor at the University of Georgia, as saying that this oil was mixed with water in the consistency of “thin salad dressing.”  Except there weren’t any plumes at all, let alone any ‘salad dressing’ type stuff.

By the end of May NOAA, where some grownups still have responsibility, released a study finding weak concentrations of oil in the area surrounding the Deepwater Horizon site at only 0.5 parts per million, maximum. The median was a little over 0.2 parts per million.

Again as the “giant” spill that threatened the East Coast, that’s barely above the threshold of detection.  By late July and early August, BP, the Federal Government, and some independent researchers were saying they couldn’t find any plumes at all. “We’re finding hydrocarbons around the well, but as we move away from the well, they move to almost background traces in the water column,” said Admiral Thad Allen, the administration’s point man on the spill. By then some 75 percent of the oil released is gone – and that’s based on new estimates that put the spill rate at the high end of earlier projections.

The giant-plume threat was greatly overstated by scientists and further blown out of proportion by the media. This writer believes those ‘scientists’ are not scientists at all.  As everyone who passed high school general science knows, oil is lighter than water and rises above it in all known situations on this planet. The idea of underwater plumes defies everything that we know about the physical laws on earth.  It’s been a great source of irritation and anger for weeks.  It’s a very good thing the notion is so incredibly dumb that its funny – but watching people report it is to see a stunning display of ignorance.  Are there no fact checkers left in the mass media?

The Gulf of Mexico and some of the coast of California are warm ecological systems where oil seeps are part of the food chain.  The leak was a bonanza for oil eating bacteria and the bacteria bonanza will work its way up the food chain with its abundance.  While the leak was perhaps a four-fold increase in the annual oil supply to the Gulf, the natural ecosystem adjusted quite well and as seen decades ago in the Mexican leak – it’s a very short-term matter. Truly it’s a disaster not to be left unused – by bacteria.

Dispersants turn thick, ugly slicks into widely distributed droplets, minimizing damage to beaches and sensitive wetlands.  When slicks are broken up the light oil parts evaporate, and the bacteria more easily eat the heavier parts.  Corexit is thought to be the major dispersant used in the treatment – something you shouldn’t spray directly on coral, marshlands or other living things as it’s a detergent like chemical.  Corexit has made lots of disinformation news too, even being a subject for a Congressional hearing.  But the EPA who recently started proceedings to make milk spills hazardous material type events has approved Corexit in supervised use.  In a reality check using dispersants is to break up oil before it gets to shore, piles up and gets out of the water – where the oil breakdown slows down and gets quite messy for wildlife and the flora.  It’s a very good thing the EPA kept its act together and the disbursements flowing – an issue of debate that did have some suspense.

Finally, this writer has a question for everyone – where is the link to the reputable gulf shrimp supplier – I’d like a five gallon bucket full, packed in dry ice for a 3 day UPS ground trip. A shrimp feast might make the anger recede a little more.

In closing, people lost their lives and condolences are due their families and herewith are heartfelt given.  Jobs are lost, suffering and troubles are mounting, so this writer is speaking out for you and will be your customer again.

The disaster isn’t about oil anymore, it’s the impact of media and politics – something that should and could be fixed in just a few words by just one man.  Do you think it will happen?

Here is the original: New Energy and Fuel

Impossible But Done

This writer thinks with others that the “laws” of science and other notions have useful purposes when they work to our advantage.  But the laws need challenged now and one must suppose forever.  So when the impossible, flaunts the law, or whatever gets crashed, there’s cause for some celebration – not for the breaking, but for the new frontier.

Back at the July fourth weekend, but Rick Cavallaro and the crew at fasterthanthewind.org proved a wind-powered vehicle traveled downwind faster than the wind speed.  Naysayers said it couldn’t be done, but the anarchist in this writer can’t help but spread the word.  It’s official – impossible but done.  The North American Land Sailing Association made it official July 27th, 2010 when it ratified the results.  And at better than 2.8 to 1 as well.

The achievement means physics texts, record books, and a pile of assumptions all have to be rewritten and reevaluated.  A new frontier, indeed.

Richard Jenkins wrote in part, “My heart is split between belittling idiots, and saluting eccentrics, and this downwind quest lay somewhere in the middle. These loonies were pursuing a pointless goal, doomed to failure, but there was some genuine merit in the myth and their enthusiasm . . . Traveling through zero apparent wind, with no stored power? Impossible. Why would you even attempt it?

A few months later I actually met the idiots in question and, to my surprise and concern we not only have a few mutual friends, but they seemed to be rather technically credible. But, everyone makes mistakes, and I let them off as decent people with a blinkered view of fundamentally flawed engineering . . .  A few months later they were claiming success!

There was, however, a growing momentum of technical people (who should have known better), saying that these idiots have actually proven that it is possible to travel faster than the wind going directly down wind.”

Jenkins shot the video:

The backing for the record attempts isn’t full of dopes either. The list includes JobyEnergy, Google, MetOne Instruments, and SportVision.  Some eccentric press picked it up including Wired Magazine, Popular Science, Discover, Sail Magazine, Discovery Channel, and Thin Air Designs.  Let the skeptics rest.

Cavallaro and his crew designed an innovative ultra-lightweight, aerodynamically sound cart with a 17-foot propeller that’s driven by the vehicle’s wheels. The wheels turn the prop, while the prop turns the wheels – possible thanks to an incredibly heavy-duty transmission – with the wind acting as an external power source that propels the cart faster than the wind itself.

The team set out to prove such a feat was possible and now that they’ve set a record they’ve fixed their sights on breaking it. Cavallaro hopes to reach three times the speed of the wind within a few weeks.

The counterintuitive idea that you can travel downwind faster than the wind is casus belli for aerodynamic arguments from Internet forums to college classrooms. The concept DWFTTW (Down Wind Faster Than The Wind) can cause world-renowned physicists to throw their Nobel Prizes in fits of rage.

Cavallaro explains, “If you’re on a bike and you’re going downwind, you don’t feel any wind anymore at all. You lose the power of the wind when you reach the wind speed, because there is no relative wind at that point.”   Working with a hang-gliding buddy, Cavallaro did the math and built a model to prove DWFTTW is possible.  The equations didn’t persuade anyone, “I thought people would say, ‘That’s cool,’ but they didn’t. They said, ‘Wow, you’re an idiot.’ So we decided to build a full-size one. That’s when we approached a couple of sponsors.”

Cavallaro lined up help from Google and JobyEnergy and set to work with the San Jose State University aero department on an ultralight, four-wheeled vehicle with a 17-foot-tall propeller. The vehicle is made mostly of foam and parallels the aerodynamics of a Formula 1 racecar.  The propeller is key to how it is possible to travel downwind faster than the wind. It’s also the source of the biggest misunderstandings about how the vehicle works.

Cavallaro goes on, “Skeptics think that the wind is turning the prop, and the car is turning the wheels, and that’s what makes the car go. That’s not the case. The wheels are turning the prop. What happens is the prop thrust pushes the vehicle.”

“It sounds like a perpetual motion machine – the wheels turn the prop, which turns the vehicle’s wheels, which turn the prop, which turns the vehicle’s wheels – but you’ve got the wind as an external power source,” Cavallaro said.

Building a transmission capable of transferring power from the wheels to the prop was almost as hard as convincing skeptics that the vehicle would work. It took longer than a year and a lot of trial and error to make it work. “You’ve got to come up with a transmission that can handle those loads, even though it’s not at a high horsepower,” Cavallaro said. “You break some things, and then you build bigger.”

Sometimes it’s the laws that break.  The anarchist in this writer is pleased; other laws can fall, too.  Many things, from BlackLight and cold fusion in physics on to chemistry and biology there’s a wealth of laws that need sent back to being ideas with new frontiers in their place.


Original post created by: New Energy and Fuel

A New Look At the Carbon Cycle

Two new studies with international participation might change the way scientists view the crucial relationship between Earth’s climate and the carbon cycle. The reports explore the global photosynthesis and respiration rates — the planet’s deep “breaths” of carbon dioxide, in and out.  The researchers say that the new findings will be used to update and improve upon traditional models that couple together climate and carbon.

Well . . .  Lets look at the press release info. The two reports were published online by the journal Science at the Science Express Web site on July 5th. Science is published by The American Academy for the Advancement of Science (AAAS), the nonprofit science society.  Still, when it comes to the global warming crown the AAAS isn’t leading the ‘get it right’ crowd at all.  Anyway . . .

Led by Christian Beer from the Max Planck Institute for Biogeochemistry in Jena, Germany, along with colleagues from 10 other countries around the world, the first study looks at Earth’s Gross Primary Production (GPP), which represents the total amount of carbon dioxide that terrestrial plants breathe in through photosynthesis each year. With a novel combination of observations and modeling, they estimate the total amount of carbon dioxide that the world’s plant life inhales annually to be 123 billion tons.

The other group led by Miguel Mahecha, also from the Max Planck Institute for Biogeochemistry, and another international team of researchers believes they’ve settled a long-standing debate over the effects of short-term variations in air temperature on ecosystem respiration, or the Earth’s exhalation of carbon dioxide back into the atmosphere. They show that the sensitivity of ecosystem respiration to short-term variations in temperature is similar around the world. The researchers also suggest that factors other than temperature, such as the slow, ongoing transformations of carbon in the soil and water availability, appear to play crucial roles in long-term ecosystem carbon balances.

Mahecha’s group looks to be working on the actual carbon cycle loop and the variables that affect the rates vegetation operates.  Beer’s group is going for the big numbers, novel or new, the field is rife in opportunities in the assumptions, omissions and errors.  The math may be impeccable but the assumptions are going to be suspect from the start.  Maybe not due to the team, but the Climategate fraud casts a very long shadow.

The press release groups the two studies with the hope of selling the idea the study findings shed more light on the global cycle of carbon into and out of the atmosphere and how those processes are coupled with Earth’s ever-changing climate. The researchers analyzed vast amounts of climate and carbon data from around the world, and they say their results should help to improve the validity of predictive models and help resolve how climate change might affect the carbon cycle — and our world — in the future.

Beer sensibly says, “An understanding of the factors that control the GPP of various terrestrial ecosystems is important because we humans make use of many ecosystem services, such as wood, fiber, and food.  Additionally, such an understanding is important in the context of climate change as a consequence of carbon dioxide emissions from burning fossil fuels because vegetation greatly modulates the land-atmosphere exchanges of greenhouse gases, water, and carbon dioxide…”  All very well reasoned and reasonable.

Beer and his colleagues pooled large amounts of data from FLUXNET, an international initiative established more than 10 years ago to monitor exchanges of carbon dioxide between Earth’s ecosystems and the atmosphere, with remote sensing and climate data from around the world to calculate the spatial distribution of mean annual GPP between 1998 and 2006.

The Beer led researchers highlight the fact that uptake of carbon dioxide is most pronounced in the planet’s tropical forests, which are responsible for a full 34 percent of the inhalation of carbon dioxide from the atmosphere. Savannas then account for 26 percent of the global uptake, although the researchers note that savannas also occupy about twice as much surface area as tropical forests.

Well, NO.  Looking back up at the beginning the report is only looking at the terrestrial activity, all of Oceania is being ignored.   Someone needs to have a talk with the press office.

The Beer led researchers found rainfall also plays a significant role in determining the gross global carbon dioxide uptake. They suggest that rainfall has a significant influence on the amount of carbon that plants utilize for photosynthesis on more than 40 percent of vegetated lands, a discovery that stresses the importance of water availability for food security. According to the study, climate models often show great variation, and some of them overestimate the influence of rainfall on global carbon dioxide uptake.

Biomass Growth Chart By Temperature and Precipitation. Click image for the largest view.

No one is dialing in the atmospheric humidity, which has a huge impact, too.

But Beer sums up with; “We reached a milestone with this paper by using plenty of data from FLUXNET in addition to remote sensing and climate reanalysis. With our estimation of global GPP, we can do two things — compare our results with Earth system process models and further analyze the correlation between GPP and climate.”

It sounds good but doesn’t look solid, still a theory looking for statistics and the stats aren’t good enough, by far.

Mahecha and his team of researchers also relied on the global collaboration within the FLUXNET network during their investigation of ecosystems’ sensitivity to air temperature. Compiling and analyzing data from 60 different FLUXNET sites, the researchers found that the respiratory sensitivity to temperature of the world’s ecosystems, commonly referred to as Q10, is actually quite set in stone — and that the Q10 value is independent of the average local temperature and of the specific ecosystem conditions.  This group looks much more sensible – the conclusion matches standard agronomy know how.  Farmers know and count up the heat units they get to predict yields.  Decades of experience make the process a certainty.

Which makes one wonder – when the press release asserts experts have debated the effect that air temperature has on global respiration, or the collective metabolic processes of organisms that return carbon dioxide to the atmosphere from Earth’s surface. Most empirical studies suggest that such ecosystem respiration around the world is highly sensitive to increasing temperatures, while the majority of predictive models suggest otherwise. Scientists say that global air temperatures may rise due to the presence of heat-trapping carbon dioxide from the burning of fossil fuels.

But, this new result suggests that the temperature sensitivity of the natural exhalation of carbon dioxide from ecosystems has been overestimated and should be reevaluated.

Mahecha and his team considered the processes of the 60 different ecosystems on the exact same time-scale in order to nail the global mean Q10 down to a value of 1.4. Their new, standard value for various ecosystems’ sensitivity to air temperature suggests a less pronounced short-term climate-carbon feedback compared to previous estimates.  The study might settle a controversy – suggesting that previous field studies failed to disentangle processes acting on different time-scales.

Mahecha says, “Our key finding is that the short-term temperature sensitivity of ecosystem respiration to air temperature is converging to a single, global value.  Contrary to previous studies, we show that the sensitivity of ecosystem respiration to temperature variations seems to be independent from external factors and constant across ecosystems. In other words, we found a general relationship between variation in temperature and ecosystem respiration… Our findings reconcile the apparent contradictions of modeling and field studies.”  Thank you, sirs.

Its reported that the two studies can allow for more precise predictions of how Earth’s warming climate will affect the exchange of carbon between our ecosystems and the atmosphere — and vice versa. They provide scientists with important tools for better understanding the world’s ecosystems and how the human race continues to influence and alter them.

But there isn’t much new in the work.  Crop scientists, agronomists, farmers, gardeners and herbalists have all known for generations that warm humid rainy weather is better for biomass growth than cool dry droughts.  Actually everyone knows that.

Mahecha’s group may well straighten some things out.  Beer’s group cuckolds some more bizarre modeling.  Maybe this is progress.  But it challenges the practical food consuming critters on earth – we’re grateful for every bit of atmospheric carbon dioxide – cutting CO² back sounds sometimes like planetary genocide.


Source: New Energy and Fuel