Each week dozens of prognosticators toss out their predictions of the weekly gas storage change, the most significant piece of supply and demand information available in the gas market today. But it’s surprising to know that on average the predictions of 34 top firms, including five energy companies, 28 financial houses and one energy consulting firm, have been off by about 13.2 Bcf/week since the Energy Information Administration (EIA) took over the survey from the American Gas Association in May 2002.

A draft paper circulated earlier this month by Gerald D. Gay of Georgia State University and Betty J. Simkins and Marian Turac, both of Oklahoma State University, provided these and other results after taking a closer look at the accuracy of the weekly storage predictions of the 34 firms and those in the Bloomberg weekly survey of analysts’ predictions.

Although analysts providing storage forecasts assisted “significantly in the price discovery” on the natural gas futures market by providing important information, the paper found that since 1997 there were “significant cross-sectional differences in analyst forecasting ability.

“We find that analyst forecasts are less accurate and more widely dispersed during the critical withdrawal season than during the injection season; that energy company analysts have generally outperformed their counterparts employed in financial companies; and that analyst forecasts became overall more accurate and less dispersed following the takeover by the [EIA] of responsibility for publishing a key weekly storage report,” the paper’s authors stated.

Looking at weekly forecasts of the 34 firms since 1997, mean weekly averages ranged from an error as low as 9.49 Bcf (in the case of Citigroup) to as high as 23.13 Bcf (Cresvale).

Although the 13.2 Bcf average error since EIA started the survey might be considered a fairly large number, forecasting success used to be even further off the mark (an average error of 16.4 Bcf/week) when the American Gas Association was in charge of the storage report, according to the paper.

Furthermore, using a “standardized analysts forecast error” calculation, which compares the forecaster’s weekly error to the actual working gas level for the week, the paper found that performance ranged from a weekly average error of only about 0.47% (in the case of Southwest Securities) to a high of 1.3% (Cresvale). Such a mistake even on a weekly basis could hardly be be characterized as substantial.

There are alternatives available but none of them do as good of a job at predicting the weekly storage number, said Gay, one of the paper’s authors. “You can ask yourself…’what information are these guys bringing to the table that is not already reflected in a lot of naive approaches’ — and we do test five or six naive models,” Gay said in an interview with NGI. “We find that whatever goes into those [naive] models, whether they are elaborate time-series models that take into account cyclicality and seasonality and those kinds of factors, analysts are bringing information and they are doing better predicting than these alternative models.

“Their forecast errors are smaller,” he said. “We also provide evidence that the market is also embedding these analysts’ forecasts into prices given the way that prices react when these weekly reports by the EIA are released. How good they are I don’t know, but they are certainly better than a lot of these alternative approaches.”

Salomon Smith Barney Futures was the most accurate forecaster compared to its peers on average over the period, while Cresvale International was the least accurate over the time period.

The paper also determined that weekly storage changes are more difficult to forecast during withdrawal periods than during injection periods. “Individual analyst forecast errors…and the weekly Bloomberg consensus forecast errors…are significantly larger (all at the 1% significance level) during the withdrawal seasons. In addition, the dispersion of analysts’ forecasts is also larger during withdrawal seasons,” according to the paper.

There were several other interesting findings comparing the results from different types of forecasting firms during different periods. For example, the paper found that when AGA was running the survey energy firms had “significantly better performance than did financial firm analysts.”

That trend didn’t carry over on average when EIA assumed control of the survey. After May 2002, the performance differences between the two groups was insignificant unless examined seasonally. Energy firm analysts still performed better than financial firms during the November through March withdrawal cycle, however. There were insignificant differences during injection cycles (April through October).

For more details from the paper contact Gay at the Department of Finance, J. Mack Robinson College of Business, Georgia State University in Atlanta at (404) 651-1889 or ggay@gsu.edu.

©Copyright 2004 Intelligence Press Inc. All rights reserved. The preceding news report may not be republished or redistributed, in whole or in part, in any form, without prior written consent of Intelligence Press, Inc.