Perhaps we use the term “explosive” very often when describing the demand for data, but there just don’t seem to be any other adjectives that convey the scale of growth as well.  The latest indicator is a recent announcement by a Norwegian company that gives a glimpse into the “explosive” growth of data on some offshore oil and gas platforms.

Ziebel AS, a Stavanger-based provider of specialist well intervention services, announced what it believes to be the largest-ever collection and transfer of data in the oil and gas industry.  The collection, carried out on behalf of major operator on wells in the North Sea, was completed over an eight-month period.  The volume of data gathered was 1,708 Terabytes.  The data was stored on servers on the platform rather than transmitted to shore. 

In its release, Ziebel made a couple of interesting comments:

  • That the shifting sands of the oil & gas industry mean that firms must collect more data than ever before in order to make informed, responsible production decisions.
  • Advances in investigation and collection methods mean that the volume of oilfield data is expanding at a phenomenal rate. 
  • Industry experts believe that the sheer amount of data collected is growing by a factor of five every year.

We have sometimes joked about forecasts that describe Internet data traffic in terms Zettabytes and Exobytes, describing them as numbers so large as to be almost meaningless.  It was only a few years ago that people would have called these forecasts laughable.  Remember how in the 1990s people were talking about “information overload?”  They argued that there was a ceiling as to how much data people could reasonably process and anything beyond that was unnecessary.  When was the last time you heard that argument?

It wasn’t that long ago that generating 1,708 Terabytes of data over eight months for a single offshore oil project would have been considered equally implausible, but this is what the future holds for us.