Teradata oil and gas 'big data' roadshow (December 2013)Partners tout big data analytics for oil and gas. Spotfire use cases from Chevron, Encana. Hortonworks Hadoop for the enterprise. Teradata's quest for better upstream analytics.
Teradata, along with partners Tibco/Spotfire and Hortonworks invited Oil IT Journal along to their EU roadshow last month. Tibco’s John Guthrie kicked off the proceedings with a reprise of a Spotfire Energy Forum presentation on Chevron’s water flood surveillance. This mashed data from operations and finance into a holistic asset view. Spotfire has application everywhere—from E&P to O&M. The latest release (V6) provides metrics on handhelds, multi-layer maps and event stream analytics. In the non-conventional space, Guthrie cited a recent webinar where Tim Yotter (Encana) discusses big data analytics for production surveillance and completions optimization. More from Spotfire.
Next up was Hortonworks’ Ben Rudall on the business value of Hadoop. Hadoop is an open source data architecture tuned for ‘big data’ analytics. The tool has been in regular ‘at scale’ use chez Yahoo since 2008. Hortonworks was formed in 2012 with the objective of marketing and supporting ‘enterprise’ Hadoop with a RedHat-style business model. Rudall positioned Hadoop as a component of a data architecture alongside the RDBMS, Teradata and data warehouse. With Hadoop, it is apparently possible to ‘manage seismic data in under 15 minutes!’ More from Hortonworks.
Teradata’s Niall O’Doherty continues in his quest to convince the upstream on the need for a better handle on its big data. Oil and gas has been a consumer of big data for years, especially seismic. But with the PC revolution, much data is now is silos (read Petrel). Other verticals (such as retail and telcos) are leveraging novel architectures for their big data. Oil and gas has made the first steps, with better handling of metadata, but it has not yet made the move to the data warehouse. Today, interest in ‘big data’ has revitalized O’Doherty’s crusade to sell Teradata into oil and gas. ‘We are a fork in road and need to decide whether we need more big applications or better analytics.’ The true potential of better data availability is in facilitating deeper analytics on new and different data types. O’Doherty presented a case study performed for an EU major on 4D seismic data management that leveraged a combination of Hadoop and Teradata to accelerate seismic processing between successive surveys. The project leveraged the Mahout machine learning application to classify and score seismic anomalies. This represents a new paradigm for seismic data management—breaking down data and functionality into small chunks. There are also applications of the technology in production monitoring which is today ‘like watching TV.’ Here Teradata has developed an integrated data environment for hauling, production surveillance, maintenance and digital oilfield that is being trialled by an Eagle Ford operator. This enables queries across 12 domains and 120 data sets. Within a year the project gave a 90% reduction of shut in wells and a $6 million per month savings due to better logistics and production management. ‘You don’t have to throw away your investment in (say) Petrel, just build a unified data architecture with Teradata, Spotfire and Hortonworks.’ More from Teradata.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to firstname.lastname@example.org with OilIT_1312_8 as the subject. Web use only - not for intranet/corporate use. Copyright © 2013 The Data Room - all rights reserved.