Ernest Capozzoli of the Coles College of Business at Kennesaw State University has offered Oil IT Journal readers an in-depth analysis of the impact on oil and gas companies of the eXtensible Business Reporting Language (XBRL). The US Securities and Exchange Commission’s (SEC) initiative to enhance financial reporting is now extending to financial disclosures. This is expected to be completed by year-end 2007 and will contain over 200 financial statement disclosures, ten of which are specific to oil and gas.
Capozzoli believes that the XBRL standard is now ready for prime time and will be ‘transformational’ over the next couple of years. XBRL comprises a set of standard XML-based taxonomies and presents financial and performance data that is ‘shareable, royalty free, reusable and easily understood.’
Accounting standards provide financial executives direction on how to account for business transactions and how to report financial information. Unfortunately, much of today’s reporting is supplied in unstructured text or PDF documents, making it hard to exploit. XBRL gets around this by presenting financial reports in a structured way that lets consumers access and understand the reported data.
The SEC made its intentions clear in 1996 when chairman Christopher Cox announced that XBRL should have documented ‘every taxonomy that’s necessary to produce financial statements for any industry using US-GAAP CI by no later than mid-year 2007.’ To achieve this, the SEC has sponsored a massive project to incorporate financial statement disclosures in the US-GAAP CI XBRL taxonomy. A $54 million investment is transforming the SEC’s EDGAR database to handle XBRL data. At the same time the US-GAAP CI financial statement taxonomy has been extended with the 200 new disclosures.
Participation in the SEC’s voluntary program from the oil and gas sector so far has been limited to Anadarko and Petrobras. Cappozoli believes that the low participation, along with pending XBRL initiatives will force organizational changes as oils struggle to meet the new reporting requirements.
Oil and gas XBRL
Two XBRL projects will specifically impact the Oil and Gas industry. The first is the development of an oil and gas taxonomy including data tags for all US GAAP financial statements and footnote disclosures. The second involves the development of taxonomy specifications for the Supplemental Information on Oil and Gas Exploration and Production Activities required by FAS 69. Capozzoli concludes by noting that, ‘The question is not whether XBRL will impact the oil and gas industry, but when!’ Read the full paper on www.oilit.com/papers/xbrl.pdf.
Schlumberger has acquired oil country data quality assurance boutique Innerlogix of Houston for an undisclosed amount. InnerLogix uses XML/XSL technology to access and cleanse data in proprietary environments.
But along with the technology, InnerLogix has a stellar marketing track record. At its first user group meet in 2003, after only a couple of years of business, InnerLogix already counted ConocoPhillips, ExxonMobil, ChevronTexaco as clients and had partnerships with Schlumberger, Halliburton and IHS Energy.
Schlumberger Information Solutions president Olivier Le Peuch said, ‘InnerLogix is a proven solution for automated data quality management. The technology is used by some 26 oil and gas companies around the world.
Center of Excellence
The addition of InnerLogix to the SIS portfolio brings data quality to users, increasing productivity and reducing technical risk.’ The InnerLogix toolset will integrate Schlumberger’s Data Quality Center of Excellence in Houston and will leverage Schlumberger’s global marketing organization.
At the end of an entertaining talk on geological ‘database blues’ (see our report from ECIM 2007 on pages 6-7), ExxonMobil’s Stephan Setterdahl polled the audience to see what datatypes gave them the database ‘blues.’ Perforation data, faults, seismics and well bores all got a mention, but what came out ahead was ‘naming conventions’ in general. There was a consensus that in the real world, especially when exchanging data with partners, it is necessary to work with multiple naming conventions. And that a database that ‘understands synonyms’ would be a big help.
This issue crops up in different contexts and granularities and across pretty well all industries. In one sense, the issue is one of data uniqueness and cleanliness. The issue of calling the top Cretaceous pick ‘Cret,’ ‘Cret.,’ ‘Kret’ or whatever is the same problem that I share with my bank and other figures of authority. Living in France, the simple question ‘What’s your name?’ is a source of great confusion. ‘McNaughton’ I reply. My interlocutor asks me to spell that. ‘MC...’ invariably this written down as ‘MAC’ and thus I can be ‘Mac Naughton,’ ‘Mcnaughton,’ ‘Mc Naughton’ and so on.
I mention this more general ‘naming’ problem because it brings us closer to the generic solutions to the problem that the data management community outside of E&P is more familiar with. To a bank, the data issues above can be approached from two directions—either as a data cleansing problem—using scripts to look for Mac Naughtons who say, live at my address and turn them into McNaughtons. It could also be approached from the standpoint of managing synonyms—the McNaughton in our address database is the same person as the Mac Naughton in our ‘people who owe us a lot of money’ database. I’m sorry if my understanding of banking is sketchy, but you get the picture.
The issue of giving different names to the same thing has been widely studied by the banking and financial services sector. It comes under the broad heading of ‘master data management,’ and is key to cross database reporting, business intelligence and other high level applications. These approaches work essentially by identifying and managing synonyms. Are they applicable to the upstream? I’m not sure—they are likely burdened by a lot of financial services domain knowledge—there should be an easier way...
Another ECIM presentation gave me some more food for thought. Contesto’s Anne-Lill Holme advocates ‘proper’ document classification, using a modified Dublin Core scheme for metadata. Here we are in the world of controlled vocabularies and taxonomies, all lists of ‘master data.’ Moreover, managing different document libraries following say, a company merger, will present the same ‘synonym’ issues as above.
I submit that all of the above issues are facets of the same problem and that is how to represent lists of information in a really useful way. Surely this completely generic issue has been addressed by academia, computer science or someone?
Enter my next witness, Kadme’s Vasily Borisov with whom I chatted at the ECIM gala dinner. He explained how Kadme had compounded information from multiple public data sources using the W3C’s resource description framework—RDF. The different ‘synonyms’ can then be mapped with RDF-aware tools such as Stanford University’s Protégé.
Well this tickled me because as some of you may remember, back in 2004 I sat in on the W3C’s Semantic Web special interest group and editorialized about the technology (OITJ March 2004). Also we reported recently (OITJ June 2007) on the Chevron/Schlumberger-backed Open Oilfield Ontology* effort—another RDF/Protégé effort.
And the answer is ...
RDF, if you will, is academia’s best shot at fixing the ‘how to make a list’ problem. It goes further than simple lists—in fact the semantic web has a lot to offer ‘master data management.’ It is all about understanding and sharing information across disparate resources whether they are documents, web pages or data stores. It works by a) assigning a unique identifier to the list (the namespace) and b) by storing the list items in a rather turgid, but machine readable form. If you want to know how it’s done check out for instance the W3C’s ‘vcard**’ namespace below.
Cain and Barnes, with backing from Devon, ExxonMobil and Shell, are inviting sponsors to a joint industry project (JIP) that sets out to address the geospatial integrity applications used in the upstream. The prospectus notes that ‘costly errors in coordinate data have been a direct result of software problems.’
The JIP sets out to ‘transform the management of geospatial data in geoscience software’ by developing best practices for developers and by ‘creating a sustainable improvement process in geoscience software applications based on sound geospatial data management.’ The JIP is seeking additional funding from new sponsors. Sponsors will receive copies of previous geodetic integrity studies of Blue Marble and ‘eight major G&G software packages.’ More from email@example.com.
OITJ—What is Liquid Computing’s (LC) approach to the oil and gas vertical?
Weston—I moved to LC from Sun Microsystems (which has exited the HPC business) because it represented something new—both a startup and new computing paradigm with oil and gas as one of its initial targets. Our first target is upstream HPC. We’ll then move into ERP with our ‘fabric-based’ architecture.
Weston—The HPC market has settled on clustered PCs linked with Gigabit Ethernet, Myrianet or Infiniband. In reality these are somewhat ‘Rube Goldberg’ solutions. Our ‘LiquidIQ’ fabric eliminates these ad-hoc connectivity solutions, replacing them with a dedicated network and blade computing infrastructure. LC’s founders came from Nortel and have leveraged the blade technology originally developed for the telecommunications industry. This means that you can have as many nodes as you need. Everything on the fabric is visible by everything else—network, storage, applications and so on.
OITJ—Sounds like ‘virtualization.’
Weston—The virtualization concept has been applied in software à la VMWare but our solution does not have the performance trade-off of traditional virtualization. Tests with the two major fluid flow simulator vendors show VIP performance up by 40% and linear scalability to 64 CPUs.
OITJ—We’ve heard that in some clusters, CPU usage may be only 10% of the theoretical rating.
Weston—Sure, because of the communications bottleneck. The LC fabric gets 90% efficiency from processors – even dual core devices. Moreover the power and cooling issues are fixed because the technology was developed for the strict environmental requirements telcos. Our liquid cooling leverages technology developed for gaming machines and lets CPUs run 10-15° cooler. This leads to greater efficiency and reliability.
OITJ—Have you addressed the seismic market yet?
Weston—Not yet, we are concentrating on the reservoir simulator market. But our solution should be good for seismic processing. It is easy to maintain, cheap on people and price competitive.
OITJ—Does the fabric address shared memory issue?
Weston—Yes. It is great for larger models. Everything is shared. In future TESLA GPU devices will become shared resources on the Fabric. In fact HPC will just be another compute resource on the system. Today we can deliver 2,000 node systems with considerable upside—the bus length can stretch to 300 m!
OITJ—What sort of compiler support is there?
Weston—We are ‘completely COTS!’ We support Red Hat Linux, AMD and whatever runs on these. VIP and Eclipse binaries and SCALI all ran. We provide tools that manage, monitor components continuously and reconfigure virtual machines to assure quality of service by adding resources or limiting application access.
OITJ—You mentioned ERP?
Weston—Yes we plan to offer technical and commercial computing on single architecture. The fabric will run on Oracle’s ‘Unbreakable Linux’ and as Landmark’s R5000 release will support the Oracle 10G Rack, we will be one of first certified platforms. Oracle Rack will actually work on the fabric!
More from Nick.Weston@liquidcomputing.com.
The venerable Center for Wave Phenomena of the Colorado School of Mines just published its report to sponsors for the 2006/7 academic year. The CWP has broad based industry support for geophysical R&D and this year’s papers focus on imaging, time lapse seismics, AVO and interferometry (passive seismics).
The CWP runs a four year old, 32 node Linux cluster and some 5TB of disk space. Software deployed includes Mathematica, Matlab, Intel C and Fortran compliers and the NAG Fortran complier.
The CWP releases open source software using the open Seismic Un*x (SU) package, now installed at 3600 sites in 68 countries. Other open source software includes Dave Hale’s Mines Java Toolkit and Paul Sava’s Madagascar seismic processing package—this captures seismic processing ‘recipes’ for re-use. More from www.cwp.mines.edu.
Speaking at the X-Tech conference in Paris earlier this year, Peter Murray-Rust of the University of Cambridge made a plea for ‘open data’ in science. Science is increasingly based on the re-use of existing published data. Traditionally this has been associated with primary journal articles, either within the text or attached as supplementary information. There is increasing community pressure to publish this in open, machine-accessible form, either into data centers or as supplemental data. This allows data and text mining to generate new areas of knowledge-driven science. Linking data from different disciplines is enabled by new web technologies that create syntheses from which new insights arise.
Unfortunately, most publishers make no effort to encourage the machine-readable publication of data and several actively oppose it by practices such as licenses, copyright and bans on robotic downloads. Many publisher requires authors to hand over copyright on data, even though it can be argued that these are facts. Murray-Rust advocates a policy of unrestricted access to scientific data in semantic, ‘machine-understandable’ form. This can be leveraged by text-mining and automated high-throughput spidering. More from http://en.wikipedia.org/wiki/Open_Data.
Stavanger-based Kadme has announced a ‘Virtual Warehouse Framework’ (VWF) for E&P metadata management. The VWF underpins the Kadme-Kestrel asset data management system presented at the 2007 ECIM data management meet. VWF builds an ‘address space’ of all enterprise information resources that Kadme believes will ultimately replace the corporate database.
VWF is based on a top-level ontology and semantic schema from Kadme. These are stored using the W3C’s ‘resource description framework’ (RDF). Information from local databases is captured with an ‘RDF-izer’ such as MIT’s Simile and the Jena API. VWF binds metadata from different models based on ‘semantic proximity.’
Kadme’s approach leverages open source ‘semantic web’ technology as envisaged by Tim Berners-Lee, the father of the world wide web and long time advocate of RDF. VWF underpins Kadme’s K-search and K-map solutions. More from www.kadme.com.
UK-based Offshore Hydrocarbon Mapping (OHM) has acquired Rock Solid Images (RSI) of Houston in a £10.4 million cash and paper transaction. RSI’s rock physics toolset ties well and seismic data to predict reservoir properties. OHM uses controlled source electromagnetic imaging (CSEM) to map hydrocarbons below the seabed. OHM believes that CSEM and seismic data are ‘completely complementary’ and can provide ‘quantitative measurement’ of rock and fluid properties.
OHM CEO Dave Pratt said, ‘The acquisition follows a year-long collaboration on integrating CSEM data with RSI’s geologic models with exciting results. We can now accelerate our R&D and secure the ownership of intellectual property rights developed in this field.’ OHM is also to leverage a ‘strategic alliance’ with CGGVeritas to further its CSEM/seismic data analysis. CGGVeritas holds a 15% stake in OHM.
Founded in 2004, OHM has had a roller coaster ride on the UK AIM market as its IPR was contested by competitor Electromagnetic Geoservices AS (EMGS). After the European Patent Office’s refusal to grant EMGS’s wide CSEM claim, EMGS abandoned its suit, paying OHM’s costs. In 2004 Schlumberger’s Dalton Boutte claimed that CSEM ‘could replace seismics.’ Today CSEM remains a minute fraction of the exploration market.
ENI Australia has awarded SpectrumData a contract for seismic data management and tape copying services. The contract includes duplication of ENI’s exploration data, using a range media including SDLT/DLT tapes, 3590 tape cartridges, LTO and DAT tapes and hard copy.
SpectrumData CEO Guy Holmes said, ‘In today’s boom times, high quality and cost effective seismic data management lets our clients focus on commercial priorities in the knowledge that their valuable data is in safe hands.’
Northern Territory Government
The Australian Northern Territory Government (NTG) has contracted SpectrumData for its onshore petroleum data bank. The project sets out to preserve the Territory’s data and provide access to industry and the NTG.
Houston-based startup Upstream Professionals (UP) has announced a ‘Well Lifecycle Efficiency Framework’ a.k.a. the UP Framework. The system currently being implemented in several oil and gas companies sets out to ‘optimize operations, control costs and increase revenues.’
UP has a strategic alliance with RWD Technologies to leverage its expertise and tools for training and performance support solutions. The resulting ‘knowledge transfer system’ is tuned to an organization’s requirements, particularly in the face of the ‘big crew change.’
UP CEO Jeff Dyk said, ‘Improving the visibility of key information elements across all departments increases staff productivity and positions an organization to adapt to change through future acquisitions and drilling.’ The solution includes best practices for drilling, workover and production and embeds benchmarks for operational key performance indicators. A custom dashboard provides financial insight into operations. Upstream Professionals was founded in January 2007 by the creators of the Production Access package.
UP’s Chris Niven told Oil IT Journal – ‘Our new solution includes software from multiple vendors as well as our own tools. We have a vision of what an upstream optimization and knowledge transfer environment should look like and are leveraging existing systems, filling the gaps with customized workflow, collaboration and dashboard solutions.’ Clients include Petrohunt, St Mary’s Land & Exploration and Denbury Resources. More from firstname.lastname@example.org.
Zokero has announced SeisWare 6.5 including a ‘true’ compiled 64 bit version for large data sets and Windows Vista 64 and 32 bit compatibility. The package also offers a new hardcopy editor, enhanced OpenSpirit import and export, interactive velocity QC and grid editing for improved depth conversion.
Google has submitted its Keyhole Markup Language (KML) to the Open Geospatial Consortium (OGC) for adoption as a standard. The plan is for KML V3.0 to be ‘harmonized’ with relevant OpenGIS baseline standards. KML is used to display geographic data in Google Earth and Google Maps. The OGC and Google have agreed to better align KML with the OGIS’ Geographic Markup Language (GML).
TGS-NOPEC has released Facies Map Browser 2 (FMB2), an update of its well-data derived database of depositional systems models. FMB2 offers improved map visualization, single well review and multi-well correlation displays and a query tool for trend analysis. Grid manipulation allows for data use in third party interpretation systems. FMB2 provides access to an online library of reports and references, adding a virtual ‘audit trail’ of supporting documents for an interpretation.
Beicip-Franlab has just announced ‘FRACAFlow,’ a ‘next generation’ fracture characterization and modeling tool. The tool is a component of Beicip’s OpenFlow environment (OITJ May 2007) which also includes CONDORFlow for assisted history matching and PUMAFlow for reservoir simulation. The new portfolio is made up of plug-in modules that share a common user interface, data model and visualization environment.
LogTech has just released LOGarc VE, a well log management system for digital and raster log data. A new WELLfile module adds well-related file data types. LOGarc can be used as a component of outsourced data management with data hosted by LogTech. LogTech’s database currently holds over 7,500,000 curves.
Deloitte Petroleum Services has launched a ‘prospects and play’ module for ArcGIS 9.2 and MapInfo 8.5. The module is available for all PetroView datasets and incorporates a customizable schema that lets users load, edit and query their own prospects and business development opportunities. The module imports information from many formats (PPT, XLS, DOC and CAD) into a single spatially-enabled database. Users view and query corporate prospect outlines and business development opportunities alongside PetroView data and other GIS layers.
Ryder Scott has added metric units capability to its SNAP well performance and nodal analysis package. SNAP includes code from the Valve Performance Clearinghouse, a consortium of oil companies that establishes gas lift valve performance correlations. Another SNAP customization optimizes tubing tail location in long well completions. SNAP is used by engineers at the Prudhoe Bay and Kuparuk oil fields.
A new release of Geosoft’s Oasis montaj, release 6.4, adds voxel extraction, isosurface plotting for grid visualization and ‘snapshot’ tool to bookmark an area of interest. Math expression support, interactive grid filtering, and SEG-Y import are among other new features.
Johan Kinck described Hydro’s Petech architecture, a new application portfolio with OpenSpirit as the data ‘glue.’ OpenSpirit (OS) acts as a data transfer utility between the project data store and Hydro’s applications. This ‘fit for purpose’ solution will enable deployment of ‘best of breed’ applications while assuring data security in a central data store. OpenSpirit was used to move data from the old project data store (GeoFrame) to the new (OpenWorks). This was a ‘demanding task’ involving large volumes of data.
The Petech project aims to establish more effective cross disciplinary work processes and reduce cycle times for the upstream. Applications from Schlumberger, SMT and Roxar will access data in OpenWorks via OpenSpirit. With the imminent merger with Statoil, Kinck demonstrated that its current data volumes put the company in the same league as Statoil. Commenting on OS’ functionality, Kink said, ‘OpenSpirit is the Fast Move button, but you must understand what you are asking it to do!’
Lynn Babec outlined the future for OS development. New data types will extend OS’ footprint into production engineering, drilling, well planning, reservoir simulation and field operations. The plan is to extend the OS workflow from its current ‘static earth model’ focus out into field development and production. This proved challenging with issues like unstructured and proprietary data stores, multiple realizations and time variant data. All of which raises several questions for the OS community as to the prioritization of new datatypes, applications and data stores.
Marc Hockfield described how Production Access’ Oilfield Data Manager (ODM3) was integrated into Talisman Norway’s OS-enabled environment. ODM3 is a geological toolkit for data integration and interpretation. An OS plug-in was used to Talisman’s GeoFrame data store. OS was successfully used to import Talisman’s massive database in a fraction of the time it would otherwise have taken.
Schlumberger’s Armando Gomez described current and future workflows between Petrel and OS-enabled data sources. Improvements to Petrel’s OS module have enhanced usability with coordinate system awareness, expanded data type support and the integration of ArcGIS events for selected Petrel data types. An upcoming ‘Petrel for the Enterprise’ release will include an Oracle database.
Wolfgang Jaich showed how OMV is planning to leverage OS in a ‘next generation,’ SOA-based E&P information system.
Torkel Thime from the Norwegian State Archive (Statsarkivet) described a program to capture documents relating to the early days of Norwegian oil and gas exploration. Oil and gas is a crucial business to Norway and it is important to understand how it developed. The Statsarkivet has signed deals with Statoil, ConocoPhillips, Total, Exxon, OLF and others for the archival of documents from the 1970s relating to the discovery and development of Norway’s major oilfields like Ekofisk and Frigg.
The real story
Documents include scanned memoranda, handwritten notes and video archives. These include ‘behind the scenes,’ internal non public records that ‘contain the real story.’ They show for instance the unpublished negotiations on Ekofisk crude prices and tax discussions between Phillips and the Norwegian government. Management Committee minutes show Bartlesville memoranda on the Ekofisk Bravo blowout. Discussions with trades unions are also captured. According to Thime, ‘There is no truth here, only different views, eyewitness views bring us very close to events.’
A data migration strategy assures a move from original format documents, removing software dependency. Confidentiality issues are subject to agreements with companies as to when documents will be publicly available.
Lars Olav Grøvik (Hydro) believes there is a ‘mismatch’ between what we spend acquiring data and what is spent on managing it. This is a significant issue that ‘could turn the dream of the digital oilfield into a nightmare.’ Hydro’s operations control center (OCC) at Sandsil was the best in class when built. The OCC was replicated on the West Venture rig offshore. But because Hydro ‘omitted’ to change the underlying work process, the CAVE visualization system was underused. Grovik still believes in 3D visualization but this has largely failed because of the difficulty of keeping CAVE data current. Hydro has tried to get real time data into the CAVE but ‘software is not ready to consume field data.’
Wired pipe has given a huge increase in data rates and broadband from the bit represents a huge challenge for data managers. ‘Why is our software not real time ready?’ ‘Why do we have to hit a button to refresh the screen?’ A middleware integration layer can help with ‘plug and play’. Hydro was reluctantly ‘forced’ to use OpenSpirit to link its local, project and master data stores. The human factor and change management are critical—especially when changing software while operations are up and running! Grøvik concluded with a reference to ‘Putt’s Law’: Those who understand technology don’t manage it, and those who manage don’t understand technology!
XML in oil and gas
Bjorn Rugland (Statoil) traced the complex history of XML in the Norwegian petroleum industry. The Daily Production Reporting (DPR) standard has backing from OLF, Petoro and the main Norwegian operators. DPR embeds the ISO 15926 ontology, Tieto Enator’s Energy Components and an XML data base from LicenseWeb. Petoro uses DPR for partner surveillance across 50 plus fields. A related standard for Monthly Production Reporting solves reallocation issues that were addressed in DPR. MR requires correct volumes aligned with commercial systems. ProdML’s initial scope was the ‘office’ domain, upstream of historian. But in the future, ProdML scope may embrace process data. Statoil is testing ProdML with SIS on zonal allocation on Snorre.
Open Standards in E&P
Nils Hofseth works on system integration at Capgemini’s (CG) ‘EpiCentre,’ an E&P industry competency center which opened earlier this year to leverage CG projects in the upstream/midstream. Hofseth traced the history of integration efforts across vendor applications, data models and APIs. Today, shared data models including OpenSpirit and WITSML alleviate the fundamental problems with reusable programming interfaces and by minimizing ties to platform and language.
Eric Toogood (Norwegian Petroleum Directorate—NPD) offered a brief historical review from the early agreement between the NPD and oils to cooperate on data provision, focusing initially on post stack seismics. NPD is also a DISKOS user as it does its own interpretation to negotiate license terms with oils on a level playing field. The NPD also supervises data preservation and manages data release by the simple expedient of changing the entitlement to ‘public’. Seismic data release rules have been clarified and data in relinquished acreage is no longer confidential. DISKOS membership is up from 17 companies last year to 41. Users can now log into the database from anywhere in world through a new web interface. SINAS operations manager Steven Eames reported that there is now around 2TB of well data and nearly 90TB of seismic online and around 3TB is downloaded per month.
Anne-Lill Holme (Contesto) advocates tagging documents with contextual metadata to make it findable and understandable to other users and for version control. Holme deprecates the use of folders to mimic metadata. Storing documents in folders may not be consistent across the company and there are security, version, tampering and erasure issues with such naïve document management. Metadata capture starts with a database of controlled lists for owner, creator, document type, etc. These are presented to users as a form to fill-in before storing a document. Search can then use metadata and free text constrain the hits returned. It is important to think through tag structures to align them with business needs. This is where the Dublin Core set of metadata terms comes in. DC offers common semantics – with a dozen or so elements to choose from. Unfortunately, there are always some attributes that are not suitable for DC. Metadata capture should allow for stuff that can’t be structured this way—an issue that is not recognized by the DC community. To help and encourage users to fill in metadata, tricks include prepopulating fields and color coding documents.
According to Stephan Setterdahl (ExxonMobil), geologists are bad at databases and their DBAs are ‘singing the blues.’ Setterdahl suggests starting with an empty database and moving key data into the new repository. What should the be? A big corporate database with a 1,000 page manual is ‘a disaster,’ and an open database gets too cluttered after a couple years of use. Setterdahl confessed he was ‘tired of cleaning databases’ and decided that there had to be a better way—without sacrifying geologists’ creativity. He got together a geologist and script writer (SQL and Perl) and started developing a set of scripts to identify database inconsistencies and notify users. Next, a unique ‘pick name’ is required for the plethora of synonyms ‘Cret.,’ ‘Cret,’ ‘Kret,’ ‘SIS_Cret’ etc. Because there are so many, new users are forced to add their own to the mess! Scripts find and rename rogue names to a standard taxonomy. Other tools map from BP, Statoil, and other pick conventions to Exxon’s A17 Maximum Flooding Surface (MFS) convention. ‘Police Action Scripts’ control the database and prevent a mess before it happens. If a user diverges from the standard, the system gives the user one month to correct or their data is deleted! A debate on ‘database blues’ revealed similar issues for perforation data, faults, seismics and well bores. But fly in the ointment is Petrel. As Setterdahl says, ‘You can’t run scripts on Petrel data, I’m hosed!’
Otis Stelly outlined Shell’s move from semi-autonomous OPCOS to a ‘global’ EP IM discipline with central control. This has resulted in a Global IM Forum and an EP CIO. IM design is in response to business drivers like productivity and reporting. As an example, Shell’s ‘Focused Basins Information Platform’ uses global document resources to collate information on an area of interest, leveraging Metacarta. The Shell EP Catalog is now extending to embrace field development and to ‘put workflows atop of IM.’ Shell uses the ISO 15926 standard for its facilities where up to 300 contractors may work on a project.
Liv Maeland (Statoil) announced a new university study program designed for professionals in oil and gas information management. The ECIM-managed course starts later this year at the University of Stavanger. A second ‘IM value creation’ conference is planned for 2009.
This report is a short version of a Technology Watch report from The Data Room—more from email@example.com.
Back in 1999 at the SEG, Eleanor Jack presented a paper entitled ‘The future is random: Archiving seismic data to random access media*.’ In an ECIM workshop, Jack, now with Landmark, teamed with colleague David Holmes to examine the degree to which the 1999 ‘future’ predictions have come true, taking a fresh look at the data archival landscape. In 1999, random access archival media meant CD/DVD or optical disk. Magnetic (spinning) disk was considered too expensive for archival. In 2007, archival is still to tape—either IBM 3590E, 3592 or LTO. 3592 capacity is now 750GB going on a TB and LTO4 is currently 800GB. Will these be the solution in 2015? For Jack, this is far from clear, ‘tapes getting bigger without evolving, just like the dinosaurs!’
Holmes then surveyed the 2007 random access media landscape. Nearline storage now includes CDROM, DVD, memory stick and portable disk (now a major data delivery mechanism for Landmark—even for prestack seismic although it is unsuitable for archiving). Online options include SLED, ‘single large expensive disk’, JBOD ‘just a bunch of disks’ (for gamers) and RAID (but watch out for deletions, not good for the archive).
For all continuously spinning disks, heating and cooling are major issues – in this context, tape is ‘way out ahead.’ One answer to this issue is a new technology called MAID – a ‘massive array of idle drives.’ A typical MAID vendor is Copan whose 42 unit cabinet sports a 700TB capacity. Only a few disks are active at any time, data is spread around drives which are powered down between access. MAID has 6 times the data density of a 3592 tape library. Latency for the first GB is acceptable and much better than a robot. But the system has the same delete issues as RAID. Finally hierarchical storage managers (HSM) bring organization to complex storage architectures. HSM manages back up to disk in a transparent fashion.
Jack recommends media neutral formats for archival. SEGY can be used for tape or CD. The current seismic field data format is SEG D whose Rev 2 has a byte stream version that can be stored on anything. Encapsulation is an important technology for gapped tape formats. Encapsulation options include RODE (although this is a complicated option) and the ATLAS TIF format which replaces tape gaps with a digital marker. Current TIF implementations have a 2GB file size limit but a TIF 8 version with a 24byte address gives a file size limit of 8 PB!
Returning to the archival issue, Holmes noted that current practices do not meet industry requirements. These are a) to remove the requirement for future remastering and b) to be able to throw away the original tapes. Holmes suggests looking again at the Extensible access method (XAM) that is now supported by the storage industry.
Tape or not?
Jack concluded by noting that tape has disappeared in the entertainment industy and in home computing. ‘The present is random. It’s time for the seismic world to catch up.’ There ensued a healthy debate as to the merits or otherwise of tape as an archival media. The proponents of tape advance the cooling and energy issues as well as the fact that tape technology’s progress shows no sign of slowing in the immediate future.
BHP Billiton has appointed Tim Cutt as president of global production. Cutt was previously with ExxonMobil.
CapRock Communications has named Doug Tutt as president of global energy markets.
Chevron has teamed with the The Economist to launch ‘Energyville,’ an online SimCity clone. Energyville uses content provided by the Economist Intelligence to ‘examine the economic and environmental trade-offs and opportunities associated with different energy sources.’ More from www.willyoujoinus.com. Chevron has also appointed John Watson as executive VP for strategy and development.
The Chemical Industry Data Exchange (CIDX) has just published a white paper entitled ‘Chem eStandards Initiative – Radio Frequency ID’ to ‘help chemical company CIOs and CFOs evaluate their RFID investment.’ More from www.cidx.org.
Xiaojun Lu has joined COADE as senior engineer and developer for plant engineering. Lu was previously with China National Offshore Oil Corporation and Texas A&M University.
John Lowe has been promoted to executive VP E&P for ConocoPhillips.
Det Norske Oljeselskap (DNO) has joined de Groot-Bril’s Sequence Stratigraphic Interpretation System consortium (SSIS II).
Donald Zmick is now VP client services with Caesar Systems. Zmick was previously with Decision Strategies.
Dennis Clark has joined the data processing division of Fairfield Industries as sales manager.
Petris been awarded US patent No. 7,257,594 for the ‘dynamic common model’ that underpins the PetrisWINDS Enterprise product.
Invensys has appointed Rick Bullotta VP and CTO of its Wonderware unit. Bullotta was previously with SAP.
John Rose is now president of KBR’s Upstream unit and Tim Challand is president of a new technology division.
The Norwegian centre for research-based innovation, The Michelsen Centre has been established to develop innovative solutions for the ‘next generation of measurement technology.’ Hanno Tam has been appointed VP sales for AVEVA Americas. OpenSpirit Corp. has donated its units catalogue and data model to the POSC/Energistics standards body. OpenSpirit’s technology originated in POSC’s ‘business objects’ project back in 1998.
Michael Brumby is Asian regional manager for Petrosys’ new office in Kuala Lumpur, Malasia.
Gilles Munier is general manager of the new ‘Geogreen’ joint venture between the French Petroleum Institute (IFP), Géostock and the BRGM. Geogreen was established to offer engineering services for CO2 transport and sequestration.
Ann Rolison of Magellan LP is vice chair of the PIDX Downstream eBusiness Workgroup.
Keith Woodrome has joined Ryder Scott as petroleum engineer. Woodrome was previously with ExxonMobil.
The US Securities and Exchange Commission has appointed Texas A&M professor John Lee as petroleum engineer. Lee is responsible for ‘evaluating oil and gas reserves disclosure requirements.’
Shahin Khan has been appointed VP corporate strategy with SGI. Khan will help ‘build the new SGI with a sharp focus on high-performance computing (HPC).’ Kahn hails from Sun Microsystems.
Lars Saunes is director of Simmons & Co.’s new Norwegian office in Oslo.
Tom Even Mortensen is now CEO of Scandpower Petroleum Technology (SPT) Group following the departure of Dag Terje Rian who stays on as a board member.
The 2007 annual Harris Poll ranks oil companies way down the list alongside the tobacco industry. Only 33% believe oil companies ‘do a good job.’
Following our piece last month, geoLOGIC Systems president David Hood points out that ‘geoSCOUT is not just for Canadian data. US data is also available and many companies use international data with the product.’
Following its acquisition of Maintenance Repair and Operations (MRO) Software last year (OITJ September 2006), IBM has consolidated its Maximo offering and is extending the MRO philosophy into IT asset management with ‘cross fertilization’ from its own brand ‘Tivoli’ line of business. Presentations to the Maximo World 2007 held last month in Orlando included both the IT asset and the ‘nuts and bolts’ side of the MRO business. BP Exploration hosted an ‘innovation that matters’ session on the use of Maximo across its global operations.
Terry Ray (IBM) described functionality of the current ‘Zakum’ release of Maximo for oil and gas. BS ISO 14224 failure codes and equipment categories have been deployed. Standardized failure data is essential for reliability analysis. Similarly, standard asset specifications enable comparisons of equipment performance. The Zakum release also leverages Maximo location information and ‘operating context’ is used to associate production data with maintenance so that production losses can be recorded against a work order. Work order prioritization and condition for work information support regulatory compliance. Maximo ‘knows’ what condition an asset has to be in before work such as a plant shutdown can commence. Maximo oil and gas members include ADNOC, BP, Chevron, Flint Hills Resources, KNPC, Occidental, Syncrude, Accenture, Ledet and Associates and UC Berkeley.
IBM has added GIS functionality to Maximo. ESRI’s ArcGIS Server 9.2 enterprise is now available from within Maximo. Dynamic GIS data is now available to Maximo users. Conversely, Maximo information I also available to the mapping community. Covisualization of GIS and Maximo data is further enhanced with support for geoprocessing, geocoding and versioning.
A presentation from Intel focused on ‘converged asset management.’ The use of ‘pervasive devices’ including RFID, sensors and IP addresses for assets is causing physical and IT asset management to converge as users seek to ‘reduce costs, simplify infrastructure and manage assets from a single console with repeatable processes.’ Last year Intel performed over a million Maximo work orders.
Following its web services based delivery of well production and header data (OITJ March 2007) IHS is now offering a slimmed-down service for occasional use. US Data Online provides web based access to current well and production data and lets users build map-based queries and create areas of interest (AOIs). Selected data can be exported in any standard format and reports are stored free for 90 days.
Users get access to new data as soon as IHS collects audits and stores it in the hosted database. Well data is uploaded nightly, production data throughout the month. Enerdeq-powered text queries and national searches complement AOI search.
In addition to well and production data, IHS offers over 170,000 drill stem test reports from the American Institute of Formation Evaluation. This data, which goes back to 1948, has not been previously available as no regulatory authority had responsibility for its collection. US Data Online targets low volume accounts with less than $3,000 of data purchase per year. More from www.oilit.com/ads/ihs.
California State University at Long Beach’s 2006 first technology transfer startup company, TruePoint Systems Inc. has signed a deal with service-oriented architects TenFold Corp. to extend its mote-based distributed inventory management System (DIMS) with an SOA-compliant, Ajax-enabled applications engine. DIMS uses motes, small sensors with onboard business rules and connectivity that provide real-time location solutions for asset management.
TruePoint president Anne Yee said, ‘We needed an inventory management application to leverage the location information provided by our infrastructure. The result is RTLSManager, a robust front-end to our asset location solution.’ TruePoint’s motes enable asset tracking in harsh environments. TruePoint’s anchor client is the US Department of Defense. TenFold’s development platform ‘EnterpriseTenFold’ is used to replace legacy applications and to build complex SOA-compliant systems.
Kongsberg Intellifield has received certification from Energistics (formerly POSC) for its SiteCom, Discovery Wells and Discovery Portal products. SiteCom acts as a WITSML server for real-time and other drilling data from WITSML and non-WITSML sources.
LIOS Technology of Cologne, Germany has also received certification for the WITSML data interface for its distributed temperature survey (DTS) devices. LIOS’ technology provides temperature profiles, backscatter, and event data for onshore and offshore installation.
Energistics’ WITSML certification program is a ‘baseline, self-certification’ process WITSML Version 1.3. Although the program does not provide for testing programs, this is envisaged at a future date. Certification is free to Energistics Sustaining Members. Non members pay an annual fee on a sliding scale based on turnover.
To date, certification has seen limited take-up. Knowledge Systems was first out of the blocks with certification of its Drillworks Connect module and Geologix’ GEO Suite is also compliant.
Mobile computing solutions provider, Illinois-based Syclo is extending the reach of IBM Maximo for Oil and Gas to field workers with its new Maximo Mobile Syclo Edition (MMSE) product line. Syclo MMSE improves operator productivity by targeting asset ‘touches’ to keep equipment functioning. Field data from Syclo’s mobile units can be integrated into back-end applications, speeding management decision making and providing accurate information for compliance reporting. Syclo’s products are built on the Agentry platform and are preintegrated with leading back-end systems including Maximo SAP, Primavera and Datastream.
Richard Padula, Syclo president and CEO said, ‘We work with partners to extend their applications’ reach with our Agentry-based platform and streamline operations with leading-edge mobile technology.’ Other oil country Syclo solutions target work and inventory management. A new Smart Task Tracker application has been deployed by a major oil company client to speed plant turnarounds, shutdowns and outages by eliminating excess foot traffic and paper-driven sign-offs. Syclo’s mobile products are used by some 40 oil and gas companies worldwide including Alyeska Pipeline, BP, Columbia Gas, El Paso, Gulf South Pipeline and Scotia Gas Networks.
Environmental Support Solutions (ESS) has released Essential Suite 7.0, the latest version of its risk management platform. Essential Suite 7.0 (ES7) comprises a suite of integrated environmental, health and safety and crisis management solutions.
Robert Johnson, ESS President and CEO said, ‘Our software is the preferred platform for global enterprises which are under pressure to move their EH&S performance beyond compliance to risk reduction and operational excellence.’
The new release adds 100 languages allowing global enterprises to manage and report worldwide. An Enterprise Hierarchy lets managers roll up EH&S and crisis management performance from all levels of the organization. New security functions control access to enable information sharing without compromising security.
ESS is also working on new incident reporting and auditing functions which will be available in the 7.1 release. ESS partners with Microsoft, IBM and OSIsoft and is working towards SAP Netweaver certification. Clients include Halliburton, ConocoPhillips, Lyondell, Cheniere Energy, Shell, Sasol, Chemtura, Huntsman and Total.
Southern Star Central Gas Pipeline has deployed GE Oil & Gas’ Pipeline Solutions’ ThreatScan pipeline monitoring system. Southern Star is the first North American company to install ThreatScan as part of its pipeline integrity management activities. ThreatScan uses acoustic monitoring to improve pipeline safety and alert operators to potential damage such as may be caused by construction machinery (OITJ July 2007).
Northern Wichita has experienced much urban expansion, with construction encroaching upon a 16-inch gas pipeline. This caused Southern Star to install ThreatScan on a section of line to monitor work crew activity. ThreatScan sensors detect abnormal vibrations and alert GE call centers where data is analyzed and the operator notified of the location of a potential threat.
Warren Etheridge, Manager of Pipeline Compliance for Southern Star said, ‘ThreatScan is proving to be everything we thought it was and much more. Although the product is an impact-detection system, we are working with GE to demonstrate its capability to anticipate third-party encroachments in time to protect the line. ThreatScan will help us improve safety and reliability for our customers and the Wichita community.’ Third-party damage is the leading cause of pipeline failures in the United States. Southern Star is a division of Owensboro, Kentucky-based Southern Star, a natural gas transmission network spanning more than 6,000 miles in the Midwest and mid-continent regions of the United States.
CGGVeritas has acquired 15% of Controlled Source Electromagnetic imaging specialist OHM at a price of 240 pence per share. The companies have also entered into a ‘strategic operating alliance’ to jointly develop the global CSEM market and to ‘to capitalize on the integration opportunities between seismic and CSEM.’
CB&I is to acquire ABB’s Lummus Global business for $950 million on a debt and cash free basis and subject to adjustments at closing. Lummus Global provides process technologies to the oil & gas and petrochemical industries and is a global EPC contractor.
Boston-based VC firm Abry Partners has acquired Houston-based data center specialist CyrusOne. The company recently signed up a new oil and gas client to run seismic applications its 2,000 node cluster.
Fugro has acquired the oil and gas data storage activities of Canadian Kestrel from Recall, Brambles’ information management business. The deal covers petrotechnical and scientific data and samples. The business has annual revenue of approximately €4 million and will be integrated with Fugro Data Solutions Canadian operations.
Invensys has extended its partnership with Innotec whereby Invensys will resell and support the Comos plant engineering management modules worldwide.
Petroleum Geo-Services (PGS) has acquired Roxicon Geogrids AS in an $12 million cash transaction. PGS will roll Roxicon into a new ‘MegaSurvey Centre of Excellence’ in Stavanger.
SEOS has divested its visualization systems business to a VC backed management buyout led by Martin Howe. The new company is a called Global Immersion Ltd.
Emerson Process Management has acquired Decision Management International (DMI) for an undisclosed amount. DMI provides software and expertise for electronic management of workflow, materials, equipment, personnel, and documentation. DMI’s Compliance Suite is tightly integrated with Emerson’s PlantWeb architecture.
Honeywell has acquired Delft Instruments’ Enraf Holding unit for $260 million. Enraf provides measurement and controls solutions for the exploration, production and transportation of energy products for land and marine applications.
GE Energy has made a £289 million cash offer for Sondex Plc. UK-based Sondex designs and manufactures electro-mechanical downhole tools and surface equipment for well-site operations.
A recent webinar outlines the role played by Impress for Engineering Project Management (EPM) in the 2006 shutdown of BP’s Lingen, Germany refinery. Impress Software’s package was used by BP to synchronize project data between SAP and its Microsoft Project-based Primavera application. The project mobilized some 2,100 external contractors and 300 BP employees (OITJ October 2006). The turnaround was the first complete shutdown of the refinery in 50 years and involved servicing 1,600 pieces of large equipment, 60 reactors and thousands more equipment items.
€ 54 million
The € 54 million project included 800,000 hours of contractor work broken down into 6,000 ‘activities.’ Impress for EPM was the bridge between the Lingen core SAP system and Primavera 3 Enterprise (P3E). SAP work orders were synchronized to P3E to enable project schedule planning, confirmation of completed work and timely payment of contactors. Dynamic scheduling successfully reduced the duration of the turnaround and assured a ‘timely response’ to unplanned work.
Turnaround data is now available to ongoing operations from a BP internal information portal. It is estimated that the systems will reduced planning time and costs by two thirds for future turnarounds. Other Impress customers include Shell, Total, ConocoPhillips, Saudi Aramco, Marathon and Suncor.
Quorum Business Solutions has rolled out a new version of its Quorum Land Suite (QLS) at two clients sites. The latest version – QLS 5.0 has be re-coded to benefit from the Microsoft .NET development platform. According to Quorum, .NET includes technologies that improve system flexibility, increase system scalability, and reduce the vulnerability of applications and computers to security threats.
Quorum executive VP Gary O’Dwyer said, ‘We have consistently been first to market with new, proven technology in the land and lease management arena with features such as XML based data export/import tools, web-based ad-hoc business intelligence tools, and integrated web-based GIS.’ Quorum expects all of its US-based Land Suite clients to operate live on .NET platform by mid-2008.
Quorum also announced the acquisition of Integra Solutions of Dallas. Integra provides business intelligence consulting and education services and is a Business Objects Alliance Partner. The merger will allow Quorum to extend its expertise to provide industry standard software and consulting solutions to industries outside of the Oil and Gas market.
Energy Maintenance Services Group (EMS) is to install a pipeline integrity SCADA demonstrator at the Pipeline and Hazardous Materials Safety Administration’s (PHMSA) training center (formally the TSI) in Oklahoma City.
SCADA demonstrations of new techniques in cathodic protection automation will leverage web monitoring and control. Remote access to the resource will be available so that remote users can train over the web. A static pipeline will be utilized to demonstrate techniques for pipeline inspector certification and training.
EMS Group’s strategic partners on this project include CalAmp for wireless data communications, Solarcraft for solar and back-up power systems, American Innovations (wireless remote monitoring and control) and NTG’s cell phone network connectivity.
Distributed temperature sensor (DTS) specialist Sensornnet has teamed with UK based FloQuest for the provision of data interpretation services to its portfolio of monitoring solutions. The license agreement means that Sensornet will provide design installation and commissioning services, with FloQuest providing data monitoring and interpretation.
Sensornet CEO Neale Carter said, ‘FloQuest’s focus on inflow performance and its understanding of data handling and interpretation will offer Sensornet clients a best in class value extraction technology.’ Sensornet has also just published a case study of a well integrity monitoring for a high temperature well in the Middle East. The fiber optic DTS was deployed in conjunction with a tubing-conveyed perforating (TCP) gun, believed to be a world’s first. The TCP/DTS combo was used to optimize a steam injection program by monitoring steam breakthrough from adjacent injection wells from a continuous temperature profile of the complete well.
Calgary-based WellPoint Systems has acquired Bolo Systems of Denver, a supplier of integrated financial, land and production accounting solutions. WellPoint is to roll the Bolo package into its Microsoft Dynamics Axapta-based solution for oil and gas enterprise resource planning (ERP).
WellPoint CEO Frank Stanford said, ‘We have strengthened our position by improving our access to the sizeable US market with country specific strategies, products, expertise and infrastructure.’
Bolo’s most recent client is Calgary-based Nations Petroleum which uses the package to record revenue, expenditures, assets, production, lease records, and joint interest billing transactions. Nations has implemented Bolo’s Strategic Management Platform and Executive Dashboard. Bolo is now working with Nations to integrate its US and Indonesian-based operations.
Intermap Technologies has awarded a grant to Auburn University to research ways to save fuel using its NEXTMap 3D GIS road geometries. Initial focus is on development of a predictive cruise controller and automatic gear shifting algorithm to calculate optimal vehicle speed and gear selection to optimize fuel economy and operating costs.
Researcher Wei Huang said, ‘This project is innovative in that the system is tested with commercial 3D road geometry. The influence of road geometry and sensor accuracy on fuel economy will also be investigated. The GPS-based control system will reduce the heavy trucks’ fuel consumption using information from vehicle state estimators, the road geometry, and an optimizing control system.’ The optimizer takes input from the global positioning system (GPS) technology and Intermap’s 3D road geometry and figures when to accelerate, decelerate, or change gears going into and coming out of slopes and curves.
$3 billion saving
Early results show that fuel consumption can be reduced by up to three percent without no increase in travel time. A 3% fuel saving could represent annual US savings of around $3 billion dollars or a billion gallons of diesel. The Auburn study will also look at the tradeoffs between increased travel time and additional fuel savings. The study is also backed by Eaton Corp., supplier of heavy duty transmissions. Other NEXTMap applications include virtual tours, topographic maps and the addition of ‘interactive intelligence’ to airborne and satellite imagery.
A new ‘protocol’ from Det Norske Veritas (DNV) addresses what is described as ‘the continuing impact of major accident hazards’ in the oil and gas industry with a new process safety management (PSM) tool, ‘isrs7.’ The Baker and Chemical Safety Board reports on BP’s Texas City disaster confirm DNV’s experience that existing management safety systems did not perform well in the face of a major accident.
Graham Bennett, DNV Energy’s downstream business leader commented, ‘There is a need for a fresh approach to effective PSM. The new isrs7-PSM tracks the status of multiple layers of hazard management protection such as process plant condition, people issues, management effectiveness and business processes. In addition, it helps drive the improvement process.’
DNV’s International Safety Rating System (ISRS) is used to measure, improve and demonstrate safety, environmental and business performance. The new service takes a holistic view of the management of major accident hazards. DNV has already undertaken a number of PSM projects and plans to leverage these as a reference base for benchmarking PSM performance.
IHS appears intent on cornering the market in energy research. Following its acquisition of Cambridge Energy Research Associates (OITJ, September 2004), IHS has now bagged Norwalk, CT-based John S. Herold, Inc. in a $48 million cash transaction. Herold provides in-depth analyses and key financial and operational data on more than 400 global oil and gas companies.
IHS president Ron Mobed said, ‘Herold shares our vision of providing high-quality critical information and insight, and adds commercial evaluations and company fundamental analyses to our oil and gas industry coverage. The acquisition expands our offering to operating companies, banking and investment communities, broadening and deepening the reach of IHS across the energy sector.’
Herold chairman and CEO Art Smith added, ‘The data, information and research capabilities of Herold and IHS are complementary and will enable insights from wellhead to Wall Street.’ Founded in 1948, Herold’s online transaction databases and deal analysis are used in ‘virtually every’ major oil and gas company. Herold also holds the annual Pacesetters Energy Conference.