January 2009


Peer to peer iRING

ISO 15926 network to demo at upcoming FIATECH tradeshow. Chevron and engineering partners to unveil iRING, an open source, peer to peer, oil and gas engineering data infrastructure.

The US FIATECH construction industry standards body and the Norwegian POSC Caesar organization have announced a new standards-based interoperability infrastructure, the ISO 15926 real time interoperability network grid (iRING). The project is part of an ongoing effort from the standards bodies to ease information transfer from engineering contractors to owner operators and to assure lifecycle data management of plant and production facilities.

The iRING’s web-based infrastructure, accessible by any company, will facilitate full specification ISO 15926 data exchange. The iRING is designed as a peer to peer service run from member companies’ firewall ‘demilitarized zones,’ with no central iRING authority.

A demonstrator, codenamed ‘Camelot’ has just been kicked-off to leverage ISO 15926 to model business objects, map legacy systems to the ISO 15926 Reference Data Set and to trial data exchange scenarios between member companies. Camelot components include web services, web clients and hosted applications. The demonstrator will provide interaction between the RDS, commercial applications and will include prototype mapping and transformation tools, browsers and editors. Software for the prototype tools will be managed in an open software source control service such as SourceForge. Along with the demonstrator, Camelot deliverables include an ISO 15926 Implementers Guide, a Software Development Kit (SDK) and a Wiki for developers. The iRING Components to be built by the Camelot subproject will be available as open source software under a BSD license allowing redistribution and modification under certain conditions.

Camelot’s ‘storyboard’ includes the following ‘actors’: a data modeler, an application administrator and an engineering end user. The data modeler will be using the RDF template mapping developed in earlier ISO 15926 work. The iRING mapping editor and RDS/WIP browser will be used to map and integrate legacy applications. More detail will be added to Camelot’s storyboard as members deploy their iRING components and authoring tools in their DMZ.

The iRING can be deployed within a company to enable data exchanges between internal applications. The components are the same for both deployment domains. The approach will also work for a mix of internal and external applications.

Camelot’s ‘King Arthur’ is Bechtel’s Engineering Automation Manager Robin Benjamins. Camelot members will likely include Bechtel, Bentley Systems, Chevron, DNV, Fluor Corp., Hatch, NRX Global and Tata Consultancy Services. Preliminary results are to be presented at the upcoming FIATECH Conference in Las Vegas this April.


Petris grows ...

Petris expands its WINDS Enterprise infrastructure with acquisition of Intervera’s data quality toolset and Zeh’s upstream printing solutions.

Petris Technology has acquired both data quality boutique Intervera and the venerable upstream printing solutions provider Zeh Software. The Intervera acquisition sees Petris assume worldwide responsibility for the sales, service and future development of the Intervera’s DataVera solution and expand its international presence into Canada. Intervera founder and president Paul Gregory is to become Petris’ VP, Data Quality. DataVera is to integrate the PetrisWINDS Enterprise data infrastructure and assure data cleanliness.

The acquisition of ZEH sees the addition of enterprise plotting and montaging solutions to PetrisWINDS Enterprise and a significant expansion of Petris’ client base and international reach. Additionally, ZEH’s seismic data management offering, SeisInfo, adds 2D and 3D seismic field data management to the Petris portfolio.

Petris president and CEO Jim Pritchett said, ‘ZEH’s applications are complimentary to our product line and will enhance the value of our Recall and PWE solutions. We look forward to providing more solutions of significant value to our joint customers.’

See our interview with Pritchett below.


On computers in oil and gas, politics and the semantic web

Oil IT Journal editor Neil McNaughton, back from the Semantic Web in oil and gas workshop, reports on the state of play in RDF modeling in geology, engineering and ... politics!

When I was a whippersnapper, maybe 12 or so, circa 1960, computers were coming into the public eye. Curiously one of the first real world uses in the UK was in the accounts department of Joe Lyons’ chain of coffee shops, neatly combining two of my future passions—but I digress. A personal interest at the time was politics and I frequented the school debating society and some memorable political meetings. I can’t say that I was terribly successful as a politico, I wasn’t comfortable with the disingenuity, ‘robust’ discourse where frankly, I was generally gotten the better of. It seemed to me at the time that the obvious thing to do with computers was to use them to find the ‘answer’ to these hotly debated political issues, to replace the debating societies and parliament, and in general run the country. Instead of endless debate you should be able to ‘feed’ everything into the computer and come up with ‘the answer.’ I think that at the time I assumed that the computer would come out on my side of the debate—but I digress again...

Even though this idea seems as ridiculous now as it did at the time to my relatives and friends, a lot of subsequent effort in the field of information processing has been devoted to finding ‘the answer,’ to increasingly complex questions—even if it is not (yet) used to run the country. You can trace the evolution of the computer’s identity in how people describe it—as a ‘computer’ for doing stuff with numbers, an ‘information processor,’ or something used in ‘information management,’ to even more clever devices capable of machine to machine interaction and ‘reasoning.’

Which brings me to our attendance at the first World Wide Web Consortium (W3C) Semantic Web in Oil and Gas (SWOG) workshop held last month in Chevron’s Houston location. Oil IT Journal subscribers can read our report from this event on page 6, and the presentations and ‘official*’ report are also available online.

If you don’t know about the semantic web, I suggest that a quick visit to Wikipedia may help understanding what follows.

I have editorialized before about the semantic web—notably after our attendance at an earlier W3C event**.

Very briefly, Tim Berners-Lee’s (TBL) vision was for a ‘next generation’ world wide web of ‘linked data.’ There is a lot of structured information tucked away in vanilla HTML pages—information about people, conferences and webcasts, company profiles and stock tickers. It would be nice to be able to bring all this together and assemble it into larger chunks of useful information. TBL’s idea was to use a simple ‘triple’ modeling construct—the resource description format, RDF, to tag all this useful stuff on folks’ web sites.

Since its introduction around the turn of the millennium, RDF has been something of a flop. On the one hand the calendars and stock quotes, in so far as they have been standardized (not much!), have mostly been done in XML ‘microformats,’ not RDF. On the broader data front, the hegemony of the relational database is unshaken. Data modelers seeking to innovate have mostly gone for XML over RDF—witness our own industry’s WITSML and PRODML efforts.

I think that the semantic web’s failure is partly due to a subtle difference in the kind of data we are talking about and what we mean by a ‘data.’ Is a web factoid like ‘Britney Spears is dating Prince Charles’ (I made that up) data? How about a statement like ‘the fuel consumption of a Ferrari F50 is 75 liters/100 km’ (also made up)? Capturing the first factoid in an RDF statement is pretty trivial, the second is not so easy. If you have an engineering mindset, you’ll see the potential pitfalls of typos in units, unspecified conditions (at what speed?) and the likely requirement of other representations (fuel consumption in miles per gallon.) Right away we are confronted with a data modeling issue. And believe me, RDF may be simple up front, but modeling stuff of minimal complexity gives some impenetrable RDF ‘graphs’ in no time at all.

The SWOG workshop gave examples of both kinds of modeling. Geological constraints like the ‘Comblanchian is a member of the Dogger’ sound quite similar to the Britney Spears relationship. Such text-oriented factoids can be dumped into a triple store for machine ‘reasoning’. Chevron has reported using RDF to gather collections of factoids in a way that mirrors master data management—or as Chevron’s Frank Chum put it at the SWOG, ‘like a souped-up business intelligence system.’ On the other hand, the FIATECH/POSC Caesar ISO 15926 IDS/WIP*** is built around engineering data relationships like the Ferrari example above. This is an exciting extension of RDF-based modeling into the engineering world. The engineers have had to invent some RDF extensions to achieve this, possibly, for the purists, breaking a few eggs in the process.

The ISO 15926 is a flagship project not just for oil and gas but for the semantic web at large. Not the least for the new peer to peer deployment of the iRING (page 1).

One facet of meetings like the SWOG is the facility with which difficult terms are bandied about. My favorite is ‘ontology.’ If I had a penny for everyone who has heard this term used without having a clue what it means I would probably have enough to buy that Ferrari by now.

After the event, I signed up with the ‘ontolog’ online community***. I was not disappointed. Ontology is an Alice in Wonderland concept that can mean whatever you like. From a list of ‘stuff,’ through hierarchies of more ‘stuff’ to a categorization of the whole of human existence!

I found this of great comfort and have now downloaded Protégé and am busy inputting factoids from the modern politico-economic situation. I’m also hard at work on my political upper ontology—although I’m having a little trouble disambiguating ‘liberal’ from its American and European contexts. But I’m nearly there. Next month I hope to report on progress in fixing the world economic crisis and provide a definitive solution to the liberal/conservative schism that has held up progress for so long. Yes we can!

* www.oilit.com/links/0901_6

** www.oilit.com/links/0901_8

*** www.oilit.com/links/0901_9

**** www.oilit.com/links/0901_10


Jim Crompton on data, integration and semweb’s promise

Chevron’s iField guru thinks the data pipeline is kinked. Will semantics straighten it out?

In the keynote address to the World Wide Web Consortium’s (W3C) Semantic Web in Oil and Gas Workshop (page 6) Chevron’s iField program advisor Jim Crompton recalled the bygone days of the well organized paper file room saying, ‘Life has never been that good since.’ Companies ‘lost control’ of data in the passage to the digital world. Surveys show that even today, 30-70% of our time is spent finding data. Knowledge workers may only have access to a small fraction of the information they need—so, ‘they give up looking and go with a best guess.’ Unfortunately, as the collective experience of the workforce diminishes, the quality of ‘best guesses’ is declining.

The information age has given us Google-type full text search. This is all very well, but for accurate search, you still need to tag documents and you need to know if you have found everything. The reality is that we can still only search accurately within a given environment. Chevron’s intranet search is OK, but does not include email and many databases. Search has become siloed.

Meanwhile the information pipeline is fed with increasingly large amounts of data from low cost sensors with implications for HSE, operations and maintenance. Downhole intelligent completions provide real time data and an exponential increase in data gathering capacity. Our modeling capability is likewise now ‘massive’. But between the expanding data volumes and the modeling capability there is a ‘kink in the pipeline,’ a yawning gap between data collection and modeling/analysis for decision support.

But we are reasonable people! How did we screw it up? The answer is in the economic cycles – when the cycle is down, there is no money for information management. When the going gets good, acquisitions and mergers kick in. Chevron has experienced ten years of data hell following the Texaco merger! After a merger, nobody knows how the legacy systems work. The next worst thing is when someone leaves and hands over their spreadsheet. Engineers are not very good at documenting what they do.

The business impact of the current state of affairs is that engineers use month-old data for decisions. Optimization works fine at a small scale, but multiple attempts at optimization at different scales hamper the ‘big picture’ optimization. It is hard to react to dynamic changes such as water breakthrough or equipment failure. Today we are reactive and we need to be more proactive.

Regarding the semantic web, haven’t we been here before? Yes, not perhaps exactly, but there have been other attempts to bridge IM and the business. Even IT folks shy away from this problem.

Crompton believes that the path to data sanity lies in data governance, a reference and master data architecture spanning structured and unstructured data, and data quality. Such an information architecture is a new thing for Chevron. Standardization is important but ‘we need to go further and allow for portfolio rationalization, we need a common language.’

Crompton advocates a three tier architecture. SOA may be a way to achieve this but oil and gas has not yet got a taxonomy to support this—a potential role for the semantic web. This could be applied to legacy data by re-tagging and structuring the data. The industry has had some success with XML data protocols. Chevron came late to the WITSML party but has been more proactive with PRODML. There have been some successes here, but Chevron is still ‘kicking the rock’ on internal deployment. We already tried the one big data model approach and it failed miserably. Today Chevron has 600TB of data (70% technical, 25% business, 5% financial) and this is growing. There are currently 300 million Office documents.


Oil IT Journal Interview—Jim Pritchett, Petris Technology

With Intervera and Zeh in the bag, Petris president and CEO is looking for more strategic targets.

OITJ—What’s the rationale behind Petris’ acquisitiveness?

Pritchett—In 2009, with the current economic climate, everyone will be looking at mature fields. These systems will hit the bulls eye. Our ongoing program of acquisitions is designed to evolve our PetrisWINDS Enterprise (PWE) integration framework across more upstream application domains. Both Intervera and Zeh fit with our multi-vendor integration model. PWE is designed for integration with our own application and those from the main third party vendors.

OITJ—Both Zeh and Intevera are already ‘multi vendor’ providers.

Pritchett—Exactly. Intervera’s quality and master data management offerings are obviously complimentary to our data infrastructure. But Zeh’s graphics and printing offering can also be seen in the light of hydrocarbon data visualization and integration across disparate sources. The Petris framework now adds automated data quality and consistency to what is becoming an enterprise wide business intelligence system—with business process management to top it off. Also Zeh’s SeisInfo brings us a seismic data management and data quality system for field seismic data.

OITJ—What’s next?

Pritchett—We’ll continue to look at potential acquisitions that fit with our cross domain integrations strategy. The Intervera and Zeh acquisitions have also brought us better global coverage, Intervera in Calgary and Zeh in Perth. Petris sees this as a long term technology development play. We will be working on acquisitions for a time.

OITJ—How do you characterize Petris today, as an application developer or infrastructure supplier?

Pritchett—We are both. We would love to be a full vendor to everyone. Ultimately we may expand beyond E&P and pipeline. But the reality is that applications come and go but information stays.

OITJ—You mean that Petris’ strength lies as an integrator of third party applications.

Pritchett—Yes. We don’t really want to do the whole thing. Our WINDS Enterprise framework means that when we do an acquisition, we are well equipped to achieve a quick integration of the new portfolio. We are not just buying the new company’s balance sheet.

OITJ—How much did you pay for Zeh and Intervera?

Pritchett—We are a privately held company so I’ll pass on that one. I can say that we are well entrenched in the second tier of software houses, behind Landmark, Schlumberger and Paradigm.


R5000 release adds parallel 3D seismic processing

Landmark’s new SeisSpace environment supports interpretative seismic processing.

Halliburton’s Landmark software arm has released SeisSpace R5000, the latest manifestation of its seismic imaging package. SeisSpace R5000 supports time and depth domain processing and QC of large seismic data volumes and is claimed to be an ‘open’ platform that supports proprietary technologies and ‘specialty’ processing services.

R5000 code can take advantage of the latest multi-core CPUs and storage devices to perform compute intensive tasks such as 3D noise suppression using a 3D FKK filter, spatial mix and FX-Y decon and a new surface-related multiple elimination tool. R5000 also targets the seismic interpreter with ‘collaborative’ workflows that assure closer cooperation between processors and the asset team. The software supports visualization of 3D, 4D, and ‘5D’ pre-stack volumes with a direct connection to Landmark’s OpenWorks project database. SeisSpace can be downloaded from Landmark’s Software Manager (LSM) along with release notes and an installation guide.


GPU in seismic processing, ‘don’t try this at home’

Headwave presentation reveals potential of GPU-based seismic compression.

Speaking at the Las Vegas SEG meet late last year, Headwave’s Steve Briggs provided an inside track on the company’s work on using graphics processing units (GPU) in seismic data analysis. Headwave’s focus is prestack volume visualization and analysis. Prestack visual quality control mandates quick look access to data sets of terabyte size. Headwave’s technology uses wavelet data compression to enable this to be performed on a notebook PC. This opens up access to processing that was hitherto only available to the processing center such as prestack horizon picking. The technology can run on GPU or CPU. With NVIDIA’s CUDA, programming the GPU is much easier but Briggs suggests, ‘don’t try this at home.’ While you can do a lot of ‘cool stuff’ with GPUs, ‘they are not going to handle full datasets.’ GPUs are good for testing but still present bottlenecks and roadblocks, and this is in the face of bigger datasets, now around a terabyte for 10 US offshore blocks prestack. Headwave uses GPUs to compress data for visualization and to accelerate attribute calculation. Current Gig Ethernet infrastructure is ‘painful’ as you need to move lots of bytes. Even compressed data sets can take weeks to get from tape. Storage is a problem, disk cache is useless for these volumes – you need ‘RPM s,’ lots of spindles and multiple 10GigEthernet, Infiniband pipes. Compression needs large JBOD arrays and maybe SSD. But investment in infrastructure bandwidth really does pay in terms of visual and computational bandwidth. More from contact_eame@headwave.com or contact_americas@headwave.com.


Discrete cosine transform for fluid flow characterization

Shell-backed MIT research team maximizes return on limited subsurface measurements.

New mapping technology developed by the Massachusetts Institute of Technology’s (MIT) Department of Civil and Environmental Engineering has the potential to ‘significantly’ increase oil recovery. The new technology uses a discrete cosine transform (DCT) to describe spatially distributed reservoir properties such as permeability. MIT’s experiments suggest that that the DCT makes the history matching problem better-posed and improves the realism of reservoir property estimates.

MIT researcher Behnam Jafarpour, now an assistant professor in petroleum engineering at Texas A&M, said, ‘Our studies indicate that this approach has the potential to improve current reservoir characterization techniques and to provide better production forecasts and development strategies.’ The technique uses oil flow rates and pressure data from oilfield wells to create a realistic image of the subsurface reservoir. The MIT technique categorizes the ‘complex subsurface pathways’ that convey oil to wells even when they are beneath the resolution of seismic and well observations.

Co-researcher Dennis McLaughlin explained, ‘The methods we’ve developed extract more information from those limited measurements to provide better descriptions of subsurface pathways and the oil moving through them.’ This research was funded by Shell International and will be the subject of a paper in an upcoming issue of the SPE’s Journal of Petroleum Technology.


Ingrain and Knowledge Reservoir team on digital rock physics

Companies to extend 3-D NanoXCT x-ray tomography with enhanced modeling workflows.

Knowledge Reservoir has teamed with Ingrain to promote the use of Ingrain’s ‘digital rock physics’ measurements in reservoir characterization. Ingrain uses a combination of 3-D NanoXCT x-ray tomography and high performance computing to analyze the reservoir in what has been described as ‘quantitative virtuality’ (OITJ July 2008). Ingrain technology works on core samples or drill cuttings and is claimed to speed turnaround time compared to regular laboratory special core analysis.

Knowledge Reservoir is to develop enhanced reservoir modeling workflows around the large amount of data that is generated by Ingrain’s technique to support clients’ rock property studies. Knowledge Reservoir president and CEO Ivor Ellul said, ‘Ingrain’s digital rock physics lab represents the future of rock property analysis. This deal means that our asset team consultants will further assist clients in maximizing recovery, reducing costs and improving understanding of the reservoir.’


Software, hardware short takes

Advanced Geotech, Cyviz, Ernst & Yound, Pansoft, Safe Software, StoneBond, Badley’s, Caesar Systems, Energy Navigator, IES, SGI, Tibco, WellEz.

Advanced Geotechnology has announced the 3.8 release of STABView, its well planning package with new wellbore cross-section plots, spider and tornado plots for sensitivity analyses and borehole collapse mode comparisons. A quantitative risk analysis module is planned for later this year.

A survey by Cyviz and Kristensen Consulting found a high failure rate in oil and gas collaborative environments (CE) due to ‘poor reliability and poor usability in real life contexts.’ Better user interfaces, standardization and training are identified as key remedies.

Ernst & Young is to assist Chinese energy ERP solutions provider Pansoft with SOX 404 compliance.

Energy Solutions has released V 3.1 of PipelineManager with instrument analysis, data playback, a SCADA simulator for leak studies and a web service API.

Safe Software’s FME 2009 offers faster access to spatial data and optimized conversion of large volumes of spatial data. FME Server now natively supports 64-bit Windows, Linux and Solaris. New supported formats include Adobe Geospatial PDF, AutoDesk 3ds, CityGML, IBM Informix Spatial and OpenStreetMap (OSM) XML.

Stone Bond has added remote function call support for SAP inter-operability to its AppComm package. AppComm now supports bi-directional integration with SAP, RFCs, iDOCs and web services.

Badley’s has released Move2009.1 with a native 64-bit version on Windows and Linux and improved component consistency, cross-components workflows and data transfer. Badley’s is now working on 2d and 3d modeling of thrust belts and on direct links to Petrel and OpenWorks.

Caesar Systems’ PetroVR V6.3 will offer greater insight into the impact of completion activities on potential decisions, particularly important in deepwater projects. Drilling plans and schedules have been enhanced and wellbore drilling and completions are treated independently.

Energy Navigator’s Value Navigator application meets the new SEC guidelines for reporting of probable and possible reserves. The SEC’s new rules are the first revision in over twenty five years.

EnteroOne 2009 sees the introduction of a ‘single-source’ data architecture and a foundation for real-time sharing of business information. Entero’s interface can be customized to external systems.

Schlumberger unit IES’ PetroMod 11 petroleum systems modeler offers local grid refinement, seismic-derived facies analysis and improved fault modeling. Other enhancements target data exchange and a new 14-component phase kinetics module includes secondary cracking and automated fluid property calibration.

SGI’s Altix system now offers up to 8TB of global shared memory. A flagship Altix 4700 deployment at Ames facility ran a 2,048 cores, 4TB machine under a single Linux system image.

Transzap reports that its Oildex e-payable service has again achieved SAS 70 Type II certification for the design and execution of its operational controls. SAS 70 audit assures customers and their auditors that controls and procedures are in place to manage and protect their data.

Tibco Spotfire V 8.1 now embeds S-PLUS, Tibco’s integrated development environment that scales from desktop to ‘gigabyte class’ data sets. S-PLUS is a library of statistical and mathematical algorithms for portfolio management and business process modeling with Bayesian statistical methods. The package includes a Wavelets package for image, signal and time series analysis and a SpatialStats module for spatial data. A new Spotfire S+ Eclipse workbench offers compatibility with the open-source ‘R’ environment.

WellEz has upgraded its web-based oilfield operations reporting service with M2E, an Excel reporting and charting add on and a new helpdesk ‘WellEz on Demand.


ITF reports successful 2008, poneys up £9 million for 2009

UK-based fund backs spin-outs from several UK universities and software start-ups.

The UK-based Industry Technology Facilitator (ITF) reports an ‘exceptional’ 2008 and is back in 2009 with more support for innovative technologies that address the challenge of maximizing the hydrocarbon recovery from the UK Continental Shelf. The ITF is a not for profit organization owned by 21 majors and service companies. The ITF is now inviting proposals for solutions that enhance reservoir understanding and development, in particular with low cost technologies for production enhancement, water control and chemical technologies.

2008 saw nine new technology implementations, 29 joint industry projects launched, £9.4 million of funding secured and 5 field trial projects launched. ITF MD Neil Poxon said, ‘The key to ITF’s success is the delivery of high quality, innovative projects that address our member’s needs. During 2008 we worked to strengthen our facilitation process, ensuring that it offers value to our members—so that we can help to address the big issues and unearth solutions that would not be developed without a collaborative industry effort.’

An example of the ITF’s 2008 successes is the COFFERS project carried out by Edinburgh University. COFFERS is a statistical reservoir model that uses correlations between injectors and producers to calibrate faults and fractures that impact production. The software is now commercially available and the university has undertaken its first North Sea contract. Another example comes from Rockfield which used an ITF grant to develop geomechanics software to model fractures and faulting. Sponsors are already using Rockfield’s software in their field studies. Two other projects address flow assurance: Heriot Watt’s hydrate monitoring and early warning system, now commercialized through a university spin off, Hydrafact, and Manchester University’s Acoustek package. This uses acoustic technology to detect blockages and leaks in gas pipelines at distances of up to 10km.

R&D themes for 2009 include carbonate reservoirs, long tie-backs, imaging in challenging environments, corrosion mitigation, well intervention and data communications. An idea of the amount of support available comes from the £5 million ‘indicative investment’ allocated by the Trustee Savings Bank. More from www.oil-itf.com.

Please note the following correction that appeared in the February 2009 edition of Oil IT Journa to the above article.

Folks at the Industry Technology Facilitator were understandably upset when we referred to them as the ‘Industry Task Force’ in our January ‘Folks, Facts’ section. In the same issue we also made multiple errors in our article ‘ITF reports successful 2008.’ We received the following corrections from ITF. ‘ITF is a not for profit organization therefore the phrase “ITF poneys up £9 million” is inaccurate. ITF facilitates technology development by securing funding from its members. Also, as a global organization, ITF’s remit and objectives go far beyond the UK. Finally, the RFP for solutions to enhance reservoir understanding and development was not an ITF call. It was made by the TSB [the Technology Strategy Board—not as we gaffed, the Trustee Savings Bank!]. ITF was merely supporting the TSB to promote their event to developers.’ Our abject apologies to ITF and the TSB!


W3C Semantic Web in Oil and Gas Workshop, Houston

Oil IT Journal attended the World Wide Web Consortium’s first Semantic Web in Oil and Gas Workshop, hosted by Chevron. The technology underpins industry programs including Chevron’s Integrated Asset Management and an ‘Exploratory Pilot’ and the ISO 15926 ‘WIP.’ Other projects address the semantic web in geology and natural language processing.

The World Wide Web Consortium’s (W3C) Workshop on the Semantic Web in Oil and Gas* was held in Chevron’s Houston offices with attendance from a good cross-section of industry decision makers and semantic practitioners. Attendees hailed from BP, Oxy, Shell, Total, Halliburton, Energistics and Schlumberger inter alia. On the oil company side, Chevron is the most enthusiastic proponent of the semantic web although the technology still hovers between academic proof of concepts and enterprise deployment.

W3C CEO Steve Bratt traced the evolution of the web from hyperlinked documents (Web 1.0) to ‘one web’ of creators and consumers (Web 2.0) and now, linked data (Web 3.0) a.k.a. the Semantic Web. RDF is the core semantic web standard – using a simple ‘subject, property, value’ triple to describe anything. This turns the web into a ‘big global relational database.’ According to Bratt, the Gartner Group is ‘very positive’ but sees full take up of semantic data by 2017 and by 2027 for a ‘semantic environment.’

The healthcare/life sciences (HCLS) industry is the W3C’s poster child for semantic web take-up. HCLS, like oil and gas, has been plagued by barriers to interoperability including commercial applications, external resources, a lack of APIs and data mismatches. A possible outcome of the workshop would be the inception of an Energy/Oil and Gas interest group along the lines of the HCLS above. In the Q&A, Bratt acknowledged that growth in HCLS has been slow.

Schlumberger’s Bertrand du Castel offered a short history of upstream data initiatives noting the failed effort in the 1990s to agree on a common data model. Du Castel spoke with first hand knowledge, he was head of POSC when it was an $8 million per year operation. In terms of data management standards, ‘nothing came out of it, an ROI of zero!’ More recent attempts to find ways to speak to each other through common exchange formats like WITSML and PRODML have taken root. Today, we have a good model of what exploration is and an understanding of what we are all talking about. Now we need go to the next level and reap the value. Mistakes were made in the 1990s, but now ‘we are getting back on track.’

Chevron’s main semantic effort is the three year Integrated Asset Management (IAM) R&D project that was carried out at the University of Southern California’s CiSoft department. CiSoft’s Ram Soma described IAM as a ‘comprehensive transformational approach’ to integrated oilfield operations that sets out to increase integration, enable ‘what if’ scenarios, create a knowledge base and reduce risk. The ‘non disruptive’ technology’ works across previously non interoperable data silos. IAM is now seeing real world deployment through a technology transfer project, with UK-based Microsoft developer Avenade. IAM’s metacatalog is an OWL triple store. Semantic web technology provides an expressive and rich data model suited for inference and rule based reasoning, it is also vendor-independent. The three level design includes a domain-independent upper ontology. Beneath this are domain-level models of asset elements and finally application and workflow specific ontologies.

Frank Chum presented Chevron’s position paper. Chevron has published ontologies for information integration in oil and gas on the W3C site**. Chevron’s problem areas include the difficulty of ‘semantic reconciliation’ of enterprise metadata, the standardization of information and integration across WITSML, PRODML, ISO15926, PPDM etc. The ‘N° 1 role of the semantic web is data integration across applications.’ Chevron’s Exploratory Pilot seeks to achieve the ‘holy grail’ of enterprise search by linking technical data and documentation. Chevron developed a semantic metadata store for technical data – ‘in a way that brought value and did things that were not possible before.’ The Exploratory Pilot mustered metadata from Unix-based SeisWorks and GoCad projects, building an ontology and RDF data store. RDF proved ‘extremely useful to us in bringing together factoids of unrelated information.’ Chevron found that the semantic web was like a ‘souped-up business intelligence system.’ It does require a lot of effort to ‘corral’ metadata.

In a wide ranging Q&A, several oil companies expressed their hopes for semantic technologies. For Total, the expectation is that the semantic web will help with knowledge transfer to the next generation of oil industry workers and that the semantic web can help locate information that you didn’t know existed. BP stated that it had started down the path of a service oriented architecture, but didn’t seem to be getting huge benefits. BP was attending the workshop to check out the ‘next IT wave.’ For Schlumberger, the move to deepwater and harsh environments requires new technology that goes beyond human capabilities. All of which points to a long term move to automation and technologies to support this. Shell noted the semantic web’s potential in exploration for capturing ‘story telling’ and subtle inference from geological models. Speaking from an engineering perspective, Fluor noted that everyone is now losing money due to the lack of interoperability. ISO 15926 is changing this, enabling interoperability between engineering partners. Semantic techniques will become mainstream if they are used.

Fluor Corp.’s Onno Paap presented a paper on ISO 15926 data modeling with RDF/OWL. Today’s engineers are data mappers, they spend all their time ‘yellow lining’ documents, checking PDF documents against the database. With better data management, ‘one engineer could do the work of three!’ Commenting on the W3C’s preferred semantic modeling building block, Paap said, ‘triples are too limited in data description, you need more than just ‘subject,’ ‘relation,’ ‘object.’’ Hence the ISO 15926 use of ‘templates,’ ‘a pattern for facts.’ Facades are grouped in a ‘confederation of participating facades.’ A laptop with a facade browser sends SPARQL queries to the ‘confederation.’ Fluor uses 10,000 equipment vendors on 800 current projects. Paap’s ‘façade’ concept did not go down well with the W3C’s purists!

Jean François Rainaud, a researcher at the French Petroleum Institute (IFP) showed how the semantic web has been used to enable ‘intelligent’ document search in the context of a C02 storage project. The Energy-Web Ontology Knowledge Hub (E-WOK) process begins with the annotation of the document collection down to the document ‘fragment’ level. The semi-automated tagging process uses natural language processing to extract significant words which are ‘conceptualized’ in a domain ontology. Existing semantic resources have be re-used including Dublin Core, the Geon, the NADM, and GeoSciML. Interaction with a subject matter expert is required to define concepts. OWL is used to describe data modeling of geological formations including formation boundaries, faults and fractures.

Speaking on behalf of ENI, Brooke Aker (Expert Systems) stated that the Italian major has adopted its Cogito semantic platform widely. Cogito offers natural language processing includes morphological and grammatical analysis, used to ‘disambiguate’ words with more than one meaning. ENI’s semantic network contains 350,000 words and 2.8 million relationships. A search for ‘China’s nuclear energy strategy in 2020’ winnowed 25 relevant documents from millions in a collection. The technique is good at capturing ‘weak signals.’

David Norheim described Computas’ Active Knowledge System for Integrated Operations (AKSIO). This was designed to avoid Norwegian operators repeating errors and also to help with the ‘big crew change.’ AKSIO is an ‘active socio-technical system for experience transfer in drilling,’ sponsored by StatoilHydro. AKSIO regards a drill rig crew as a ‘friend of a friend’ (FOAF) network. AKSIO generates ‘experience reports’ with semantics to screen and annotate knowledge and to build a searchable, knowledge base. This can be filtered by discipline, operation, equipment state, etc. The AKSIO drilling ontology (in OWL-DL) was created by subject matter experts (SMEs) and knowledge engineers using ‘question driven query scripting.’ Incident reports are routed to SMEs for screening and annotating with the domain ontology. The result is an increased rate of knowledge reuse, good take up of best practices, avoiding repeating one’s mistakes and process improvement.

* www.oilit.com/links/0901_6

** www.oilit.com/links/0901_2

This report is an extract from The Data Room’s Technology Watch report from the W3C Semantic Web in Oil and Gas event—more from tw@oilit.com.


World Business Research Digital E&P conference, Houston

Highlights from knowledge and data management presentations by Repsol-YPF, BP and Devon and ENI.

The first World Business Research Digital E&P Conference was held in Houston late last year. Repsol-YPF CIO Agustin Diz kicked off the proceedings with a personal journey through information management (IM) and a reflection on how the value of an IM project can be estimated. Valuations need to include the time taken to implement, customer expectations, risks and dependencies, resource requirements along with the return on investment. A simple scoring system is used, but weighting the individual scores requires a stakeholder consensus. Repsol-YPF has a portfolio of IM project proposals the big question is, which project is best for the company?

A framework is required that holds the relationships between drivers, project expenses and ‘income.’ This allows problem areas to be identified such as when there may be information loss due to a change of data ownership and the management time scale involved. Well logs for instance have a ‘full life cycle’ requirement that differs from the short term requirements of a daily drilling report. The trick is to look across different information flows, identify the gaps and then bring everything together and create corporate objectives and a balanced IM project portfolio.

BP Technology Director Paul Stone investigated the impact of digital innovation in oil and gas. BP’s innovators are looking for ‘game changing’ technologies. One such project was the introduction of GE into oil and gas with projects including wireless automation and workforce automation. A game changer starts with an idea and ‘evangelization,’ before there is a detailed technology blueprint. Next come pilot projects and finally, ‘at scale’ transition to business segments. Stone notes, ‘there is no innovation until technology is adopted!’ The scale of the first phase which generally lasts about a year can be seen from the GE game changer. Here the idea was to use predictive analytics to anticipate and fix equipment problems on production facilities. This mobilized around 50 vendors working to explain and evangelize the technology before discussing specifics and agreeing on proof of concept pilots. The equipment health monitoring trials showed how small changes in compressor vibration can herald an insipient problem*. Devon Energy CIO Jerome Beaudoin described how a data rich E&P environment can enhance workforce efficiency. Data management is still ‘hot,’ but we need to improve confidence in our data. Today’s data ‘reality’ remains the spreadsheet! Users are comfortable with spreadsheets and know the data has not been manipulated by IT!

Beaudoin suggests working from this state of affairs to identify what users see, to be able to clean the data and identify ownership. Devon is working on ‘intelligent data recognition,’ managing unstructured content in a system of reference. For Devon this means Endeca’s Information Access Platform. Endeca has allowed Devon to locate and manage data that was previously impossible to locate and to access all of the ‘authoritative sources’ of well data.

Luigi Salvador, ENI’s chief knowledge officer stressed that, ‘People are the key factors because technology and organizational changes can only succeed when they are accepted by people.’

Only people can make sense of data. So the answer is a combination of tools and behaviors. We also need more willingness to share and ‘less individual, competitive behavior.’ ‘Knowledge sharing is so much more productive.’ In some ENI units users spend up to 10% of their time exchanging knowledge, helped with a knowledge facilitator. The answer is not Taylorism. Complex problems require complex solutions and the knowledge enterprise requires actors not executors.

* More on this is the current issue of BP’s Frontiers Magazine – www.oilit.com/links/0901_1.

Correction (Oil IT Journal, February 2009) Endeca points out that ‘Devon is not an Endeca client. Jerome Beaudoin made it clear that Devon implemented its E&P Portal Solution internally. Endeca was presented as a new technology that appeared to offer an alternative to the custom coded application.’ Our apologies to Jerome Beaudoin, Devon and Endeca.


Folks, facts, orgs ...

CygNET, AGR, AJM Petroleum Consultants, Aker, ARKeX, Atlas Pipeline, Baker Hughes, Barco, Total, BP, CO_LaN, E3 Consulting, Emerson, FileTek, Geomodeling, Gray Wireline, Hess, Seismic Ventures, Ingrain, ITF, P2ES, Palantir, Pioneer Drilling, The Pipeline Group, TerraSpark, Roxar, Schlumberger, Siemens, Tibco, Tieto.

Steve Robb, VP of Business Development, is to head-up CygNet Software’s new office in Calgary.

Lasse Øvreås has been appointed VP inspection and integrity services with AGR’s Bergen Norway field operations unit. Øvreås hails from Aker Solutions.

Barry Ashton, COO of AJM Petroleum Consultants, has been elected to the executive committee of the Society of Petroleum Evaluation Engineers’ Board of Directors.

Aker Solutions has announced changes in its executive management team. Gary Mandel is VP process and construction. Jarle Tautra is VP energy development and services.

ARKeX has appointed Jim Sledzik to its board and Stuart Gibson as CFO. Sledzik comes from WesternGeco.

Eugene Dubay has been appointed president and CEO of Atlas Pipeline Partners. Dubay was previously COO Continental Energy Systems.

Russ Cancilla has been named VP of HS&E and security by Baker Hughes.

Martin De Prycker has resigned as CEO of Barco and is replaced by Eric van Zele.

Marc Blaizot has succeeded Jean-Marie Masset as Total’s senior VP Geosciences.

Lamar McKay is chairman and president of BP America, succeeding Bob Malone who has retired.

Eduardo Inglez is now an associate member of CO-LaN.

E3 Consulting has hired Mark Juneau, as Executive Director. Juneau comes from his own firm, Juneau & Associates.

Emerson has opened new regional headquarters in Dubai, with more than 300 employees.

FileTek has appointed Gary Szukalski as president and chief marketing officer.

Geomodeling Technology has appointed Les Dabek and Peter Phillips as product managers.

Jim Meneely has been appointed interim CEO of Gray Wireline and David Apseloff as executive VP and CFO. Meneely was previously with Halliburton.

Greg Hill is now president worldwide E&P with Hess Corp. Hill joins Hess from Shell.

Steven Rutherford has joined Seismic Ventures as director of its new direct hydrocarbon detection division. Rutherford was previously with Anadarko.

Ingrain has opened a digital rock physics lab in Rio de Janeiro, Brazil.

The UK-based Industry Technolog Facilitator is inviting students to take part in a ‘talent development’ competition by submitting Masters research topics related to the upstream.

P2 Energy Solutions has appointed David Verdun as CTO. Verdun hails from Paradigm.

Palantir Solutions has named Dan Fichte VP, North American operations. Fichte was previously with Energy Navigator.

Philippe Flichy has left Merrick Systems and is now available for consulting.

Pioneer Drilling has appointed Lorne Phillips as executive VP and CFO and Carlos Pena as VP. Phillips comes from Cameron International Corporation, and Pena was formerly with AT&T.

Val Jackson has joined The Pipeline Group as client and community affairs manager, heading up the company’s Houston sales office.

Tom Robinson has left Petrosys and is now VP Sales and Service with TerraSpark Geosciences.

Roxar has appointed Serena Arif to manager, Europe and Africa for its Flow Measurement Division. Arif was previously co-founder and Business Development Manager at PolyOil, UK.

Ken Havlinek heads-up Schlumberger’s new Technology Center in Calgary.

Tom Blades has been appointed CEO Siemens Energy Oil and Gas Division succeeding Frank Stieler. Blades previously worked for Schlumberger.

Tibco has appointed Murray Rode to COO and Sydney Carey as executive VP and CFO.

Nina Christiansen has been appointed Alliance Director at Tieto. She hails from Unisys. Tieto is the new brand name of TietoEnator Corp.

WellPoint Systems has appointed Richard Slack as president and CEO succeeding Frank Stanford.


Done deals

geoLOGIC, Iron Mountain, Germanischer Lloyd, Seismic Equipment, SensorTran, Helix, IFP, BRGM, ModViz, NVIDIA.

GeoLogic has acquired Calgary-based Whitehot Innovations. The deal includes all proprietary technology and surrounding assets. Whitehot staff, including president Lori Adams, have transitioned to GeoLogic and are now working to integrate Whitehot’s QFind document management system with geoSCOUT, GeoLogic’s E&P decision support system. Adams started Whitehot in 2007 after purchasing the QFind from her previous employer, Rapid Technology.

Information protection and storage service provider Iron Mountain has gained entry to the S&P 500 Index, in the S&P 500 GICS (Global Industry Classification Standard) Industrials Economic Sector and the Diversified Support Services Sub-Industry Index.

Germanischer Lloyd has acquired Singapore-based International Refinery Services adding risk management expertise and advanced inspection techniques to its oil and gas portfolio. Terms of the deal were not disclosed.

Private equity firm Perseus LLC has bought in to Seismic Equipment Solutions (SES). The leveraged deal was arranged by Westlake Securities in the face of ‘extremely difficult conditions in the credit markets and a precipitous decline in energy prices during the process.’ Houston-based SES rents and sells equipment to seismic contractors and oil companies worldwide.

Austin, TX-based distributed temperature sensor (DTS) specialist SensorTran has raised $3.5 million venture capital, to be used for working capital and to accelerate the growth of SensorTran’s Smart Grid and wellbore monitoring businesses. Participants in the round included Advantage Capital Partners, Expansion Capital Partners, WHEB Ventures, and Stonehenge Capital Company.

Helix Energy Solutions has sold approx. 6% of its stake in Cal Dive for $86. The stake was acquired by Cal Dive as a stock repurchase. After the sale, Helix’s Cal Dive share will reduce to 51%.

The French Petroleum Institute (IFP) and the French Geological Survey (BRGM) have signed a research partnership agreement for the development of software tools for the study, dimensioning and monitoring of geological CO2 storage facilities. BRGM and IFP are both members of CO2GeoNet, the European network of excellence on geological CO2 storage.

Finally, though this news is a little stale, those who like us were puzzled over the disappearance of ModViz will no doubt be interested to learn that the company was acquired by Nvidia Corp. in April last year. ModViz provided complex 3D data visualization on high-performance computing clusters.


PPDM ‘What is a Well?’ tool rolls-out

Professional Petroleum Data Management Association provides interactive well dictionary.

The Professional Petroleum Data Management Association, formerly the Public Petroleum Data Model Association has just released an online interactive tool to membership as a deliverable of its ‘What is a Well?’ (WIAW) workgroup. WIAW is developing baseline definitions for the structural components of a well and for well life cycle to help industry understand how a well evolves over time. The first phase of WIAW focuses on definitions and mappings to key regulatory agencies in the US and Canada. Subsequent phases will examine vendor data and application software.

PPDM notes that, ‘The business entity we call a well has become increasingly complicated. Corporate and regulatory processes have evolved to manage this in different ways resulting in a mixture of systems and data sources that are difficult to integrate and use effectively.’ WIAW sets out to be the ‘Rosetta stone’ for consumers of well data.

Critical to the process are the different regulatory environments in which oil companies operate. The online interactive tool offers users access to terminology definitions emanating from regulators including the US Department of Industry’s Minerals Management Service and Canadian regulators. Extension to the international environment may be considered when the methodology has been proven for the US and Canada. A Chevron internal analysis was the starting point for the initiative.


Virtual Geomatics for El Paso pipeline management

LiDAR software suite to support route corridor visualization and planning.

El Paso Corporation has chosen Virtual Geomatics’ VG4D Production Manager Suite of LiDAR software tools to manage the mapping of its pipeline network. VG4D supports LiDAR data management, filtering, visualization and contour generation for pipeline mapping. Pipeline corridor mapping frequently involves both major and minor changes to route planning. Often, an alternative alignment needs to be evaluated when the project is already underway and new mapping activities initiated in short order. VG4D gives geospatial analysts control over all elements of the LiDAR data evaluation and processing and helps avoid costly delays and compliance risks.

Matt Simmons, Senior GIS Technician at El Paso said, ‘With this technology in-house we can analyze our data in-house and come up with alternative routes for proposed pipeline projects. CG4D helps us stay on schedule without the need for third party evaluation.’ Virtual Geomatics demoed VG4D at ILMF 2009 in New Orleans this month. The new release provides cartographic quality contouring with full control over smoothing. A new ‘Boresight’ module provides high accuracy LiDAR calibration.


‘GProfile’ evaluator for UKCS acquisitions

Hannon-Westwood founder advocates ‘dynamic’ modeling of UK portfolios.

Speaking at the OilVoice* Forum this month, Jim Hannon of Hannon Westwood (HW) demonstrated the UK-based consultancy’s new ‘GProfile’ application that allows asset buyers to ‘play out’ the development of potential reserves in a ‘time sensitive, commercially-based profile.’ Hannon described the approach as ‘translating static reserves reported on an individual basis into a dynamic model that includes the interaction of plays, prospects, wells or complete corporate portfolios.’

GProfile provides a plethora of graphical metrics including NPV, EMV, PI, cash flow and forecast production profiles. The GIS based model can be tweaked for different hypotheses as to future oil price, tax take, inflation and cost of capital.

GProfile is loaded with HW’s comprehensive dataset of UK continental shelf (UKCS) data. A prototype is available which allows for live interaction and study of any selected part of the UKCS. HW is about to release ‘GProfile15,’ a new version of the tool that allows analysts to rebuild the model using up to 15 different geological play types. Selected play types can be filtered on companies, pipeline catchments or other configurations to analyze acquisition targets or perform competitor analysis.

* www.oilvoice.com.


GeoFields announces across the board APDM support

ESRI’s ArcGIS geodatabase now alternative to PODS and other industry data models.

Pipeline data management specialist GeoFields has announced support for the ESRI-backed ArcGIS Pipeline Data Model (APDM) across its product suite. Operators can now manage pipeline data in an independent APDM V4.0 geodatabase, leveraging native ArcSDE and ArcGIS functionality.

GeoFields executive VP Keith Chambless said, ‘Combining the functionality of our enterprise suite with the ESRI geodatabase will help operators manage their asset data, reduce costs and make regulatory requirements more manageable.’

The new enterprise suite for APDM extends GeoFields pipeline GIS technology. Geofields applications use ArcGIS 9.3 as the underlying core GIS technology and are compatible with SQL Server and Oracle databases. The APDM-compatible release complements GeoFields’ existing solutions designed for use with the Pipeline Open Data Standard (PODS) and other industry data models.


Sales, contracts and deployments

SpectrumData, TelaPoint, Quorum, Aker, Eurodecision, Paradigm, Meridium and Locus.

SpectrumData reports the completion of a seismic data assessment project with an unnamed Egyptian client. The evaluation is the first step in the development of a long-term seismic data asset management strategy. The company has also signed an agreement with DataBank, Australia’s largest off site tape and data storage provider for provision of joint data storage and management solutions.

Southern Maryland Oil has selected an internet petroleum supply chain solution from TelaPoint Inc. of Louisville, KY. TelaPoint’s TelaFuel ‘Smart Replenishment’ module will be used for inventory management of fuel supplies to convenience stores and other wholesale customers. TelaFuel, a browser-based application, handles fuel purchasing, replenishment, truck logistics, and invoice reconciliation. TelPoint’s Smart Scheduling module is also used to manage fuel dispatch for company-owned trucks and common carriers.

Quorum Business Solutions has completed implementation its PGAS measurement solution for SemCAMS, one of Alberta’s largest licensed sour gas processors. Phase one of the roll-out includes gas and liquid analysis including third-party lab interfaces, inbound analysis validation, and non-compliant analysis escalation. Quorum’s interfaces allow for automated data transfer from major laboratories. Gas and liquid analysis management supports automated sample validation, flagging exceptions to historical data for closer scrutiny.

Aker Solutions has seen its modification and maintenance support contract with BP Norway extended through 2011. The extension is valued at between NOK 1,000 to 1,500 million. The framework agreement covers 12 platforms. Aker has also been awarded a front-end engineering design (FEED) contract by Agip for Phase II of the Kashagan field development. Contract value to Aker and its joint venture partners CB&I and WorleyParsons is £90 million.

Total has commissioned a supply chain optimization solution from Eurodecision of Versailles, France. The ‘Optilog’ project leverages Eurodecision’s LP-SupplyChain solution to model Total’s domestic fuel oil distribution network in France. Total’s Bruno Ollivier said, ‘LP SupplyChain has improved visibility of our logistics organization. Mathematical modeling has replaced empirical processes and a few received ideas have been revised—particularly in the way we organize our distribution channels.’ The solution was co-developed with the Alligra consultancy.

BP has extended its global deal with Paradigm for the continued use of the Geolog petrophysical analysis and formation evaluation package.

SMT has signed a global software agreement with India’s Oil and Natural Gas Corporation (ONGC) for the provision of its Kingdom geoscience interpretation software. SMT will also provide training to 200 ONGC geoscientists. Local partner Suvira Group is helping with the deployment.

Saudi Basic Industries Corporation (SABIC) has selected Meridium’s Total Reliability Program for project planning. Local partner Saudi Business Machines is assisting with implementation.

Locus Environmental Software has been selected by Hovensa refinery in St. Croix, US Virgin Islands for environmental sampling and remedial operations data management. Hovensa is a joint venture between Hess and PDVSA.


Standards Stuff

SEG-technical standards and CSEM, Fieldbus Foundation safety spec, Process Safety Forum.

There have been some interesting discussions at the Society of Exploration Geophysicists’ technical standards committee on the extension of the SEG-D field data recording format. SEG-D Rev 3 is a potential candidate for the recording of novel data types such as controlled source electro-magnetics (CSEM). However the CSEM community is lukewarm about using seismic oriented formats, preferring more modern open source self describing binary formats such as Net CDF/Unidata* used by the meteorological and astronomical communities and also HDF5**. The possibility of such a paradigm shift in the SEG’s own field recording standards seems remote—the feeling is that the industry is not going to rewrite 100,000 lines of code. Hence the current SEG-D revamp, a half way house that integrates modern IT concepts (embedded navigation, tagged data and a table of contents) with the existing technology.

The Fieldbus Foundation has announced a new Safety Instrumented Functions (SIF) technical specification, an interoperability test kit and a description library for SIF-aware devices. The SIF protocol has been approved by the German TÜV authority. Last year, SIF solutions were successfully demoed at the Shell Global Solutions technology center in Amsterdam. In a live test, Fieldbus-enabled safety valves, pressure, temperature and diagnostic devices were tested. Shell’s Audun Gjerde said, ‘Shell expects enhanced diagnostics through this integrated asset management system. We also anticipate less testing thanks to smart testing and diagnostics, as well as online testing and partial stroke testing. This will result in early detection of dangerous device failures and fewer spurious trips.’

The UK Petroleum Industry Association (UKPIA) and Oil & Gas UK are teaming with other trade bodies in the chemical and nuclear industries to create a ‘pan sector’ Process Safety Forum. The Forum will provide a platform where initiatives, best practice, incident learnings and process safety strategy can be distilled and shared across all the industry sectors, with the aim of preventing major incidents. The Forum has been set up in the wake of the UK’s Buncefield tank farm disaster. UKPIA director Chris Hunt said, ‘Incidents in the downstream oil industry in recent years have focused attention on the need for greater sharing of knowledge and best practice across the high hazard sectors. Through the Buncefield Standards Task Group and its successor the Process Safety Leadership Group, the value of a wider pan industry dialogue has been demonstrated, to complement the existing safety forums within these sectors.’

* www.oilit.com/links/0901_2

** www.oilit.com/links/0901_4


Schlumberger teams with ERF for wireless communications

‘Broadband trailer’ brings high-end communications to remote North American drill sites.

Schlumberger has signed an exclusive reseller agreement with League City, Texas-based ERF Wireless to expand broadband and Wimax communications in North America to the US and Canadian oil patch. Schlumberger is to extend the footprint of its IPresence and IPerformer services with ERF Wireless’ high-speed, low-latency wireless coverage.

Slavo Pastor, VP Schlumberger Information Solutions (SIS) said, ‘This deal will provide cost effective ubiquitous communications to the oil and gas industry and will increase real-time activities and collaboration between remote sites and office-based asset teams.’

ERF Wireless CEO John Nagel added, ‘The new broadband service will let energy companies realize their digital oilfield initiatives and increase productivity, safety and crew welfare.’ ERF Wireless’ wireless broadband services are deployed with a broadband trailer located at the drill site or pipeline facility. The network delivers real-time data, VoIP, video conferencing, surveillance and monitoring. Schlumberger’s own communications offering includes satellite, wireless, terrestrial, data and voice solutions.


Malaysia-based companies sign on for ‘Software as a Service’

Hosted data solution IDS DataNet2 offers semi automated daily reporting from WITSML sources.

IDS has just announced the worldwide implementation of ‘DataNet2,’ its ‘next generation’ hosted reporting service for the upstream oil and gas industry. DataNet2’s suite of reporting and data management tools to minimize data entry effort and offer enhanced data visualization and analysis.

Early adopters of DataNet2 include three major Malaysian-based companies: Talisman, CHOC and Petrofac. DataNet2 offers the flexibility of a spreadsheet with global data delivery to any web browser. DataNet2 can also share data with any WITSML-compliant sources. In what is claimed as an industry first, daily drilling and geological reporting can leverage real-time WITSML data sources, saving valuable time at the rig site. The Malaysian early adopters are currently using a combination of IDS’ DrillNet, SafeNet, GeoNet and ProNet products, tailored to their reporting requirements.

IDS CTO Reuben Wee said, ‘We aim to eliminate many of the time-consuming elements of reporting and to make it easy to share and analyze data. The latest advances in WITSML and other data sharing technologies have enabled us to both reduce manual data entry and make the whole reporting process much more straightforward and intuitive. DataNet2 shows real savings in time, money and frustration with immediate effect.’ More from www.oilit.com/links/0901_7.


Invensys’ Foxboro unit kits-out Qatargas LNG upgrade

Foxboro unit upgrades Qatar Gas 1 LNG train with new mesh-based system.

Foxboro DCS Systems, a unit of Invensys Process Systems (IPS), has signed a ‘multi-million’ dollar contract with Qatargas, for a major automation upgrade at the Qatar Gas 1 liquefied natural gas (LNG) facility in Qatar’s Ras Laffan Industrial City. IPS is to upgrade control processors, gateways, local area networks and network security with its Foxboro I/A Series distributed control system (DCS).

IPS Business Development Manager Mohsen Sorour said, ‘Our goal is to keep our clients’ DCSs running at peak performance levels. The existing DCS is a mixture of legacy Foxboro I/A systems. IPS will migrate these NodeBus-based systems to our latest mesh technology. By retaining existing field wiring and system cabinets and by using online upgrade procedures, we can ensure that plant downtime is minimized.’

Jambulingam Balu, Qatargas Lead I&C Systems Engineer added, ‘The phased upgrade will allow different generations of products and software to operate as one system, ensuring that we can stagger our investments, extend the DCS’s lifetime and enhance plant performance.’ Qatargas currently exports 10 million tons per year of LNG to customers in Japan and Spain. By 2010, annual capacity is expected to increase to 42 million tons.


Roxar reports on 2008 success in Middle East, Asia Pacific

Noteworthy software sales to Petronas, Sinopec, CNOOC, PetroVietnam.

Stavanger, Norway-based Roxar reports a successful 2008 for its Asia Pacific division with ‘multi million’ dollar software sales and closer relationships with Asia Pacific NOCs and educational establishments. A $10 million, three-year contract was signed with Petronas for the full suite of Roxar’s reservoir modeling toolset, Irap RMS, Fracperm, Tempest and the EnAble history matching application. The deal includes customized training programs for one hundred geoscientists and reservoir engineers.

Other significant sales were made to National Oil Companies (NOC) in China and Vietnam including PetroChina, Sinopec and CNOOC and PetroVietnam. The company has also signed a memorandum of understanding with the Hanoi University of Mining and Geology for the establishment of a laboratory at the university. Roxar is to donate US$2.5 million worth of software for academic use. Another academic partnership deal has been struck with Universitas Padjadjaran (UNPAD) in Bandung, Indonesia. Roxar’s customer base in the Asia Pacific region now includes over 100 international and national E&P companies. Roxar’s Asian push is supported from offices in Beijing, Ho Chi Minh City, Jakarta, Kuala Lumpur, and Perth.


Semantic cure for Excel ‘hell’

Cambridge Semantics’ Anzo for Excel leverages semantic web techniques in Chevron-backed proof of concept.

In a trial for Chevron, start-up Cambridge Semantics has shown how semantic technology can unify data in multiple spreadsheets scattered through the enterprise. The proof of concept involved daily production data supplied by joint venture partner BHP. Cambridge Semantics’ Anzo for Excel collected data from the daily reports, collating it with internal Chevron data. Anzo for Excel uses semantic web templates to map spreadsheet fields to enterprise databases. Excel cell to production database field mapping is done once and for all. Anzo Exposé was used to view shared data from spreadsheets and other sources. Different ‘lenses’ can be applied to data views, including a cute ‘timeline’ display that was used to track production report comments against time. Other ‘lenses’ allow for instance, spreadsheet data recast to PRODML.

Speaking at the W3C workshop (page 6) Cambridge Semantics CTO, ex IBM-er Lee Feigenbaum described the semantic web as an ‘expressive’ way of building domain and expertise models that align better than the relational database (RDB) and XML. Feigenbaum recognizes the spreadsheet as the lynchpin of human machine interaction but observes that Excel has created a ‘shadow IT’ world of data that is not discoverable or searchable. According to Feigenbaum, tools like Anzo and other semantic technologies will be the cornerstone of the next generation of data management.


N4 Systems’ RFID for Hy-Safe fall protection systems

Radio frequency tags key to Field-ID safety supply chain and compliance monitoring.

Fall protection specialist Hy-Safe Technology is to use radio frequency identification (RFID) technology from N4 Systems to manage its safety system supply chain and operations. N4 Systems’ ‘Field ID’ safety network is an inspection and safety compliance management (ISCM) solution that provides traceability and compliance for manufacturers, distributors, inspectors and end users of safety equipment. Field ID eliminates compliance paperwork and reduces errors, guesswork and liability inherent with paper-based compliance and inspection management.

Hy-Safe CEO Frank Anzaldi said, ‘We have always believed that RFID could help us and the industry as a whole. Field ID is a powerful tool for managing inspections and safety for both us and our customers.’ N4’s Field ID combines RFID technology, wireless mobile computing, secure data management, and web-based network integration. Field ID assures automated compliance and access to current and historical inspection data.


Emerson Smart Wireless for Chevron’s San Ardo field

Twin wireless networks monitor steam injection and well test operations.

Chevron has deployed a ‘Smart Wireless’ network from Emerson Process Management at its San Ardo, California oilfield. Two separate networks were used to monitor steam injection and down-hole well pressures. In a test, wireless pressure transmitters on steam injectors identified an ‘over steaming’ situation in one well. This also created more wastewater, which had to be pumped from the well and treated before discharge. Over-steaming also meant higher natural gas used in steam production. The wireless network has improved operator safety by minimizing travel in the field. A Ethernet gateway connects the injectors to oil field’s control. Installation took only three hours.

The second Smart Wireless installation replaces a legacy remote telemetry unit (RTU) wireless system that collected downhole pressure data used in periodic well pressure testing. Chevron automation engineer Mohammad Heidari reported that the battery powered Emerson solution cost $30,000 as opposed to approx. $90,000 for a conventional RTU replacement. According to Heidari, ‘The new system is reliable and has passed our rigorous IT security review. Installation was easy and we haven’t had any problems.’


Qatar Cloud Computing Center

Open Source Hadoop development to target seismic imaging, production operations.

IBM has backed several research establishments with its ‘Blue Cloud’ cloud computing solutions. Blue Cloud will see deployment at the Qatar Cloud Computing Center (QCCC) of the University of Qatar, home to educational outposts of both Carnegie Mellon and Texas A&M. IBM expects that cloud computing will accelerate projects and research initiatives that were hitherto constrained by time or by limited system resources. The QCCC is to open its cloud infrastructure to local businesses and industries to test applications and complete projects such as seismic modeling.

Professor Majd Sakr of the Carnegie Mellon University in Qatar said, ‘We are working with IBM to create the first cloud computing platform in the Middle East and to realize the vision of a cloud computing infrastructure targeting regional applications and R&D projects.’ The QCCC will be used for research on data mining, scientific modeling and simulation and financial modeling and forecasting. Pilot applications include seismic modeling for oil and gas exploration, integrated production operations, an Arabic language web search engine and the testing and migration of various applications using Hadoop, an open source cloud computing infrastructure. Willy Chiu, VP IBM Cloud Labs added, ‘For decades, clients have turned to IBM to integrate new technologies and computing paradigms into their operations and now, Linux, open source and the Internet.’ IBM is also partnering with rSmart to deploy Sakai, an open source learning management system. IBM claims 100,000 users of its 13 worldwide cloud computing centers.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.