March 2010


Seismic ‘ecosystem’

Shell and HP are to build a new seismic system, part of HP’s ‘Central Nervous System for the Earth.’ New system faces stiff competition from WesternGeco and a complex web of patents.

Shell and Hewlett Packard (HP) have announced that they are to develop an ‘ultra high resolution’ land wireless seismic recording system. The system is to be built around HP’s new sensing technology, a key enabler of HP’s vision for a ‘Central Nervous System for the Earth*’ (CeNSE). The new seismic system promises a ‘much higher channel count’ and ‘broader frequency range’ than is currently available. The solution will integrate Shell’s high-performance computing and seismic imaging environment.

Shell’s Gerald Schotman said, ‘We think this will represent a leap forward in seismic data quality that will provide Shell with a competitive advantage in exploring difficult oil and gas reservoirs, such as sub-salt plays in the Middle East and unconventional gas in North America.’

The strategic relationship with Shell is a component of HP’s CeNSE ‘information ecosystem.’ CeNSE involves a network of sensors, data storage and analytical tools. HP Labs’ Peter Hartwell effervesced, ‘CeNSE will revolutionize communication between objects and people. With a trillion sensors embedded in the environment, it will be possible to ‘hear’ the heartbeat of the Earth. This will impact human interaction with the globe as profoundly as the internet has revolutionized communication.’

HP’s new sensing technology includes low-power MEMS** accelerometers that are said to be ‘1,000 times more sensitive than high-volume, commercial products’ (no comparison with existing geophone technology was offered.) The devices can be customized with single or multiple axes per chip—with implications for full waveform seismic recording. The solution will be delivered by HP Enterprise Services.

Comment

An announcement from WesternGeco this month underscores just how high the bar of what is ‘currently available’ is set. Schlumberger’s new ‘UniQ’ seismic recording system is capable of 150,000 ‘point-receiver’ channels at a 2ms sample rate, generating around two terabytes of data per hour.

The idea of a seismic recording solution ‘exclusive’ to Shell harks back improbably to the days when oils ran their own seismic crews, notably when Conoco astonished the seismic community by patenting the Vibroseis method***.

The situation today is that the co-venturers will be confronted with a complex web of patents surrounding seismic systems—as witnessed by the ongoing $25 million spat between ION Geophysical and CGGVeritas.

* links/1003_9.

** Micro-electro mechanical systems (on-chip accelerometers).

*** links/1003_10.


EnergyPath to SAP

Capgemini rolls out SAP-qualified solution for drilling and oilfield service companies. EnergyPath methodology endorsed by Seahawk Drilling.

Capgemini’s new EnergyPath offers oil field services companies an SAP-based enterprise resource planning (ERP) system ‘at an accelerated pace and reduced cost.’ Energy-Path promises ‘streamlined operations and greater visibility and control of core business processes.’

Capgemini’s Rob McKay said, ‘EnergyPath enables clients to deploy a comprehensive industry specific ERP solution in as little as 90 days.’ John Varkey, CIO with EnergyPath anchor client Seahawk Drilling added, ‘Following a comprehensive evaluation and learning from Capgemini’s experience with some of our peers, we chose EnergyPath to provide us with a system that positions us to respond to pressing industry challenges.’

EnergyPath draws on Cap-gemini’s experience of SAP implementations with clients such as Pacific Drilling, Rowan Companies and Vantage Drilling Company. The system offers pre-configured components for back-office processes such as finance, procurement and HR. Industry-specific scenarios such as intercompany billing and asset transfers, freight forwarding, rig moves and rechargeables complete the picture. CAP claims 2,000 global clients and 5,000 SAP-based projects. More from capgemini.com.


An open letter to the president of the SPE

Oil IT Journal Editor Neil McNaughton writes to the President of the Society of Petroleum Engineers, Behrooz Fattahi to advocate freer access to the SPE’s publications, suggesting an alternative way of ‘accounting’ for societal revenues and questions growing commerciality in the SPE’s output.

Dear Sir,

In your Conversation column, ‘The Matter of Money*’ in the February 2010 issue of the Journal of Petroleum Technology, you report that in your travels, one question that you are always asked is, ‘Why [the SPE’s] services and products are not free.’ You report that one audience member accused the SPE of ‘not being serious about technology dissemination’ because of its paid-for publications policy.

I think that you do well to report on such interrogations, which echo my own March 2008 editorial where I called for open publication by the learned societies. Back then I noted that the ‘preponderance of information on the web is unscientific chit chat .. a medieval world of charms, half-truths and snake oil.’ While the chit chat is free, the science often is not, and the whole body of publically available knowledge is skewed away from peer-reviewed, authoritative content towards self-publicists, commercial interests and the ‘crazies.’

Just to show that ‘free’ need not equate with ‘loss-making’ I note that, from this month, Popular Mechanics has opened up access to its back issues. The UK’s illustrious Royal Society likewise offers free access to its Journals after a year or two of paid-for use. And, modestly, our own Oil IT Journal, an entirely ‘commercial’ publication, has been doing free (for all of our content over a year old) for the last 15 years and we are still going strong.

Oil IT Journal operates of course at a rather different scale from the SPE. Our ‘commercial’ activity has a turnover of roughly one hundredth of that of the ‘not for profit’ SPE. ‘Commercial’ in this context by the way means making a living from writing, rather than from advertising.

But to get back to your editorial. If I can briefly summarize your argument. You argue against free Journal access from a cost accounting stand point. Publishing all this stuff costs a lot and is a loss-making activity that is subsidized by the SPE’s conferences and tradeshows—the Annual Technical Conference and Exhibition, Offshore Technology, Intelligent Energy and the like.

But there is more to accounting than cost and a balance sheet is about more than cash flow. The figure in your editorial shows membership as a downward-trending ‘cost center’ that is losing the society a couple of million dollars per year. But this analysis fails to account for the true value of the membership to the SPE.

To appreciate what is missing in the analysis, imagine for a moment that the SPE spun off its profitable conference activity as a separate business. This would leave the membership and publishing activities as a large loss-making rump. Could the rump survive once deprived of subsidy from the tradeshows?

We are in the realm of speculation here, but my contention is that the ‘loss-making’ bit may actually do better than the ‘profitable’ conferencing business. Why? The answer is simple. With a 92,000-strong membership and a popular website this should be a rather successful concern. Membership fees represent a few million dollars after all—a a nice amount of ‘seed capital’ for a new venture! And sponsors would very keen to engage the membership bringing in more revenue streams..

Now lets turn to the other half, the conference organization. I wonder how long it would last and how its finances might develop. I suspect that, cut off from its membership, the Conferences section would wither away. It would also face stiff competition from other tradeshows.

How do we ‘account’ for this? You take the whole, split into ‘moneymaking’ and ‘loss-making’ parts and, a few years down the road, ‘moneymaking’ is struggling while ‘loss-making’ is doing fine. This is where the cash flow analysis falls short, failing to take account of the goodwill that the membership brings to the table. The real ‘hidden’ asset of any learned society is the goodwill, both in an accounting sense and figuratively, of its membership

The strong positive cash flow from commercial sponsors, the millions of dollars of booth space, advertizing, product placement and so forth is part chimera, part dangerous distraction as it takes attention away from the needs of the membership and towards commercial sponsors. This is, I feel, reflected in the growing amount of non peer-reviewed content that the SPE puts out—often provided by the aforementioned commercial entities.

The arguments for and against free access are presented in depth in a position paper from the Royal Society**. This is actually a defense of the role of a ‘commercial’ publisher as used by the Royal Society on the grounds that managing the publishing cycle and peer-reviews costs money.

While the SPE’s ‘heavyweight’ specialist publications do receive attention from the reviewer, much of the content in the JPT is unrefereed, as are papers presented at the tradeshows. This again leads to creeping commercialism, fueled by the focus on cash-flow. The current presentation guidelines in regard of commercialism are in need of a major refresh as they stifle honest discussion of technically significant products and software while letting through many a ‘commercial presentation’ as long as it is suitably drafted and delivered by a suitable ‘sponsor.’

The SPE’s finances all derive from the goodwill provided by its membership. I think that the requests made by members around the world and myself, should be reconsidered in the light of this. After all if Popular Mechanics can do it, if the Royal Society does it, why not the SPE too? Practically speaking I suggest that all SPE publications should be free to members from the date of publication. Public free access could click in after a year of exclusivity to members, which would ensure that pretty well everyone paid their dues. It would also add a significant body of knowledge to what is available on the world wide web, redressing the balance in favor of public science.

Yours truly,

Neil McNaughton (SPE).

* links/1003_5.

** links/1003_6.


Review—Data Modeling for the Business

Oil IT Journal reviews ‘Data Modeling for the Business’ by Steve Hoberman et al. The book outlines a new approach to data modeling and includes a chapter on BP’s enterprise architecture.

Someone once said the ideal number of data modelers is one*. The book ‘Data Modeling for the Business**’ (DMFTB) takes practically the opposite approach, advocating a series of corporate Rolfing sessions and pizza parties to thrash out what should be modeled, how, and for how long information should be retained. If the single modeler approach presupposes a domain specialist who knows all, Hoberman’s is rather of journeymen data modelers, perhaps without deep domain knowledge, who can extract all the information required from other stakeholders. The thrust of DMFTB is communication and debate with non specialists. This can be rather labored—as in the first chapter which plods through the analogy of a data model and a blueprint for a house.

Those expecting technology insights and a discussion of tools will be disappointed. We learn from the frontispiece that the graphical models in the text were created with CA’s ERwin tool. But the book does not really connect with technology. The subtitle of ‘aligning business with IT using high level data models’ says it all.’ This discussion is far removed from databases and SQL and focuses on a bird’s-eye view of the enterprise rather than on implementation.

There are ‘traditionally’ four levels of models—very high, high, logical and physical. High level models communicate core data concepts like ‘customer,’ ‘order,’ ‘engineering,’ ‘sales.’ Even the ‘logical’ is model is ‘a graphical representation of [...] everything needed to run the business.’ All of which is a far cry from the Express logical model of Epicentre or ISO 15926!

The body of DMFTN is concerned with business, rather than technical data, examining in depth how for instance the concept of ‘customer’ can be implemented in ‘hundreds of database tables on a variety of platforms.’ Business requirements may mean changing definitions of key concepts like customer. These start at the high level, and ripple down through the model layers. Modelers can then perform impact analysis to see ‘what changes are required at the logical and physical levels.’ Although how such changes are effected across ‘hundreds’ of databases including pre-packaged behemoths like SAP is glossed over.

Of particular interest is a chapter on data modeling in an international energy company by BP’s Mona Pomraning. Here an enterprise architecture initiative set out with a vision of a ‘shared corporate data asset that is easily accessible.’ Amusingly, half way through their work, the team found that there was another initiative working on master data management whose goal was also ‘a single version of the truth.’ Such is the nature of the large decentralized beast! BP’s modelers leveraged industry data models including PPDM, PSDM (ESRI), MIMOSA and PRODML—although exactly how these different circles were squared is not explained!

Despite its technical weakness, DMFTB makes an interesting and perhaps inspiring read for technologists who are trying to engage with their fellow stakeholders.

* links/1003_4.

** A handbook for aligning the business with IT using high level data models. Hoberman et al. Technics Publications 2009. ISBN 9780977140077.


Fifth High Performance Computing in Oil & Gas Workshop

Rice meet hears of ‘new dawn’ in HPC, Eclipse benchmarks, ‘single chip cloud’ and call for OpenCL.

Bill Brantley described AMD’s ‘Fusion’ accelerated processing unit (APU) and its ‘direct connect architecture 2.0,’ as a scalable design that supports up to 16 cores per CPU. The APU heralds the ‘dawn’ of a new era of heterogeneous computing.

Owen Brazell reported on benchmarking of Schlumberger’s Eclipse and FrontSim reservoir simulators running across various multi-core chips including Intel’s Nehalem and AMD’s Shanghai (AMD’s latest 12 core Magny-Cours was not ready in time for the test). Various million cell models were run, most showing tail-off at around 16 or 32 CPUs. The conclusion was that increasing cores per socket without an increase in memory bandwidth is no use for distributed codes. Multi-threaded codes such as FrontSim do benefit from the new architectures. Multi-core is the future and software developers will need to re-code to reap the benefits.

Paul Fjerstad described tests of the joint Chevron/Schlumberger developed Intersect simulator on a super-giant oilfield. The ‘next generation’ simulator uses large scale parallel simulation. A new solver has already show a threefold speed-up over conventional simulators. A deviation from optimum scalability was noted, stemming from the serial part of the program as one processor works while others are idle.

Tim Mattson unveiled Intel’s futuristic concept chip, the ‘Single Chip Cloud’ (SSC). Intel’s first ‘tera-scale’ computer, the 1997 ASCI Red, had 9000 CPUs and required one megawatt of electricity and 1,600 square feet of floor space. The SSC is a terascale computer on a chip—a 48 core CPU requiring 97 watts of power and occupying 275 sq. mm! Intel plans to release 100 SSC’s to partners for research into ‘user-friendly’ programming models that don’t depend on coherent shared memory. Mattson then turned to the topic of ‘software in a many cored world’ noting that ‘parallel hardware is ubiquitous, parallel software is rare.’ He enumerated some 95 attempts to find a parallel programming model to conclude that ‘we have learnt more about creating programming models than how to use them.’ OpenMP was cited as a poster child for open systems and Mattson concluded with a plea for a similar push for OpenCL. ‘If users don’t demand standards, we, industry and academia, will proliferate languages again and our many core future will be uncertain.’

Dave Hale (Colorado School of Mines) asked, ‘Who will write the software for multi core/parallel machines?’ Most geosciences students program in MATLAB and lack the skills or ambition to program computers in the ‘fundamentally new ways’ required to exploit modern hardware. One solution is to recruit science grads to work alongside geoscientists, but Hale favors a different approach—that of getting geoscience students excited about computing. One is tempted to suggest a third way—get MATLAB to sort the parallel programming mess out! More from links/1003_11.


ECCMA data cleansing whitepaper reviewed

Electronic Commerce Code Management Association CIO on definitions and data project execution.

Peter Benson, founding Director of the Electronic Commerce Code Management Association (ECCMA), where he is currently CTO, has published a whitepaper, ‘Managing a Data Cleansing Project*.’ Data cleansing appears ‘deceptively simple’ and while common sense will see cleansers through a small project, more structured processes and knowledge are required for large-scale initiatives. The 64 page whitepaper offers insightful definitions of terms such as cataloguing (a synonym for cleansing), master and metadata and more, leveraging the ISO 8000 quality standards and best practices.

‘Quality’ means data that meets requirements so IT can be used to verify data entry against external references. Cleansing also includes structuring data and adding context through an ‘ontology,’ here taken to mean an assembly of ‘a data dictionary, classifications, data requirement statements and rendering guides.’ Other terms—metadata, class, classification and property are given equally pragmatic definitions with examples.

The whitepaper has a tendency to resemble a standards ‘smorgasbord’ with references to Federal, NATO, UNPC, NCS, ISO and the ECCMA Open Technical Dictionary (eOTD). The latter is a super-registry of terminology where concepts are assigned a public domain identifier. eOTD is a key enabler of quality in that it provides a standardized superset of industry and technical terminology. Users can ask for terms to be added to the registry for free and the eOTD is updated in under a day.

The whitepaper discusses roles and responsibilities for clean-up with potential candidates filling the C-Suite. Benson thinks that ultimately, responsibility for corporate data rests with the CEO who should delegate it to a ‘master data quality oversight committee.’ Data management is fundamentally no different from physical inventory management and merits similar attention from both board and bean counters. More from eccma.org.

* links/1003_3.


BP’s ‘flagship’ technologies highlighted in Annual Review

Field of Future, high end seismic and supercomputing to add billions of barrels to reserves.

BP has divvied up its R&D push into ten technology ‘flagships,’ each ‘with the potential to deliver over one billion barrels of reserves.’ These include ‘inherently reliable facilities,’ whereby monitoring is used to anticipate equipment failure and increase operating efficiency, and the trademarked ‘Field of the Future’ digital oilfield program that is to deliver 100,000 boed through real time reservoir, well and topside management.

The flagships featured in BP’s 2009 Annual Review, published this month, where they were highlighted as ‘key to increasing recovery from our resource base and to operating safely and efficiently.’ The flagships also provide BP with a competitive advantage when bidding for new licenses.

Seismic technology figures prominently among BP’s flagships with wide azimuth towed streamer and ‘proprietary’ independent simultaneous sweep (ISS) acquisition.

Last year, BP’s high-performance super-computing centre deployed 3 petabytes of disk. But as Doug Suttles revealed at the SPE Intelligent Energy event in Utrecht, NL this month (a full report in next month’s Journal) this has since been doubled to 6 petabytes—along with a CPU count of 27,000. More from bp.com.


Petrosys User Group report

dbMap gains traction. Web map services, Petrel and Google Earth extensions prove popular.

Petrosys’ international user group meetings were held late last year. dbMap appears to be (re)gaining traction as a ‘practical solution’ to the data management challenge with its image georeferencing, export to Google Earth and links to Schlumberger’s Petrel. Users are also leveraging Petrosys’ E&P master data management and PPDM expertise.

Petrosys notes a ‘continued reliance’ on both Windows and Linux desktops with a trend towards ‘batch automation’ of intelligent workflows leveraging web services and ‘conditional processing. In this context, OGC-compliant web map services allow geodata to be ‘consumed’ by a variety of devices.

The session concluded with a discussion on geostatistical enhancements to Petrosys in the form of kriging modelling improvements to gridding processes. Clients argued the case for extending integration to Paradigm’s applications and discussed roles and drivers for future OpenSpirit-enabled workflows. More from petrosys.com.au.


IHS Enerdeq API for international data

Software development kit gives third party developers access to former Petroconsultants’ data.

IHS has announced a software development kit (SDK) for its international E&P data. The new Enerdeq SDK allows programmers to develop web services that access IHS’s international well, field, contract and production information from third party interpretation environments.

IHS VP global product management Richard Herrmann said, ‘Industry has been looking for a solution that enables disparate systems and applications to exchange data and achieve interoperability. Web services allowing data to be integrated directly into an application, freeing users from the repetitive and laborious aspects of data management.’

One satisfied user is PetroWeb MD Gina Godfrey who added, ‘The Enerdeq SDK has enabled us to integrate IHS’ international E&P database into our Gateway product, adding direct online access to international data.’ The system is set to replace end users’ ‘labor-intensive’ processes for re-populating spreadsheets, updating projects or refreshing a corporate well data repository. More from ihs.com/energy.


Software, hardware short takes

DataMatters, Emerson, Exemys, Cartopac, Fugro, Ikon, Iridium, P2ES, Quorum, Schlumberger, SGI.

Exemys’ GTS device offers remote diagnostics and alarms via SMS (text messaging). The system accepts analog and digital inputs from a wide range of devices.

CartoPac has released V 3.0 of its eponymous field data collection solution. The new release improves mobile access to a diverse set of databases and a web-based server option. A ‘Studio’ module lets users design custom data collection solutions.

Emerson’s Smart Wireless Field Starter Kit now includes a THUM adapter and wireless position monitors to facilitate the incorporation of wired HART instruments into a wireless infrastructure.

EnergySys has announced a new production reporting application, a hosted service designed to ‘make production reporting accessible and affordable for all sizes of oil and gas operations.’ The service includes standardized reports by well, field and producer. The application includes processes for data import, validation and reconciliation. A workflow scheduler automates complex sequences of events.

The 4.1 release of Fugro-Jason’s EarthModel sees its integration with the Geoscience Workbench. The new release adds stochastic modeling and upscaling and streamlines the link between petrophysics, seismics and the reservoir simulator. The modeling technology originated in Volumetrix’ ‘FastTracker’ tool, acquired by Fugro in 2003.

Ikon Science has announced a new RokDoc-Well Tie package, available either as a stand-alone solution, a RokDoc module, or a Petrel plug-in. The workflow-driven tool shows the effects on the tie from changes such as wavelet estimation and stretch/squeeze log editing.

Iridium and SkyBitz have teamed on a global, bi-directional remote asset tracking and monitoring solution, combining SkyBitz’ Global Locating System Iridium’s new 9602 satellite data transceiver. The solution targets oil, gas and chemicals with ‘intelligent’ sensor solutions.

DMNG has released SeiSee, a free SEG-Y viewer, filtering and header data manipulation solution. The software received enthusiastic endorsement from openDTect users. More from links_1003_7.

P2 Energy Solutions has extended functionality of Qbyte PRISM. V9.6 of the Canadian production revenue accounting software adds step-by-step processes for capturing facility charges along with ‘robust’ analytical tools and more accurate representation of facility flows.

Schlumberger’s PetroMod unit has announced a data exchange plug-in providing Petrel users with access to basin and petroleum systems modeling data.

A new version of Quorum’s Geospatial Information System enhances land department operations with a tract interest summary by depth severance, new map properties and tools to increase quality and performance through map caching and enhanced map and layer security.

Data Matters is developing an Apple iPhone application to provide ‘anytime, anyplace’ access to the PPDM data model.

Another iPhone developer is RedTree/Virtual Materials Group whose ‘Alph’ thermodynamic and physical process calculator has been ported to the iPhone and iPod Touch. A cut-down freeware version is available.

SGI has launched Cyclone, a data center in the cloud service offering access to an HPC infrastructure of Altix and hybrid clusters including either Nvidia Tesla or AMD FireSTream GPU-based accelerators. The solution is available in both hosted and in-house ‘Infrastructure as a Service’ modes.


More from the Artificial Lift R&D Council Workshop...

Relationship between TellWell and WellSavvy explained. Weatherford’s i-DO in action.

In last month’s Journal we reported from the 2010 Artificial Lift R&D Council meet, notably with a presentation from Weatherford on its WellSavvy ‘digital engineer’ application. This was followed by a presentation from Neil De Guzman on IntelligentAgent Corp.’s (IAC) ‘TellWell’ tool. Like WellSavvy, TellWell leverages periodic data from well tests along with pressure, injection rates and models to monitor gas lift wells, diagnose problems, recommend corrective actions and explain the results. A software ‘agent’ mimics an expert’s train of thought using pattern recognition and case-based reasoning. The system provides ‘visibility’ into the data and into the reasoning behind its recommendations.

After the event, Oil IT Journal asked De Guzman about the relationship between TellWell and WellSavvy. He explained, ‘IAC holds several patents for the application of intelligent agent technology in oil and gas. We have researched this field for several years with support from Baker Hughes, BP, Chevron, Halliburton, Marathon, and Statoil. More recently we shifted from R&D to building applications for the use of agent technology in artificial lift. TellWell is IAC’s gas lift application that uses any of the commercially-available well models such as WinGlu, Prosper and SNAP. Weatherford’s WellSavvy is a version of TellWell customized for use with Weatherford products such as Wellflo and LOWIS.’

Fathi Shnaib (Dubai Petroleum) and co-authors from Smart Zone Solutions and Weatherford’s i-DO team presented on the digital oilfield transformation taking place on Dubai Petroleum’s offshore gas fields. Optimization of the complex, mature asset is currently a ‘highly manual effort’ with a long cycle time. Slow identification of under-performing wells makes it hard to allocate well production and pinpoint losses. Weatherford’s i-DO methodology has been used to automate data acquisition and QC. Well models are tuned with current well test data and update the asset model automatically to allow full-field optimization at any point in time. I-DO provides a full field performance dashboard tracking KPIs such as lost production, reconciliation, back allocation with comparison to theoretical, identifying underperforming wells and upside opportunities. The i-DO server is an Oracle based web server and application engine. This is early days for the system, there are many more future opportunities. The system has now provided reduced downtime, tracking of full field production at a glance and improved workflows enabling faster and better decision making and unity of data available to all.

SPT Group’s Juan Carlos Mantecon showed how simulation is used to optimize well clean-up. Dynamic simulation defines the minimum rate and time required to clean a well along with the best size of test equipment. With offshore drilling unit costs of $600,000 a day, this kind of modeling quickly pays off. More from alrdc.org.


12th SMi E&P Information and Data Management, London

Data managers hear more on Chevron’s data and IT standardization and from Repsol on keeping it ‘open’ and ‘people-focused.’ ENI warns of lost geodetic know-how. Aramco scales-up its data effort to match drilling hike. GDF Suez, Total offer insights into their data quality programs. AF Engineering and Preen Petroleum show how data quality is assured in the refinery.

Around 100 showed up for the 12th SMi E&P Information & Data Management Conference held last month in London. Chairman Floyd Broussard (Schlumberger) noted how the industry has evolved from the days of unmanaged data collections though a stage of conservancy and project building to data ‘curation’ and ‘shaping the information landscape for self service by users through quality tools and automation.’

Peter Breunig’s (Chevron) described data management as, ‘not cool, but critical.’ Data was a key aspect of an IT transformation project kicked-off by Chevron’s new CIO, Louie Ehrlich, in 2009. The project identified Chevron’s ‘top ten’ data types, which were targeted for a major clean up program over the next decade along with the start of a data maintenance program. This connects with Chevron’s ITC ‘utility’ services spanning upstream, ERP and business units.

Chevron is working with an upstream IM ‘shaping curve’ that helps understand objectives and provides ‘swim lanes,’ with details of how to achieve and roll out projects. Chevron is working on standardized systems of record, search (text and spatial) and on IM career paths—there will be no more ‘dipping in and out’ of geotechnical and IM careers. The project sets out initially to answer the ‘15 or so’ top enterprise questions, ‘what’s my production’ etc. via data standards. Flexibility is needed to balance the ‘architectural review’ approach with the ‘killer application’ that does not conform.

Malcolm Fleming (Common Data Access) outlined a new UK data initiative: a National Data Repository (NDR). The NDR is to house petrotechnical, cultural data and metadata along with license and production sharing agreements, regulations on data preservation and reporting. A ‘virtual’ model is planned for the NDR leveraging existing vendor and archive sites. There will be no central location and there will be duplication—it will be a ‘chaotic’ system that will need to be fixed. CDA is working with the DECC on catalog standardization and exchange. The vision is of an ‘affordable science-driven selection policy for preservation.’ Fleming warned that ‘It is unlikely that the UK will ever allocate sufficient financial resources to preserve all North Sea data—there are hard decisions to come.’

Repsol’s Augustin Diz acknowledges that we need structured databases but these need to be integrated by design—which is currently not the case. Managing ESRI Shapefiles for instance is complex and Repsol’s users have moved to the simplicity of Google Earth. There is more happening in operations than in G&G. But we are still not doing surveillance with business rules. We need interpretation tools that broadcast their models. Social tools should be built-in to support access. We need to ask what users would like to say about their models and interpretations—make it useful and easier to remember what they did. Diz advocates ‘embracing’ open publication. In Repsol a team was using Sharepoint to share official tops and integrate work flows. Well logs were dumped into SharePoint because this was seen to be better that ‘putting them into the data base.’ IT came along behind and captured this stuff into databases without losing SharePoint facility. SharePoint was considered as a vehicle for IM improvement. Diz advocates thinking ‘people first’ and how best to create value. You should also be prepared to switch between learning and teaching modes. G&G folks should spend a month or two in IT.

Tarun Chandrasekhar (Neuralog) and Steve Jaques (Laredo Energy) presented Laredo’s ‘iOps’ data environment. iOps comprises a NeuraDB log database, a GIS-based web portal and access to third party data from the Texas RRC, IHS and P2ES. The ‘glue’ behind the system is a PPDM 3.8 master data management system that collates data across the components. iOps provides integration with interpretation tools such as Petra and SMT Kingdom. The iOps system is understood to be one of the first Microsoft SQL Server-based PPDM implementations.

Mario Marco Fiorani (ENI) described the parlous state of coordinate reference system (CRS) information across the industry saying, ‘We are losing the knowledge of handling cartographic parameters. In ENI there are only a couple of people left who understand datum shift and CRS data management.’ To fix this and to support its GIS and business intelligence applications, ‘MyGIS Explorer’ and ‘InfoShop,’ ENI has performed a quality assessment of its coordinate data. ENI is working with OpenSpirit and is now ‘nurturing’ its remaining cartographic skills.

Ahmed Al-Otaibi described how a three-fold hike in drilling (146 wells in 2008, up from 44 in 2001) has created data management challenges for Saudi Aramco. Exploration and delineation drilling footage has risen from 4 million feet in 2004 to 8 million in 2009 with a similar hike in development. Aramco has developed a data governance framework for its well data with roles, responsibilities, and policies. The company has its own data model, ‘not Epicentre, not PPDM,’ and its own seismic standards.

Well data tracking has automated the process of assuring data completeness in drilling, wellsite geology and wells databases. Traffic lights and a dashboard track process across diverse in-house and outside data producers. The new system can simultaneously track 400 wells with minimum manual intervention. Proactive data tracking from wellsite to corporate database has resulted in a 13 fold productivity gain. The system tracks well headers, well bores, deviation surveys, logs, cores, tops, tests, lithology, ROP and VSPs and supports document tracking and loading along with QC of document metadata and secure access control.

An in-house developed data quality management (DQM) process tabulates data quality metrics such as completeness, consistency, validity and uniqueness. A very successful data services system (DSS) was developed in collaboration with Petris that has now been commercialized. DSS acts as a hub to seismic data from processing, field acquisition, specialized processing and as a gateway to project databases.

Another in-house developed system, ‘Geo-Knowledge Management’ (GKM) captures and manages knowledge of exploration assets—performing version control, providing connectivity to knowledge repositories and databases, and offering reporting and GIS-based search. The GKM is now enhanced with a master data catalog—further aiding search through ‘centralized and consistent master data across all repositories.’

Finally (most important) are people and professional development. This is assured through training on G&G data and document management, data quality, GIS and database administration through in-house and vendor supplied training and mentoring. There is also a professional development program for young talent. Others can enroll in more specialist programs. In answer to a question, Al-Otaibi said that data management is separate from IT. Exploration data management is done by geoscientists.

Future data challenges include pre-stack seismic data and interpretations. Here Aramco is working on data governance and QC procedures before rolling out a system that supports data flows from acquisition to prestack interpretation and on.

David Lloyd observed that GDF Suez’ recent growth has seen a doubling in Aberdeen and London for the Cygnus development, the largest Southern North Sea project in 20 years. GDF Suez’ vision is of tightly integrated information systems, robust processes running on state of art technology and databases and leveraging quality, data and GIS. Lloyd noted that ‘the technology deployed in schools and universities has not yet reached the oil company coal face.’

Technologies such as 2 terabyte disks, Intel Nehalem CPUs, solid state RAID, ‘Light Peak,’ Nvidia Tesla 20, faster RAM and USB3—are all coming ‘real soon now!’ But what are we doing about it?

Standard methodologies including PRINCE2, ITILv3, MOF etc. can help—but they need a good ‘pitch’ to avoid analysis paralysis and rejection. Projects need to ‘fit’ with the E&P Industry and avoid an ‘IT crowd’ label. Lloyd recommends going for a minimum number of clearly defined processes with demonstrable ROI. Business analysis provides ‘eyes and ears’ into the department to find out geologists and geophysicists requirements. Information systems project management can be tailored to the industry, leveraging the best bits of PRINCE2—not re-inventing the wheel. Documentation is required—a minimum of project mandate, brief, initiation document, presentation, risk log, RFC, end stage and end project report. Everyone needs to know what is expected.

GDF Suez is currently three months into the project and has drafted high level definitions in Visio. An online business management system is being deployed using corporate standards. A risk assessment matrix of severity vs. likelihood has been established and people hired to fill roles. RPS/Paras’ Alan Smith gave a parallel talk on the use of psychometric testing to fill key roles in projects like this. In answer to a question, Lloyd described GDF Suez’ UK business as a ‘devolved environment, although this is starting to change. E&P data management processes are now emerging from Paris. We may end up somewhere in the middle of the central/devolved spectrum like BP.’

Pascal Colombani (Total E&P) asked, ‘What does a data quality program involve?’ The need for such a program became clear when an internal study revealed that ‘users were struggling to find reservoir data’ in part because of different geographic coordinates across CDA, the survey department and the OpenWorks master database—up to six different sources of positioning information for one well and four 4 different locations. Another driver was the need to avoid individual DIY data management in Excel. Data can also be key in HSE, as was the case in one unit where a ‘lost’ well had been sidetracked to avoid an abandoned nuclear tool—a potentially dangerous situation that was fixed by the quality effort.

The cornerstone of quality management is data ownership, which is also the big challenge. People will happily spend 50% of their time building their own data set—but will balk at the extra 10% effort required for validation. Colombani warned against regarding data management and quality as a ‘project.’ These initiatives need ongoing funding to be ‘sustainable.’ Data management has not always been given the consideration it deserves in the past. But Total is steadily building a compelling argument for doing it right. In Indonesia, a sustained data management and quality effort has seen the creation of a new, QC’d reference database which has contributed to a ‘huge production hike.’ This was achieved by visibility (for the first time) of a complete, quality dataset that has meant drilling better wells, enhancing injection, and optimization based on a better reservoir model.

Christer Öhbom (AF Engineering) cited Preem Petroleum’s Tore Carrick’s ‘data commandments’ as follows, 1) data is always wrong but decisions must be right, 2) data must be visualized to understand its origin and context and 3) data has endless life while systems come and go. Preem Petroleum has 20,000 measurement points in its refineries. With an MTBF of once in 30 years, that means 2 faulty signals per day. Often these are hard to understand and may trigger inappropriate action by operators. To combat this, Preen recommends the ISO 9000 7.5.2 ‘special processes’ methodology which states that, ‘if results cannot be verified you need to QA the process.’ For preen, this has been done by DNV for 800 flow meters and other tags. QA is now built into Preen’s Quality Information System (QIS). Data flows from Historians, PLC/DCS and databases through QIS and on to users/applications. QIS leveraged a ‘logical process model’ of input, output streams and values. Tag agents aggregate data from minutes to hours. When an anomaly is detected, a warning goes to the data owner.

This article is an abstract from a Technology Watch Report produced by The Data Room. For more information on Technology Watch, visit oilit.com/tech or email tw@oilit.com.


Folks, facts, orgs ...

API, Aveva, EnergySolutions, Energistics, IBM, Fugro, Leica Geosystems, Knowledge Reservoir, Madison Williams, Petrosys, SensorTran, Terrapin, Tidewater, WorleyParsons, RigNet, TGS.

Deryck Spooner has joined API as Senior Director for ‘External Mobilization.’ Spooner hails from the Nature Conservancy. Why are we thinking poachers and gamekeepers now?

Aveva is building a ‘state-of-the-art’ visualization centre in its Americas HQ in Houston.

The Energy Institute has merged with the British Energy Association to form the UK Member Committee of the World Energy Council. UKWEC is chaired by Michael Gibbons.

Energy Solutions has promoted Rene Varon to VP of global sales.

Energistics has restored the link to its Epicentre data model that was inadvertently dropped in the move to a new website.

Izaskun Azpiritxaga is manager of Fugro-Jason’s new office in Bogota, Colombia.

IBM has opened an Energy & Utilities Solutions Lab in Beijing, China.

Knowledge Reservoir has appointed Larry Denver as President and BJ Crouse as General Manager, Middle and Far East. Both come from Ascend geo.

Jürgen Dold is president and CEO of Leica Geosystems.

Jeffrey Spittel and Sonny Randhawa have joined Madison Williams as oil services sector analysts.

Carrizo Oil & Gas, EnCore Oil and Wintershall E&P have joined UK Common Data Access.

The Kazakhstan-British Technical University has deployed a 10-TFlops IBM BladeCenter cluster running Paradigm software.

PetroSkills has announced a joint venture with the Southern Alberta Institute of Technology for instructor and e-learning in production, operations and drilling.

Khoa Van and Stephen Howard have joined Petrosys as software testers. Andrew Dunn has been appointed as support geoscientist.

RigNet has elected Ørjan Svanevik to its board. He hails from Cubera Private Equity. The company has also opened a service center in Williamsport, Pennsylvania headed-up by Ricky Begnaud.

Broker R.J. O’Brien & Associates has opened a Houston office. Steffen van Keppel and Tod Mitchell jointly head-up the new unit.

SensorTran has selected Lupatech S.A. as the exclusive distributor for its distributed temperature sensing systems in Brazil.

Javier Castrillo Penadés and Marta de Amusátegui y Vergara have been appointed to Telvent’s Board of Directors. Penadés is from Banco Santander and Vergara from consulting firm AILARA.

Terrapin is offering a ‘3 day MBA’ in Oil & Gas, in London next June. The course is presented by Saudi Aramco consultant M. A. Mian.

Kristian Johansen has been appointed as Chief Financial Officer of TGS. Johansen was formerly Executive VP and CFO of EDB Business Partner.

Tidewater has promoted Matthew A. Mancheski CIO.

Marty Gaulin is to head up Worley-Parsons’ expansion into Newfoundland and Labrador while Mike Paulin has been named operations director.


Mathcad in Oil and Gas virtual user group

Saipem, Maersk, Weatherford and Smith Rea feature in Parametric Technology Corporation webinar.

PTC held a virtual user group/webinar this month on the application of Mathcad in oil and gas. PTC claims that Mathcad is used by most major oils to leverage their intellectual property and to capture engineering ‘context and intent,’ something that ‘cannot be done with either Excel or CAD*.’ PTC divides the mathematical/engineering as follows. Hand-held calculations are OK for ‘one-off’ engineering problems. Excel is ‘pervasive but imperfect’—formulae are hard to read, units of measure are not properly manages and spreadsheets are error prone. Domain-specific programming-based solutions require specialized capabilities and are ‘geared to simulation and modeling.’

Mathcad on the other hand offers human readable ‘natural’ math, annotation, is ‘unit aware,’ can automate and capture workflows and offers interoperability, notably with other components of PTC’s product line.

Offshore engineering contractor Saipem used Mathcad in a ‘dropped object study,’ a risk assessment of falling crane loads on subsea pipelines. A Mathcad worksheet has been developed for decision support in locating pipelines and subsea valves, replacing a prior ‘rule of thumb’ approach. Saipem is now working on impact consequence analysis.

Another user, ‘Drillers.com’ has used Mathcad to deploy an online electronic drilling manual. Here Mathcad worksheets are available to perform a variety of calculations for casing strength etc. and to assist in well planning.

Maersk Drilling has migrated from spreadsheets to Mathcad ‘to avoid inconsistent results and significant errors.’ The company has standardized on Mathcad for calculations and documentation. The tool is now considered a ‘best practice’ for drilling performance and production improvement—resulting from more effective calculations and communication.

Weatherford stated that Mathcad is an improvement on Excel and is used inter alia to visualize deviation survey data and perform conversions using inbuilt functionality.

Subsea 7 uses MC to perform (and document) pipeline and structural calculations to ISO and DNV specifications.

Hydrocarbon accounting specialist Smith Rea was having similar ‘issues’ with Excel regarding reuse, transparency and audit capability. The company now has developed worksheets for meter data auditing according to the ISO5167:2003 standard.

PTC wound up underlining Mathcad’s integration with CAD—notably its own Pro/Mechanica and Pro/Engineer tools. A rather long demo ensued showing Mathcad’s role as a bridge between Excel, Pro/Engineer and Mechanica for finite element analysis. Other use cases include alarm analysis, seismic signal processing and economic optimization of refinery product. More from links/1003_8.

* Computer-aided design.


Kongsberg rolls out SIM Reservoir

‘Next generation’ 3D reservoir simulation data analysis engine makes first sale.

Norway-based Kongsberg Oil & Gas Technologies has announced SIM Reservoir, a ‘next generation’ reservoir simulation post processor. SIM Reservoir uses 3D graphics to give engineers a better understanding of reservoir dynamics. The immersive 3D display is achieved without the need for special glasses and is claimed to provide a ‘realistic’ image of the subsurface.

SIM Reservoir addresses well path optimization with fast comparison of simulation runs, history matching and ‘smart’ cell filtering to identify new well locations.

SIM Reservoir displays streamlines, vector fields, time lapse animation and ‘intelligent’ property mapping to reveal ‘hidden’ simulation features that improve reservoir understanding.

At the Subsea UK tradeshow in Aberdeen last month, Kongsberg announced its first SIM Reservoir sale, a substantial 14 license deal with ‘one of the largest National Oil Companies in the Middle East, placed in December 2009. Sales manager Mark Baldwin said that the £2 million investment in 3D visualization represented a strategic milestone for Kongsberg. More from kongsberg.com/kogt.


Integrated Operations in the High North—mid term report

The four year Norwegian IT flagship expects proofs of concept and digital platform for 2010.

The Integrated Operations in the High North (IOHN) joint industry project kicked off in 2008 and is slotted to run for four years. IOHN has just issued a mid term status report on this Norwegian IT flagship project that sets out to ‘design, implement and demonstrate a reliable and robust ICT architecture for Arctic E&P.’ Here the requirement is for field development and operational concepts that include ‘heavily instrumented facilities.’ IOHN is to leverage open standards to ensure interoperability and data integration. IOHN is building on the ISO 15926, POSC Caesar Association’s oil and gas ontology and semantic web technologies.

A drilling and completion use case focuses again on interoperability through open standards at the drilling control level. The idea is to close the loop between real-time data and its timely use during drilling operations. IOHN systems will automate real-time data analysis, perform ‘autonomous’ decision making and control the drill bit. Other use cases target reservoir, production and operations and maintenance.

In 2010, the first proofs-of-concept for the different pilots are expected, along with work on different parts of the digital platform. The project is also working on position papers on how semantic web technologies and autonomous systems may impact the current operational models.

Comment

There is no doubt that the high north is and will be a proving ground for many innovative technological solutions such as Statoil’s audacious Snohvit development. What is less clear is why the high north’s information technology needs are different from any other part of the globe. The reliance on immature semantic web technology and an emerging plant data protocol rather than existing automation standards is puzzling. More from iohn.org.


Microsoft announces Oil and Gas Reference Architecture

Whitepaper strong on vision, light on content. Is BizTalk up to the upstream?

Microsoft officially unveiled its Upstream Oil & Gas Reference Architecture (OGRA—OITJ December 2009) at CERA Week this month and followed up with a whitepaper*. The document, strong on vision and light on details, describes a reference ‘architecutre’ (sic) that is to ‘enable better integration of upstream applications through the composition of common application components.’ ‘Early adopter’ Landmark, hitherto something of a Java/Linux shop, heralded the OGRA claiming that its DecisionSpace for Production leverages OGRA. Schlumberger was conspicuously absent from the announcement.

OGRA is currently at a very early stage of development but an idea of its future shape can be had from an examination of the slightly more mature utilities architecture, SERA (OITJ Dec 09). SERA leverages a comprehensive Microsoft stack including BizTalk for automation and ‘orchestration.’ BizTalk has had a checkered history in the upstream. Total has successfully used the middleware for oil field monitoring (OITJ Nov 08). Elsewhere, ExxonMobil’s attempt to leverage BizTalk in its upstream enterprise architecture failed spectacularly. A source who did not want to be named told Oil IT Journal. ‘We lost two years!’ More from Microsoft.com/oilandgas.

* links/1003_2.


Merak PEEP 2010—Office 2007 GUI

Schlumberger’s economics flagship refreshes core technology.

Schlumberger’s Merak oil and gas economics and reserves management software arm has released Merak Peep 2010. The new release ‘refreshes’ the application with updated core technology and a new user interface with an Office 2007 ‘look and feel’ including a new ‘Fluent User Interface’ a.k.a. the ribbon and an Outlook-style navigation pane. The new GUI can be tailored to specific roles—exposing functionality on an as-needed basis. Peep can now be extended and customized with plug-ins and ‘helper panes’ for data entry. The Merak Fiscal Model Library has been extended with US and Canadian fiscal models. More from slb.com/merak.


Sales, contracts, partnerships and deployments

Aker, ASCI, Oniqua, Aveva, Dyadem, Energy Solutions, Flare Solutions, GX Technology, IFS, Ikon Science, GE Oil & Gas, Palantir Solutions.

Aker Solutions geo-business unit has signed a three year frame agreement with Statoil for the provision of consultancy services in production geology, exploration geology and reservoir technology. The contract, whose value was not disclosed, covers Statoil’s assets worldwide.

BP Exploration Alaska has renewed its contract with Advanced Supply Chain International for the provision of supply chain management services. Software deployed includes Oniqua’s OAS-Inventory software following the signature of a distribution agreement covering North American and the Caribbean.

Aveva reports ‘surging’ sales of its Aveva Net IM solutions in Russia. New customers include oil sector engineering companies VNIPIneft and RusGazEngineering along with the INHP research institute.

BP Shipping is to deploy Dyadem’s ‘Stature Workgroup’ on ‘everything that floats in the BP group’ to conduct HAZOP & FMEA risk assessments.

Sonatrach is to deploy Energy Solutions’ PipelineManager and PipelineStudio tools on three gas pipelines in the Hassi R’mei and Oued Saf Saf pipeline corridor.

Flare Solutions reports sales of its E&P Catalog to the North Caspian Operating Company and GDF SUEZ E&P UK.

Pemex has awarded a three year seismic data processing contract to ION Geophysical’s GX Technology unit.

Danish engineering contractor Semco Maritime has selected IFS’ Applications for Engineering, Procurement, Construction and Installation for its global projects in a deal worth SKr 20 million.

Statoil is to embed its ‘bespoke’ E&P and reservoir geophysics methodology into Ikon Science’s RokDoc. A selection of Statoil’s methods will complement existing RokDoc functionality.

ISS Group has signed a five year enterprise agreement with Woodside Energy for the continued use and expansion of its BabelFish suite.

GE Oil & Gas’ Nailsea, UK unit is to benefit from a $1.1 billion contract with Chevron Australia for the Greater Gorgon LNG Development. The deal includes subsea equipment and controls and will leverage the Nailsea SmartCentre for remote monitoring.

Palantir Solutions has built a planning, reporting and portfolio analysis system for Talisman Energy.

P2 Energy Solutions reports the installation of Tobin Enterprise Land and Tobin GIS Studio at one of ‘Canada’s largest integrated energy and energy-related companies.’ The client also signed up for a Western US map data subscription via Tobin’s All Access program.


GDF Suez’ ‘Actarus’ webMethods-based business intelligence

Logica develops business intelligence solution for EU utility at Software AG ‘Factory’ in Bordeaux, France.

GDF Suez has rolled out Actarus, a business intelligence solution for its accountants and managers. Actarus provides accountants and managers with real-time sales information. As the EU gas market was opened to competition, GDF Suez’ accounting services needed more detailed information of upstream transactions enacted by an expanding supplier base. GDF elected to model its business with a combined business process management (BPM) and composite application framework (CAF) approach.

Actarus was developed by IT services provider Logica using Software AG’s webMethods BPM Suite. The solution takes advantage of GDF’s existing technology investments and integrates several key technologies, including a portal, EAI server, process engine, and task engine, all within a service-oriented architecture (SOA) framework.

Work on Actarus commenced in 2000 using webMethods and Logica’s project delivery capabilities for SOA and BPM. Software AG chief product officer Peter Kuerpick said, ‘Logica has built a SOA-based solution that can be leveraged to support additional integration and composite application development and that will deliver continued economic value for GDF Suez.’ Last year, Logica opened the ‘Software AG Factory’ in Bordeaux, France, to provide a ‘One-Stop-Shop’ for services and capabilities around webMethods products. More from softwareag.com.


Congressman asks Minerals Management Service for action

Congressman Raul Grijalva (D-AK) asks for data checkup on BP’s Atlantis platform.

Following the Food and Water Watch request to the US Minerals Management Service to suspend production on BP’s flagship ‘Atlantis’ development in the Gulf of Mexico following alleged documentation deficiencies (OITJ July 2009), Congressman Raul Grijalva (D-AK) has joined the fray, with a letter addressed to MMS director Ms. Elizabeth Birnbaum. The letter asks the MMS ‘to verify the existence of a complete and accurate set of drawings and report its findings to Congress.’ The letter, signed by 19 house democrats, also expressed ‘concern’ that the MMS’ interpretation of rule 30 C.F.R. § 250.903(a)(1) concerning approval to operate at Atlantis ‘indicates a less than acceptable standard.’ The letter cites ‘communications between the MMS and congressional staff’ suggesting that while the company must maintain ‘as-built’ documents, there is no requirement that such documents ‘be complete or accurate.’

An incident* last July on Atlantis involving a gas leak cited ‘the lack of written procedures [..]’ as a contributing cause. Since then BP has been working with its suppliers and the MMS to fix the problem and is having an ‘Installation, Operating and Maintenance Manual’ drafted covering the equipment at issue. More from grijalva.house.gov.

* links/1003_2.


EDSA Paladin Gateway gets SDK

Toolkit allows electrical power analytics to be incorporated in third party software.

San Diego, CA-based EDSA has released a software development kit (SDK), Paladin Gateway, for its Paladin Live electrical power analytics and monitoring solution. EDSA’s tools help facility owners understand how interruptions to the power supply affect production and refining, optimize power system design and reengineer their infrastructure. The Gateway SDK accepts input from SCADA and other data acquisition platforms and provides third party application developers with a power analytics toolkit.

EDSA CTO Kevin Meagher explained, ‘The benefit of power analytics lies in Paladin Live’s ability to diagnose power problems and inefficiencies at the earliest possible stage, isolating them at the detailed component level. This means communicating in real-time with hundreds or thousands of data-generating devices located throughout mission-critical infrastructure.’

Paladin unlocks data trapped in isolated, proprietary architectures and heralds a new ‘open’ systems architecture where data from multiple vendors’ solutions can be aggregated and delivered as an organization-wide power analytics solution. EDSA customers include ABB Offshore, Aker Kvaerner, BP, CNPC, CNOOC, ExxonMobil, Petronas and Shell. More from edsa.com.


Shell, IBM sign R&D outsourcing agreement

Joint program to investigate ‘predictive analytics’ application to oil and gas production.

Shell and IBM have launched a joint R&D program to investigate the application of ‘predictive analytics’ to oil and gas production. Shell expects to improve its reservoir modeling by ‘combining its subsurface and reservoir expertise with IBM’s analytics and simulation experience.’ The companies are to explore ‘advanced techniques for reconciling geophysical and reservoir engineering field data’ to ‘reduce the educated guesswork’ inherent in today’s workflows. Shell’s Gerald Schotman said, ‘This will not be done through expensive experimental facilities, but by bringing together a team and powerful computers so we can be smarter than before.’

The broad-reaching program includes time-lapse seismic, subsurface measurements, production, well and laboratory data. The partners are to ‘reformulate and automate’ the task of reconciling all of the above in an ‘enhanced, yet practical, mathematical model.’ The results will become part of Shell’s proprietary reservoir modeling tool kits for application in new oil and natural gas developments as well as existing assets. More from ibm.com.


Skymira ‘Remote Information System’ for J. Irwin Co.

Solution targets automated field tickets, replacing paper-based forms.

Satellite and cellular technology solution provider Skymira is implementing a ‘comprehensive’ remote information system (RIS) for natural gas construction services contractor J. Irwin Company. Phase I of the RIS includes automated field tickets to replace paper-based forms.

Angela Schultz, CEO, J. Irwin Company said, ‘Our paper forms were cumbersome and error-prone. Initial installations are already showing improvements and a measurable reduction in errors. We’re looking ahead to the next phases of implementation, including time reporting and real-time inventory management.’

Skymira CEO Robert Landsfield added, ‘Implementing an RIS, rather than installing separate component applications, enables J. Irwin Company to refine its business processes and reduce costs, adding new applications that integrate with existing infrastructure and processes.’

Following the initial deployment, Skymira will develop additional applications including time reporting integration with the main office, field equipment tracking, real-time inventory management and engine diagnostics. More from skymira.com.


Merrick launches hosted production data solution

Production data management ‘Software as a Service’ formalized with new team and data center.

Merrick Systems has launched a hosted ‘Software as a Service’ (SaaS) version of its oil and gas production data management solution. Users can now access Merrick’s Field, Operations or Enterprise solutions served from a secure data center. Merrick’s packages include field data capture for volumes, inspection and maintenance, production operations, hydrocarbon accounting, regulatory filings and web-based management reporting.

Merrick VP Clara Fuge said, ‘We have been supporting clients without the necessary IT resources with hosting services for several years. Responding to a growing demand, we recently expanded our infrastructure and assembled an experienced team to run our data center and hosted software. The repackaged services are backed by our support and consulting team that has 15 years of experience implementing, training and integrating our software with our clients businesses.’

Along with the software and infrastructure provision, Merrick’s staff helps out with routine tasks such as adding new users, resetting passwords, ensuring weekend connectivity support and monitoring response time. Merrick’s Co-Founder and CFO Samina Farid added, ‘The hosted solution yields significant improvement to clients’ bottom lines and extends the reach of our technology to a greater segment of the oil and gas market.’ More from merricksystems.com.


Total’s approach to affiliates’ data management

GADAMA project heralds ‘professional’ data organization. Finder to PPDM migration mooted.

Speaking at the first EU meeting of the Professional Petroleum Data Managers’ (PPDM) Association user group meet last month (more from this meet in last month’s Oil IT Journal), Claude Martin described how Total was promoting its data management activity and building a ‘professional’ data management organization via its affiliates’ geosciences data management program, ‘Gadama’. Total is essentially a Schlumberger shop. Data managers use ProSource and end users DecisionPoint. Total’s data landscape includes LogDB, Finder, eSearch. An Avocet-based production data management system connects to the Gadama infrastructure through Finder. Following a three year pilot in Indonesia, Gadama was rolled out to Nigeria and Angola. After a ‘pause for evaluation,’ deployment in most all of Total’s major affiliates followed.

Gadama comprises some 3,000 pages of best practice documentation. Affiliates now have a single entry point to multiple repositories that share taxonomies and definitions. Metrics show that despite escalating activity, growth in the data management community has been modest and productivity has risen significantly.

Martin stated that Total is ‘very happy with Schlumberger. Finder is a good database which provides the bricks—but you still need to build the house.’ Martin wondered if such a project would be possible with PPDM noting that ‘even with Schlumberger’s toolset, integration and deployment is not easy.’ Even now it takes over six weeks for an affiliate to install the system.

But the future is open as Schlumberger is migrating its Finder user base to the new Seabed database. Mappings will change, and adopters like Total will have to rewrite 30-40% of ProSource views and add new licenses. ‘When your provider changes policy you have to follow them.’

Gadama has brought Total better understanding of how to run a data management organization. Most of the métier (business) re-engineering has been realized. A possible alternative to a Seabed port would be ‘to develop our own corporate database.’ Is PPDM is a potential solution? Total’s decision will be likely be made this year. More from ppdm.org.


Marsh metrics for risk analysts

New report, ‘The Hundred Largest Losses’ published. Risk management now ‘applied science.’

Insurance broker and risk advisor March has released a report titled, ‘The 100 Largest Losses*’ that quantifies the most significant property damage losses in the hydrocarbon industries since 1972. The report finds that despite an increase in the size and scale of new energy infrastructure projects, national oil companies (NOCs) and other energy and chemical concerns are experiencing fewer and less-severe major losses than in previous years.

Speaking at Marsh’s National Oil Companies Conference in Dubai last month, Jim Pierce, Chairman of Marsh’s Global Energy Practice said, ‘Energy sector risk management has evolved into an applied science that is making a real difference to mitigating the catastrophic losses of previous years. Improved risk management techniques are even more critical in the age of the $50 billion mega project.’

Despite massive growth in the sector, the report shows that catastrophic losses at petrochemical plants, gas processing plants, upstream projects and terminal and other distribution points have declined over the last five years as companies enhance their risk management techniques.

Marsh predicts that NOCs could benefit from lower risk with a potential 20% reduction in insurance costs. Companies involved in construction projects also stand to benefit from current market conditions.

Of the 20 largest losses in the last 38 years, six occurred in the US, five in Europe, two in South America, Africa, Australia and Asia and one in the Middle East. The largest loss was the 1998 Piper Alpha explosion with an estimated $1.6 billion and 167 lives lost. More from marsh.com.

* Download from links/1003_1 (registration required).


Supply chain automation meets wireless tracking

Wireless network specialist Enfora teams with Smart Management on oil and gas asset monitoring.

Richardson, TX-based wireless networking solutions provider Enfora is teaming with Norwegian supply chain automation specialist Smart Management to enable wireless tracking and monitoring of assets and equipment across the oil and gas supply chain. The deal combines Smart Management’s TAG-HUB supply chain management system with Enfora’s Spider AT asset monitoring system. TAG-HUB lets companies exchange asset information, providing services that facilitate inspection, documentation and data-capture.

Spider MT provides GSM/GPRS device location functionality for mobile tracking applications. GPS data is transmitted to operations centers, web pages, localized computers or mobile data terminals worldwide via cellular radio. The devices provide information such as asset location and movement patterns, based on user defined alerts, specific events or ‘geofence’ boundaries. Spider AT has been ATEX certified for operation in hazardous environments. Enfora’s Services Gateway offers a GUI for tag management and devices provisioning. The joint venture will initially focus on opportunities in Norway before extending the solution worldwide. More from lee.lanucha@enfora.com.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.