June 2010


Moebius and the Repsol Brain

Repsol SemTech 2010 presentation reveals three years of ‘practical application’ of semantic technology underpinning knowledge management initiatives including ‘Repsol Brain’ application.

Speaking at this month’s SemTech 2010 conference in San Francisco, Jesus Contreras of Intelligent Software Components (Isoco) presented Repsol’s ‘semantic technology’-based ‘Moebius’ corporate ontology and its ‘innovation capture’ knowledge management system a.k.a. the Repsol Brain.

Repsol management has recognized the value of its intangible knowledge assets for some time and has encouraged departmental-level KM projects such as communities of practice, talent management, search, innovation and reputation tracking.

The company wanted to break with the traditional ‘knowledge is power’ syndrome and leverage Web 2.0 tools to increase productivity through tagging, wikis and other social networking tools.

However this has led to a situation where over 40 KM vendors’ tools have been deployed across the enterprise: hence the decision to use ‘semantic technology’ to tie these disparate efforts together and build an enterprise-wide KM solution.

Successive attempts to achieve this have leveraged thesauri, business unit-based ontologies and latterly the Moebius Project, Repsol’s company-wide ontology and ‘virtual’ knowledge base.

The early adoption of semantic technology in Repsol has not been without problems. Building the ontologies is hard and the new technology has met with some resistance from IT. Repsol turned to Isoco, a spin out of the Barcelona Institute of Artificial Intelligence, to provide ‘semantic technologies.’

One application area of the Moebius ontology is the Repsol Brain—an ‘open innovation’ application for idea management. The Brain is a collaborative portal where employees and external staff can put forward new ideas. The expert system is able to relate similar ideas, look for experts and evaluators. Repsol claims ‘increased quality of outcomes compared with traditional solutions’ for the Brain.

Neither Isoco nor Repsol were saying exactly what is meant by the ‘semantic technologies’ deployed under the Moebius hood. But Contreras gave a pointer to the EU-sponsored ‘Value-IT’ project in his SemTech presentation.

Value-IT, a component of the EU 7th Framework Program, sets out to ‘accelerate take-up of semantic technologies for the enterprise.’ Isoco manages the Value-IT project and has contributed the Repsol enterprise semantic technology use case to the initiative.

More from SemTech on www.oilit.com/links/1006_1, from www.isoco.com and www.value-it.eu and on sematics in our report from Semantic Days on page 5.

A possible reason for Isoco’s reticence to divulge more might be potential intellectual property issues weaving their way around the Moebius strip!


PPDM API coup

Custodianship of venerable American Petroleum Institute standard is transferred to the PPDM Association. Funding sought for revamp project.

The American Petroleum Institute has transferred custodianship of the authoritative API well numbering standard to the Calgary-based Professional Petroleum Data Management Association (PPDM). PPDM is now organizing a publicly available project to assess and update the well numbering standard. The revised standard will be made available free of charge to anyone. Industry and regulators will be encouraged to adapt their systems to use the revised standard.

Speaking at PNEC last month, PPDM CEO Trudy Curtis explained how confusion in well numbering has led to orphaned data and in some cases, data attached to the wrong well! Information is often scattered across systems while ‘official’ well identity rules may be inconsistently applied.

PPDM is engaged in researching global well naming conventions and tidying up issues such as junked wells and pilot holes. The API standard will be augmented with work coming from PPDM’s ‘What is a Well’ study group and is seeking funding from members to support participation from key experts, communication, technical and administrative costs. Total project costs of $250,000 are being sought. More from www.ppdm.org.


Putt’s Law and the data managers

Editor Neil McNaughton reflects on Putt’s Law, on the DAMA Guide to the Data Management Body of Knowledge and on presentations that send him to sleep. He concludes that case for a data management ‘profession’ like accountancy or law is flawed because of upstream data idiosyncrasy.

Statoil’s Lars Olav Grøvik wound up his keynote presentation at the Semantic Days conference (report on page 7) in Stavanger last month citing ‘Putt’s Law.’ This has it that, ‘technology is dominated by two types of people: those who understand what they do not manage, and those who manage what they do not understand.’ Having ploughed my way through the DAMA DMBOK Guide1 (review on page 3) I was left wondering, was this book written by the ‘understanders’ or the ‘managers.’

Travelling to conferences around the world I hear a lot of interesting and varied technical material. Unfortunately, not all presentations are created equal. Occasionally I think to myself, ‘what a lot of waffle; how can anyone get away with either a) such bland, motherhood and apple pie stuff or b) with such unsubstantiated claims?’ What is truly puzzling though is that, as I rant inwardly, or drift off into a jet-lag induced reverie, I may look around to see other people in the audience nodding enthusiastically in agreement!

I used to see this as evidence of the frailty of the human race. But it happens so often now that I am looking for alternative explanations. The most obvious one is that I have missed a crucial point, but I like to think that this is not always the case. I prefer to think that what is happening is that I am overly aligned with the ‘understanders’ side of the equation. My management credentials are more questionable. I wonder if the ‘nodders’ come from the other side of Putt’s duality, the non-understanding managers?

But why do these folks, wherever they are coming from, find stuff to agree with in vague talk? I think that this is due to how management speak language has developed. Those who study and write on management are constantly trying to derive rules and commonalities across different subject areas. Occasionally, as in accounting or law, this works, and a trans-industry discipline is codified, taught and practiced.

Elsewhere it is harder to find cross industry commonality. So the management gurus adopt a language that is deliberately vague. Words like ‘asset’ and ‘resource’ are preferred over old-style qualifications such as ‘drilling rig’ and ‘employee.’ The implication is that it doesn’t make any difference whether a ‘resource’ is a time serving domain specialist, a new hire, or an ‘outsourced’ individual working for a third party. This is a process of abstraction that hides such awkward granularity.

With a high level view, the debate, instead of getting harder, as you might expect, gets easier. At a suitable level of abstraction, when manager A makes a statement of a sufficiently abstract nature, then manager B can immediately agree with it—without there necessarily being any alignment between the two inner trains of thought.

Another observation is that at the intersection of business and IT (or another domain), words are invented, repurposed and shared. Technical terms used in one field are recycled in a different context. Their meaning evolves with time as does their currency. You wouldn’t get very far today using buzzwords from the 1990s. But if discourse and meaning ‘evolve,’ what is driving their natural selection?

Beyond the desire of the speaker to be up to date and ‘smart,’ the managerial tendency towards abstraction ‘selects’ words that cross domain boundaries. It is comforting to think that the same ideas apply in different places. Thus a term is used in a different context at the risk of meaning something rather different.

This process is a kind of linguistic entropy as words are re-purposed and used in a broader context, they move away from a precise meaning. Gaining nods, losing sense and fuelling games of buzzword bingo!

This process actually has quite a lot to do with IT, with master data, and, another great buzzword of today, ‘ontologies.’ I discussed some aspects of this in my ‘What is a turnip?’ editorial (February 2010). But I think that I missed a trick. The ‘solution’ to the turnip problem is not Wikipedia, it is Linnaeus! Instead of abstracting away, sharing words, sharing terminology someone needs to tie the thing down with some decent definitions.

My intent was to review the DM-BOK Guide in this context—to see if it nailed down the body of knowledge in a Linnaean sort of way onto which a science of data management could be built. But I was wrong-footed by the fact that the DM-BOK is actually volume II in a series of which volume I is the ‘Dictionary.’ So plan B is to order the dictionary and then to report back in a later review.

Preliminary findings from our reading of the DM-BOK (page 3) show that the Putt dilemma is everywhere and is at the heart of the data management issue. Is enterprise data management about ‘data’ or about ‘management?’

In our report from PNEC Volker Hirsinger offers more insight into these issues. Managing seismic data actually involves the opposite of abstraction. As Hirsinger shows, intimate knowledge of observers logs, navigation data, velocities, processing workflows and coordinate reference systems is required. There is more to it than ‘managing’ the large volumes of field data.

The risk of a DAMA-like approach is that, as specialist data managers join the generalists and dialog at a suitable abstract level, more warm feelings and possibly hot air will be generated than useful insights into the nitty gritty of technical data management.

I would think differently if a DAMA contained other users of technical data such meteorologists, nuclear and space scientists or microchip designers. Perhaps even some from the high performance computing brigade too—they manage seismic don’t they? That might make for a more relevant community. On which note it’s interesting that RESQML has adopted the NCSA/NASA HDF5 data standard for multi-dimensional data. I bet they didn’t learn that in the DM-BOK!

1 Data Management Association Guide to the Data Management Body of Knowledge. ISBN 978-1-935504-02-3, $74.95.


Book Review—The Data Management Body of Knowledge

DAMA Guide sets our to ‘professionalize’ data management—does it succeed?

The DAMA Guide to the Data Management Body of Knowledge1 (DM-BOK) sets out to provide a compilation of principles and best practices and to provide practitioners with a framework to manage data and to ‘mature’ their information infrastructure. These laudable aims can be judged at two levels. First by how well the book achieves its stated aim and second, how appropriate the isolation of enterprise data management is as a discipline—given that there is in every enterprise, a constellation of domain specialists, database managers, IT hardware and software experts who, to a greater or lesser extent already occupy the ‘space’ delineated by the DM-BOK.

We were disappointed to find that the Guide does not contain a glossary or definitions—these were issues in a previous publication. Also many interesting topics in the index—for instance ‘geospatial meta data standards’ are one liner links to an external website. So to get the full benefit of DM-BOK you need to acquire a) the Dictionary and b) a few hundred reference works. This would not be too bad if there was any indication as to what the really essential works actually were. Another irritation is that, instead of a chapter on ‘master data’ or ‘data quality’ there are chapters on ‘master data management’ and ‘data quality management.’ This allows DM-BOK’s authors to speak from the management high ground rather than addressing how things get done.

It is not all bad though. The chapter on data quality introduces the Demming cycle and gets into tools and tricks for cleaning up names—although it would have been nicer to name some of the tools actually in use! Data ‘entropy’ gets a good treatment as does the idea that it is better to fix data upstream rather than before it gets trashed—building quality into the data resource. There is also advocacy for a single data architecture as key to quality in enterprise data.

One problem with the Guide is that, because it is building a ‘profession,’ it is too abstract and offers too little in the way of concrete examples. This combines with a tendency to slip into database jargon and hampers understanding. To give an example. A ‘foreign key’ is described as a ‘an attribute that provides another link to an entity.’ Rather opaque when compared with Wikipedia’s ‘a column that refers to a (..) column in another table.’ In a similar vein, DM-BOK ploughs through one-liner definitions of normal forms—up to number six—without offering any real insight as to what is going on. One can imagine DM tyros having to rote learn this stuff, perhaps chanting it out like mid 20th century schoolchildren learning math (well at least they did learn it then!).

But perhaps more importantly than misgivings that one might have about the DM-BOK guide is the feeling that it fails to make as good a case as it could have for the existence of a DM professional as for instance the accounting profession. Read this month’s editorial for more on this.

DM-BOK’s level of abstraction would imply that a DM professional could switch between say E&P and banking. To take an extreme example, an SME who understands geodetics is unlikely to be severely tested by the content of DM-BOK. But he or she might pick up some useful jargon and understanding of the vast overlapping collection of technologies and solutions that make the field. A manager on the other hand may get the impression that this is easier than it really is.

1 Data Management Association Guide to the Data Management Body of Knowledge. ISBN 978-1-935504-02-3, $74.95.


The value of real time data

SPE Gulf Coast Section’s one day Digital Energy Workshop hears from Oxy, Baker Hughes and Shell.

John Kimberling related Oxy’s experience of using real time data to minimize non productive rig time by reducing waiting times for equipment and personnel. Previously it took hours to get activity data from rig to office. Now Oxy uses bar codes with job codes in different languages including Spanish and Arabic. This lets Oxy determine exactly what rig activity is ongoing and schedule personnel accordingly. The system includes real-time feeds to assess weather impact on rig activity. Oxy now knows how much production is associated with its maintenance backlog and can calculate the savings from reduced billable hours. The company is also benefiting from knowledge sharing in co-located control rooms and sees the next step as moving from monitoring to control.

Baker Hughes Neil Harrop suggested that industry has lost focus on the value of data and is instead ‘focusing on technology for its own sake.’ Baker Hughes has spotted some trends in ‘smart well’ technology such as nanotech and intelligent well diagnostics applied to modeling long-term CO2 injection cycles and aquifer displacement. But these increasingly complex models require high end computing and are moving engineers further away from their data. Ubiquitous data access means that there is a new generation of engineers that believe all data that is presented to them, regardless of source.

A survey of 122 case studies demonstrated that industry underutilizes data. Successful digital projects are those where data acquired relates to the decision making process, leveraging investments in computing and communications. One project had a total of 100,000 tags on the surface and downhole—but only 1,500 of these were providing ‘actionable data.’

Shell’s Ron Cramer is investigating if historical catastrophic process incidents could have been avoided using real time data. His analysis shows that most incidents occurred during transient operations and were detectable and developing long before the accident. Many occurred at night during shift changes. Often catastrophic events were associated with a lack of training on start-up or shut-down. In an ideal world, such systems should be remotely operated, indeed on some greenfield operations, plants are designed with operators outside the blast zone. Cramer suggests using remote collaboration centers as ‘control towers’ for safety monitoring, leveraging existing infrastructure, staffing and data. Safety officer shift changes should be staggered over operational shift changes to make sure there are always ‘fresh eyes’ on transient and dangerous processes. A data historian can usefully complement the alarm system adding real time analytics of incidents and near misses—possibly combined with training simulators. In most cases there was no sign that these systems were being used in incidents with fatalities.


Ephesia’s Impala statistics embed GoCad/Skua and JewelSuite

Paradigm and JOA sign with multi-point statistics vendor for ‘geologically realistic’ simulator.

Geneva, Switzerland-based Ephesia Consult has signed significant sales of its ‘Impala’ 3D multi-point statistics (MPS) package to Paradigm and JOA Oil & Gas. Paradigm is to commercialize Impala as a stand-alone product and the companies are to collaborate on integrating Impala into GoCad and Skua workflows. The 3D MPS add-on will be available in a future GoCad/Skua release.

Impala is a new, fully parallelized 3D MPI simulator that optimizes CPU performance and reduces RAM requirements. Results are claimed to be ‘geologically realistic,’ even when derived from structurally complex training images.

In a separate announcement, Ephesia has signed a similar deal with JOA for the development of an Impala-based 3D MPS simulator that will be offered as a plug-in for JOA’s ‘JewelSuite’ integrated reservoir modeling solution. More from www.ephesia-consult.com/Table/Impala.


Upstream Professionals TAM links well master data with SOX

Well Framework Methodology extended to ‘Total Asset Management’ and business intelligence.

Houston-based Upstream Professionals has rolled out ‘Total Asset Manager’ (TAM), leveraging its framework methodology (Oil IT Journal—September 2007) to jointly determine clients’ well master strategy. TAM builds on UP’s well lifecycle strategy counseling to deliver an implementation strategy, providing an extensible framework that enables operators to manage and evaluate assets in a ‘central, trusted source.’

TAM collects information from drilling, production, economics, and other applications into a central repository. TAM was designed to support the addition of new wells into the environment and be the focal point for dashboards, scorecards and other business intelligence, visualization and analytics.

One client uses TAM to compare budgets with actual numbers, with immediate drill-down into problem areas. Reporting spans high-level annual reports to daily well KPIs. BI tools can access TAM for more powerful analytics and business simulation. More from www.upstreamprofessionals.com.


Spatial Energy, Intermap team on terrain model-as-a-service

NextMap service offers on-demand high resolution digital terrain models and cultural data.

Boulder, CO-based Spatial Energy has teamed with Intermap Technologies of Denver to offer on-demand spatial data to the energy and utilities verticals. Spatial is to provide access to Intermap’s NextMap high-resolution elevation data from its Spatial on Demand (SoD) portal—claimed to be the world’s largest energy-specific online database. The announcement was made at the 2010 Global Petroleum Show in Calgary. SoD is an application and data hosting service that provides access to corporate geospatial and other energy-specific datasets. Users can check availability of geodata for a particular area of interest and order it directly from their desktop.

NextMap datasets include cultural features, digital terrain models and orthorectified radar images. Intermap claims over 10 million square kilometers data worldwide and has ‘proactively’ remapped entire countries and built uniform national databases. According to Spatial Energy, ‘four of the five Fortune 100 oil and gas companies trust Spatial on Demand with their geospatial imagery assets.’ More from www.spatialondemand.com.


Kelman reports i-Glass revenue success

As seismic processing business declines, data management sees ‘double digit growth’ in US.

In its annual report for 2009, Calgary-based Kelman Technologies reports growth in its seismic data management division in the face of declining revenues from data processing division. Data management revenue totalled $10.5 CDN million compared to $9.8 million in 2008, driven by ‘double-digit growth’ in the US data management business. Overall, revenue in 2009 was $21.6 million compared to $29.6 million in 2008.

Kelman’s ‘iGlass’ seismic data management solution was introduced to industry at last year’s SEG (Oil IT Journal November 2009). iGlass client software is a fully functional data management tool that allows oil and gas companies to manage their seismic assets. The iGlass portal adds a map view of data assets.

According to president and CEO Rene VandenBrand, ‘Clients now recognize the need for increased efficiency in dealing with large volumes of seismic data and in avoiding the growing legacy data problem. We expect this business unit to continue to grow in 2010 and beyond.’ More from www.kelman.com.


Software, hardware short takes

Gedco Vista, PSE gPROMS, IDS DrillNet Junior, Paradigm CRAM, Wonderware Information Server.

Gedco has released V10 of its ‘Vista’ cross platform 2D/3D seismic processing package with multiple enhancements to interactivity, workflow and log data integration—www.gedco.com.

~

Release 3.3 of PSE’s gPROMS predictive process modeler speeds solution of complex batch and dynamic processes and offers protection of embedded intellectual property—www.psenterprise.com.

~

IDS has announced DrillNet Junior, an ‘off-the-peg’ drilling reporting service for smaller operators—www.idsdatanet.com.

~

Paradigm has teamed with Acceleware on ‘CRAM,’ a new full waveform RTM processing package—www.pdgm.com.

~

Invensys has released V4 of its Wonderware Information Server, adding enhanced operating system support and Microsoft Silverlight-based graphics—www.invensys.com.

~

More software short takes in next month’s Oil IT Journal.


Semantic Days 2010, Stavanger, Norway

Schlumberger, Statoil and Baker Hughes offer different slants on semantic technology’s impact.

Schlumberger Fellow Bertrand du Castel’s keynote, ‘Upstream ontologies: will we ever learn?’ described the industry’s long quest to overcome the barriers between oilfield senor data and expert decision makers. Size, remoteness, and data ‘invisibility’ combine to make the accumulation of knowledge difficult. From the data bases of the 1980s, through the networks of the nineties to the ontologies of the first decade of this century, the industry has come a long way. But the next challenge looms—that of more ‘human-centered’ automation and systems that can ‘learn.’

For duCastel, artificial intelligence (AI) is a means to automation. Expertise is enhanced by automation in data management, simulation, uncertainty management and prognostics. Experts make decisions and are part of the automation continuous improvement process. A multi-vendor asset is fully networked from down-hole to seabed and surface. Asset performance metrics and uncertainties in future performance are constantly updated. Automation plays a key role in a rolling simulation, uncertainty analysis, and optimization of asset exploitation.

Citing his own 2008 oeuvre ‘Computer Theology1’, Ducastel claims ‘There is much to human beings, of which little has been decoded. Artificial intelligence is remote, but leveraging what’s known is within reach.’ The big new things are ‘descriptive logic,’ a math breakthrough that has application across signal processing and control systems along with ‘Bayesian reasoning.’ All of which is driven by a ‘reasoning engine’ and an ‘ontology’ of upstream terminology. ‘The ontology describes sensor fusion and control activities in a uniform manner so that reasoning can automatically process data input into commands.

DuCastel’s presentation lays obscure patent pending claims to ‘stochastic grammars’ which appear to be workflow patterns (the example shown is in the drilling domain) driven by a kind of truth table of drilling status elements. Input to the ‘reasoner’ is a real time feed of weight on bit, rate of penetration etc. DuCastel operates at a level that sets out to fly above that of ordinary mortals. Thus (apart from references to computer theology) we learn that ‘description logic ontologies are monotonic’ while ‘the brain is stochastic and learns.’

Lars Olav Grøvik’s (Statoil) presentation as a bit more down to earth—although less ‘semantic!’ Statoil’s challenge today is information overload. The petrotechnical ‘wheel’ turns around data seismics, well correlations, petrophysics and reservoir engineering—to name but a few of Statoil’s workflow elements. This picture hides a plethora of domain specific, data hungry applications including Petrobank, R5000, Recall Geoframe, Energy Components, Spotfire and many others. Statoil’s onshore operations centers are predicated on the existence of real time data streaming from the field. But things can go wrong—with unstable data streams, poor connectivity and programming/setup errors. Not only does the smooth running of the data center depend on data, so does the future value of the business—along with data reporting to regulators and other stakeholders. Effective work processes mandate providing the right data to the right people at the right time—and with the right quality.

Quoting Chevron’s Jim Crompton (as reported in Oil IT Journal), Grøvik noted the ‘kink’ in the information pipeline. The kink is located between oilfield automation/real time systems and analytics and modeling. The kink is caused by disparate data formats, quality, poor master data, ‘shadow systems’ and system complexity. The situation does not appear to be improving any time soon. One data specialist in a large oil company estimated that half of all data used is not actually captured—and of that which is kept, 78% will never be looked at! Grøvik wound up citing Putt’s Law—‘Technology is dominated by two types of people: those who understand what they do not manage, and those who manage what they do not understand.’

Inge Svensson (Baker Hughes) enumerated no less than eight data integration strategies to conclude that ‘ontology-driven processes’ win out over traditional data integration as the number of data sources increases. One such is the AutoConRig (ACR) project—powered by semantic web technology. ACR sets out to automate drilling, open loop control and envelope protection, replacing verbal communication between service company and driller. The control can be extended beyond the drilling environment for integration with models, real-time surface and downhole data. BHI eats the semantic dogfood in its Sand Control domain taxonomy, a three level vocabulary of 1,400 terms developed in Protégé and Excel. This underpins the ‘Beacon’ knowledge management system and helps find the right information and or right people at the right time in an expertise and domain knowledge base. Svensson believes that domain taxonomies/vocabularies are extremely powerful, ‘We have found multiple uses in other application areas–but creating good taxonomies and getting consensus is hard.’ Partitioned taxonomies are probably required in technical domains. Few good tools exist for taxonomy management. Future work includes PPDM integration.

More from Semantic Days on www.oilit.com/links/1006_9.

1 Computer Theology: Intelligent Design of the World Wide Web, ISBN 978-0980182118—www.oilit.com/links/1006_2.


14th PNEC E&P Data Conference, Houston

Shell on seismic data management, Petrel’s evolving role and R5000 deployment—its ‘most complex project ever.’ Continental Resources deploys a PPDM/NeuraDB master data system. ExxonMobil struggles with global seismic inventory. Southwestern Energy teams with Petris on data quality.

According to Cora Poché, Shell has petabytes of data online and shelves of cartridges. But despite a global seismic data management policy and standards, sometimes projects are executed with incomplete data. Poor audit trails mean that data still can get lost and may be re-purchased. Shell has been working to improve this situation since 2008 with a global map-enabled index of all of Shell’s seismics. Company policy for data preservation, remastering and vectorization has been published as the Shell E&P Global Seismic Data Management Manual. The project started in the US and was going global when the oil price cratered in 2009 and the project was scaled back. Poché observed a strong inverse correlation between the oil price and discretionary spend on data management!

David McMahan outlined how Continental Resources got started with a PPDM-based master data system spanning G&G, vendor data, physical inventory and workflow improvement. Continental buys data from IHS, HPDI and MJ Systems which is used inter alia in Geographix projects. NeuraDB, an ‘out-of-the-box’ PPDM database from Neuralog was deployed in a SQL Server/ESRI ArcSDE environment. This was extended with data loading tools, an EDMS and web services updates from IHS Enerdeq and HPDI. The 5 million well dataset is now updated nightly. Continental is now planning for integration with a new accounting system and a transition to the PPDM 3.8 data model. All of the new data loaders are now available in the ‘shrink wrap’ version of Neuralog which will likewise be migrating to the 3.8 database later this year.

Schlumberger’s data guru Steve Hawtin has been looking at the impact of the Data Management Association’s writings and activity on the oil and gas vertical. DAMA has published a set of data management best practices in data management in the form of a Data Management Dictionary (2008) and the Guide to the Data Management Body of Knowledge (2009). DAMA has carved up data management into functions and elements and laid down guidelines for data governance—where Hawtin sees a pressing need in the upstream. But E&P differs from the main body of DAMA (financial services) in that it adopts a buy rather than build policy and is mostly confronted by integration issues. Here DAMA gives some good pointers although not all are applicable. Hawtin believes that attempts to leverage the DAMA approach of reference data and master data management failed in E&P in the 1990s with Epicentre. In respect of metadata, Hawtin believes ‘our definition is completely different!’ While the majority of the DM-BOK is a valuable resource, E&P data managers need to be aware of conflicts.

Like Shell, ExxonMobil is fretting over its global seismic data inventory and is working on improving access to its massive 3D datasets as Jim Blackwell explained. The system will provide a world map-based front-end to a database of proprietary and vendor data with metadata links to other databases of field tapes, survey notes, legal rights etc. Following an evaluation of third party solutions—ExxonMobil decided to extend its own proprietary database in a major in-house development leveraging ArcGIS/SDE and the PetroWeb application.

John Deck described Southwestern Energy’s data quality framework, a joint development with Petris Technology. Prior to the project, Southwestern was managing its data using a combination of Excel, SQL Server and folders. Deck was brought in to ‘sort this out,’ and to improve data management of apps such as Petra, Kingdom, OpCenter and Property Master. The project centers on the creation of a single source of truth, common naming conventions and data governance. One early win was the use of Dynamic Drilling’s Apollo Dart WITSML aggregator to replace manual data entry from emailed deviation surveys. Petris’ DataVera has been deployed for quality assurance, currently profiling some 30 data types. Cleansed data is moved from app data stores into a new PPDM database. Other relevant standards include AOGC, API, and AAPG. ‘How do you know when quality is good enough?’ Deck suggests this is a balance ‘between perfection and a major screw up such as drilling a well in the wrong place!’

Hector Romero observed that Schlumberger’s Petrel has evolved from Shell’s tool of choice for static modeling to a complete workbench for petroleum geology. But Petrel data management is problematical with an ‘explosion’ of multiple copies with no audit trail. ‘Forgiving’ ascii import is prone to bad/incomplete data. Geodetic integrity is also a challenge and users risk ‘loss of context’ as data migrates. In 2007 senior data managers got together to introduce standards around Petrel in Shell. This led to the implementation of the Petrel reference project and better geodetic control. Shell’s mainstream interpretation environment is Landmark’s OpenWorks/SeisWorks. These links to geodata in ArcSDE via OpenSpirit and the Petrel reference project. Shell’s ‘Epicure’ middleware is also used to write back to OpenWorks along with context/attributes ‘as far as possible.’ Jeremy Eade joined Romero to describe how Landmark has built plug ins for both Petrel and Landmark’s PowerHub/PowerExplorer for direct connectivity between the two environments. Landmark’s R5000 data environment is the Shell standard and PowerExplorer is the data managers’ tool of choice.

Petrosys’ Volker Hirsinger provided an overview of the state of the art of seismic master data management. Seismics has evolved from a 2D ‘frontier’ tool to the geologists’ equivalent of an MRI scan! But alongside the high-end, companies still need to handle legacy data. Keeping seismic data in perpetuity means managing media and metadata—and is not always allocated the required budget. Geoscience Australia was forced to abandon its ambitious transcription effort through lack of funds. Ancillary documents, navigation data, velocities, processing workflows and multiple coordinates make for diverse data sets and a serious knowledge management issue. There is often too much focus on ‘managing’ large volumes of field data—while other key data sets are neglected. Increasingly workstation vendors ‘corner’ segments of the marketplace—for instance, much subsalt data in the Gulf of Mexico is ‘locked’ to Paradigm. International oils now run multiple workstations to cater for such local differences—increasing the seismic data managers workload.

Peggy Groppell described R5000 deployment as ‘Shell’s largest and most complex technical project undertaken,’ particularly in the light of a directive to extend the OpenWorks/Linux environment to Windows. All Shell’s client software is now 64 bit Windows with Linux servers. Shell’s 123DI flagship subsurface interpretation tool was originally developed at Bellaire in 1985. This has evolved through ‘nDI’ and now ‘GeoSigns,’ Shell’s R5000-based ‘next generation’ system. GeoSigns comprises tens of millions of lines of C++ and Fortran—and would be too much work to port completely to Windows. Shell opted to leverage Nokia Qt to provide cross platform functionality between Linux and 64 bit Vista. Other porting problems included broken symbolic links in Windows and password issues in the Windows Wallet. Whatever the root cause of a port problem, Groppell observed, ‘the developers get the blame!’

This article is abstract of a longer Technology Report produced by Oil IT Journal’s publisher, The Data Room. More from www.oilit.com/tech.


OSIsoft User Conference 2010

Real time—the ‘currency of the new decade’ for Pertamina, Alliance Pipeline and Marathon.

Toto Pranatyasto described Pertamina’s downstream operations spanning 6 refineries totaling 1 mmbopd capacity, 145 depots, 134 vessels and over 3,400 fuel stations. This vast infrastructure had grown over the years with no single point of accountability for downstream margins or supply coordination. Organizational silos and a lack of integration meant that the company lacked a unified view operations. A multi-vendor environment, some refineries are on Yokogawa, others on Honeywell, further complicated the picture.

Pertamina formed an integrated supply chain unit in 2008, reporting to the CEO, to optimize downstream and trading operations. The result is a new downstream dashboard, a flashy single wall display located in a single operations room covering refining, supply, shipping and marketing. OSIsoft PI System was key to bringing real time data together across the multi-vendor environment and to provide connectivity to Pertamina’s ERP system built around SAP’s Oil Industry Solution. PI modules are used extensively to create a single view of the data. Specific SAP data is also integrated with PI in order to facilitate real time view and historical data analysis. The system has been in operation for a year and already provides historical look-back analysis, a reduction in out of stocks events and a faster response to supply chain disruptions.

The subtitle to Steven Kociuba’s presentation on Alliance Pipeline’s PI System deployment was ‘Excel is no longer the data historian!’ Alliance’s 3,000 km gas pipeline from Canada supplies some 2.5% of US consumption. Prior to Alliance’s PI project, data was calculated and stored in local Excel spreadsheets. This manual process led to issues with data integrity and problems of data access—key data was marooned on mission critical systems that few had access to and was proving increasingly hard to maintain. A decision was made to implement a real time data historian. Alliance’s PI System collects data from some 50,000 tags into a high availability redundant pair. An extensive PI software portfolio includes ProcessBook, DataLink and Data Access. These connect to data sources including Experion PKS, FlowCal and other databases. A comprehensive suite of workflows for operations and forecasting has been developed along with data backfilling of nine years of historical data. This leveraged PI’s Universal File and Stream Loader (UFL) and Perl scripts. The result is that there are no more performance impacts to mission critical systems and data visibility is improved from the operations dashboard. Alliance is now working on phase 2 of the project to add PI Analysis Framework (AF), Advanced Computing Engine and WebParts. The plan is to expand the user base and to automate reporting. AF is to add a tag data dictionary to simplify access for new users and support navigation of equipment hierarchies to locate relevant data. AF implementation time for the 40,000 tag system was 2 to 3 months. Business users approved the AF data structures. The project has resulted in cost savings of $70,000 over the lifetime of each facility. Had the system been applied to 60 ultrasonic metering sites at inception, a $4 million saving would have been achieved.

Ken Startz described how Marathon uses PI to model and monitor its worldwide oil and gas assets (co-authored by SAIC). Marathon has production in nine countries and product sales in 18 US states. These are coordinated through a total of 12 PI systems spanning upstream, refining and distribution. Marathon has been using PI since Pat Kennedy sold the first system in 1988.

PI helps solve various business challenges such as time-series data collection from upstream assets, data blending from six different control systems, SharePoint integration and Marathon’s ‘Viewpoint’ digital oilfield initiative (Oil IT Journal March 2009). PI’s advanced computation capability is used to highlight problem areas and prioritize workflows. The system offers knowledge management and training and integrates with modeling applications such as Kappa Diamant, Kappa Saphir, HYSYS and ECLIPSE. Startz worked through three compelling use cases of PI covering modeling of its Equatorial Guinea LNG plant, operations on the Golf of Mexico Droshky platform—tied in to Shell’s Bullwinkle PI System—and plunger lift diagnostics on the East Texas Mimms Creek field. The last shows how PI-based decision support is helping operators who were previously ‘overwhelmed’ with multiple tasks. PI-ACE now analyzes well state and identifies situations such as ‘flow control valve leaking,’ ‘insufficient plunger lift time...’ and emails suggested intervention plans to operators in a daily spreadsheet. More from www.oilit.com/links/1006_7.


Folks, facts, orgs ...

Simmons, CGGVeritas, Aker, Aramco, CiSOft, Cortex, Devon, Energistics, FMC, Fusion, PII, GE, Global, GSE, Halliburton, IBM, OFSPortal, Petrosys, PPDM, Ryder Scott, Westheimer, Wipro...

Mat Simmons has left Simmons & Co. to spend more time at his new home, The Ocean Energy Institute, following differences over the Macondo blowout and over the potential of US shale gas.

~

Lammert Hoeve is to head-up AuraPortal’s European HQ in CJ Houten, Holland.

~

CGGVeritas has opened a Technology Center in Rio de Janeiro and has signed a three-year technology agreement with Petrobras for 4D seismic processing, imaging and reservoir geophysics.

~

Simen Lieungh is stepping down as president and CEO of Aker Solutions. CFO Leif Borge is interim president and Chairman Øyvind Eriksen CEO.

~

Khalid Abubshait is executive director of Saudi Aramco’s Affairs unit and Nasir Al-Naimi is executive director of Pipelines, Distribution and Terminals Operations.

~

USC CiSoft students Nelia Jafroodi and Brittany Daniels have been hired by Chevron.

~

Cortex Business Solutions has added John Hubinger of Oracle, Brian Frank, deputy COO of BP, and former chief of staff to the Premier of the State of Alberta Rod Love, to the Cortex Advisory Committee.

~

Devon Energy president, John Richels, has been appointed CEO. Co-founder and chairman Larry Nichols continues as executive chairman.

~

Energistics has named Executive VP of Business Development, Jerry Hubbard, COO. He will be responsible for operations, financial management, human resources, marketing and communications.

~

James Smith, Chairman of Shell UK, has been elected President of the UK Energy Institute.

~

FMC Technologies has appointed Johan F. Pfeiffer ,Vice President, Global Surface Wellhead. Pfeiffer has been with FMC since 1993 and was currently General Manager, Subsea Eastern Region.

~

Fusion Petroleum Technologies Inc. has appointed Richard Koseluk as President and COO with responsibility for the U.S. service operations.

~

PII Pipeline Solutions’ Calgary inspection services facility is now headed by Michel Hoyeck.

~

GE Oil & Gas has opened a $6 million technology center in the Republic of Azerbaijan.

~

Global Industries has appointed Ashit Jain COO and Jim Osborn (formerly with Worley Parsons’ IntecSea unit) as Chief Marketing Officer. President Pete Atkinson is to retire in March 2011.

~

GSE Systems has named former President of MXL Industries, James Eberle, as COO.

~

Abdallah S. Jum’ah has been named to Halliburton’s board. He is former president and CEO of Saudi Aramco, and serves on the JP Morgan Chase advisory council. Greg Culp heads-up the company’s new $15 million oilfield equipment test facility at its Duncan, Oklahoma Manufacturing Center.

~

IBM and the Brazilian government have engaged in a public private partnership to investigate oil and gas exploration, logistics and safety.

~

Michael Mueller, formerly with BP, has joined MicroSeismic as Chief Geophysicist.

~

Intelligent grid software provider Nexant has opened a new office in Bahrain headed up by Graham Hoar, Director-Middle East.

~

OFS Portal has elected Charles Currie (Schlumberger), Nicholas Gee (Weatherford), and Jerry Lummus (Cameron) to its board.

~

Nadene Sayer is the new head of marketing at 1Spatial.

~

CDP Inc. is offering free online safety training to smaller oil and gas companies and vendors.

~

Andrew Weller has joined Petrosys as a Support and Training Geoscientist. Banu Panjateharam and Siau Ch’ing have joined the Kuala Lumpur office.

~

Coreworx, RigData, Sinfic Venture IM have joined the PPDM Association.

~

Senior VP Herman Acuña has joined the Ryder Scott board. Miles Palke has joined as senior PE, Eleazar Benedetto-Padronas as petroleum geoscientist and Hugo Armando Ovalle as PE.

~

SeisWare has welcomed Jim Lingley and Oscar Skaer to its sales team.

~

Senergy has recruited Andrew Sutherland and Stuart Walley to its Middle East team. Both were formerly with Paradigm.

~

Tieto has established an Energy Components centre of excellence for oil and gas in Russia.

~

Westheimer Energy Consultants (WEC) has hired Chris Hughes, Ken Gauld, Gary Lundeen, Mike Dougherty Tom Ripley, Jess Kozman, Ayana Redwood and Laurence Modeste.

~

James Lawnin and Mark Allen have joined Wipro Technologies’ global energy practice. Both hail from SAIC.


Done Deals

Aveva, Logimatic, ADB, Baker Hughes, TGS, P2ES, Energy Solutions, CGGVeritas, Stingray Geophysical.

Aveva has acquired the Mars line of business from Logimatic, adding plant engineering, information management, materials, project management design and construction planning. Aveva has also bagged ADB Systemer’s ‘WorkMate’ operations integrity management business.

~

Baker Hughes has bought Russian Oilpump Services, a Siberia-based electrical submersible pumping service company.

~

TGS’ Geological Products division has purchased the directional survey business of P2 Energy Solutions’ Tobin business line, including a database of over 38,000 directional surveys.

~

Energy Solutions has merged with Houston-based Entessa, a provider of logistics management software for the oil and gas supply chain.

~

The French Government’s Strategic Investment Fund has acquired a 5% stake in CGGVeritas, purchasing shares on the open market.

~

Stingray Geophysical has raised £3.15 million in ‘growth equity’ from existing shareholders Energy Ventures, Cody Gate Ventures, Chevron Technology Ventures and Statoil Venture and from execs Magne Sveen and Martin Bett.


ISACA Social Media in Business White Paper

Information Systems Audit and Control Association on emerging communications technology.

The Information Systems Audit and Control Association (ISACA) report ‘Social Media: Business Benefits and Security,’ (SM-BBS) starts out with the premise that ‘the days of recommendations to keep social media usage out of the enterprise are gone.’ Social media use is now the rule not the exception. Does this mean that your company is doomed if it doesn’t have a Facebook page? Seemingly 65 of Fortune 100 companies do. ‘Social media’ is defined as any communication channel that embraces user feedback.

SM-BBS homes in on the risks associated with enterprise use of SM. Because use does not require any special hardware or software, SM use may escape the normal risk assessment, exposing the enterprise to improper and or insecure use. Vulnerabilities such as insecure applications on an employee’s personal social media page may cause unacceptable exposure on a corporate network. Moreover, ‘malicious outsiders could use employee social media pages to launch targeted attacks by gathering information to execute sophisticated social engineering campaigns.’ The report enumerates numerous IT and social risks and offers mitigation strategies. These should leverage structured approaches such as the Risk IT and CobiT methodologies promoted by ISACA.

The report concludes that ‘emerging communication technology offers great opportunities to interact with customers and business partners [but] there are significant risks to those who adopt this technology without a clear strategy that addresses both the benefits and the risks. Likewise there are risks and potential opportunity costs for those who think that ignoring this revolution in communication is the appropriate way to avoid the risks it presents.’ More from www.isaca.org.


Arthur D. Little on catastrophic risks in oil and gas industry

16 page whitepaper addresses exposure to ‘residual’ risk and long term balance sheet health.

A timely new report from consultants Arthur D. Little, ‘Improving management of potentially catastrophic risks in the oil and gas industry1’ notes that, ‘despite carefully planned and implemented risk management, residual risks can present significant damage to a company’s balance sheet. Recent events have turned attention to the assessment of Exposure to Risk (EtR), the maximum potential economic loss associated with a risk.

The 16 page report introduces ADL’s roadmap to an optimal EtR evaluation model with application to international and national oil companies and their contractors.

ADL’s thesis is that while companies have factored ‘initial risk’ into their strategies they need to improve management of ‘residual risks,’ i.e. the potentially catastrophic events.

EtR is claimed to be a new approach to evaluating such exposure with recommendations on balancing in-house and outside risk management expertise and tools. The technique also balances a company’s level of risk aversion and likelihood with the current value and profitability of each asset.

Alongside the need to adopt appropriate measures relating to an EtR portfolio, companies need to optimize risk retention and transfer strategies to assure long term stability of their balance sheets.

1 www.oilit.com/links/1006_3 (login required).


Hagenes Data announces TheGlobe for Petrel

New 3D viewer overcomes geodetic limitations of Petrel/Ocean.

Hagenes Data AB, a software house ‘located deep in the Swedish forest’ has rolled out ‘TheGlobe,’ a Petrel plug-in that addresses some of the cartographic limitations associated with Schlumberger’s interpretation flagship.
Currently Petrel projects and data are restricted to a single projection which can be a problem when dealing with data of large areal extent. TheGlobe brings a new 3D viewer to Petrel which overcomes the geodetic limitations in Petrel and Ocean and improves on scalability, levels of detail, interactivity and multi threading. TheGlobe is based on the OpenInventor ‘Examiner Viewer.’ Target users for TheGlobe include managers of large scale projects, geomatics professionals and users annotating and presenting Petrel results. An application programming interface (API) is available to offer Ocean programmers ellipsoidal renderers and to add data objects to TheGlobe environment.

Hagenes Data CTO Odd Hagenes told Oil IT Journal, ‘TheGlobe does not leverage Google Earth or Bing Maps. We have made a new application running on top of the OpenInventor graphics library. This uses the same technology that is embedded in Petrel and gives the same look and feel as other Petrel applications. While Petrel remains limited to a single projection, we can put data from other CRS’s on TheGlobe and view the data with Petrel’s native viewers. We use the Ocean coordinate service (based on ESRI’s projection engine) so CRS definitions come from the Petrel catalog. We have added some new ones through WKT and EPSG codes for vertical and geocentric projection systems.’ More from www.hdab.se/TheGlobe.html.


Sales, contracts, partnerships and deployments

Ipres, Exprodat, Alliance Geotechnical, ArkeX, Ark CLS, Cortex, Full Circle, CygNet, GlobaLogix, ADNOC, Flour, Gazprom, Siemens, IDS, AGR, Ikon, Stingray, OPT and Technip.

Ipres has delivered its reserves management and reporting software ‘Ipresource’ to the Dutch state oil company Energie Beheer Nederland (EBN). Ipresource supports reserves, resource management and reporting and complies with SPE/WPC/AAPG/SPEE PRMS reporting guidelines.

~

Exprodat Consulting has teamed with Malaysia-based Alliance Geotechnical Services to expanding market clout for its Team-GIS E&P decision support tool and to offer ArcGIS consultancy and training services to the Asian oil and gas business. Exprodat also signed with ESRI UK to provide GIS training to the upstream.

~

ArkeX has signed with Ark CLS for the further development of its ‘ArkField’ geophysical potential field software. ArkField lets geophysicists view, model and interpret potential field data in conjunction with 2D/3D seismic data. An ArkField plug-in to dGB’s OpendTect system is planned.

~

Calgary based Cortex Business Solutions has partnered with Full Circle Systems to jointly develop and market an integrated solution spanning Full Circle’s DocVue workflow and document management solution and the Cortex Trading Partner Network. The partnership targets Coretex’ accelerated growth into the US market via Full Circle’s 100 strong US client base.

~

CygNet Software has launched the CygNet Integrator Program to expand its partner ecosystem. Early adopters of the program include GlobaLogix and Techneaux Technology Services. GlobaLogix VP Jim Fererro ascribed GlobaLogix’ three fold growth over the last two years as in part due to its use of CygNet’s Enterprise Operations Platform.

~

Following successful completion of front-end engineering on Abu Dhabi Gas Development Company’s (ADNOC) Shah gas development project, Fluor Corp. has been named as Program Management Consultant. Fluor booked the $160 million contract value in Q2 2010.

~

Gazprom has signed a memorandum of understanding with Siemens covering cooperation on liquefied natural gas (LNG) technology development.

~

Independent Data Services (IDS) is providing AGR Petroleum Services with drilling reporting services for its Falklands operations. AGR is using IDS DrillNet to link operations on the Ocean Guardian drilling in the North Falklands Basin, to its Aberdeen, UK HQ.

~

Ikon Science has teamed with Stingray Geophysical to develop a system for 4D/time-lapse imaging of permanent reservoir monitoring deployments. The new technology and services offering will combine Stringray’s Fosar multi-component life-of-field seismic recording system with Ikon’s RokDoc-Chronoseis reservoir monitoring package and QIS modeling to track fluid movement and optimize field development.

~

Houston-based XCXP Operating has licensed the PEOffice suite from Optimization Petroleum Technologies for reservoir and production analysis.

~

Technip has been awarded a four-year term agreement by BG Group for the provision of pre-FEED1, FEED, full EPIC2 and IRM3 services in the UK and Norway. The agreement includes a possible three year extension.

1 Front end engineering design.

2 Engineering, procurement, installation and commissioning.

3 Inspection, repair and maintenance.


Standards Stuff

Last Call for MathML 3.0. EU Guide for standards writers. RDB2RDF first public working draft out.

The W3C Math Working Group has published a ‘last call’ Working Draft for Mathematical Markup Language (MathML) Version 3.0. MathML is an XML application for describing mathematical notation and capturing structure and content. The goal of MathML is to enable mathematics to be served, received, and processed on the World Wide Web, just as HTML has enabled this functionality for text. More from www.oilit.com/links/1006_6.

The EU Committee for Electrotechnical Standardization (CENELEC) standards supervisory has issued a Guide for standards writers encouraging them to consider the special requirements of micro, small and medium-sized enterprises (SMEs). The Guide sets out to increase uptake of EU standards amongst this group which makes up 99% of all EU corporations. The Guide is available in English and French and can be downloaded from www.cen.eu/go/SME.

The W3C’s Working Group on database to RDF mapping—a.k.a. ‘RDB2RDF’ is inviting comment on the first public working draft of its ‘Use Cases and Requirements for Mapping Relational Databases to RDF’ proposal. The document includes use cases from science and industry showing how relational data is exposed in patterns that conform to RDF.

Curious that after nearly a decade of the semantic web this work is at such an early stage! More from www.oilit.com/links/1006_5.


Hindustani refiners keep secrets in Oracle Vault

Online tendering system backed by secure storage and authentication system.

Hindustan Petroleum Corp. Ltd. (HPCL), a Global Fortune 500 company, has deployed a solution from Oracle to secure its crude oil import tendering operations, protecting its trade secrets and competitive bid information from unauthorized access. The solution leverages Oracle’s Audit Vault and Database Vault solutions to strengthen security and prevent competing bid information from ‘leaking.’

With a 16 million tones/year throughput, HPCL’s refineries’ procurement requirements are significant and timing and pricing of tenders plays a critical role in HPCL’s profit. During tendering, all bid information is submitted online via the web and stored in an Oracle database. This has been augmented with Oracle’s ‘Vault’ solution set to safeguard classified information on material quality, quantity, loading port, pricing period as submitted by some 60 international vendors. The solution protects confidential data from unauthorized access by any user—even privileged users such as DBAs. Audit Vault automates the collection and consolidation of audit data from all database servers. The security solution also helps HPCL address data governance requirements and increases the security of existing applications.

Further protection can be obtained by implementing Vault’s multi-factor access control based on time of day, IP address, application name, and authentication method, preventing unauthorized ad-hoc access and application by-pass. More from www.oracle.com.


Industrial Defender announces host intrusion prevention system

Application whitelisting security system avoids overhead of anti-virus/scanning solutions.

Foxborough, MA-based Industrial Defender (ID) has announced a Host Intrusion Prevention System (HIPS), an application whitelisting-based security solution that prevents malware attacks and non-trusted software modifications on control system servers and end-point devices. ID HIPS avoids the high overhead of software scanning operations common with traditional anti-virus technologies, as well as the administrative overhead of frequent software and anti-virus signature patch updates.

Walt Sikora, ID VP of security said, ‘HIPS embeds patented whitelisting technology from CoreTrace inside our Defense-in-Depth suite. This is supported with our professional services including host scrubbing, product deployment, whitelist initialization, and periodic system audits and refreshes.’

HIPS works by limiting runtime access to a predetermined list of authorized applications to launch on each automation system computer. HIPS automatically blocks all unauthorized applications, including unknown malware and rogue applications installed by users—improving on traditional firewall and password protection. More from www.industrialdefender.com.


Jim Soos, IBM, on master data management in E&P

SPE Digital Energy group hears how MDM ‘hub’ assures consistent data use, retaining app flexibility.

Speaking at an SPE Digital Energy Study Group meet in Houston this month, Jim Soos, of IBM’s global business services unit spoke on master data management (MDM) in the E&P industry. Integrating information across the different E&P domains, subsurface, production, operations and finances, requires a trusted source of shared data. But ‘trusted’ data will likely be scattered around the enterprise in local data sources and will likely be co-mingled with data of varied provenance and scope. Soos defined MDM as ‘a set of disciplines, technologies, and solutions to create and maintain consistent, complete, contextual, and accurate business data for all stakeholders.

A master data repository holds master, meta and reference data along with historical data. This feeds into a services bus, supplying data quality, authoring, event management and relationship management. The forgoing comprises an MDM framework which itself is topped-off with a data governance layer.

The MDM system becomes a hub for existing applications, assuring consistent data us and providing a flexible mechanism for the addition of new applications and services.

Challenges for MDM deployment include functional silos, many industry standards of varied quality and take-up and inherently complex industry data types. Moreover, E&P data governance and ownership is ‘immature.’

Soos believes MDM can help improve the business, offers efficiencies and cost savings and can reduce risk. While a business can expect a significant initial cost for implementing MDM, once the capability is there, when folks are trained and the data is cleaned up, the cash flow curve, plotted as a function of line of business adoption and the number of objects mastered, crosses over to show a sustained ROI.

Soos stepped through the process of building and MDM capability, covering information architecture, governance and deployment. There is some interesting granularity here as MDM can be tuned to operations, collaboration and/or analytics. But the reality is that all of the above will be used in a successful MDM roll-out.

Soos noted that in a typical project, MDM software only accounts for around 10% of overall project cost. 40% goes on establishing data governance and architectural planning while the lion’s share (50%) is devoted to data remediation and clean-up.

MDM is a component of IBM’s information reference architecture—an impressive piece of slideware covering the whole enterprise IT enchilada! More from www.ibm.com.


IBM Maximo in Smarter Planet, IIF and MDM

IBM triple prong attack on E&P IT—Maximo, Integrated Information Framework and MDM.

At a recent breakfast meet in Calgary, IBM’s Bill Ely, of the Maximo asset management unit and head of the Maximo oil and gas user group, presented IBM’s solution for ‘intelligent asset management and operational efficiency.’ Maximo is being retro-fitted into IBM’s ‘Smarter Planet’ initiative (TechWatch 1005) which promises an instrumented, interconnected and intelligent world. Oil and gas asset management is a strategic growth area for IBM and the ‘reference semantic model,’ developed in the Chemicals and Petroleum unit is now impacting Maximo and other IBM units such as turnaround and business analytics and optimization. Maximo is seen as an enabler for integrated operations and boasts a blue chip customer base including ADCO, BP, CNOOC, Chevron, KNOC, Repsol and others.

Those who have followed IBM’s ‘other’ E&P data framework—the Chemicals and Petroleum Integrated Information Framework (IIF) will have noted the absence of Maximo from this offering. This is because Norway, where the Chem & Pet IIF was developed for StatoilHydro, is primarily an SAP ‘shop.’

The Chemicals and Petroleum IIF, now in version 1.3, has moved from its JIP status (original versions were developed under the auspices of the Norwegian Integrated Operations initiative) last year with its first ‘sale’ to Statoil last year. Meanwhile in Houston, IBM was touting a another solution to upstream data management—see our report on Jim Soos’ presentation on ‘Master Data Management in the E&P Industry’ on page 11 of this issue. More from www.oilit.com/links/1006_8.


Microsoft ambush marketing at EAGE

Microsoft Upstream IT Reference Architecture backed by IHS, ISS, Halliburton, OpenSpirit et al.

The Microsoft Upstream IT Reference Architecture (a.k.a. Microsoft Oil and Gas Reference Architecture—MOGRA) juggernaut is shuddering into motion with a series of announcements made around the EAGE in Barcelona this month. We say ‘around’ because Microsoft was not officially ‘at’ the EAGE—but rather stalking the exhibition floor, rather as Oil IT Journal likes to do in fact.

As of this month, the following have sworn allegiance to MOGRA—Accenture, EMC, Energistics, IHS, Infosys, ISS Group, Halliburton, Logica, Merrick Systems, OpenSpirit, PointCross, VRcontext, WellPoint Systems and Wipro.

Conspicuously absent from this initial list is Schlumberger’s Information Solution unit, which has been backing Microsoft now for a few years and may be a bit peeved to see Halliburton getting a front row seat so easily. As to what it all means, as far as we can tell, MOGRA is more bluster and FUD at the moment. But we have Ali Ferling on record as saying, of the promised new oil and gas specific protocols under development, ‘We will publish what we come up with. You will be hearing more from us!’ We can’t wait! More from www.oilit.com/links/1006_4.


OpenSpirit wins support from Siemens, Kadme and Petris

Siemens Energy, Kadme, ffA and Petris add tools to OpenSpirit interoperability bus.

If Microsoft’s MOGRA ‘reference architecture’ above so far lacks substance, the same cannot be said for OpenSpirit’s (OS) interoperability solution which organized a well attended Technical Symposium following the EAGE in Barcelona (report in next month’s Oil IT Journal) and announced a slew of new partners for the upstream interoperability ‘framework.’

A significant newcomer to upstream IT is Siemens Energy which is to use OS to extend its ‘XHQ’ operations intelligence platform to geoscience applications and data stores. Siemens also ‘supports’ MOGRA. Kadme is to extend its Whereoil Enterprise to include OpenSpirit-enabled data sources. Likewise Petris Technology has joined the OS business partner program to extend its PetrisWINDS Enterprise data management, interoperability and workflow solution. And Australian ISS Groupis to add OS-based interoperability to its ‘BabelFish’ solution for real time data streams. Finally Foster Findlay Associates (ffA) is adding OS connectivity to its SVI Pro and SEA 3D Pro seismic image processing applications. More from www.openspirit.com.


Shell’s seismologists back PGS ‘gamechanger’

OptoSeis marine system adapted for ‘ultra-high’ channel count onshore use.

Shell is teaming with Petroleum Geo-Services (PGS) on a ‘game changing’ fiber optic-based seismic acquisition system. Dirk Smit, VP, Exploration Technology at Shell explained, ‘Our technologists spotted the potential for applying PGS’ OptoSeis technology to onshore seismic. Given that most of this technology already exists, we expect to deploy the first system soon.’

Land-based OptoSeis promises an ‘ultra-high’ channel count, ‘far beyond what is currently available,’ high quality sensors and improved resolution. The system also targets permanent deployment for reservoir monitoring.

We quizzed Shell spokesperson Jaryl Strong as to how this development related to Shell’s previous announcement of a collaboration with HP on seismic sensors (Oil IT Journal March 2010). Strong replied, ‘These technologies are complementary and competitive in terms of function and effectiveness, but not necessarily contradictory. Shell is committed to having multiple options that can offer the best tailored strategy for each exploration site. Shell may use either technology or a combination of the two to obtain the greatest potential for success based on the needs of the specific application. Both technologies also represent complementary solutions for both testing and monitoring sites. Both are extremely effective, so it makes sense to develop complementary solutions with relative strengths based on individual project needs.’


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.