September 2012


Standards leadership?

The great and good of the oil and gas standards world gathered in Houston recently. But what are they actually going to do to square the IT circle of competing, overlapping data protocols?

Chevron’s Jim Crompton likened the inaugural meeting of the Standards leadership council (SLC) to the handshake between Queen Elizabeth II and former IRA chief Martin McGuinness! For Crompton, the meeting of the nine constituent bodies potentially heralds twenty years of standards peace.

BP’s Rusty Foreman thinks that standards are increasingly important in today’s digital business. But while standards development is a high priority, the landscape is hard to understand and there is a risk of duplication.

Halliburton’s Dave Savelle offered a nuanced view of past efforts. Standards succeed when they help automate or augment an existing business process, when their business value is clear and when they are perceived as open and neutral. For Savelle, SCADA/OPC and WITS are successful standards. Both evolved from proprietary technologies and are easy to deploy. The PIDX eInvoice was also successful and timely, with the advent of ‘game-changing’ B2B technology and ERP systems.

Savelle cited Epicentre, Wime and the Global unique well identifier as having failed because no clear business problem was being solved. Vendors saw these as an attempt to commoditize what were really differentiating technologies. Their scope crept and technology came to dominate business requirements. They were also wrongly perceived by sponsors as ‘too big to fail!’ Savelle asked, ‘Is Witsml a successful standard?’ Witsml ticks most of the boxes above but so far has failed to meet the acid test of a good standard—adoption.

Comment—With due respect to this illustrious assembly, a little humility may be needed as it confronts the herculean task of tying together so many radically different standards. Realism suggests that the likelihood of the upstream impacting say the OGC or OPC specs—which both serve much larger communities—is minimal. Another issue is that standards rely on such diverse technologies—XML, database, ‘semantics,’ not to mention Word documents and CSV/ascii files. Industry would benefit from clarification here. It might also be a good idea to re-visit what problem the SLC is trying to solve. Is the quickest route from well to ERP really a ‘standard of standards?’ One outcome from a future SLC meeting might be a clear statement that such and such a standard or whole technology was to be deprecated and eventually abandoned.


BMC for Pemex

BMC’s ‘ERP for IT’ solution sees across-the-board take-up in Pemex’ drive to rationalize applications and data centers.

BMC Software’s Business service management (BSM) platform is to see widespread adoption by Pemex IT professionals. Pemex CIO Abraham Galan is using the tool to transform a disparate IT system of over a thousand applications and multiple data centers to a connected and standard infrastructure. Previously each Pemex division had its own proprietary IT. Galan saw this as an opportunity to implement a new strategy and platform, rationalizing its applications and data centers.

Following trials with several other solutions, Pemex has now switched its IT management to the BSM. The result is improved levels of customer service and better visibility into overall work flows and processes. Pemex has implemented pretty much the whole of the BMC solution which Galan describes as ‘an ERP for IT.’ This is helping Pemex decide where to cut costs and where to invest. Galan is now looking outside of the IT department and is planning to use the platform to configure the rest of Pemex’ business processes. Pemex’ ‘Basica’ gas and chemicals unit deployed BMC software back in 2010 (Oil ITJ January 2010). Shell also reported use of BMC in its Malaysian ‘MegaCenter’ (Oil ITJ March 2002). More from BMC.


Modeling shale gas production—bonanza or bust?

Editor Neil McNaughton, with help from Kappa Engineering CEO Olivier Houzé, takes a skeptical look at shale gas. Behind the ‘game changer’ for world energy headlines is a complex picture of price and reserves. Anyone for a million year buildup test?

It usually takes a while for ideas to enter the public consciousness—but once they are there, they become immovable objects. Where I live in France, shale gas exploration is invariably associated with damage to the environment. Its exploitation has been banned under moratoriums from both right and left wing governments. But that is not the immovable idea I want to explore in this editorial.

The other universally accepted truth relating to shale gas is that it is a game changer for world energy. The US is going to be self sufficient in gas (and maybe oil) at some not too distant date. Cheniere’s LNG import facilities planned for the southern US are now being turned into LNG export terminals. At least one major conventional gas project has been canceled (Shtokman). And as natural gas is perceived as environmentally green, there are moves to replace coal and diesel fired electricity generating plants.

The problem with this rosy scenario is that there are two big unanswered questions. One is the amount of gas that these new-type wells will deliver in the mid to long term. The other is the price at which such can be said to be economic.

Let’s deal with the second question first. At the 2010 AAPG, Chesapeake’s Aubrey McClendon stated that money could be made at $5/mcf but McClendon may be seen as something of an optimist. In 2011, BG Group’s Malcolm Brown stated that natural gas was ‘developable’ at $5.

Given the propensity for oil and gas folks to paint a rosy picture and present something nearer to a best case than an average, I suggest that we retain $10 as a price that would bring a significant portion of the promised reserves into play.

Natural gas is currently selling for around $2 in what is clearly an unsustainable situation. A glut of production has hit the market, and as ExxonMobil CEO Rex Tillerson said recently, ‘We are all losing our shirts.’ For those of you who are more comfortable with crude oil prices, think of how things were when crude was at $20 compared to today’s $100.

Given that the wheels should have come off the shale gas game in North America, what is sustaining the activity? One factor is that in Europe, the natural gas price is around $11 and in Japan, nearly $17. So those folks at Cheniere are probably on the right track. Another factor is momentum, as industry drills its obligation wells.

To return to the first question as to ‘how much’ shale gas will be produced I bring in my first and only witness, Olivier Houzé, who is CEO of Kappa Engineering, a vendor of well test software that it used, inter alia, to evaluate shale gas wells. Houzé, speaking at a recent gathering of the French oil and gas technologists’ association (AFTP), provided an insight into the state of the art of evaluating shale gas wells. We summarize his conclusions as follows.

The shale gas boom presents an interesting problem to the reservoir engineer. There is a lot of science that has the potential for useful application. But the ‘factory drilling’ paradigm means that little money is devoted to research and there is no prior experience of how these novel reservoirs will perform. A ten year period from first production is required before we have the good data we need for a proper evaluation.

What we do know is that there are multiple modes of production and different scales of gas diffusion including desorption, flow through micro-pores and the fracture network itself. The concept of a ‘virtual’ well test is instructive. Calculations show that it might take a million years to reach pseudo-steady state in the reservoir itself! Even attaining pseudo radial flow may take 100 thousand days. In the early days of unconventional production, various decline curve models were tried. Although wells were far from the terminal decline phase, such calculations were accepted by the SEC and booked as ‘reserves.’

What should be used for a twenty year plus evaluation? Working from a straight line flow regime, the ‘ad-hoc’ Arps decline curve has begotten thousands of publications and various fiddle factors. But the problem is that 2-3 years of data is too short a time frame to evaluate such methods.

Analytical models suggest a less optimistic picture than decline curves, but they may be ‘right,’ or at least less wrong. Such models can be very complex including, for instance, fractal diffusion. Tools are being built today in anticipation of possible ‘bad news’ when the decline curve approach no longer works. These include numerical models with unstructured gridding and dynamic refinement. Some models may need millimeter cells!

Despite these sophisticated models we still lack data. Model water flow back studies and stress analysis may be tried in the face of an unwelcome productivity drop. But here, empirical pressure-to-productivity relationships may be used to ‘explain’ just about anything.

All the models can match sub 10k hour data. One early study was recently updated with an extra ten months of data. Rerunning the model showed that the old straight line proxy was extremely optimistic and missed the fact that fractures see’ each other. The analytical model turned out to be pessimistic, failing to account for formation compressibility. While an ‘improved’ numerical model was OK, this does not say much/anything about the next 5-10 years.

For both shale gas and shale oil the problem is that all kinds of curves can match the first three years of data. But what happens over the longer term? Several industry consortiums have been initiated to try to understand the problem.

So there you have it. A game changer for world energy that is predicated on a significant hike in US natural gas prices and whose capacity to deliver the promised reserves is the subject of under-funded research as industry ‘factory drills’ ahead!


Letter to the editor from Hasso Plattner

Hasso Plattner, co-founder of SAP and professor at Potsdam University, takes issue with our review (Oil ITJ June 2011) of his book ‘In Memory Data Management.’ No FUD was intended!

In the review of my book (Oil ITJ June 2011), you question the linkage between the in memory database (IMDB) and various seemingly unrelated topics—let me explain.

The Microsoft Surface device was mentioned because of its capability to share information across a group of users. For this to happen effectively, queries have to be answered within the human attention time span—i.e. with a maximum acceptable delay of about 8 seconds. Such a response time can only be achieved with a database of this type.

We also referred to the ‘cloud’ since the best way to run an on-demand application in the cloud is to use an in-memory database. Here, columnar data storage gives us 5-20x compression factor for customer data. Multi-tenant use makes for very large tables and here again, speed is everything. The columnar database’s schema can be changed on the fly, adding new fields and simplifying upgrades.

On the hardware side, we highlighted the Intel Nehalem architecture as the current leader in terms of CPUs per board, cores per CPU and RAM. A database server with 4,000 cores on 100 nodes with 50 terabytes of RAM is currently operating in San Jose at a sensationally low price point!

While everybody talks about virtualization, with the IMDB, there is no need for it. Performance is significantly higher running natively on the operating system.

Regarding big data, our experience with SAP Hana running real customer scenarios is sensational. We have replaced custom applications built with Oracle tools running on an Oracle database with the same application on Hana and seen huge run time improvements. 1,000x is normal, 10,000 frequent, and on 3 occasions we achieved a 250,000x speed up. Now many applications can run on big data as a transaction—returning sub 8 second query results.

The book is intended to help students understand a new technology which will change enterprise computing profoundly. Any marketing flavor was unintended and is a mistake that the authors let slip in. There is definitely no FUD whatsoever. In fact the reality is even better than what we anticipated two years ago.

Best regards,
Hasso Plattner,
Hasso Plattner Institute,
Potsdam.


Interview with Jim Pritchett

Following Halliburton’s acquisition, Petris Technology CEO tells of his plans for the future.

Why did Petris sell?

We were at a point in our development where we needed to expand to support growth. About a year ago, the board began to look at alternatives and started talking to Landmark who shared the same vision and who had a complementary product line.

What was most attractive to Landmark—your software? The customer base?

The installed base—in particular Recall—now a component of our Petris Winds Enterprise borehole data management solution. This has seen a lot of success—especially in larger tenders. Also we have been historical partners as Recall is a part of the Petrobank solution and Landmark is a reseller for our Zeh printing solution.

One area where we definitely add to the Landmark solution line-up is with Datavera, our data quality solution—this is particularly important with the huge growth of data volumes and master data complexity.

Why sell now?

As I said, exploding data volumes are mandating more comprehensive solutions and this at a time where Finder and other industry standard data management solutions are reaching their end-of-life. Petris and Landmark getting together will address these issues. Combining our services groups will offer significantly better capacity over a wider range of geographies.

The latest press release on the Landmark website dates back to January 2011. Did Halliburton take its eye off the ball in the software space?

I can’t comment on the Landmark website. But what I can say is that there is a lot going on as we work to combine our road maps. We are meeting with customers and so far have had a very positive reaction to the merger. The next few years will be a very exciting time for data management.


ISN moots FlexPod oil and gas data solution

Upstream potential seen in joint Cisco, NetApp, VMware hardware bundle.

ISN CTO Martin Kucharcik thinks that a new hardware bundle from NetApp, Cisco and VMware is likely to have a significant role as a server for upstream data and applications. The ‘FlexPod’ solution is a pre-integrated compute platform consisting of Cisco’s Unified Computing bland server, VMware virtualization system , storage from NetApp and networking again from Cisco.

For Kucharcik, FlexPod appears to offer a lot to the IT department in terms of scalability and ease of management. FlexPod also offers an impressively specced Citrix server with 0.5 TB of memory, capable of running hundreds of virtual machines—ISN sees Citrix as key to its oil and gas clients’ IT infrastructure. FlexPod claims some 1100 installations and has a ‘starting price’ in the UK of £15,000. More from Kucharcik’s blog.


Earth sciences ontology in the spotlight

Ontolog Forum ‘mini sessions’ propose improved meta data, formats for geoscience and GIS.

The Ontolog Forum, a virtual community of practice dedicated to advancing the field of ontology and semantic technology in mainstream applications and standards, has just concluded two online ‘mini sessions’ devoted to establishing an earth sciences ontology.

Krzysztof Janowicz (UC Santa Barbara) sketched out the geo-ontology value proposition. Better meta data means ease of data discovery and reuse, improved reproducibility of scientific results and less misinterpretation.

Mike Dean, from Raytheon unit BBN Technologies, outlined prior ontological art likely to impact earth sciences. This includes GeoSparql for geospatial data and the RDF Data Cube for multi dimensional ‘linked data’ sets.

The USGS’ Dalia Varanka contrasted the ambiguity of current relational models with the ‘sharper scientific focus’ that a ‘semantically specified model’ promises. More earth sciences ontology mini sessions are scheduled for November and December.


‘Subsurfr’ powers North Dakota’s well data showroom

Wellstorm leverages open source tools in 2D/3D well and seismic decision support system.

Austin, TX-based well data specialist Wellstorm has just rolled out ‘Subsurfr,’ a new well data browser that offers map, table and 3D visualization of logs and wellbore trajectories. Subsurfr uses Wellstorm’s Witsml service platform (WSP—Oil ITJ April 2010) for its 3D view.

The Subsurfr demonstrator has been ‘primed’ with logs, wellbore trajectories, and formation tops from North Dakota’s public domain dataset. The data has been converted to Witsml by Wellstorm and is hosted on Wellstorm’s WSP server. Wellstorm is also to offer users their own private WITSML storage area on Subsurfr, which can be connected to live feeds from service companies for upload of LAS data and spreadsheets with trajectory data. Subsurfr also displays seismic data—although there is currently none available in the North Dakota public domain data set.

Subsurfr’s mapping uses AOL’s MapQuest—now an open source mapping engine. 3D visualization likewise leverages open source technology—notably the WebGL plug in. While many browsers support WebGL, Internet Explorer requires a plug-in for this functionality.


Future trends in GIS

UN geospatial experts’ crystal ball shows geospatial evolving to new data types, formats and usage.

The United Nations committee of experts on geospatial information management has just deliberated on ‘Strategic consideration of future trends in geospatial information management.’ While recognizing that ‘the future is difficult to predict,’ the UN Experts have identified several trends that will be examined in depth in a future publication—slotted for release next year. Trends of note include a growing number of sensors in everyday devices collecting dynamic geospatial information and real time data creation by citizens. Unmanned aerial vehicle use will increase as will 3D and 4D data. Location based services will increase citizens’ familiarity with spatial information and use of cloud-based services will rise.

Future geospatial data will be ‘linked’ (notably to social media) and, over the next five years, linked data technologies will replace current exchange standards like GML. Technology ‘will move faster than legal and governance structures. Free and open source software will grow as viable alternatives for mapping, analysis and geoprocessing. Machine to machine interaction will enable ‘fully-automated’ decision systems. ‘Big data’ technologies will enable use of raw data feeds and this will lead to the establishment of a geospatial infrastructure—which society will rely on as it does today on electrical grids and highway networks. Spatial literacy will not be about learning GIS in schools but will be more centred on increasing spatial awareness. All of which will make for a ‘clear dividing line’ between winning and losing nations in the geospatial sweepstakes. More wild speculation and editorializing from the UN here.


White paper on seismic processing storage solutions

Robert Lai explains NetApps big data, high bandwidth, single data container solution.

NetApp has just released a seismic processing solution guide, a 56 page white paper authored by Robert Lai. The NetApp seismic processing solution (SPS) offers a single, scalable file system exposing the entire data set with petabyte scalability. The SPS comprises the E5400 storage system and StorNext file system (SNFS). SNFS is described as a ‘single data container’ with SAN, IP, and NAS connectivity for Windows, Linux and Apple Mac clients. The SNFS metadata controller communicates with client workstations over a dedicated Ethernet network—a low latency ‘private’ communications channel that arbitrates overall access to the shared file system.

The metadata manager (which comes in both Linux and Microsoft Windows flavors) also provides configuration and management support. The system provides ‘big data’ bandwidth of ‘up to’ 4.4GB/sec bandwidth from a single 4U rack unit and 1.8 petabyte storage in a 40U rack. More on NetApp storage and oil and gas data management here. Read Lai’s white paper here.


Software, hardware short takes

Petris, Headwave, Industrial Defender, Kepware, NPD, Petrosys, Pegasus Vertex, SherWare, Meridium, On Track Technology, GE, Chart Industries, University of Missouri.

Petris (now part of Landmark) has released DataVera 10.0 with a new web dashboard providing remote access to HealthCheck and MasterSet data. Charts, tables, and maps can be embedded in a corporate portal or wiki for better visibility of data governance initiatives. A file loader wizard configures overnight data import and a web services API provides workflow integration.

Headwave 2012.1 adds a gather tracker, a new AVO crossplot and promises a ‘holistic approach’ to seismic data QC, conditioning, attribute extraction and analysis.

Industrial Defender’s Survive service extends backup and recovery solutions to the control system. The Survive service, provided from ID’s SSAE/16 type II certified data center facilities, assures automation system availability across clients, servers and industrial endpoints like IEDs and RTUs.

Kepware Technologies’ new KEPServer-EX 5.9 industrial automation communications platform targets the oil and gas vertical with electronic flow metering (EFM) and wellsite information transfer Standard (WITS) level 0 drivers. EFM drivers tap into stored data to monitor and distribute real-time data and historical information for export to flow analysis and production accounting. The WITS drivers provide access to real-time drilling, MWD and mud logging data across multiple locations.

The Norwegian Petroleum Directorate (NPD) has launched an app providing access to Norway’s production and license information from the NPD’s fact pages. Users can access information on fields, production licenses, companies, production and active exploration wells, as well as news from the ministry and NPD. Applaud helped develop the app.

A new plug in for Esri’s ArcMap 10 launches Petrosys’ surface modeling software and allows grids, contours and faults to be displayed as native Esri objects. The plug in handles 2D/3D interpretation data in Openworks, Seisworks, Kingdom, Petrel and others along with shapefiles and file geodatabases. The plugin allows access to E&P databases such as PPDM and Finder.

Pegasus Vertex ’ new Cementing Suite comprises CemPro, a mud displacement modeler, CentraDesign, for placement of casing centralizers, CTemp, for computing temperature distribution in the wellbore and CemView, an engineering toolbox that calculates volumes, material and cost of cementing operations.

SherWare has announced ‘Well Profits,’ an application that allows investors and royalty owners to track revenue and expenses by operator and property and integrates with QuickBooks. A hosted version is planned later this year.

The 3.5 release of Meridium asset performance management (APM) includes new solutions to centrally manage asset health and automate corrective actions through monitoring policies. Also new is an API 581-based risk calculator for relief devices and support the API 510, 570, and 653 standards. APM has been certified by Exida for API 580 compliance. APM received a strong endorsement from Jon Graham, VP EH&S for Apache.

Unity Management unit On Track Technology has completed a ‘green’ prototype hydraulic oil pump—the NG Pumping System. The ‘energy friendly’ pump targets new and stripper wells.

Researchers from GE, Chart Industries and the University of Missouri, under an ARPA grant, are to develop an at-home CNG refueling station to retail at $500 and re-fuel in under an hour.


New Zealand NDR leverages open source mapping tools

GNS, developers of Petroleum Basin Explorer, encountered trouble near the international date line. The solution was the open source ‘Geoserver’ reference implementation of the OGC’s standards.

GNS Science has launched New Zealand’s Petroleum Basin Explorer (PBE), a free-to-access data portal of petroleum exploration data and information. PBE built on GNS’ extensive GIS databases and now serves ‘live’ maps via Geoserver, an open source ‘reference implementation’ of the Open Geospatial Consortium’s GIS standards suite. HTML/javascript web pages provide contextual information on New Zealand’s geology and the exploration history of its 18 sedimentary basins.

Proximity to the International Date Line posed a problem to much available web mapping technologies which performed poorly—and failed to integrate with Google imagery. GNS fixed the problem by adopting and customizing open-source technologies, fixing the dateline issue, which could have delayed PBE’s release by many years.

GNS has deployed the open source SOLR package which provides full text search across documents and the database’s attribute data. GNS has also developed a web-based content management system which provides a fast track for geoscience publishing and manages multiple sites for other users such as minerals or geothermal.

GNS told Oil IT Journal, ‘We use Geoserver on the server end with the OpenLayers javascript library on the client. Clients also use the open source libraries Proj4js for projection support, ExtJS for GUI, and GeoExt to extend OpenLayers functionality. GNS has contributed code back to Geoserver, Openlayers and GeoExt. The GIS layers used by geoserver reside a variety of formats. Internally, GNS uses ESRI ArcMap and stores GIS layers in ArcSDE/Oracle. Some project-level spatial data used in PBE resides in the open source PostGIS system which offers better flexibility and speed within Geoserver. We expect to make use of PostGIS in the future since the latest ESRI software can use data stored there. One of the big advantages of Geoserver is the very wide variety of spatial systems that it can fetch data from.’


ECIM 2012, Haugesund

NDB on application selection for Cairn. Shell’s enterprise architecture. Multi domain data management for Statoil’s injection/disposal well monitoring and BP’s seabed survey database. ExxonMobil’s ESRI/VoyagerGIS spatial data framework. ConocoPhillips’ Petrel project management. Western-Geco’s seismic data management. ISO ‘semiotic framework’ for data. Shell’s data quality guru.

Cairn Energy was confronted with a common operator’s problem—how to select an appropriate application portfolio for its exploration unit. NDB’s Jonathan Jenkins of NDB described a novel process, ‘application speed dating’ which was developed to circumvent ‘longwinded and partisan vendor presentations’ and to be in a position to chose, if not Mr. Right, at least Mr. Goodenough. An earlier attempt at application selection failed as testers got bored, scope crept and vendors provided ‘mini demos’ rather than tests. Data flow was overlooked and multiple Oracle instances were a nightmare.

Cairn started over with a more structured approach, pre screening vendors with a check list and establishing key workflows for testing. Even then the going was not easy—Jenkins described geoscientists as ‘artistes’ who ‘combine art and science in way that most of us find annoying!’ Enter the speed dating paradigm, with NDB as match maker. NDB provided a controlled environment for presenting results—and a facilitator (Jenkins) who was there to calm the passions and listen to the quieter voices. Scoring included data management and workflow tests. The outcome was that speed dating worked fast and excluded one vendor whose data management capability was lacking. Jenkins also cited the Aupec benchmark study as of use in determining which vendor’s tools are in the ascendant and which are on the wane.

Shell’s Lars Gaseby cited an Accenture study which found that ‘the cost of poor data is hidden in business processes and data maintenance and integration costs.’ Currently, reactive ‘band aid’ is often applied to link reporting, finances, SCADA and geoscience systems. Moreover data cleansing is often done in reporting systems—leaving the data source dirty. A significant part of people’s jobs remains reformatting and accessing data from foreign systems. Hence Shell’s interest in an enterprise architecture (EA). EA means defining a data architecture in a way that supports the business as a whole. Shell is building on a previous DAMA-derived data framework which defines data and data value owners. The new ‘data centric approach’ derives architecture from data and function—a different approach to the previous ‘systems driven’ architecture. EA requires strong business involvement—IM/IT ‘should be a follower.’

A presentation by Statoil’s Frode Uriansrud showed how monitoring of injection and disposal activities cuts across a wide range of disciplines. Injection, into suitable geologic horizons, of produced water, slop, cuttings, H2S water and CO2 is a cost effective and accepted technique. Developing a disposal well involves all the usual data sets from high resolution seismics, through logs to well tests. Injection likewise involves a plethora of measurements and data. Monitoring has led to the identification of issues such as direct hydraulic fracturing through the caprock to the sea bed, leakage along faults and well integrity problems. Data collected includes pressure and flow, visual inspection by ROV, bathymetry and environmental monitoring for hydrocarbons. Data is collected for each batch pumped and pumping stops if a pressure drop is observed. Data collected for activities such as pipeline and cable surveys is ‘re purposed’ for Frode’s team. All this needs good data management and cooperation across disciplines.

Max Gray’s presentation focused on ExxonMobil’s brand new enterprise spatial data framework. Over recent years, ESRI’s ArcGIS desktop has seen significant take-up in the upstream with up to 500 users, many occasional. But there is a huge gap between the richness of AGD and familiar tools like BingMaps and Google Earth. ExxonMobil set out to build a GIS infrastructure for both sets of users. This leverages ArcGIS Server alongside VoyagerGIS. VG is key for data management and discovery—cataloging and exposing data to users. The ArcGIS Silverlight API was used to develop web apps with some geoprocessing capability. Exxon distinguishes ‘foundational’ from non foundational data. Foundational data has a wide audience and is used across the enterprise. Non foundational data is reserved for local and or specialist use—users need a very good reason to classify data as such. The majority of GIS data is foundational. Geospatial metadata standards are used to classify data according to 14 themes (addresses, basemaps, cadastral, etc.). Exxon’s central GIS database is being used to wean users off Google Earth and on to a combination of ESRI data, Bing Maps and Exxon proprietary data.

Stein Sigbjoerensen explained how ConocoPhillips manages Petrel projects. CP began using Schlumberger’s Petrel some ten years ago as a stand alone tool with no data management as such. As usage grew, the company encountered problems with data sharing and with users knowing what was already available. Project size was growing and projects were very slow to open. CP, with help from Blueback Reservoir and Schlumberger, embarked on a project to create new a Petrel data environment to support its Ekofisk team—along with best practices and procedures for data population and a retrofit of current projects to the new environment. The result is that CP has now integrated Petrel into its data infrastructure which centers on Landmark’s OpenWorks/R5000 data store. OpenSpirit is used to get data into a Petrel reference project. The data management group maintains an asset master project for Ekofisk. Users’ projects can be created empty from a template or cloned from the master project. When work is done results can be fed back to the asset master and the user’s project deleted.

CP’s data environment also includes Schlumberger’s ProSource, GeoFrame and Techlog. The Petrel data environment is via Blueback’s Project Tracker which updates results, template and master databases at regular intervals. Care is required when using OpenSpirit to place data into Petrel—Sigbjoerensen recommends keeping the number of attributes transferred to a minimum. After a refresh, users are given a month’s grace before projects are deleted. To date CP has deleted 70 projects—with so far no complaints. While the system captures snapshot datasets at bid rounds and other stage gates, ‘full circle’ back population of the OpenWorks database appears to be work in progress.

Rick Johnston offered some insights as to how WesternGeco manages its huge seismic data library. This includes culling of old 2D data as surveys are reshot in 3D. But generally, old 3D data is kept since different recording geometries make for a variety of target illumination. WG has five processing hubs and moves data around the world for work load balancing. WG’s big data is getting bigger with a 3x hike for Isometrics data and a 12x increase for IsoGrid data. WG uses the new SEG-D Rev 3 tape standard. Cataloging data begins in the field before data delivery. A recent remastering/cleanup program has unified WG’s media and is now saving the company $4 million per year, a two year payback. WG has destroyed four million tapes in the last ten years. The company has 85 petabyes of ‘active’ data in its Houston hub which would take around 20000 days to read! The library will grow by almost five petabytes this year. WG has some 15 petaflops of HPC capacity world wide (at the hubs and on its vessels). But the library is getting smaller with new high capacity media. Houston used to have a 12,000 sq. m tape store. This is down to 200 m2 after the last ‘crunch.’ Interestingly, the largest prestack survey was a 150,000 channel onshore campaign for Saudi Aramco. WG is now offering clients ‘virtual’ data delivery—they get a delivery copy. WG maintains the original and manages entitlements—a kind of iTunes for seismics.

Shell global quality guru Kishore Yedapalli observed that typical workflows involve checking and fixing errors after delivery. Folks do not in general reach out to the data supplier to get things fixed up front. Contract owners need to provide better requirements to suppliers and keep the communications channel open after delivery. Shell’s goal is of a single version of the truth for data within and across upstream business—and to avoid recourse to massive Excel spreadsheet/macros as a ‘solution.’ Yedapalli gave an enthusiastic endorsement to Exprodat’s data quality toolset. The big picture of a single, summary data quality KPI per organizational unit gets management attention. Exprodat also acts as a data quality dashboard for Shell’s 200 major OpenWorks projects—showing which are improving and which are getting worse.

Those interested in data quality will be interested to learn of the ISO 8000-8 emerging data quality standard. Tor Arne Irgens (Norwegian Defence Logistics) and colleagues Trine Hansen and Atle Kvalheim from DNV explained that the value of such standards for the military was, inter alia, in avoiding ‘friendly fire’ incidents and targeting errors. Irgens cited the 1996 IFIP/Frisco Report as the foundation for the analysis. Frisco provides a terminological foundation for information technology a.k.a. a ‘semiotic framework for information and data quality.’ This comprises three layers, ‘syntactic,’ ‘semantic’ and ‘pragmatic.’ The intent is to build ISO 8000-8 into data acquisition contracts and thereby achieve ‘trusted data.’ The ISO standard will be formalized by year end 2013.

Ole Christian Meldahl (Schlumberger Water Services) observed that when water is concerned, things quickly get emotional, to the point where it ‘may be hard to have a rational discussion.’ Water is key to shale gas operations, coal bed methane and heavy oil. Sourcing, spills and flow-back water all needs managing and inventorying. This challenges traditional data management as many different specializations are involved—biology, well, injection, quality etc. Moreover there is ‘a complete lack of standards’ in water management, apart from ‘roll your own’ in Excel*! In the meantime, Meldahl suggests you make your own standards. Water management is deceptively similar to petroleum engineering. But it has a different history and culture. With water injection taking place near habitation, the ‘risks are increasing tremendously.’

Exprodat’s Ian Milligan and Walter Jardine (BP) described a real-world trial of the OGP’s new seabed survey data model. The SSDM was published in April 2011 and is used for site and route surveys and to ‘de-risk’ drilling the tophole. These activities leverage high resolution seismic as the key data set to identify shallow geohazards (gas, boulders, faults to surface). Other unexpected stuff that has been encountered include a 100 year old telecoms cable that got wrapped around a drill bit and an unexploded WWII bomb within 40 m of the Forties pipeline. Site surveys include sparker surveys, sonar, high resolution seismics, coring and environment sampling. There are lots of different equipment and sensors involved. The data is valuable but it can be hard to access legacy information. GIS is an excellent media for collating all of above—hence the BP pilot of SSDM format in the N.Sea ETAM area.

The SSDM is a simple ESRI Geodatabase with subtypes and attribute domains for data validation and symbologization. BP’s implementation was extended with an interface to the Pipeline open data standard spatial data model. Around 10 GB of ETAP legacy data acquired by several contractors had to be ingested. This included data as delivered from contractors, survey reports, charts, bathymetry, geotechnical logs in Excel and one GIS file. Each data type was converted to a geodatabase before consolidation to the master ETAP SSMD repository. The toolkit included ArcMap, ArcCatalog and ArcToolbox along with some Exprodat custom tools. ET GeoWizards and PetroGIS also ran. NitroPDF proved useful to extract tables. Loading ETAP’s 70 plus surveys was not without its problems—both from a software and data quality standpoint. It was a ‘fiddle’ to get all the information together and much legacy source data is ‘not really amenable to GIS.’ All in all the project was a success and is now being deployed on the Clair field. Visit the ECIM home page here.

* Although the OGC’s WaterML may be of interest here.


Folks, facts, orgs ...

A.D. Little, Baker Hughes, eLynx, Emerson, Greater Yield, Helix Energy Solutions, IBM, Ikon Science, Infotechnics, Ipcos, Neuralog, Nvidia, OFS Portal, OSIsoft, FMC Technologies, Rajant, Wood Group, Petrosys, Reservoir Group, T.D. Williamson, Tata Consultancy, Murchison Law, SPE.

Arthur D. Little has published a whitepaper ‘The projects, technology and procurement organization’ reviewing Shell, Statoil and others’ best practices.

Baker Hughes has named Mario Ruscev as CTO. He joins from Geotech.

Cincinnati Bell unit CyrusOne is adding a third data center to its Houston West location. The 120,000 sq. ft. 24 MW facility will open Q1 2013.

Gary Tootle will head-up eLynx Technologies’ new Calgary office.

Emerson has appointed Mark Bulanda as executive VP industrial automation. He succeeds Emerson Europe president Jean-Paul Montupet who is now chairman of the Industrial Automation business.

Paul Rosenblum heads-up Greater Yield’s new energy industry practice.

Helix Energy Solutions has appointed Jan Rask to its board of directors.

IBM is opening a Natural Resources Industry Solutions Lab (NRIS Lab) in Sao Paulo, Brazil, focusing on automated solutions for mining and oil and gas.

Philip Neri has been appointed VP global marketing for Ikon Science. He was previously with TerraSpark Geosciences.

Infotechnics has opened a new office in Warwick, England, and recruited Gordon Duthie as a sales executive and Jim Ross and Nabil Kabbani to its technical team.

Gilles Simon is business development manager at Ipcos in Montpellier, France.

Steve Larson has joined Neuralog as Business Development Manager.

Guy Gueritz is EAME oil industry business development manager for Nvidia.

Baker Hughes IT director Andy Morley has joined the OFS Portal management board.

Jenny Linton has replaced Bernard Morneau as president of OSIsoft. Morneau is now OSIsoft’s chief strategy officer. Martin Otterson has been promoted to senior VP sales, marketing, and industry.

FMC Technologies has appointed Bob Potter as president. He was previously with Energy Systems. Doug Pferdehirt has joined as Executive VP and COO.

Rajant Corporation has recruited William ‘Rusty’ King as VP, Oil & Gas and Terry Wason as VP of business development. King hails from Hydratight and Wason from Smart Technologies.

Ian Wood is to retire as Wood Group chairman. CEO Allister Langlands takes his place as Bob Keiller becomes CEO.

Petrosys has recruited Richard Li, Peter Mullins, Nikhil Sobti, Zohreh Nejadian and Raymond Gahan.

Reservoir Group has appointed Simon Howes to head up its data management business, Interica. Howes was previously with Carl Zeiss.

Yerlan Andashev heads-up T.D. Williamson’s new office in Atyrau, Kazakhstan.

Erik Van Kuijk is now director, strategic innovations upstream oil and gas at Tata Consultancy Services.

Dallas energy attorney Vince Murchison is forming the Murchison Law Firm focusing on the pipeline industry.

Jeff Spath (Schlumberger) is the 2014 president of the Society of Petroleum Engineers.


Done deals

Fugro, AspenTech, Excellere, FMC, Pure, GE, Presens, Naxys, HII, AES, New Tech, Carr, Quest, Digital Insight, Tetra Technologies, Westward, DLM, Ikon, JRS, STI, Technip, FEI, VSG.

Fugro expressed ‘surprise’ to find itself on the Center for Financial Research & Analysis’ ‘risk list.’ The company is ‘unaware of any reason why Fugro would be put on such a list.’

The International Court of Arbitration of the International Chamber of Commerce has concluded that AspenTech acted lawfully in terminating the reseller agreement with AspenTech Middle East (now ATME). ATME is now liable to pay damages of approximately $25 million in damages, interest and costs.

Denver-based equity firm Excellere Partners has expanded its investment portfolio through an investment in Integrated Petroleum Technologies.

FMC Technologies has signed a definitive acquisition agreement to acquire Pure Energy Services for C$11.00 per share in cash, or approx. C$282 million (US$285 million).

GE has acquired Norwegian Presens, a provider of pressure, temperature and flow measurement solutions. GE is also to acquire Naxys, a Bergen, Norway-based provider of subsea leak detection and condition monitoring sensors.

HII Technologies has signed a Letter of Intent to acquire Dallas-based oilfield service company AES.

New Tech Global has acquired Carr Environmental Group, an environmental consulting firm specializing in compliance for the oil and gas industry.

Quest Integrity Group has acquired New Zealand-based Digital Insight, a specialist in remote digital video inspection.

Tetra Technologies has completed its acquisition of well test specialist Greywolf Production Systems for $55.5 million cash.

Westward Partners has acquired a majority interest in DLM Oilfield Enterprises. Generational Equity advised DLM on the transaction.

Ikon Science has acquired Australian JRS Petroleum Research, developer of integrated software for image log analysis and geomechanics. JRS’ software will form a RokDoc geomechanics module.

Statoil Technology Invest is to exit ShareCat. STI, along with ProVenture and Investinor are to take part in a 21 million NOK share issue by HPC specialist Numascale.

Technip has finalized its € 225 million acquisition of Stone & Webster process technologies.

FEI has acquired Visualization Sciences Group (VSG) for €44.8 million.


Consortium corner

Subsea instrumentation interface standard plugfest. RSI floats reservoir anisotropy research.

The Subsea instrumentation interface standard (SIIS) is a joint industry project that aims to improve subsea equipment reliability by standardizing the interface between sensors and the subsea control system. SIIS held its second ‘Plugfest’ interoperability trial this month at Tronic’s subsea excellence centre in Ulverston, UK. The output from the SIIS workgroup is fed into ISO13628-6. The vision is for ROV-pluggable interfaces to subsea control systems. SIIS, open to oil companies and suppliers, now boasts 32 members.

Rock Solid Images (RSI) is asking for expression of interest in research into resistivity anisotropy characterization. The joint industry project will build on RSI’s atlas of rock physics and seismic responses and will use well log, CSEM data and modeling to ‘develop a better understanding of regional resistivity and anisotropy trends, examine the underlying causes of these trends and to develop methods to predict anisotropy from common well log measurements.’


Gaming technology drives HARC’s virtual drilling rig

Coastal Impacts Technology Program and Epic Software roll-out ‘green’ drilling demonstrator.

The Houston Advanced Research Center (HARC), in conjunction with the Coastal Impacts Technology Program (CITP) and the Epic Software Group are developing a multimedia web application, the 3D virtual rig tour (3DVRT), to demonstrate advances in environmentally friendly drilling technologies. CITP head Richard Haut said, ‘The application contrasts conventional drilling techniques with new methods with a suite of videos and 3D animations.’ The videos are presented by virtual roughnecks ‘Ralph and Rhonda.’

Epic Software has provided the animation, web graphics, video and programming for the project using a ‘powerful software engine’ that allows visitors to move around the rig, play videos and retrieve information on related ‘green’ products. The 3DVRT showcases closed loop mud systems, small footprint rigs, advanced hydraulic fracturing systems and high efficiency water handling and processing systems. The 3DVRT project was funded through a grant from the U.S. Department of the Interior, Coastal Impact Assistance Program. Take the 3D Virtual rig tour here.


Smalltalk facilitates PetroVR object retrieval and re-usability

Caesar Systems uses Smalltalk programming language to capture, modify and audit user actions.

Caesar Systems’ Leandro Caniglia and Carlos Ferro delivered presentations to the 2012 EU Smalltalk user conference in Belgium last month. Caniglia’s hands-on live demo showed how Smalltalk-coded software can be enhanced to handle multiple scenarios within a time-oriented ‘virtual world.’ Ferro’s presentation was titled ‘Saving and retrieving objects in a changing environment.’ He showed how Smalltalk addresses the problem of storing rich objects and adapting them later to an application that may have undergone many changes over a span of several years.

Such back-compatibility is a feature of Caesar Systems’ PetroVR software. The company is a long time user Smalltalk. Smalltalk addresses the challenges of changing environments in many ways. Its conceptual simplicity and economy makes developers stick to a few simple principles, which remain stable under change.

Caniglia explained to Oil IT Journal that Smalltalk’s keystroke recording functionality meant that user scenarios could be captured on the fly and modified or re-run at will. This makes the environment well suited to sensitivity analysis and hypothesis testing. Smalltalk has been in use since the 1980s. One flagship Smalltalk deployment is JP Morgan’s ‘Kapital’ financial risk management and pricing application.

PetroVR is used by BOEM (formerly the MMS), BHP Billiton, BP, Chevron, ENI, Maersk, Murphy Oil, Shell, Talisman and Total. Visit Caesar Systems on www.petrovr.com and read the blog here.


Kongsberg IT for Pacific Drilling’s Santa Ana

New information management system for dynamic drill ship.

Kongsberg Maritime is to provide Pacific Drilling with a new information management system (IMS), a component of the automation system to be deployed on its dynamically-positioned drillship, Pacific Santa Ana. The IMS consolidates the rig’s control system data into a single, role-based secure web-portal. The IMS is said to enhance information sharing and offer real-time insights into operations for both offshore and onshore teams.

The IMS is a modular system with a scalable infrastructure that can be adapted to specific operational requirements. The Pacific Santa Ana delivery features a data logger that replicates onoard data sources to an onshore fleet database.


Sales, contracts, partnerships and deployments

Aveva, Citi, EST Enerji, MatrikonOPC, Expro, Fugro, IFS, Mahindra Satyam, Ikon, Petrotrace, Ingrain, ISN, M2M Data, SkyWave, Meridium, OSIsoft, Kongsberg, Petris, Intergraph, KSS Fuels, NRX, Bright Computing, EnergyNet, Schlumberger.

Saudi Aramco/Total Refinery and Petrochemical Company has selected Aveva NET as a ‘digital information hub’ for engineering document management. Aveva Data and Documents will manage engineering information at the Jubail joint venture.

Citi has been awarded a mandate from New York-based Kimmeridge Energy Fund to provide private equity fund and cash administration services.

Turkish control system integrator EST Enerji has joined the MatrikonOPC global partner network.

Expro has secured a contract for its clamp-on Sonar meters from Eni on its Zubair field, Iraq.

Fugro Seismic Imaging has adopted Paradigm Skua for modeling of salt bodies and velocity fields.

IFS has partnered with Mahindra Satyam for joint sales and marketing activities around the IFS Applications software suite, and staff training.

IFS Applications has been selected by Ontario-based Trans-Northern Pipelines as its enterprise asset management and resource planning solution.

Ikon Science and Petrotrace Global are partnering to service new arctic exploration and improve reservoir recovery in Russia and the CIS.

Ingrain has received a contract from the Colombian National Hydrocarbons Agency to digitize the ANH’s national core store.

ISN has deployed an ICT infrastructure for Afren’s Kurdistan, Iraq operations. The project includes regional headquarters complete with data centre, video conferencing, CCTV and a VHF radio link to the field.

M2M Data Corporation now offers its clients SkyWave’s new IsatData Pro satellite communications service.

Apache has selected Meridium as its global asset integrity management system. Meridium has also been chosen by Singapore Refining Company to support its mechanical integrity initiative.

OSIsoft and Kongsberg Maritime have announced a strategic partnership, enabling users of the new Kongsberg Information Management System to benefit from the PI System for processing data and events.

OMV has selected PetrisWindsRecall and Enterprise for its well log management. Petris’ borehole data management solution has also been selected by Petronas Carigali.

Intergraph has won a contract with Santos for the use of SmartPlant Enterprise for Owner Operators (SPO), along with other SmartPlant Enterprise solutions.

MRH Retail has implemented KSS Fuels fuel pricing application across its UK network of more than 300 retail locations.

Shell has expanded the use of NRXAsset Hub software to four additional sites in Malaysia, Iraq and New Zealand.

Chinese oil company Sinopec has chosen Bright Computing’s Bright Cluster Manager with the aim of reducing HPC-related overhead.

The State of North Dakota has chosen EnergyNet for a five-year period to provide web-based auction services for its oil, gas and other mineral leases.

Total is joining Schlumberger and Chevron to further develop the Intersect ‘next-generation’ reservoir simulator.


Standards stuff

OGC GeoSparql. CSIRO ‘secret’ borehole project. Oasis LegalXML. OGP shapes. PPDM UWI.

The Open Geospatial Consortium (OGC) has released its GeoSparql standard, a set of Sparql functions, RIF rules and a core RDF/OWL vocabulary for geographic information.

Oasis is seeking nominations for two seats on its LegalXML steering committee. The group supports implementation of XML technology in electronic legal documents, records and information exchanges. Oasis has also launched a LinkedIn forum targeting standards for big data.

CSIRO researchers recently reported on the use of OGC standards for mapping borehole observations using a ‘true’ 1-D coordinate reference system. Recently adopted OGC standards CityGML 2.0 and GML 3.3 show promise in representing 3D objects and in deploying linear referencing as per borehole logging. CSIRO’s ‘secret Friday morning project’ leveraged GeoServer.

The OGP geomatics committee has added a ‘shapes’ component to its database of coordinate reference systems, a series of standard polygons used to accurately match coordinate reference systems to geographic locations.

The Energy Resources Conservation Board of Alberta, the Canadian Association of Petroleum Producers and other Western Canadian regulators have endorsed the Professional Petroleum Data Management Association’s (PPDM) proposed overhaul to Canada’s unique well identifier. The well identification Western Canada project follows on from PPDM’s overhaul of the USA API D12A well numbering system (Oil IT Journal June 2010).


Agip deploys Ubisense location-based personnel safety solution

‘World’s largest’ deployment of real time mustering solution leverages S3 ID’s personnel safety tags.

In what is claimed to be the world’s largest geographic deployment of a real-time personnel location and mustering system, AGIP has selected Cambridge, UK-based Ubisense, in partnership with Norwegian S3 ID for its Kazakhstan operations. In the event of an accident, S3 ID provides first responder with critical data on who was present at a facility and who has been accounted for at muster points. AGIP’s safety system shares such information across multiple locations, both on and off site. The technology combines RFID tags with Ubisense’s ‘ultra wide band’ communications technology. A small battery powered tag provides three-dimensional location with 50cm accuracy at up to 200m range.

S3 ID is used to control the maximum ‘people on board’ in a specific environment. Tags can also provide electronic control of permits to work and to restrict access to specific control systems through a central database of permissions. Ubisense’s geospatial solutions featured at a recent meeting of the FOSS4G community where Peter Batty presented his ‘myWorld’ app as an exemplar of ‘disruptive technology’ (Oil ITJ May 2012).


Real time location system tracks oil and gas personnel and assets.

Mojix’ RFID tags track personnel and critical mobile assets for safety and supply chain transparency.

Los Angeles-based Mojix, a provider of wide-area RFID networks and passive RFID asset tracking solutions has announced an ‘increased focus’ on the oil and gas industry. The Mojix ‘space-time array reader’ (Star) passive real-time location system tracks critical assets and supports personnel safety applications. Star improves safety by eliminating processes where there is potential for accident or injury. The location of personnel is known at all times and the system ensures that employees are properly attired by reading RFID tags on helmets and safety vests.

Mojix provides ‘deep supply chain transparency’ by reading passive RFID tags on oil country tubular goods from a distance. Workers need not un-rack and examine pipes to check inventory. This is claimed to reduce the number of human touch points and the incidence of lost items. Passive RFID tags are available for containers, pipes, tools, cases and other asset classes from over 100 vendors. The tags implement the ISO-approved EPCGen2/GS1 UHF standard for devices operating in the 860MHz to 960MHz ISM band. The system can read tags at a distance of ‘up to’ 200 meters. Mojix was founded in 2004 by ex JPL/NASA scientists.


Shell, Neos presentation on remote sensing of stray gas

Airborne EM survey tracks Marcellus stray gas and identifies production sweet spots.

A joint Shell/Neos Geosolutions presentation at the US Groundwater Protection Council’s recent forum on stray gas incidence and response described the use of remote sensing technologies to detect surface and near-surface stray gas occurrence and migration pathways.

The presentation described the joint ‘Neoprospector’ in north-eastern Pennsylvania. Last year Shell contracted with Neos to conduct a remote sensing survey of its Tioga County acreage with the aim of detecting hydrocarbon seeps and surface indicators and identifying ‘orphaned’ wellbores that were no longer documented in state, county, or prior leaseholder records.

The non-seismic survey involved resistivity mapping and hyperspectral imaging of geo-hazards and floral surface variations. The airborne EM resistivity dataset was integrated with other newly acquired potential field datasets along with well, seismic, and production data, and used to geostatistically characterize production sweet spots in the Marcellus shale. Neos investors include a certain Bill Gates! More from a ‘narrated slideshow’ here.


‘Voice of the customer’ heard at Shell UK retail

Nice Systems’ ‘Fizzback’ provides Shell with real time feedback from gas stations.

Shell UK Oil Products is to deploy a real time customer feedback system from Israel-headquartered Nice Systems. Following a trial last year, Shell is using Nice’s ‘Fizzback’ voice of the customer solution to capture customer feedback at its retail fuel business. Members of Shell’s Driver Club loyalty program are invited to provide feedback via SMS or email immediately after filling up at the gas station. Fizzback enables Shell to assess the performance of its service stations and to ensure they meet its standards.

Shell’s 800 UK-located service stations have access to customer feedback and can address ‘customer issues’ in real time. Shell describes the UK fuels market as ‘highly competitive,’ hence the need to listen to its ‘numerous and disparate’ customers to drive improvement. Nice MD Rob Keve said, ‘Retail is a growth sector for our voice of the customer solution.’ UK-based Fizzback was acquired by Nice last year for approximately $80 million.


Control system virtualization

Invensys offers thin client support and cyber security best practices for DCS installation projects.

Virtualization, providing compute resources from a data center rather than from a local PC is proving a good way to separate hardware from applications to improve IT flexibility and scalability. Now, Invensys’ Operations Management unit is extending virtualization to process control systems by providing thin client support for its Foxboro I/A Series distributed control system (DCS). Invensys offers DCS virtualization on the Microsoft HyperV and VMware platforms.

Invensys’ Gary Freburger explained, ‘Control and safety systems are often delivered as turnkey solutions which can take up to 18 months to implement. Virtualization of our Intelligent Marshalling and Intelligent Engineering workbench solutions reduces implementation costs and project risk.’

The offering includes a new range of servers qualified as optimized virtual machine-hosting appliances along with diskless terminals, thin client management software and software. Invensys is also providing advice on cyber-security best practices and approved virtualized architectures.

Virtual machines are accessible worldwide via terminal services so global teams can work on projects around the clock. Invensys claims that the virtualization route to deployment ‘could be made available to customers after commissioning by offering off-site archive services, as well as shadow-system support for customers willing to share their applications with Invensys support teams.’ More from Invensys here.


Pemex trials Chimera’s ‘non hydraulic’ helium frac

Mexico’s Chicontepec tight oil reservoir slated for test. But what does ‘non hydraulic’ mean?

Houston-based Chimera Energy has announced a novel ‘non-hydraulic’ method for shale oil production. Chimera fracs shales by injecting liquid helium into a well which expands 750 fold as it transitions to the gaseous state, creating the pressure required to open up existing fractures and form new ones. Chimera is currently working to industrialize its novel process for ‘mass production, relicensing and sales.’

Commenting skeptical remarks in the blogosphere, Chimera president Charles Grob said, ‘There will always be naysayers in a new business venture, and we have ours. I and all those associated with Chimera are excited about our prospects and encouraged by our progress. We are moving forward at a deliberate pace!’

Grob reported on ‘fruitful meetings’ with Pemex which is to trial the technique on the Chicontepec tight oil formation. Chicontepec has been reported as holding some 140 billion barrels of reserves, ‘the equivalent of half of the reserves of Saudi Arabia.’

Comment: Apart from any discussion as to the practicality of the method, the claim that fracking with a gas is ‘non hydraulic’ is curious from an etymological standpoint. Our dictionary defines hydraulics as ‘the branch of science concerned with the practical applications of fluids in motion.’ As both states of helium are, sensu stricto, ‘fluid,’ the process would seem to be ‘hydraulic.’


eLynx announces GPS-enabled alarm for oil and gas

Enhanced ‘i4D’ device adds location information to alarm data for mobile asset monitoring.

Tulsa-based eLynx Technologies has announced a GPS-enabled alarming device for oil and gas. The device is an enhanced version of eLynx’ i4D, a four-input discrete alarm that uses the Iridium global satellite network to communicate from remote locations without cell phone coverage.

The i4d provides run time metrics for oilfield equipment such as compressors and generators. The GPS-enabled version adds real-time location information for mobile assets. Logs and location data are sent to eLynx’ data center and can be viewed in the SCADALynx mapping package. Notifications can also be sent to operators via text, email or voice callout. The device calls home once a week to say it is still alive and communicates battery level information with each data transmission.


Utilipoint leverages Hadoop in energy data analytics

Sister company StraTech and partner Cloudera team on ‘big data’ solution to energy data mining.

Energy research and advisory boutique UtiliPoint has expanded its data analytics practice with a Hadoop services offering. UtiliPoint COO Bob Bellemare said, ‘Expanding our analytics capabilities to include Hadoop will help customers identify previously unrecognizable associations and trends.’ The offering is backed by sister company StraTech and partner Cloudera, a Hadoop specialist.

UtiliPoint MD Rand Warsaw added, ‘When it comes to data analysis, size and speed are no longer issues. A terabyte of unstructured data previously took hours to process. With Hadoop, it can be mined in seconds. The issue now is what companies need to know and how best to use the data.’ UtiliPoint and StraTtech are units of Midas Medici Group, an IT services and infrastructure provider. More on Hadoop in the July-August 2012 issue of Oil IT Journal.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.