UK headquartered natural language generation (NLG) specialist Arria’s Data2Text unit has signed a three year extension to its deal with Shell for the use of NLG to report on equipment health on Gulf of Mexico assets. Following initial trials, Shell is to pay ‘$5-10 million’ over the period for Arria’s technology. Shell Oil rotating equipment specialist David Laux observed, ‘I have worked [with Arria] on the vision for the articulate oil field from inception. This brave new technology has massive potential to change the way that oil and gas platforms are run.'
Arria has its origins in a 2009 University of Aberdeen spin-out, Data2Text. A group of investors, under the Arria banner, acquired 20% of Data2Text in 2012 with an option (exercised in October 2013) on the remaining 80%. Data2Text’s technology has two components, an artificial intelligence toolset that extracts information from disparate data feeds and a natural language generation system that turns it into human-readable English. It’s rather like HAL, the computer in 2001: A Space Odyssey, which famously said, ‘I've picked up a fault in the AE35 unit. It’s going to 100% failure in 72 hours.’
In a flagship demonstrator, Data2Text’s technology has been deployed at the UK Meteorological Office to generate fine grained local five day weather forecasts at some 5,000 locations around the UK.
Arria has succeeded with a high profile marketing campaign including keynote talks at SPE-sponsored events in Abu Dhabi and Utrecht. The technology was also presented at the Houston Pumps and Pipes event earlier this year (Aberdeen University’s ‘BabyTalk' NLG app assists neonatal carers).
In oil and gas, NLG is presented as a better way of communicating information to stakeholders than today’s human machine interfaces. The system captures expert knowledge from about-to-retire subject matter experts and registers it as ‘meaningful natural language narratives.’ During operations, the NLG engine compares current and historical data from multiple sources ‘combining data analytics and computational linguistics.'
In December 2013, Arria floated 36% of its share capital on the London stock market. The shares jumped from the £1 issue price to peak at £2.82 before falling back sharply in the first week of trading. Arria is currently at 55p. The company has been awarded two US patents for its technology, N°s 8,762,133 and 8,762,134.
Winding up his keynote at the SPE Abu Dhabi event on the 'articulate machine,’ Arria CTO Robert Dale asked, ‘What would your data tell you if it had a voice?’ Let’s hope it’s not ‘Dave, this conversation can serve no purpose anymore. Goodbye.’ More from Arria (and HAL).
Halliburton has acquired 100% of the share capital of UK-headquartered Neftex Petroleum Consultants. Neftex offers an extensive georeferenced database of public domain bibliographic information for explorers exposed in a web map. A demo at the EAGE pulled together chrono-stratigraphical charts from both sides of the Atlantic to compare the development of Brazil’s Espírito Santo basin with Angola’s Kwanza basin. Information on palaeogeography, facies and play types can be pulled into the view. All information can be investigated with palinspastic reconstructions using PalaeoGIS reconstructions from the University of Lausanne.
The company’s GIS systems have evolved considerably over time.
Neftex started out with a web mapping application using the Bing Maps
API with tiled images produced by Safe’s FME. More recently Neftex has
deployed the open source GeoServer API to provide polar projections,
web map services and paleo-reconstructed maps. Also of interest (and
intrigue) are Neftex’ ‘ChronoCubes,’ bundles of the Neftex products for
direct load to Schlumberger’s Petrel. More from Neftex.
Let’s pretend that I am the exploration manager for a medium sized oil and gas outfit company and we have just drilled an expensive dry hole. I am feeling the heat already. During the drilling, as is customary - well it used to be I don't suppose it is now - the company’s top brass, WAGS* in tow (sorry for the sexism but World Cup oblige) all traipsed out to the drill site located in a woodland area to witness the well test and (hopefully) celebrate. During the visit, the CEO and her husband (back in your good books?) wandered off (what could they have been up to?) into the woods and discovered an old abandoned well head only a couple of hundred yards from ours!
The CEO came back disconcerted asking what was going on. I was on the spot as I hadn't a clue what this old well was about. It wasn't on any of our maps. Our well then failed spectacularly to flow and we all headed back home, me in a state of considerable embarrassment. Now this is a fiction, so let’s imagine that instead, as is customary, such things get swept under the carpet, we kicked off a post mortem investigation to find out why the hole was dry and what should be done to avoid it happening again.
This post mortem is an excellent example of a good business question that has some interesting attributes. Number one it is easy to understand by all those involved. Number two it is ring fenced - why is the well dry? And number three - it has a chance of being answered with finite resources in a reasonable time frame. It is, in other words, a business question that has all the attributes necessary to turn it into a project! Note the order of things here. To give this more weight I propose a little flow chat viz.
OK I can hear you think, ‘that is pretty dumb as flow charts go’ but bear with me.
Actually answering both components of this question can be quite tricky which is why I introduced the deus ex machina of the nearby well. The investigation found that the old well drilled some 30 years ago had already tested the structure and proved it to be dry. How could we have re-drilled a dry hole**?
Although this scenario sounds improbable it does happen. Wells are ‘lost,’ or misplaced. Wells are drilled in the wrong place. Sidetracks run into old wells, logs may be wrongly depth registered and so on. Shit happens...
Such issues are of great interest from an IT perspective. We have identified a data issue. What can IT (I use IT in the broadest sense including data management, governors and computer experts etc.) do to help? IT loves to ‘abstract’ and generalize problems. Data issues of this nature are currently considered to be ‘master data’ problems and as such amenable to being handled in a ‘master data management' (MDM) project. At this juncture I invite you to visit Wikipedia to see what a large bucket MDM is today - including data cleanup, consistency, ‘unicity’ and so on.
A presentation from Entrance Software’s Nate Richards at the 2014 PNEC conference last month on ‘Unexpected insights from an MDM failure’ included a citation from Gartner which forecast that, ‘Through 2016, only 33% of organizations that initiate an MDM program will succeed in demonstrating the value of information governance.’ Richards argued that ‘MDM initiatives are not just projects,’ and that ‘people are the primary drivers of success (or lack thereof).’
I think there is another problem with the MDM project in that its top-down approach only brings benefits when a project is complete. In North America, several million wells have been drilled for oil and gas. Large operators have thousands, perhaps hundreds of thousands of wells in their databases. As more complex wells are drilled, operators are confronted with more tricky geometries, the burden on an operator to track everything goes through the roof. We are quickly into ‘boiling the ocean’ or ’solving world hunger’ territory to use just two popular clichés.
If MDM projects really do fail as often as Gartner would have it***, it is probably because they are ‘MDM projects’ and as such are unconstrained and not addressing a sufficiently specific business issue for them to have a chance of success. In fact Richards’ presentation revolved around the implementation of ‘complex shale lease compliance' which is indeed a good ‘business question.’
How should my fictitious company have gone about ‘not letting it happen again?’ I don't know. Maybe they should have fired me (maybe they did, maybe I didn't make all this up). Perhaps a data blitz prior to drilling would be a good approach - checking all information within say a 10km radius of the well location for accuracy. ‘Just-in-time MDM' if you want to be fancy. At least take a walk in the woods before spudding to make sure nothing embarrassing will be seen by the CEO. Whatever approach you try, just remember the magic formula and put the business cart before the project horse.
* WAGs - wives and girlfriends (in football parlance).
** I am not very happy about this intro and I don't suppose that you are either. You may like to replace it mentally with an unusual set of circumstances that have tested your information systems. The expression 'the exception that proves the rule comes to mind.’ This is not, as is commonly believed, nonsensical. It refers to an older usage of the word 'prove’ as in ‘puts to the test.'
*** I am a skeptic when it comes to Gartner-style analyses and I am not saying that MDM is a waste of time. Data cleanup etc. is an essential part of doing business. On the other hand, many data ‘issues’ would go away if the constraints available in the relational database were correctly implemented.
ESRI Petroleum user group (PUG) president Andrea Le Pard announced a ‘rebranding’ of the PUG as a ‘professional industry organization.’ The new PUG is to promote GIS as a professional discipline in oil and gas companies and to help out with advice on career ladder, job descriptions and salary scales. The new PUG is to initiate geospatial workflow and technology research projects and to carry on with support for the ‘List’ of outstanding GIS-related issues. The List is now housed on the Esri Ideas website. The ‘professional’ proposal got a mixed reception on the GIS in Petroleum LinkedIn discussion board.
ESRI has fleshed-out the ArcGIS ‘platform’ strategy it introduced last year. This includes ‘WebGIS,’ described as a ‘new paradigm’ but actually a technology shift from the somewhat deprecated Flex/Flash and Silverlight endpoints to shiny new HTML5. A new web app builder provides a plug-in environment for developers and programming-free map customization for users. ESRI and third parties provide themes, widgets and ’stem’ apps that users can leverage in lightweight, fit-for-purpose mapping applications.
One demo from RPS ASA showed an oil spill web app that combines operational response smarts for spill tracking with a map of the user’s chosing. Another demo from Coler &Colantino (now Novara GeoSolutions) showed how C&C’s pipeline tools for high consequence analysis, pipeline class calculations and more are accessible from a map in the browser. All connecting to a PODS database back-end. Users can covisualize web services data from the database along with map data. And presumably with other providers widgets such as RPS’ although the demo did not show this.
ESRI is positioning the platform rather grandly as one of three pillars of enterprise IT as in ‘what, when and where,’ with the platform providing the ‘where.’ We're not sure what provides the ‘what' and ‘when’ but the demo of the ‘maps at work’ concept from ESRI’s Norwegian partner Geocap was compelling.
Here we have full-featured display and control of a 100 GB 4D seismic volume from inside ArcScene with roaming through cross lines, inlines and time slices, ‘just like in an interpretation package.' Next, log into ArcGIS, bring in key wells and publish your project as a web scene for visualization in a browser - with the same functionality as in the desktop app. Pretty fancy stuff. Maybe one day Schlumberger will be writing plug-ins for ArcGIS! More from the PUG in next month’s Oil IT Journal and from the PUG conference page.
Speaking at the 2014 PNEC conference last month Marc Nolte revealed that Total E&P Canada found its Schlumberger software stack lacked key functionality necessary for its Alberta oil sands development. Total’s data management environment, ‘Gadama’ includes Finder, ProSource, LogDB and Decision Point, along with extensive processes for data loading and QC. But these did not cater for the new mining data types. As Nolte puts it, ‘Oil Sands projects are very different. Data is vast and voluminous and projects are very complex.’ In the meanwhile, Total’s acquired affiliate was plowing ahead with ‘non standard’ applications like Minesight, Surpac, and GeoGraphix.
To bring some data management savvy to the table, Total turned to Acquire Technology Solutions, developer of GIM, a geoscientific information management solution for the minerals and coal industries. Acquire provided Total with software better suited to its mining activity. GIM also could be configured to provide headquarters with data in Gadama-compatible format. Nolte gave a strong endorsement of Acquire’s adadaptability to different information management approaches, ‘Even though we're all in the same business, we all have our unique perspectives, ideas and processes.’
Acquire is used to QC and verify Wireline information and surveys as well as preliminary field data loaded into SharePoint. Acquire is also deployed alongside ESRI technology to view rig status during drilling operations and to quality control laboratory and geological information from coring.
Today, Acquire is Total ’single source of truth,’ that feeds HQ, partners, government and project databases. Total also uses SQL reporting and Microsoft SharePoint to publish and display data. Acquire is used in Canada by Shell, Imperial, CNRL and Suncor. More from email@example.com.
Speaking in the fascinating if somewhat inebriating Bols museum (just across the road from Amsterdam’s Van Gogh museum if you are ever in need of a restorative), Roxar Software Solutions MD Kjetil Fagervik unveiled plans to migrate the company’s RMS geomodeling solution to the cloud. The novel ‘RMS to Go’ solution is currently at the R&D stage but Fagervik showed a proof of concept version of the software from an iPad connected over the Bols WiFi to an Amazon EC2 cloud-based server located ’somewhere in Europe.’
All the calculations and graphics run in the cloud and the graphics are exposed over the internet using Calgary Scientific’s PureWeb embedded cross platform graphics library. Roxar is also working on a cloud-based edition of its DotRox API for remote plug-in development.
We also heard a rumor that Landmark is planning a ‘DecisionSpace to Go' cloud-based edition which might make for a little race to the US patents and trademarks office for the use of ‘to go’ in an upstream context.
Schlumberger’s Petrel 2014 will be released next month with a complete refresh of the user interface, the result of extensive usability testing with a couple of dozen key clients - to provide a ’single shared view of the subsurface for all disciplines.’ The new GUI represents a move to the Microsoft Windows ‘ribbon’ interface which in other contexts (like Microsoft Office) has proved a love hate thing. Schlumberger claims the Petrel ribbon has been well received and it does seem to have had more thought put into its organization than its Office equivalent. A logical left-to-right organization reflects Petrel’s broad seismic-to-reservoir scope. For those not convinced, reverting to a classic WIMP mode is possible.
Different ribbons reflect workflow ‘perspectives,’ covering for example geology and geophysics, reservoir and production. A 'drilling and shale’ perspective is to launch at URTech later this year. Extensive usability testing has enabled click counts per task and mouse mileage to be significantly reduced over the 2013 edition. New technology in 2014 includes quantitative interpretation with Western Geco’s AVO, DepoSpace (a Skua clone), Visage geomechanics, Elesevier’s Geofacets and relief well planning with an embedded Olga flow simulator. In a parallel development, Schlumberger is to have its Techlog acquisition running on the Petrel Studio platform later this year and to offer an Ocean APOI for Techlog.
Halliburton’s Landmark Software unit has announced DecisionSpace G1 Personal Edition (DSG1), a head-on competitor for Schlumberger’s Petrel. To make its DecisionSpace interpretation flagship more deployable in smaller E&P shops, Landmark has decoupled it from the OpenWorks database and is offering the 'full-featured’ geosciences suite as a ‘zero-configuration’ (i.e. no Oracle installation) single user bundle at a named-user list price of $99k.
In an unveiling at the Amsterdam EAGE this month, Landmark
showed a time lapse screen capture movie of a DSG1 installation showing
that around half an hour of real time was all that is required to load
the software (including an embedded ArcGIS mapping engine, the SQLite personal
OpenWorks database) and a substantial data set of seismic and well
data. Once loaded, the full DecisionSpace toolkit is available to the
Landmark is pretty fired-up about DSG1’s potential but for some, the move from a central OpenWorks database to a personal project/file approach à la Schlumberger Petrel was seen as a step backwards. Although managing data across different instances of OpenWorks will likely be easier than across multiple Petrel projects.
Two reports see trouble ahead for the oil company workforce. An analysis by Mercer’s energy consulting practice finds that a retirement wave and technical skills gap threatens oil and gas company growth while another report from the American Geosciences Institute (AGI) predicts a shortage of 135,000 geoscientists by 2030. The Mercer study predicts a loss of ‘50-80%’ of the retirement-eligible workforce over the next five years. Half of the companies studied plan to ‘poach' talent to fill the gap, an approach that Mercer’s Philip Tenenbaum describes as, ‘Simply not viable or sustainable’ and advocates a strategic approach to talent management. Skills development is also required to balance another ‘critical problem’ of a widely reported technical skills gap.
The AGI predicts a shortage despite rising enrolment in geosciences education and recruitment, especially since oil and gas and professional societies have increased outreach to student members. Also, most of the geoscience workforce is within 15 years of retirement age.
The American Petroleum Institute is also concerned about the greying workforce and has launched a website to provide information on careers, training and certification.
The functional mock-up interface for model exchange and co-simulation (FMI) was originally developed by Daimler (Mercedes-Benz) to enable the exchange of simulations with suppliers. The interface has had some take up in oil and gas as witnessed by a Statoil presentation at the 2012 Modelica conference of a case study on the estimation of the gas-oil ratio of an offshore production facility.
A candidate release of V2.0 of the FMI spec was released this month for prototype implementations to get feedback from the user base. The spec has two main fields of application, model exchange and co-simulation. FMI models are described by differential, algebraic and discrete equations and can generate C code for use in other modeling and simulation environments. Models can be very large for use offline or small for embedded control systems on micro-processors.
The interface standard allows for coupling of simulators in a co-simulation environment with data exchange at discrete communication points. A master algorithm controls data exchange between subsystems and the synchronization of all simulation solvers (slaves). More from FMI.
The 2.2 release of EnergyIQ’s Trusted data manager implements a full well hierarchy with aggregation and blending rules, attribute inheritance when creating child records of a well, a new IHS EnerdeqML loader and GUI enhancements.
Dynamic Graphics’ CoViz 4D V7.1’s new features include drag-and-drop loading from Schlumberger Petrel projects, LAS well log files and enhanced Sim2Seis and well designer workflows.
Widget specialist INT has announced GeoToolkit 4.3, a C++ library of visualization components and INTViewer 5.0 its data visualization platform that helps fast-track the development of oil and gas applications in Windows or Linux. Versions are also available for Java and C#/.NET.
SGI partners GIS Federal and Nvidia have announced GPUdb a.k.a. Gaia, a highly scalable ‘in-memory’ geospatial database. The GPU-accelerated distributed database is used by the US Army intelligence and security command to render complex geospatial features and heat maps during military operations.
The 10.2 release of Interica’s iAsset E&P data management system includes new bulk data loaders, data audit trails and enhanced mapping. Other improvements cover logging, data cataloguing and selection.
Caesar System’s PetroVR
11.2 allows for on-the-fly switching between production
decline regimes to model unconventional wells. New facility cloning
functionality helps plan rig mob/demob. Sequential shut-in excess
policy has been enhanced and the release includes project level total
fluid production roll-up.
Geoforce has announced GT0, 'the world’s smallest RFID/GPS asset tracking device.’ The ’slap-and-track’ device is used to remotely monitor assets ‘too challenging’ for other GPS devices.
Geovariances’ Isatis 2014 geostatistical package offers improved gridding with Bayesian drift and new facies simulation functionality. The new release also offers better seismic functionality and exploratory data analysis.
Oniqua’s ‘intelligent’ MRO streamlines oil and gas and mining companies’ maintenance, repair and operations with cloud-based software and analyst services.
Pronto Software’s Pronto Xi now supports oil and gas enterprise resource planning (ERP) with a single source of enterprise information and process automation.
The 2014 SMi Machine to machine (M2M) conference held in London earlier this year kicked off with a presentation from Shell’s Berry Mulder on ‘frontier automation,’ a.k.a. the ‘latest and greatest’ in process automation. Frontier means remote working, wireless, video and robotics applied to distant sites for situational awareness and operations. Robots are of particular interest to Shell as they improve safety and reduce production downtime. Use cases include flare stack inspection with drones, tank inspection with the EU-sponsored ‘Petrobot’ (see below) and robotic drill pipe handling. M2M requires connectivity across multiple devices from field to office leveraging WiFi, 4G, LTE, and internet in what Mulder dubs ‘X2X’ communications. Workable X2X means good bandwidth and low latency, sometimes across hundreds of kilometres, and reliable means of managing software licenses, authentication and security.
Tim Black provided more information on Petrobot, a project funded under the EU 7th Framework R&D program that seeks to ‘open up the oil and gas and petrochemical market for robotic inspection,’ (sic). The project is developing a semi-autonomous robot capable of achieving tank cleaning and inspection activities and negotiating obstacles with limited operator control. Different flavours of the robot are designed for work in storage tanks and pressure vessels. Consortium members include Shell, Chevron, Gassco and Vopak.
Eurotech’s Tim Taberner provided an introduction to the internet of things (IoT) , claiming that the IoT is catching up with the oil and gas industry. In fact, Eurotech’s own IoT architecture was originally designed for the oil and gas industry to bridge the vertical’s plethoric devices and protocols and enable connectivity between process and office systems. Eurotech advocates a ‘Twitter-like’ approach, decoupling data producers and consumers via a dedicated ‘enterprise service bus for machines.' Eurotech’s interpretation of the IoT is embodied in ‘ESF’ its Everyware cloud and software framework, an M2M communications appliance that is programmable via the Eclipse Foundation’s Kura project. Everyware is positioned as an operating system for the IoT.
Chris Hook of Swire Oilfield Services painted a picture of an imperfect supply chain with M2M components currently spread across multiple, disparate systems with limited track and trace functionality. This leads to materials shortages or excesses due to lack of delivery status visibility and to the overuse of costly expedited delivery. The solution lies in M2M leveraging a customized commercial-off-the-shelf software platform. Enter Swire’s OverVu, a web-based track and trace solution that consumes data from enterprise IT systems and a range of auto-ID (M2M) systems.
Benoit Tournier modestly presented Sierra Wireless as the global leader in M2M devices and cloud services. Oil and gas M2M is growing rapidly with a forecast 30% annual growth and an estimated 400,000 units deployed by 2016. But technology choices confronting oil and gas deployments are complex. Energy efficiency, reliable and secure connectivity and scalability may be hard to achieve. Sierra advocates 'open and standard’ technologies as in Sierra’s ‘AirVantage’ M2M cloud - embedding standards from ISO 27001 and the Cloud security alliance. Sierra’s poster child is ConocoPhillips which has deployed the AirVantage cloud to enable pipeline and well monitoring. More from SMi.
The 18th PNEC conference on petroleum data integration, information and data management held last month in Houston appears to have succeeded in its move to new ownership under Pennwell, attracting a record 700 oil and gas industry professionals from 34 countries. Judging from the presentations which the PNEC organizers have kindly made available to Oil IT Journal, the conference has succeeded in maintaining the quality and focus of its presentations.
James Vazquez offered detailed advice on how ConocoPhillips has moved ‘from chaos to calm’ in its Petrel environments. But the journey was not easy. The first problem was that Blueback Reservoir’s Project Tracker failed to cope with CP’s large network. Three weeks after kicking off, the tool was still crawling! Enter Blueback’s Project Crawler which managed to find CP’s projects for further investigation with the Tracker. This let CP’s surveyor align project coordinate reference systems (CRS) with data in its OpenWorks database. With some basic data management standards in place, users could use the Petrel reference project tool and benefit from some ‘real collaboration.’ A spin off was an 80% reduction in Petrel project counts. CP’s ‘OneWiki' knowledge sharing environment got a plug.
Ivan Benavides presented BP North America Gas’ approach to assuring well integrity. The well integrity management project addresses BP’s mid to long term needs with a web based tool that helps capture and evaluate anomalies as they are detected. An anomaly ranking tool combines safety critical equipment data with the probability and impact of barrier failure. A systematic process for risk categorization and anomaly prioritization has provided BP with an enhanced capability to meet existing and emerging state regulatory requirements. The system has reduced the time engineers spend on integrity data management and has saved costs through automation.
ExxonMobil’s Kristen Poteet returned to the Petrel data problem. Petrel is Exxon’s tool of choice for integrating well logs and seismic but it presents 'recurring data management difficulties.’ These include the inability to find definitive data, duplicate or incomplete data sets and disk space problems. Exxon has deployed Petrel Studio to rationalize its data management. Exxon’s Studio implementation runs on an Oracle database with ArcSDE extensions and provides a true multi-user environment for Petrel. A ‘publish’ model captures data to Studio at project milestones. The Studio/Oracle combo allows Exxon to capture and search much larger amounts of data than in a Petrel project and the system can alert users when new data is available. Projects can be edited and rebuilt/reconfigured as required. The system was piloted through 2013 on an early release of Studio. Several problems were encountered en route but Exxon is persevering with Studio, which now provides a ’single source of trusted data for several of our business teams.’
Petrosys' Volker Hirsinger observed that, ‘There are currently no standards for how resource estimates should be managed in structured databases.’ He suggests using PPDM architectural principles to develop a maintainable resource estimates data structure that could be extended in the future. The PPDM audit history table provides a mechanism for tracking reserves estimates as they evolve when new data arrives or when interpretations change. Reserve estimates are after all opinions - and tracking them is a matter of considerable importance to regulated operators. Hirsinger was not selling anything but we imagine that the reserves cataloguing workflows that Petrosys was showing at last year’s SEG are not a million miles removed from the proposed approach.
Pradeep Vaswani unveiled BP’s plans for its upstream data integration platform, a suite of design patterns, integration technologies and tools designed to support, inter alia, BP’s ‘Chili,’ subsurface modeling project. A key facet of the work is the ’socialization’ of BP’s strategy to ensure buy-in from key stakeholders. The project originated with the difficulty of cross discipline application where ‘enormous use cases’ are hampered by a lack of enterprise service bus/OpenSpirit technology. BP has proposed a 'common underlying integration framework’ to meet all current and future requirements. Vaswani went through a number of use cases including PI data integration (with iLogistics and SAP), upstream (Chili - with Petrel atop OpenWorks) and Finance (SAP to Midas). Other tools involved are Tibco/Spotfire, Microsoft BizTalk and deployment on SharePoint in the Azure cloud. Vaswani’s presentation included detailed data flow charts including a real time scenario feeding a 100 terabye 'massively parallel processing database’ connected to a petabyte big data store.
Jay Hollingsworth embarked on a tour of standards that impact oil and gas in particular Energistics' embryonic common technical architecture (CTA), a technology base for the next generation of Energistics standards. The CTA builds on open standards to avoid licensing issues and to ensure that they are accessible to developers. These include some familiar building bricks like XML, HDF5 and Microsoft’s Open packaging convention. Energistics now has its own OPC profile used in the recent Resqml standard. The Object management group’s UML is also now key to model development and code generation. Also new is a move to the Websocket protocol for the next generation of Energistics standards that will support streaming data. A new Energistics transfer protocol has been proposed leveraging Websocket, Avro and JSON - to be rolled out in Witsml 2 real soon now!
A presentation by Chen XinRong provided some interesting metrics on the scope of Sinopec’s geological asset management effort. Sinopec has 900 employees working in its geology archive. It was proving hard to find an off-the-shelf solution that supported Chinese characters and fulfilled Sinopec business needs. In 2005 the company initiated a $2.5 million project to sort out its geology data with metadata standards, processes and a database and web portal. The in-house solution now boasts some 20,000 users!
Paul Richter (ENI) asked ‘How do you maintain a G&G data quality in a multi- vendor environment?’ adding that to date, ‘the question is largely unanswered.’ Enter ENI’s own data framework. Richter warns against embarking on an ‘unending data cleanup effort’ A successful effort needs to balance methodology, technology, and planning.’ While Richter downplayed the technology choice, his slides appeared to show a custom edition on OVS’ ‘One virtual source’ data solution.
Greg Hess (Halliburton/Landmark) described a cloud deployment performed for Cobalt International. Houston-based Cobalt has operated in a hosted environment since its inception in 2005. Cobalt’s cloud comprises Landmark’s application hosting cloud services, an ‘infrastructure-as-a-service’ (IaaS) model originally developed in 2000. Today Cobalt has over 100 users accessing some fifty applications in the cloud accessing a multi-terabyte data set. Data management of the latter is a key component of the offering as indeed is data loading - all provided as a service by Landmark under a service level agreement. While public cloud providers could handle portions of the hosted solution, the real challenge is to ‘bring all the pieces together for a positive user experience.’ This requires a combination of data processing, delivery, and storage in conjunction with configuring the proper networks and appropriate communication links. All relevant traffic concerning the system is tracked with an information technology service management tool for reporting.
EnergyIQ’s Steve Cooper showed how the PPDM ‘what is a well’ standard has been deployed in real world environments for Devon and Anadarko. Application of the detailed nomenclature for deviated sidetracked and variously completed well bores has made for easy access to integrated data across the full well lifecycle and across different data silos. The approach is said to be 'a robust alternative to the common, error prone approach of data exchange by spreadsheet.’
Tjan Tjwan Liang described Saudi Aramco’s workflow automated services for well logs, ‘Wasl,’ developed to automate log data management. Wasl provides a dashboard and key performance indicator to track data delivery and use. The tool has helped Aramco assure well data quality and completeness.
Scott Raphael outlined how Merrick’s toolset has adapted to the fast paced factory drilling paradigm of unconventional exploration and development. In this context, new wells and equipment need to be added to the production data management systems in a timely fashion and the flood of data creates opportunities for error. Along with the usual gas, oil and water, operators now need to track how products are sold, flared, vented, used as compressor fuel and so on for regulatory reporting. Tracking production allocation and rapidly changing price variations across a variety of liquids is also quite an issue. Operators run the risk of lawsuits if they mess up - or can't prove that their reporting is fair and accurate. Enter Merrick’s Production Manager, now customized for shale operators.
No less than three presentations (from PPDM, CDA and BP) revolved around the creation of a certification program for data management professionals. Trudy Curtis (PPDM) did a great job retrofitting the plethoric standards from the Standards leadership council into a single coherent Power Point. CDA, ECIM and PPDM plan to publish a joint ’statement of intent’ real soon now to ‘plot the route towards a single society.’ Those interested are invited to ‘watch this space and get involved - with an open, collaborative mind.’ More from PNEC Conferences.
Norway’s annual ‘Semantic Days’ conference gathers semantic web enthusiasts from oil and gas and other verticals. Kari Anne Haaland Thorsen (EPIM) opened the proceedings with a reminder that Norway’s costs are too high. Since 80% of Statoil’s costs are external, Norway’s prime minister has told Statoil, ‘make your suppliers competitive!’ Cost reductions are expected through the implementation of a generic information model (GIM), applicable across ‘industry, business and the public sector.’ Earlier Norwegian standards efforts went under 'Integrated operations’ (IO) banner. This is now refocusing on integrated work processes across domains and suppliers, on semantics, big data and offshore logistics. Rather grand claims are made for GIM’s potential to provide a ‘unifying logic,’ ‘proof,’ and ‘trust’ across use cases and stakeholders.
Under GIM’s broad banner we have the Integrated life cycle assets planning (ILAP) standard, presented by Glen Worral (Bentley), which sets out to ‘improve HSE and increase asset value through a common standard for planning data.’ A second standard ‘Equipment data requirements and conformance’ (EDRC) leverages POSC/Caesar’s ISO 15926 standard to 'achieve plug and play interoperability and savings.’ One EDRC objective is to provide a compliance mechanism for ISO 15926. An earlier semantic project found in 2012 that ‘the lack of rigorous methods for assessing conformance to ISO 15926 is a major barrier to industry adoption.’ Another facet is Data exchange within process industry (Dexpi), under test between Bentley and Siemens.
Tore Christiansen (Christiansen Consulting) described how GIM is used in supply chain standards, providing ‘a systematic way to ensure common understanding.' These include the EqHub (equipment data exchange) and ILAP for planning. Both build on a semantic core but communicate with the outside world via the more accessible XML.
Øyvind Mydland (StepChange) returned to the earlier Integrated operations (IO) project. High capacity network connectivity now facilitates onshore-offshore collaboration and real-time is now a gievn. But the movement has lost momentum. We still can't integrate across domain silos and IO is considered a ‘cost’ in major projects. Mydland advocates a new focus on 'capability,’ as opposed to technology. IO needs to be owned by top management.
Using semweb technology requires semantic specialists that are few and far between. At one juncture in Norway’s semantic effort a cry for help went out to Top Quadrant, developer of the Sparql semantic toolset and holder of one set of the keys to the semantic web. CTO Ralph Hodgson described a ’semantic ecosystem for oil and gas’ a.k.a. an alliance of Sparql and ISO 15926. Interoperability comes from ’semantic transformation of XML data to instances of ISO15926 ontologies.’
Thomas Røed explained how the Talisman/BP logistics centre supports marine operations, leveraging the EPIM Logistics Hub. The centre handles planning and scheduling activity across 11 vessels, 3 supply bases and 20 installations. A combination of Logistics Hub information and RFID keeps the schedule updated and automates tracking of cargo units.
The main successor to the IO project is the EU-sponsored Optique project, presented by Martin Skjæveland (Oslo University). Optique is essentially a reprise, at EU level, of IO with the promise of accessing data from different, incompatible sources through 'ontology-based data access.’ Only Statoil and Siemens remain in the Optique game. More from the Semantic Days website.
David Phillips is now head of industry and investor relations at Aker Solutions. He was previously with HSBC Global Equity Research.
John Bell is now Senior VP business development with natural language generation boutique Arria NLG.
Baker Hughes has appointed Greg Brenneman and Bill Easter to its board.
Jim Rogers, formerly with Duke Energy, is now senior advisor to energy investment advisors Broadscale Group.
Coler & Colantonio’s geospatial division is to re-launch as Novara GeoSolutions.
Arild Fotland heads-up Ceragon Networks’ new location in Bergen, Norway.
The European association of geoscientists and engineers (EAGE) has posted its e-lectures to YouTube (naff music alert!) where they can be viewed for free.
Randy Crath has joined Evercore investment banking’s oil and gas business as senior MD in Houston. He hails from Scotia Waterous.
Vimal Kapur, a 25 year Honeywell veteran, has been named president of the process solutions business.
Chris Niven is IDC Energy Insights’ director of oil and gas research. He was previously with MCSquared Consulting.
Angelica Lasso has joined Ikon Science as a senior geomechanics specialist.
Itasca International has appointed Shawn Maxwell as president and CTO of its ‘Image’ microseismic and geomechanical products and services. He was formerly with Schlumberger.
Midland Valley has recruited Alexis McCallum as office coordinator and Lydia Jagger as structural geologist.
Stuart Filler and John Sharp have joined Ryder Scott as senior petroleum engineers. Filler was formerly with High Mount, Sharp with Chesapeake.
Massimiliano Leopardi is now CFO of Saudi Aramco/Dow’s Sadara Chemical joint venture. He succeeds Luciano Poli who returns to Dow after his secondment to Sadara.
Vladimir Argeseanu has retired from Seisland.
Seventy Seven Energy, a Chesapeake Energy spin-off, has named Karl Blanchard as COO. He was formerly with Halliburton.
Jürgen Maier has been appointed CEO of Siemens UK, succeeding Roland Aurich who is retiring.
Todd Bradley has joined Tibco Software as president. Bradley hails from HP.
Gary Morris is now chairman of US Seismic Systems. He was previously with Paradigm.
Wellsite Rental Services has appointed Blair Faucheaux as controller.
Willbros Group has promoted Earl Collins to President and Brad MacLean to the position of ‘chief talent officer.'
Noumenon Consulting MD and ISO 15926 specialist Adrian Laud died last month. Read the Fiatech obituary.
Bureau Veritas has acquired DTI Diversi-Tech, oil and gas inspection and audit service provider.
Borehole imaging specialists Aberdeen-based Task Geoscience and Houston-based Fronterra Integrated Geosciences are to merge into ‘Task-Fronterra Group.' The deal is backed a £3.8 million investment from the Business Growth Fund, an investment vehicle set up in 2011 by Barclays, HSBC, Lloyds, Royal Bank of Scotland and Standard Chartered.
Following its 2012 acquisition of 10% of North West Geomatics, Hexagon AB has purchased the remaining 90% cent. North West Geomatics is now a wholly-owned Hexagon subsidiary.
MRC Global is to acquire privately held Metron Holding, the parent holding company of Hypteck, a Norwegian provider of instrumentation and process control equipment. The new company will operate as MRC HypTeck.
Oceaneering International has acquired Airsis, developer of PortVision web-based service reporting on the location of commercial vessels.
Wellsite Rental Services has received an equity investment from NGP Energy Technology Partners and the Wellsite management team.
Speaking in a panel session at the Hexagon* Live event in Las Vegas this month, Shell’s business improvement manager for capital projects Egbert Stuit discussed the use of cloud computing in construction, handover and operations. A couple of years ago, Shell set out to fix its legacy manual processes for data and document management in capital projects. The company was also looking to ways of ‘getting data out of engineering and procurement contractors’ (EPC) tools into its own system.
The result is Shell’s ‘Data move’ program and ‘Sirius,' Shell’s integrated engineering environment (SIEE), the cloud solution that provides engineering tools for EPCs, plant owners and vendors. Stuit was asked if the SIEE concept made contractors ‘nervous.’ He responded that this used to be the case, but that Shell has been putting a lot of effort into convincing its suppliers of the merits of the new ways of working. Shell held its second EPC forum in Den Hague this year with 15 global EPCs attending the event to hear Shell’s vision of where data and document handling is going and what new tools and processes are required. ‘A lot of the nervousness has gone, they are now keen supporters.’
Implementing the SIEE in new builds is one thing. Brownfield projects require a different approach. But the potential is perhaps even greater as there are more brownfields. Here Shell is migrating legacy data from its assets into a ‘clean’ Smart Plant cloud. Over time, when enough data has been moved, the plan is to retire the legacy world. The SIEE cloud is also suited to new data types such as laser point clouds and is great for supporting Shell’s replication philosophy - a ‘design once and build many’ approach. Several replication projects are already underway. These have required a change of mindset to a containerized/modular design approach that is helping lower capital project costs. More from Hexagon Live.
* Stockholm, Sweden-headquartered Hexagon brands include Intergraph, Hexagon Metrology and Leica Geosystems.
Hitachi Solutions’ Canadian unit has announced a Microsoft Dynamics AX-based solution for oil and gas joint venture accounting. The joint venture solution (JVS) offers project and cost center management, contracts to manage joint interest agreement details, joint venture billing, authorization for expenditure (AFE) management, and gross/net reporting.
The JVS module completes Hitachi Solutions’ Dynamics AX-based software offering in the oil and gas vertical. Other modules target enterprise resources planning (ERP), oilfield services management, EHS and ‘xRM,’ licensing and relationship management for oil and gas. The company’s oil and gas industry practice has implemented solutions for Canadian operators such as Ensign Energy, Access Pipelines, Petro-Canada/Suncor, Talisman Energy, Aux Sable and Gibson Energy. More from Hitachi.
A report from UK-based consultancy Frost & Sullivan, ‘Global oil and gas infrastructure security market assessment’ reveals ’sustained threats’ to oil and gas infrastructure and a need for better security solutions. The study covers security services, command and control, screening detection, surveillance, access control, perimeter security, and cyber security. Overall the market is said to be worth $20 billion in 2013 and this will likely rise to $25 billion by 2021. F&S analyst Katherine Evans observed, ‘While most opportunities are in the upstream, manufacturers must provide solutions across all segments, with particular focus on surveillance and intrusion detection.'
Cyber security is not currently a spending priority among oil and gas companies and has limited opportunities for vendors. Companies are more likely to spend on safety measures such as fire detection and prevention even though cyber attacks pose a greater threat. The situation will change with the passing of national and international legislation to protect oil and gas assets from cyber-attacks. The F&S study is delivered under F&S’ Growth partnership service program.
A new white paper from Microsoft, ‘How innovative oil and gas companies are using big data to outmanoeuvre the competition,’ subtitled, ‘Drilling for new business value’ is a buzzword laden editorial on ‘big data,’ the ‘digital oilfield’ and the 'internet of things.’ Big data comes from ‘previously untapped’ sources such as ’seismic input, weather patterns, or social media’ and combining such disparate data sources can lead to ‘interesting insights that empower decision-makers.’ Microsoft’s offering in the oil and gas big data stakes? It is our good old punching ball Mura, the Microsoft upstream reference architecture (Oil ITJ passim), now repurposed as a big data enabler.
BP Exploration (Alaska) has selected AeroVironment to provide mapping and GIS at its Prudhoe Bay oil field for a five-year period. The contract marks the first time drones will be used for commercial services over land in the USA. AeroVironment operates a Puma AE drone equipped with Lidar or optical/infrared sensors.
BP has awarded Aker Solutions framework agreement for engineering, modifications and maintenance services for its offshore Norwegian operated oil and gas fields.
Statoil has extended its contract with Altus Intervention for the provision of mechanical wireline services on the Norwegian continental shelf.
Anton Oilfield Services has been awarded reservoir production management projects on a shale gas well in Hubei and a tight oil well in the Subei Basin, China. Anton will be rewarded with service fees and performance incentives as the production reaches initial target.
Aveva and Amec have signed a new five-year global agreement for the supply of a ‘full portfolio’ of software for engineering, design and information management.
CB&I has been awarded a contract by Pieridae Energy (Canada) for the FEED* of the Goldboro LNG project in Guysborough County, Nova Scotia.
Petrobras has chosen Houston-based consultants Endeavor Management to investigate and document successful industry practices for decommissioning subsea infrastructure. Endeavor is to gather information from regulators, contractors, and engineering companies, as well as engage major operating companies in collaborative benchmarking to share knowledge and experiences.
FTS International has entered into a 15-year joint venture agreement with Sinopec. A jointly owned company, SinoFTS will serve both Sinopec and other exploration and production companies throughout China.
A Foster Wheeler subsidiary along with affiliates of Chiyoda, Saipem and WorleyParsons have been awarded a contract by LNG Canada Development for the provision of FEED and project execution services for the proposed Kitimat LNG export project in British Columbia.
GE Oil & Gas and Devon Energy have signed a technology collaboration agreement to enhance the performance and economics of unconventional oil and natural gas projects.
GDF Suez’s Cofely Endel Europipe unit has selected Intergraph SmartPlant, Spoolgen and SmartPlant Isometrics for data management and handling of contractor specifications in PCF format.
AMEC is transferring its existing PDS perpetual licenses to Intergraph Smart 3D.
Ipcos has implemented its advanced process control solution on GASCO’s gas processing facilities at Habshan-0, 1, 2, Bab and ASAB-0, Abu Dhabi.
Kongsberg Maritime has been awarded a FEED contract (with a delivery option) covering safety and automation systems on Statoil’s four Johan Sverdrup field platforms. Kongsberg has also signed with Teekay Shipping unit Gina Krog for supply of an integrated control and safety system and power package to the Gina Krog FSO unit.
Supplier of frac sand Unimin Corporation has chosen IFS Applications to assist in optimizing its supply chain and global operations. Unimin will implement IFS Applications to some 600 users in a multi-phase process over the next three years, beginning with its energy business.
SCDM Energie has adopted Paradigm’s SeisEarth as part of its corporate standard for comprehensive seismic interpretation and visualization.
Suncor Energy has selected Honeywell to provide automation systems to a new multi-billion dollar oil sands project in Alberta. The deal includes control and safety systems, alarms and simulation software.
Petronas has awarded Tendeka a three-year contract for the provision of zonal isolation technologies. The contract was awarded through Tendeka’s local partner Dialog Systems.
Developer of mobile broadband network products and spectrum
sharing technologies xG
Technology has signed a letter of intent with Shoreline Energy International
to ‘explore’ partnership and deployment opportunities for its xMax
mobile broadband solution across a range of industries and countries
throughout the sub-Sahara Africa region.
* Front-end engineering and design.
The Pipeline open data standards body PODS has posted a draft charter document for the new PipelineML work group, a joint initiative with the Open geospatial standards organization. The group is to develop an ‘open extensible standard’ for the exchange of pipeline data. The will leverage existing OGC standards such as GML and OGC Web Services.
V3.9 of the Public petroleum data model (PPDM) has been released with support for ‘What is a well’ components, faceted taxonomies and sample management and analysis. The ’substances’ subject area has been extended with detailed descriptions of chemical characteristics.
A historical note. Back in 1997, Oil IT Journal, reported on the joint POSC/PPDM ‘Discovery’ project, which was to become PPDM version 4.0! Sic transit...
US and EU trade sanctions are challenging the development of globally accepted international standards. The OGP and the American Petroleum Institute have created a task force to look into ‘a sustainable long-term solution.’ Meanwhile OGP has approved a process whereby new standards are sent to ISO for balloting or publication. More from OGP.
Bombay, India headquartered IT solutions provider Rolta has announced an oil and gas edition of its ‘OneView' enterprise business intelligence (BI) and analytics methodology and toolset. OneView claims to offer a ‘transparent’ information ecosystem to address change management of change issues associated with business intelligence programs. The approach begins with a discovery phase using Rolta Data Advizer to define a baseline state, the future information landscape and blueprint a BI information model.
Companies may decide to consolidate their information platform to a ’standardized’ platform of data, analytics and content management in which case Rolta’s SmartMigrate provides solution accelerators, methodologies and expertise to enable complex data migration efforts. The solution includes connectors to operations, ERP, geospatial and engineering tools. Rolta’s iPerspective delivers ‘all-round analytical capabilities’ including BI, real time/big data and predictive analytics.
The Rolta release reads like a buzzword and technology Who’s Who with tips of the hat to scorecards, six sigma and TQM. On the technology front, Rolta claims compatibility with ‘best of breed’ BI, cloud and analytics platforms including SAP Hana and Oracle Exalytics. Geospatial ‘fusion’ integrates Rolta’s earth sciences capabilities with map services from ESRI, Google and Microsoft. Standards-based ‘open’ credentials derive from an ‘ISA-95 compatible platform.'
Speaking at the PSIG Annual Meeting of the Pipeline simulation interest group in Baltimore, MD last month, Energy Solutions International’s Jennifer Worthen, used parametric studies to identify the best operating scenario for a given pipeline configuration. Traditional liquid pipeline optimization software seeks to find the pumping rates that minimize operating costs while fulfilling flow rate and other requirements. But in general, only a subset of all possible parameters is investigated in the simulation leading to uncertainty as to whether the solution is a true global minimum in the overall solution space (similar problems crop up across many other optimization fields from seismic imaging to reservoir simulation).
The study looked into combinations of pump station location and drag reducing agent usage to minimizing operating costs at the target flow rate. 2D and 3D plots (made with GNU Octave) of the ‘rich’ data set provide a ‘unique view’ into subsets of the objective function that are typically unseen. These provide users with ‘a broader knowledge than regular optimization software provides - without significant manual work.’ The study was performed with a pre-release version ESI’s PipelineOptimizer tool. More from ESI.
Honeywell has introduced ‘Leap,’ a.k.a. ‘Lean execution of automation projects,’ a cloud-based service for the design and execution of process plant automation for plants such as oil and gas production facilities. The lean approach ’separates physical from functional design,’ allowing parallel workflows with standardized designs and enabling engineering to be done from anywhere in the world. Leap combines Honeywell’s proprietary hardware and software, virtualization and cloud engineering to provide flexible scheduling and reduced project risk. The approach is claimed to reduce automation costs by ‘up to 30%.’
Traditional sequential workflows mean that automation and controls must be implemented before the rest of the plant can be completed. Leap’s parallel workflows keep automation systems off of these critical paths. Honeywell Process Solutions president Vimal Kapur explained, ‘Automation projects are increasingly difficult to manage as plants become larger and more complex. Leap turns project execution workflow on its head, simplifying what has traditionally been a long and expensive process.’ Moving projects into the cloud allows project engineering to take place from anywhere in the world. Leap embeds three core Experion PKS Orion components, universal channel technology, control room system virtualization and cloud engineering. More from Honeywell.
A spate of rail accidents, both in the US and Canada, involving dedicated trains or large blocks of flammable liquid tank cars, has highlighted the vulnerabilities of the US DOT-111 certified tank car and resulted in new legislation and the higher CPC-1232 safety specification. To help its clients easily identify trucks that comply with the new spec, Bourque Logistics has teamed with Industrial Networks on an RFID-based solution for tank car identification.
The new solution is designed for use with Bourque’s YardMaster and Railtrac shipping and monitoring tools along with IN’s automated equipment scanning systems. The new CPC-1232 designation will be added to the Railinc’s Umler system, the rail industry’s official, mission-critical source for rail equipment information. Bourque’s tools are tightly integrated with Umler’s daily updates. Canada recently accelerated its deadline to May 2017 for the exclusive use of CPC-1232 rated tank cars. More from Bourque and Industrial Networks.
French startup Terra3D’s PrinTerra software is a plug-in for Schlumberger’s Petrel that provides connectivity to a 3D printer. Geological and reservoir models can be printed as ‘full colour’ solid 3D representations. Terra3D claims that handling the 3D models provides insights and understanding into field geometries and helps communicate complex geology.
The PrinTerra interface allows for sizing and selection of 3D Petrel models which can feature any grid property such as facies, porosity, or fluid content.
Models are then exported in Collada or Vrml formats that can be read by 3D modelling software. The files can be printed on your local 3D printer or sent to off-site 3D print services such as Sculpteo. PrinTerra is available from the Schlumberger Ocean store. Pricing starts at $1,000 for a month’s rental. More from Terra3D.
In his presentation at the Nvidia GPGPU conference earlier this year, Garf Bowen of UK-based startup Ridgeway Kite unveiled its work-in-progress reservoir simulator. Ridgeway is working on a full physics toolset for characterizing unconventional shale reservoirs. The modeler computes multi-phase thermodynamic equilibrium in nanometer scale pore spaces including capillary pressure, vapor-liquid equilibrium, multiple porosities and a ‘pseudo-geomechanical’ approach to describe fracture growth.
Small scale modeling requires big compute resources - to provide the ‘maximum possible parallelism.’ Here Ridgeway has developed the 'extensible parallel library’ (XPL), a library for ‘massive’ code parallelization that runs across ‘all current hardware types.’ The intention is that it will also run on most future systems too.
However, coding its linear solver in XPL is a significant overhead. Ridgeway has been tempted from the pure device-independent approach by Nvidia’s AmgX library. This provides a 'free’ pathway to accelerated core solver technology on GPUs with an 'up to 10x’ acceleration for the computationally intense linear solver portion of simulations.
Ridgeway’s software is tested on two systems housed by the UK Center for Innovation. One, Emerald, is a 114 teraflop machine with 372 NVIDIA Tesla processors. The other, IRIDIS is a 12,000 core Intel Westmere-based cluster with a 108 teraflop bandwidth. Ridgeway plans to release its reservoir simulator in 2015. More from Ridgeway Kite.
Westheimer Energy Consultants has teamed with Oracle Oil & Gas to port its ISDS seismic data management solution to Oracle’s 12c database and optionally, the Oracle Exadata appliance. The Integrated seismic data solution (OITJ November 2011), which embeds technology from ESRI, Troika and Fuse, offers ‘row level’ access to pre and post stack data.
ISDS was originally developed for SGI/FileTek’s StorHouse storage virtualization and data management solution. Integration with the Oracle stack, a three month effort by Westheimer’s subject matter experts and Oracle’s global technical oil and gas teams, promises ‘improved scale and performance.’ The solution can scale to multi-petabyte data stores leveraging Tier 1 and Tier 2 storage appliances. Westheimer MD Jeffrey Maskell observed that the new technical and business relationship with Oracle will ‘help Westheimer establish a commercial presence and critical mass in the sector.’ More from firstname.lastname@example.org (Houston) and email@example.com (UK).
Drillinginfo’s John Fierstien posted an amusing and provocative blog last month on 'why oil and gas software sucks.’ Fierstien, a geologist and long time user of oil and gas software allows that upstream software operates in a world of complex science, data and workflows. But does the software have to be complex too?
He tilts at the PhD boffins that create oil and gas software who often have math and computer science expertise but may lack ‘essential oil and gas experience.’ Too often, geology software is driven by young hires as opposed to leveraging expert oil and gas explorers to drive the software workflows.
Industry itself is at fault for being ‘too smart!’ Highly qualified individuals confronted with user unfriendly software will just spend a few months learning the tools before bragging about their new expertise. The result, workflows that ‘take days and hundreds of steps.'
One of Fierstien’s pet hates is the ubiquitous tree file manager which, for data selection in E&P projects leads to interminable clicking and scrolling. He prefers the Linux Filelight file system GUI and, for Windows, WinDirStatsit. Software can be made better - as witnessed by the user friendly interface of mobile devices. More in the full post on and on DI’s own ‘Transform,’ E&P decision-making platform.