Oil IT Journal: Volume 21 Number 3


Hip hip Hadoop!

Rice University Oil and Gas HPC presentation by Prairie View AandM researcher adds machine learning to seismic interpretation. ML-driven fault extraction results ‘promising.'

Speaking at the 2016 Rice University Oil and Gas High Performance Computing in Houston this month, Lei Huang, assistant professor at Prairie View A&M University, showed how an open source software stack has been deployed to add artificial intelligence to seismic interpretation. Huang and co-author Ted Clee (TEC Applications Analysis), have used ‘deep’ machine learning (ML) to identifying geological features in seismic data volumes. Huang’s latest work builds on the Prairie View seismic analytics cloud, a Spark/Hadoop-based seismic processing infrastructure. Huang observes that the massive datasets that are current in 4D seismics are of a different scale and nature to those that are commonly leveraged in social media-based big data work.

Huang’s SeismicRDD, a derivative of Berkeley’s resilient distributed datasets, brings parallel, in-memory computing and provides a mechanism for exposing multi petabyte datasets to analysis. The AI/deep learning component is provided by the Deep learning for Java toolset. A constellation of other open source tools add stream, batch and interactive capabilities, a NoSQL database and routines for seismic data loading and partitioning across the cluster. An earlier presentation from Huang’s team demonstrated the system’s capability in seismic processing where tools for data management have been developed, along with seismic transpositions and filters. The cloud-based system offers a web-based front end that can be programmed in Java, Python or Scala. Tests on a 288 core cluster show good scalability.

The machine learning component has been applied to the identification of geological faults. Here Huang uses Dave Hale’s seismic image processing for faults (IPF) package to calculate an attribute of fault likelihood, strike and dip. IPF’s image thinning techniques smooth seismic data along reflectors, enhancing discontinuities. IPF is already in use in commercial packages.

Huang’s approach combines multiple data sets (curvature, amplitude envelope) along with a training data set of ‘known faults.’ The ML-derived fault detection is said to show ‘encouraging results,’ although the ML ‘meat’ of Huang’s exposé was rather glossed-over. The techniques used included logistical regression and a support vector machine. Here, Apache Spark was key to speeding processing of the whole dataset, down from days on a sequential machine to hours. Visit the Cloud Lab where Huang is building a scalable big data analytics cloud with sponsorship from the US National Science Foundation. The lab focuses on Apache Spark and Hadoop big data with an emphasis on developer productivity and scalability.


TIM’s shale shock!

Refine researchers’ traffic impact computer modeling brings bad, and could-be-worse news to communities affected by shale drilling.

A computer modeling study by the UK-based Refine research consortium on fracking investigated the environmental impacts of road traffic associated with drilling and fracking operations. The study used a new traffic impacts model (TIM) to assess the short-term local impact of an individual site, as well as the long-term impact of a many sites operating over decades. The intent of the study is to inform the debate around the impact of increased traffic associated with fracking.

The study found that the traffic impact of a single well pad can create substantial increases in local air pollutants during activities such as the delivery of water and materials for fracking to the site. While averaging the values over the drilling and completion of all wells on a pad may give lower values, these may present a distorted picture of the actual impact on the local population. On the other hand, using TIM to explore a range of development scenarios spanning several decades showed that the overall impact on a region appears ‘somewhat negligible’ compared to general traffic or industrial activities. More from the informative GFZ/Potsdam Shale gas information center.


From linear programming in 1958 to winning at Go

Neil McNaughton traces the history of AI in oil and gas back to the dawn of the computer age then wonders what DeepMind’s success at Go means for complex problems such as seismic interpretation? Can we learn from AI accidents like Tay or the Google car fender bender? If you can't improve what you can't measure, can you apply AI to a problem without knowing what ’success’ is?

It is a pity that artificial intelligence (AI) guru Marvin Minsky died before witnessing Google’s DeepMind beating the human champ, Lee Se-dol, at the game of Go. On the other hand, he was probably turning in his grave as Microsoft’s Tay became a ‘Hitler-loving sexbot’ within minutes of entering the twittersphere.

DeepMind’s 4 to 1 thrashing of Se-dol came at a great time for the AI marketing brigade and journos the world over, as confirmation of the imminent job destructive power of AI. In oil and gas I have heard people who should know better claim that all we need to do now is dump our data, higgledy-piggledy, in a massive container and let the machine do the rest. It has been suggested that units of measure are no longer worth recording as the machine will figure out such trivia.

To trace AI’s history in oil and gas, I checked back through OnePetro to see how long it has been in use. It turns out that the first mention of AI goes back to 1973 and there have been some 1,500 papers that contain the term since then. Frequency of use of ‘AI’ has increased steadily with an approximate doubling (to 100 papers per year) in the last decade.

A search for ‘machine learning’ returns just over 600 results but pushes back the dawn of AI to 1969. If you allow that ‘linear programming’ is a piece of AI, this brings up 624 results going even further back, to 1958. I therefore nominate and simultaneously award Oil IT Journal’s AI starter prize to Messrs. Lee and Aronofsky of the Magnolia Petroleum Co. for their 1958 paper, ‘A linear programming model for scheduling crude oil production.’

But to get back to the present, it is clear that there is good AI and bad AI. How do you make sure that you (or the folks that are working for you) are developing the next DeepMind and not about to release another Tay?

Winning at Go is different from writing a chatbot for the simple reason that the outcome is clear; a win. This cannot be contested* and is unlikely to upset anybody except for Mr. Se-dol. But a chatbot? How do you define its ‘success?’ To paraphrase the old data dictum, if you don’t know what success looks like you can’t optimize your process.

Roadside cameras that capture your car number plate as you drive along are pretty good now and can be considered a modest success for AI. But what about Google’s other big AI adventure, the Google car? While this is a real gee-whiz use of AI, the recent fender bender is instructive. It seems like the car pulled out slowly into a gap in the traffic ‘expecting’ the bus to give way. It didn’t.

So what will Google’s programmers do next? They could change the program to allow all traffic to pass until there is a big enough space to pull out, which could take a while and drive the passengers crazy. They could make the indicator lights flash more brightly. Unfortunately, my experience of driving in Houston suggests that flashing indicator lights are taken by following traffic as a sign to accelerate and block your maneuver. A similar obstacle to ‘do no evil’ Google is what to do when the lights turn red. Accelerate like everyone else? Or brake legally and have the rush hour traffic tail-end your Noddy car?

Driving is imponderable with regards to an outcome. You can set a simple goal for driving like ‘No fender benders.’ But driving around at 50 kph, giving way to busses would not be much of a satisfactory outcome for the F1 driver who recently expressed a preference for the older 1000hp motors over today’s wimpy 700hp V8s. Maybe Google should be looking to buy a racing car manufacturer, add in its AI and see if they could win a F1 race which would at least be a definitive outcome. We might even see Amazon race Google and Facebook. How about a driverless race of 5000HP super-duper cars with Arria doing the commentary? Would anyone listen?

AI in oil and gas is definitely on the tricky side of the equation. On the one hand, ‘optimization’ is a motherhood and apple pie concept. I mean you are not doing to deliberately aim to do un-optimization. But what are you to optimize?

Say we had complete control over every facet of an oilfield. What do we aim for? Ultimate recovery? Cash flow? NPV? ROI? A job for life? Deciding on your goal is just the start. Next you need to think about what the best course of action is most likely to bring ‘success.’ This is where you start down a road with multiple bifurcations. Choices might include a hypothetical new well, extra compressors, pipelines and so on. Before the real world ‘run’ is through, the oil price will probably have changed. And maybe so will the tax take. Optimizing for recovery may mean cutting back on production for months or years which may be hard to explain to your bankers!

Another seemingly elusive target for AI is evidenced in this month’s lead where we report on an ML-based technique for identifying faults in seismic data. This uses a synthetic training data set to come up with a strategy for fault detection. But is it really getting more ‘successful’ at fault finding? How does the algorithm learn in the face of a nebulous outcome, where expert geoscientists may engage in heated arguments as to the origin of the seismic response? Go it ain’t.

* In fairness to Mr. Se-dol it has been contested. On the basis that sharing the time allowed to either party equally is unfair. Human thought proceeds at the speed of thought, but no faster. You can perform more calculations in a given time slot just by using a bigger computer, which doesn’t really mean that it is getting more intelligent.

The true nature of DeepMind in the context of Minsky’s AI is also debated.

@neilmcn

Review - Building ontologies with basic formal ontology

Today’s linked open data has resulted in a ‘hairball’ of incompatible data sets (which was actually the problem the semantic web set out to solve!) To be interoperable, an ontology needs an overarching ‘basic formal ontology’ (the USGS has one) as is explained in this ambitious new book.

Building ontologies with basic formal ontology* (Bobfo) is a 200 page introduction to the subject by Robert Arp, Barry Smith and Andrew Spear.* While ontology in its broad sense, the theory of what exists, is all-encompassing and philosophical in scope, Bobfo focuses on analyzing the ‘information domain’ with an intended application in IT and data modeling. The ontologist is therefore a journeyman modeler who can identify and extract the essence of data, relationships and indeed, everything else in a field of activity. In the case of Bobfo’s authors this is medicine and the biological sciences, but the approach is intended to have application everywhere.

To understand the basic formal ontology itself we would recommend viewing co-author Smith’s video which succinctly explains the failure of the semantic web, linked open data (LOD) and ontology to date. Smith categorizes the much-vaunted constellation of semantic LOD as a ‘hairball’ from which information can only be extracted with considerable manual effort. The BFO sets out to fix linked data’s ‘anarchy and chaos’ with domain-neutral standards for building ontologies, shared by all.

Bobfo defines an ontology as follows, ‘a representational artefact, comprising a taxonomy as a proper part, whose representations are intended to designate some combination of universals, defined classes and certain relations between them.’ A taxonomy is then defined as a simple hierarchy while ‘universals,’ and synonyms ‘classes,’ and ‘types’ are defined as groups of the entities in the world that is being described.

As you will have gathered this is a pretty dense oeuvre as perhaps befits a field so close to philosophy. One page 6 we are plunged into a discussion on terminology research, on how the ‘concept orientation’ of ISO, the international standards association was derived from the ‘phenomenonalist’ ideas of the Vienna Circle. This approach has been replaced by the ‘realist orientation.’ The realists downplay the ‘ideas in people’s heads’ to focus exclusively on ‘labels that represent entities in reality.’

Following two chapters on best principles for ontology design, the book gets down to business with the BFO itself. The BFO is a small, top-level (a.k.a. upper level) ontology designed to support data integration in scientific research. It addresses the time-dependent nature of measurement by distinguishing between a continuant and an occurrence. Other BFO concepts include role, disposition, boundary, spatial region and relation. These get rather thorough treatment, with erudite and interesting asides such as a digression on Arthur Eddington’s two tables.

The rubber hits the road in chapter 8 describing the BFO at work. Concretely this means the use of the web ontology language OWL, the W3C’s resource description framework RDF and the ontologist’s favorite tool of the trade Protégé. A short section on ‘facilitating interoperability’ outlines the potential benefits, with a pointer to the work of the Open biological foundry, an ‘expanding virtual framework for navigating massive amounts of biological and clinical data.’

BFO’s users are mostly in biosciences as shown on the Ifomis website. In our quick spin through the list we found ontologies for email, economics and petrochemicals. But not all appear to be maintained, far from it. One noteworthy use case is the USGS that has leveraged the BFO in its surface water ontology.

Bobfo underscores the weakness of current domain-specific attempts at semantics and enumerates many of the modelers’ pitfalls. In the end, the success or otherwise of the BFO approach depends on whether the benefits that accrue from decomposing a specific domain into its ontological components exceed the considerable intellectual effort that this requires. Those interested in such matters may like to sign up for the upcoming International conference on formal ontology in information systems in the beautiful town of Annecy, France next July.

* By Robert Arp, Barry Smith and Andrew Spear. 2015 MIT Press. ISBN 978-0-262-52781-1.


Optique update - semantics in Statoil, Siemens

EU Optique researchers publish on ontology-based data access to Statoil’s disparate geodata.

As we reported in June 2014, the main successor to Norway’s integrated operations (IO) was the EU-sponsored Optique project which promised ‘ontology-based data access’ to different, incompatible sources. Statoil and Siemens were the only IO survivors to carry on into the Optique follow-up to IO.

A 2015 paper from the Optique team described how Ontology-based data access (Obda) uses an ontology to mediate between data users and data sources. The ontology exposes data in a conceptually clear manner by abstracting away from the underlying schema-level details. The ontology is connects to relational data sources via Bootox mappings that translate natural language queries into SQL. Users can thus request for instance, all wellbores that penetrate a rock layer of a specific geological age.

The test data sources were Statoil’s Exploration and Production Data Store (Epds), and the NPD fact pages. The 700GB Epds database has 3,000 tables and 37,000 columns. Writing queries against the Epds requires a significant effort and it is common for a query to contain thousands of terms and up to 200 joins. The Obda solution has been tested at Statoil and the authors report that it provides good execution times. Another ‘preliminary deployment’ is reported chez Optique partner Siemens. More from Optique.


iRODS - big, open source data management for seismics?

Integrated rule-oriented data system could address ’spectacular’ seismic data growth.

Speaking at a ‘Lunch and Learn’ session of Society of HPC professionals, Dan Bedard executive director of the iRods* consortium showed how it is addressing the challenge of multi-terabyte data sets.

iRods is a cross industry organization with backing from IBM, EMC, Seagate, Nasa and others. The consortium develops open source software for managing ‘big, important and complex’ unstructured data. iRods also provides data virtualization and discovery along with workflow automation.

Bedard considers seismic data as a prime candidate for the iRods approach. Land seismic data growth has been spectacular, up from 400,000 sensors per sq. km. in 2005 to as much as 36 million in 2009. Bedard also stressed the importance of open source software in data management to avoid vendor lock-in.

iRods is managed by the Renci, the Renaissance Computing Institute, a research unit of UNC Chapel Hill and the Data intensive cyber environments (Dice) group, a spin out of the San Diego supercomputer center, now jointly housed, in ‘bi-coastal’ fashion, at Chapel hill and UCSD.

* Integrated rule-oriented data system.


Tessella powers BP’s Well Advisor

Analytics engine and business analysts helped extend SiteCom-based drilling system.

UK-based analytics specialist Tessella (now a subsidiary of France’s Altran) has reported on its involvement in the development of BP’s Well Advisor (BPWA). The BPWA integrates data with predictive tools, processes and expertise. Tessalla’s analytics are embedded in the tool to ensure that operations are efficient. The BPWA gives advanced warning of potential drilling issues such as stuck pipe. During development, Tessella’s business analysts acted as proxies for BP’s subject matter experts to help figure out what information was needed. Kongsberg was also involved in the development as provider of the underlying SiteCom platform. A bespoke extension to Energistics’ Witsml data transfer format was developed to handle make-up torque assessment data.

First trialed in Azerbaijan, BPWA is now live on ten rigs. Over 170 runs have been completed using the casing running console in the country. BP’s Mark Mundo reports that BPWA has saved the company $200 million by reducing non-productive time.


Baker Hughes first to join operational integrity program

Wollam Petroleum Advisory Group’s ‘ROIP’ builds on API Q2 quality management standard.

Baker Hughes is the first oilfield service company to enroll in Wollam Petroleum Advisory Group’s Gulf of Mexico regional operational integrity program (ROIP). The ROIP, which was jointly developed by Baker Hughes and Wollam, sets out to ‘drive consistency, service quality and risk mitigation across operations.’

ROIP builds on the API Q2 standard from the American Petroleum Institute. Q2, a.k.a. a ‘Specification for quality management system requirements for service supply organization for the petroleum and natural gas industries,’ was first released in December 2011. The standard specifies the requirements of a quality management system for an organization to demonstrate its ability to consistently provide services that meet customer, legal, and other applicable requirements.

The Wollam ROIP extends Q2 with region specific considerations for risk mitigation throughout the local supply chain. Controls include risk assessment, contingency planning, service design, quality plans and performance validation. Steve Ellison, VP quality with Baker Hughes said, ‘Our responsibility as an industry is to get things right first time. Moving the industry toward safer, more reliable operations requires that all aspects of quality programs are interrelated and consistent.’ More on the ROIP from Wollam. The Q2 standard is available at a cost of $80 from the API.


Accenture, Microsoft release 5th upstream digital trends survey

IT spend remains strong(ish), on mobile, IoT and the cloud. But where did cyber security go?

Microsoft and Accenture have released their 2016 survey (the fifth) of Upstream oil and gas digital trends. Despite the downturn, the survey found that 80% of the respondents plan to spend the same or more on digital technologies over the next three years. 53 % said that digital is already adding ‘high to significant’ value to their businesses. Cost reduction was identified as the ‘biggest challenge that digital technologies can most address today’ (sic). Also key was digital’s enablement of ‘faster and better decision making.’ On the downside, the main barrier to realizing the value is ‘the lack of a clear strategy or business case’ rather than the technology. Today’s digital investments focus more on mobility, 57% report investment in mobile. Next up is the internet of things (44%) and the cloud (38%). The next three to five years ‘should see a shift to big data and analytics.’ Penn Energy Research, in partnership with the Oil & Gas Journal, carried out the survey of worldwide upstream professionals including engineers, geologists and mid-level and executive management.

We asked Accenture’s Rich Holsman why there was no mention of IT security in the study. Here is his response. ‘Clearly, we see cybersecurity as a major threat and as a priority. In fact 44% of the respondents told us that IoT/cyber security was something that they are investing in today. But this was reported as an IoT investment. So the IoT number actually includes cyber security. You make a good point and we will clarify the situation in the online results.’


Software, hardware short takes

HampsonRussell, Epsis, EssencePS, StoneRidge, Rock Flow Dynamics, Safe Software, Esri, Ikon Science, INT, Geovariances, 2H Offshore, Schlumberger, SimpleInfoApps, RockWare.

CGG has released HampsonRussell Software’s HRS 10.1 with new seismic azimuth analysis, automated inversion parameter tests and AVO quick look modeling.

V6.1 of EpsisTeamBox collaboration engine brings new enhancements to grant publishing rights to correspondents and automates publishing to users or machines when starting a workflow. The environment module now automatically adapts to a users screen configurations.

EssencePSEssRisk release 1.3 now reads binary Eclipse files, adding compatibility with StoneRidge’s Echelon and Rock Flow Dynamics’ tNavigator. EssRisk was recently used by Baker Hughes to optimize smart well design.

The 2016 edition of Safe Software’s geographical transformation engine FME adds support for Autodesk ReCap, AWS Aurora, Portal for ArcGIS and SAP Hana among others and claims ‘43 new ways to transform your data.’

Esri has named its new GIS-enabled pipeline model (Oil ITJ 2016 n°1) as ‘APR,’ for ArcGIS pipeline referencing.

Ikon’s new RokDoc module includes wavelet estimation and editing to improve seismic-to-well calibration. Ikon’s four research sponsors acquired the functionality in early 2016.

INT has published a suite of YouTube videos demonstrating INTViewer functionality including QC and analysis of P190 and SPS navigation files, resampling of SEGY files in Python, petrophysical analysis and synthetic seismogram generation.

The ‘Flumy’ module in Geovariances’s Isatis 2016 now offers realistic modeling of turbidite environments.

Acteon unit 2H Offshore has announced FlawIQ, an engineering critical assessment tool that automates BS7910 and API 579 procedures for fatigue assessments of offshore structures.

The 2016.1 release of Schlumberger’s ProSource Seabed data model-based seismic data manager includes protection from the latest cyber vulnerabilities, streamlined setup and deployment and an expanded data footprint.

CEO and founder of SimpleInfoApps, Fahad Al-Dhubaib tells us that his company is focused on building simple and effective software for the oil and gas industry. First out of the starting blocks is a ‘Simple oil and gas data manager’ application, currently in release candidate stage and ‘on course for launch in May.’

RockWare’s RockWorks 17 update adds contour animation, Azure cloud support and data dictionaries embedded in SQL Server. RockWare has also announced a free online coordinate converter covering a variety of NAD, UTM and WSG formats.


Energy Network Conference’s big data and IoT in oil and gas

Ambyint, ‘IoT is disruptive.’ Galdos’ geospatial semantic registry for oil and gas. Beaver Drilling, rigs now IoT themselves. Streamline’s industrial IoT. Embark Innovations’ ArrowSync chemicals app.

Speaking at Energy Network ConferencesBig Data & IoT in Oil & Gas Canada earlier this year, chairperson Nav Dhunay (Ambyint) likened the Internet of Things (IoT) to teenage sex, as ‘everyone talks about it, nobody knows how to do it and all think that everyone else is doing it!’ Nonetheless, Dhunay believes that the IoT is disruptive and that it is a quantum leap from today’s scada systems.

Ron Lake (Galdos) believes that the IoT needs a standard. Enter the light-weight geospatial semantic registry for IoT and analytics in oil and gas. The registry (specifically the OGC’s catalogue services for the web, CSW-ebRIM) can be used to store IoT device properties and metadata. The CSW uses a NoSQL database that tracks device history and health, and forms the basis of a ‘portable business information model.’ Lake presented a use case of the registry to monitor a pipeline network involving Galdos’ CSW-derived registry Indicio.

Beaver Drilling president Kevin Krausert traced the history of drilling rigs. Today these are pretty much of an IoT themselves, with wired drill pipe extending the network downhole. Automation, along with the experience gained during the shale boom years, has brought huge efficiency gains. Krausert centers a constellation of drilling apps, for torque, vibration, pump control and more, on an ‘integrated drilling operating system.’ Today this is independent of other corporate software environments (ERP, HSE) but the future should see these merge into a new technical environment for drilling, ‘driven by strategic alliances with other service providers.’

Gregory Tink (Streamline Control) and Jason Sawchuk (Plains Midstream) described a real world project that leveraged industrial IoT concepts. The IoT and business worlds are different but they are now colliding as operations technology meets IT. One route, for instance, leads from devices to ERP environments such as SAP. There are also demands to ‘control the plant from anywhere’ so that implementers need a ‘foothold’ on either side of the firewall. IT/OT convergence means that ‘protocol barriers are starting to break down.’ All of which makes for a perfect storm, albeit one that is changing corporate expectations. Plains’ IIoT is built on the MQTT messaging protocol, a lightweight publish/subscribe middleware spec that is said to be ideal for the constrained environment of the IIoT. Devices are now upstream of the MQTT broker with business, operations, engineering and even scada now all working downstream of the MQTT broker. The project is a component of Plains’ five core strategic initiatives mapped out through til 2017.

Jane Glendon presented Embark InnovationsArrowSync platform for tracking chemicals used in oilfield operations. The flexible app tracks corrosion inhibitor, demulsifer and defoaming agent usage and costs, and is reported as having saved some $850k CAD per year in a 150 well deployment.


ABC Wellsite Automation, Houston

The second annual American Business Conferences event hears from Emerson on operational performance. Noble reveal’s its ‘hidden (onshore safety) agenda.’ Devon manages by exception, wirelessly. Chevron advocates object-oriented scada. Murphy Oil automates rod pump management.

Predicting 2016 to be a ‘tough year’ Craig Llewellyn (Emerson) advocated investment in automation to address rising costs and to maximize ‘operational performance,’ a KPI that seeks to minimize downtime, lost production and costs. ‘Easy’ opportunities include tank management, custody transfer, enhancing separator performance, choke sizing and chemical use. Wireless scada offers an opportunity for a site-wide reduction in installation costs and enhanced process visibility. There are obstacles to automation, notably the inertia of a manual operations culture and a reluctance to invest in new kit. Llewellyn advocates starting with small projects to deliver quick wins, to retrain operators and also to ‘challenge your organization and automation partners’ to come up with ROI-generating scenarios.

Clint Boman (Noble Energy) has a ‘hidden agenda.’ Unconventional production and more stringent environmental considerations are making for increasingly complex production facilities. These need to cope with increased liquid volatility, higher gas volumes, operating pressures and new emissions controls. To ensure operational safety in the new normal, Boman takes inspiration from offshore safety systems, notably API RP 14C. While originally written for offshore facilities, RP 41C makes a ‘very good basis’ for onshore facilities as well. The standard can be used to drive a ‘prescriptive’ approach to safety instrumented systems (SIS) design and to take the guess work out of Hazop with a simple framework for deciding which safety devices are required for each process component. The intent is to develop a new prescriptive standard for onshore facilities where regulation or risk levels mandate full SIS implementation. Lower risk facilities can be covered with performance-based standards such as ISA84, IEC 61511, or IEC 61508.

Brandon Davis (Devon Energy) presented on the use of scada data in optimization. Davis acknowledged service providers Weatherford/Cygnet, Theta and OSIsoft and showed how these data services have been connected over a wireless network and into a central production control room. The infrastructure enables Devon to manage by exception and perform role-based analysis and decision making remotely. Operational parameters on RTUs and PLCs can be changed from the control room and key data feeds can be monitored at high frequency for troubleshooting end optimization. Set points can be adjusted automatically based on the results of accepted engineering calculations like Turner’s equation, Foss and Gaul and others. High frequency data logging and daily exception reports let operators see what is occurring without having to be on location or watching full time. Devon has built its own tools and dashboard for gas lift analysis, including components of Peloton’s WellView.

The production optimization dashboard was built with Theta Oilfield ServicesXSPOC/XDIAG and provides pump unit analytics and KPIs. Other tools offer support for liquid loading monitoring and ESP performance. Devon’s own brand ScadaNOW application rolls-up all of the above and adds an Esri map front end. The system captures some 150,000 values per day. Devon is currently working on an extension of ScadaNOW for its mobile workforce. Devon’s drillers likewise have their own tool, WellCON providing 24/7 data-driven decision support for geo-steering and fracking operations. Davis concluded that better communications and remote operations are key to Devon’s production effort and that the trend towards higher resolution data will continue and become more important across the board.

Chevron’s George Robertson has investigated the costs and benefits of object-oriented scada systems over conventional tag-based systems. The object approach allows meter behavior to be captured in a software object, and for objects to be assembled into larger component sub-systems that inherit constituent objects’ behaviors. There are downsides to the object approach, notably in development costs but these are generally outweighed by quicker and more robust deployment. In one case study, adding a new meter to the system took some 3 to 5 hours with a tag-based system but only 15 minutes using the object approach. This could easily amount to $700 per well per year in development cost savings. Robertson went on to discuss scada architecture and security to advocate segregating systems from the internet, from the business and even from neighboring scada systems. While it may seem attractive to deploy ‘one big system’ that is cheap and easy to maintain, that makes for one big single point of failure. On the other hand, hundreds of discrete systems make for a support nightmare. In between the two there has to be a sweet spot! Appropriate physical, logical and organizational barriers are needed to isolate segments and the enterprise should be able to survive the loss of a single segment. Segmentation is also important to avoid programming and configuration errors propagating too widely.

Fred Clarke showed how automating rod pumps is helping Murphy Oil maximize production, mitigate decline and cut costs. Murphy has 650 producers dispersed across the Eagle Ford shale area of Texas and some 500 beam pumps in operation. While automation has many benefits it is often misunderstood by management who may see it as a costly and complex addition that requires operator re-training. Clarke convinces management with plots of pump failure frequency against time that clearly show the benefits of installing controllers and variable speed devices (VSD) early in a well’s life. Automation can be carried out in stages. From ‘status only’ telemetry, through simple pump-off controllers to intelligent VSDs that offer added benefits of self optimization, remote operation and control of other equipment such as chemical pumps. Murphy’s high-end, intelligent VSD units offer lots more. From tubing, flowline and casing temperature and pressure, to leak and vibration detection and safety system monitoring. Automation also pays-off as non automated systems make for more site visit and slower decisions made from ‘dead’ data. Such considerations are especially important in shale wells whose rapid decline makes for changes in operating conditions. VSDs can adjust pump speed to keep pump fill near to the target range. Real time data provides a movie rather than a postcard allowing for optimization of every pump stroke.

More from American Business Conferences.


More from GE Oil and Gas, Florence

Technip - ‘crisis worst for a generation.’ BP - ‘industry got too comfy with $100 oil.’ Total - ‘engineers are concerned about costs too!’ BP, Statoil, ExxonMobil - ’standardization is good for business.’ Welltec - ’standardization is bad for innovation.’ Columbia Pipeline leverages big pipeline data.

Technip CEO Thierry Pilenko asked in his keynote, ‘are you sure it’s only about costs?’ The current crisis in the oil industry is the worst for a generation, probably worse even than 1986. Everyone knows what needs to be done. We hear, ‘you are all too expensive, do something now.’ Technip started with a 10% cut (i.e. its profit margin!) then turned to GE which turned to its suppliers and so on. But industry is asking for 30, 40 or 50%! Drilling is already there, but is what has been achieved sustainable? Meanwhile megatrends are working the other way. Developing a cluster of fields is more costly than a single massive field. National labor content adds cost. Inflation raises salaries. There is a fantastic reservoir of talent in India but salaries are up 50% in the last 4 years. Pilenko got into his swing on the burden of increasing norms and regulations. Here the time spent is up from 1500 man hours for a large piece of equipment to 3000. A curious observation in the face of IT productivity gains. Each operator adds its own norms and rules, some of which seem ridiculous and kill productivity.

BP head of global operations Fawaz Bitar agreed that the industry had gotten too comfortable at $100 oil and that there was too much over-specification. Workforce productivity had decreased to the extent that ‘wrench time’ may average only 4 hours out of a 12 hour shift. BP is addressing this with increased standardization, citing the OK IOGP’s initiative to standardize procurement specs for ball valves, subsea trees and low voltage switchgear. Bitar also mentioned GE’s Unified Operations, deployed in BP’s Plant operations advisor as providing ‘holistic insight’ into performance of (e.g.) gas compression systems.

Total’s Bernard Quoix (VP rotating equipment) opined that ‘engineers are as concerned by costs as finance.’ Total is addressing ‘exponentially increasing’ costs with fit-for-purpose design and process standardization. Quoix advocates standardizing before a contractor is assigned. ‘We don’t want the EPC to decide for us.’ Total works with GE on optimization and Quoix’ dream is that ‘we remain in direct contact, without having to go through the EPC who is only interested in the bottom line.’ We expect that Quoix meant to add ‘present company excepted’ as Thierry Pilenko was sitting only a few feet away! Total is an enthusiastic user of GE SmartSignal for predictive maintenance and controls all of its rotating machinery from Pau, France.

Just when everyone had got comfortable with the idea of standardization, Welltec president and CEO Jørgen Hallundbæk took the stage to argue that standardization produces a ‘low to common denominator.’ Welltec performs the equivalent of keyhole surgery on wells, and works to take people off the rig, operating a rig remotely like a drone and using downhole robots to fix issues like stuck pipe. Welltec likes to over-specify its robots so that they are ready for different tasks, creating completion system that can be deployed by pipe handlers. Welltec even automates its own factory where ‘robots build robots.’ Hallundbæk observed that the true uncertainty in field development is geology and therefore advocates a staged development process. A delineation well is thus designed so that it can become a ‘keeper’ if it produces.

CTO Elisabeth Birkeland Kvalheim’s keynote was a delicate balance between Statoil’s activity as an oil and gas producer and its desire for ‘innovative solutions to shape a low carbon future.’ Statoil is getting more competitive by simplifying and standardizing drilling. But standardization is not sufficient, we need more targeted technologies. Statoil’s annual 3bn NOK R&D spend has delivered some 80 new technologies including subsea gas compression, the unmanned wellhead and next (perhaps) ‘automatic drilling.’ On the flip side of the carbon coin, Statoil is working to reduce emissions and costs and to become the ‘lowest carbon oil producer in the world,’ in partnership with GE. Kvalheim invited other operators to join the project.

ExxonMobil Development Company president Neil Duffin compared and contrasted costs in routine and non-routine, complex projects. Some of the later have gone horribly wrong, bringing tens of billions in cost overruns and are decades behind schedule. And guess where the industry is going today? Even more non-routine projects, with more deepwater, FPSOs, extended laterals and LNG. Duffin categorized shale as a game changer but observed that the recent price collapse was not caused by a drop in global demand, which rose constantly through 2014. What happened was that the supply curve crossed the demand curve at the start of 2014. Major project costs can however be controlled, Exxon’s 75 million tonnes/year Papua New Guinea LNG plant came in early and on budget.

Duffin advocates one team working across operator, contractor and sub-contractor bringing all close together up front and reducing the number of hand-offs, which are always an opportunity to ‘drop the baton.’ Behaviors need to change. Engineers need to focus on engineering, calculating and managing risk. ‘Don’t pile contingency onto of contingency.’ Today the industry really doesn’t understand standardization. We have heard here a lot of talk about customization but even small tweaks cost more and take longer. We need to help the service industry standardize. A ‘little bit different’ is not standard. This way, the industry can and will weather the storm.

Ken Oostman, VP engineering with Columbia Pipeline Group, is an enthusiastic advocate of the use of big data and analytics in pipeline operations. Columbia’s wakeup call happened when a line ruptured in Sissonville West Virginia in 2012. The non-fatal but costly incident was down to a combination of corrosion, data management and scada/shutoff issues. Columbia turned to GE to develop a system to fix these issues. The solution (the Accenture/GE Intelligent Pipeline Solution (IPS)) now provides a birds-eye view of operations, with drill down to individual segments. The system provides fast access to information even if there are still issues with the age and format of data. Columbia is also upgrading its workflow and integrating its GIS systems. One serendipitous benefit came in a meeting with the PHIMS regulator who asked operators ‘what they were doing with big data?’ Oostman talked about IPS. The others had nothing to say!

This is our second report from GE Florence and we still haven’t got around to Predix. This will definitely come next month.


Folks, facts, orgs ...

Arria NLG, Bill Barrett Corp., ClassNK, ConocoPhillips, Lloyd’s Register Energy, DNV GL, eCORP, Express Energy Services, GE, Society of HPC Professionals, Intertek, Kongsberg Maritime, Maersk Training, Navigator Energy Services, Nimbix, Peak Group, Propell Technologies, UKGeoForum, Zycus, Datum360, Pioneer Consulting, PSE, Quantum Technology Sciences, Seatronics, Shell, TGS, Tipro, Yokogawa Electric, Israeli ministry of energy, Lloyd’s Register Foundation.

Simon Small has stepped down from his position as executive director of Arria NLG.

Following CFO Bob Howard’s departure, Bill Crawford is senior VP treasury and finance with Bill Barrett Corp.

ClassNK has promoted Koichi Fujiwara to chairman and president. Junichiro Iida is MD, Tetsushi Agata is executive auditor.

Jeff Sheets is to retire as ConocoPhillips’ executive VP finance and CFO. The position is now held by Don Wallette. Al Hirshberg is executive VP production, drilling and projects and Matt Fox is executive VP strategy, exploration and technology.

Lloyd’s Register Energy has launched a new equipment survey program to assess the integrity of land rigs.

Ditlev Engel is now CEO of DNV GL’s energy business area succeeding acting CEO Elisabeth Harstad.

Jerry Walker is President of eCORP. He hails from Dresser Rand Group.

Stuart Bodden has joined Express Energy Services as president and CEO. He was recently with Innovative Deepwater Exploration.

GE Power has appointed Clay Johnson as VP and CIO.

Shawana Johnson of Global Marketing Insights and Franz Deimbacher of GeosScale have joined the board of the Society of HPC Professionals.

Catherine Aitken is to lead Intertek’s new laboratory in Louisville, Colorado.

Egil Haugsdal is to assume the presidency of Kongsberg Maritime after a short overlap with Geir Håøy who will become CEO next June.

Maersk Training has opened an advanced simulation facility in Dubai.

Mark Hafner is VP of commercial operations at Navigator Energy Services. He hails from Occidental Energy Marketing.

Nimbix has appointed Chuck Kelly as senior VP of Sales. He hails from SunGard Availability Services.

Peak Group has named Neil Poxon as global business development director. He was previously CEO of ProSep.

Brian Boutte is now CEO at Propell Technologies. He succeeds John Huemoeller who remains on the board. David Ramsey, formerly of Tecpetrol, is COO.

Quarry One Eleven founder Alistair Maclenan is now chairman of the UKGeoForum.

Michael Taylor has joined Zycus as VP of ‘customer success management.’

Matt McKinley is now VP of global business development at Datum360. He hails from Siemens Oil and Gas.

Pioneer Consulting has named Gavin Tully as director of submarine solutions. He was previously with TE SubCom.

Paul Frey is to lead PSE’s new center of expertise for pressure relief, flare and blowdown. He hails from AMEC Foster Wheeler.

Stephen Trippe is now chairman of Quantum Technology Sciences succeeding company founder Freddie Garcia who remains a director.

Seatronics has appointed Jannelle Pence as VP for the USA region and Kevin Strachan as head of finance.

Marvin Odum is to leave Shell. He is succeeded as US country chair and president by Bruce Culpepper, currently executive VP HR, unconventional resources and regional coordination.

TGS COO Kristian Johansen is to succeed retiree Robert Hobbs as CEO.

Drillinginfo CEO Allen Gilmer is now chairman of the Texas independent producers and royalty owners association, Tipro.

Yokogawa Electric Corp. has appointed Masatoshi Nakahara as director and executive VP, and Junichi Anabuki as director and senior VP.

Tenders, jobs...

The Israeli ministry of energy has issued a tender number 20A/16 for the establishment and maintenance of an oil and gas national data repository. Deadline for purchasing the bid documentation is the 4th May with final submissions due on 4th July 2016.

Lloyd’s Register Foundation is seeking a director and host institution for a £10M program to build resilience in critical infrastructure.


Done deals

3esi, Enersight. Motive Drilling Technologies, GE Ventures, Formation 8. Fluor Corp., Stork, Arle Capital Partners. IHS, Markit. Altran, Tessella. Quorum, ePrime, Retama Resources.

3esi and Enersight are to merge and form 3esi-Enersight. The new unit will provide strategy, planning and execution services to the upstream.

Motive Drilling Technologies has received $10.5 million in funding from Formation 8 and GE Ventures. The monies will be used to expand deployment of Motive’s artificial intelligence-driven directional drilling bit guidance system. The company is a Hunt Energy Enterprises spin-out.

Fluor Corp. has acquired Stork Holding BV from its private equity owners Arle Capital Partners.

IHS and Markit are to merge, forming a global information provider to the energy, financial services and transportation sectors.

Francs’s Altran has acquired UK-based analytics and data science consultant Tessella. The acquisition is a component of Altran’s ‘2020 Ignition’ strategic plan.

Quorum has acquired upstream business process outsourcing services provider ePrime from Retama Resources. ePrime delivers oil and gas accounting and administrative services using Quorum’s software.


Wireless world

Statoil’s WiNoS program. Ingenu, WellAware machine network. ITC Global, Satellogic comms news.

Yokogawa and Statoil are to jointly develop an ISA100-based field wireless system to map noise levels at Statoil’s plants and upstream facilities. Statoil’s wireless noise surveillance project (WiNoS) is designed to help minimize noise exposure of personnel and to achieve compliance with the OHSAS 18001 specification. The system displays noise levels on a map of the plant that can be viewed in the control room or remotely from a cloud service.

Ingenu and WellAware have announced an expansion to the ‘machine network,’ now with some 55,000 square miles of coverage in Texas. The network, powered by Ingenu’s RPMA communications technology, covers oil and gas fields, representing more than 50% of US production. RPMA allows IoT and M2M devices to be connected ‘without the need for further network investment.’

Satellite communications specialist ITC Global has been awarded a ‘multimillion dollar’ contract to provide communications to two West African FPSOs. The seven-year, $6.5 million contract was awarded by Saipem as a component of Total’s Kaombo development, offshore Angola. Comms will be assured by dual C-band stabilized antennae delivering up to 10 Mbps data rates per vessel.

Satellogic is to launch a ‘real-time analytics engine,’ using data from its proprietary constellation of high resolution satellites to deliver ‘actionable information’ to oil and gas decision makers.


OspreyData’s Hadoop for the upstream

CTO Al Brown blogs on HortonWorks-based ‘agile’ big data solution for oil and gas.

Hortonworks technology partner OspreyData CTO Al Brown blogged recently about the role of ‘agile big data analytics solutions’ in oil and gas. Brown focuses on unplanned interruptions to production, a ‘$100 billion/year problem.’ With oil prices at near historic lows, production interruptions cannot be allowed to further erode margins.

Without saying which ones, Brown reports that oil and gas companies are already using big data technologies, such as the combined Hortonworks/OspreyData solution, to reduce the risk of unplanned interruptions. Today, analysis is performed by operators who may have hundreds of wells in their charge, and a workload that limits their ability to intervene proactively. Analysis takes months and may cover only a fraction of the data.

Enter OspreyData’s packaged Hadoop-based oil and gas application for production optimization and preventative maintenance. This captures streams of data generated by surface and downhole equipment, providing visibility into operations and pinpointing potential points of failure. The system embeds ‘best in class’ field practices to predict and prevent failure. All running on a modern data architecture with enterprise services for security, operations, and governance.


DNV GL sees ‘new reality’ for industry

Outlook for industry report and survey finds some positive notes in a generally gloomy picture.

DNV GL has just issued its Outlook for the oil and gas industry in 2016, a 32 page summary of an annual survey, conducted by Longitude Research on behalf of DNV GL, of over 900 industry professionals and executives. Woodside COO Michael Utsler opined that ‘We all subscribe to the lower-for-longer view on pricing.’ But a ‘confident minority’ of companies claim to be ‘managing the downturn better than others’ and which plan to ramp up spending. Statoil’s Eirik Wærness sees shale oil production declining and demand growth ‘much higher than it was 12 months ago.’ Petronas upstream CEO Dato’ Wee Yiaw Hin sees ‘some optimism for a possible recovery in 2016’ even though ‘there are still uncertainties.’ Confidence in financial performance was slightly up from last year. 34% were confident about achieving their profit targets in 2016. Total was the profits poster child for 2015. BP, curiously in the face of its record $5bn loss for the year, was complimented for ‘beating analyst forecasts for replacement cost profit.’


CMG reports CoFlow deployment hiccup

Novel flow modeler hits performance snag.

In September 2015, Computer Modeling Group (CMG) released revision 10 of CoFlow, its next-generation dynamic reservoir simulator, to partners Shell and Petrobras. The simulator was deployed in a target asset to assist with day-to-day business decisions. The solution is still in use but CGM reports that due to ‘identified issues’ with software performance, further deployment has been suspended. CGM, Petrobras and Shell remain committed to the ongoing development and success of the project. Version R11 is scheduled for release later in 2016. CoFlow development has been underway since 2006 and is to continue ‘until ultimate delivery’ of the software. CMG puts its share of the costs at $6.4 CAD million for fiscal 2016.

Sales, deployments, partnerships ...

Arria, Inovx Solutions, Blackstone, Doxee, Emerson, Intergraph, JP3 Measurement, Insight Analytical, OSIsoft, Paradigm, Petrotechnics, Quorum, Rock Flow Dynamics, Sierra Monitor, CICC Automation Technologies, Arigo Software, Zycus.

Arria has partnered with Inovx Solutions to enhance the latter’s 3D asset modeling software with Arria’s natural language generation platform.

Blackstone has formed a partnership with Clarion to provide strategic solutions to the offshore oil and gas drilling and services sector and to offer private equity financing.

RWE is to deploy the cloud-based Doxee enterprise communications platform to manage customer communications and output management operations at its Czech Republic branch.

Maersk has awarded the automation contract for its North Sea Culzean project to Emerson.

Total and contractors Technip and Samsung Heavy Industries are to use Norway’s EqHub e-procurement solution on the Martin Linge project in the Norwegian sector of the North Sea.

Saipem is to consolidate and standardize its engineering operations on Intergraph SmartPlant Enterprise .

JP3 Measurement has chosen Insight Analytical as the exclusive reseller of its products and services into Canada’s oil and gas markets.

OSIsoft has been selected by a ‘major US intelligence community agency’ to deliver its PI System software as the information technology/operational technology infrastructure for situational awareness at several major national facilities.

Addax Petroleum has acquired the pore pressure prediction module to complement its Paradigm suite of products. The module has been embedded in Geolog as Addax Pretroleum’s corporate tool for pore pressure prediction.

Petrotechnics has been selected by a major LNG operator to deploy its Proscient platform to its Gulf of Mexico facilities.

Quorum TIPS has been chosen by an unnamed master limited partnership to manage its customers, operations and assets in the midstream oil and gas industry.

YPF has acquired licenses of Rock Flow Dynamics’ tNavigator to for enhanced oil recovery studies, uncertainty analysis and automated history matching.

Sierra Monitor Corp. has appointed CICC Automation Technologies as an authorized reseller of its FieldServer family of protocol gateways and routers in India. Arigo Software has been authorized as a reseller and local support team for German-speaking markets.

An unnamed ‘global oil and gas engineering organization’ is to implement Zycus’ procure–to-pay and strategic sourcing solution across all of its divisions’ headquarters and regional offices worldwide.


Standards stuff

OGC IOGP/IPIECA oil spill response common operating picture. Energistics’ Witsml 2.0. V8.9 of the EPSG dataset. Oasis UBL 2.1. EC ok’s XBRL 2.1. OGC reports IMIS IoT success.

The Open Geospatial Consortium (OGC), the International Association of Oil & Gas Producers (IOGP), Ipieca (a global oil and gas industry association for environmental and social issues), in cooperation with Resource Data, Inc., have issued the OGC Iogp/Ipieca recommended practice for a common operating picture for oil spill response. The final deliverable from the IOGP/IPIECA JIP outlines the use of GIS/mapping technology in oil spill response management.

Energistics has released for comment V2.0 of its Witsml wellsite information transfer standard. Witsml 2.0 is the first standard to embed the new Energistics common technical architecture (CTA), a common technology foundation for the whole family of standards including Prodml and Resqml. The CTA is designed to aid standards implementation and to makes it easier to share data between the standards as they are deployed in common ‘end-to-end’ workflows. The latest release sees the replacement of the old web services API with the new Energistics transfer protocol. Comments are requested by close of business June 30, 2016. More from the Witsml overview guide.

The geospatial committee of the IOGP has announced V8.9 of the authoritative EU petroleum survey group (EPSG) dataset, now available as both an Access and SQL download.

ISO and IEC JTC1 have approved the Oasis universal business language standard, UBL 2.1, now also known as ‘ISO/IEC 19845:2015.’ UBL is said to be widely used around the world for procurement, sourcing, and inventory.

The European Commission has made XBRL (eXtensible business reporting language) version 2.1 eligible for referencing in public procurement. EU public administrations can now refer to the XBRL specification in their calls for tender. The XBRL standard is designed for exchanging business information, facilitating automatic retrieval of financial information and improving analysis of financial reporting. The EU promotes ICT* standards to maximize the ability of systems to work together. This is seen as essential to ensure that markets remain open and to ‘promote European competitiveness.’

The standards-bulimic OGC has also announced the successful completion of its incident management information sharing internet of things pilot (IMIS IoT). The pilot, sponsored by the US Department of homeland security, set out to demonstrate open system sensor integration for emergency and disaster response.

* Information and communications technology.


Cyber security round-up

SafePass Pro. Securing the C-Suite. Insider threat advisory. DeltaV secured. NISTIR cyber standard.

Harris CapRock has announced SafePass Pro, a cyber security solution for oil and gas comprising a firewall, 24/7 network monitoring and preventive threat protection. The tool enforces acceptable use and security policies and helps assess and eliminate network vulnerabilities.

The IBM Institute for Business Value has published a free, 20 page guide to ‘Securing the C-suite.’ While 65% of execs believe that their cyber security is OK, only 17% can demonstrate the highest level of preparedness and half think there is a 25% chance of a breach occurring that would have a material impact on their organization. Despite the title, the guide is actually about raising the profile of cybersecurity to boardroom level rather than the risks posed by powerful individuals and BYOD.

A new insider threat advisory service from Montréal-headquartered CGI, in partnership with Carnegie Mellon University’s Software Engineering Institute delivers strategy advice and implementation services to address cybersecurity threats from trusted insiders, such as current employees, contractors or business partners. The program helps organizations analyze and correlate disparate data sources to uncover potential risks and threats using ‘sophisticated methodologies’ and also creates a governance model for ongoing program management. The SEI-approved program was originally developed for the US National industrial security program operating manual (Nispom).

Emerson Process Management’s DeltaV distributed control system has passed the IEC 62443-2-4-based Achilles practices certification for cybersecurity. GE’s Wurldtech cybersecurity unit performed the audit. DeltaV will now undergo Wurldtech’s annual recertification.

An interagency report on Strategic US government engagement in international cybersecurity standardization, Nistir 8074, sets out strategic objectives for cyber-security and makes recommendations on how to achieve them.


AGSync for Total UK E and P

Offshore process control software assets protected and synched with Asset Guardian’s tools.

Total UK E&P has selected Asset Guardian Solutions’ (AGSL) Asset Guardian to secure and manage its process control software across its North Sea oil and gas fields. Asset Guardian provides a centralized, secure repository for storing process control software files, along with disaster recovery services and software version control. Central management of software ensures that authorized, onshore and offshore personnel have access to the same information. The system also prevents simultaneous changes being made to code while the use of MD5 checksums provides integrity assurance of files during transfer.

AGSL is also providing Total with its AGSync application that synchronizes files across onshore and offshore locations. Asset Guardian will be first deployed on Total’s Elgin-Franklin gas condensate fields. If the test deployment succeeds the software will then be deployed on other Total E&P UK assets. AGSL is also to assist Total with installation of the software and provide user and administrator training.


Effisoft for YFP’s risk managers

WebRisk tool standardizes risk evaluation and management for Argentinian major.

Paris-based Effisoft reports completion of a two year project to equip Argentina leading energy company YPF with its Webrisk risk management application. Effisoft’s Webrisk RMIS (risk management information system) targets risk and insurance managers. YPF is using the solution to gather and evaluate asset data for risk assessment and insurance premium estimation. YPF also uses Webrisk to evaluate its property, casualty, liability and transport insurance and in planning site visits, monitoring and and to make recommendations.

YPF’s risk and insurance manager said, ‘Webrisk has helped us to standardize our data and simplify our reporting. Instead of using Excel files, we now have a single, reliable database which is adapted to our needs. Given the success of this first project, we consider going a step further than the initial scope by including our fleet of 5,000 vehicles and our 1,500 gas stations.’ Webrisk is now used throughout YPF by some 100 employees, along with a manager and consulting engineer at head office.


SciQuest Spend Radar for Sunoco LP

'Source-to-settle’ payment solution displaces legacy procurement tools.

Following a series of acquisitions, Houston-headquartered fuel retailer Sunoco LP decided to rationalize its ‘source-to-settle’ process to reduce spend and gain efficiency. Following an evaluation, Sunoco has consolidated its onboard, buy, pay and source functions under a single platform from Morrisville, NC-based SciQuest. SciQuests’s cloud-based business automation solution for spend management will replace Sunoco’s legacy procurement tools.

The deployed solution includes SciQuest’s Spend Radar, Sourcing Director, Supplier Manager, Spend Director and Accounts Payable Director. SciQuest president and CEO Stephen Wiehe said, ‘Sunoco views procurement as a strategic imperative. Our software will provide insight into Sunoco’s current spend and help take strategic decisions on growth.’ Follow SciQuest’s blog.


Librestream, Cisco TelePresence video for PDO

Gulf Business Machines kits-out Petroleum Development Oman’s control rooms.

Dubai-headquartered systems integrator Gulf Business Machines (GBM) has supplied a video collaboration solution to Petroleum Development Oman (PDO) based on Librestream technology and Cisco networking.

Live video, images and audio from Librestream’s Onsight rugged cameras will transit through Inmarsat’s Bgan link from the field into the control room. Here remote experts can collaborate with field workers from their desktops or from Cisco TelePresence rooms.

Librestream’s Mike Murphy said, ‘Often field engineers need to stay a safe distance away from the equipment when diagnosing emergencies such as well head relief valve failures. Here engineers can use the camera’s digital zoom help remote experts diagnose the issue.’

Cisco’s Chet Namboodri added that the PDO project was one of several such collaborations with Librestream in oil and gas and other verticals. Cisco partner GBM resells the Onsight range of Ex-certified video devices, collaboration software and management tools throughout the Middle East.


DigitalGlobe uses Software AG’s cloud-based API repository

CentraSite WebMethods API gives custom access to high resolution earth imagery.

DigitalGlobe is to offer users of its high resolution earth observation imagery programmatic access to its data via Software AG’s WebMethods API*. DigitalGlobe provides some 4 billion square kilometers of coverage (98% of the earth), much at 30cm resolution.

Software AG’s digital business platform has been used internally by DigitalGlobe for five years under the leadership of Steve Miller, DigitalGlobe’s manager of enterprise integration. Miller’s team relies on the platform for integration, message routing, process management and service virtualization. Having transformed DigitalGlobes’s IT, Miller is now reported to be ‘excited’ about what can be done next with APIs. DigitalGlobe’s clients can now access and integrate high-quality imagery into their business operations and applications.

Once a service is ready for deployment, developers can offer users ‘right button click’ access to services hosted on CentraSite, Software AG’s API catalog and registry. Miller added, ‘The new API/cloud capabilities, which include the hosting and integration of consistent, scalable APIs, mean that we can these services to our partners.’

* Application programming interface.


Ensyte reports first client for Gastar in the cloud

Software as a service edition of natural gas transaction manager developed with help from Air Desk.

Ensyte has announced its first client for Gastar Online, a hosted version of its Gastar line of applications for ‘managing natural gas business transactions from wellhead to burner tip.’ The hosted, software-as-a-service edition is provided in collaboration with Air Desk Solutions, a cloud technology supplier that offers a ‘best-in-class’ operational environment and state of the art security. Gastar online is offered at a monthly subscription fee, giving small and mid-size companies a ‘cost-effective alternative to traditional license-based pricing models.’

Ensyte also reports a new sale to an unnamed Houston-based company for the implementation of its producer gas marketing and well meter netback solutions. Both are also provided as hosted packages. The producer gas solution computes month supply forecasts, daily measurement updates and end of month settlement data for a large gathering system network of producer supply and central delivery point (CDP) meters. Data can be collected from EFM interfaces, CSV/text files, and manual input. The CDP measurement is updated daily with field estimates, uploaded from IHS’ Production Explorer, with data from the pipeline supply pool.


Atek’s AssetScan for internet of things OEMs

Tank monitoring specialist offers its cloud-based IoT sensor connectivity to third parties.

Atek Access Technologies, developer of the TankScan remote tank monitoring solution, has announced AssetScan, a suite of hardware, software and data analytic services that allow original equipment manufacturers (OEMs) to connect products to the ‘industrial internet of things.’ AssetScan captures any variable and transmits it to the cloud, where it can be accessed via the web-based Atek Intelligence Platform.

The platform monitors system health remotely, including battery life, temperature and cellular data usage. Atek president Sherri McDaniel said, ‘We are building on years of remote asset management experience in tank monitoring and applying it to new markets. AssetScan delivers the power of analytics to our OEM partners, helping them to differentiate their products and create a competitive advantage.’

Customers can access the platform data base via API calls to create and manipulate their own copy. Purviews can be configured by security levels for each user type. Displays are customizable to different endpoints. AssetScan also functions as an alerting system, sending text and email messages to those users configured to receive them. Alerts can be configured at different levels and triggered by any process variable.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.