Speaking at the IQPC Oil and gas Cybersecurity conference earlier this year, Noble Energy’s Stuart Bailey and Andrew Ginter of Waterfall Security Solutions showed how Noble is protecting its offshore assets from cyberattack using Waterfall’s unidirectional security gateway. Waterfall’s technology uses a one-way laser-to-photocell communications device to connect the rig to the onshore business network.
Sending control system data over a one-way link is harder than it sounds. The OPC-DA protocol is ‘intensely bi-directional’ with constant to-and-fro handshaking. To get around this type of constraint, Waterfall has developed device emulators that turn OPC traffic into a unidirectional data stream. Waterfall claims the ‘world’s largest’ collection of industrial server replications that includes several OPC flavors and historians from OSIsoft, Siemens and AspenTech.
Noble’s offshore West Africa rigs deploy primarily OPC, WonderWare and PI, all of which are now replicated. The system provides Noble with ‘as much visibility of offshore platform data as from any other control system in the company,’ all with ‘absolute protection’ from attack. The system is fault tolerant with dual redundant data paths. Bandwidth is capable of handling the 4000 plus connected tags (many with sub-second data) and control system backups.
For applications that do require outbound data flow Noble has developed a manual process that writes data to removable media which is scanned on a cleansing workstation prior to connection to the control system. Much work has gone into designing policies and procedures to control data moving out to the control system. Noble has worked with Waterfall on its security framework that allows new services to be implemented in a structured manner such as the addition, late in the project, of GE’s on-site equipment health monitor.
Educating the workforce has been a key component of the project. Noble has found that educated end users help with compliance. ‘Operations guys are good at following logical procedures that make sense.’ Noble is also keen to ‘educate’ other companies regarding the use of the Waterfall solution that is considered to be ‘an important development that can improve security industry wide.’
In case you are not convinced of the risks that a connected control system can encounter, another presentation from Chris Shipp, who works at the US DoE’s Strategic petroleum Reserve, told of a hack that caused ‘massive damage’ to a German steel plant last year. Shipp also recommended an article in Valve Magazine. Those interested in tracking oil country cyber security may be interested in an upcoming IQPC event in Houston.
Halliburton’s Landmark software business line is to team with Edinburgh, UK’s Petroleum Experts (Petex) on the development of advanced production solutions for the digital oilfield. Petex’s digital oil field, a.k.a. IPM Field Management (IFM) is an enterprise-level vendor neutral environment for production monitoring, optimization and forecasting. IFM is claimed to be ‘productized,’ avoiding the ‘costs and delays’ of alternatives.
IFM will now be the core platform for Landmark’s production solutions, extending its portfolio of drilling, economics, geoscience and simulation software in a ‘single, seamless real-time field management system.’ IFM and Landmark’s DecisionSpace will be integrated and extended as a new joint DOF solution.
Landmark VP Nagaraj Srinivasan said, ‘Customers will receive immediate benefit from the pre-integration of existing Landmark and Petex products as components of Landmark’s Intelligent Operations Solutions. We are also to develop a new class of production and reservoir workflows to meet existing and emerging industry challenges. More from Landmark and Petex.
As the actualité has crowded out my editorial space this month I thought I’d provide some succinct housekeeping news. As you probably noticed we have been ‘managing’ the news from our conference attendances such that it is only now (mid May) that we are reporting on the SPE Digital Energy conference that took place in March. The reason for this laggardly behavior is that we only have the space and resources to cover one major show each month. Moreover each of these shows merits our main ‘centerfold’ slot and there is only one of these per issue. So we stick to one comprehensive in-depth conference report in each issue.
So far this year that has meant on-the-spot reports from American Business Conferences’ Wellsite Automation (Houston), SMI E&P Data Management (London) and the SPE Digital Energy event (see page 6). Our next ‘live’ report will hail from the 2015 SAP in oil and gas event held in Berlin last month. After that we will report from the 2015 EAGE conference in Madrid. I expect to see some of you there. Looking ahead the crystal ball is a bit cloudy but we will likely be attending Ecim in Haugesund in September, the SPE ATCE in Houston and a couple of others to be announced.
There is something err.. noble about the sentiment reflected in Noble Energy’s presentation at the IQPC Oil and Gas Cyber Security conference (see this month’s lead). It is refreshing to hear a company presentation that unequivocally endorses a commercial product. This is in contrast with the game that is too often played by oils and societies whereby ‘commercialization’ is to be avoided at all costs. Some folk actually believe that software is ‘commodity’ and that the true value add comes from a savant’s manipulation of the mouse.
With PNEC kicking off next week (sorry I won’t be there, off on a biking holiday) I thought I’d check the program to for endorsements. I am happy to report that at least five presentations clearly associate software with a client. It is possible to organize a conference where the presentations are genuine attempts to get an important and complete message across. It is as important to steer clear of blatant commerciality as it is to avoid obfuscation. We will of course do another ‘virtual’ report from PNEC after the show.
Our annual aunt Sally, the Accenture/Microsoft ‘survey’ of oil and gas digital technology and trends puts happy face on an otherwise dire situation. ‘Despite the fall in crude oil prices, most companies … plan to invest the same amount or more in digital technologies.’ The trouble with surveys, as psychologists and politicos will confirm, is that they are devilishly difficult to design without revealing inconsistencies of a more or less embarrassing nature. Asked ‘Which digital technologies is your company investing in?’ 77% replied ‘None of the above’ and 71% replied ‘don’t know.’ Support for other technology options was all in the 60% plus range. Accenture MD Rich Holsman explained, ‘Respondents were allowed to select any or all of the options. They first selected technologies that their companies was investing in. Those who selected ‘don’t know’ possibly meant that they were not sure which other technologies were in use.’ Decipher the enigmatic responses yourselves by downloading the survey results.
Speaking at the 2015 Nvidia GPU technology conference earlier this year, Dominic Walsh explained how Schlumberger is introducing GPUs into its ‘next generation’ Intersect reservoir simulator. Since 2011, clients’ model sizes have risen from a few tens of million cells to as many as 2 billion cell models running on Linux clusters. Adding surface facilities, economics and uncertainty makes for ‘compute bound’ models, hence the interest in the potential of the graphics processing unit (GPU) to accelerate simulations. Over the next 5-10 years Schlumberger plans to migrate its user base from Eclipse to the Intersect ‘high fidelity’ simulator and is keen to add new GPU hardware to the mix. Parallelizing computations to the GPU is complex. Modeling in the vicinity of the well bore is ‘too small and complicated’ and will remain on the GPU.
Walsh’s talk was co-authored by Stone Ridge Technology whose Dave Dembeck presented on how GPU acceleration has been leveraged in its own Echelon reservoir simulator. Echelon was developed from the ground up for GPUs and claims ‘10x to 50x faster’ runtimes than leading commercial simulators. Dembeck observed that as simulations speed up, they can create a ‘downstream bottleneck’ as engineers are inundated by model results. Workflows group results into ensembles and visual tools help scenario and model triage.
Saber Feki presented KAUST’s experience with OpenACC, a higher level, cross platform route to parallelizing scientific applications to GPUs and other hardware. KAUST showed OpenACC ports of seismic migration and inversion codes for use on its Shaheen II Cray XC40, a 100 tonne monster with a 7.2 petaflop peak capacity. Shaheen II uses 6,192 x 16 core Intel Haswell CPUs. A GPU upgrade is planned as the machine has maxed-out on power usage at 2.8 megawatts!
Our ‘best pics of the show’ award goes to Michael Heck (FEI - Visualization Sciences Group) who showed how Schlumberger’s Petrel leverages Nvidia GPUs via FEI’s Open Inventor toolkit. Volume rendering and interaction with 220 giga voxel datasets is possible. Heck reported that the latest Quadro K6000 is around twice as fast as the K5000 with same software. Resist
Finally Jonathan Marbach’s (CGG Software) presentation on GPU acceleration of acquisition footprint removal in post-stack seismic data is replete with benchmarks and offers a good history of GPU evolution.
What’s behind Witsml certification.
Our implementation of the 188.8.131.52 release passed the Energistics automated software testing and certification program. This new approach replaces previous certification programs and brings consistency and objectivity to the testing process. We have a particular interest in this as I am a co-chair of the Witsml SIG.
What does this entail?
Witsml is an open industry community. We try to keep the herd moving in the right direction and figure out where we should go tomorrow. We recently released the new Energistics transfer protocol for public review. ETP has been a focus for several years as the community looks beyond simple SOAP and separates data and transport mechanics, making it easier to add new stuff.
Why was this not done before?
Hindsight is a great thing! ETP fixes this issue, adding just what we need. We are now focusing on data management and have added new objects to make it possible to share quality assessments and add context. This will also enable data audit and traceability. With the rise of deviated drilling, managing trajectory data is increasingly important.
What does Witsml bring to Petrolink?
Actually we use and need lots of standards but we find Witsml to be one of the most active and open communities solving oil and gas problems.
What others are there?
A lot! Wits is still widely used as are LIS, LAS, DLIS and even CSV files. DLIS and WITS do have communities but they are not very active. We also see a lot of standards coming in from process control. OPC is a big contender here. But Witsml was designed by and for oil and gas.
But OPC is at a lower level?
Yes and it has a different footprint. But you may want to integrate some data from the process world. For instance to monitor a BOP stack. This is easiest to implement at the OPC level and you can share summary data from service providers via Modbus/Profibus. We take results from OPC and ‘upscale’ to Witsml. There may be other ways of doing this, but I am biased! Petrolink also does data aggregation. Company X has some data, company Y has more data and we combine them, adding context.
Witsml is the industry-favored format?
Yes. Our first thought is to use the data structures somehow, even if it was not designed for data management.
What is the situation now with validation of Witsml data? Oil IT Journal validates its newsfeed with the W3C’s validator. Why isn’t this facility offered by Witsml?
It is now although it is true that the Witsml community resisted certification for years. But this did not stop users (like us) from using a parser to check conformity. With V184.108.40.206, we realized that an extra step was needed, one that would not be a barrier to use. Today, the free Energistics reference server gives an unbiased opinion on data structure and behavior, although this approach is not as rigorous as your RSS Feed validation.
Which is certified? Server or client?
For now it’s the server. But as we move to ETP and away from XML and with an order of magnitude more data, we will need a more structured process like a validator.
ETP is not XML?
No it is a binary encoding of data in say XML or Profibus. You can also use ETP as internal transfer mechanism for proprietary data. ETP leverages the HDF open packaging standard that was championed by the Resqml community. It also uses the Apache Avro framework.
Speaking of Resqml, is the Semantic Web getting take-up in the larger Energistics community?
The community is often asked by oils for more context and extra information types like the relationship between real time and static, reported data. A report is a snapshot of data at a point in time. This requires thinking about things at a corporate level – for instance tying Witsml to SAP. In this context, combining semantic metadata with Witsml objects could be a means of taking stuff along the data chain. The Energistics data management group is looking at this. In fact I am working on a paper for PNEC on Data Management Challenges and WITSML.
Who pays for all this?
We are 100% staffed by volunteers. There are 75 in Schlumberger and 4-5 companies working on the project. More from Petrolink.
Total has upgraded its SGI ICE X ‘Pangea’ supercomputer, adding 4,608 Intel Xeon E5-2680 v3 nodes (110,592 cores) and 589 terabytes of memory in 8 SGI M-Cells. The update includes an additional 9.2 petabytes of storage. Pangea is located at Total’s Jean Féger Scientific and Technical Centre in Pau, France and will, once the new upgrades are installed, take some 4.5 megawatts from the grid.
Mustafa Kara presented certification consultants DNV GL’s cloud strategy at the recent IDC HPC User Forum in Norfolk, Virginia. DNV GL uses a cloud-based edition of Nimbix’s Jarvice platform to perform computational fluid dynamics calculations on oil and gas structures. Nimbix’s API is used to access Ansys Fluent software and to create tailored workflows. Jarvice is described as a self-service, elastic supercomputing architecture for big data and HPC applications.
Schlumberger has signed two key deals for deployment of its
flagship Petrel geoscience applications with BP and Chevron Energy
Technology Company. Petrel is a component of BP’s new digital technology roll-out.
The first deployment of the two-year global implementation has been
successfully completed at BP’s business unit in Aberdeen, Scotland.
Schlumberger Information Solutions’ president Uwem Ukpong said, ‘Petrel replaced an existing software toolkit to help upstream technical staff optimize work processes, improve efficiency and reduce non-productive drilling time.’
Speaking at last year’s PNEC data management conference, BP’s Pradeep Vaswani outlined BP’s ‘Chili’ upstream digital platform which has Schlumberger’s Petrel running on top of Landmark’s R5000 data infrastructure. The Chevron deal covers a long-term software contract providing universal access to Petrel, Techlog, OFM and ProSource data management across Chevron’s entire earth sciences organization. More from Schlumberger.
Ikon Science has signed with California based Dynamic Graphics for the integration of the latter’s CoViz Lite 3D visualization platform with RokDoc seismic lithological modeler. CoViz Lite will be marketed as an add-on 3D visualization system for RokDoc featuring select capabilities derived from the original CoViz 4D package. Visualizing complex rock and fluid property maps and data volumes in full 3D will help geologists and engineers understand and analyze complex reservoirs and plan more efficient wells. Users of the new Ji-Fi facies-driven impedance inversion process will also benefit from the CoViz data representation toolset.
Ikon CEO Martyn Millwood Hargrave said, ‘We look forward to progressing this technical relationship as we share Dynamic Graphics’ passion for rigorous science and its culture of user-friendly software.’ The combined package is undergoing testing prior to a commercial launch this summer 2015. The software will be on show at the upcoming EAGE and AAPG tradeshows. More from Dynamic Graphics and from Ikon.
OneGeology, an international collaboration between geological science institutes has outlined its work and funding plans for 2015-16. OneGeology (Oil IT Journal September 2008) was set up principally by the UK Geological Survey and France’s BRGM as a worldwide digital geological mapping system. Since then Earth and computer scientists have collaborated on the digital geological map of the world. This year’s program includes a large amount of ‘business as usual’ activity, in particular fundraising as the UK and France seek other parties capable of sharing the financial burden. New principal member subscriptions and/or other income sources are necessary for OneGeology to continue
Assuming that the funding issues can be sorted, the OneGeology program includes registering new web map services (WMS) from around the world, especially from South America. OneGeology’s underlying GeoSciML protocol is to be extended with a web feature services and a new metadata catalogue is planned, leveraging the imminent new release 3.0 of the open source Geonetwork spatial catalog. This will then include new ways of searching for the extended geoscience data types that OneGeology now encourages.
BRGM is to implement a new Geoserver service to provide the backdrops for the new multi-projection support system. Finally, support for the OGC web coverage service is planned. This will facilite the visualization and exchanging of large grid and raster data sets as a ‘small step’ towards full 3D support. Interested parties should sign up for the Technical Implementation Group (TIG) meeting in Lisbon on 26 May.
To celebrate the 200th anniversary of William ‘Strata’ Smith’s seminal 1815 ‘delineation of the strata of England and Wales,’ the UK Geological Society has announced a year-long program of events. At the launch event in Burlington House a new interactive Strata-Smith website was unveiled that allows users to view all the known surviving editions of the Map.
The development of this website was sponsored by UKOGL, and developed by Lynx Information Systems. Lynx’s Peter Wigley demoed the new website’s interactive map viewer. Along with the Smith maps, users can overlay modern geology, wells and seismic sections (using Lynx’s online SEG-Y viewer) along with topographic maps. More from Lynx.
The accolades are all the more fitting in that, back in the day, the Society gave Smith the cold shoulder, as Simon Winchester describes in his book, ‘The map that changed the world.’
Assai Software’s engineering document management solution now includes AssaiOnSite to enable up/downloads across low bandwidth connections into the Assai collaboration portal.
Badleys’ Move2015 introduces Trishear, a new 3D move-on-fault algorithm that restores complex deformation ahead of a fault tip. Badley’s flagship ‘uncertainty mitigator,’ Traptester has been rebranded as ‘T7.’
ClearEdge3D has released EdgeWise 5.0 with improved pipe extraction algorithms, a Plant 3D plug-in and ClearView, a photo-realistic point cloud visualization engine.
IBM researchers have recorded 123 billion bits of data on one square inch of magnetic tape. The breakthrough represents the equivalent of a 220 terabyte tape cartridge, an 88 fold improvement over today’s LTO6.
Ikon Science has released a seismic data conditioning plugin for Schlumberger’s Petrel, providing pre and post stack QC workflows as a precursor to quantitative interpretation.
INT’s GeoToolkit.JS 1.3 improves visualization of complex data on endpoints from desktops to mobile devices. The release includes new gauge types, a well log widget for common log displays and new log data event and log curve data source objects.
KBC Advanced Technologies has released new versions of its engineering software suite. Petro-SIM 6 now supports life of field modeling with the integration of Multiflash’s high resolution thermodynamics and Feesa Maximus’ thermal hydraulic production models.
New releases of Lynx’ Seismap, Clickrelate and Rasterlink include support for ESRI ArcGIS 10.3. The solution provides point-and-click access to linked documents, images, seismic, well logs and formation tops.
Newgate Instruments JT400 ultra-low power wireless multivariable transmitter measures differential and static pressure and process temperature. The explosion-proof unit targets custody-transfer applications with an API 21.1-compliant data logging and a local USB interface for configuration and data download.
Release 4.0 of Oildex’ Spendworks ePayables includes a new GUI/dashboard, auto-email dispute resolution and enhanced administration functionality.
Pegasus Vertex has announced Bass, a new bridge agent size selector. The tool uses lost circulation material blends and particle size distribution to optimize selection based on the ‘ideal packing theory.’
Safran Software has announced Safran Integrator for SAP, providing ‘seamless’ information flow between Safran Project and SAP.
Schlumberger’s ‘brand new’ drilling optimization software, built on big data analytics from the 70 million feet it drills every year, is coupled to an upgraded rig control system, promising a ‘step-change’ in drilling performance. The 2015.1 edition of CoreDB delivers (another!) step-change in functionality and interoperability with Oracle and Microsoft. The Intersect simulator now runs in the cloud.
ZTR Control Systems has announced a heavy oil well site monitoring and control solution under its inReach brand. The system offers J1939 and Modbus connectivity and dual-mode (cellular or satellite) communications.
Oil IT Journal ventured downstream last month, attending a conference on Intelligent Operations organized by the Netherlands WIB process industry body. WIB provides a focus for peer group counseling, lobbying, functional safety and best practices exchanges to around 50 member companies including Shell and ExxonMobil. The debate polarized somewhat between advocates of people and process and those in favor of automation and robotics.
On the ‘people’ side, Alex van Delft (DSM and WIB chair) characterized intelligent operations (IO) as being about new ways of designing plant and about working together to close the different loops of process, production and the business. The key is to leverage measurement and to bring workers into a virtuous circle of transparency and self-regulation. ‘IO is mainly about people.’
Joachim Birk (BASF-SE and Namur) sees the people’s role as diminishing. In fact this trend is already well underway with for instance, Linde, which operates 15 air separation plants and 50 Ecovar units remotely from an operations center in Leuna. Production is tuned to demand and energy costs. BASF has a roadmap that aligns different levels of IO development with investment. This goes from low level DCS-controlled plant to a highly automated plant with no operator intervention in normal operations.
The ultimate is the ‘semi-autarkic’ plant which is unmanned. Centrally located personnel provide expertise and distribute tasks across the technology cluster. Unmanned plant integrity can be assured by video, thermal and noise surveillance. Sequence based automation with ISA S 88 for batch and ISA SP 106 for continuous also ran.
Rob Everink (DSM) asked if it should be intelligent operations or an intelligent operator. He believes that a true vision for automation is lacking. Humans need to focus on highly skilled activities as robotics take over the ‘monkey business!’ Control systems, not operators, should control the plant. After 40 years DCS control systems are far from maturity, today they ‘resemble a computer game for operators.’ Modern PID controllers are very different, but we don’t use the difference! 100% automation should be possible. Inertia and conservatism are holding this back. Download the WIB presentations here.
Geoffrey Moore kicked off the 2015 edition of the Society of Petroleum Engineers’ Digital Energy conference in The Woodlands, Texas earlier this year asking if the digital oilfield has ‘crossed the chasm,’ a reference to his seminal work on innovation and disruption. Moore’s framework for analyzing technology adoption ‘plays out in every disruption.’ For instance, 3D printing today is in the chasm but will emerge sooner or later into take-up by the majority. The question for such innovation is ‘how do you light the fire on the other side of the chasm?’ Moore was better at citing examples (Amazon in cloud computing, Apple for smart phones) than providing matches. Although Bezos’ and Jobs’ ‘charisma’ helped these companies ‘get ahead of the herd.’ Oil’s current ‘re-pricing’ may push companies to ‘go digital’ because of the pain. Moore recommends targeting a beachhead segment, a niche market with an intractable problem. What will oil and gas be missing in 2016?
Another problem is doing this from inside a large company or, ‘from the belly of the whale.’ Moore is on a ‘positive jihad’ to help large organizations allocate resources to ‘unwelcome’ next-generation projects that are ‘dilutive before accretive.’ But the digital oilfield is not an innovation challenge, it is a management challenge. Moore covered all bases in his conclusions observing that ‘Digitization is changing everything, it will change oil and gas’ but also that ‘the time horizon for venture capitalists and energy is not aligned,’ citing the ‘big smoking hole’ that green energy investment has left in VC finances.
BP’s Joe Anders presented on the application of analytics to well integrity. The Norsok D-010r4 standard was a starting point. BP’s well integrity workflow relies on copious data collection, even an operator’s smartphone photograph of a gauge which can be coded by a clerk back at base. For analytics, many different information sources come into play, accessed via a data integration layer and common data model. Anomalous conditions, such as a well’s annulus pressure breaching a limit, will email a notification to appropriate personnel who can use the system to investigate further and generate standardized reports for various conditions. Analytics make it possible to link precursor events, such as wells with 2 or more bleeds in the last month, to defects before they become serious. BP has evaluated various ad-hoc schematics packages and finally went for Well-Barrier. The tool now has 600 users.
Amol Bakshi related some of Chevron’s lessons learned from model based real time production optimization. Ideally this concerns highly instrumented assets but this is not always the case. Early solutions used various software solutions and integration platforms. Around 2010, Chevron recognized that it needed a coordinated enterprise-wide solution which is now operational. The first step in production optimization is to make sure that the asset is amenable to such an approach, not all are. Once a commitment has been made to model based decision making, an 80/20 solution that targets core objectives is designed, resisting scope creep. Key tools of the trade are Petex’ IPM suite and a ‘virtual field simulator’ that generates test data before first oil. We asked Bakshi what had come from his earlier work on semantic technology in Chevron. This appears to have been down-played as ‘finding people with semantic technology expertise is hard.’ SQL would seems to be the tool of choice at Chevron.
David Reed offered a retrospective1306 of 20 years of managing BP’s drilling and completions application portfolio. BP made a move to Landmark’s EDM in 2005 and is currently migrating to the R5000 release as part of BP’s Chili project. The user community’s demands often far outstrip IT’s capability. User needs must align with architectural principles and BP’s bill of IT (buy not build and favor Windows over Linux). The IT architecture has evolved from flat file, through client-server and now data layer with pipes feeding five ‘pillar’ applications, Landmark’s Engineers’ Desktop, Well integrity, Kongsberg’s operations suite and GWET, a Sharepoint-based well engineering toolkit. BP has over 300 separate database and many more stand-alone machines, a situation deemed unsustainable. The company is working to cut the database count by 60%. Chili combines IT discipline with new workflows for well delivery, CRS and deviation data management. The aim is to get the drilling target closer to geological target and reduce the ‘ping pong’ between engineers and geoscience. Safety-critical data is now a one way transfer via sanctioned applications. Witsml plays an increasing role in ‘making things less ambiguous.’
Big data and analytics proved popular themes this year. Minshen Hao (Chevron) has used population-based stochastic search to optimize steam generation in an EOR project. The object is to optimize the sum of multiple output vs efficiency curves. ‘Quantum-behaved’ particle swarm optimization is the technique of choice, a ‘simple, flexible cost minimizer that is entirely data-driven.’
Alireza Shahkarami (Saint Francis University) has been using artificial intelligence to model CO2 sequestration and EOR with a surrogate reservoir model (SRM) from Intelligent Solutions. This provided ‘fast track’ modeling of the complex Permian basin Sacroc field, with some 50 years of injection history. SRM uses ‘Latin hypercube’ experimental design. Results are claimed to be similar to those obtained with a CMG simulator, but while the simulator took 24 hours, the SRM takes ‘under a second.’
Inge Svensson introduced Baker Hughes’ new big data platform for real time drilling data and rig performance optimization. Real time analytics shifts the focus from pure non productive time to other, ‘invisible’ causes of lost time including human factors. The system includes a NoSQL database, HHTPS connectivity and a Witsml server. KPIs show fine grained activity break-down and opportunities for performance improvement. Statoil is a user.
Peerapong Ekkawong’s showed how PTT E&P has used the Matlab linear optimization toolbox to increase production from a Gulf of Thailand gas field. The Excel to Matlab to Excel workflow improves on manual fine tuning. The work was performed in collaboration with Texas A&M’s model calibration and efficient reservoir imaging program Mceri.
IBM’s Mike Brulé provided some evocative imagery of the evolving corporate data lake, ‘from data swamp to data reservoir.’ The big data reservoir (BDR) holds data that has been cleaned-up in the data refinery. The oil and gas industry is still trying to figure out how to use big data. Modeling with data driven methods is all very well but it is better still to understand cause and effect. Empirical ANN is gaining ground alongside of traditional physics-based models. The BDR is Hadoop/MapReduce ‘landing zone*’ and analytics sandbox for all oil country data, both structured and semi structured. ‘Polyglot’ data repositories include SQL, NoSQL, Hadoop and GraphDB. Use cases include DTS analytics and real time pipeline leak detection. More on the BDR in the IBM Redbook ‘Governing and Managing Big Data.’
Frans van den Berg reviewed Shell’s experience of smart fields
and the collaborative work environments (CWE). Smart and
standard processes are used in surveillance, production optimization,
maintenance and emergency response. There is ‘one team, one plan and
one set of KPIs.’ Exception-based surveillance plugs into guided
workflows for follow up and alerting to asset teams. Today the drive
for a CWE comes from local assets who ‘want to work this way.’ The CWE
increases efficient decision making through dialog rather than
ExxonMobil’s Robert Aydelotte provided an update on Energistics’ ProdML extension for PVT and fluid characterization data exchange. Modeling has now migrated from Excel to UML. The spec can capture an audit trail of sample chain of custody, adding context and ownership information along the way. Also included are fluid analytical tests, and sample description. The spec is currently in the test phase and should be available in Prodml 2.0 later this year. Energistics’ common technical architecture was leveraged such that fluid/PVT data can be included in simulator-ready Resqml decks.
In the ‘Future vision’ session, Mark Little described GE’s FastWorks initiative that seeks to foster a start-up mentality inside the whale that is GE. GE uses around 300 3D printers in various manufacturing contexts. A ‘brilliant’ 21st century factory leverages full physics modeling, model-based manufacturing and no more paper drawings. ‘Everything is becoming connected’ (including incidentally a young lady in the row in front of us who was checking out some shoes on Yahoo shopping).
NASA’s Steven Fredrickson offered insights from space flight. We tend to overestimate what can be done in the short term and underestimate what can be done in ten years or so. NASA was an early adopter of digital representations from early electronics to digital. He mentioned ‘human-in-the-loop’ simulation, Monte Carlo modeling and the ‘digital double’ approach where SysML modeling enables a ‘computable representation of everything.’ On-board intelligence deploys Watson-inspired systems for exploration anomaly resolution a.k.a. ‘mission control in a box.’ NASA has migrated from paper through Adobe PDF to the international procedure viewer IPV XML documentation.
David Rossi spoke of Schlumberger’s ‘new’ early stage investment program (actually announced in 2012) in promising start-ups like Foro Energy’s high powered lasers used in well abandonment. Schlumberger is also working with Aramco on ‘electric completions’ in multi-lateral extreme contact wells and one a collaboration with Google to use its search in oil and gas. Rossi also revealed that Schlumberger now has a total of 26 petaflops of compute capacity in Houston which would put it in the top five of the Top500 classification (if it entered). The Q&A revealed that nobody actually wanted to wear Google glasses, considered ‘a bit creepy.’ Rossi opined that ‘if you can write a flowchart for a job, it will disappear and be done by automation or robotics.’ Asked ‘What job you recommend to youngsters?’ the engineers recommend ‘engineering.’ Our cheeky question, ‘what would Watson say if he was on the panel’ was elegantly fielded by IBM’s Curry Boyle who suggested, ‘stop watching those kitten videos!’
* A metaphor too far?
Speaking at the 2015 PPDM Houston Data management symposium and tradeshow, Jerrod Stutzman provided an in-depth review of Devon Energy’s evolving geographic information systems. Stutzman kicked off with a geospatial 101 along with some tips on geographic data QC. Once data is clean it needs synchronizing across enterprise applications with tools like FME, Informatica, ArcGis, TIDAL Scheduler and cron jobs. Geoprocessing can be applied to datasets for instance to populate a PPDM well_area polygon or to generate a well pad schematic.
Devon deploys a heavyweight IT stack for GIS support. Geo data from domain specific applications is consolidated to storage on ArcSDE and PostGIS. ETL* jobs run overnight to feed data to applications such as ArcGIS, OpenGeo, Geocortex and FME.
Future development includes a Voyager GIS deployment for geospatial search, ‘virtualization’ of source data sets and real time capability. A sensor data infrastructure is also planned, leveraging technology from OSIsoft and Esri. Devon’s geospatial data management fits into Devon’s ongoing big data initiative which includes a Hadoop cluster, master data management and data virtualization with Informatica data services.
Jim Crompton from Noah Consulting elaborated on the big data theme. A mastery of multiple big data sets is required to combat the low ROI of unconventionals. This will come through a transformation of the oil and gas solutions model via an interoperable ‘system of systems.’ Such will enable a shift from high maintenance custom coded development to an ‘open industrial interoperability ecosystem’ allowing for configurable plug and play re open systems. Enter the Mimosa-backed oil and gas interoperability pilot, a ‘systems and information (as opposed to application) centric architecture.’ Checkout the working documents from the interoperability testbed on the Mimosa website.
Presentations from the PPDM Houston show are now online.
* Extract, transfer, load.
Kevin Sullivan has been named CEO of Advanced Control Systems.
Executive VP and CFO Mark Sullivan is to retire from AspenTech.
COO Jim Teague will succeed Michael Creel as CEO of Enterprise Products Partners after his retirement at the end of 2015.
George Kirkland is to retire as Chevron’s vice chairman and executive VP upstream. Joe Geagea is now executive VP technology, projects and services. Henry Swartzlander is also to retire from his position as unit manager, process automation.
Ryan Schneider is now CMG’s COO.
José Rivera is the new CEO of the Control system integrators association. He hails from Schneider Electric.
Scott Berkey has been appointed MD of Dassault Systèmes’ North American business. Bruno Latchague is now senior executive VP.
Mel Riggs has succeeded Clayton Williams as president of the eponymous company.
Andrew Slaughter is the new director of the Deloitte Center for energy solutions. He hails from IHS’ Energy Insight unit.
Régis Clot is now MD of Georex. He was previously with Schlumberger.
Golder Associates has appointed Hisham Mahmoud as President and CEO.
Paul Simons is the new deputy executive director of the International Energy Agency.
Didier Houssin is the new chairman of IFP Energies Nouvelles.
John Kerr is now a partner at New Digital Business.
Deirdre Michie has replaced retiree Malcolm Webb as Oil & Gas UK CEO.
The OPC Foundation named Stefan Hoppe as VP.
Petrotechnics has appointed Stuart Douglas as regional sales manager in its Abu Dhabi office and Suhas Jadhal Middle East business consultant.
Sullexis’ Alejandro Del Palacio has been appointed as President of PIDX. Schneider Electric’s Scott Fleck is Executive committee chair. GE Oil & Gas CIO Angela Tritzo has been elected to the PIDX board.
Anthony Greer has been appointed Rock Solid Images’ COO.
Dean Rietz is the new president of Ryder Scott, replacing Fred Richoux, who will remain on the board. Guale Ramirez is executive VP.
Amin Nasser has been named as acting president and CEO at Saudi Aramco.
Smith Flow Control has hired Sunil Verma as regional sales manager for India. He was previously with Emerson.
Colin Graham heads-up TAM International’s new office in Stavanger, Norway.
TD Williamson has promoted Robert D. McGrew to the position of president and CEO.
Westbridge has appointed Kevin Broger as technical advisor. Broger is currently president of Out Front of the Curve.
OES Oilfield Services has promoted Jonathan Hall to country manager, Saudi Arabia and Bahrain.
Michael Hutchinson has joined the Oneok board of directors. He is a retiree from Deloitte & Touche.
Peterson has appointed Loek Sakkers as director of projects and Stephen McCrindle as supply chain manager, within its offshore business.
Tim Donaldson has been appointed as Progea’s North America Sales Director.
Bobby Bryan is now CEO of Schramm. He was previously with National Oil Varco.
The SPE and Oilpro are to launch a job search tool for SPE members.
Statoil has appointed Anders Opedal as executive VP and COO.
Saudi Aramco Energy Ventures has taken an equity stake in Target Intervention AS, a Norwegian downhole tool specialist developing next generation intelligent coiled tubing tool solutions. Target’s first main product is an electric isolation tool for stimulating unproductive zones.
Wood Mackenzie has purchased the Petroleum Service Group from Deloitte LLP.
ISN has acquired IT consultancy Virtual Stream to bolster its oil and gas expertise and presence with an Aberdeen-base. Virtual Stream also brings expertise in Citrix, VMware and Microsoft applications. ISN now employs 80 upstream oil and gas specialists.
Divestco has sold its Land Software assets (LandRite, iLand and MapQ) to Pandell Technology Corp. for cash which has been used to repay a $4.5 million short-term loan.
Quorum Business Solutions has acquired Fielding Systems, a
provider of cloud-based data applications for the oil and gas sector.
RPS has acquired Oslo-based project management specialist Metier Holdings AS. Metier will join up with OEC Group, acquired by RPS last year.
Start Scientific has acquired Quality Energy Solutions in a paper transaction. Start is looking to raise substantial capital to expand QES through additional contracts and by acquisitions of other small service companies.
Dell Computer’s annual threat report for 2014 has it that the high profile breach of retailer Target’s point of sale systems ‘came indirectly through the company’s HVAC vendor, who received deeper user permissions than needed.’ Dell also reports a twofold increase in attacks on Scada systems, up to 675,186 in January 2014, many down to buffer overflow vulnerabilities.
The Object Management Group, with backing from the White House, has instigated a ‘Threat and Risk community’ and has put presentations from its inaugural cross-domain threat and risk information exchange day online.
Honeywell Process Solutions has announced the Cyber Security Risk Manager, a ‘digital dashboard’ to proactively monitor, measure and manage cyber security in multi-vendor control systems.
NIST has just published guidance on security and privacy assessments of mobile apps. The Special Publication 800-163, ‘Vetting the security of mobile applications’ targets a government audience, but should also benefit private industry developers and enterprise security professionals. Another NIST publication looks at the impact of ‘defensive code’ on software performance and finds that hardening software does not in general negatively affect performance.
The Petroleum industry data exchange recommend a new practice for all PIDX users to enhance transaction security. PIDX members and trading partners need to migrate from SSL 3.0 to TLS ‘as soon as possible.’ More from the SSL V3 Best Practices white paper.
Olivestar LLC has filed for patent infringement against Chevron and 16 other US oil and gas companies, from Anadarko to Occidental for alleged infringement of US patent 8239481B2, ‘a method for controlling devices in a computer system.’
TDE Petroleum Data Solutions has filed against Moblize Inc. for alleged infringement of its US 6,892,812 B2 patent, an automated method and system for determining the state of a well. Incriminated products include Moblize’s ‘Well at a Glance.’
The wheels seem to have come off Acacia’s attempt to sue Schlumberger (Oil IT Journal Vol. 20 N°1) over Petrel’s alleged infringement of Austin Geomodeling’s 3D geology modeling patent. Kathryn Rubino’s (AboveTheLaw) analysis makes for interesting reading.
The zeitgeist of the Schlumberger ruling was reflected in a TelecomTV blog report on the new Patent Act, a bi-partisan bill in the US Congress that sets out to end abusive litigation by ‘whacking the trolls right in the pocket book.’
A joint AppianWorld presentation from Austin Rosenfeld (Macedon Technologies) and Achal Augustine (Flowserve) showed how oil country pump and valve manufacturer Flowserve has created a global online service center portal to manage the pump repair workflow and documentation lifecycle leveraging new features for mobile, social, cloud and data that Appian’s technology offers.
Sales engineers can now initiate repair requests from mobile devices in the field, approve quotations and track and update the repair status. Pictures, voice and video recordings can be attached to jobs to provide ‘as found’ information on equipment items. The cloud-based software provides a ‘single source of truth’ via ‘on the glass’ integration of multiple ERP system data on one screen. More from Appian and systems integrator Macedon Technologies.
Netherlands-based Culgi has co-authored (with Shell) a book chapter on the ‘Application of computer simulations to surfactant chemical enhanced oil recovery.’ A new release, V9.0 of its eponymous modeling package enhances the NWChem quantum chemistry module to allow for ‘orbital visualization and the treatment of surfaces.’ The Monte Carlo engine can now handle free energy calculations. A database of 25,000 molecules has been added to the package. Culgi is currently in contact with a number of companies interested in applying its chemical modeling to shale, and recently signed with an unnamed US oil company to ‘enable IT solutions for digital chemical oil.’ More from Culgi.
ABB has added Kepware’s KEPServerEX communications platform to its flagship 800xA integrated control and safety system.
Cambridge Pixel has supplied its Secure-X radar modules to Sofresud’s vessel monitoring for a W. African oil field security project. The software was developed for Total Nigeria.
OptaSense has partnered with Weatherford International to deliver integrated optical sensing solutions for drilling and production.
ABB and IBM have signed a multi-year ITC service contract with Wipro and BT.
E.ON has acquired the Allegro 8 platform to support its retail business in Spain. Implementation will be led by Indra.
Aptomar has been awarded a five-year contract by BG Group to provide field monitoring services on the Norwegian Knarr field. Aptomar has teamed with O.M. Rønning to provide spill detection on Eni Norge’s Goliat field.
Kentz will deploy Saft’s Sunica Plus batteries on Qatar Petroleum’s Dukhan oilfield, where they will provide energy storage and backup power for wellhead industrial control systems and corrosion protection systems.
Siteworx recently redesigned Cameron International’s website and assisted with digital and content strategy.
CB&I has been awarded an approximately $50 million contract to provide engineering, procurement, fabrication and construction of seven storage tanks near Fort McMurray, Alberta, Canada.
Core Specialist Services is to offer Perigon’s iPoint as a platform for core data visualization and reportings.
Dexter + Chaney has partnered with Pensacola to resell its Spectrum construction software.
GE and Pertamina Drilling Services are teaming on the development of drilling equipment and solutions for onshore and offshore applications.
IFPEN and INRIA have signed a framework research agreement to co-develop methodologies and algorithms from research in computational science.
Yinson Holdings has implemented IFS Applications across its onshore and offshore operations.
CH2M HILL is to implement Intergraph SmartPlant Enterprise across its Energy and Chemicals Division.
Kongsberg has partnered with ABT oil and gas on a proof of concept of unattended facilities for the development of marginal fields. Saipem has awarded Kongsberg a contract to deliver subsea structures for the gas export pipeline project of the INPEX-operated Ichthys LNG Project.
OTM Consulting has partnered with Rice University’s Alliance for technology and entrepreneurship to provide industry expertise and financial support to technology start-ups.
Rock Flow Dynamics reports sales of its tNavigator flagship to EON, Repsol, Vermilion Energy, Lundin, and Rocksource.
Turkish Petroleum has acquired Paradigm’s Reverse Time Migration (RTM), an imaging seismic data solution in areas of complex wave phenomena. Eco-Engineering Solutions has selected Paradigm Sysdrill to provide drilling support services to clients.
ABB Benelux has acquired a license to Wish Software’s AutoChart. Statoil is to use Wish’s VisualGIS Server and HydroGIS to make digital video and accompanying survey data available across the Statoil Network.
Dana Petroleum has awarded Technip a brownfield subsea contract for its Triton FPSO. Technip and consortium partner COOEC have been awarded a FEED contract by CNOOC for topsides (including two drilling rigs), hulls, mooring and riser systems.
OneGeology has published a cookbook on how to serve map data using a web coverage service. WCS produces images with MIME types like png, jpeg or gif that can be displayed natively in common web browsers.
There has been sparring in the US Congress over the mandatory use of the XBRL standard for company reporting. The debate centers on the cost of XBRL compliance and some organizations are mobilizing against this, the 4th attempt to pass the legislation.
The International Accounting Standards Board (IASB) has published the 2015 edition of IFRS as Global Standards: a Pocket Guide. Written by former IASB member Paul Pacter, the guide provides a summary of the use of IFRS in 138 countries around the world. The summaries are condensed versions of full jurisdiction profiles available on the ifrs.org website.
Nature has published the Nature Publishing Group’s core ontology, used publishing its articles and journals and classifying article-types and subjects. The ontology includes OWL classes and properties as well as a select number of SKOS taxonomies.
The OGC has published its ‘Guide for Software Acquisition’ as an official OGC white paper providing an overview of its compliance process and emphasizing the benefits of acquiring OGC-compliant products, as opposed to non-certified products that implement OGC standards.
Speaking at the 2015 IQPC Oil and Gas Communications summit in Houston, Shane Meche stated that the number one requirement for oil country telecommunications is predictability. Meche enumerated the many factors, from transmitter power through bandwidth, antenna gain, terrain and weather, that influence communications. He showed how consistent communications can be achieved even across a complex environment of a Canadian oil sands development. Large mobile mining and transportation units require constant communications to share data on machine health, autonomous operations, fatigue management and more.
Meche along with other presentations sees the Long term evolution (LTE) spec as the future of oil country communications. Bob Prichard (Proactive Energy Services) gave a detailed backgrounder on LTE which should provide high bandwidth to ‘nomadic’ applications but may be overkill for fixed sites where its cost may be hard to justify. LTE should also be a good option for ‘hybrid service applications.’
IBM unit Aspera was also presenting its Fast general purpose data transfer technology that promises maximum bandwidth use in long hop data transfers over the public internet. More from IQPC.
To datePG&E has paid out on claims amounting to more than $500 million with the victims and families of the 2010 San Bruno pipeline explosion, established a $50 million trust for costs related to recovery and contributed $70 million to support recovery efforts. PG&E has also earned ISO 55001 and PAS 55-1 certification for its safety systems. These include software from Arcos to automatically callout crews in response to emergencies. Voice activating technology encourages safe cellphone use.
The IOGP has released a several new publications as follows. ‘Guidelines for implementing well operations crew resource management training,’ ‘High integrity protection systems’ and ‘ESH, risk and impact management.’
The US Pipeline and Hazardous Materials Safety Administration has released Report 14, a Guide for communicating emergency response information for pipelines.
Asset management specialist Meridium has an interesting take on a range of new technologies that maybe aren’t so new after all. CEO and Founder Bonz Hart opined at the Meridium user conference that ‘While everyone is talking about the potential of Big Data, M2M, the Cloud, Advanced Analytics and the internet of things, we have been using these technologies 1993!’ Hart and CTO Eddie Amos demoed the next generation of Meridium APM with a new interface that can be configured by subject matter experts (not developers) and cloud functionality. Meridium also announced APM Connect a service bus for integrating asset data and an integration ‘center of excellence’ offering development, support and services.
The latest 4.0 release of Meridium Enterprise APM includes risk-based inline inspection and thickness monitoring management. The new functionality covers onshore pipeline systems enabling users to manage fixed, rotating, and linear assets within a single solution. A risk calculator now supports threat-based ‘recognized and generally accepted engineering practices.’ At the show Total/Aramco joint venture Sabic was awarded the 2015 APM best practice award for its integrity change management program.
A new white paper from completion specialist Tendeka covers autonomous inflow control technology. Managing inflow and reservoir sweep is often a limiting factor on the length of a horizontal well as the benefit of greater reservoir contact is offset by increased differential drawdown and a tendency to cut across formations of different permeability. Passive inflow control devices (ICD) are often used but cannot be adjusted in the event of water or gas breakthrough.
Autonomous ICDs (AICD) offer greater control of fluid influx, mitigating breakthrough. They can lead to greater recovery and lower water cut. To manage such devices, Tendeka has developed a comprehensive multi-phase flow model of AICD performance which can be used to model inflow distribution along the wellbore. The model has been validated by extensive testing in a multi-phase flow loop and real world results from over 10,000 Tendeka AICDs in some 50 wells around the world. Tendeka recommends checking flowing bottom hole pressures with a steady state computation such as provided by Landmark’s NETool for model verification. Tendeka’s AICD evaluation workflow was originally developed by Statoil.
Speaking at the PPDM Oklahoma City data management luncheon, Jason McKittrick introduced a joint production analytics offering from Microsoft, OSIsoft and Neal Analytics. Following a quick tour of Excel’s pivot tables, graphics and (Bing) data maps, McKittrick showed how Microsoft’s Power BI can be used to link on site data with business intelligence functionality in the cloud via a ‘data management gateway.’
The Microsoft tools now link to the plant or platform via OSIsoft’s PI System, notably with a new ‘self-service BI for PI offering.’ Other novelties on the horizon include machine learning in the Microsoft Azure ‘data factory,’ a PI integrator for Azure and an ability to combine textual information with real time operational data for KPIs in-context.
Excel is presented as the client of choice for what proved to be a rather bewildering amount of slideware. But the machine learning self-service data science available in PI System 2015 looks pretty compelling with interfaces for R, Python and even open computer vision, OpenCV!
Bill Barna then showed how some of the above components have been leveraged in Neal Analytics’ oil and gas predictive analytics offering. This includes tank level monitoring and forecasting using Azure machine learning. Neal uses neural networks, Poisson and ‘decision forest regression’ along with model cross validation to ensure a model generalizes well to new tanks. Presentations avaliable from PPDM.
In an interesting announcement from a different slide deck, Microsoft reports that Rockwell Automation is using Microsoft Azure ‘Internet of Things’ to connect disparate systems in the petroleum supply chain. Rockwell client Hilcorp Energy monitors ‘cloud-connected’ submersible pumps from from the control rooms.
OSIsoft systems integrator RoviSys has teamed with industrial internet solutions provider Bit Stew to bring utilities-style real time intelligence to the upstream. Bit Stew’s MIx platform provides integration, visualization and business intelligence across disparate data sources. RoviSys initially worked with Bit Stew on a contract to provide a pipeline integrity solution to utility Pacific Gas & Electric but realized that the technology could also be used to address upstream automation challenges.
Bit Stew’s Ron Pequette said, ‘MIx can manage billions of data points and provide actionable insights into operations. Oil and gas is a natural fit for both our companies.’ Bit Stew’s MIx Core platform automates data ingestion and uses machine learning to reveal actionable insights that optimize performance. The MIx Director (formerly Grid Director) client provides users with an in-context real-time view of operations and assets and, for utilities, customers.
San Diego, California-based Mtelligence Corp (Mtell) is to extend its software solutions for the management of the health of industrial equipment with a ‘big data’ platform from Hadoop boutique MapR. The new Mtell Reservoir offering includes Hadoop, Mtell Previse and the open source time-series database Open TSDB. The system ingests and analyzes real-time sensor and historical data alongside maintenance data that are generated from rotating machinery and other equipment on oil rigs and other industrial plants.
Mtell Reservoir is an enterprise historian that distributes disk access and CPU processing across MapR clusters. ‘Orders of magnitude’ improvements are claimed over current plant historians. Mtell reports proven loading of over 100 million data points per second on four servers and linear performance scaling with the number of servers.
Engineer WorleyParsons is to embed its understanding of EPC projects and brownfield asset improvement to provide a ‘digital asset’ service to clients. The service includes the creation and maintenance of digital plants leveraging Aveva’s flagship PDMS toolset. The service will support lifecycle engineering data including handover of information in a ‘consistent and validated format’ for brownfield ‘as-built’ and/or greenfield projects.
Flagship client for the digital asset is Abu Dhabi-based ADMA-OPCO whose aging, complex assets have seen some forty years of modifications and maintenance and whose documentation no longer reflected the true state of the facilities. The project used laser scanning to create as-built 3D models from point cloud data. Aveva NET has been customized to warehouse the facilities’ huge quantities of documents and data. The project features an ongoing progressive sequence of handovers and is scheduled for completion in 2015. A new white paper, ‘The Digital Asset Approach - Defining a new era of collaboration in capital projects and asset operations,’ is a free download from Aveva.