Hey-ho another year comes to a close and so much that I planned or wanted to do remains undone. OK we did manage to do the survey which gave some pointers as to where we should be taking Oil IT Journal and www.oilit.com next. But as for implementing major changes? Zilch!
The problem is that the time taken to prepare Oil IT Journal has, over the last couple of years expanded considerably. No major changes but a gradual increase in the detail and scope of our coverage which I hope you noticed. I certainly have done as producing the Journal is taking longer and longer. Hold that thought…
Just before I left the SEG in Denver I came across the exhibit of the SEG Advanced Modeling Corp. SEAM produces 3D synthetic seismic data sets of various geologies that are used to test imaging algorithms. The EAGE provides similar datasets for its Marmousi models.
SEAM is about to embark on a new ‘Life of Field’ (LoF) project, to build a static earth model, populated with rock physics and reservoir fluid parameters. This will be used to simulate seismics as before, but with repeat models run as oil and gas is extracted from the field, to produce a 4D time lapse seismic dataset. The SEAM prospectus states that ‘Most of the ingredients of such a [data driven production] strategy are already available. What is now needed is a productive way of linking [everything] together in an integrated system.’ SEAM’s starting point will be a ‘highly realistic earth model.’
Now when this was presented to me I had a eureka moment. Having tracked progress of Energistics’ Resqml earth modeling initiative for a while, with notably a review of the book of the project I thought wow! This is tailor-made for Resqml.
The idea of combining SEAM and Resqml, I should say my idea of combining them (false modesty is not my strong suit) has many obvious benefits. One the one hand it is an ‘industrial strength’ test of Resqml’s adaptability to a ‘use case’ not a million miles removed from its original purpose. On the other hand, seismologists, instead of creating and using ‘yet another’ model format will be using an industry standard. This is a no brainer, or so I thought.
Back in the office I fired off a few emails putting my suggestion forward to both the SEAM and Resqml teams. I waited. There was a deafening silence, the like I have not heard (sorry for the strange metaphor) since my March 2010 open letter to the SPE president. Eventually, after a couple more pings and prods, I got a lukewarm response from Resqml along the lines of, ‘it’s not a bad idea, maybe a couple of versions down the road we’ll be able to do something.’
But that of course means missing a great opportunity. I pinged some more and I have to say that I seem to have got some folks thinking about this possibility although I have not heard anything concrete back. I will keep you posted if and when I do.
There are plenty of good reasons why a Seam/Resqml collaboration should go ahead. The spirit of collaboration is in the air—think Standards leadership council for instance.
I can also think of a few bad reasons for not doing this. While people love to talk about collaboration and ‘breaking down the silo walls,’ actually doing something to achieve such is harder and usually out of folks’ comfort zones. It is easier to stay with the tried (and tired) old methodologies of the past rather than adapt to meet someone else’s requirements.
It could be that some facets of Resqml do not lend themselves to the seismic test. It could be that it doesn’t work at all! So there is an element of a challenge in my innocent ‘suggestion’ too. Make it work guys!
During one exchange, my interlocutor expressed ‘intrigue’ at my interest and involvement. This got me thinking. I was intrigued too. Why should I care how the majors and standards bodies run their projects? The simple answer is that this is what editors do, inform and influence.
Actually we have been active in similar fields in the past. Our persistent questioning of the supposed benefits of the semantic web in engineering have contributed to, let’s say, a more moderate presentation of these. We were also, as I learned some time after the fact, influential in mitigating a hook up between the PPDM’s standard data model and Esri’s proprietary technology. Oil IT Journal is an activist as well as an information source.
Which brings me back to my workload. I want to keep traveling to and reporting from conferences, to give more attention to the web site and to be active and influential. Regarding the website I have two projects in mind. First I want to bring search in-house and make it smarter—think Lucene maybe. I also want to try to get something along the lines of IBM Watson running our two million plus word information asset—think Uima. I also want to create an ecosystem of websites-of-relevance to what we (and I hope) you are really interested in.
To facilitate all this, we are going to cut back slightly on the number of issues that we publish. Starting in 2015, there will be 10 issues (down from 11) spread out through the year. In keeping with the ‘activist’ spirit, I think that we will revive the French revolutionary calendar with its 10 decimal ‘months.’ Watch out for the inaugural ‘Pluviôse’ edition next year.
Speaking of which, all the best from Teresa and myself for 2015.@neilmcn
Communicating in real time or near real time, between drill rig and bottom hole, is something of a holy grail for the digital oilfield. At the low end of the downhole digital spectrum, conventional logging while drilling uses mud pulses to achieve bit per second bandwidth. Other applications record data to a memory chip downhole for later retrieval—offering high data volumes but no real time or control loop functionality.
Digital oilfield promoters have cited high bandwidth downhole communications as contributing to the ‘data deluge,’ so we thought it would be a good idea to take a rain check on current downhole bandwidth via some recent presentations and publications.
A paper in the Autum 2014 issue of Schlumberger’s Oilfield Review describes an alternative approach to downhole telemetry. Schlumberger’s Muzic ‘wireless,’ but not radio, system is used for well testing. Muzic uses a string of acoustic repeaters clamped onto the tubing string to provide bidirectional communications between the test unit and the surface. Data rates are lower than memory devices but the ability to control gauges and valves during testing is a plus.
At the Amsterdam SPE ATCE earlier this year, National Oilwell Varco presented a field ‘premiere’ of along-string dynamic measurements for drilling optimization in the Eagle Ford shale. The system uses real time downhole data streaming from the bottom hole assembly via the IntelliPipe wired drill string. This offers a 57,600 bps bandwidth used to monitor and mitigate drilling dysfunctions such as stick/slip and bit bounce.
In a follow-up presentation, NOV teamed with Halliburton and ConocoPhillips Norge to show similar successes in a North Sea wired drillpipe deployment. Here the bi-directional data network was used for early pack-off detection and to mitigate drill string vibration. ‘Memory quality’ LWD measurement was transmitted uphole in real time and the downlink allowed for control of downhole tools. The system also produces high quality image logs, used for geosteering through fault zones.
So is the data deluge living up to its name? Well if 57kbps was acquired continuously that would make for around 622 MB/day. Say 10 GB for a 20 day well. That is quite a lot of data but not nearly as much as the terabytes of a seismic survey. Nor indeed as much as all the Scada/DCS systems at the rig itself.
Those interested in learning more about digits and drilling might like to attend the LBCG well site automation for unconventionals conference in Houston next month. We’ll be there. More from the SPE ATCE on page 6 of this issue.
Lloyd’s Register (LR), through its charitable foundation, is to give the UK’s Alan Turing Institute £10 million over a five year period to support engineering applications of big data. The LR foundation has also just published a review showing how big data science can impact safety and asset performance.
The review was authored by a panel led by artificial intelligence expert Nigel Shadbolt of the University of Southampton and the Open Data Institute. One section addresses big data in the digital oilfield where ‘industry is increasingly looking to automate dangerous and expensive operations.’ Meanwhile techniques such as computational fluid dynamics (CFD) are used to improve production by modeling the complex downhole interactions between production equipment, the stressed reservoir rock and injected and produced fluids.
LRF MD Richard Clegg said, ‘We are going to see step changes in sensor technology, data-driven intelligent systems and data analytics impacting all aspects of business, from design to manufacturing, maintenance to decommissioning.’ Big data is going to bridge the gap from monitoring ‘what is’ to predicting ‘what if’. More from LR.
Hubbard obtained a Masters in science management from the University of Alaska, Fairbanks, and subsequently worked with OFS Portal, Oilcats and SparesFinder where he was active in the development and adoption of global oil and gas e-business standards, serving with API/PIDX and the UNSPSC. He joined Energistics as EVP business development in 2006. Named COO in 2010 and president and CEO in 2011, Hubbard built the organization from around 60 to some 125 active corporate members.
Along with his promotion of Energistics’ XML-based oil and gas data standards, Hubbard’s legacy includes his successful nurturing of the National Data Repository movement from a small informal gathering to its current influential position. He was also an instigator and promoter of the Standards Leadership Council, an ongoing attempt to foster cooperation between standards bodies in and close to oil and gas.
Hubbard was a tireless advocate for industry standards and was judged by some to be ‘the best CEO Energistics ever had.’ His affable personality and ability to listen and engage with folks on standards related matters or on life in general will be remembered by many. Colleagues at Energistics remember him as ‘always kind, with a funny story to tell’ and as having ‘a passion for great food and drink – especially if part of an international adventure.’ ‘He was a genuinely friendly man and a clear, astute thinker.
Hubbard received the Philip C. Crouse Cornerstone Award in May 2014, cited for ‘fostering a global user community across standards organizations and dedication to the progression of open data exchange standards and data management practices.’
More in the Houston Chronicle obituary.
What sparked off Repsol’s use of IBM’s cognitive technology?
We believe that the technology is applicable in two areas, as an aids to decision making in bidding rounds and in helping with field planning and optimization—where to place injectors and producers. The idea is to make the three fields of complex math/analysis, geoscience and computing accessible and working together. We have already worked internally on technology to optimize field development plans, to increase production, keep costs down and assure safe operations. The cognitive initiative will build on this.
What was this previous project called?
This was our Excalibur project. There came a moment when, in discussion with IBM, we realized there was an opportunity to take this to the next level by adding a cognitive computing component to Excalibur. The idea is to leverage new computing possibilities offering more interaction, machine learning and even reasoning in an advisory context. We thought that this was a great opportunity to do something together.
Where is this today? Are you just starting out or have you already run a pilot?
We are at the early stages of developing the technology. But we started trials as an extension of Excalibur about a year ago. The first results are already there - and we are encouraged by this promising area of innovation.
So what exactly is involved? We have previously reported on IBM Watson’s Jeopardy success. Is this mainly concerned with natural language processing (NLP)?
Sure NLP is a key component. But we are working in other directions too. We are looking at the behavioral and psychological side of decision making and also how we can leverage ‘big data’ access. In fact we want to go beyond data access, to see how these intelligent systems can help us regain control of our huge data sets, realizing their full potential. Making serendipitous links and discoveries across data, text and people.
The IBM release makes a lot of ‘cogs’—is this a marketing term? What exactly are cogs?
Cogs is just a fancy name for the apps and tools we are developing to access and reason with natural language and or big data resources.
In Jeopardy, Watson was fed with large public domain information assets including Shakespeare, the Bible and Wikipedia/dbPedia. What is your ‘feedstock’ for the project?
All public oil and gas related information sources and of course our own databases and information assets will go into the mix. One key idea is to be able to identify relevant analogs for current areas of interest by trawling large subsurface information assets.
Comment—Following our interview, Repsol announced the $8.3 billion acquisition of Canadian Talisman Energy. We don’t know if Watson advised on the transaction.
Wood Group’s Intetech well integrity management unit has developed a new software tool, ‘iQRA’ for analyzing well component reliability. iQRA will help operators select the best well and oilfield components, enhancing asset performance and safety.
iQRA includes a broad set of validated component reliability data and leverages the ISO-14224 standard for oil country reliability and maintenance data exchange. Users can benchmark their own reliability figures against a global database of safety critical element failure statistics and mean-time-to failure data. The cloud-based tool allows users to build custom queries and reports.
Intetech MD Liane Smith said, ‘iQRA provides dependable information on the status of safety-critical well barrier components whose true performance must be known with confidence.’ iQRA data can be used to quantify the risk status of wells before and after proposed workovers and to decide whether to retain or plug and abandon marginal wells.
Speaking at the 2014 Argentina Smalltalk Conference last month, Caesar Systems developers described how they were using the interactive visual development environment (VEO) and Bee Smalltalk in their work on PetroVR*.
Adrián Soma introduced VEO with its interactive, on the fly, code writing capability and a new ‘live programming’ object paradigm. Javier Pimás demonstrated improvements to the Smalltalk ‘Bee’ implementation that can now be run in a performant, ‘self-hosted’ environment sans virtual machine.
Leandro Caniglia offered a sideways look at analytics, arguing that, for data, sometimes, small is beautiful. Mathematical techniques using small data combined with statistics and simulations can be used to generate a continuum of insightful models. Such techniques are computationally expensive but amenable to parallelization, making ‘excellent candidates for distributed multiprocessing.’
PetroVR uses the small data approach to let analysts visualize a large number of different scenarios to identify risks and opportunities. PetroVR is built on Bee Smalltalk which Caniglia describes as ‘a breakthrough in simulation software.’
* Petroleum Ventures and Risk.
Paul Duller (Tribal Group) reports that the San Bruno Gas Pipeline case he has been working on for the last three years resulted in the pipeline operator, PG&E being fined $1.4 billion, the largest ever fine imposed by a regulator on a gas company. As we reported earlier (Oil IT Journal April 2014), investigators found that the GIS system used for PG&E’s integrity management program contained ‘inadequate and misleading information.’ PG&E have admitted fault and apologized but are ‘respectfully asking that the commission ensures that the penalty is reasonable and proportionate and takes into consideration the company’s investments and actions to promote safety.’
Another piece of litigation we have reported on in earlier Oil IT Journals concerned the state of documentation on the BP Atlantis Gulf of Mexico production platform. This was judged earlier this year when US District Judge Lynn Hughes found the case to be ultimately about ‘paperwork wrinkles’ instead of engineering shortcuts. Abbott and the environmentalists ‘have not blown a whistle, they have blown their own horn.’
Notwithstanding the judgment, Congressman Raśl Grijalva was back on the warpath with a letter to US Secretary of the Interior, British-born Sally Jewell, asking her to ‘provide me your detailed plans for strengthening regulations to ensure that BP and other offshore companies maintain a comprehensive set of final engineering drawings.’
Beijing-based GridWorld Software Technology has announced DepthInsight 2014, a geomodeling package capable of fast modeling of complex structures with export to industry-standard geocellular model formats and/or as Resqml files. Seismic interpretations, models and numerical simulation data can be shared with third party interpretation and modeling packages.
DepthInsight handles complex structures such as thrusts, Y-fault, salt overhang and recumbent folds. A smart geometric algorithm includes ‘intelligent’ human-computer interaction to speed modeling by automatically separating and renaming faults and creating and modifying surfaces on-the-fly.
Complex structures can be converted to stair-step grids for export in Resqml for ingestion by third party software such as Schlumberger’s Petrel. GridWorld performance benchmarks have a 150km2 model with 85 faults taking 16 hours to build. Building a 2,000 km2 model with 1500 faults took 45 hours.
Baker Hughes’ new Completion ArchiTex completion design software creates accurate high-resolution 3D models of well trajectory, casing, tubing, and subassemblies. The tool handles complex multilateral and dual string completions and supports supply chain workflows and process adherence.
PetroDE 5.0 promises cloud-based oil and gas intelligence with a choice of basemap layers and improved rendering of closely spaced wells. Users can chose between Esri, OpenStreetMap and various Google Maps options adding industry-specific layers from IHS, Drillinginfo, US agencies and in-house systems.
Aveva NET 5.0 comes with a new HTML5-based GUI, enhanced 3D visualization, configurable tag and document content cards and a ‘cloud-ready’ architecture.
BPT has announced two new products. BPT ROX (for steady state and dynamic simulation of orifice and venturi flow meters in HySys) and BPT EXT (for Excel reporting).
Geosoft has released Valem (voxel assisted layered earth modeling), a cloud-powered inversion service for users of its GM-SYS 3D potential field modeler.
HP/Autonomy has released an ‘intelligent’ retention and content management solution, HP Records manager, to help ‘retain and manage high value and high risk business content in a fully compliant repository.’
Madagascar, the open source seismic imaging package now comes pre-installed on a ‘Crunchbang’ (Debian) image for installation on Oracle’s VirtualBox.
MCAConnect has released an oil and gas industry-specific joint venture accounting option for its Microsoft Dynamics AX-based AX4energy suite.
Mentor Labs is offering a free no-install cloud-based trial of FloEFD, its computational fluid dynamics solution.
Schlumberger has announced the alpha release 5 of its Ocean framework for Petrel with a preview of features planned for 2015.
Pegasus Vertex has released PathView 3.0 with new multilateral 3D visualization function and aids for well path collision avoidance.
Recon Technology has developed a novel fracking system, FracBHD, claimed to be a low cost alternative to ‘expensive foreign systems’ currently in use. The solution for multi-stage, open-hole horizontal wells is said to maximize reservoir productivity and save completion time.
The latest R2.2 release of CGG/Hampson-Russell’s HRS-9 includes azimuthal analytics, volume processing of azimuth stacks, rose diagram and polar Fourier spectra.
Rock Flow Dynamics’ tNavigator 4.1.2 includes improved performance on compositional models and models with massive water zones, a new adaptive implicit method for compositional and black oil models and a capability for unstructured faults in geological models.
Schlumberger’s 2014.1 edition of its PipeSim steady-state multiphase flow simulator is described as ‘the second release of the 3rd generation user interface.’ An earlier 2012.2 edition is still current as functionality and connectivity moves to the new version.
The latest V8.4 edition of Stone Bond’s Enterprise Enabler data virtualization platform provides easier installation and faster start-up along with a new ‘agile integration interface’ and extended connectivity via AppComm to Microsoft Dynamics, Json and Excel.
A new release of Yokogawa’s ProSafe-RS safety instrumentation system claims improved control and connectivity with the Centum VP production control system. R3.02.20 offers open communication protocols to improve connectivity with third party Scada systems.
The industry e-commerce organization PIDX International held twin conferences in Houston and Paris earlier this year. Peter Black (EnergySys), a standards myth-buster, asked, ‘If standards are the answer, what’s the question?’ While collaboratively developed standards that are widely adopted have real business value, many myths and shortcomings cloud the picture. Black’s standards myths: publishing something or releasing an XML encoding makes a standard, vendors and customers are interested in standards and standards guarantee Interoperability. He cited several (mis)use cases, all relating to superfluous use of Witsml along with a few ‘possible’ use cases such as data migration, interoperability and reporting. Black proposes a new approach to standards, starting with the identification of a ‘real’ problem, looking for solutions from other verticals, providing a reference implementation and devkit, getting buy in from vendors and finally, deprecating unimplemented standards.
Implico MD Kay-Peter Buhtz provided a more positive view, tracing a twenty year long track record of e-commerce standards use culminating in PIDX-based terminal data exchange in the Rhine area. Materials data exchange standardization in Germany dates back to 1992 when the MPKS Standard was born. It is still in widespread use today. In 2006, BP and ConocoPhillips initiated a feasibility study on the use of XML to communicate between tank farms and trucks onboard computers. Today, the IFLEXX system is used between more than 20 terminals and refineries and has informed development of the flagship PIDX bill of lading standard which went live in July 2014 in Germany, Austria and Luxembourg. Buhtz stressed the importance of mutual understanding of wording (order, contract, shipment) between truckers and terminal. More from PIDX.
Around 7,500 attended the 90th Society of Petroleum Engineers Annual Technical Conference and Exhibition in in Amsterdam in October. The opening plenary on ‘affordable energy’ was a rather lacklustre debate that touched on climate change, growth and sustainability, a ‘triple dilemma’ with no easy solution. The IEA’s Christian Besson observed that energy supply is increasing as demand is dropping and this situation may last a while. ExxonMobil’s Neil Duffin spoke of a ‘chain reaction’ as a falling oil price impacts major projects. So far this is a short term phenomenon affecting mostly smaller companies although all are seeking more efficiencies. Technip’s Philippe Barril sees integrating teams and relocating work away from high cost environments as one solution. On the question of standards, Duffin made a call for ‘standards that work,’ not ones where ‘changing one spec means that we are no longer within the standard!’ Barril concurred that standardized design can help cut costs. Reducing paperwork and bureaucracy would also help. Besson saw carbon capture and storage as essential to moderating climate change but Duffin warned that high costs and uncertain regulation were problematical.
Another session reflected on the oil and gas industry’s public image. According to Deborah Shields (Colorado State University), the lack of a ‘social license’ for oil and gas development (especially fracking) is making things difficult. This is a ‘wicked problem’ with no easy solution. Shields cited the work of Lausanne’s sustainable energy systems unit and Braden Allenby’s book. The oil and mining industries have ‘spent years turned inwards, speaking to themselves.’ What is needed is a change of tone in communications with government and the public. Pete Smith (Aberdeen University) pitched in with more climate doom and gloom. Since the failure in 2000 of the climate change agreement, greenhouse gas emissions have accelerated. We now need more investment in energy efficiency and in power plants that sequester CO2. Unabated emissions through 2030 will make for ‘overshoot’ and require an expensive negative transition. Abatement must start now with more nuclear and renewables and fossil fuels to be phased out by 2050. For Smith, the oil and gas industry is ‘alongside the tobacco and arms industries in its level of negative perception.’ Yes this is the SPE!
A special session on aging assets in the North Sea heard from ConocoPhillips’ John Hand on the venerable Ekofisk field that has been producing for 40 years and will likely go on for another 40 (that’s beyond 2050!) Recovery has risen from 15% to 70% and production has caused the sea floor to sink by 9 meters causing sinking structures and well buckling. Injection is necessary but causes ‘water weakening’ in the overburden. To see where the water is going, a permanent fiber optic/satellite link provides 4D seismic monitoring. More fiber provides downhole surveillance, ‘listening’ to wells. 3D geomechanical models help with understanding of the overburden and wellbore stresses. A culture of performance and continuous improvement means that the high-end technologies are being applied to smaller and smaller drilling targets. The ‘integrated operations’ approach transfers easily to smaller fields.
Notwithstanding the politics, industry continues to advance on multiple technology fronts. Shell provided an update on assisted history matching with 4D seismics. Currently this frequently fails from a lack of information. Enter ‘model maturation,’ a way of including prior information such as faults and aquifers that are otherwise left out of the model. History matches on local gridblocks pinpoint model flaws and update the model. The approach is widely used by Shell in its worldwide operations.
A Decision Strategies presentation bust the ‘myth’ of sweet spot exploration. For high variability reservoirs, exploring for sweet spots is ‘inefficient and destroys value.’ While service companies are keen to sell techniques to identify sweet spots, it is better to follow a methodology that proves that a project is viable and to avoid ‘gaming’ exploration with unrepresentative wells. Later in the development cycle, operators can home in on sweet spots to assure early cash flows. But a thorough ‘value of information’ analysis should be applied to additional techniques because ‘some aren’t worth the expense.’
The thorny topic of reserves reporting was addressed in a joint SGS Horizon/University of Houston presentation analyzing recent SEC reporting guidance. In the last five years, most comments from the SEC revolve around the reporting of proven undeveloped reserves and how to interpret ‘materiality’ or ‘significance’ in reporting. PUDs must be developed in under five years from reporting—an issue of some importance in North American shale plays where some reserves will likely have to be de-booked. ‘Undeveloped’ acreage must have a realistic program for its development in the ‘near term’ i.e. within three years. Such ‘simple’ requirements are not met by many companies.
A joint presentation from BP, Shell, Total and the University of Houston described the use of reservoir simulation in estimating (and reporting) reserves. The 2009 SEC modernization of reserves reporting allowed for the use of ‘a grouping of technologies which may include computing,’ opening the door for reservoir simulators to be used in reserves estimation. Previously the SEC only allowed for a deterministic approach. Modeling includes volume calculations, selection of analogs and decline curve analysis, all combining to offer ‘reasonable certainty’. A new framework is proposed for ‘evidence-based reserves classification,’ a systematic approach to assuring that model-based reserves estimates meet standards of reasonable certainty. The framework includes determination of a production mechanism, evidence for static and dynamic reservoir performance, history match, analogs, sensitivities and documentation.
In the digital energy session, a Halliburton paper showed how data-driven predictive analytics can be used to estimate downhole temperatures while drilling. Here a software ‘support vector machine’ running atop a Hadoop file system was used to ‘disentangle’ the complex relationships between various drilling parameters (RPM, WOB, mud flow rate) and formation temp. The approach works in deviated wells but was not so good on horizontal wells. For Halliburton, the oilfield’s digital revolution is unfinished, big data and analytics will be the next phase of the digital transformation.
Another Halliburton presentation has it that digital energy is at a ‘strategic inflection point’ representing a challenging competitive environment. Factory drilling is a fundamental shift from exploration to production. Enter ‘operational intelligence’ that ‘tracks the small stuff and handles the operational graffiti,’ leaving professionals to do their stuff. OVS Group and Platts also got a plug.
Technical data management now underpins Shell’s wells, reservoir and facilities management (Wrfm) program. Wrfm sets out to maximize production from existing assets. TDM addresses issues like data being hard to find and poor ownership. Following a pilot in 3 assets TDM is now being rolled-out globally. Critical data catalogues and data quality standards also ran. Shell now employs TDM subject matter experts while a ‘Lean’ data management facility in Asia can be called on for peak load handling.
If everything was done sequentially it would take two years to drill and complete a typical 26 well pad in Shell Canada’s Groundbirch shale play. Enter Simops, with up to 8 frac jobs per day. All enabled by an acronym soup of Simops, Concops (concurrent) and Mopo, a matrix of permitted ops that leverages Wwims, a ‘wells worksite instructions manual!’
A ConocoPhillips’ study of 15 years of injection into shale formations on the Norwegian continental shelf (NCS) has implications for shale development. Norway has been injecting into low permeability shales for a long time to dispose of well cuttings. The good news is that fracs can be induced with relatively low injected volumes and that injecting increasingly large volumes with a period of shut in can create secondary fracs around the primaries. The bad news is that a lot of fracturing is a-seismic, meaning that micro seismic monitoring may only give a very partial picture of frac formation.
While it was something of a sales-pitch masquerading as a paper, Thinklogical made a reasonable case for using a combination of fiber optic communications and its keyboard video mouse extender to facilitate remote operations of control rooms and real-time operating centers.
Kuwait Oil Co. presented results from its Sahala/Sabriyah digital oil field pilot. Here a model update and ranking methodology has been developed to optimize water flood using a 1.4 million cell model. The approach is said to simplify engineers’ workflows and facilitate onboarding of young professionals.
A joint presentation from Chevron and the University of Southern California/CiSoft showed how time series ‘shapelets’ a ‘new kind of wavelet’ are used to predict equipment failure from oilfield sensor data. Time series data from electrical submerged pumps is used to predict failure using a ‘process-oriented event model.’ The approach (like many before it) faced problems with failed or failing meters. These were addressed by using timestamps of ‘last good (meter) scan.’ The approach is said to be ‘faster than machine learning.’
A group led by YPF presented on fault diagnosis of pressure cavity pumps. Pump sensor real time data trends were compared with a fault database. The approach has been proved on simulated faults and now will be tested in the field.
At the 2014 Pipeline Open Data Standard association’s annual user conference in Houston earlier this year Rod Burden (Moore Resources) presented a Pods 101 introduction to using the data model. Pods is a moderately complex relational model with a table count of 201 (Ppdm 3.9 has 2,700). Pods 6.0 is delivered as 31 modules. These can be implemented independently, letting operators tailor a solution to company requirements.
Enbridge Pipelines’ Lorne Dmitruk reviewed Pods Esri spatial, a Pods 5.1 model embedded in an Esri geodatabase. The approach aligns Pods with the large Esri user base and provides ‘out of the box’ functionality and interoperability.
The Pods technical committee presented a new module for external corrosion direct assessment data interchange with the ECDA/Innga format. Work on modularization continues using Sparx System’s Enterprise Architect UML modeler and an upgrade of the spatial edition to Pods 6.0 is coming ‘real soon now!’ A survey showed that over half are planning to upgrade to Pods 6.0. Service providers currently support all versions. Several report that ‘heavy customizations hinder upgrade.’ Respondents also asked for an extension to offshore data.
A couple of presentations focused on the minutiae of real world usage. Michael Ortiz described a survey data validator developed for Plains All American Pipeline by New Century Software. The Arc GIS validator provides fast, robust validation of raw survey data prior to load along with tabular and graphical data QC. Nick Sines (Tallgrass Energy Partners) presented the Pony Express pipeline conversion project whereby a 432 mile pipe was converted from gas to crude oil involving modifications to some 400 sites, 100 reroutes and 264 miles of new build. An innovative use of barcoding fixed numerous worksite issues during construction.
Keith Winning (CB&I) described ongoing attempts to close the gap between CAD and GIS and how a Pods based GIS model could be used in the design and build phases of a project. This requires a change in perspective, from Pods as a repository of as-built data to its use across design and handover. Enter Aegis*, a means of storing PODS data in a CAD model.
John Tisdale (Enterprise Products) introduced PipelineML, a new spec from the Open geospatial consortium for the exchange of pipeline data between disparate systems. Will PipelineML compete with or conflict with Pods? Not according to the authors, the ‘open and independent’ standard can be adopted to any data storage model (Pods, Spatial, APDM, Updm, Isat). Yes but will it compete with Pods?!
* Advanced engineering geographical system for pipelines.
Former Siemens smart grid division head Kevin O’Hara is now executive VP sales and marketing with Advanced Control Systems.
Felicia Harris has joined Burleson LLP as a partner in Houston. Adam Veltri has joined the firm’s Denver office. He hails from Steptoe & Johnson.
Tony Testolini has joined Cartasite as chief revenue officer. He hails from GeoGraphix.
CDA CEO Malcolm Fleming has been seconded, three days per week, to the UK DECC for a 7 month period.
Jim Barker has joined Cortex as interim president and CEO.
Charles Robertson has joined Cowen and Co.’s equity research department as a director, E&P. He joins Cowen from Millennium LP/Blue Arrow Capital.
Deep Blue Engineering has appointed Dieter Watelle to the newly-created position of Design Engineer.
Devon president and CEO John Richels is to retire next July 31. Dave Hager, currently Devon COO, is to take his place.
Divestco has promoted Dani Chiarastella to CFO.
Michael Jegen is now executive VP of Energy Navigator.
Carey Lowe has been promoted to executive VP Ensco plc. Steve Brady takes his place as SVP eastern hemisphere and Gilles Luca is now SVP western hemisphere.
Fritz Industries has hired Steve Almond as director R&D at its new Houston Technology Center. Almond was previously with Mead Westvaco.
Geospatial Corporation has opened a new district office in the Woodlands area of Houston, Texas and has hired Richard Nieman as VP sales.
Abdulaziz Al Khayyal has been named to the Halliburton board of directors. Al Khayyal retired from Saudi Aramco earlier this year.
Intertek has appointed Andy Duncan as lead consultant for its production and integrity assurance division. Duncan joins from the Health and Safety Executive’s Energy Division-Offshore.
Greg Kowalik and Todd Burns lead Noah Consulting’s new content enablement practice while Grant Hartwright heads-up the new facilities information management practice. Robert Best has joined Noah as senior principal and E&P subject-matter expert. Best hails from PetroWEB.
Howard Harary has been appointed director of the National Institute of Standards and Technology’s Engineering Laboratory.
Inger Lise Strømme has been named director, data management and organization at the Norwegian Petroleum Directorate.
Oildex and Quorum Business Solutions have launched an electronic gas plant data exchange.
Blayne Eversole has joined ShareCat as VP business development for the Gulf of Mexico and North America. He comes from Oracle’s cloud services business.
Team Oil Tools has appointed Adam Anderson to CEO. He joins from Baker Hughes.
Robin Watson will be appointed COO of Wood Group in 2015.
The XBRL International board has elected Deloitte’s Cees de Boer as chair.
The ABB board has nominated Peter Voser to succeed Hubertus von Grünberg as chairman.
Absoft has created a new upstream consulting business line.
Gregg Budoi has joined Kalibrate as CFO and executive VP. He hails from EZ Energy USA.
Torstein Sanness is to retire from his executive position in April 2015 and assume the role of Chairman of Lundin Norway. Kristin Færøvik will assume the role of MD. She was previously with Rosenberg WorleyParsons AS.
Noble Corporation has reached a final settlement with the Department of Justice, concluding a two-year investigation into operations and systems on the Noble Discoverer drill ship. Noble will pay $8.2 million in fines and $4 million to DOJ-designated community services.
QinetiQ has acquired SR2020’s seismic imaging capability for ‘up to’ $1.7 million, providing its OptaSense unit with ‘state of the art’ vertical seismic profiling and microseismic imaging services to complement its proprietary DAS-VSP acquisition capability.
Enterprise Products Partners is to acquire Oiltanking Partners. OTP is to merge with an Enterprise subsidiary. The paper deal represents a 5.6% premium on the OTP closing price before the deal or about $6.0 billion on completion.
Emerson has acquired Cascade Technologies, a provider of gas analyzers and monitoring systems using quantum cascade laser technology. Cascade will integrate Emerson’s Rosemount Analytical gas analysis portfolio.
Well completion specialist FTS International has completed its acquisition of ‘substantially all’ of the assets of J-W Wireline. The deal includes some 70 active wireline units, operational locations, and manufacturing and training facilities. 400 J-W employees are joining FTSI.
Teradata has acquired RainStor, a privately held company specializing in online big data archiving on Hadoop, its fourth big data-related acquisition this year.
UK’s Common Data Access (CDA) has announced enhancements to its soon-to-be released competency management system. A job profile mapping functionality will facilitate recruitment and the system will allow users to share their profiles with invited users. The CDA CMS is built with Lexonis’ Quick Assess technology.
The Society of Petroleum Engineers is developing a new competency management service for members based on IHRDC’s CMS Online software. The SPE CMS, a free web-based service, allows members to assess their capabilities against 22 competency models covering engineering, geosciences, project management and HSE disciplines.
Following a three month trial of UK-based Oilennium’s ConTrainer eLearning solution, Dolphin Geophysical reports a ‘dramatic improvement’ in its ability to deliver quality, consistent health, safety and technical training to its crews offshore. Progress of Dolphin’s 400 plus users was monitored from its Bergen HQ through an online dashboard.
A recent study by Accenture, ‘A new dimension of opportunity; 3D printing’s potential for the energy industry’ claims, without much conviction, that ‘the consumerization of three-dimensional printing technologies will bring new opportunities for value creation in upstream and downstream operations.’ Three dimensional printing (3DP) has seemingly ‘been around for more than 30 years’ but is only now ‘edging into mainstream manufacturing.’ A trend driven by a ‘convergence of increased technology sophistication, lower equipment costs and diversification of 3D-printable materials.’
Who is using 3DP in oil and gas today? One example is GE Oil and Gas which has used plastic and metal 3D printers to reduce the design loop for prototyping some parts from 12 weeks to just 12 hours. The company is also ‘reported to be considering’ using 3DP to produce electric submersible pumps.
So much for the evidence, but what of the potential? For Accenture’s researchers, ‘We are now at an inflection point with 3DP. By integrating 3DP into the fabric of their operations, oil and gas companies will be able to transform the effectiveness of upstream supply chains, as well as bringing new markets and new sources of revenue to their downstream businesses.’
According to Accenture, the 3DP market is expected to quadruple over the next decade to $12 billion as it moves from prototyping to manufacturing. But a report in the Financial Times, ‘3D printers still not printing money’ has it that shares in 3DP manufacturers have halved in value in 2014 as the technology appears to be failing to ‘consumerize.’
In a paper prepared for the recent Dome digital oilfield conference Mohamed Atia presented a structural integrity study of a North Sea platform performed by London, UK headquartered Atkins.
Atkins performs real time monitoring of large offshore structures as they twist and bend in response to sea states and loading. Real time monitoring can provide immediate detection of structural failure and can be used to design a structural integrity management program.
Structural monitoring is achieved by placing accelerometers at key places on a platform. Data is analysed in Atkins’ fleet management system (FMS). FMS allows operators to visualize deformation according to a structure’s natural frequencies and spot short term incidents and long term trends.
On one platform a jacket brace failure was immediately detected from a change in the structure’s natural frequency. More from Atkins1802.
Energy Solutions Intl. has named Houston-based Dyonyx as its worldwide partner for hosted editions of ESI’s GasStream, Synthesis and PipelineDashboard in the cloud. CMO Eric Johnson said, ‘Our web-based applications for the oil and gas include solutions for gathering systems, processing plants, pipelines and terminals are now available from the cloud providing customers with enterprise-class solutions at a lower total cost of ownership.’ The solutions are marketed either under a traditional license agreement or through a monthly subscription. Dyonyx clients include Pride, FMC Technologies and BHP Billiton.
Hart Energy and P2 Energy Solutions are to jointly offer access to pipeline and other oil country data, combining P2’s Tobin and Hart’s Rextag datasets.
Aveva is to supply an asset visualization and information management solution to Total E&P Norge.
Dolphin Drilling has contracted with Asset Guardian for the provision of a configurable process software management tool for its rig control and automation systems.
BP has awarded Emerson Process Management a $40 million main automation contract for its Shah Deniz Stage 2 project in Azerbaijan, part of a global agreement for the provision of greenfield automation services.
Allegro Development has announced a partnership with DHC Software to provide commodity trading solutions and consultancy to the Chinese market.
Norwegian consultant Subsurface is now a provider of equipment reliability data services leveraging ExproSoft’s WellMaster.
Deloitte Consulting has partnered with IFS to offer deployment of IFS Applications to the oil and gas, construction and other industries in South Korea.
Mott MacDonald is to deploy Intergraph SmartPlant solutions across its oil and gas business.
Intertek and Letton Hall Group are to collaborate on the provision of metering and allocation service capabilities to oil and gas clients in the USA and worldwide.
Abu Dhabi Marine Operating Co. has awarded Technip the project management of the Nasr Phase II field development.
OCS Group has announced the formation of a facilities services joint venture in Saudi Arabia with the Jeddah-based Zahid Group.
OpenLink and Tableau Software have announced a partnership to bring data analytics to the energy trading and risk management market.
Recon Technology has signed for the provision of automation solutions to Tangshan Jidong Petroleum, Sichuan Petroleum and Sinopec Southwest. Total contract value is RMB6.4 million.
Tall Oak Midstream has deployed Energy Solutions International’s GasStream transaction and volumetric accounting system.
Technip has signed a new offshore oilfield production R&D cooperation agreement with IFP Energies Nouvelles.
Tendeka has signed two sand and inflow control technology provision contracts with RN-Purneftegaz and Lukoil-Nizhnevolzhskneft.
Teradata and MapR Technologies have announced an expanded partnership covering technology integration, road map alignment and a unified go-to-market offering.
Wison Offshore & Marine has selected Aveva’s marine software for the engineering, design and construction of its offshore floating storage regasification unit projects.
Wood Group has been awarded a five year contract with an estimated value of $750 million by BP. The company’s Mustang unit has also signed a memorandum of understanding with Century 3 to collaborate on sourcing, bidding for and executing projects.
Yokogawa Electric’s Chinese unit has received an order from PetroChina Yunnan Petrochemical for the delivery of control systems for new build oil refinery in Yunnan Province.
The new Saudi Arabia advanced research alliance (Saara) has been launched to ‘drive commercialization and application of innovative research and development activities in the Kingdom.’
Energistics has floated two new Resqml-related Energistics-led workgroups. One adds graphical style information/symbology to Energyml data objects (a.k.a. Resqml objects) during a data transfer. The other addresses direct, application-to-application transfer of Energyml objects using the Energistics transfer protocol. This replaces file transfer with data transfer over a TCP socket.
The OPC Foundation reports successful trials of ‘virtual interoperability’ (V-IOP), whereby vendors in different continents interconnected products via a Microsoft Azure virtual network. A new OPC UA global discovery server from GE was also recently trialed, allowing large organizations to centralize management and provisioning of OPC UA servers and certificates and allowing clients to find and query servers on the network.
The Process automation users’ association (WIB) is offering a free self-assessment tool for compliance with the IEC 62443 process control cyber security standard. WIB also offers a guidance document, ‘Process control domain-security requirements for vendors.’ IEC 62443 was originally based on a specification from WIB. The tool is available from its developer WCK GRC.
The Global reporting initiative has announced the GRI Taxonomy 2014, developed in collaboration with Deloitte, an XBRL taxonomy for sustainability reporting. The release includes an updated architecture and implementation guide and a sample instance file.
XBRL US has announced a new CET (construction, energy and transportation) work group with an initial focus on surety bond processing for contractors. The protocol was developed in response to the US digital accountability and transparency (Data) act that was signed into law last May.
A new 14 page white paper from Emerson offers advice on corrosion monitoring in refineries and shows how wireless technology can be deployed economically. The paper discusses use of Emerson’s CorrLog wireless corrosion transmitter for ER and LPR probes along with a deployment case history from Reliance Industries.
In a separate announcement, Emerson introduced Smart Wireless Navigator to help users deploy and manage large wireless networks.
A data sheet from Freewave outlines wireless machine-to-machine (M2M) connectivity options available from its WavePoint multi-mode broadband wireless solution. WavePoint allows remote oil and gas production sites to communicate via Ethernet, WiFi, GSM and other available protocols.
OleumTech has been awarded a US patent for its WIO wireless input/output solution that replicates hardwired process signals for remote monitoring of ‘any industrial process or control application.’ WIO units operate in the 900 MHz or 2.4 GHz ISM band and are delivered factory-paired for plug and play deployment.
M2M Data is to provide Monnit’s wireless sensing solutions to users of its M2M solutions in the oil and gas and other verticals.
Stallion Oilfield Services signed with global satellite operator SES and Global Data Systems to double satellite capacity to connect ‘booming’ oil and gas operations across North America. Along with enterprise data, Stallion provides high-speed broadband, corporate VPN and MPLS solutions, Netflix, and Skype.
Targeting inter alia the oil and gas market, Yokogawa’s multi-function wireless adaptor allows wired field instruments to function as ISA100 wireless devices. Multiprotocol wireless adaptors are available for HART and Modbus devices.
Yokogawa also recently signed with Murata Manufacturing for joint development of ISA100 wireless devices.
Dong Energy has deployed an operational intelligence solution spanning process control and GIS at its growing offshore wind farm. The deployment is a poster child for the new PI Integrator for ArcGIS, an ‘out-of-the-box’ solution that connects real-time operational data streams to Esri’s mapping technology.
Dong will be operating 1,800 offshore wind turbines by 2020. Remote monitoring of turbine, wind and wave conditions along with the location of service personnel will help minimize downtime. A €20 million per year saving in operating costs is anticipated. Dong lead data architect Anders Røpke said, ‘Offshore is a challenging and expensive environment. Remote monitoring will maintain health and safety standards and reduce operating costs.’ PI Integrator for ArcGIS provides an integrated geography-time perspective that exposes patterns in operational performance that might otherwise go undetected.
Emerson Process Management has released OpenEnterprise field tools (OEFT) a software package for configuring multiple RTU* platforms and Hart transmitters across remote sites, helping improve operations and field personnel safety. The software enables configuration and real-time monitoring in remote oil and gas applications such as wellhead automation, flow measurement, and tank overflow protection.
OEFT works across Emerson’s family of RTUs and flow computers including products under the ROC, FloBoss, and ControlWave brands. Hart communications enable configuration, troubleshooting and maintenance of wired and wireless devices. Hart pass-through enables tunneling over complex Scada infrastructures with native protocols.
system provides live-mode monitoring of connected devices and flags-up devices that require field personnel. OEFT supports devices from Emerson and 3rd party manufacturers. The system can be extended to new devices by adding their device descriptions.
* Remote terminal unit.
The US Department of Energy has chosen a ‘data centric’ high performance computing system to address energy R&D and big data challenges. The new systems are the ‘Summit,’ at Oak Ridge National Laboratories and the ‘Subtractit*’ at Lawrence Livermore, both with 100 petaflop peak performance. Contracts with a total value of $325 million have been awarded to a group led by IBM whose technology ‘puts computing power everywhere data resides, minimizing data in motion and energy consumption.’
Under the hood of the new supercomputers is IBM’s attempt at a RedHat-style ‘freemium’ ecosystem around its Power architecture. OpenPower represents an open sourcing of Power hardware and software alongside a complementary ‘for profit’ shop front for added value services.
The DoE’s computer includes Nvidia’s NVLink pipe for data exchange between Power CPUs and next-generation Nvidia Volta GPUs. Key to the new machines is the Hadoop-like ‘data centric’ architecture which minimizes data movement.
* Only kidding, it’s called ‘Sierra.’
Engineering data handover from build to operations has been the subject of much handwringing in the past and has spawned a lengthy process of standards development that so far has failed to bring fruit. Norwegian independent oil company, Det Norske Oljeselskap, operator of the NOK 24.7 billion Ivar Aasen field has taken a more pragmatic approach, deploying a new ‘progressive handover solution’ (PHS) from UK-based Aveva.
PHS is said to ‘de-risk’ the transition of a new facility from construction to a client’s operations team. PHS is enabled by Aveva’s digital information hub, a portal based solution, where every physical asset has a corresponding digital representation. A reporting dashboard lets Det Norske’s EPC, Sembcorp Marine unit SOME, update and monitor information throughout the project. The result is a ‘streamlined handover from construction to operation with higher quality information.’
Aveva’s Ellinor Meling said, ‘The solution sees a shift from document-centered engineering to a focus on tags providing direct access to technical information. The solution is being extended notably with a link to SAP. The solution will also be used for ongoing operation and modifications’.
Speaking at Workboat 2014, Jason Tieman provided a briefing on AIS* and a new common operating picture (COP) capability resulting from Oceaneering’s acquisition of PortVision earlier this year. The COP is said to be of relevance to daily operations and oil spill incident response.
The COP provides collaborative situational analysis and operations management for multiple stakeholders, each with its own view of an incident. Cross-system interoperability provides a common picture of the crisis and associated response activities. Components include live video and data feeds with a replay capability for after action review and process improvement. With proper planning, a COP can be deployed quickly in any location to provide a ‘comprehensive tactical operational view.’
Oceaneering has collected 15 billion AIS locations in the past five years and records 50 million new location reports every day. Analysis of historical data from 1987 to 2007 shows multiple ‘interactions’ between vessels, anchors and pipelines. In the Gulf of Mexico alone these include 120 pipeline strikes, 25 fatalities, and 17 injuries, 100,000 barrels of released product and over $100 million in property damage. More from PortVision2702.
* Automatic identification system. A mandatory VHF signal captured by a worldwide network of receivers.
Schneider Electric, which acquired Invensys earlier this year, has updated the Wonderware ‘SmartGlance’ mobile reporting solution to facilitate monitoring and analyzing real-time plant and process data from mobile devices. The latest SmartGlance release, 2014 R2, includes support for wearable technologies, a multi-platform browser-based interface and multiple time zones for round-the-clock working.
Product manager Saadi Kermani said, ‘Plant personnel require access to real-time operations information from smart phones, tablets and other mobile devices. SmartGlance provides personalized charts, reports and alerts and the flexibility to view and collaborate on operations for timely, effective decision making.’
A new MyAlerts app notifies users of process events based on configurable thresholds for tag reports. The SmartGlance works on smart watches to provide hands-free real time data to plant supervisors and managers of critical production and process information. An open interface allows data to be pushed to mobile devices from ‘virtually any data’ source. The new release extends connectivity to Schneider’s Citect SCADA, PRiSM predictive analytics and the eDNA historian.
ExxonMobil’s new workplace evolution 3 (We3) office environment at its Houston campus, goes beyond open plan in an effort to improve collaboration and productivity while reducing costs. We3 introduces a new ‘activity-based’ environment where employees are no longer assigned a fixed desk. Instead they can choose the best workspace for a particular task. To develop the We3, ExxonMobil used employee surveys and seat sensors to track how space was utilized. The result is that collaboration space use is up, costs are down and access to outdoor views* is up. Key enabling technology for We3 is Barco’s ClickShare, a plug-in device that enables content sharing across laptops, smartphones and the large screen.
Without wanting to cast doubt on ExxonMobil’s brave new office, it behooves us to record that Lindsey Kaufman, writing in the Washington Post, cited a 2013 study that found workers in open offices ‘frustrated by distractions that lead to poorer work performance.’
* An earlier attempt (circa 1960) to hike office productivity at the then Elf Aquitaine’s location in Boussens, France placed the windows high up to prevent workers gazing out at the mountains!