Repsol and IBM have set out to ‘transform’ the oil industry through the use of ‘cognitive computing.’ The concept comes out of IBM research’s Cognitive environments lab which provides novel human machine interfaces leveraging speech, gesture, robotics and advanced visualization. The idea is to ‘understand and compensate for human bias in decision-making.’ This will be enabled by a ‘society’ of software agents called ‘cogs,’ working in partnership with humans. In oil and gas, cogs support interaction and collaboration between geoscientists, engineers, economists and planners in a ‘single environment that leverages their individual skills.’
Specific use cases include mergers and acquisitions, where cogs can ‘naturally highlight value and synergies.’ Emergency planning is another target for cog-enablement where agents will ‘explore successes and failures from past data to recommend options and trade-offs for allocating funds and deploying emergency crews.’
Repsol believes that the approach will help it ‘compete with giants, when acquiring new acreage and on optimizing its production. Upstream CTO Santiago Quesada explained ‘This collaboration with IBM opens up new possibilities. We believe that the blending technology and talent will be the key driver for the industry in the 21st century.’
Cogs are backed up by IBM’s Watson artificial intelligence back end. Repsol’s Ruben Rodriguez explained, ‘We are trying to find synergies between people and Watson, building an oil industry knowledge base and expert system.’ The interface to the cognitive environment includes an operations room with mikes and video phones.
Speaking at the IBM cognitive systems colloquium last month, Quesada observed that, with offshore wells costing $400 million and with a 25% chance of success, ‘We have to go further, look at emerging technologies like cognitive systems. Our team of E&P experts will prototype these aids to decision making. It will be difficult but it has to be done.’
‘We have to deal with huge amount of data. It’s impossible to be handled by humans. We want to not only handle the data but to control this data. Today, we don’t control data. Data controls us and this has to change.’ ‘We want to make this a reality, not just a vision. We’re combining talents of the joint team. We have the passion for this. Hopefully we’ll see important results very soon.’ IBM’s marketing team has been in overdrive with its ‘society of cogs’ and ‘human-computer collaboration at the speed of thought.’ For a more measured view of the Repsol project, read our interview with Santiago Quesada in next month’s Oil IT Journal.
Megamerger sees Halliburton and Baker Hughes combine in paper and cash transaction valued at $34.6 billion, putting a $78.62 per share valuation of BHI at the time of the deal. While the release speaks of ‘highly complementary’ products and services, it envisages some $2 billion in ‘annual cost synergies’ for the combined company whose notional 2013 revenues were $51.8 billion. The new company will have 136,000 employees and operations in 80 countries.
The transaction represented a 40% premium on the BHI stock price just prior to the initial offer. Baker’s share popped from $52 to $66 on the announcement before dropping back to $57. At the time of writing, Halliburton is down from $54 to $40 following a longer term trend. Halliburton has agreed to divest businesses generating up to $7.5 billion in revenues, if required by regulators, although it is thought that these will be significantly less. Halliburton will pay BHI $3.5 billion if the deal fails to get antitrust approvals. The transaction is expected to close in the second half of 2015.
Writing this editorial has been like trying to catch a falling knife. Since I started it a month ago, actualité has caught up with its sentiments and by the time you read it will likely have overtaken them. Apologies in advance for any stating what, by the time you read this, has become either blindingly obvious or obviously wrong!.
Pundits and forecasters in the shale game tend to polarize. They are either pro industry—think American Petroleum Insitute or watch Marathon Oil’s fabulous ‘Full tank of freedom’ ad promoting US domestic production. On the other side of the argument there are folks of a greenish hue who are anti fracking and often anti oil and gas in general. It would be nice to pretend to be objective in such matters, but we all have some prejudice or other so I though I should try to introspect my own.
A straw poll at ECIM asked attendees to hold up their hands if they had ten years of industry experience. Then twenty.. I was getting quite excited as I realized that I could, as of this year, claim forty years. But the pollster quit at thirty! Seems like old farts like myself are beyond the pale.
When I was hired the oil price was riding high at around $50. It has been up and down quite a bit since then. Folks like to talk of a ‘cyclical’ industry but that hides the fact that price movements usually come out of the blue and careers get buffeted. Sometimes it’s not just the oil price but the exchange rate too. French exploration in the early 1980s got a boost when the oil price hit $40 and the exchange rate was 10 Francs to the dollar. A couple of years later, oil was $10 and the Franc was at $4, a ten fold fall that had companies (except mine!) fleeing the country quicker than you could say Beaujolais nouveau.
To get back to my prejudices. Having spent years on conventional oil and gas, I have a problem with the sophistication and expense of multi-stage fracs and horizontal wells. I can’t see how these will produce more than small fraction of in-place hydrocarbons. My next prejudice is more of a psychological nature. Our industry, like others, is in general better at telling a good tale when speaking of the upside than it is when saying how much a project is going to cost. Oil and gas, like banking, also does quite a good job of divesting itself of risk—by farming out or issuing paper of (possibly) dubious value.
Given my skeptical starting point, how is shale looking in the face of a plunging oil price? Well even before the price fall, as we reported last month, Schlumberger’s Patrick Schorn was warning that ‘40% of the shale wells drilled in the Eagle Ford of Texas are uneconomic.’ The fraught nature of shale was also highlighted in the October 9th edition of Business Week (BW) in a piece by Asjylyn Loder and Isaac Arnsdorf titled ‘Fracking’s funny numbers.’ BW compared shale reserves as reported in company presentations with what the frackers are telling the SEC. Many shale drillers report one figure in filings while advertising a higher number to the public. On average, the number for resource ‘potential’ was 6.6 times higher than that reported to the SEC.
Industry insiders will understand the differences between reserves and ‘resources’ as cited in presentations to investors. Also, the industry has a quite a few tools and techniques that ought to provide some reliable estimates of what is truly. But are these tools really being used?
At the SEG in Denver I sat in on a presentation from the CGG-Baker Hughes shale alliance that showed how seismics can pinpoint sweet spots in a shale play. The results were pretty impressive, showing a high potential trend weaving its way across the tract. Not so long ago, the SPE managed to convince the SEC to allow geophysical techniques like seismics to supplement the traditional, more restrictive approach to reserves reporting. But the idea then was that seismics would be used to increase reportable reserves to the full extent of a (conventional) reservoir. Not, as they implied in the CGG presentation, that only 20% of the acreage was actually sweet!
Another shale caveat comes from a recent post on petroleum engineering software specialist Kappa Engineering’s website where CEO Olivier Houzé wrote, ‘In the dash to exploit unconventional resources it has been tempting [..] to bend traditional methods to fit the observed response rather than trying to understand the physics […] We have resisted this trend and, probably at the cost of some business, have not entered into such a Faustian pact, [preferring to] keep our engineering soul intact.’
And what of the geoscientists working in the shale arena? I have ‘anecdotal evidence’ i.e. whispers from a couple of buddies, that geoscientists are being ‘let go’ by some shale explorers. To save on costs? Maybe. Or perhaps to stifle the Cassandras!
Shale exploration in the US has proceeded at such a rate that it is hard to find out what is really going on. The US Energy Agency reports on the number of rigs and on ‘production growth’ but does not provide answers to obvious questions like how many wells are producing and at what rate. The current drilling frenzy makes it hard to see, at the macro scale, how fast shale production really declines.
This is likely to change if the oil price stays low for a while and drilling slows. Maybe in a year or two we will know. Maybe we’ll know sooner than that. The FT recently reported that ‘energy debt’ accounts for 16% of the $1.3 trillion US junk bond market. That’s $200 billion of which one third is currently classified as ‘distressed.’ Hedging by the producers and ‘slicing and dicing’ by the lenders will save some. But others may end up holding yet more paper of doubtful value!
Last year we reported on Paradigm’s ‘Epic’ open data environment. How is this shaping up?
We have two linked initiatives. One, an open data connector, leverages the new Resqml 2.0 standard from Energistics. The technology is available in our 2014.1 release and is used in our Petrel connector. This plug-in outputs Resqml data for consumption in our Epos framework. The current version has been developed independently of Schlumberger so that we have control of the implementation on both sides. We may review this if and when Schlumberger comes up with a Resqml interface. The solution provides Petrel connectivity to Epos, Skua and SeisEarth.
How was this done before?
Reservoir models used to be exchanged via Eclipse data files.
Is Resqml your ‘official’ connectivity route or just an option?
This is our only connectivity solution! It is an industrial-strength product, used to connect all our Epos-based products.
Can Resqml data be validated as conforming to the standard?
Not currently although the Resqml developers are planning a validator.
Will you be sharing your link implementation with third parties?
The link is free to Paradigm clients who can use it to connect their own applications. We are working on our side too, to support Resqml data structures.
Will Resqlml displace Epos?
No but Epos is undergoing a gradual transformation to a Resqml style. We already use HDF5 for seismic data.
What is happening with the ‘OpenDB’ project you announced last year?
This is our initiative to build an E&P database that blends PPDM and Resqml.
That sounds like squaring the circle!
Not really. We use concepts from both standards, improving the way PPDM tags objects by adding a fundamental UID. PPDM’s frame concept was used in Resqml. The models are closer than you might think.
PPDM to Resqml mapping sounds like a good project for the Standards Leadership Council?
Perhaps, but right now we are more interested in building EnergyML. We want to bring together the work done on Witsml, Prodml and Resqml and put them on a sound common footing. Then we can look to extend the core concepts.
That’s no small task!
Recently things have been very buoyant in the standards space with a lot of enthusiasm from vendors and strong support from several majors. Also there have been many real-world implementations. This is what drives success, implementation and debugging. There has been a cultural shift too, with more involvement from professional software engineers, unlike the enthusiasts of the early days. Now there are more folks with dual IT/domain specializations. Of course this momentum was helped by $120 oil. Things may change now!
Dome Exhibitions’ 4th international digital oilfield conference held earlier this year in Abu Dhabi is an interesting show, with more of a process engineering focus than the EU and US events. There is also a good involvement from operators, particularly PDO* with in-depth presentations on several ongoing projects. Hilal Al Harthy observed that, just as the ‘easy oil’ is running out for explorationists, the ‘easy plant’ is rapidly becoming a thing of the past for the process control community. In the old days of high profitability, there was little incentive to optimize. Things have changed with complex plants and automated control systems.
In most cases, poor performance can be traced back to poor plant design.
There is a major disconnect between plant design and process and business objectives. This in turn is down to the lack of in-house control systems expertise and a general failure to design for the big picture of overarching plant control. ‘Optimizing sub components of a system by definition creates a sub optimum system.’ PDO is practicing what it preaches by reviewing existing control schemes and modifying them such that they integrate more facets of the process. The result is less deferred or lost production, reduced operator intervention and automated startups.
Jamal Balushi and Abdulhameed Al-Habsy’s presentation began with an image of the futuristic control room planned for PDO’s gas network operations center (GNOC). GNOC was driven by a rapid growth in PDO’s gas business resulting in a complex of operations over a wide geographical area. GNOC includes three work streams, physical infrastructure, technical (HMI/control and automation) and business requirements. The many small fields currently use control systems from Honeywell, Yokogawa and Foxboro. But the plan is to move to a common HMI with 3rd party integration via Wonderware or Yokogawa. This is ‘not an easy choice.’ Wonderware is already used in PDO’s oil operations and Yokogawa is deployed at the Ju’aymah NGL plant in Saudi. The tradeoffs and vendor dependencies are under study. Final detailed design is slotted for 2015.
Salim Al-Mawali provided an entertaining exposé of ‘smart’ technology use at PDO. Robot and drone usage is being evaluated for use in dangerous environments including H2S, high temperatures and explosive atmospheres. A ‘smart mobile worker’ is equipped with an ‘e-buddy’ system comprising GPS/real time locating receiver, camera, voice headset, gas monitor and health sensor. Elsewhere Falcon-8 drones are used for asset inspections, engineering progress monitoring and virtual geology field trips. Drones are also effective in hazops such as tip inspections. More from Dome Exhibitions.
* Petroleum Development Oman.
Aberdeen University researcher Paul Cleverley, on a sabbatical from Flare Consultants, has published a paper investigating information retrieval using faceted classification and word ‘co-occurrence’ based search. Cleverley quizzed some 54 petroleum engineers as to the use they made of information from in-house document management systems, the wider world wide web and industry specific resources such as the SPE’s PetroWiki. The semantic web for earth and environmental terminology (Sweet) ontology was used to guide research.
Cleverley distinguished between the information resource itself and users’ problems and queries which are distinct and only exist in ‘social reality.’ Users’ needs differ. Groups and ‘sub-cultures’ with shared beliefs will use digital libraries and enterprise search for different purposes. Little research has been done on what enterprise searchers find useful.
Cleverley proposes a new information needs model better tuned to such inherent differences. The ‘Bridges’ (broad, rich, intriguing, descriptive, general, expert and situational) information model may help meet professionals’ information needs and provide a system capability that facilitates serendipitous discovery. The Bridges model ‘has implications for faceted search in enterprise search and digital library deployments.’
At the 2014 Pipeline Conference in Houston last month Beyond Recognition (BR) announced the offer of a free trial of its ‘visual classification’ approach to the management of oil and gas documents. Visual classification (VC) performs a graphical analysis of documents, grouping scanned documents with native-format files. Documents are managed using a normalized schema independent of their containers that allows for a single workflow spanning both paper and electronic documents.
VC starts out with a completely automated first pass classification that places files and scans into groups of visually-similar documents. Knowledge workers then perform checks on a few documents to evaluate document importance and to verify processing workflows. After a few weeks, BR claims that almost all incoming documents are placed in appropriate clusters.
A similar process is used to pull data elements from document clusters using information prepared by domain experts. Output formats are client-specific but typical deliverables are CSV files of normalized attributes, image-over-text PDFs and a single-instance of the original native file. The process results in a reduction of around 90% on a typical document collection. BR is offering a proof of concept trial on a terabye of corporate data. More from Beyond Recognition.
Kepware has partnered with Splunk to combine ‘big data’ with the ‘internet of things.’ The latest 5.16 release of Kepware’s KEPServerEX includes a new industrial data forwarder for Splunk plug-in. The data forwarder brings Splunk’s enterprise big data aggregator into the industrial market, streaming industrial sensor data into Splunk’s real-time operational intelligence platform.
SysFera and Kitware have partnered to integrate their high performance computing technologies for scientific data analysis. Kitware’s ParaView, an open-source, multi-platform data analysis and visualization application is now available from the SysFera-DS web interface. Users can launch multiple simulation campaigns on an HPC infrastructure, using ParaView’s 2D and 3D data-analytics from any web browser. Remote simulations can be initiated and managed from a laptop or tablet without large data movement.
BMC has announced Control-M for Hadoop, a toolset that ‘simplifies and automates Hadoop application batch processing.’ Control-M schedules and manages Hadoop jobs and provides a ‘robust’ development environment for developing Pig, Hive, Sqoop, MapReduce and HDFS-based applications.
Distinguished lecturer Michael Franklin, speaking at the Ken Kennedy Institute for Information Technology at Rice University, Houston, introduced BDAS, the ‘Berkeley big data stack.’ BDAS provides an open source, ‘unified’ big data toolset. BDAS comes from the Berkeley AMPLab where machine learning HPC and data scientists congregate.
BP’s massive seismic processing center is moonlighting as a general purpose big data ‘hot zone’ where new ideas are tested and concepts proven. BP’s biofuels division recently used the facility to develop yeasts for ethanol production, a component of the company’s Brazilian green business. The BP center for high-performance computing started out as a 2.2 petaflop facility last year and is expected to triple in bandwidth by 2016.
The V2.0 release of Blueback Reservoir’s Rock Physics Petrel plug-in sports the Microsoft ‘ribbon’ interface and offers direct model exchange with Petrel using IronPython.
Elsevier has added an overlay of C&C Reservoirs’ worldwide oil and gas field analogs knowledge system (Faks) to its Geofacets E&P map portal.
CGG’s new PowerLog Frac software connects petrophysics and fracture simulation.
Check-6 has announced Checklist Ops with Rigor, a mobile app that offers procedural discipline and verifiable compliance for oil and gas tasks ‘susceptible to lost time incidents.’
A new well formation tops module for Deloitte’s Petroview adds sub-surface geological content to its E&P geographic information system.
A free Eagle Ford mobile network app from Marco Flores provides access to information such as job listings, equipment sales, small business support and social media. The smartphone’s GPS is used to localize business listings.
The latest version of Exprodat’s Data Assistant provides data transfer between ArcGIS and exploration and production file formats such as Petrel, Kingdom, OpenWorks and SeisWorks. Software now automatically detects input file format and provides batch transfer and on-the-fly geodetic transformations.
The new 7.20 version of geoLogic Systems’ geoScout enhances data transfer with DST Pro and offers improved LAS file creation and editing. Datasets include core gamma ray, completions and plugbacks and casing weight and grade.
Ikon Science has extended RokDoc with a geomechanics module, a toolset for 1D and 3D models of reservoirs and their surroundings. The new module computes safe mudweight windows and is used to plan well trajectories and reduce the likelihood of wellbore collapse. Ikon’s geomechanics expertise comes from JRS Petroleum Research, acquired in 2012.
The 4.1.1 release of Interica’s Project Resource Manager offers email alerts for ‘projects out of place,’ notification of low free space on file systems and charts for users and application disk space usage.
Intertec’s new process instrumentation enclosure, developed for a Russian oil refiner, is intended for use in extremely cold climates. Applications include differential pressure flowmeters and process transmitters.
The 2014.2 release of Perigon’s iPoint wellbore data management solution extends its generic query layer to map relational database models which can now be managed natively within iPoint. The release includes new adapters for Petrel, R5000 and Petra along with new ArcGIS-based mapping, Active Directory support and a command line interface.
Petrosys V17.6 introduces templates for consistent mapping, a new spatial editor for work across a selection of popular GIS data sources and a buffer tool for generating polygonal buffers around faults.
SeisWare 9.0 adds new workflows for velocity modeling and time/depth conversion. User roles can be defined for project collaboration and to increase security and data access. Data can be organized without the need for unique naming conventions.
Tendeka’s ‘FracRight’ integrated frac sleeve for selective multi-zone stimulation is now coupled with its real-time DTS monitoring and Quest software for collection and analysis of stimulation data in unconventionals.
Quantitech’s new Frog-4000 rugged, portable instrument performs on-site analysis of volatile organic compounds in air, soil and water for environmental, health and safety applications.
Airo Wireless’ Airo I-Safe 19000 is a 19" Windows 7 panel PC, purpose-built as a fixed-mount status alert system for hazardous environments such as oil and gas, petrochemical and pipeline.
The US Energy Information Administration has released ‘Crude oil import tracker’ to track trends in US crude oil imports. Coit-US features graphing and mapping capabilities.
Speaking at the recent Blue Marble Geographics user group in Calgary, LMKR’s Scott Oelfke stated that Geographic Information Systems (GIS) usage in oil and gas is ‘exploding’, driven by GIS’ key role in shale development. Shale and other resource plays cover ‘vast areas’ where field planning and logistics are at a premium. GIS enables operators to track the thousands of wells drilled over hundreds of square miles assuring collision avoidance and enabling rapid permitting and providing up-to-date information. GIS is ‘going viral.’ The downside is that the numbers of GIS practitioners who understand basic geographical concepts such as datums, projections and shifts is not keeping up with the booming deployment. Oelfke warns that bad geographical assumptions can have severe legal and financial repercussions.
To minimize GIS risk, LMKR has embedded Blue Marble’s coordinate transformation technology in its GeoAtlas mapping flagship. The use of ESRI ‘layer’ files (as opposed to the older shapefile) has simplified data management by linking maps to multiple original QC’d data sources. The arrival of Google Earth (GE) has been a blessing and a curse. GE presents major issues with accuracy and usefulness but it has ‘done more than perhaps even ESRI’ to advance GIS to the forefront of management’s agenda. GE makes everyone a geographer. ‘Resistance is futile, it’s the way industry is moving.’
Alonzo de la Cruz (Suncor Energy), a geomatics professional, showed how care, during a major GIS revamp, was required to avoid disaster. Suncor’s legacy GIS data included custom coordinate systems created decades ago to ‘simplify engineering calculations in a pre-computer world.’ The company also has projects that use lesser known coordinate systems. With help from Blue Marble, Suncor has created a custom xml format to import its legacy coordinate systems into Blue Marble desktop, at the same time making them easier to find and use.
Before we were thrown out (see last month’s editorial) we managed to capture the opening session of the 2014 Society of Exploration Geophysicists’ annual convention held in Denver last month. The SEG president Don Steeples put current membership at around 30,000 with 37% in the US. Membership is currently ‘stagnant to falling slightly.’ While there have been over a million downloads of abstracts from Geophysics (the SEG’s flagship publication), its companion, The Leading Edge, is losing in popularity. Next up was Tom Petrie, banker and pundit who has appeared on both PBS and Fox. Petrie ran through the gist of his book, ‘Following Oil,’ which traces the rise of US shale production and its implications for US energy independence. Unconventionals are the ‘grand disruptor’ that has halted a four decade long production decline along with shibboleths like peak oil. There are currently some 14 unconventional plays in N America with the potential for huge expansion in the resource base. Economic disruption is following a reduced trade deficit and restored manufacturing competitiveness—especially in petrochemicals. On the technical front, shale has led to dramatic evolution in drill rig design and improvement in geosteering, frac and proppant make-up and, for the geophysicists, micro seismic improvements that have ‘transformed the Bakken.’ On the environmental front the US, depite not being a signatory to the Kyoto accord, is now ‘one of most compliant countries in the world,’ and this has been achieved, ‘thanks to private capital.’
DrillingInfo CEO Allen Gilmer took to the mike stating there was ‘no escape from analytics and big data!’ While these are ‘over used terms,’ seismics is the ‘granddaddy’ of big data, especially with passive monitoring. Microseismics allows operators to compute fractured rock volumes in real time, stopping or changing parameters to optimize treatment. Elsewhere, multivariate statistics on big data can bridge geology and engineering to identify engineering best practices … and also to identify operators that are not so good. One spent $100 million sans any added value, another spent $150 million to add 1.5bopd production. ‘The best operators are miles better than worst.’ On the topic of costs, Gilmer ventured that $50 was the mid range breakeven cost for shale operators but that for some, this could rise to the ‘$60-$100 range.’
Author Chunka Mui segued clumsily into driverless cars, now positioned as a solution to the 4 million US car crashes and 33 thousand deaths per year. Worldwide there are 1.3 million deaths per year that could be saved by ‘taking humans out of the loop.’ There is an ‘industry wide arms race’ to develop driverless cars. Uber got a plug as another ‘killer app’ but Mui sees even more potential by combining driverless with Uber. There would be losers of course, car dealerships and auto insurers and … oil and gas too! According to Google’s Sergey Brin, the timeline for all this is 2017/18, in other words, ‘real soon now.’ The SEG did not solicit questions from the floor, but chair Rutt Bridges did a great job of asking (and answering) more questions than were really necessary.
Speaking in the ‘Road Ahead’ session, Ion’s Christoph Stork described a perfect (seismic) world with million channel/shot surveys but with a correspondingly unworldly price tag. Compromise is necessary, through ‘custom’ acquisition. The new name of the game is ‘compressive sensing,’ tuning acquisition to geological objectives and making smart compromises. Stork made a call for help from academia for better tools for survey design risk reduction and understanding of noise. Synthetic model data is key to acquisition modeling, to ‘show where attributes can be trusted.’
The iconoclasts were out in force. Art Weglein was arguing against the ‘inclusive’ view that primaries and multiples are both signal and ‘should be migrated.’ All current reverse time methods fail Claerbout’s test of source/receive coincidence. Multiples ‘are never involved in imaging and inversion.’ Sergey Fomel argued that the unpopular topic of time migration deserves further research as a means of avoiding the problem of velocity, the ‘elephant in the room’ for depth migration. Fomel is skeptical that the velocity problem can be solved. BP’s John Etgen was beyond skeptical. Despite ‘umpteen’ papers on full waveform inversion, all the wide azimuth, coil shooting bells and whistles, subsalt imaging in the complex Gulf of Mexico still fails. Studies on synthetic data show that as little as 5% of the subsurface can be imaged. ‘It is the velocity model that is killing us’ as small errors in the salt model degrade migration rapidly. Modern acquisition is fit for purpose, but our models are inadequate. Etgen, a keen amateur astronomer, sees hope in adaptive optics, a technique for correcting light’s passage through the earth’s turbulent atmosphere. His suggestion? Use a similar approach, numerically ‘reshaping’ the wavefield in the vicinity of a high contrast interface, ‘riding along with the waves to see where they are having trouble.’
CGG’s Sam Gray offered a more measured view of current seismic methods. ‘Big’ (structural) imaging and ‘little’ (rock property) imaging are on convergent paths. Broadband ‘big’ may include a stratigraphic component and ‘little’ imaging of unconventional targets may benefit from migration. But fractures occur on a centimeter scale, quite different from seismics. ‘We will never get the centimeter spacing required.’ Tomorrow’s rock property investigators ‘will need to know a lot of stuff and some.’ Seismics needs help (from academia) because ‘the little problems are harder than the big ones.’ On the ‘little’ front, seismic attributes continue to fascinate, and multiply. Kurt Marfurt offered a historical overview from Balch’s 1971 color sonogram to today’s high-end attribute analytics from companies like FFA/Geoteric and Marfurt’s own published work on multi attribute cluster analysis. ‘Generative topological mapping’ also ran, a neural net approach to finding a proxy for petrofacies. Headwave’s interactive pre-stack environment also got a plug.
OptaSense has carved itself quite a niche in the fiber optic monitoring arena and helped out with Shell’s evaluation of different kinds of fiber for use in distributed acoustic sensing. The problem with fiber is that it is sensitive to sound waves travelling along the axis of the fiber, less so for perpendicular arrivals. This precludes its use as a regular, horizontally-deployed surface cable. Optasense has extended fiber’s directional sensitivity with a helically wound arrangement which fared well in Shell’s tests. Fiber generates massive amounts of data in a short time. Current systems generate a terabyte in a few hours. This is processed in the field to a manageable SEG-D dataset. The raw data is then chucked!
Simplivity’s pitch is to replace the whole storage/server/network stack with its OmniCube ‘in-house cloud.’ The system comes with tools for data rationalization reported to bring major storage savings through de-dupe across all locations. Petrobras has deployed dual cubes to replicate data from an FPSO to onshore HQ.
Nvidia Index technology has been used by Landmark to bring an 8x speed up to full wavefield modeling. Complex deep water or unconventional plays can be modeled in ‘minutes or hours. Index can provide remote compute horsepower for tablet or laptop thin clients. Index came from Mental Ray/iRay acquired by NVIDIA. Shell is reported to be an enthusiastic user. Headwave/Hue uses the technology to provide management and exploratory analysis of terabyte prestack seismic datasets. Hue is now also marketing its proprietary compression technology (as used in HueSpace) to third party developers. This is said be 25x faster than existing technology, offering smaller files and better signal to noise.
Fraunhofer’s Franz-Joseph Freund is skeptical of the GPU approach, ‘GPUs are limited by the size of the cards and by PCI bandwidth. Direct CPU to memory access is much faster.’ Fraunhofer’s PV4D data visualization engine uses this approach in a new hexahedron viewer that scales to terabyte data sets. PV4D is delivered as a toolkit for 3rd party developers.
According to Ikon Science, ‘today’s seismic inversion is not trusted by modelers.’ Ikon’s ‘Ji-Fi’ technology sets out to change this with Bayesian inversion that operates on a per-facies basis, using prior information from RokDoc’s rock properties library.
Schlumberger is introducing a cloud computing offering for power users of Petrel, Techlog and Omega. Algorithms are said to run up to 1000x faster than on a local machine. The Schlumberger cloud was originally developed for service use. The Geosphere geosteering service has been using the cloud for a year or so. The petrotechnical cloud also provides remote virtual machines for Petrel running field offices. The multi-OS offering is now available for wider industry use.
On the esoteric hardware front we spotted Green Revolution that offers liquid cooled enclosures for compute clusters. The approach allows for 30% overclocking sans fans, air conditioning or raised floors. Supermicro is an OEM. GR’s biggest installation comes in at 10 megawatts. GR is used by CGG and Down Under Geo.
On the Wipro booth, Landmark founding father Royce Nelson, representing his startup Dynamic Measurement showed us his ‘natural source’ electromagnetic technique. Where controlled source EM pumps around a thousand amps into the ground, ‘natural source’ EM (lightning) provides around 30,000 amps per strike. DM uses the North American lightning detection network to triangulate strikes and map peak current, claimed to relate to subsurface geology. The US Gulf coast is well endowed with around 60 strikes/yr/ks2 and ‘90% of fields show up as lightning anomalies.’ DM was awarded a US patent last year for the technology. More from the SEG.
The Energistics-backed National Data Repository (NDR) informal gathering of regulators and data managers from around the world met last month in Baku, Azerbaijan. The meet heard from a range of data repositories at different stages of evolution and taking very different development paths. To give just a couple of examples from the dozen or so on show, we contrast the approaches taken by Venezuela and Kenya.
Victor Bruley from Scan Geofisica outlined Venezuela’s journey to an NDR. To date, some 150.000 (of an estimated total of around one million) magnetic tapes have been transcribed and stored in a new data center. Bruley observed that ‘classic artisanal’ transcription methods do not scale to such operations. Manual transcription operations have been shown to produce a significant error count, so an automated process has been developed. This involves ‘no human decisions,’ real time automatic QC and minimal tape handling. Transcription has been turned into a ‘manufacturing process’ with one operator working 8 tapes drives, transcribing 150 tapes/hour.
The technique adopted is ‘blind copying’ leveraging tape metadata where available but using pattern recognition and automation on tapes of ‘unknown’ provenance. Data is encrypted and stored across multiple locations for redundancy and backup. Bruley reported free/open source software has been used on the project citing Seisee from DMNG. Capacity in 2014 is around 48TB, a tenth of what will ultimately be required.
Eunice Kilonzo reported on Kenya’s ‘innovative’ funding solution for its early stage national data center (NDC). Kenya needed to move beyond basic data management and position itself to make informed decisions about its petroleum resource. The Kenyan National Oil Co. has contracted with Schlumberger for multi-client 2D data acquisition which will provide funding for the NDC. This will deploy a Schlumberger software stack of ProSource Data Services/DecisionPoint and ESRI ArcIMS on a Windows server alongside a Seabed database and ESRI ArcSDE running on Linux.
The NDR group has kicked off a data quality/business rules project that will leverage data quality work performed by Petronas in its ‘technical applications and data repository optimization’ (Tadro) initiative. An initial work group has been established and will be reporting at the next 2016 meeting in Houston.
The physiognomists amongst you might like to checkout slide 3 of the invitation to the 2016 NDR and enter our caption competition? More presentations from NDR 2014 on the Energistics/NDR home page1203.
Andrew Howell is now CEO of KBC.
ARMA International is currently looking for a CEO.
Gary Patterson heads-up Aveva’s new office in Aberdeen.
Jeffry Haas has joined BellowsTech as technical support engineer. He is a recent graduate of Embry-Riddle University, Daytona Beach, Florida.
Ashtead Technology has recruited Alison Glover to its non destructive testing center. She hails from Lavender Intl.
Mary Francis has been named corporate secretary and chief governance officer of Chevron, succeeding Lydia Beebe who is retiring next year.
CMG has named Rob Eastick VP of its CoFlow division.
Fereidoun Abbassian heads-up BP’s new digital center of expertise and big data.
The UK Energy Institute has announced a new research and engagement program, the Energy Barometer.
Emerson Process Management is opening a $10 million ‘pervasive sensing center’ for the Asia-Pacific region.
BP’s Elinor Doubell has been elected to the Energistics board of directors.
ExxonMobil is a founding member of the MIT energy initiative to ‘advance and explore’ the future of energy.
FMC Technologies has appointed Fluor Corp. COO Peter Oosterveer to its board.
Carl Larry has been named director, business development for Frost & Sullivan. He was previously with Credit Suisse.
Terry Payne, former President of Platt Sparks, has joined FTI Consulting as senior MD.
Patricia Vega is head of GE Oil and Gas in Latin America.
Gustavo Usero is Geovariances’ new representative in Brazil.
Gary Whitsell is joining Inoapps in Houston as part of a $2 million investment which will add another 10 people to the company’s current 200+ employees.
Firdaus Hadi has joined IRM Systems as senior engineer, Derek Enhao Lee as engineering consultant and Saad Wahid as junior engineer.
Katalyst it has opened an iGlass datacenter near London.
Kongsberg Oil & Gas Technologies has appointed Mike Branchflower as global sales manager for flow assurance. He hails from Schlumberger.
Leidos Holdings has appointed Mike Leiter to executive VP business development and strategy. He comes from Palantir Technologies.
David Shorey is now VP of Nanometrics’ oil and gas division. He was previously with Baker Hughes.
Navigant has named Randy Zwirn to its board. Zwirn is CEO, Energy Services and president of Siemens Energy.
Noah Consulting has named Grant Hartwright head of facilities IM and Greg Kowalik and Todd Burns co-heads of its content enablement practice.
Bob Bullwinkel (Schlumberger) has been elected to the OFS Portal board.
Ingo Simonis is director, interoperability programs and science at the Open geospatial consortium.
Stefan Papenfuss heads-up Quest Integrity Group’s new flow loop testing facility in Houston.
Scott Harrison (Sales) and Theodoros Sketopoulos (Support) have joined Rock Flow Dynamics’ new UK office in Aberdeen.
Bill Wicker is now CEO of Venture Global LNG. He was previously with Morgan Stanley.
P2 Energy Solutions (formerly Petroleum Place) has acquired production management and hydrocarbon accounting software boutique Merrick Systems. HitecVision (for Merrick) and Advent International (for P2) backed the deal whose financial terms were not disclosed.
IHS has acquired PacWest Consulting Partners, a provider of market intelligence to the unconventional oil and gas industry, bringing PacWest’s ‘IQ’ products (PumpingIQ, WellIQ, and ProppantIQ) into the IHS fold.
GSE Systems has acquired staffing boutique Hyperspring along with a 50% interest in IntelliQlik, developer of an online learning and learning management system for the energy market. The deal involves an initial $3 million payment with up to $8.4 million more if certain earnings targets are met.
Intervale Capital is selling Proserv Group to Riverstone Holdings, a private equity group with over $27 billion of committed capital. Riverstone and Proserv management are acquiring the company from Intervale, Weatherford and minority shareholders.
Emerson Process Management has acquired Management Resources Group, a management consulting specialized in manufacturing reliability.
CGG has turned down Technip’s unsolicited bid, rejecting the financial terms and the ‘industrial logic’ of integrating CGG’s activities within Technip.
FTI has acquired Platt Sparks & Assoc. consulting petroleum engineers.
Altair recently revealed the extent to which its PBS Professional workload manager and job scheduler for high-performance computing (HPC) has penetrated the oil and gas vertical. On the fluid flow front, Computer Modeling Group’s has integrated PBS with its Imex/Stars reservoir simulator line up. PBS is also embedded in Emerson/Roxar’s Tempest simulator and reservoir engineering software suite. Rock Flow Dynamics’ tNavigator supports PBS and Schlumberger offers PBS Professional support to users of its Eclipse simulator flagship.
PBS has take-up in seismics, Paradigm has integrated its Echos seismic processing and imaging system with PBS Professional. Finally, Petroleum Experts (Petex) now supports PBS Professional in its Resolve master controller of simulations spanning multiple, third party reservoir, process and pipeline transient simulators. Petex’ Integrated Production Modeling system also uses Altair’s HPC tools.
Altair CTO Sam Mahalingam observed, ‘Oil and gas operators rely on PBS Professional to gain more efficiencies from their computing investments and HPC end-user software developers recognize that supporting PBS Professional will improve their implementations.’ More from Altair.
Heightened exploration and development activity in North America has led to a boom in telecommunications and remote monitoring. In Alaska, WellAware has combined its RPMA technology with GCI’s cellular network to offer machine-to-machine (M2M) communications throughout the State.
Global satellite operator SES has signed a new capacity agreement with SageNet to help energy companies connect operations across North America. The solution uses Ku-band capacity on the SES-1 satellite to provide managed enterprise communications solutions for drilling rigs and electric and natural gas utilities.
FreeWave Technologies has announced WaveContact, a family of wireless I/O solutions for industrial M2M automation and control. WaveContact provides Class 1, Division 1 wireless I/O and data aggregation, extending connectivity across wide area sensor networks and facilitating M2M setup for the oil and gas and other verticals. End points come as self-contained, explosion-proof enclosures with multiple analog voltage inputs and an RS485 interface.
Extronics and Aruba Networks have developed what is claimed as the industry’s first explosion-proof outdoor gigabit Wi-Fi access point, combining Aruba’s ruggedized AP-274 outdoor AP with Extronics’ iWAP107 explosion-proof enclosure. The 802.11ac AP targets the ‘industrial internet of things’ in the chemical, oil and gas and other verticals.
BATS Wireless has announced BTS for broadband connectivity to FPSOs and drillships. The stabilized microwave solution enables coms and data capabilities between assets.
Folks like to pretend that ‘digital’ is new and that digital ‘natives’ are displacing old farts with pen and pencil. In the latest issue of the BP Magazine, CTO David Eyton is quoted as saying that, ‘Across BP, the application of digital technologies is changing the way in which we operate. The clock speed of digital technology evolution is very rapid.’
Elsewhere on the BP website, a short history of digital business at BP suggests that although its speed maybe high today, BP’s digital clock has been ticking away for quite a while. For BP, the dawn of digital came in 1956 with the acquisition of an English Electric ‘Deuce’ computer, used to optimize refinery operations. By 1967, there were 40 computers in the company and it was observed that ‘The time is fast approaching when it will be difficult to operate the company without [computers].’
Given that some of those who were involved in the first digital work in BP are probably dead by now one has to wonder when ‘digital’ will stop being news and finally be accepted as an integral but unremarkable part of the business!
Spectra Logic has thwarted a patent infringement lawsuit brought in July 2012 by Overland Storage. The squabble concerns Overland’s patent on storage library partitioning. This month the patent office appeal board ruled that this was invalid since the claims made in the patent were disclosed as ‘prior art’ in an IBM 3494 tape library manual. The board also disallowed Overland’s attempt to amend its claim. Last year the International trade commission ruled the patent to be invalid. More from Spectra Logic.
BP has selected Asset Guardian Solutions’ software management platform to manage the process control software used in its Quad 204 West of Shetland FPSO.
Noah Consulting has expanded its alliance with Energy IQ, combining EnergyIQ’s Trusted Data Manager (TDM) software with Noah Consulting’s IM services.
Turkish EPC Tekfen has migrated to Aveva’s Everything3D design software.
Capgemini has completed implementation of its cloud-based EnergyPath, SAP solution at Excelerate Energy.
CGG’s PowerLog has been selected by Baker Hughes Incorporated as petrophysical software application of choice.
BP has awarded Emerson Process Management a $40 million automation contract for its Shah Deniz Stage 2 project, Azerbaijan. The deals includes integrated control and safety systems for two new offshore platforms and an onshore gas processing plant.
Geovariances has partnered with Emzed Exploration Services in Canada to promote Isatis and organize joint consulting and training.
GSE Systems and Specialized Petroleum Technologies are to provide training products and services to Kasipkor Holding’s Atyrau Petroleum Education Center in Kazakhstan.
Harvey Gulf International Marine has deployed Advanced Logistics’ ‘Samm’ marine management and preventative maintenance system on its fleet.
GDF Suez E&P UK has chosen IHS’ risk assessment solution to enhance operational risk management in the UK.
Ikon Science has joined Phase 3 of the SEG Advanced Modeling (SEAM) consortium and will work to evaluate current methodologies for pre-drill pressure and hazard predictions.
Intertek has entered an agreement to provide analytical laboratory services to Statoil in Kalundborg, Denmark.
Petronas Carigali has awarded Kongsberg Oil & Gas a four year, $2.2 million contract for delivery of SiteCom real-time drilling operations software.
LMKR has partnered with Lumina to improve geoscience data interpretation and data integration. A new spectral decomposition-based tool, Predict 3D, is to be released early 2015. LMKR has also teamed with PetroWeb to provide connectivity between its GeoGraphix products and PetroWeb’s PPDM-based Enterprise DB.
Darren McLean Consulting has joined Merrick System’s partner list in Canada.
Paradigm has signed a long-term contract with Chevron Energy Technology providing access to an expanded Paradigm product portfolio. The additions include Paradigm solutions for formation evaluation, seismic imaging and drilling engineering.
Petrotechnics has teamed with Texas A&M’s Mary Kay O’Connor process safety center to support R&D into the HSE effects of chemical products and processes.
GDF Suez Norge has awarded RSI a $2 million, three year frame agreement for well log analysis, rock physics, reservoir characterization and CSEM processing.
WPX Energy has contracted Sigma Cubed to supply an integrated 3D reservoir model of the Bakken reservoir to identify sweet spots and predict performance of blind wells.
Wex Fuel Management has signed with Titan Cloud Software to ‘strategically promote’ their software solutions for fleet managers and petroleum retailers.
DrillingInfo blogger John Fierstien has followed up his earlier article on US well numbering with an authoritative post on Canadian well numbers and the Canadian land system.
The Modelica association reports that adoption of its functional mockup interface (FMI) standard has ‘exceeded expectations’ with 60 vendors tools listed as planned or supporting FMI 1.0.
The 8.6 edition of the European Petroleum Survey Group’s geodetics library is available with new data for some 22 countries and revisions to EASE-Grids and the seismic bin grid transformation examples. The registry can now export valid CRSs and transformations in the ISO 19162 ‘well-known text format. Dataset entries may be exported individually, by user-defined groups, or through download of all valid CRSs and transformations. The EPSO has also introduced an open source web service, epsg.io, exposing data based on the ‘official’ EPSG database maintained by the OGP Geomatics Committee.
GeoGig, an open source spatial data processing tool (inspired by GitHub) is approaching its first major release. The GeoGig code base has been contributed to the LocationTech working group at the Eclipse Foundation. GeoGig offers an audit trail to changes in a spatial data repository. Watch a demo on the BoundlessGeo website.
Oasis has formed a new technical committee to work on content management interoperability services for digital asset management (CMIS4DAM) and is seeking submissions of technology for consideration.
The OGP Geomatics Committee has released SeabedML, a geography markup language application schema for its subsea data model (SSDM). SeabedML provides an ‘open’ alternative to the Esri ArcGIS/geodatabase SSDM delivery template.
The Standards leadership council has announced an ‘Operator advisory panel,’ to serve as a ‘standards sounding board’ and to be a ‘visible evangelist’ for standards development and promotion.
The US National institute for occupational safety and health (Niosh) of the Centers for disease control and prevention (CDC) is to publish a National agenda on total worker health (TWH). The draft agenda is meant to ‘stimulate research, applications and guide policy related to worker health.’ The initiative represents an attempt to create a national occupational health agenda (Nora) covering a variety of industrial segments including oil and gas. The overarching goal is to ‘prevent worker injury and illness and to advance the safety, health, and well-being of the workforce.’ To increase awareness and adoption of occupational health and safety in the workplace an ‘internet-based open source system for disseminating TWH best practices’ has been proposed. Stakeholders are invited to comment on the draft and provide input on top priority issues.
It may be something of a reprise, but the Chemical Safety Board’s updated video, ‘Behind the Curve,’ detailing findings and recommendations on the 2010 Tesoro Anacortes refinery accident has been updated to include interviews with the CSB’s investigators and chairperson Rafael Moure-Eraso who observed ‘The CSB is seriously concerned by the number of deadly refinery accidents in recent years. We have concluded that extensive improvements must be made in how refineries are regulated at the state and federal level.’ The Tesoro investigation, one of the most extensive and complex of recent years, found a ‘substandard safety culture at Tesoro, which led to a complacent attitude toward flammable leaks and occasional fires over the years.’ The CSB has recommended that the American Petroleum Institute reviews its recommendations and guidelines and that the EPA use its authority under the clean air act authority to promote safety.
Comment—Unfortunately the API is more likely to be battling the EPA from the opposite direction these days, on behalf of less regulation and laxer standards for industry.
Version 2 of Norway-based Billington Process Technology’s BPT Toolkit includes a process simulation optimizer that targets users of AspenTech’s HySys process simulation software. PSO is a license management tool that optimizes use of HySys licenses and token Systems. The Toolkit also ensures that HySys users comply with corporate policy on license usage without oversight from IT or management, eliminating the ‘friction’ in policy enforcement.
BPT VP Gerry Lillie told Oil IT Journal, ‘We expect the new license and user management system to have a massive impact on the AspenTech/HySys user base and save millions of dollars in license fees.’
Comment—As we have observed before, license management is a double edged sword. Money saved by the user community is money lost to development!
CGG’s Robertson unit has teamed with Wood Mackenzie on ‘EV2,’ a new exploration valuation tool combining Robertson’s geological knowledge with Woodmac’s commercial analysis expertise. Both companies have decades of experience providing clients with proprietary databases of the world’s oil and gas fields, wells, subsurface data and analysis. EV2 provides analysis combining play risk assessment with Monte Carlo economic simulations. New venture specialists can rank exploration opportunities in terms of subsurface risk, volume and value potential to identify the best opportunities.
Sophie Zurquiyah, head of CGG’s geophysics and reservoir unit said, ‘Valuing acreage can be challenging as it requires extensive time and resources and global basin knowledge. EV2 brings rapid access to this information along with the risks, volumes and value of each block.
EV2 is delivered with data on ten ‘hot’ basins. The full package of 200 basins around the world will be released in phases through 2015 and 2016. Early adopters will have the opportunity to influence the final product’s development and roll-out.
It is possibly the most geeky video ever made, but if you are really into data preservation, watch SpectrumData’s film of a 1/2” tape containing a ‘multimillion dollar data set’ that disintegrates as it is read. If you want to convince management of the need to look after your digital asset, get them to watch too.
SpectrumData MD Guy Holmes observed, ‘This phenomenon has been around for years but we are seeing it progress at an alarming rate. While media age and storage conditions are partly responsible, cheaper tape brands are far more susceptible. Tapes we thought would last 30 or 40 years, simply don’t!’
A dwindling pool of tape manufacturers means that ‘we see massive pools of media that don’t stand the test of time.’ Holmes advised, ‘There is no substitute for acting before the problem arises.’ Data transcription, before it is too late, is advised, possibly to SpectrumData’s new ‘virtual tape’ solution, that uses Amazon’s EC2 cloud to ‘break the offsite storage cycle’ into which companies have been ‘locked for years.’
Calgary-based Computer Modeling Group showed a teaser of a reservoir and production system modelling application at the Amsterdam SPE ATCE last month. CoFlow is the culmination of the dynamic reservoir modeling system (DRMS) joint industry project carried out by CMG and partners Petrobras and Shell.
CoFlow’s multi-discipline modelling environment spans the reservoir, gathering networks and production systems. The framework provides a collaboration environment and guided workflows for development of multiple assets. Workflows are customizable to corporate requirements and provide ‘end-to-end integrated uncertainty, optimization and history matching to support forecasting, planning and reserves updates.’
CoFlow was validated in a three-reservoir, deep water field producing different fluids to a single offshore platform. The current R10 CoFlow edition will be released to Shell and Petrobras in 2015. Others will have to wait ‘several years’ before the software is released commercially!
Interica has teamed with ETL Solutions to combine ETL’s Transformation Manager and DataHub with Interica’s PARS/iAsset. The combined solution will enhance support for ‘project aware’ archiving and backups of exploration and production data, allowing companies to map and move data from legacy applications to current versions.
DataHub captures complex digital content for long-term knowledge retention, compliance and storage management. Project metadata is stored within the archive for rapid identification, restoration or migration. The new offering extends to real time data which can now flow through DataHub and into the iAsset repository.
We asked Interica if the new ‘real time’ functionality meant that it would be competing with the ubiquitous PI Systems from OSIsoft. Intercia’s Simon Kendall told us, ‘Our combined solution provides flexible data access and storage and does, to an extent, provide similar features to OSIsoft PI historian. However the flexibility we offer allows exploration specialists to use E&P applications of their choice.’ More from Interica and ETL.
Avere has also announced a ‘software-only’ version of its FXT Edge Filer that offers cloud-based storage on Amazon’s off-site public EC2 cloud. A white paper described how the ‘real’ cloud can extend on site storage, while ‘resolving otherwise deal-breaking performance, scalability, access, and management challenges.’ In this configuration, Amazon’s cloud can serve either as permanently provisioned IT infrastructure or provide overflow burst-compute resources to temporarily add compute horsepower for big data applications.
A new ‘Control of work’ offering from Aveva provides operators with increased control, efficiency and safety of maintenance work on high-risk assets. Control of Work covers routine maintenance and engineering operations with modules for risk assessment, permit and activity management and lessons learned reporting. Aveva VP Jan Edvin said, ‘Control of Work interoperates with enterprise asset management solutions, such as SAP, IBM Maximo and Aveva WorkMate, making engineering tasks safer and minimizing downtime.’
At its 2014 world summit last month Aveva announced an extension of customer trials of its ‘Everything 3D’ engineering design software running in the Amazon EC2 cloud. The cloud based platform was developed with support from poster children Shell and Lundin which have provided digital assets from the Gulf of Mexico and the Norwegian Continental Shelf. Aveva’s ‘digital asset’ approach was also on show, with case studies from Amec, WorleyParsons, Foster Wheeler, RasGas, Adma Opco and others.