Its been a year or so since I shared my ‘industry at large’ folder during which time a lot has happened. The oil price had halved so this is not such cheery reading as last year but here goes...
According the API, US crude production ‘hit its highest level since March 1986 last November,’ up 15% 9 million bopd and the highest monthly rate since ‘peak oil’ back in 1973.
Speaking at a recent Credit Suisse event, Schlumberger’s Patrick Schorn observed that while capex increased fivefold from 2001 to 2012, this only produced a 10% hike in production. This was ‘becoming less and less sustainable even before the dramatic fall in the oil price.’ Schorn criticized the industry’s lack of integration, ‘in many operations, a number of different companies are involved, each interfaces with the customer but looks after its own commercial interest.’ Schorn also revealed that 2014 North American revenues for a Halliburton/Baker Hughes combination are $8bn (double SLB). But internationally, SLB beats HAL/BHI with $7 (vs 6) billion revenue and a (commercially interesting) 24% return vs. 15% for HAL/BHI. Oh, and SLB also has announced 9,000 layoffs and HAL, 6,400.
A study by Gaffney Cline finds that a falling rig count may have a two million barrels/day impact on US unconventional oil production despite US Energy Information Administration forecasts to the contrary. The picture emerging is complex and nuanced by the time lag between drilling and production hook-up. Continental Resources is reportedly drilling but not completing wells, ‘seeking to defer flush production in the hope of capturing a price recovery or spike.’
IHS forecasts that the ‘stunning growth’ in US oil production may come to a halt mid-2015 as low oil prices start to bite. Around half of new wells in 2014 had a breakeven price of $60 or less while some 30% require $80 or higher. Another IHS study finds unsurprisingly that small US E&Ps are particularly exposed with a ‘worrisome’ median debt-to-appraised worth of assets ratio of 51% and anticipates ‘serious liquidity issues should prices remain depressed beyond 2015.’
Drillinginfo’s Allen Gilmer thinks that the rig count is old hat and is a concept that has become abstracted from production. He proposes a new Drilling-info Index that identifies actual drilling locations along with production data to provide an accurate indication of capacity. Gilmer estimates that of late, around 6 million barrels of new ‘produce ability’ has been coming into the system every year, offset by ‘very high decline rates.’
Avere Systems’ co-founder and CTO Mike Kazar said that software patents should be axed adding that, ‘copyright protection is sufficient to prevent wholesale IP theft. This worked fine through the early 1990s, when software patents were first allowed in the US.’ Kazar was joining Pure Storage in its call to the government to cut the patent term to just five years and to introduce a ‘use it or lose it’ clause to target patent trolls.
President Obama has requested $842.1 million for fossil energy programs in the 2016 financial year budget. The funds will enable the Office of Fossil Energy advance technologies related to the ‘efficient, affordable and environmentally sound use’ of fossil fuels and the management of strategic reserves. Some $560 million are allocated to fossil energy R&D into carbon capture and storage and ‘advanced energy systems’ that blend fossil energy usage with CCS and ‘cross-cutting’ research. $44 million will go to the natural gas program to ensure that shale gas development is conducted in a manner that is ‘environmentally sound and protective of human health and safety.’
In a ‘public commitment to openness,’ PG&E has provided the California public utilities commission with copies of 65,000 emails exchanged between the company and its regulator over a nearly five-year period beginning in 2010. The emails were the subject of a ‘voluntary review’ PG&E undertook in 2014 as a result of which PG&E ‘self-reported apparent violations’ of rules governing ex parte communications. At issue are some inappropriate conversations between PG&E and CPUC relating to ongoing litigation over the 2010 San Bruno pipeline explosion.
On a more positive note , a recent report from Wood Mackenzie shows how improved well design in the Haynesville shale has transformed the play. The Haynesville had fallen out of favor but today, the economics have shifted with a new breed of massive gas wells said to be one of the best hedges against plummeting oil prices.
A recent position paper from the Railroad Commission of Texas’ David Porter argues that the US crude oil export ban is a ‘dated policy’ and contributes to low oil prices that are hurting the State’s economy. ‘If Congress and the Obama administration were to repeal the prohibition on crude oil exports, studies show that domestic energy production would increase along with GDP, job growth, and capital investment, all while reducing our national trade deficit.’ The continuing shale revolution is necessary to sustain the ‘Texas miracle’ and help cement the US’ status as a global energy superpower.
Finally, an interesting piece in the March 16th edition of Bloomberg/Business Week on how US oil in storage has risen at the Cushing hub, from 18 million barrels last summer to over 50. What’s more, some are reported to be paying over a dollar a month per barrel for storage (that’s worse that the Bundesbank!)
Now here’s a thought. I expect that some of the oil going into storage has been sold via hedging contracts. Wo what happened when the storage fills up? Do the hedgers run out of hedge?
Petronas and service provider Schlumberger stole the show at the 2015 SPE Digital Energy conference in Houston with three back-to-back presentations on the digitization of the Samarang oilfield, Malaysia. Discovered in 1975, Samarang is currently the subject of a major redevelopment project following a 2010 alliance with Schlumberger that set out to leverage Schlumberger’s ‘renowned’ subsurface technology.
The integrated operations (IO) concept for Samarang was first presented at the last (2013) Digital Energy as a conceptual study (SPE 163724) that found opportunities for streamlined workflows, improved information management and increased production. In particular the concepts were to be integrated early on, during front-end engineering design of the revamp.
Petronas’ three presentations (SPE 173578, 173579, 173580) showed how the concepts developed in the initial study have evolved and have been implemented. In particular with the deployment of a ‘tri-node’ collaborative work environment (CWE) that provides a real time data sharing environment for workers located at the field itself, on the local operations team in Kota Kinabalu and petrotechnical experts at head office in Kuala Lumpur. Data tie-in and commissioning leverages a data quality funnel with multi-level checks from field based RTUs through to end user applications.
These include a full stack of Schlumberger’s tools (OFM, Olga-online, Avocet, Eclipse and Peep) along with Petex’ Prosper and Gap and OSIsoft’s PI historian. So far five workflows have been implemented including well testing, back allocation and gas lift. Workflows are driven by operational guidelines that show process ‘swim lanes’ linking individuals to the tasks in hand. Workflows cover ‘fast loop’ activities such as reacting to a failed valve, medium loop processes such as production optimization and slow loop reservoir management activities that may see many specialists congregating in the CWEs.
Implementing the new systems meant upheaval for employees and so a structured change management process was used. This has leveraged the work of John Kotter and ProSci’s Adkar methodology. A ‘Sipoc’ six sigma process was also used to sketch out and implement workflows. Programming logic and source code was managed with Microsoft’s Visual Studio Team Foundation Server. The process has been standardized such that it can now be re-deployed at other Petronas assets.
The US department of energy has sold the ‘historic’ Teapot Dome oilfield (a.k.a. the Naval Petroleum Reserve Number 3) to Alleghany Capital unit Stranded Oil Resources Corp. The $45 million proceeds from the sale will end up in the US Treasury’s coffers. Teapot Dome was made famous in the 1920s as the subject of a political bribery scandal.
field was run by the Rocky Mountain oilfield testing center (Rmotc), a
government facility that was disbanded in 2014. The Rmotc provided a
popular and widely used dataset of the field which appears to have been
orphaned by the sale.
Oil IT Journal asked both the US Department of Energy (DoE) and Alleghany for the status of the public domain data. Spokesperson Namrata Kolachalam told us that the data was no longer available from the DoE. Alleghany did not respond to our emailed request. The question now for upstream software developers and testers is, where to get a hold of the Teapot Dome data. PPDM members might like to grab the data here before it’s too late.
UK-based Exprodat Consultants has just issued its second E&P geographic information system (GIS) benchmark survey of 18 worldwide oil and gas operators. The 2014 edition covers governance, usage, data, technology, support and skills and spend. Little has changed since the previous survey. Overall, companies score 2.8 out of a maximum 5 points showing that there is still ‘room for improvement.’ While there are some signs of improvement, industry is still ‘struggling to overcome endemic issues.’ ‘Fresh thinking and radical strategies are needed.’
Data integration is the Nø 1 support issue and will likely hold back analytics but there is weakness in all other areas. Spend analysis confirms that technology is by far the cheapest element. People and data represent the main costs. On average per user GIS spend is $14k/year with $k2 on software and training and $12k on support and data.
GIS is now a core application in E&P and a potential game changer. One company scored 4/5, a ‘remarkable achievement,’ companies seeking competitive advantage should take note!
Unsurprisingly ESRI dominates Exprodat’s sample. All list ArcGIS as their main GIS product, often supplemented with Safe FME. Other tools are used, Erdas, Google, K2Vi, Blue Marble, Geocortex, but these are way behind.
Project data management remains a major issue. Some 70% of respondents report that it is ‘handled inconsistently in an ad hoc fashion.’ Under 30% have defined project close out/archival procedures. Data duplication is a widespread problem.
Integration with other applications is mainly (88%) via OpenSpirit with some use of Safe Software and generic web map services. Petrel is the main GIS consumer app.
Exprodat’s analysis suggests that GIS needs better governance and business representation. GIS events are usually congregations of specialists and service providers. The authors also express surprise that integration with a single product (Esri) is so hard. ‘How come we can’t join-up our information silos in the age of cloud computing, web services and APIs?’ More from Exprodat.
The UK chapter of the international society for knowledge organization organized a Great Debate last month on the role of the ‘traditional thesaurus’ in modern information retrieval. The essential question for knowledge organizers is, do we need to index and classify documents or just leave retrieval up to search.
Arguing for search, Judi Vernau (Metataxis) observed that use of thesaurus standards such as ISO 25964 in document indexing and retrieval has declined although it remains ‘a small part of the story.’ There is less emphasis on a narrow, constrained thesaurus and a search for a broader set of relationships. Document classification used to be done by professionals but the world wide web has changed everything. Librarians trained in the art of information retrieval have been replaced by end users who just want to find something!
While the classical thesaurus may be overkill, tagging a corporate document archive will help retrieve stuff in context. The question is how much effort to put into tagging and how to do it. What is key is to make it easy, with a useful semantic structure that is easy to navigate. This is not the case for a traditional thesaurus which contains too many terms and constraints. While there are good reasons to conform to a standard, there are good reasons to not do so, to slacken off some constraints.
Automated entity extraction may or may not work but might be an improvement over the status quo. Dictionaries and whitelists can help – so long as your search is tuned to take advantage of them. Unfortunately, some commercial content management systems can’t even handle a three level deep hierarchy, let alone more complex relationships. There will always be a tension between the desire for smarts and the need to dumb down. Currently the thesaurus is a middle way that does not do the job. You need to be more flexible and more sophisticated.
Vanda Broughton (University College London) defended the whole concept of the thesaurus which is ‘alive and well and fighting back,’ especially in its loser, less constrained manifestation. The process of building a thesaurus teaches us about the domain, its concepts and relationships and can help build a first pass formal model of the domain. The information modeling that underlies a thesaurus should not be lightly abandoned. It is key to semantic web applications, machine reasoning and fuzzy logic. The controlled vocabulary, search tools, domain models and other terms ‘all point to the thesaurus.’
Consultant Helen Lippell, arguing for search, observed that it is hard today to make a business case for a thesaurus and that projects tend to mushroom out of control. Modern content management systems provide inbuilt tools such as automated suggestion lists and constraints, reducing the amount of expert’s time required compared to a full blown thesaurus. What’s needed is a balance between such simple tools for tagging and annotating against ontologies and billion triple stores and semantic web technology ‘that is really hard to implement.’
Leonard Will (Willpower) is a thesaurus believer, mentioning in particular the semantic web’s Skos tagging scheme. Wikipedia has a thesaurus as do hardnosed organizations like eBay and Amazon. Combining the thesaurus with geographic information systems brings even more benefits. The pro search, anti-thesaurus motion was defeated resoundingly.
References and extra reading – Everything is miscellaneous, Hasset, UK data archive, Information, a very short introduction. Visit ISKO-UK’s new website.
The Information Store (iStore) has announced ‘WellBitz’ a cloud-based workspace for E&P projects including drilling, operations, prospects and investments. WellBitz lets multiple users collaborate, upload and organize oilfield documents. Online tools enable mapping, production data visualization, decline curve analysis and partner interaction.
Applications are delivered via a pay-as-you-go, software-as-a-service model which iStore claims, ‘provides services at a much lower cost than traditional desktop or server applications.’
Early adopter Tom Divine of Oklahoma-based RKI E&P said, ‘WellBitz’ mapping and economic analysis functions are well thought out and simple to use. This should be a very attractive product for smaller operators as a lower cost alternative to traditional industry software packages.’ iStore plans regular enhancements for WellBitz. The current roadmap includes data room functionality, well log and cross section viewers, wellbore utility and mobile device access.
The WellBitz technology stack comprises Microsoft’s .NET framework, Amazon web services, the PPDM data model, Esri map controls, Open Street Map and Html 5. WellBitz pricing starts at $49 per month for a single seat. More from iStore.
Interviewed in the Q1 2015 issue of Ryder Scott’s Reservoir Solutions newsletter Baker Hughes’s Randy LaFollette argues that data mining is to play a critical role for shale plays in today’s low price environment. Public domain data from over 65,000 onshore US wells from IHS’ data library have been studied to reveal that several simplistic prior analyses fail to identify the best completion strategies. Cross-plotting peak gas rates vs. stimulated fluid volumes in the Barnett shale play for instance failed to account for reservoir-quality differences.
LaFollette recommends using ‘boosted-tree, multivariate analysis’ of injection rates, fluid volumes, well architecture and other parameters. Combining such analyses with GIS data, geology and geochemistry produced ‘strong predictors of well production.’ Notably with the top 10% of wells drilled at some distance from faulting. The study also found that shorter laterals were more efficient in the Bakken while stage count was far less influential.
Baker Hughes has now rolled-up its shale analysis into a Google Earth-based interactive map of the Eagle Ford and other shale plays that makes ‘mining for data as easy as looking at a computer screen.’
Speaking at a recent SPE event in Houston, Noah Consulting’s Curley Thomas offered some advice on establishing a system of record for wellbore schematics. Too often wellbore schematics are compiled from disparate spreadsheets, hand written notes and vendor data. For companies drilling large numbers of complex wells it is preferable to move to a wellbore schematic management system and a central repository of current information. Meetings with US regulators (BSEE and BOEM) confirmed that companies were reporting schematics in multiple formats with no consistency from engineer to engineer. Noah set out to see if some standards for schematics could be identified.
Peloton’s WellView application for well lifecycle data management was chosen for its ability to capture complete wellbore schematic information to a structured database for use by multiple stakeholders. A ten stage data cleanup approach was proposed, along with a change management process to adapt WellView to client-specific requirements.
A sample workover package including schematics was submitted to the BSEE which met with immediate approval. Noah advocates using SharePoint to collate documents and track data cleanup prior to loading to WellView. More from Noah and Peloton.
In a recent presentation to the the Society of HPC Professionals, Dan Skinner presented Micron Technology’s ‘automata processor’ hardware for accelerated text search. Micron’s automata is ‘poised to fundamentally transform complex data analysis, just as hydraulic fracking has done for the oil and gas industry.’
Micron’s automata is a massively parallel device that performs high-speed search and analysis of complex, unstructured data streams. A single chip processes ‘up to 6.3 trillion decision paths per second.’ The automata is set to impact oil and gas which, like other verticals, is being ‘crushed under the weight of unstructured data that is far outstripping its analytic capability.’
Current use cases for the automata include network security where the hardware is claimed to scale linearly for malware pattern search and analysis. Snort network intrusion detection has been benchmarked at 1 Gbps per processor bandwidth. More from Micron.
Badleys’ ‘T7’ (formerly Traptester) provides a new 3D framework for stress analysis, fracture prediction and more. Watch the video.
Petrosys 17.6 introduces an interactive spatial editor for map data from a range of sources and formats. The tool allows for generation of buffer polygons surrounding shape and point data.
V2.7 of Integrated Informatics’ Geomancy decision engine adds impact reporting of well sites, pipeline planning and reporting and well lateral placement in large lease boundaries.
The V4.3 edition of ‘IP,’ LR Senergy’s formation evaluation and well data interpretation tool, adds a multi-user, multi-well functionality and an upgraded mineral solver. Also new is the use of 3D self organizing maps for rock typing.
Behavioral Recognition Systems has released V5 of its AISight artificial intelligence platform for Scada and information security. AISight is a multi-sensor data analytics platform with applications for video, intelligent alerting and remote operations.
Broadband Antenna Tracking Systems has announced a proprietary microwave communications system tailored to the needs of shale operators. The ruggedized system integrates with third party radios to form a multi-gigabit bandwidth communications network.
Golder Software’s FracMan 7.5 adds modeling capabilities for fractured conventional and shale reservoirs, 3D simulation of fracture interaction and simulation of hydrocarbon flow to well.
CoreLogic’s new SpatialRecord is a single-source, parcel-level information solution for oil and gas land record management. A comprehensive parcel database includes information on some 140 million US land parcels, around 99% of the nation’s properties.
Pegasus Vertex has announced MudPro+ with new functions for tracking mud volumes and concentration in both the active system and reserve pits.
Safe Software has released FME 2015 with expanded support for point cloud data, support for Esri ArcGIS 10.3 and enhanced web mapping services functionality. FME 2015 introduces supports for Minecraft ‘to enable users to make their GIS and other data available through its interactively discoverable gaming platform.’
PetroWeb’s new data rules module tracks data quality across an asset, adding geospatial data agnostic functionality to geoscience workflows.
SCM has released its tips and tricks for Petrel 2014 with focus on the use of the Microsoft FluentUI/ribbon interface.
Tecplot RS 2014 R2 adds conditional expressions, custom 3D Views and grid comparison to the reservoir simulation visualization and analysis package.
‘Irish-based’ Weatherford has announced a new FracAdvisor service providing near real-time guidance to operators for optimized completion designs, reduced operational risks and increased fracking efficiency.
The 2014 release of Schlumberger’s Pipesim steady-state multiphase flow simulator delivers new functionality to enhance user experience and simulation capabilities, including GIS network canvas and interactive wellbore schematic.
The 5.2 release of LandWorks’ land management suite brings live integration with Esri ArcGIS Server. Users can open a lease or right-of-way agreement and access a map displaying the relevant polygons. LandWorks currently runs on a client’s hardware. A cloud–based edition is in preparation.
New Century Software’s Facility Manager 5.0 provides enhanced accuracy and efficiencies to manage pipeline attribute centerline data stored in a GIS. ArcMap integration allows users to insert station information and apply spatial filtering to views. Documents can be associated with pipeline events and work orders and vice versa. Data staging ensures data is verifiable, accurate and traceable.
Imation’s combo of HGST’s 8TB Ultrastar hard disk drives with its Nexsan E-Series family of data storage provides up to 144TB of data in two units – almost 5 petabytes in a standard rack.
The Chemical Safety Board has produced another no punches pulled report on an industrial accident, the August 2012 Chevron Richmond Refinery pipe rupture and fire. An 8” line flowing 10,800 bbls/day of light gasoil ruptured releasing a vapor cloud that subsequently ignited.
While there were no serious injuries or fatalities, 15,000 people from the surrounding communities sought treatment at nearby medical facilities for breathing and other problems, ten of whom were admitted. Root cause of the rupture was found to be sulfidation corrosion, a damage mechanism that causes steel to thin over time when exposed to sulfur compounds at high temperature.
The CSB found that in the 10 years prior to the incident, Chevron’s own energy technology company’s (ETC) personnel with understanding of sulfidation corrosion recommended inspection and/or upgrade of the unit, in a ‘sulfidation failure prevention initiative’ SFPI, but this was not implemented effectively. Along with such chain-of-command shortcomings, the CSB found that Chevron’s ‘operational excellence and reliability intelligence’ (Oeri) online dashboard failed to catch the issue.
Oeri uses 26 different process safety indicators, to track the implementation status of ETC recommendations and new industry guidance. Oeri visually displays the status of many different process safety indicators. However, Oeri did not track progress of implementing the recommendations of its own in-house specialists. The CSB believes that by monitoring a KPI of the status of the SFPI the accident could have been avoided. Read the final CSB report.
The 2015 American Business Conferences’ Houston Wellsite Automation for unconventional oil and gas conference offered a great snapshot of what is, despite the fall in the oil price, a very buoyant sector. While the rig count may be dropping, dealing with the backlog of shale wells is keeping the engineers busy and the size and geographical extent of the new producing areas is providing huge scope for a more digital approach to operations. Although shale wells are kitted out with the same Scada systems that have been used for years, the amount of digital and wireless kit deployed has risen greatly. Operators are also looking at how data is handled downstream and how ‘virtual operations’ can be optimized.
Randall Wilkins (Vine Oil & Gas) observed that although accounting, geoscience and other industry sectors have embraced IT, operations are still often manual. In fact the electronics is already in place, flow, pressure measurement devices are already on the well site generating mountains of data. What is lacking are the right people and processes. An automation team has three components, measurement (devices), the field office (HMI, servers and reporting) and an operations center. The latter is often neglected. One problem stems from inconsistent use of standards. Different sites may use different tank sticks, ‘a nightmare for support.’ Scada systems need to provide meaningful measurement to downstream systems. ‘Software developers need to visit the field, get to know what we are doing and be able to anticipate Scada requests.’
The operations center is the target user group that will enable ‘virtual operations’ (VO). VO is about ‘pumping from the computer,’ operating the field without pumpers writing down the wrong numbers! Scada data needs to be aggregated in the operations center for consistent analytics and reporting. Wilkins believes that maybe 80% of all wells can be pumped from an operations center. This implies rigorous alarm management with zero unacknowledged alarms. Person-specific alarm software that ‘knows who to call’ brings a ‘drastic reduction’ in downtime compared to broadcasting an alarm to everybody. Tank levels and flow rates need to be monitored consistently, ‘tanks cause more grief than anything else.’ Virtual ops enable better communications with haulers for route planning. Site visits can be captured with a $10 ‘last visit’ reset button and timer on the RTU. As operators migrate to the ops center companies need to collaborate with third party visitors, haulers, chemical guys, ‘get them to look around to see if something is unusual and tell us about it.’
Wilkins advocates one operations center, ‘owned by Scada and manned by lease operators’ for each region. Wilkins’ previous employer Exco’s control center brought about a huge reduction in fuel use and improved safety (less trucks on the road and less night work). Virtual operations are cheap to implement although the concept is new and hard to swallow. But the time for the digital oilfield and virtual operations is now! Pumping can be run from head office, ‘I’ve done it. It’s not rocket science.’
Wilkins was pressed on the cost issue in the Q&A. Virtual ops need a significant level of instrumentation and more power to the Scada system, ‘At $40 oil it will be hard to convince management.’ His response was that each investment level will produce a measurable return. ‘As you reduce downtime, claim the money and use it to reduce more downtime.’ In the Haynesville, Exco cut 14 truck routes down to 2 saving $100 per day/truck.
Kevin McDaniel (Marathon Oil) agrees that there is value in virtualized operations but this is ‘just the tip of the value iceberg.’ To use the data fully requires an ‘intelligent oilfield’ approach blending people, process and technology to address efficiency, production optimization and risk management. The goal is to get the right info to the right people at the right time. Data needs to be accessible without driving out to wells. Marathon has replaced its Eagle Ford manual ticketing process in an automation/integration project. This has eliminated risk of theft and loss, bringing data straight into the accounting system. Automated valves allow haulers to punch in a security code, eliminate the need for a complex seal program. Optimizing plunger lift settings is now done remotely and constantly—so that ‘wells stay optimized.’ In all 16 conditions are monitored and alarmed in Scada including flow control valve leakage, dump valve monitoring and plunger travel. Issues can be communicated to folks in the field as actionable items.
Shaun Derise observed that the highest risk to lease operators was ‘windshield time.’ BHP Billiton’s goal is to reduce driving time through exception-based intervention, focusing on preventive maintenance and high value fixes and increasing the well to pumper ratio. This is being achieved through better software—high performance HMI, best practices and exception-based reporting. The scale of modern shale operations can be scary. A comparison with a large offshore facility is instructive. Offshore a 20k bopd production platform may have around 10k data points. Onshore a small platform with three wells, around 1500 data points. But things scale up quickly when an onshore region is considered with maybe 48 pads, 144 wells and 72k data points. Moreover design errors are very costly when you have 48 pads to fix! Derise echoed Wilkins’ comment in his introductory keynote that ‘it’s crazy trying to manage all of the data.’ Derise’s five-point plan starts with a list of daily activity at the well pad that is refined in the context of automation. Site surveys are carried out and plans updated. Production is comingled where possible to reduce equipment costs and installation time. Finally operational philosophies need reworking to leverage the new paradigm. BHP’s push into automation includes IT and communications. Mesh radio connections bring ‘very fresh’ alarms into the surveillance center. The results? In six months, miles driven have been halved and downtime cut by 60%. BHP is to deploy the new system across all its liquid producing areas.
Rene Beck (WPX Energy) presented on the nitty-gritty of Scada deployment. WPX was spun out of Williams Energy in 2012. The company has enterprise Scada deployed since 2009, with now around 7k meters in the system. Beck recommends shooting for a full-blown ‘enterprise’ Scada system management backing. WPX’ Scada is vendor neutral, but ‘we do have Cygnet’ (Weatherford). Scada data goes to applications including Flow-Cal, TOW and Aries as ‘auditable, editable and repeatable data.’ No one size Scada fits all, ‘there will be gaps, make sure the options are there.’ Standardization is to be driven relentlessly, ‘Be as common as possible but as different as necessary.’ In the current climate it is necessary to consider how to handle new basin entries, bolt-on acquisitions and to ask, ‘how easy is it to divest?’ WPX was helped by Techneaux Technology Services and Cygnet. Deployment requires significant investment in project management and support and liaison between IT, SMEs, Scada and communications. ‘Go big or go home!’
Micah Northington showed how Whiting Petroleum has gone beyond Scada visualization into trending and workflow automation. Data is pushed out to more users in the company for operational awareness and is now linked to maintenance systems (Maximo/Avantis/Tabware), production accounting (ProCal/FloCount), finances (Nexus/ADP) and modeling (Aries/OFM). Ideally all these systems should be fed with data from the field, but operating across different localities and cultures can make things hard. Data has to cross the operations/IT frontier via tiered historians the replicate field data to enterprise IT. OPC data forwarding with Kepware also ran as did One Virtual Source. OVS provides a workflow library tying in to over 50 different databases including Scada. A schema metadata analyst standardizes variables and UOM and brings everything together in automated procedures that allow for surveillance by exception.
Joe Rodriquez (ZTR) showed how flexible telematics solutions can be deployed onsite to support data aggregation from wellheads, tanks, generators and be pushed into whatever comms are available, cellular and/or satellite. $40 oil is causing industry to change tack with regards to efficiency, cutting fuel costs through automation. Clients are also going cloud/IP based although ‘security is an issue that we all have to confront.’
Bevan Cox’s (Linn Energy) presentation on RTU selection sparked off an interesting Q&A on whether it was OK to perform a wireless shutdown. ‘Yes!’ opined Cox to cries of ‘no’ from the floor.
On the related topic of wireless availability Kelly Garrod discussed the different options available to remote operators from licensed and unlicensed spread spectrum, cellular and satellite. He recommended Golder’s Squid Pro tool for assessing local cellular coverage.
Heath Jenkins (Wika Instruments) observed that instrumentation seems easy until you have a big field with lots of stuff. There are many choices and options to consider regarding maintenance and complexity. There are also some instrumentation myths, like ‘everything is or should be digital.’ The reality is that Wika sells more analog meters today than ever before! Why? Because they are simple. ‘You don’t need a master’s degree to install or maintain a big dial gauge. And they are reliable. The mean time between failure for an analog stick gauge is 700 years. For a wireless/digital, 100 years so if you have 10,000 tanks, that’s two per week!
Marathon Oil’s Kevin McDaniel explained why you have to ‘sweat the small stuff!’ in meter selection. There are many meter types, turbine, Coriolis, vortex, positive displacement, ultrasonic and magnetic each suitable for different uses. In the Eagle Ford where paraffin is a big problem, ‘some meters won’t work.’ Companies also need to decide what they are going to do with meter data, ‘this is your cash register, inaccuracies in measurement can be costly!’ Meters need a regular proving schedule and solids and water need careful measurement. ‘You may not be able to rely on truckers’ honesty!’ Temperature may vary depending on where a sample is taken. An error of a few barrels per load adds up to several million dollars per year. Use of an API compliant Lact* Coriolis meter supplied by Emerson has mitigated Marathon’s losses.
If you would like to see what kind of kit is used on a modern shale pad checkout the state of the art RTU that was on display from Awc-Inc. This compact field solution includes a Siemens/Simatic real time unit box ready to deploy on a well pad. More from American Business Conferences.
* Lease automatic custody transfer.
Fiatech, the engineering and construction standards body, finalized its plans for accelerating industry adoption of reference data libraries (RDL) during a virtual meeting late last year. The work is a joint Fiatech, Mimosa and POSC/Caesar association collaboration.
Jen Garfield (ExxonMobil) and Alan Johnston (MIMOSA) introduced the Improving RDL content project. The intent is to improve the ISO 15926 RDL quality and structure to ensure it is fit for purpose and to publish templates ‘so that Part 8 implementation can become a reality.’ This is to be achieved by leveraging COTS semantic tools to use the RDL out of the box and to provide real world data ‘to combat the anecdotal notion that ‘the RDL is not complete enough.’ The initiative also focuses on equipment/device data types, connectivity (P&ID) and geospatial layout of 3D plant models.
Current data sheet utility and usage is limited because sheets are in PDF or document formats. But, contrary to popular belief, turning data sheet elements into structured, machine readable documents is not an insurmountable task. Because of this misconception, industry operations and maintenance groups are constantly retranscribing large portions of data sheet information to perform calculations for critical performance and safety issues. The ‘immense cost’ of this work is passed on to the client.
In reality, most EPC and engineering organizations believe that only a small number of data elements are needed to satisfy the needs for engineering data handover to an owner operator. Hence the interest in consolidating industry standard datasheet definitions (including API, ISA, IEC, ISO and NORSOK) as machine interpretable, with UUIDs assigned to each element. This work will be aligned with ISO 15926 or IEC property classes, with Energistics’ units of measure and aligned with ECCMA material procurement codes.
A first phase proof of concept will consider a mix of complex and simple equipment based on a realistic break down from recent oil and gas industry capital projects. The projected budget for the project is $1.15 mm over 2 years. Mimosa president Alan Johnston is to head-up the project team. More from Fiatech.
PG&E has engaged Ken Salazar to develop a ‘best-in-class’ regulatory compliance model.
Dave Lamolinara is now CEO of 1845 Oil Field Services. He hails from Ceva Logistics.
Geosciences consultant Aker Geo has been renamed First Geo.
Allegro has appointed Rod Fomby as senior VP global services. He was formerly with SunGard.
John Christmann has been appointed president and CEO of Apache. He was previously with ConocoPhillips.
Former Colorado state official Tracee Bentley is to lead the API’s new Colorado Petroleum Council.
BMT has opened an environmental testing facility in Jakarta, Indonesia.
CGG GeoSoftware has appointed Wanita Utama and Dan Skomorowski as business manager for respectively continental and northern Europe.
CGI is creating an onshore IT center at the University of Louisiana’s Lafayette’s research park.
Frank Nothaft has joined CoreLogic as senior VP and chief economist. He hails from Freddie Mac.
Digerati Technologies has appointed Craig Clement as chairman.
The US Senate has confirmed Christopher Smith as the department of energy’s assistant secretary for fossil energy.
MD Sam Pitts heads-up EnCap Flatrock Midstream’s new Houston office. He comes over from Citigroup’s global energy investment bank.
Enservco has two new board members: Robert Herlin (Evolution Petroleum) and William Jolly (Scarsdale Equities).
Graham Wright is the new principal of Environ in its Melbourn office.
Gary Luquette is the new president and CEO of Frank’s International.
Leigh Cassidy has joined Gasfrac as director and chairman.
GE and KOC are to collaborate on R&D into ‘cost-effective solutions for oil and gas’ and on specialized training programs for Kuwaiti engineers.
The Induced seismicity monitoring network consortium has hired Alireza Babaie Mahani to monitor induced seismicity from natural gas development in northeast BC.
Hexagon Solutions’ president Claudio Simao is to lead the company’s new innovation cub.
Paul Tiao (cyber security), Kevin Jones (energy) and Linda Walsh (regulatory) head-up Hunton & Williams’ new energy sector security team.
Fatih Birol has been named executive director of the IEA starting September 2015.
Jonathan Gear is executive VP energy and industrial business lines with IHS.
Diah Noor is the new oil & gas advisor for Intsok in Indonesia.
Itasca International has appointed Mark Mack as general manager in Houston.
Scott Simmons is now executive director of the OGC’s standards program. George Percivall is CTO.
Stefan Hoppe has joined the OPC Foundation as VP.
Richard Longdon has been appointed as non-executive Chairman of PS Enterprise.
Amr El-Bakry (ExxonMobil) is the new chairman of the SPE’s PD2A technical section.
Eldar S‘tre has been appointed president and CEO of Statoil.
Jimmy Staton has joined Venture Global LNG as executive VP. He hails from NiSource.
Stan Cena has joined Visage Information Solutions as VP business development. He hails from Merrick Systems.
Steve Nicol is now Wood Group PSN’s CFO.
Verisk Analytics is to acquire consultant Wood Mackenzie from Hellman & Friedman and other Wood Mackenzie shareholders in a œ1.85 billion transaction.
Total’s Energy Ventures venture capital unit has taken a stake in Avenisense, a French company specialized in embedded fluid property sensors to measure liquid density and viscosity, gas density, fluid humidity and oil quality.
Oceaneering is to acquire C&C Technologies for approximately $230 million in cash. C&C provides deepwater ocean-bottom mapping services from ROVs and satellite-based positioning services for drilling rigs and seismic and construction vessels.
Gasfrac Energy Services has sold ‘substantially all’ of its assets and related technology to an unnamed oil and natural gas service industry purchaser. The sale was conducted under a court-approved process as part of bankruptcy proceedings under the supervision of Ernst & Young.
Flotek Industries has acquired ‘substantially all’ of the assets of International Artificial Lift for a consideration of $2.25 million.
Bentley Systems has acquired French boutique Acute3D, provider of Smart3DCapture software for reality modeling.
IHS has bought Scotland-based drilling and completions information provider Rushmore Reviews.
Oil sands operator Syncrude Canada is building a consolidated system to optimize its environmental information managed. The ‘system of systems’ covers land and mineral rights, geomatics, tailings, water quality and biodiversity. The GIS-based system is being developed by Golder Software which supplied a system for the Base Mine Lake demonstrator.
GE and Statoil are to address the environmental challenges of oil and gas production, leveraging GE’s ‘CNG in a box’ technology, the use of liquefied CO2 in fracking and gas compressor optimization.
Dakota Software reports successful deployment of its environmental management platform by manufacturer McWane. Dakota’s ProActivity Suite provides ‘a 360ø degree view’ of air, water, and waste compliance and tracks transporters, disposal facilities and management of waste profiles, shipments, and containers.
The US has sequestered almost 1 million metric tonnes of CO2 as of February, 2015 via the Department of energy’s clean coal program, the equivalent of the annual emissions from 210,000 vehicles.
The Oxygen Factory is ‘recycling’ carbon, burning hydrocarbons and garbage to generate electricity and capturing and separating exhaust gases. The CO2 is turned into biomass through ‘enhanced photosynthesis and photocatalysis.’
The Standards leadership council
(SLC) has announced its public Forums in Houston and Stavanger. These
should be a good opportunity for members to report progress from the
dozen or so workgroups announced in the SLC website. These include no
less that eight bi-directional mapping initiatives between member
societies’ standards which might seem rather ambitious for a group sans finances.
While not directly involved in the mapping, the Object Management Group (OMG) has announced oil and gas as a current ‘hot topic’ with a revamp of its position paper and a reiteration of the claim that ‘little has been done since the adoption of the Witsml standard.’ And this despite three years of SLC activity! One hopes though that the OMG’s UML will serve the SLC’s massive mapping effort.
Yet another oil and gas ‘standard’ was announced at the The Open Group’s (TOG) annual San Diego meet this month which heard Leonardo Ramirez (Arca1) expound on the use of the TOG’s ‘Togaf’ methodology to ‘align corporate strategy with execution.’ The work was supported by a research program into an ‘oil and gas information framework reference architecture,’ (Ogifra). Despite a long email exchange with Oil IT Journal, TOG has failed to provide any more information on the ‘reference architecture.’
A blog post by Ansys’ Jim Cashman introduces V16.0 of the engineering design simulator. Ansys has integrated technology from its SpaceClaim acquisition into the AIM immersive simulation environment, said to ‘blur the geometry–mesh continuum’ of engineering simulations.
AIM, an integrated solution for 3-D engineering simulation now embeds multi-physics simulation in engineering design. Simulations span structures, fluids, thermal properties and electromagnetics, all from the same interface. The tool can be configured to automate best practices, create specialized applications, or provide designers and inexperienced engineers with access to relevant solver functionality. Watch the 1802 or checkout the graphic of fluid-structure interaction in a flow control valve. AIM is scalable to HPC systems with thousands of cores.
Auto-Dril has filed a claim in a Texas court against National Oilwell Varco and five other drillers for alleged infringement of its US Patent No.6,994,172 for a well drilling control system.
Auto-Dril claims that, inter alia, NOV’s e-Wildcat electronic autodrilling system infringes on the patent and has ‘actively promoted’ sales and use by third parties of its automatic drilling control systems.
Auto-Dril’s patent covers a ‘system for regulating the release of a drill string of drilling rig during the drilling of a borehole’ leveraging a bit weight and programmable controller of the rig’s variable drive electric motor and drill stem brake. Other defendants include Canrig Drilling, Omron and Pason Systems.
BP has commissioned a $450,000 survey of its Gulf of Mexico Atlantis facility from Aberdeen-based Return To Scene. R2S’ visual asset management spherical photography has been used on nine BP assets to date.
Panorama Consulting Solutions announces a new oil and gas ERP consultancy and a marketing and sales strategy partnership with SalesWerks.
RPS’ new ‘Reality Check’ service leverages BetaZi’s deal screening engine to ‘increase confidence in the accuracy of production forecasts.’
ABB has been awarded a $50 million contract by JGC Corporation for the supply of the electrical system for Petronas’ PFLNG2 floating liquefied natural gas facility.
ASD Global and Intergraph are to develop interfaces between Intergraph’s Smart 3D and ASD’s OptiPlant, Pipe Router, and Pipe support Optimizer.
Barco and Vidyo are to integrate Vidyo’s HD and UHD video conferencing solutions into Barco’s TransForm C enterprise collaboration solution to enable high-quality universal VTC communications.
Global Investment has selected Entero Mosaic to manage its North American E&P portfolio.
Vizcom is to offer consulting and reseller services around Epsis Teambox.
Eversendai Offshore is to deploy Aveva Marine engineering and design applications across its Middle East and Asia units.
Statoil has awarded Expro a $200 million contract for integrated well testing and fluid sampling services for Statoil on the Norwegian continental shelf.
Falconridge is to deploy and represent Tubestar’s Terraslicing well enhancement and oil recovery technology.
First Geo has been selected by Inpex Norge AS to provide full exploration teams and data hosting services.
The Edmonton Energy and Technology Park used Golder Associates’ GoldSet decision support tool to help screen sites for a new petrochemical facility. Golder is also working with Syncrude Canada to consolidate its environmental information management. Another deal covers engineering and procurement support for PG&E.
GSE (Beijing) has received ISO 9001:2008 certification for its simulation, engineering, design and consultation service for energy industries.
Intergraph SmartPlant Instrumentation (SPI) has been selected by DSE Oil and Gas.
Murphy Oil has licensed Ikon Science’s Joint Impedance and Facies Inversion tool JiFi.
Odfjell is to roll-out Navtor’s e-navigation solutions across its fleet.
Cepsa Trading has chosen the Allegro 8 platform to manage its worldwide oil and refined products supply and trading subsidiary.
Siteworx has redesigned Cameron International’s website and assisted with digital and content strategy.
Welltec’s annular barrier has been successfully run as a standalone, primary well barrier element for Statoil in North Sea.
Yokogawa Electric is to provide automatic process control systems and field instrumentation to Bashneft’s Ufa refinery complex.
PIDX has posted an updated business process guideline and code list XML schema for public review and comment. The update unifies PIDX response documents (PO, Field Ticket, Pro Forma Invoice, and Invoice).
The OAGi’s data mining group has released the predictive model markup language (Pmml), an XML-based file format for exchange of neural network and other AI models.
The OASIS consortium and ProSTEP iViP are coordinating their work on standards for smart information flows in an industrial engineering environment. ProSTEP’s code of PLM openness addresses business requirements for agility in complex IT environments. More from Oasis.
ISO’s TC46/SC9 groups are to revise and consolidate their document indexing standards. More from ISO.
Kongsberg’s K-Spice now sports a Cape-Open compliant interface for thermos-dynamics and life-cycle simulation support. More from Cape-Open.
Shell has awarded a contract to Yokogawa and Cisco for the supply of a ‘comprehensive security management solution for plant control systems,’ a component of Shell’s SecurePlant initiative. SecurePlant was co-developed by the three companies and will now proceed over the next three years with implementation at around fifty Shell plants globally. SecurePlant includes OS patches and anti-virus pattern files for control systems and the provision of real time and proactive monitoring of solution delivery, as well as a help desk operation to manage this solution.
A report from the International association of IT asset management lambasts the US federal government for its ‘IT insecurity.’ The report observes that ‘by focusing largely on hacks and other breaches, elected officials and agency administrators are failing to take a bottom-up approach to the purchase, control, inventory, and proper destruction of such IT assets as software, computer hard drives and mobile devices.’ The government spends about $10 billion a year on IT security, ‘with no meaningful standards and controls in place,’ resulting in ‘huge vulnerabilities that can easily be exploited from those inside and outside of the system.’
Geovariances has announced an R&D consortium to investigate uncertainty in seismic depth conversion, described as a ‘classical although challenging task.’ The aim is to develop a dedicated software tool to overcome current issues.
Badleys has launched a carbonate fault rock group led by Quentin Fisher (University of Leeds) to examine the sealing potential and transmissibility of ‘carbonate-hosted’ faults. The results will be implemented in Badley’s TrapTester/TransGen. Badleys has also kicked off an investigation into the structure and tectonics of deepwater rifted margins in collaboration with ION Geophysical. , applying quantitative geodynamic analysis to ION’s BasinSpan data.
Calsep has joined the Danish Technical University’s center for energy resources engineering (Cere) to further thermodynamic modeling and algorithm development for phase equilibrium simulations.
CGE is inviting partners in its ‘bow tie’ examples library initiative, a depository of information on risk analyses. The database will showcase bow ties and their linkage to risk management systems.
The three year EU ‘Faster’ project has now completed. Faster built an experimental version of Maxeler’s dataflow engine which has not however been adopted in the production release. Faster received a €2.8 million contribution from the EU’s 7th Framework program.
BP is to deploy software from Houston-based PAS* to help manage critical operational limits in its refineries and petrochemicals assets. The PAS inBound system captures plant data from alarms and safety instrumentation and provides operators with real time alerts.
inBound lets operators explore ‘boundary compliance’ to identify areas of plant safety improvement. Such ‘boundary data’ includes process alarms, safety instrumentation and normal operating envelopes. A scorecard shows adherence to limits as well as detailed and trend information for decision-making. InBound integrates with a plant’s alert system to drive critical operator actions.
inBound is a component of PAS’ PlantState suite that is deployed at over 600 sites worldwide. inBound VP Mark Carrigan said, ‘Automated boundary management is considered an industry best practice today. BP joins other PAS customers who utilize these advanced methods of plant safety controls.’ More from PAS2401.
* Plant automation services.
Following the deaths of Noumenon/XMTools principals Adrian Laud and Chau Lee last year, Aukland, New Zealand-headquartered Nextspace has been appointed sole sales, distribution and support partner for the engineering software. Nextspace CEO Damian Swaffield told Oil IT Journal, ‘XMTools has provided Nextspace with XMpLant, XMpDE and related extractor source code and license generation facilities in support of our new partnership. Existing and new XMpLant customers now have a one stop shop for all sales, licensing and product support and development. Nextspace has eight years’ experience in standards based interoperability for engineering and construction.’
Nextspace also has partnerships with Dassault SystŠmes and has an SAP partner center of expertise certification. XMpLant interfaces to the major process plant design systems, providing access to ‘intelligent’ plant information in the neutral, open ISO 15926 format. XMpLant supports the full structure, attributes and geometry for 2D schematics and 3D models. XMpLant also supports the ISO16739 standard. XMpLant and XMpDE are deployed on over 100 projects and are data driven by the Proteus Schema.
Frost & Sullivan expects at least ten significant cyber security acquisitions in 2015 as threats rise and companies jostle for market position. By year end 2015 all the sectors that constitute critical national infrastructure will have been breached, including oil & gas, causing operational incidents and downtime.
Visiongain’s report on the oil and gas automation and control systems market 2015-2025 sees growth, thanks to robust demand for process optimisation, safety measures and remote-control solutions in every industry segment. The 2015 global A&C market will be worth $18.68bn.
Software AG forecasts energy market merger mania for 2015. In predictions for digital energy, Software AG has it that ‘mega mergers such as Halliburton and Baker Hughes may fail due to antitrust concerns.’ Moreover, the political debate over major projects like Keystone XL ‘will spark energy companies to view technology like predictive analytics as essential both to ensure project safety and to overcome perceptions of high risk among lawmakers.’ Lower oil prices will push oil exploration companies toward the internet of things, adding automation and sensors to enhance operational efficiency.
A report from AT Kearney forecasts increased oil and gas M&A in 2015 but warns that ‘the window of opportunity may be shorter than expected, and will be driven by oil price expectations. Those companies with strong cash flow and healthy balance sheets will be able to leverage opportunities, while others will need to define strategies just to survive.’
Susan and Marc Strausberg’s 9W Search start-up claims to provide a ‘Siri-like’ function targeting oil and gas users. Ask 9W answers detailed questions about US oil and gas in what is claimed to be a more informative fashion than Google.
Ask 9W is powered through a partnership with IBM Watson using the cognitive computing technology atop of 9W Search’s Edgar-style structured data of US companies’ financials. A demonstration video is available showing typical use cases. These include queries relating to mergers and acquisitions, companies’ latest cash flow numbers and the like.
In a blog posting Strausberg plugs the tool as providing opportunities for investors and traders who need to ‘pay close attention to current key metrics before they make any purchase of shares in oil companies.’ Often critical information, such as a companies hedged position, is ‘buried in hard-to-find places in filings.’ The 9W/Watson combo answers complex questions like ‘What is the average price of oil in current derivative contracts and other hedging instruments that protect the company’s present and future cash flow.’
The answer may be salutary as small and medium sized companies’ debt burden often have draconian terms which may not be obvious to the casual investor. The Strausbergs co-founded the Edgar Online company information service in 1993. The company was sold to RR Donnely in 2012.
We learned at last year’s Ecim that ‘time was running out’ for the timely transfer of Norway’s seismic dataset to CGG’s Trango repository. A post on the Diskos home page reveals that the system is still not fully operational.
The well and production data modules are operational but ‘due to technical difficulties within CGG and the complexity and size of Norway’s seismic data’ the seismic module is still undergoing testing by a reference user group. Other functionality (including automated seismic data download) will be phased in and tested by a 130 strong enlarged group of ‘reference users’ over the next few weeks.
The project involves the migration of a petabyte of seismic data into an IBM ‘Elastic Storage’ unit, a combination of disk and tape robot that promises a ‘unified view of data irrespective of location.’ A data clean-up project is running in parallel with the migration. CGG asks Diskos users for patience while the system rolls-out.
Capgemini has announced a cloud-based version of its ReadyUpstream (RU) SAP-based solution for exploration and production. RU is claimed to streamline finance, accounting and operations processes. The solution is also available on the SAP Hana platform for on-premises or cloud-based deployment. RU was originally developed by Irvine, Texas-based Strategic Systems & Products which was acquired by Capgemini last year.
Capgemini North America upstream VP Carlos Martinez said, ‘Companies faced with the challenge of containing costs are limited by their infrastructure. Running RU in the cloud provides the agility needed to execute growth initiatives on a scalable platform.’
According to Capgemini, RU is one of the most widely used upstream ERP solutions. Capgemini, a qualified SAP business partner solution, delivers RU as a preconfigured, industry-specific version of SAP ERP. Capgemini has established a program to train consultants in areas such as production revenue and joint venture accounting, Council of petroleum accountants society (Cpas) rules and other industry requirements. More from Capgemini.