The Institution of Mechanical Engineers (IME) is located in about as upscale a part of London as you could wish for—next to beautiful Green Park with its ducks and pelicans and within a stone’s throw of 10 Downing Street. Inside the Edwardian magnificence are brochures describing ‘sustainable’ engineering and various schemes for carbon mitigation. There is also a life size mock-up of a V8 petrol engine from one of the latest Jaguar automobiles. Some may query the ‘sustainable’ nature of the V8. But the explanation is in the accompanying blurb. The monstrous motor’s cylinder head is made of recycled aluminum! So next time you squidge-up a beer can and lob it into the recycling, you will be doing your bit for the environment—thanks to Jaguar.
I was at the IME for a one day conference on Process Safety, subtitled—‘are you doing enough to prevent major disasters?’ We will be reporting from this very informative event in next month’s Oil IT Journal—but to whet your appetites and to provide fodder for my editorial I will move on to the conclusions of the conference.
The first consensus seems to be that major accidents over the past few decades have led to a better understanding of risk management. Current thinking revolves around a combination of the James Reason’s ‘Swiss cheese’ analysis (www.oilit.com/links/1110_51) of how defenses and barriers can be penetrated by an accident ‘trajectory’ and the development of a mitigation strategy based on a ‘best bang for the buck’ model.
Members of the pipeline ‘Geogathering’ community which assembled recently (Page 4) will recognize the approach as conforming broadly to the formula that evaluates risk as the product of the likelihood of occurrence and the impact of a subsequent event. This has led to the identification of high consequence areas as key to regulation-driven maintenance of the US pipeline system. See this month’s lead for a rather compelling examination of what can happen if such ‘data driven’ maintenance programs are not maintained themselves.
Using such techniques it is at least theoretically possible to develop a mitigation strategy that keeps the risk exposure down to a certain level. What that level is depends of course on how many ‘bucks’ you are prepared to throw at the problem.
The second consensus to emerge from the Process Safety conference was, logically enough, that if safety is a matter of spending money, then those in charge of the purse strings should be involved in the decision making process. This has translated, in the UK at least, to calls for better education of company boards and C-Suites in risk management. Training plant and refinery workers in safety is well established, but training the board in risk management? Actually this is exactly what the Cogent sector skills council is doing with a one day course for senior execs.
One can’t help feeling that the desire to engage the board in risk management is a bit of buck passing. Maybe it would be better to just present them with a bill or a few alternatives with some headline differences—for $10 million we will have a major accident every 50 years, for $20 million every 100—or some such.
I am also a little uncomfortable about calling in the silk suits specifically on ‘risk.’ Not because they will be unfamiliar with this kind of analysis—but because it is actually rather close to many other facets of board-level activity. Risk analysis is well entrenched in the financial end of the business. What used to be called reserve estimation is now termed prospect ‘risking.’ All sorts of fancy math is brought to bear on just about every link in the decision-making process—like calculating the value of information, keeping a portfolio on the ‘efficient frontier’ and Monte Carlo analysis for all.
You can see where this is leading to, a humongous enterprise risk model that arbitrates every buck spent across every ‘opportunity.’ A new pressure gauge here? A new seismic survey? Hire a geologist? Or a safety engineer?
I really don’t know what the answer to all this is. If I did I probably wouldn’t tell you, I would set up the next big management consultancy since Andersen. But enterprise level IT systems attempt to provide a holistic view and the same risk analytical techniques are used in more or less all fields. So there is a kind of logical inevitability that something like this may be attempted sooner or later. Perhaps it is already an option in SAP. A good project for an Excel-savvy intern perhaps.
On the other hand, if every department—from process, exploration, IT and whatever is rushing to ‘educate’ the C-Suite before taking a decision, that is putting rather a heavy load on the board’s shoulders.
In the upstream, current work practices combine a healthy skepticism for the outcomes of number crunching activities like ‘risking’ a prospect. Ultimately decisions are made on the basis of managerial judgment, gut feeling, flair—whatever you want to call it. I expect that in process safety it is probably not all that different.
My 2 cents is that we may have a language problem. Words like ‘risk management’ and ‘mitigation’ tend to make people believe that by following a set of procedures, risk can be eliminated. Far from being a safe outcome, this is may contribute to a dangerous situation of complacency. This is the paradox of risk management—that the more we mitigate, the more complacent we get—and expose ourselves to the ‘stuff’ that will inevitably happen. Alongside the safety system, we need a culture of fear—that encourages observant critical individual behavior. As Ian Travers of the UK HSE Executive said in his keynote, managers need to ‘stay skeptical.’
Speaking at the influential 2011 Geogathering event earlier this year, held in Broomfield, Colorado, Chris Hoidal, Western Region Director of the Office of Pipeline Safety described ‘numerous high profile incidents’ that highlight the need for better pipeline safety data integration. In June 2010, some 800 barrels of crude oil spilled from a Chevron pipeline in Salt Lake City into a city park and creek with an estimated $5 million clean-up cost to the operator. Hoidal explained that poor data integration within the integrity management program was a contributory factor to the spill. Since the pipeline was built, rights of way had been encroached upon by buildings, structures and vegetation. An electrical substation and grounding grid was installed over the pipeline in 1984 but the risk posed by electric fault currents was not explicitly incorporated into the integrity management program. The spill was initiated when a storm-induced electric arc burned a half inch hole in the top of the pipeline releasing the crude (full OPS report on www.oilit.com/links/1110_1).
Another high profile incident occurred in July 2011 when ExxonMobil’s Silvertip pipeline released some 1,000 barrels of crude into the Yellowstone River near Laurel, Montana, causing an estimated $42 million of damage. River scour was the root cause of the break. While ExxonMobil was aware of the flood risk and numerous remote actuated valves, it still took 56 minutes after the first alarm to close the valve adjacent to river.
Hoidal attributed the delayed response to a failure to ‘explicitly integrate local river crossing information, particularly local stream information, into the integrity management program.’ A subsequent investigation found that ‘few pipeline companies incorporate river and geotechnical risks when determining prevention and mitigation measures.’ Another high profile incident, the explosion of Pacific Gas & Electric’s San Bruno pipeline in September 2010 caused eight fatalities. The operator failed to keep external corrosion direct assessment (ECDA) information current. Hoidal concluded that ‘large systems need to ensure all pipeline information is integrated, particularly when ECDA is used.’
Operators can expect to see heightened attention from the DOJ and EPA. Currently there is concern that states are ‘too close to industry and lack the will to enforce regulations.’ Operators are required to examine their data yearly and report changes in geospatial, attribute, metadata, or contact information. The regulator is now extending assessments of pipeline integrity in high consequence areas to include ‘low stress’ pipelines. Preventative and mitigation measures ‘must be constantly improved upon.’ Congress will mandate improvements. Read the full paper on www.oilit.com/links/1110_2.
At the 2011 Manufacturing Enterprise Solutions
Association conference in Orlando, Florida this month, Microsoft announced Chemra, an IT ‘reference architecture’ for the chemical and oil refining industries. Chemra promises a ‘common, flexible IT environment for technology and data integration and interoperability across the value chain.’ Like its upstream predecessor Mura (Oil IT Journal September 2010), Chemra comprises a set of ‘guiding principles’ for users of Microsoft technology rather than a recipe for deployment. The publicly available ‘datasheet’ (www.oilit.com/links/1110_3) is conspicuously free of err, data
Microsoft’s partners in Chemra include Accenture, AspenTech, Invensys Operations Management, OSIsoft, PROS, and Siemens’ industrial automation systems unit. Chemra principles advocate a ‘natural’ user experience, interoperability, enhanced collaboration through social networking and cloud computing, business insights and a ‘solid infrastructure of high performing technology.’ More from www.oilit.com/links/1110_4.
The ISO 15926 standard for plant data exchange got a shot in the arm when it was endorsed in 2009 by the US-based Fiatech Association. Fiatech is about to release an ‘Introduction to ISO 15926’—a pre-publication version of which, the Primer, was used for this review.
Oil IT Journal has been diligently following ISO 15926 since around the turn of the millennium and has observed its evolution into what is today probably the most significant industrial use of the W3C’s semantic web technology. Diligence in our coverage has been necessary since, while the case for an interoperability standard is easily made, the choice of relatively untried technology was a bold one. We studied the Primer with interest, particularly to see if it offered an intelligible half way house between the ‘business benefits’ approach and the perplexing world of RDF graphs.
Author Gord Rachar, an engineering automation specialist, writes entertainingly and well. His starting point is that ISO 15926 means that, ‘Your computer can talk to my computer and we don’t have to know anything about each other’s system beforehand.’ This facility comes from using the various ‘Parts’ of ISO 15926. Part 2, the data model, contains the ‘rules of grammar’ to use Rachar’s ‘conversational analogy.’ Part 4, the reference data, is ‘like a dictionary or a thesaurus.’ And thus, ‘using the jargon of ISO 15926, when machines structure their data using the data model of Part 2, and when their terminology matches Part 4, they can communicate.’
Part 7, the templates, is ‘like a phrase book,’ which makes it easier to create meaningful sentences in a new language without having to learn the language. Part 7 was developed to allow ordinary engineers to use Part 2 without having to actually master it.
Rachar’s naming of the parts appears to run out of steam at Part 8 which is ‘made up of some semantic web tools called resource description framework (RDF) and web ontology language (OWL).’ Part 8 is ‘like paper in a book, or a computer file’ and Part 9, the facade, is ‘like the postal service.’
Well, we are not really sure if that will be of great help to someone coming into the field from say, a relational database background or from an engineering one for that matter. Rachar continues to thrash his metaphors to death with ‘students [ ] playing host at a local coffee shop, [ ] might create a website with a paywall where readers create an account and pay a fee.’ All of which is apparently ‘analogous to Part 9.’
More a propos perhaps is Rachar’s walk though the history of CAD data exchange—from the US Department of Defense’s Initial Graphics Exchange System, through the desire to render CAD drawings ‘intelligent’ and the Standard for Exchange of Product Information (STEP), the immediate parent of ISO 15926.
This is all very well but we were are three quarters of the way through the Primer and still no wiser as to how to write the code!
Oh, this must be it—a ‘Practical Guide to ISO 15926 Modeling.’ Here we learn that modeling with Part 7 is ‘sufficiently different that it was becoming a barrier to the wide acceptance of ISO 15926.’ At this point we are referred to the iRING users group website for more.
Getting back to more comfortable ground, Rachar explains in detail how compliance colors are used to judge the quality of ISO 15926 implementations. He then, like many educators before him, rushes quickly through the hard stuff—‘bindings to OWL/RDF’ SPARQL, ‘a language similar to SQL, but for ontologies.’ ‘WSDL, SOAP and HTTPS.’ The whole W3C enchilada in one sentence!
In summary, the Primer does a good job of setting out the ISO 15926 store front. It does a not quite so good job of explaining what all the Parts are there for. And it makes no attempt at all to provide instruction as to how to get down and code all this stuff that magically will enable machine to machine communication. Rachar, as a volunteer, is to be complimented on having made the considerable effort involved in authoring the guide. But the definitive oeuvre on real-world use—complete with code samples, has yet to be written. Read the Primer on www.oilit.com/links/1110_7.
Speaking at last month’s ECIM data conference in Haugesund, Norway, Richard Wylde (ExxonMobil and chair of the Oil and Gas Producers’ Geomatics Committee) announced the publication of three reports containing the results of the Geospatial Integrity of Geoscience Software (GIGS) initiative (Oil IT Journal October 2010).
The GIGS joint industry project set out to address concerns stemming from documented evidence of geospatial integrity failures. The intent is to establish good industry practice with respect to geoscience data management and use. The study has revealed that current practices are sub optimal. Geographic standards are insufficiently deployed. There is ‘conflicting and inappropriate’ use of terminology—Wylde recommends sticking with ISO definitions. There remains frequent use of incomplete or wrong coordinate reference system data and when it is available, it is often poorly documented. A lack of geodetic metadata leads to positional ambiguity. Companies are not auditing their own and their contractor’s systems sufficiently
The GIGS reports are now available from www.oilit.com/links/1110_5. The three reports cover GIGS guidelines, the software review process and a user guide to the test data set. Also available are a software review spreadsheet for companies wishing to see whether their tools merit bronze, silver or gold level certification.
Wylde warns that end users should not expect all software to pass all the tests—but it should be possible to ascertain how software handles geodetic information. This may all sound rather esoteric, but as Wylde explained, in one joint venture, a post change in operatorship evaluation revealed that partners had been using different coordinate reference systems while drilling. Even for this relatively green field with ‘a few wells and a little bit of seismic,’ it took 18 months to sort out and bring everything to a common framework. More on the OGP Geomatics Committee from www.oilit.com/links/1110_6.
The current issue of Ryder Scott’s quarterly newsletter contains an informative analysis of the state of the art in shale gas reserve estimation. Focus is on decline curve analysis and the techniques and exponents that are appropriate for shale gas. Current favorite is Pete Valko’s ‘stretched exponential production decline’ SEPD method which has been integrated into software from TRC, Fekete and Halliburton. Ryder Scott also looks into the way the DCA techniques interact with the new SEC reporting requirements. More on SEPD on www.oilit.com/links/1110_25.
Ryder Scott has put papers from its Annual Reserves Conference online (www.oilit.com/links/1110_80). Topics include an analysis of recent SEC subpoenas to shale gas producers. The bottom line, according to presenter Jeffrey Elk of Porter Hedges is that the SEC is looking for discrepancies between what shale gas companies are telling investors about costs, profits and well performance and ‘reality’.
Andrew Barron of Rice University reviewed the use of nanotechnology in oil and gas shale production. One novel technique ‘fingerprints’ frac fluids with ‘uniquely detectable’ magnetic nanoparticles. Thus, if there are claims of frac fluids polluting water resources, companies can ensure ‘that it isn’t your frac that makes the news’. More from Rice’s ‘Connexions’ portal—www.oilit.com/links/1110_27.
RyderScott also announced the release of its ‘SEC Seeker’ freeware application. SEC Seeker searched publicly available datasets of oil and gas company filings within the US SEC’s Edgar database. SEC seeker retrieves annual filings, for US, Canadian and international companies along with comments letters and other collateral. SEC Seeker is a web app that will be available early in 2012. Read Ryder Scott’s current newsletter on www.oilit.com/links/1110_26.
The 2011 UK Continental Shelf Demographics Report from Oil & Gas UK paints a ‘broadly optimistic’ picture of the industry. The average age of the 50,000 offshore workers is steady at around 41. Data collected over a five year period shows a growing number of people under thirty. While the majority of the workforce hails from the UK, the study found an overall total of 130 nationalities represented. There was a decrease in the number of workers in the 30-60 year age range—somewhat debunking the ‘greying workforce’ notion. According to the report, challenges remain in recruitment and retention of experienced workers in the face of strong competition from other offshore regions around the world and from other industrial sectors, including renewables. Download the report from www.oilit.com/links/1110_28.
The 2011 ‘major release’ of Paradigm’s Skua/Gocad geomodeler was unveiled at the SEG in San Antonio last month. Skua now occupies pole position in an Epos-connected constellation of upstream applications including Geolog, SeisEarth, ES 360 (now with added shale gas functionality), economics and well planning with SysDrill.
Skua has now been re-written with a Qt-based GUI and is, like the rest of Paradim’s applications, available on Windows. The re-write also includes a port to the Epos database. Skua has a new objet management approach. All E&P objects—wells, seismics, faults, are shared through the Epos database. It is possible to define usage scenarios and store results in the database. A macro recorder/player allows for re-run and edit of proven workflows. Paradigm also reports significant speedup with its CUDA/GPU-based solver. Ultimately the aim is for Skua to occupy center stage with other apps running as plug-ins.
On a related topic, a debate rages on the LinkedIn Geomodeling Network (www.oilit.com/links/1110_82) on the merits of different gridding styles and upscaling methods. More from www.oilit.com/links/1110_35.
Speaking at the Geogathering event earlier this year, Gerardo Chávez described Pemex’ ‘process-centric’ approach to pipeline integrity management. Pemex’ pipeline and facility integrity programs leverage Esri GIS, SQL Server, SharePoint and Pemex’ own risk and integrity applications. Pemex, with help from NRG Tech, developed the ‘process-centric’ approach to manage integrity utilizing cloud-based technology. Company and contractor cross-functional processes are orchestrated to ensure that work is properly executed. The system is part of a five year, $1 billion integrity management program.
The system leverages NRG’s ‘enhanced pipeline risk assessment’ methodology (www.oilit.com/links/1110_81) which relates risk levels to potential consequences, targeting resources to where they will have most impact. Risk mitigation is the product of integrity management and high consequence area analysis. A mitigation project is initiated in Microsoft SharePoint, with project specific data read in from SAP into the process project folder. A SharePoint list is generated and runs the process.
Chávez recommends developing metrics up front and working on processes ‘as-is.’ It is also a good idea to develop a proof of concept in Excel before deployment. Read Chávez’ paper on www.oilit.com/links/1110_8 and more from Geogathering on www.oilit.com/links/1110_9.
Rock Flow Dynamics has announced ‘tNavigator,’ an interactive, graphical, black oil parallel finite difference reservoir simulator. tNavigator leverages thread parallelization for multicore workstations and has financial backing from Intel. Developed in Russia, tNavigator claims 300 license sales to TNK-BP, GazpromNeft, Lukoil Rosneft, Gazprom and Sinopec—www.oilit.com/links/1110_38.
Barco has announced the OLS series of 3D LED video wall modules for the control room and simulation markets—www.oilit.com/links/1110_39.
The 3.0 release of Emerson/Roxar’s Fieldwatch adds sand erosion control reservoir management features to enhance production performance—www.oilit.com/links/1110_83.
Blueback Reservoir has announced new releases of its Earthworks seismic inversion and Tracker Petrel plug-ins—www.oilit.com/links/1110_40.
ClearEdge3D’s EdgeWise Plant 2.0 supports large projects with the ability to process up to 1,000 laser scans and unlimited points. EdgeWise automates the extraction of complex CAD pipe geometry from 3D laser scanned data into 3D industrial plant models—www.oilit.com/links/1110_41.
V 9.0 of PetrisWINDS DataVera quality tool adds support for 64-bit virtual servers with many gigabytes of RAM. A performance hike of 5-10x is claimed—www.oilit.com/links/1110_42.
V 5.0 adds quantitative analysis to the visualization functionality of Dynamic Graphics’ CoViz 4D. Geoscientists can compare 4D seismic with reservoir simulations and other time series data—www.oilit.com/links/1110_43.
EMC has released Isilon PetroVault—a storage system for pre and post stack seismic data—www.oilit.com/links/1110_44.
ESRI’s ArcGIS 10.1 release is to plug and play with IBM’s Netezza analytics appliances providing geospatial query and analysis of large datasets—www.oilit.com/links/1110_45.
ffA’s SVI Pro and SEA 3D Pro 2011.1 include data links to Petrel, SaltApp and Geoprobe—www.oilit.com/links/1110_46.
Geomodeling Technology Corp.’s AttributeStudio 7.1 includes tools for engineering and microseismic data analysis for unconventional plays—www.oilit.com/links/1110_47.
A new release of Lynx Seismap for ESRI ArcGIS Desktop adds import for common E&P data types into file geodatabase feature classes and raster grids— www.oilit.com/links/1110_48.
Pegasus Vertex’ 2.4 release of MudPro offers Windows 7 support and enhanced reporting, volume tracking and concentration recap—www.oilit.com/links/1110_49.
MatrikonOPC’s new Alarms and Events Historian improves preventative maintenance programs with analytics and diagnostics of real time alarm information using standard database queries—www.oilit.com/links/1110_50.
Schlumberger has released ‘Ocean Mobile,’ a Windows phone channel for Ocean and Petrel plug-in news—assuming anyone has a Windows phone—www.oilit.com/links/1110_60.
Senergy Software has announced Oilfield Data Manager 3.7 with true stratigraphic thickness capability, enhancements to the reservoir performance module and a TCP/IP link to Petrel—www.oilit.com/links/1110_61.
V 8.0 of Wood Group unit MCS Kenny’s Flexcom riser design software addresses multi-case extreme and fatigue sea state loading conditions—www.oilit.com/links/1110_62.
V 4.0 of Quest Integrity’s ‘Signal’ engineering fitness-for-service toolset adheres to the API 579 standards for fracture mechanics analytics on fixed and rotating equipment—www.oilit.com/links/1110_63.
The 2011 edition of Oracle’s OpenWorld conference took place this month in San Francisco. The oil and gas/utilities track included presentations from Southwestern Energy, Santos, Transocean and Oracle’s own Melinda McDade.
Since its acquisition of Sun Microsystems last year, Oracle is now a hardware vendor. Martin Paynter of ‘Oracle-centric’ consultants Enkitec unveiled Southwestern’s deployment of the Sun/Oracle ‘Exadata’ database appliance. Exadata offers database servers, storage and network in a single enclosure, all ‘optimized for data warehouse and transactional processing. Paynter presented some compelling benchmarks comparing Exadata execution times with the ‘traditional’ approach. He was joined by Southwestern’s lead Brad Salva who described the company’s e-business (EBS) project—a migration to Oracle’s E-business R12 and Business Intelligence, and P2ES Enterprise Land, Energy Upstream and Wellcore applications. Southwestern considers the EBS to provide a single source of truth that has been easy to deploy and offers ‘extreme’ performance. The presentation contains a lot of detail on the hardware, on application and backup benchmarking and system tuning.
Santos’ Steven Benn described another Exadata deployment—as a ‘private cloud’ for production allocation across Santos’ complex network of production and refining facilities in the Cooper Basin, Australia. The real challenges here are the business risks of missing delivery deadlines, invoicing issues and getting production data ‘right.’ Benchmarking the legacy production allocation system against Exadata showed a ten times speedup—folks were impressed. Santos is now working on improving performance on its Intergraph Smart Plant engineering applications via the Exadata cloud.
Oracle’s Melinda McDade and Stewart Levin (Stanford) returned to the ‘private cloud’ metaphor with a proposal for Exadata’s use in seismic workflows. Alongside its performance, an Exadata ‘cloud’ promises virtualized resources, scalability and data parallelism. Some jobs (Monte Carlo simulations, data mining and attribute analysis) are an ‘easy fit’ to the cloud. Others (seismic imaging, reservoir and refinery simulation) are more challenging. According to the authors, Exadata can rise to the challenge—for instance with seismic data processes running on metadata in the database and trace data in dedicated parallel storage. Proof of concept migration trials on the 60 terabyte SEG Advanced Modeling (SEAM) data set are underway. Download presentations from Oracle World on www.oilit.com/links/1110_36.
The 2011 Society of Exploration Geophysicsists’ annual convention received just over 8,000 attendees. It kicked-off with a rather lackluster Forum on ‘Exploration Frontiers, geography, technology and business models.’ Shell’s David Lawrence reported that shale/tight gas reserves would provide 100 years of supply in the US and 250 years internationally. Shell and Hewlett Packard will field a million sensor wireless seismic system by 2015. Tim Dodson (Statoil) shared some spectacular seismics on the Skrugard discovery showing a massive double flat spot (gas on oil and oil on water), visualized thanks to ‘partially proprietary’ algorithms bringing ‘competitive advantage’ to Statoil. WesternGeco president Carl Trowell observed that seismic imaging is evolving faster than drilling. Depth imaging now constitutes over 50% of Western Geco’s business. Every imaging project is bespoke—contractor and operator work ‘hand in hand.’ This is forcing organizational and technological change as software platforms need to share velocity and earth model data. A modern 1,000 km sq. survey with 120,000 full azimuth point receivers can generate 700 terabytes. The petaflop computing needed for this has been ‘piggy backing’ on gaming technology but today, WesternGeco is designing its own ‘chips of the future.’
Leila Gonzales reported on a study by the American Geological Institute that broadly confirmed the imminent ‘big crew change’ in geoscience professionals where a large cohort falls in the 50-60 age group. Around half of today’s 260,000 geoscientists will retire in the next decade. Even if all new graduates get hired, this will mean a 150,000 shortfall by 2021. Geoscience degree production is moving east, to the EU, Russia, China and Indonesia making for a global, mobile workforce—www.oilit.com/links/1110_10.
Parveneh Karimi of the University of Texas at Austin presented a novel approach to the ‘UVT’ system of stratigraphic coordinates as deployed in Paradigm’s Skua. The method allows interpreters to work in a pre-deformation frame of reference and is said to improve inversion. The new method leverages Sergey Fomel’s ‘predictive painting’ autopicking algorithm—www.oilit.com/links/1110_11.
Alex Martinez provided an insight into ExxonMobil’s approach to shale gas prospection. Shale gas success involves matching geoscience challenges with drilling decisions. Rock physics is the ‘Rosetta stone’ at the interface of these disciplines and the key technique is forward modeling from hypothesis to seismic response. Rock physics intervenes across the acquisition, processing and interpretation cycle. Regulatory and environmental challenges impact survey design. It can be hard to untangle the influence of fractures from other effects. Decision making speed is of the essence as shale gas drilling may move on regardless! GPS and GIS are heavily used in acquisition. In a Piceance basin survey Exxon issued GPS transponders ‘to show the regulator where everyone was.’ The survey mobilized seven helicopters simultaneously. High performance computing is a big help in quantitative data analysis. While seismic costs are high relative to drilling, Martinez hopes that, ‘we can move from pattern drilling to seismically guided drilling.’
Landmark’s latest release of its DecisionSpace Desktop includes Permedia’s Mpath for petroleum system modeling (acquired by Halliburton last year). A new subfusc GUI is said to ‘draw the eye to the data.’ A ‘VelGen’ velocity tool works across the processing and interpretation domains—from ProMax/JavaSeis to complex multi-z structural frameworks in Geoprobe. The Lithotect add-in allows for detailed structural geological interpretation. All results are stored in the Open Works database and an API is available for third party plug-in development—underlining DSD’s role as challenger to Schlumberger’s Petrel. Intriguingly, DSD was also running on a Mac—showing gestural interaction with the data via the touch pad. This is currently a research platform but is considered as a ‘vindication of the flexibility provided by Landmark’s Java/Qt codebase.’
Rich Hermann and Jitesh Chanchani elaborated on IHS’ acquisition of SeismicMicro Technology (SMT). The deal was struck on the back of SMT’s ‘market leading’ position in unconventionals. Short term plans include integration between SMT’s Kingdom and IHS’ Petra, GeoSyn and LogArc packages. Later this will evolve into a new ‘unified G&G workstation’ extending Kingdom SDK with more specialized apps and close coupling with IHS’ 5.5 million global wells dataset.
OpenSpirit, now part of Tibco, was showing ‘Tibbr,’ social networking for the enterprise. Tibbr lets users ‘follow’ people, subjects and events by subscribing to, for instance, a ‘US Land’ or a ‘New Ventures’ stream. Streams can be expense reports or invoices and each rolls up different apps and data sources. Tibbr ‘removes the chatter’ by moving email ‘conversations’ to a database and reducing notification frequency. Apache is trialing the system on a drilling community of practice—www.oilit.com/links/1110_12.
OpenSpirit has been busy integrating Tibco’s BusinessWorks information hub to extend its coverage to include SAP and any Oracle database. Business processes automation can now span geotechnical and financial data sources. A visual programming paradigm provides connectors to PeopleSoft, Siebel and Tibco’s ActiveMatrix orchestration engine. A use case might connect well information in SAP to OpenWorks, managing units of measure and coordinates en route.
Kris Pister (UC Berkeley) reviewed the state of the art in wireless networks. Modern spread spectrum channel hopping nodes run for years on batteries or on energy ‘harvesters.’ The ‘nasty’ standards battle has been won by the Wireless Hart protocol. Chevron has deployed a 4000 sensor mesh at its Richmond refinery—www.oilit.com/links/1110_13.
Rebecca Saltzer described how ExxonMobil has been using broadband seismometers to provide low frequency velocity information to stabilize migration. The trial was performed on the LaBarge field, Wyoming, with control from thousands of wells. 55 broadband Guralp seismometers (loaned by the National Science Foundation) were deployed at an unusually close 250 meter spacing and recorded background noise for six months. Data was recorded to 2 GB memory cards which were retrieved every 2 months. Nearly 600 teleseismic events were recorded including the magnitude 5.7 Costa Rica event. Data was processed with the same code as used in global seismology. The results were very similar to the well derived velocities—www.oilit.com/links/1110_14.
Pruneet Saraswat (Indian School of Mines) advocated the use of immune theory and self organizing maps to classify seismic facies. Seismic patterns are ‘learned’ and, like vaccines, ‘stay in the system.’ The demo showed some compelling if rather glib signal to noise improvements—www.oilit.com/links/1110_15.
Sharp Reflections’s PreStack Pro (www.oilit.com/links/1110_53) is a lightweight system for interactive pre stack seismic interpretation and processing. PreStack Pro embeds Fraunhofer’s global programming interface (GPI) and optionally the Fraunhofer parallel file system—www.oilit.com/links/1110_16.
Enthought’s software service support business model centers on scientific programming in Python and real time data visualization. CTO Travis Oliphant actually wrote the NumPy component of SciPy. Three levels of support are available from freeware to premium. Researchers at Shell and Conoco are users. Clients have developed Python code for pore pressure, AVO and micro seismics. Enthought is also trying to repurpose its ‘big data/NoSQL’ approach, originally developed for tick level trade data analysis in financial services, to production data streams—www.oilit.com/links/1110_17.
Reservoir Labs’s ‘RStream’ CUDA code generator for seismic processing produces optimized code for the GPU that ‘compares well’ with optimization performed by an expert hand coder—www.oilit.com/links/1110_18.
Janet Sinclair kicked off the 2011 User Conference of the Pipeline Open Data Standard (PODS) association meet in Sugarland, Texas this month. PODS is thriving, ten countries were represented by 199 attendees (50% up on last year), just over half from operators. PODS’ mission is to ‘develop and support open data storage and interchange standards for the pipeline industry.’ Growth in the data model, now in its 5.1 release, has been modest in recent years, current table count stands at 678. PODS is reported as ‘widely adopted’ with over 100 active member companies. Flagship deployment is the US National Pipeline Mapping System (NPMS) which has migrated its 510,000 mile database to PODS. PODS now also has a reference spatial implementation (a joint venture with ESRI).
Ron Brush (New Century Software) provided an update from the PODS technical committee. Along with the ESRI geodatabase in the 5.1 release, PODS is working on Oracle and SQL Server spatializations. A ‘OneCall’/damage prevention sub model was completed in 2011 and work has begun on ‘modularization’ of the data model. An analysis of table usage is underway to identify little-used parts of the model and a draft spec for a PODS API has been developed.
Jeff Cannedy (Chevron Pipe Line) spoke at a panel session on the use of GPS data in PODS. CPL has a central PODS implementation for static asset data. Transaction data is kept in SAP and control center applications. Field data is collected using Trimble hardware and SDT Cartopac/EIM software. CPL is still assessing hand held data gathering and working to eliminate functional governance silos. Pipeline data management requires both coordinate and stationed locations—PODS ‘has not made it easy to manage both.’
Darrel Clarke reported on a complex GPS data chain at DCP Midstream, with little automation and a high degree of manual QC—deemed necessary to ‘maintain the high level of data quality.’
PODS president JW Lucas described Enterprise Products’ comprehensive deployment of PODS V5.0. A centrally deployed SQL Server instance manages EP’s 50,000 mile network supporting mapping, accounting reporting and integrity management and more. The key is ‘standardized, central database management, governed by business and supported by geospatial subject matter expertise.
Total unit TIGF’s Jean-Alain Moreau showed how PODS has been adapted to an EU context. TIGF began work on PODS in 2006 and has since devoted over 65,000 person-hours to building data feeds into PODS from Bentley Microstation and Intergraph tools and integrating with Total’s own ‘OGIC’ pipeline information management tool. The project involved a migration from paper to georeferenced digital data. The PODS-based solution is now aligned with French regulatory requirements. Moreau observed that PODS ‘is powerful but complex’ and that it is better ‘to store the data you need rather than all the data you have!’ Presentations from the PODS User Group can be downloaded from www.oilit.com/links/1110_37.
Alliance Geotechnical Services and its Indonesian unit PT Aliansi Lintas Teknologi have received ISO 9001 certification from quality assurance specialist SGS.
Computer Modelling Group has appointed Ryan Schneider VP Marketing—he was previously CTO of Acceleware.
Alan Brunnen and Tove Røskaft now have seats on Aker Solutions’ executive management team and Jesper Ericsson heads-up a new subsea engineering office at Gothenburg, Sweden.
Joe Drinon has been appointed VP Communications with Advanced Visual Systems.
David Blacklaw is the new CEO at Badger Explorer. Current CEO Kjell Erik Drevdal is now executive VP, market development.
Kevin Shaw, formerly with Wellington West has joined Casimir Capital as head of global energy research.
Jorge Machnizh is CEO and Kathy Ashmore is director of marketing with startup Sigma3 Integrated Reservoir. Machnizh was previously with Paradigm and Ashmore with OpenSpirit.
Will Coombs is the inaugural Ikon Science computational geomechanics lecturer at Durham University, UK. Ikon has also announced the move of Henry Morris to VP global corporate development. He is replaced as VP Asia Pacific by David Flett who joins the company from Searcher Seismic. Jacqueline Ming has joined Ikon from CGGVeritas as business development manager.
Yang Zupei, (CNPC) is chairman of the DNV’s new Greater China pipeline technical consultative committee.
Barry Chovanetz has joined East West Petroleum as VP engineering. Chovanetz hails from Hess.
Vickie White is now training and business development manager with Exprodat. She was previously with Esri UK.
Stephan Reimelt is now VP and CEO of GE Energy Germany.
Brian Sweeney is now senior VP global sales at IHS. He was previously with HP.
Jim Ducote has joined Joule Processing as business development consultant.
Beth Sellers is the deputy director of Los Alamos National Laboratory. Sellers was previously with Areva.
The Society for Maintenance and Reliability Professionals has announced the SMRP Library of Knowledge—www.oilit.com/links/1110_52.
Charles Weiss, James Hanson and Carl Miller make up the new energy investment banking team at MM Dillon & Co.
Peter Rourke has joined Midland Valley as software tester.
Rob Cox is to lead the new OGP/IPIECA oil spill response joint industry project.
OTC Global Holdings has appointed Suresh Dongre as CIO. Dongre was previously CIO of Saracen Energy Advisors.
Charles Goodman is president and COO of P2 Energy Solutions. He was previously CEO of Ventyx.
Tim Weller is CFO Executive Director of Petrofac. He succeeds Keith Roberts who is to retire.
Kurt Bettenhausen is senior VP Siemens Corporate Research, the company’s U.S. research and development division.
Samantha Murray is now senior account representative with Petrosys in Houston. She recently joined the company from Deloitte’s Petroleum Services Group.
Brad Wells is now VP inspection services with PinnacleAIS. He was previously with Pro-Inspect Inspection Services.
Jean-Pierre Carsalade is France MENA director with SGI. He was previously with CS -SI.
Phil Brading is now operations manager at Venture Information Management and Simon Cushing is head of professional services. Brading hails from Landmark.
Aveva has acquired the software division of Z+F UK, developer of the LFM 3D laser point cloud processing package. AVEVA is to improve integration of laser data and its Laser Modeller engineering tool and will establish a 3D data capture centre in Manchester, UK.
Flint Energy Services has completed its acquisition of Carson Energy Services in a deal comprising $112 million in cash, shares and an additional contingent payment.
Fugro is to acquire the ocean bottom nodes business of SeaBird Exploration for $ 125 million.
HP has acquired Autonomy Corp. for £25.50 per share cash.
NuTech Energy Alliance has acquired special core analysis boutique Poro-Labs.
Pipeline monitoring specialist Omnisens has received a ‘substantial’ investment from Credit Suisse’s wholly-owned investment vehicle SVC.
Chinese oil country ERP specialist Pansoft is to acquire Hefei Langji Technology and its Shanghai Zhongrui unit for RMB 10.8 million ($1.7 million).
PennWell has acquired the assets of energy mapping and database provider eMarket Software. eMarket’s technology will add web-based mapping to PennWell’s MAPSearch offering.
Nova Metrix has acquired UK-based Sensornet, provider of fiber-based distributed temperature and strain measuring systems. Sensornet’s former owner Tendeka is to retain the in-well oil and gas business.
Technip is to acquire the subsea engineering company Global Industries for $8 per share. The transaction values Global at $1,073 million including debt.
China National Offshore Oil Corporation has deployed the Radwin 2000 high-capacity point-to-point system to connect its corporate offices to off-shore oil platforms. The sub-6 GHz solution provides up to 200 Mbps at ranges of up to 120 km. A CNOC spokesperson said, ‘Our previous VSAT links were too low bandwidth and didn’t satisfy our growing network data service demands.’ More from www.oilit.com/links/1110_29.
Earlier this year, Wireless Seismic contracted with Verif-i, an independent seismic instrumentation auditing company, to evaluate its RT 1000 system with a suite of industry standard tests. The tests included noise, impulse response, distortion, common mode rejection and gain accuracy. Verif-i CTO Chris Woodward concluded, ‘Following the three day test, we are satisfied that the RT 1000 system performs within the published technical specifications.’ Wireless Seismic is backed by Chesapeake and Norway-based VC, Energy Ventures. The full specs are available on www.oilit.com/links/1110_30.
FreeWave Technologies’ latest installations for Williams Field Services include the FGR and FGR2 series for serial communications. Williams is deploying FreeWave’s FGR-CP radios to monitor its cathodic protection system and safeguard its natural gas pipelines and pipe-to-soil test stations. More from www.oilit.com/links/1110_58.
Writing in the ABB Review Sergio Casati described how ABB has leveraged the Pipleine Open Data Standards (PODS) association’s data model to deliver a geographic information system as a component of a a $650 million engineering contract for a group of companies headed by the Sonatrach-Anadarko association. The project was carried out on the El Merk oil and gas fields located in the Berkine Basin, Algeria. When completed, the fields’ 140 wells will be linked by a system of field pipelines to ten gathering stations and piped to a central processing facility.
To manage the $650 million contract, one of ABB’s largest ever, ABB is developing a GIS system for the entire El Merk project. The Intergraph GIS- based infrastructure management system supports design, build and maintenance of the field, facilities and pipelines and provides all stakeholders with an ‘ever green’ view of project progress. The system calculates construction materiel needs and tracks potential conflicts along pipeline corridors, such as differences in elevation, the angle of dune slopes, and so on. The solution leverages the PODS data model to automate delivery of alignment sheets and other documents. The PODS/GIS combo is proving a major contributor to the success of the El Merk project where several EPCs, subcontractors and thousands of workers are on site at any one time. An estimated 12 months has been shaved from the project schedule thanks to the system. More from www.oilit.com/links/1110_33.
Speaking at the 2011 EU Congress of Chemical Engineering in Berlin last month, Martin Gainville (IFP Energies Nouvelles) presented a proposal for an industrial strength interface that will permit hydrodynamic point models to be swapped seamlessly between software tools. The CoLan Hydro special interest group is led by IFP with representation from Kongsberg, SPT Group, Total and Infochem. Hydro models are used to study multiphase flow in pipes and have application in nuclear, industry and oil and gas production (particularly in modeling slugging).
Prototyping has now progressed to the point where the partners are inviting expressions of interest in the development of a comprehensive software interface which will allow interaction between hydrodynamic components, unit operations and process models. CoLan has release the current spec for consideration by the community of researchers and practitioners involved in multiphase flow modeling and simulation. More from www.oilit.com/links/1110_34.
Rick Morneau’s startup, Morneau Consulting has signed with the TradeFair Group to co-locate a Virtual Reality summit with the upcoming Digital Plant 2011 event in Houston. The VR Summit will include a ‘serious gaming’ segment covering the ‘rapidly developing’ virtual reality/3D/industrial gaming markets. Morneau said, ‘The VR Summit will provide real world content, data and market intelligence on how virtual reality-based technologies open opportunities for plant owners and operators to improve performance and reduce risk across the plant lifecycle.’ Morneau, previously with Chevron, has set up his consulting outfit to apply VR-based technologies to complex work processes. More from www.oilit.com/links/1110_31 (Digital Plant) and www.oilit.com/links/1110_32 (Morneau).
Barco has partnered with Genetec to offer IP-based video surveillance to emergency operations centers (EOC). Barco’s Control Room Management Suite now interfaces with Genetec’s IP-based ‘Omnicast’ streaming video technology to provide a ‘single perspective’ overview of remote video feeds to the EOC display wall—www.oilit.com/links/1110_70.
A proposal from Petris has received funding from the Houston Advanced Research Center to develop a GIS system for optimizing well placement and sharing of environmental data types with stakeholders. The award was made in response to concern from land owners regarding unconventional exploration and production in the Eagle Ford area of South Texas—www.oilit.com/links/1110_71.
BG Norge has awarded FMC Technologies a $135 million contract for subsea equipment on its Knarr field. The deal includes subsea trees, control modules, wellheads, manifolds and related controls. Delivery is to start early 2013—www.oilit.com/links/1110_72.
Canrig Drilling Technology has licensed managed pressure drilling technologies from Managed Pressure Operations International (MPO). The agreement will also enable technological and engineering collaboration between Canrig and MPO for future MPD system development—www.oilit.com/links/1110_73 .
E.ON has selected Industrial Defender to provide security and compliance technology at 40 of its sites. The solutions come as specific bundles to fulfil national and EU regulations governing critical infrastructure—www.oilit.com/links/1110_74.
WeatherBug operator Earth Networks has partnered with Borrasca Iniciativas Atmosféricas to offer weather networking, forecasting, data visualization and greenhouse gas monitoring services to the Iberian and Latin American markets—www.oilit.com/links/1110_75.
The Middle Tennessee natural gas utility district (MTNG) is availing itself of Esri’s small utility enterprise license agreement (SU-ELA) program to obtain ArcGIS technology for mapping and managing its network. Next year MTNG plans to expand into mobile mapping with ArcGIS for iOS, providing field access and reporting with Apples’s iPhone and iPad hardware—www.oilit.com/links/1110_76.
GE Oil & Gas has been awarded a contract from OGX Petróleo e Gás to supply drilling and production equipment for three offshore fixed production platforms to be deployed in the Waimea and Waikiki oil and gas fields of the Campos Basin, offshore Brazil. The contract is worth a potential $230 million over four years—www.oilit.com/links/1110_77.
IHS has won a contract to ‘standardize and enrich’ Dupont’s maintenance, repair and operations (MRO) materials catalog and implement new catalog management software for MRO data governance. Dupont conducted a comprehensive investigation of MRO data solutions to conclude that IHS ‘Intermat’ offering, the Standard Modifier Dictionary and Struxure software were best suited to its requirements—www.oilit.com/links/1110_78.
Ipcos has teamed with Abu Dhabi-based Golden Falcon Petroleum Services to extend its process control software and optimization offering to the middle east market. Ipcos will also be actively marketing its Intelligent Operations solution for the upstream market—www.oilit.com/links/1110_79.
Brazilian E&P company, OGX Oil and Gas, is to deploy Landmark’s DecisionSpace for Production. The software is integrated with both Landmark Openworks and Engineer’s Data Model project databases—www.oilit.com/links/1110_90.
Volant has teamed with Neuralog to integrate EnerConnect and NeuraDB to enable data movement between NeuraDB and geoscience applications such as Petra, Geographix and OpenWorks. The agreement kicks-off a strategic partnership to deliver a commercial, integrated data management solution for the upstream. Additional geoscience applications will later be integrated with the NeuraDB platform including Petrel, OpenWorks, Paradigm and Geographix—www.oilit.com/links/1110_91.
Maersk Oil has awarded Lloyd’s Register’s Scandpower unit a contract for hazard identification and risk assessment (HIRA), accident and incident investigation, management of change and contractor management. Scandpower CEO Bjorn Inge Bakken said, ‘Ensuring consistent standards of risk management across a global network of high-value offshore assets is a complex task. Operators often face challenges that arise from differing legislative regimes, work cultures and levels of asset maturity. Lloyd’s Register is one of the few independent assurance providers with the global reach and diversity of technical expertise needed to support multinationals in this area’—www.oilit.com/links/1110_92.
Opsens and Lios Technology are partnering to provide an integrated solution to measure pressure and temperature in SAGD wells. Opsens’ OPP-W high-temperature optical pressure and temperature sensor will be combined to LIOS’ high-temperature Distributed Temperature Sensing System to provide a complete profile to optimize and improve SAGD well management. Sub-meter spatial accuracy is claimed for pressure and temperature measurement along a horizontal well—www.oilit.com/links/1110_93.
Paradigm has announced new connectors for its Epos upstream integration architecture. TerraSpark is developing a link from its Insight Earth seismic interpretation system and Volant Solutions is doing the same for EnerConnect. Both links are being developed with the Epos OpenGeo dev kit. Paradigm also announced a sale of its EarthStudy 360 to seismic data specialist Seitel. Seitel will use the technology to ‘enrich’ its seismic offering in the shale gas/oil arena—www.oilit.com/links/1110_94.
Westheimer Energy Consultants and 3GiG have partnered on an upstream decision support solution leveraging 3GIG’s Prospect Director and Westheimer’s expertise in information and data management, business process and upstream domain expertise. The companies will offer ‘tailored, enterprise-wide business information management systems to support upstream oil & gas management teams.’—www.oilit.com/links/1110_95.
Cadac Organice has delivered its eponymous SharePoint-based solution for engineering document management to Weatherford. Danielle Gardner, document controller, Weatherford, said, ‘The Cadac Organice cloud solution has provided Weatherford with the functionality to track documents, transmit them, use the inherent SharePoint functionality and provides insightful reports for our projects’—www.oilit.com/links/1110_96.
A new Dutch/German consortium ‘Energy Network Optimisation in Europe’ is working on mathematical modeling of Europe’s ‘increasingly complex’ gas networks. According to the proposers, ‘liberalization of the gas network is a complicating factor which can potentially lead to the emergence of inconsistencies.’ These, it appears, can be ‘resolved’ through ‘mathematical insights.’ Traditional paths from producer to consumer are evolving into a gas ‘superhighway’ with a multiplicity of branches. Manager of the €12 million project, TU-Delft’s Kees Vuik said, ‘The mathematical models that we are developing can be used by gas suppliers to tune their gas lines much more precisely in terms of the amount and value of the gas flowing through them’. The research sets out to align the delivery of different types of gas with varying calorific values with the ‘dynamic gas market that has emerged since liberalization.’ Tuning is achieved by opening and closing valves and by switching compressors on and off. Consortium partners include 3TU Delft/AMI and Matheon—www.oilit.com/links/1110_19.
EU researchers are working to develop ‘reconfigurable chips’ for real-world applications. The ‘facilitating analysis and synthesis technologies for effective reconfiguration’ (Faster) is a €2.8 million, 3 year program targeting applications in computational earth science, 3D rendering and image processing and network intrusion. Olivier Pell of project member Maxeler Systems expects that the project will lead to a ‘20% productivity improvement and a 50% total ownership cost reduction for applications such as reverse time migration’—www.oilit.com/links/1110_20.
RAE Systems has announced a new portable ‘biometrics harness’ that provides GPS monitoring and real-time visibility of a worker’s physiological status. The BioHarness is a chest-worn strap that incorporates electrocardiogram, breathing rate, temperature, posture and activity sensors for real-time monitoring. Safety officers can track a worker’s bio-readings and remove them from a fatigue situation if deemed necessary.
RAE Systems VP Thomas Negre said, ‘BioHarness allows site commanders to monitor the physiological readings of multiple workers and responders from anywhere around the world. It also allows them to assess and monitor hazardous chemical compounds and make appropriate decisions for rest, extraction and team insertion in real time.’ The harness connects to a base station through a long-range wireless radio.
RAE Systems also recently announced that RigMinder has chosen its ‘MeshGuard’ wireless detection system to protect oil-rig workers from toxic gases. MeshGuard has been integrated with RigMinder’s electronic drill recorder to provide real-time monitoring of up to 24 wireless gas-detection monitors. RigMinder VP Chris Dorris said, ‘MeshGuard easily integrated our EDR system and provides customers with the critical information they need to safely manage oil-rig operations. The standalone wireless solution saved us time and development costs, allowing us to meet our customer’s needs in a timely way.’ The system is currently deployed on four rigs in Venezuela to monitor for hydrogen sulfide and lower explosive limit gases and vapors. MeshGuard outputs data in standard formats including RS-485, Modbus and XML. More from www.oilit.com/links/1110_21.
Engineering behemoth Siemens has endorsed the latest V6.0 release of VRcontext’s ‘Walkinside’ virtual reality toolset. Walkinside has been architected around a web server and Citrix-based remote 3D visualization. A new ‘Seabed Builder’ function can be used to create VR models of subsea field lay-outs from ESRI ArcGIS data. A new software development kit allows for integration with enterprise applications.
Bernd Kokkelink, head of R&D and Comos product manager with Siemens said, ‘We have leveraged the new, more flexible architecture to integrate Walkinside with our Comos plant engineering software. It is now possible to implement comprehensive basic and detailed engineering processes in 3D across the whole plant lifecycle.’ Walkinside was used by Total E&P Angola to simulate real-life workflows and to train operators (Oil IT Journal July 2011). More from www.oilit.com/links/1110_22.
Teradata has entered the GIS market with a combo of a Teradata database appliance running and GeoServer, a suite of open source geospatial and visualization applications. The solution offers high performance access and in-database analytical processing of location and business data. The solution is supported by OpenGeo, a specialist provider of open source geospatial solutions. Teradata’s GIS promises speedy access to real-time geospatial data delivered via a flexible architecture. GeoServer works with visualization and GIS tools such as Google Earth, Esri ArcGIS, PitneyBowes, MapInfo, and uDig. Users can access business data through web applications, mobile devices, and desktop GIS clients. GeoServer is the reference implementation of the Open Geospatial Consortium web feature and web mapping service standards. More from www.oilit.com/links/1110_23 (Teradata) and www.oilit.com/links/1110_24 (GeoServer).
Norway’s E&P Information Management association (EPIM), run by a group of Norwegian operators, addresses information flow between member companies and the authorities. Existing EPIM XML-based formats cover license titleholder information, environmental reporting and equipment data exchange. A new service, the EPIM ReportingHub (ERH) has been announced for a variety of drilling and production reports. ERH validates reports against the NPD’s Fact Pages and Posc/Caesar Association’s ISO 15926 reference data library (RDL). The ERH is powered by ‘semantic web’ technology from graph database specialist Franz Inc. and TopQuadrant, a semantic data integration service provider.
TopQuadrant used its ‘TopBraid’ semantic processing platform to integrate with the existing standards. Franz provided its AllegroGraph. The W3C’s Sparql ‘inferencing’ notation (SPIN) also ran. For the meantime, the semantics are all under the EPIM hood. Operators stay with their existing XML reporting schemas and ERH outputs are XML and PDF. Presumably this will change when the rest of the world upgrades to the semantic web’s RDF protocols. More from www.oilit.com/links/1110_54.
At the Society of Exploration Geophysicists’s convention last month, HP was showing a mock-up of Intel’s new ‘Knights Corner’ high performance computing hardware based on its ‘many integrated cores’ (MIC) architecture. Intel has been taking its time to respond to the avalanche of Nvidia GPU-based HPC solutions and the proprietary Cuda parallel processing language.
Knights Corner targets, inter alia, seismic imaging applications. A novel 22-nanometer manufacturing process scales to over 50 cores per chip. Cores are small and low-powered with modest single-thread performance. But aggregate chip level performance is claimed to be much higher. The hardware on show was a very fat PCIE card, several of which can be rack deployed. A ‘Knights Ferry’ dev kit includes extended versions of Intel’s C, C++, and Fortran compilers along with Parallel Studio. An early adopter of the new kit is the Texas Advanced Computing Center whose ‘Stampede’ is a 10 petaflop Linux cluster. Stampede will include ‘hundreds of thousands’ of Intel Xeon and MIC cores and should be running in 2013 when it is expected to be among the most powerful computers in the world. More from www.oilit.com/links/1110_55.
Eni’s oil and gas engineering subsidiary, Saipem, has awarded Accenture a contract to deliver a data management solution to support its global onshore engineering projects. Accenture will supply a customized engineering, project and data management (EPDM) solution for work volume and requirement estimation, on site activity planning and project progress monitoring. EPDM automates engineering data flows and supports cross-departmental data use. The solution embeds Oracle’s Primavera P6 enterprise project portfolio management (EPPM) solution and an Oracle database.
Marco Montesano, head of engineering
and construction management information systems with Saipem said, ‘With complex
projects in remote locations, quantity management is one of our most important
challenges.’ Accenture’s Massimo Pagella added, ‘Our experience has shown that
data integration and management is the key to the successful execution of engineering
and construction projects. Data management improves performance in construction
and helps optimize assets throughout their lifecycle.’ A prototype phase will
go live in the third quarter of 2011 before being rolled out internationally
in support of up to 5,000 construction workers. More from
At last month’s ECIM conference, Petrel’s data management capabilities were on show in a new ‘Studio for Petrel 2011’ offering. Studio now supports Google-like full text search across all Petrel objects. Arbitrary text, links or movies can be attached to a location in a project. Petrel Studio now offers a ‘real’ database with support for large data sets. It has been tested with 250,000 wells and terabytes of seismics. A demo merged data from a couple of online map services, with conversions handled on-the-fly by the new ‘GIS aware’ Petrel. Studio also offers new data selectors—polygons on map and sliders for restricting search to depth ranges.
Petrel’s basin modeling capability was featured at the San Antonio SEG. ‘Petroleum Systems in Petrel’ leverages Petromod functionality to provide ‘first pass’ basin analysis. Interpreters can investigate vitrinite reflectivity and migration pathways. The toolset supports composite mapping of charge, reservoir, play fairways and traps—as a ‘chance of success.’ Petrel is now officially a ‘fully fledged tool for seismic interpretation—be in no doubt.’ If all this isn’t enough you can roll-in plug ins for fault seal analysis and for viewing pre-stack gathers for a holistic investigation from ‘play adequacy to prospect adequacy.’ All you need is that renaissance individual versant in all of the above. More from www.oilit.com/links/1110_57.