Tossing a few editorial ideas around this month I thought, ‘why not write an editorial about how to write a newsletter.’ In the end I decided not to because a) the audience for such an editorial is vanishingly small and b) I would be giving too much away.
Information food chain
I realized also that what I really wanted to do was make my life easier by trying to ‘fix’ a problem that exists further up the information food chain. This also may be of more interest to the bulk of our readership. I will therefore address the much more interesting topic, in a sense the inverse of ‘how to write a newsletter,’ that is, ‘how to write a press release.’
Before we get into the ‘how to’ part, it is worth considering why you are writing your press release in the first place. A few obvious reasons spring to mind. First there are the ‘reporting’ requirement-related releases. ‘We are about to release financial information,’ ‘Here are our Q4 results for financial year X,’ or ‘We have just released a 32K’ (I made that one up).
These strictly choreographed releases follow a pattern that can be anything from a careful exercise in information retention to an opportunity for some wild puffery as a start up reports that it has found some cash and avoided going down the tubes. Since Enron showed that the actual numbers companies produce can be completely meaningless, I expect that most editors, like me, skip the numbers and scroll down to the comments in the hope of finding an indiscretion or similar insight. But the pickings are usually rather thin in the regulatory release.
In the run in to a major tradeshow, companies often issue a press release about a new partnering deal, a new software release or some great achievement or other. While some of these may be attempts to generate news ex nihilo, I have to say that the formulaic press release can be a great help to the hard-pressed editor. The combo of tradeshow presence, press release and interview, even if it is a bit hackneyed, remains the sure-fire route to publication. It’s a shame therefore that it is an activity in serious decline.
While I encourage a degree of inventiveness in a release, sometimes companies seem to be acting in desperation in their attempts to ‘entrap’ the unwitting editor. First on my pet hate list is the undated release. This is popular on some websites. No doubt the reasoning is that the news in an undated release stays permanently fresh. It doesn’t.
A related trick is the ‘rehash’ of an old story. Both techniques work occasionally—although I like to think that Oil IT Journal’s ‘stale news’ radar is pretty effective. But when the trick does work, the effect is not necessarily what was intended. Savvy readers of technical journals are very likely to pick up on the story’s age and spot the originator’s desperation. As for the editor in the middle, ever heard of ‘once bitten twice shy’?
Speaking of readers, writers of releases should be aware of a couple of obstacles between them and the recipients of their message. I like to think of these as impedance mismatches, but then I used to be a geophysicist! The first mismatch is in the perceived importance of the players. If you are LittleServiceCo (LSC) and you have sold a widget to BigOil (BO) then the news is not, ‘LSC sells widget to BO,’ but ‘BO buys widget from LSC!’
Another issue is that of the target audience. Often press releases are written with the main objective of not treading on any C-level toes! Releases like this offer ‘quotes’ from COO, CFO, CTO whatever—all saying the same thing that was already said in the preamble! The most amusing example of this is when a quote is attributed to two individuals. Or when the same quote is recycled in different releases—hilarious!
Another own goal is the indulgence of plethoric product references with and © notices attached to biZZarrePRODUCTSM typographic atrocities. Apart from making the release look like spam for V1*gra, it’s clear that the intended audience for this stuff is not the reader. Quite who it is I’m not sure. The lawyers? If there are more than a couple of clearly distinct product names in your PR, then the likelihood is that your marketing is as confused as your readers will be.
I realize now that I have opened a can of worms. This topic is better suited to a PhD thesis than an editorial. I was going to say that I am too old for a PhD. But then I heard about Brian May*...
* Queen guitarist Brain May has just resumed work on his 1971 astronomy PhD at the tender age of 60!
Chevron has signed with UK-based engineering and plant information specialist Aveva for the implementation of its Aveva Net Portal solution at its Nigerian Agbami floating production, storage and offloading (FPSO) facility. Abgami, the world’s largest FPSO is located 70 miles off the Nigerian coast and will produce 250,000 bopd when it comes onstream in 2008.
Aveva Net Portal (ANP) is a web-enabled solution for the integration and collaborative use of engineering information. 3D models, schematics, documents and application data can be accessed in context without needing the source application that created the data.
ANP leverages the ISO15926 plant information standard and XML technology in a ‘flexible and easy-to-implement’ solution for information management throughout the project lifecycle. ANP is claimed to reduce the time it takes to find information, to enhance data quality and to make information accessible to all stakeholders. Global access to information is said to reduce costs and risks in all phases of large capital engineering projects.
The rationale behind a plant information data model such as that provided by ANP has been the subject of several previous Oil IT Journal articles. Most recently (OITJ April 2007) the ISO 15926 plant data model got backing from the US Fiatech construction industry standards body. ISO 15926 had its origins in work done by the Norwegian POSC/CAESAR organization.
The plant information management system ensures that accurate data is available during the project’s construction phase. But the main benefits from such systems come during and after handover of the facility from the engineering contractor to the owner operator. This has been a pain point historically, with considerable inefficiencies and even data loss.
Information stored in ANP is continually updated on a master hub located in Chevron’s land base in Lekki, Nigeria. Another system onboard the FPSO is kept in synch with a replication mechanism. Data includes piping and instrumentation diagrams (P&ID), a 3D ship/facility model and other engineering documents.
Aveva’s local partner is Nigerian Engineering Services company Lonadek which will be responsible for delivery of the Agbami Portal and for on-going support when Abgami goes onstream in 2008. Design and engineering work is spread between several Nigerian and international contractors and on the FPSO itself. The topside was designed and built by KBR—a member of the Fiatch ADI ISO 15926 project. For more on Aveva’s activity in oil and gas, see our interview with Aveva’s North American operation head, Matthew McKinley (OITJ April 07).
Petroleum software boutiques Australian Petrosys and Norwegian GeoKnowledge are pooling their resources to offer ‘integrated access’ to their analytic and mapping tools. The teaming agreement covers Petrosys high-end mapping package and GeoKnowlede’s GeoX stochastic play and prospect modeling toolset.
The idea originated with a prospect and leads system developed for Santos, leveraging ‘interesting workflows’ across Petrosys and GeoX. Marketing the software combo will begin in the Australia/SE Asia region and, if successful, will extend worldwide.
We asked Petrosys CEO Volker Hirsinger if the companies might merge. ‘There are no plans at present but we are looking to integrate sales and support. We both share the problem of supporting a worldwide client base.’
GeoKnowledge CEO Charles Stabell added, ‘Petrosys’ toolset and its ability to integrate diverse GIS and IM strategies make it a logical choice for extending the spatial functionality of GeoX. Improved visualization will provide greater understanding of risk and uncertainty in exploration and exploitation opportunities.’
Following the integration of the authoritative European Petroleum Survey Group (EPSG) into the International Association of Oil & Gas Producers (OGP—OITJ Feb 06), OGP is now responsible for both the EPSG database and the UKOOA positioning formats.
Committee chair Jack Verouden (Shell) said, ‘The EPSG database is now the world’s standard reference source for positional data and it is appropriate that we are now maintaining and enhancing this valuable resource. The most recent addition is a guidance note titled ‘Geodetic Awareness. This should be essential reading for all non-geodesists working with different coordinate reference systems. The note, datasets and standards are available free from www.ogp.org.uk.
Government agencies on either side of the Atlantic are offering users of geological and topographical information high tech visualization and databasing solutions leveraging Adobe’s PDF format and viewers. The US Geological Survey is using TerraGo’s GeoPDF Adobe Reader toolbar to put topographical maps online.
TerraGo turns Adobe into a geospatial application with viewing, sharing and update capabilities. Features include multiple coordinate displays, measure length and area, layers, attribute search and ShapeFile export.
On the other side of the pond, the British Geological Survey (BGS) is using Adobe’s 3D visualization capability in its LithoFrame project, a 3D geological model of UK stratigraphic units, faults and volcanic features. The downloadable model extends to the Moho, a depth of 30 km.
LithoFrame was built in GoCad and leveraged the BGS’ Digital Geology Spatial Modeling project (DGSM). The DGSM infrastructure ‘links databases, standards, applications and procedures’ to enable multi-scale 3D model data to be accessed on demand. Data management strategies handle multiple data types including borehole logs, digital terrain models, geological and seismic cross-sections. Databases are linked using ISO19115-compliant metadata and XML mark-up. Sample datasets include the UK ‘surface to Moho’ model mentioned above and offshore basins west of Scotland.
A new DVD compiled by the Safety Users Group discusses current best practice for safety, exploring some common mistakes and confusion surrounding the IEC 61508 and IEC 61511 international standards for functional safety. The DVD, ‘Safety in the process industries,’ shares the real life experience of safety experts from Emerson Process Management, Invensys Process Systems and others.
Safety Users Group President, Didier Turcinovic said, ‘Increased awareness and understanding throughout the safety arena is essential – not just for specialists, but also managers, engineers and maintenance teams, even the legal department.’
In a top-down approach, the DVD handles technical details of Safety Instrumented Systems (SIS) and the IEC 61508/61511 standard and how safety best practice can support companies in a court of law. The 2 DVD set costs $115 from www.safetyusersgroup.com.
Calgary-based GeoLogic Systems announced Version 7 of its GeoScout product on the 7th of July 2007 (07/07/07). GeoScout provide comprehensive data visualization and analysis tools for ‘all disciplines within the oil and gas industry.’ GeoScout combines information management with presentation mapping and cross-section tools in one package. The software leverages connectivity to GeoLogic and other Canadian data sets.
A new project manager offers central project creation and preview and the ability to work on multiple projects simultaneously. An improved cross-section module allows for loading of an ‘unlimited’ number of raster and/or LAS logs.
Peep formatted data can now be imported to the PetroCube module for production forecasting. Land contract data in Explorer Software’s CS Explorer can now be queried. More from www.geoscout.com.
GB Petroleum has deployed Interactive Net Mapping’s ‘OilElefant’ web-based geographic information system (GIS) to visualize its E&P operations in Europe and North Africa. The package will form the basis of a database of GBP’s license, well and seismic data, reducing the need for hardcopy and storage. Released last year, OilElefant displays oil and gas data in multiple projection systems on an interactive global map, with zoom and drag functionality. The package can be accessed by mobile users wherever an internet connection is available.
According to INM, OilElefant leverages a third party worldwide interactive base map complete with topography and bathymetry shading. The interactive world map offers fast response and conforms to the OGIS web map services WMS 1.1.1 standard. OilElefant embeds European Petroleum Survey Group (EPSG) geodetic codes and transformations.
Internet Information Services
Windows-based OilElefant was coded atop of Cold Fusion MX7 and Microsoft’s Internet Information Services (IIS). The underlying database is Microsoft’s Database Engine (MSDE) with an upgrade path to SQL Server for larger datasets. The database model schema was based on PPDM Lite 1.0. More from firstname.lastname@example.org.
UK-based Hampton Data services has released version 4.0 of its GeoScope GIS-based E&P data management system. The new release crawls network drives, building a ‘knowledge base’ of keywords and spatial metadata. The results can be browsed in a GIS viewer or searched by area of interest, data object, keyword or free text. New relationships can be recorded and data tagged, leaving the original data in situ. The crawler can be used to monitor the network, updating GeoScope in realtime.
GeoScope leverages OpenSpirit and ODBC connectivity to access and tag data across the corporate data ‘archipelago’ in a collaborative, ‘Web 2.0’ approach to data management. The GeoScope portal allows mashing of shared folders, external database tables, ESRI GIS and other spatial data. According to Hampton Data, ‘Collaborative working becomes intuitive and possible. Knowledge capture is easy because the user is released from the constraint of structured database taxonomies and workflows.’ More from email@example.com.
Shell International E&P has commissioned French petrophysical boutique Techsia to develop and implement Techsia’s Techlog petrophysical platform. Techlog supports analysis and interpretation of well log and core data. Shell will deploy the tools for both wellsite operations and office-based integrated studies.
Hans de Waal, Shell’s chief petrophysicist said, ‘Shell is a leader in petrophysics. The cooperation between Shell and Techsia endorses the capability that Shell has built over the past 60 years. Techlog will help us to develop and implement further new applications—in fact we are already seeing its impact on project cycle times.’
This development targets a ‘step change’ in performance with improved data viewing and uncertainty management. Shell reservoir staff and Techsia software engineers are to collaborate on software development.
The development will initially target basic petrophysical workflows, the data model, connectivity, user programming capabilities, multi-well layout, performance and parameter management. Phase two will extend Techlog with Shell’s ‘cutting edge’ petrophysical tools.
Long Beach, CA-based Earth Science Associates (ESA) is offering new functionality in its ‘GOM3’ Gulf of Mexico 3-D GIS package. ESA clients who also subscribe to GX Technology’s Gulfspan regional seismic stratigraphic and structural interpretation framework can now view 2-D depth-converted seismic data from within GOM3.
GOM3 already integrates with data supplied by the US Minerals Management Service (MMS) on wells, completions, reservoirs and paleontology for producing fields. Last year ESA added PaleoData Inc.’s dataset to its 1,000 3-D GIS field models.
Input-Output unit GXT’s Gulfspan dataset is a basin-wide framework comprising 76 long composite seismic depth-sections along with a consistent interpretation framework of faults, salt and regional stratigraphic boundaries. GOM3 wells, reservoirs and completions can be added to the seismic data for further analysis. Later this year, ESA plans to make tops, faults and salt ‘smart’ GIS features and to add a SEG-Y data import module.
NeuraMap 2007 introduces a ‘wizard’-driven interface, improved editing and help with map calibration. Volumetric and reserve reporting have also been added to the basic mapping capability.
Phillips Petroleum’s Tom Gamwell said, ‘NeuraMap 2007 is a milestone in map digitizing, increasing speed and accuracy. Its many features have made it a workflow shoe-in for Phillips Petroleum.’ The automated digitizing system has been reworked and volumetrics and reserves computations added to the package.
NeuraMap is a data capture, transformation, and QC system that turns paper maps, cross sections, x-y plots and other data into a digital resource. Legacy data such as seismic SEGP1 or UKOOA formats can be read by Neuralog and overlaid on the original image for QC and editing. The system embeds Blue Marble’s projection management with support for 12,000 mapping systems plus user-defined parameters. Output formats include GeoTIFF and vector formats for third party GIS systems.
Maps can be scanned with the NeuraScanner directly into NeuraMap. Multiple sources can be opened simultaneously with vectors displayed in all source documents. Vector and image datasources can be merged for quality checks.
Schlumberger has signed an OEM agreement with Isilon Systems. The companies are to optimize Isilon’s IQ clustered storage solution with Schlumberger’s E&P software and data management portfolio. Isilon has also just announced its IQ and EX 9000 series solutions providing ‘single file system scalability to 1.6 petabytes.’
Matrikon has released a new version of its OPC Tunneller for OPC networks. V3.0 of the Tunneler includes data encryption and lossless compression for satellite data transmission. The Tunneller enables secure, firewall-friendly access to control system data from a corporate network.
COADE has released CADWorx 2008 for producing AutoCAD 3D models from laser ‘as-built’ surveying. The new release links design information to live process and instrumentation diagrams (P&ID’s). The solution leverages Leica Geosystems’ FieldPipe acquisition technology.
MetaCarta has announced geographic data modules (GDMs) that ‘identify, disambiguate, and resolve’ Arabic and Spanish geographic references. Documents can be made ‘location-aware,’ and geographically categorized, visualized and mined. The new GDMs enable MetaCarta GeoTagger to automatically identify the language and character set of each document and assign latitude and longitude coordinates and country code tags to each place name in the document.
W_Geosoft has released WinScanSeis, a new seismic paper section digitizer. The package turns Windows Bitmap files into SEG-Y or Seismic Unix formatted digital data for processing with W_Geosoft’s VisualSUNT package.
Interactive Network Technologies (INT) has released J/GeoToolkit 3.0, a portable E&P data visualization and analysis toolkit.
Systems integrator World Wide Technology is providing Cisco Systems-based products and services to help build out Pacific Gas & Electric’s secure wireless infrastructure, consolidating its field offices into ‘resource management centers.’
Petroleum Geo-Services (PGS) has deployed a novel 4 component fiber optic seismic monitoring system in the deepwater Gulf of Mexico for Chevron.
TerraWave has developed Wi-Fi enclosures for hazardous environments. The enclosures house Cisco 1200 and 1500 series access points and other wireless products and are awaiting ATX explosive atmosphere certification. Target deployment includes oil rigs, refineries and chemical plants.
Triple Point Technology has released the Commodity XL Management Dashboard, an add-on to its Commodity XL flagship. The Management Dash is a business intelligence solution that aggregates data and provides analytical tools and a real-time graphical depiction of the business. The Management Dash ‘drives intelligent decision-making by supplying management with key performance indicators and real-time business insight.’
Apache Corp. has signed a software and services deal with UK- headquartered Ikon Science for the provision of its interpretation and modeling software. The global agreement covers Ikon’s RokDoc flagship and the new ‘Modeling While Picking’ link to Schlumberger’s Petrel. Apache is to integrate RokDoc into its standard reservoir workflow.
Apache’s senior geology advisor Kelly Haizlip said, ‘We believe that Modeling While Picking will enable us to bridge a gap in our subsurface workflow by reducing the risk of a one way guessing game between geophysics and geology.’
Modeling While Picking (MWP) leverages Schlumberger’s Ocean framework to provide ‘interpretational insight’ by incorporating rock physics methods into the standard interpretation workflow. Events can be picked in Petrel, which update the 2D RokDoc model in real-time. The impact of changes in the Petrel model are immediately visible in RokDoc. Global deployment follows a pilot implementation in Apache’s Aberdeen office. RokDoc is used in 50-plus oil and service companies. Ikon also provides quantitative interpretation services and training to over 100 clients throughout the world. More from firstname.lastname@example.org.
As the SEG deliberates on a future SEG-D Rev 3, and Energistics/POSC moots a ‘SeismicML,’ out of left field comes another geophysical standards initiative in the form of the EU-sponsored Geomind consortium. Geomind, unveiled at the EAGE in London last month, started with the observation that ‘no extensive metadata standard exists for geophysical data.’
The consortium of EU institutes and companies is to address this issue with an extension profile to the ISO19115 metadata standard. A data portal will leverage the new standard for data discovery and exchange. The open standard will be supported by free applications to help future users become data providers.
Geomind project manager László Sõrés of the Hungarian Eötvös Institute told Oil IT Journal, ‘We are analyzing existing data standards to see which to recommend. All SEG formats will be accepted and recommended but at present we have no connections to the SEG standards committee.’ More from www.geomind.eu.
The Amsterdam and Houston PNEC meetings totaled over fifty papers and panel discussions so the following is an eclectic report of some highlights. Sushma Bhan’s presentation covered Shell’s holistic approach to R&D knowledge management (KM). Collecting and serving information from Shell’s ‘walking encyclopedias’ is not always easy. But the effort is worth the trouble as R&D holds much of the company’s knowledge. The holistic approach to KM involves R&D portals, technical publishing, Livelink, the E&P catalog and Shell’s Sitescape community of practice (COP). The COP is managed by subject matter experts and global coordinators. Shell holds monthly seminars where researchers present the fruits of their studies. ‘Legacy knowledge’ is captured from ex-chief geologists co-opted to the R&D team. Technology support is provided by Autonomy, MetaCarta, the Flare Catalog and Invention Machine’s GoldFire semantic search.
Alessandro Allodi noted that Shell’s PDO unit produces more information in a day than a person can read in a lifetime. Shell is searching ... for the perfect search engine! The (somewhat intractable) problem involves a balance between recall (ensuring all documents are retrieved) and precision (the relevancy of what is returned). Allodi compared keyword-based and Boolean querying and ranking methods, where precision remains a problem. Tagging and metadata is an excellent though expensive approach. GIS is great, but Google’s ranking techniques may be less appropriate in the corporate environment. On the horizon (two to three years out) we can expect improvement as technology embeds heuristics and information theory. The heuristic approach is already deployed in MetaCarta’s GIS search engine.
Irina Tucker described Chesapeake’s migration of its geoscientists’ role assignments and operational hierarchy from a plethora of Excel spreadsheets to a GIS-based information system. Chesapeake’s ‘Team Table’ now offers map-based management of teams, audit trails and historical data that shows how the company was organized in past – offering interesting possibilities for correlating organizational strategy with the bottom line. Team Table applications include Hyperion System 9 master data management and ESRI GIS.
Dede Schwartz outlined ConocoPhillips’ Alaska unit’s integration of its geoscience, drilling and production processes into a ‘neutral’ Oracle database. The Alaska Technical Database (ATDB) holds 7,900 wells and is growing at around 100 wells per year. The vendor-neutral ATDB is a single point of access and a control point for well identifiers. Schwartz described the unique identifier as a huge benefit in bringing different systems together. The ATDB acts as a hub for data transfer between applications—allowing apps, which are ‘not eternal,’ to retire or upgrade gracefully. The ATDB also houses value added data, which otherwise is often isolated and hard to integrate. The ATDB was initially loosely based on Finder. Data extraction is performed by ‘the duct tape guy’ whose Perl scripts ‘can extract data from anything.’ According to Schwartz, the secret to good data management is, ‘Just say no to Excel!’
John Pomeroy described Petris work on streamlining Anadarko well data delivery. The initial Recall deployment in 2003 was a success, but a large ongoing drilling program swamped the operator, compromising data quality. Data was being delivered piecemeal to multiple stakeholders and it was hard to ensure that digital data sets were clearly identified. Direct delivery to asset teams ‘escaped’ corporate data management. Petris’ solution rationalized and automated data delivery to a unified central source. Service companies logs onto Anadarko’s website for the correct well identifier before upload. Anadarko’s 3 TB database now holds data from 300,000 wells. The Recall Autoloader was used to merge huge data sets form acquisitions. The solution is deemed SOX-compliant as acquisition artifacts are captured. ‘Back door’ data management has been eliminated and consistent naming implemented.
Qatari LNG exporter RasGas has developed a production surveillance database as described by Brian Richardson. The project started badly with a vendor that failed to provide the goods, at which point RasGas opted for an in-house development with help from The Information Store (iStore). iStore leveraged the OSIsoft PI API to turn the raw historian data into something that was ‘understandable for users’. PI data now streams into Oracle tables with overnight UOM conversions etc. The system has replaced multiple Excel spreadsheets, introduced standard units and achieved a complex aggregation of tag information. RasGas is now planning ‘date-effective’ tag management for historical analysis.
The IM elephant
Duncan Stanners described Shell Canada’s first attempt to tame the IM ‘elephant’ as a complete failure. The early Livelink EDMS implementation led to multiple repositories and copies of data. The feeling was that users couldn’t find and didn’t know what was there. Shell’s information specialists recognized the problem but were lacking a telling argument to sell an IM improvement project to management. With help from Flare consultant Alan Bayes, Stanners developed a ‘risk assessment matrix’ plotting the likelihood of an incident against the severity of the consequences. Rather than focusing on costs savings Shell looked at real events. In one oil sands project Shell failed to put the correct protocols in place governing its relationship with the engineering contractor, resulting in two truckloads of paper being delivered and ‘costing millions to sort out!’ The risk matrix persuaded management to devote $4 million to solve the problem. A Shell analysis supported the IM framework, putting people first, then process, then tools. Shell also noted that one size does not fit all. IM support for a deepwater well is different from that required in a Peace River CBM development involving 200 identical wells. Flare’s E&P Catalog was deployed to manage access to Livelink. The key learning? Standard reference data is king!
Pioneer, according to Carol Tessier, is betting on web services for data delivery. Users want to pull more and more disparate data together. G&G want to see engineering, financial, land etc. Expectations are growing, driven by Web 2.0 consumer mash ups and RSS feeds. With help from IHS and Schlumberger, Pioneer is moving to data services, applications and portal-based data search. Web services mean that ‘the IT standards wars are over.’ The POSC/Energistics ‘meta catalog’ is the key to integration. This approach offers ‘more flexibility than a data warehouse.’ Pioneer’s One Map application (developed by Schlumberger) plugs into IHS Enerdeq and PI data as well as collecting data from Landmark’s TOW and Pioneer’s financial system. Autonomy search is also provided as a web service. Tessier suggested that more industry involvement in this activity would be beneficial, suggesting an independent third party to guide us as an industry. In other words, an ‘OASIS for oil and gas.’
David Archer described Petris’ work for Saudi Aramco on information management for large, long-lived oil and gas assets. This leverages Petris’ Dynamic Common Model in an ‘Authoritative Data Store’ of application-independent business objects. The platform is accessible through web services which assure lossless data movement. Guy Holmes (SpectrumData) warned of the technology gap between the new Fiber Channel-based storage media and legacy SCSI. This is shaping up to be ‘a right royal mess.’ The issue of what data to keep, what to refresh and remaster, exercised speakers at the Amsterdam panel session. For some the cost of deciding what to keep was greater than the cost of just copying the lot! One major’s retention policy was signed-off by the CEO. Unused interpretation data is destroyed according to the retention policy.
This report is a short version of a Technology Watch report from The Data Room—more from email@example.com.
Tina Warner presented Dominion’s well data quality initiative. Dominion has a PIDM 2.5 well master database which feeds data to three regional master OpenWorks databases. These in turn are used to create interpretation projects. All transfers are controlled by InnerLogix (LX) data quality management (DQM) rules. Similar rules govern back population of added value data from regional masters to PIDM. Business units determine area-specific naming standards which are ‘hardwired’ into ILX Synch jobs and DQM process.
The PIDM datastore is updated nightly by IHS and other vendor’s data is loaded by Dominion’s IT group. A PIDM composite builder runs nightly to promote highest quality well data. Project data loading is achieved with similar ILX rules controlling data movement and overwrite protection. Data managers are notified of exceptions by email.
Careful evaluation is required of ‘friction points’ such as missing values from priority or new wells from vendors, risks of overwriting or losing value added data, synchronizing deleted data across projects and master stores, and the usual issues of inconsistent naming and data quality such as coordinate reference systems and well elevations.
Dominion worked with ILX on procedures for merging multiple event wells and child objects into a single wellbore for working projects with approved log curves and horizon picks. These involve very detailed standards for master OpenWorks project data. The ILX DQM approach is now extending to other business units.
Jim Day described how Newfield is using ILX to assure quality during data movement between Open Works and, again, a PIDM master dataset of well and directional survey data. Newfield has implemented scripts for well naming conventions and other data standards.
ILX CEO Dag Heggelund’s presentation on geodesy, subtitled ‘geography without geodesy is a felony’ traveled familiar ground to Oil IT Journal readers (See also the EPSG Guidance Note announcement on page 2). Helleglund’s presentation on the future development of ILX’ products outlined how the ILX Back Office tools were being extended to monitor data movement throughout the workflow. ILX is introducing new measurement categories for audit, data change and ‘relevancy rules’. ‘Near real time’ data availability will be facilitated with ‘data event detection’ in QCMonitor, eliminating manual data loading to corporate datastores. This means that, for instance, a marker pick in Petra is simultaneously broadcast to SMT and a PPDM corporate datastore in near real time, with on-the-fly quality checks assured by QCPro.
ILX is working on Development of ‘MetaLogix,’ a.k.a. ‘a roadmap to trustworthy data translations.’ This will extract and collect metadata, perform data translations and capture translation ‘context.’ MetaLogix will add search, traceability and comments. ‘Relevancy rules’ will allow AOI boundaries to be put on data transfer to reduce transfer bandwidth. ILX is working towards a ‘virtual data environment,’ making a heterogeneous data environment appear homogeneous. Which will be achieved by capturing data context, model translations and embedding knowledge of corporate processes. ILX user group presentations can be downloaded from www.innerlogix.com. More from firstname.lastname@example.org.
ILX is expanding its compelling data quality offering to a veritable data infrastructure. But instead of interoperability at its center—now seemingly a given—it has quality. Is that smart or what?
Pål Helsing is now Executive VP of Aker Kvaerner’s field development unit.
GE Oil & Gas has signed a pipeline inspection alliance with AGR Pipetech of Norway.
Paradigm has appointed John Allen as executive VP and general counsel, Clay Miller as VP and CIO and Andrew Stein as chief marketing officer. Allen was previously with Halliburton, Miller with AMD and Stein with Geomagic.
Divestco has acquired Spectrum Seismic Processing and iLand Data for C$2.45 million.
Allen Nfonsam has joined Techsia’s US support team.
Advanced Resources has promoted James Caballero to VP.
After one year of existence, Madagascar, the open source seismic processing package, has seen over 1,200 copies downloaded.
SpectrumData has appointed Tony Duffy operations manager in New Zealand.
Chevron’s Oilfield Ontology Repository project (OITJ June 07) now has a website—o4oil.org.
Pinnacle Technologies has acquired reservoir monitoring boutique Applied Geomechanics.
A new website, www.oilfieldpatents.com is a shop window for Baker Donelson’s IP group.
UK Common Data Access has issued RFIs for seismic and scanned image repositories. CDA has also hired Raye McRitchie to its data team.
Chevron’s Energy Technology unit has awarded a $5 million grant to UT’s Center for Petroleum and Geosystems Engineering at Austin for EOR research.
Deloitte Petroleum Services has hired Craig Foot to its PetroView as GIS/ IT specialist and Markus Fischer who joins as a PetroScope developer.
Schlumberger VP marketing Mike Benjamin has been elected chairman of the Energistics board. Benjamin succeeds Halliburton’s Jonathan Lewis.
Geosoft has appointed Chris Berry and Nick Burchell to its Professional Services group.
ExxonMobil has joined the Pipeline Open Data Standards organization, PODS.
Sharon Bickford has joined FIATECH as project manager.
Schlumberger has acquired Italian electromagnetics specialist Geosystem.
Didier Houssin has been appointed Director at OECD’s International Energy Agency.
Marc Nathan is now IT development director at the Houston Technology Center.
The International Accounting Standards Committee is seeking candidates for the new XBRL Advisory Council and Quality Review team.
Jamey Webster is the new CEO of Knowledge Systems replacing Jim Bridges who is now chairman. Webster was previously COO.
‘Cost and complexity’ has caused PGS to delist from the New York Stock Exchange (NYSE).
French EPC Technip has likewise opted to delist from the NYSE. Technip’s accounts conform to the International Financial Reporting Standards and the company ‘no longer considers it necessary to publish its accounts under two accounting standards.’
John Kelly is now senior VP of IBM Research, replacing Paul Horn who is retiring. Horn was, inter alia, responsible for development of the Deep Blue and Blue Gene supercomputers.
The French Petroleum Institute (IFP) has signed an R&D and training agreement with Saudi Arabia’s King Abdullah University of Science and Technology.
Randall Luthi is the new director of the US Minerals Management Service. Luthi comes from the Fish and Wildlife Service.
Baker Hughes’ Inteq division is to commercialise Grant Prideco’s Intelliserv wired drill pipe telemetry system.
Michael Lamach has been appointed to Iron Mountain’s board of directors.
The ISA standards body is to kick-off the ISA Security Compliance Institute for testing and certification of industrial control systems.
Isilon has named Paul Rutherford to VP engineering and Steve Fitz as senior VP worldwide sales. Fitz hails from EMC Corp. Rutherford’s is an internal promotion.
James Calaway has been appointed to the OpenSpirit board.
Two Norwegian service companies are to merge, creating a ‘next generation geophysical force.’ TGS-NOPEC and Wavefield Inseis will become ‘TGS Wavefield.’ After the paper transaction, Wavefield shareholders will hold 38% of the new company.
Expected synergies are to come as a portion of Wavefield’s fleet is deployed on TGS’ multi-client projects and as TGS captures additional data processing revenue from Wavefield’s acquisition business.
TGS CEO Hank Hamilton said, ‘We are entering a new business cycle characterized by rapid adoption of new technologies, such as wide azimuth seismic and fiber-optic based acquisition. The multi-client model continues to work in some markets. Others are better served by contract services. The transaction will increase our multi-client activity while we remain in the market for proprietary surveys.’
TGS Wavefield assets will include six 3D and seven 2D vessels, a high capacity processing center and new technologies for 4C/4D reservoir monitoring, electromagnetic and seismic data processing.
Toronto-based Geosoft has announced its intention to acquire the software and intellectual property assets of Northwest Geophysical Associates (NGA), based in Corvallis, Oregon. NGA’s assets include the GM-SYS range of gravity and magnetic modeling.
Oil and gas
Geosoft and NGA have partnered, since 1989, to provide the oil and gas industry with visualization and interpretation tools for gravity and magnetic data and its integration with seismics.
Geosoft CEO Tim Dobush said, ‘The proposed acquisition is a natural progression of Geosoft’s partnership with NGA. It provides an opportunity to advance our integrated solution, streamlining software development and our interactions with customers in the Oil and Gas and Geoscience markets.’ NGA’s software will be incorporated into Geosoft’s line of desktop mapping and processing software. NGA developers will become part of Geosoft’s R&D group.
Master data management specialist Data Flux has announced a major win with an unnamed global oil and gas company. The client has exploration, refining and petrochemical operations in some 150 countries around the world. DataFlux was deployed to cleanse and standardize product data prior to loading it into a new data warehouse.
Mergers and acquisitions have increased the number and complexity of business applications and information systems within the client’s business. To improve information reliability and accessibility, the company began to implement global data warehouses, huge information repositories of standardized and validated versions of critical business information.
According to DataFlux, product data is ‘notoriously difficult to standardize.’ Even within the same business, companies have different ways to describe products and brand names. The company’s worldwide presence made the problem even more difficult by adding language differences and other cultural variables, making the task of building a global product data warehouse significantly daunting. By using DataFlux’ master data management technology, the client standardized product information in a global data warehouse, freeing business analysts and gaining a better assessment of product needs and availability.
Announced last December (Oil ITJ Dec 06), the latest version of DataFlux includes ‘Accelerators,’ pre-built workflows, a new data quality integration platform and new metadata discovery and data monitoring capabilities. The package includes customizable reporting on data that does not meet corporate quality and integrity standards.
Houston-based life of field services and operating E&P company Helix Energy Solutions has licensed P2 Energy Solutions’ (P2ES) Enterprise Upstream (EU) and Land solutions. Following acquisition-fueled growth, Helix needed a single ERP solution to replace its disparate legacy systems and, following a competitive bidding process, chose P2ES’s Enterprise Upstream suite.
Helix VP and CIO Troy Matherne said, ‘We were impressed by EU’s robust functionality and the domain knowledge of P2ES staff. By standardizing on a single ERP platform, we can meet compliance regulations, streamline workflows and improve data and decision-making across geographic and business unit boundaries. EU’s modularity offered a flexible implementation process, accommodating the different cultures across our businesses and preserving our competitive advantage.’
P2ES claims its Enterprise Land solution, based on the Microsoft .NET platform, is the ‘first service-oriented architecture-enabled land management solution.’ The lease, property and agreement management system offers GIS integration, content management and oil and gas accounting for oil and gas.
P2ES president Tarig Anani added, ‘What sets us apart is our focus on the commercial and operational needs of the energy industry. One size does not fit all! We have the product range, knowledgeable staff and resources to interpret, guide and partner with companies as they embark upon large-scale business transformation projects.’
A joint venture between Pure Energy Services and Inversion Technologies has addressed a complex business problem involving reservoir monitoring, compliance reporting and e-commerce. The operator, a major Canadian coal bed methane operator, was required by the Albertan regulator to run a downhole pressure survey and to simultaneously record five zones in one well per land section of land – amounting to over 300 surveys a year. Three different companies were involved in the operation, each supplying data and invoices to the operator which was landed with ‘a data management nightmare,’ and a big reporting headache involving a huge management overhead.
Pure Energy and Inversion Technologies devised a hardware and software solution to solve the data management issue but also simplify operations and consolidate them into a single sourced solution. An interface to the pressure recorder hardware was built for setup and managing of field operations and download of acquired data. Following gauge calibration and processing the system submitted the results to the regulator and invoiced the Operator.
The application was programmed in ‘Dart,’ Inversion Technologies’ development environment, in 2 months. The self-contained, automated system requires little management overhead for the operator. To date close to 100 wells have been monitored. Dart organizes any digital asset into an accessible information resource. This is the foundation for processing and visualization functionality. Oil and gas applications to date include job management, log analysis, and data visualization. Data loaders are available for industry formats including LAS, DLIS, OpenWorks, OpenSpirit and others. More from email@example.com.
Kongsberg Maritime has been awarded a contract for the modeling and simulation of all of Kuwait Oil Company’s (KOC) pipeline network. Kongsberg’s Process Simulation Unit was engaged by Det Norske Veritas (DNV) Kuwait as a component of an ongoing risk assessment of KOC’s infrastructure.
Kongsberg will assess pipeline behavior using Kongsberg’s D-SPICE simulator. Modeling will include oil, gas, condensate and multiphase pipelines and cover design review, optimization and transient analysis. The contract includes several D-Spice for use by KOC engineers.
KOC’s pipelines range from 3 to 52” diameter with an overall length exceeding 3,200 km and gathering centers to tank farms, refineries and export facilities. Risk assessment will be performed by DNV’s Kuwait-based HSE consultants.
GE’s PII Pipeline Solutions recently released ‘ThreatScan,’ an acoustic pipeline monitoring system that mitigates the risk of ‘third-party’ infrastructure damage, the leading cause of failure in gas and liquid pipelines. ThreatScan detects damage caused by construction, farming and other excavation, assuring transmission line integrity and security of supply.
Using ATEX (explosive atmospheres) certified sensors, solar power and satellite communications, ThreatScan can be retrofitted to existing above-ground pipelines without excavation and can be deployed in populated areas. ThreatScan’s intelligent signal processing filters out ambient noise, identifying even minor pipeline strikes and avoiding false alarms.
GE Oil president Claudi Santiago said, ‘Operators now can know immediately where and when a pipeline has been damaged and can investigate what happened, rapidly repair any immediate failures or damage that may cause future failure and take action to ensure it will not happen again.’
In addition to immediately notifying operators of specific impacts in a given pipeline, ThreatScan provides monthly summaries of dates, times and locations of impacts, further enhancing the operators’ risk analysis and ability to establish long-term damage mitigation plans.
Polish grid computing specialist GridwiseTech has sold a scalable, software-based data management system to seismic processing contractor, Geofizyka Krakow, part of PGNiG, Poland’s largest E&P company.
GridwiseTech (GWT) is a vendor-independent grid computing and virtualization specialist with an offering that spans data processing, application debottlenecking and the provision of turnkey, scalable computing systems.
GWT president Pawel Plaszczak was an early grid computing pioneer, working on the Argonne National Lab’s Globus project. Plaszczak also co-authored the book, ‘Grid Computing: The Savvy Manager’s Guide*.’ At the International Supercomputing Conference in Dresden last month, GWT demoed engineering and financial simulations run from a mobile device, leveraging remote grid computing resources.
* ISBN 0127425039, Elsevier.
Motiva, a refining and retail business jointly owned by Shell and Saudi Aramco has completed a major digital automation project at its Norco, Louisiana refinery. Motiva Norco, one of the largest refineries in the world, processes 225,000 bopd. Emerson Process Management was the prime automation contractor on the project which leveraged Emerson’s PlantWeb digital architecture. Project scope encompassed seven process units and consolidated five separate control rooms into a centralized, integrated control room.
Norco operations manager Jeff Funkhouser described the situation before the upgrade, ‘Modernization of instrumentation and controls had not kept pace with other improvements to key units at the refinery. Poor reliability and frequent shutdowns were undermining the refinery’s efforts to serve the market effectively. Our instrumentation was not up to the challenges of the 21st century.’ Emerson’s PlantWeb solution leverages Foundation Fieldbus communications to network Emerson’s DeltaV digital systems, Rosemount instruments, Micro Motion flowmeters and Fisher valves. Diagnostic information is collected by AMS Suite’s Intelligent Device Manager which delivers alarms and data to operations and maintenance personnel.
The re-instrumentation project has already yielded ‘solid results’ for the facility, including its best reliability utilization year on record, and increased output with no new units or major investment.
Industrial Defender (previously Verano) has released version 3.0 of its flagship risk mitigation suite for real-time process control and SCADA environments. New Industrial Defender (ID) features include support for Cisco and Secure Computing firewalls, intrusion detection for power substations and better scalability.
ID’s cyber risk protection program includes professional services, risk mitigation technology and co-managed security services. Such an approach is considered critical in guarding against the increased volume and sophistication of cyber security threats. ID also addresses North American Electric Reliability Corporation (NERC) standards, which became mandatory and enforceable on June 18, 2007, with penalties of up to $1 million per day for non-compliance one year from now.
ID president and CEO Brian Ahern said, ‘As cyber security threats, industrial espionage and human error increase the risk to mission-critical infrastructures, we are enhancing our platform to fully protect against the potentially catastrophic impact on business operations and public safety. A successful cyber attack could halt business or cause shortages of vital commodities or environmental and human casualties.’
ID now monitors and manages both existing and newly installed ID firewalls from a single console. Support for Cisco and Secure Computing infrastructures eliminates the need for separate cyber security or control systems for each different brand of firewall. ID monitor and manages up to 200 host systems and helps companies meet regulatory compliance mandates and guidelines including ENISA, NERC CIP and NIST.
Wood Group pipeline and subsea engineering unit JP Kenny has deployed Simulia’s ‘Abaqus’ finite element analysis (FEA) package for design and installation engineering operations in several major Western Australian gas field projects.
JP Kenny pipeline business leader Gordon Cowper said, ‘To make the undersea infrastructure more secure, we have used Abaqus FEA software to plan for major event scenarios, including the impact of cyclones on pipeline dynamics. Abaqus has reduced simulation times and improved the efficiency and accuracy of pipeline design and route mapping.’
Abaqus FEA models static and dynamic loads, vibration, multibody systems and complex physics in a common data model and integrated solver. Simulia is a wholly-owned unit of Dassault Systèmes of France and Abacus can cohabit with Dassault’s flagship Catia CAD package.
Ken Short, Simula VP marketing added; ‘Abaqus has a history of use in energy, petrochemicals and offshore exploration. We continuously develop new technology for customers like JP Kenny to address the challenging engineering problems associated with increasingly harsh operating environments.’
OpenSpirit has released a web server that offers E&P professionals a ‘new way’ of interacting with geotechnical data and applications. The OpenSpirit web server (OSWS) lets casual and experienced users navigate, select and view OpenSpirit-enabled data. Geoscientists view E&P data along with publicly available map data via 3D browsing tools, such as Google Earth, ESRI ArcGIS Explorer and NASA World Wind.
OSWS harnesses the power of Web 2.0 technology to display basic well, seismic, interpretation and culture data at the project level without initiating application licenses. The tool was designed for executives who need a quick spatial view of their areas of interest. Power users can preselect data in OSWS and transfer data to OpenSpirit-enabled applications. Data can be accessed from OpenSpirit-aware like OpenWorks , GeoFrame , Finder , PPDM, Kingdom, Petra , Recall, and ArcSDE.
OpenSpirit CTO Clay Harter said, ‘Users get a global view of the latest information and can browse data in a table format or best of all, get a spatial view of G&G data by overlaying 3D raster imagery from a geographic exploration system.’
OSWS operates behind the firewall and can limit access to subsets of an existing OpenSpirit framework. Developers can integrate the OSWS with existing portals and web applications , exposing lists of data store types, installations and projects. OpenSpirit data attributes can be queried and the Google Earth KML code is generated on the fly.
Dan Piette, OpenSpirit president and CEO added, ‘Most think of Geographic Information Systems as a way to chart driving directions or zoom over satellite images. But GIS has enormous potential in the upstream to help knowledge workers interact more naturally with their data and geotechnical applications. The Web Server is all about improving efficiency and making faster decisions.’
Angolan state oil company Sonangol is to use Norchard Solutions’ ‘Succession Wizard’ package to support its succession planning and staff requirements forecasting. Succession Wizard (SW) is claimed to take the ‘gut instinct and guesswork’ out of staff appointment and promotions. SW identifies critical gaps in management and plans future staffing needs accordingly.
SW director Jon Lear said, ‘Organizations generally have a wealth of talent already working in different positions and levels throughout the business. This hidden resource should be identified and tapped as it is a key factor in effective succession planning. SW does this and shows you what skills and resources you already have at your disposal. SW identifies people who, with support and training, can climb your management structure and potentially become its future leaders.’ Sonangol is to use SW across its 30 worldwide subsidiary companies.
Statoil has upgraded its seismic processing cluster – now said to be Scandinavia’s largest high performance computer (HPC). The system comprises 264 nodes, each equipped with an HP Proliant DL380G5 computer with a 3 GHz, quad core Intel Xeon processor and 32 GB RAM.
Disk storage is provided by a Panasas ActiveStor 3000 Parallel Storage Cluster with 77 terabytes onboard and 4GBps bandwidth. The operating system is Red Hat Enterprise Linux 4.4 with Cisco OpenFabrics Enterprise Distribution (OFED) InfiniBand connectivity running across Gigabit Ethernet and CISCO switches.
Statoil senior VP Bill Maloney said, ‘E&P challenges have been the main driver for acquiring this kind of processing power. The machine is already providing seismic images of prospects in the Gulf of Mexico, where there are huge problems mapping beneath the salt layers.’ Processing GOM data is performed by geophysicists in the Statoil ‘Find’ project in Norway with input from geologists in Houston.
Staff geophysicist Roger Sollie added, ‘The machine plays a decisive role in providing better images of the geology several kilometers beneath the surface, especially in areas with complex geology.’ Performance is quoted as 12.7 (peak) and 10.4 (sustained) teraflops.
Saudi Aramco has awarded Perth, Australia-based ISS Group and its Middle East partner Naizak, an order for BabelFish, ISS’ well performance monitoring system. BabelFish will be deployed on 380 wells in Saudi Aramco’s iField project.
The order follows successful completion of a pilot. BabelFish monitors large numbers of wells in a single view and provides immediate visual alerts of non-performing assets. ISS continues work with Aramco on enhancements to the system.
ISS Group Asia/Middle East MD Mr. Richard Pang said, ‘BabelFish will provide Saudi Aramco with a web-based, real time well performance monitoring and data integration capability. We have been working with Saudi Aramco over the last two years to assist with the evaluation and suitability of the product.’
Though the incremental order value of contract is only A$300,000, Pang considers the deal to be ‘a significant international achievement for the ISS Group.’ ISS group also reports orders from BG International, Hess Oil and other ‘leading players in the global energy, oil, gas and process markets’ in North America and Europe. Asian clients include ConocoPhillips, Petronas and Kodeco. More from firstname.lastname@example.org.
Houston-based EnterSys Group has just announced OFSxpress, claimed as ‘the first all-in-one SAP solution for the oilfield and drilling services industries.’ OFSxpress (OFSx) addresses the business needs of midsize oil field service companies with a ready-to-use, integrated package that can be implemented in twelve weeks.
An initial SAP solution developed for PowerWell Services (part of The Expro Group) was the template for OFSx. Jim Claunch, former PowerWell CFO said, ‘EnterSys solved the problem of getting 11 countries transacting business and 10 remote locations communicating, all within our initial 90 day timeframe. The ability to have everyone looking at the same information and not multiple Excel spreadsheets is a huge benefit.’
Michael Sotnick, senior VP SAP Americas added, ‘OFSx enables oilfield services and drilling companies to achieve their growth objectives, lower costs and gain a better understanding of business processes.’
OFSx comes pre-configured with best practices and baseline business processes for multiple subsidiary reporting and global operations, detailed costs and revenues, consumables and services management and Sarbanes-Oxley and other regulatory compliance.