You’ve heard of the semantic web I expect—Berners-Lee’s pipe dream of a data driven web—enabling machine to machine communications and pinpoint search. I use the word pipe dream advisedly because the semantic web at large has been as big a flop to date as the original world wide web was a success. The main reason for this is that while folks are happy to cobble together some forgiving code like <h1>My Big Headline</h1> etc., the instructions for deploying the semantic web are hard to find, generally incomprehensible and requiring of an effort well beyond the average web developer—who is busy trying to code for three different browsers, none of which support the semantic technology anyhow.
But, the argument goes, this is not so in the enterprise, where browsers can be standardized and extended and where code can be written by a machine—hiding the ugliness. Moreover, at the enterprise level, it should be possible to mandate, although industry has not been very good at this to date, compliance with a standard representation that facilitates data sharing, without resorting to a ‘standard’ (i.e. proprietary) solution from a single vendor.
It would be even better if this happened at the national level—so that a region’s companies and suppliers were all singing to the same semantic hymn sheet. And this is exactly what might happen in Norway, if the POSC/Caesar association (PCA) manages to pull it off (see our report from the PCA ‘Semantic Days’ conference on page 6). Semantic web technology has been adopted for deployment of the ISO 15926 standard for the description of plant and process hardware including offshore structures. Some now see the approach expanding to embrace more upstream protocols such as WitsML and ProdML.
ISO 15926 is also getting traction in the US through the Fiatech standards body (see page 3). Here BP, Chevron and Petrobras report on work that involved the standard, albeit not all at the enterprise scale. As usual in the standards game, some ‘deployments’ may be restricted to the use of a ‘compliant’ application. Nonetheless, it is pretty significant that a US standards body is getting in on the act.
In fact it is fair to say that the PCA/Fiatech activity, if it does develop as planned, will be one of the first truly significant semantic web deployments in any vertical. Which is both worthy of praise and also a bit scary!
~ ~ ~
Last month we reported from Phil Crouse’s PNEC data integration conference in Houston and the SPE Intelligent Energy show in Amsterdam. I was struck by the difference in tone of these tradeshows. As the Intelligent Energy show is about applying digital technologies to the oilfield and PNEC is about managing the data that comes, inter alia, from the oilfield, you would think that these shows reflected different sides of the same coin.
How is it then that Intelligent Energy is a razzmatazz celebration of ‘cool stuff’ and PNEC is, well, a bit tedious? How come Intelligent Energy (IE) portrays a world of successful deployment of projects of mind boggling complexity and reach, while the poor old PNEC crowd is still complaining about under resourced projects, poor quality data from vendors and so on? And most importantly, how can all these collaboration rooms, fields of the future, model-based decision support systems claim to work properly if the data that they depend on is in the parlous state that PNEC would suggest?
Part of the answer lies in the sneaky ‘inter alia’ of my first paragraph. PNEC, for historical reasons, is mostly about geological and geophysical data management. This is because the G&G community began looking at data management in a serious way a decade or so ago. The petroleum engineering community has only really been into data management for a couple of years.
But that’s not quite right either. The digital oilfield is lucky compared with the upstream in that it had a large installed base of process control instrumentation and systems. The engineers didn’t really need to manage data, it was already being done for them in the historian with a large installed base in process control.
But that’s not the whole story either. In his keynote at IE, Schlumberger CEO Andrew Gould cautioned that although the concept of real time optimization has been around for a decade or so, along with the expectation that recovery rates would be boosted, this has not really happened. And this is despite a history of digital oilfields going back to Exxon’s Computerized Production Control in 1967!
So with such a digital history, why all the histrionics? There is a degree of commercialization at IE which is less evident at PNEC—something of a paradox when you think that IE emanates from the Society of Petroleum Engineers. The commercial razzmatazz means that even the more technical presentations get sucked into an excess of ‘wow’ factor. PNEC talks in general tell it like it is.
How is it?
And how is it you may ask? Well you can‘t do the digital oilfield without sorting the data issue. So one has to imagine that much of the singing and dancing of IE concerns somewhat early stage development. It’s much easier to develop a skunk works solution that flies for the time of a demo, than to assure enterprise wide success for a relatively modest project. Another issue impacting the upscale conferences these days is the skills shortage. If you are trying to recruit you need to make it sound better than it is. If you are trying for more funds, make it sound worse. Perhaps PNEC should be considered as a user group and Intelligent Energy as a beauty contest!
At the 2008 American Association of Petroleum Geologists (AAPG) Annual Convention and Exhibition in San Antonio, this month, TerraSpark Geosciences, a spin-out from the University of Colorado that was formerly the BP Center of Visualization has just announced ‘Insight Earth, a new seismic interpretation suite.
Insight Earth’s technology and workflows derive from a five year Geoscience Interpretation Visualization Consortium research program with sponsorship from BHP Billiton, BP , Chevron, ConocoPhillips, Paradigm Geophysical, and Shell. A portion of the research has also been supported by ExxonMobil and Anadarko.
An early version of Insight Earth, dubbed Computer Aided Seismic Interpretation was presented at last year’s SEG (OITJ Oct. 07). Insight Earth builds on the CAD metaphor with automated fault extraction, surface wrapping and domain transformation to enhance geoscience workflows.
A stratigraphic workflow leverages a 3-D transform to create a stratal-slice volume where all structural deformation has been removed from the 3-D data. Stratal slicing supports imaging and interpretation of the elements of depositional systems as complete 3-D surfaces, which are then transformed back as interpreted depositional surfaces in the original seismic volume.
TerraSpark CEO Geoff Dorn told Oil IT Journal, ‘Insight Earth is the first interpretation package that supports volume interpretation of complete 3-D surfaces throughout. Workflows in the structure and stratigraphy modules reduce interpretation time and effort while improving the quality of faults, horizons and geobodies. This provides an efficient interpretation of the seismic volume allowing the interpreter to recognize and extract subtle faults, channels and other structural and depositional features that might have been missed with conventional interpretation tools.’
Dorn explained, ‘We don’t compete with the full scale visualization and interpretation systems that are currently on the market. Insight Earth complements these with compatible 3D processes. Traditional packages impose manually picked faults in 2D slices. Insight Earth simultaneously renders more than four volumes with attributes, automatically picking horizons and faults.’ The package includes tools for data management, structural interpretation, stratigraphic feature analysis and a modeling tool for principal surface extraction. The package is also amenable to 4-D seismic volume interpretation. Dorn’s team presented four papers at the AAPG on various techniques of reservoir characterization and modeling using TerraSpark’s technology. More from firstname.lastname@example.org.
Halliburton’s Landmark unit has acquired the intellectual property, assets and business of Knowledge Systems Inc. (KSI). KSI provides geopressure and geomechanical analysis software and services for well placement and design.
Landmark VP Paul Koeller said, ‘Drilling deeper in complex reservoirs requires access to the real-time pore pressure and geomechanical information that KSI provides. Integrating Landmark’s well planning and engineering solutions with KSI’s pore pressure prediction and geomechanics will reduce the time needed to plan, drill and complete wells.’
KSI’s flagship Drillworks software drilling performance and well path optimization. An integrated ‘Pressworks’ relational database stores and manages pore pressure and geomechanical information.
Schlumberger has also boosted its geomechanical offering with the acquisition of TerraTek of Salt Lake City (OITJ July 06) and more recently the acquisition of UK-based boutique VIPS. (May 07).
The Austin, TX-based Fiatech organization held its annual technology conference and showcase in New Orleans last month. Fiatech is a technology standards body that promotes integration and automation technologies in capital intensive projects. Fiatech director Ric Jackson outlined the consortium’s activity which includes automated procurement and supply networks, a global valve e-catalog, self-maintaining facilities and accelerating deployment of the ISO 15926 Standard (see also our report from the Norwegian ‘Semantic Days’ conference on page 6 of this issue). Fiatech is also working on permitting and regulatory standards and on harmonization with operations (MIMOSA) and geospatial (OGC) standards.
ISO 15926 WIP
Robin Benjamins (Bechtel) described the Fiatech ‘work in progress’ (WIP) implementation of ISO 15926—a.k.a Accelerating Deployment of ISO 15926 (ADI). The business case for the standard is clear, a reduced turnaround time for catalogue searches and information requests, less mapping between supplier, internal and owner classification systems, streamlined procurement and more information handed over. ISO 15926 can also form a baseline for a lifecycle enterprise data dictionary, deployable from front end engineering design, through handover and on to operations. A roadmap for deployment is under development with the formation of ADI special interest groups targeting suppliers and owners and EPCs.
Calgary-based asset master data management specialist NRX has provided an open source WIP browser proof-of-concept (available at iso15926.nrx.com). The plan is for a SQL Server data base of the RDS/WIP feeding through a RDS to RDF converter into a SPARQL server. The NRX template browser leverages W3C and IETF protocols to read from the RDF Server on a ‘Semantic Web’ framework (Pellet, Java, Jena). Benjamins concluded that the ISO15926 can be quickly and usefully implemented with available Semantic Web tools that cross platforms and implementations successfully—leveraging the strong foundation of W3C and IETF standards.
Hakan Sarbanoglu (Kalido) presented a paper on integrated asset information management, co-authored by Bill Nyström, manager of BP US’ pipelines and logistics program. BP has leveraged Kalido’s master data management solution in its pipelines and logistics data management initiative. A meta and reference data repository was built using a generic storage design based on ISO15926 Part-2. Physical data is stored in a triple store design which remains the same for any model and data. Changes are defined as incremental meta data (ISO-18876). BP transports 2.5 million barrel miles per day of oil, refined products, natural gas liquids, carbon dioxide and chemicals through its 10,000 miles of pipe, 70 light-oil terminals and 500 trucks. The data improvement program began with the construction of a central asset repository and pipeline business model. Data cleansing was achieved using DataFlux’ data quality tool. Benefits included the discovery of missing assets and the improvement of key processes spanning in-line inspection, corrosion tracking, HSSE incident tracking and more. Today all BP’s pipelines systems and facilities have been loaded into Kalido MDM. The program enabled the decommissioning of multiple Excel spreadsheet ‘databases.’ Reporting is now performed with Business Objects offering savings in compliance costs.
Edward Fry, manager of IM/IT on Chevron’s major capital projects revealed that Chevron’s 2008 capital program for 2008, at $23 billion, is its largest ever. A high cost environment means that execution is critical. Chevron has a dozen world wide projects with over a $1billion net share. Information management (IM) on major capital projects requires control of complex document routing workflows, frequent exchange of data with third-parties, content organized to support both the project and subsequent operation—all in the context of increasing IM scale and complexity as the project moves from concept selection, front end engineering, detail design and execution.
Previously gaps were exposed in Chevron’s major capital projects information management. With the growth in project number, size and complexity, experienced IM personnel demand was outstripping supply. Stop gap solutions led to extensive and costly rework in software and procedures. A lack of standard IM specifications and standards made for inadequate sharing of best practices and lessons learned and challenged Chevron’s ability to find and retain personnel.
The solution was to implement Fiatech’s vision—going beyond building information management to ‘total asset lifecycle information modeling.’ Chevron is therefore a keen participant in the Accelerated Deployment of ISO 15926 (ADI) program. Chevron’s major capital projects IM (MCPIM) initiative leverages Software Innovations’ Coreworx document management system and business process automation. Data quality is being addressed as an essential precondition to successful MCPIM. Data governance has been implemented such that data is captured once at source and maintained throughout the life cycle. ISO 15926 is seen as the standard for data exchange. It was selected because it is the only industry standard that is close to Chevron’s requirements and it aligns with Chevron’s service oriented architecture (SOA). Current MCPIM deliverables include a project management handbook and Chevron contract language template that includes the IM standards. Bentley’s ProjectWiseLifecycle Server and Aveva’s Vnet are also used (notably on the Agbami FPSO—Oil ITJ July 07). Going forward Chevron is mapping its MCPI classes to ISO 15926. But most significantly, Chevron is now mandating ISO 15926 compliance from its suppliers.
Alexandre Casalechi described Petrobras’ 3D virtual reality (VR) deployment for production optimization and asset management which leverages VRcontext’s WalkInside and ProcessLife applications. These tools are said to fit into the Fiatech ‘philosophy’ by allowing for the re-use of 3D design investments throughout the asset life cycle. Petrobras has a dozen refineries using VR for intuitive, real-time information management and to support a ‘knowledge-enabled’ workforce. Flagship deployment is at the Replan refinery, Brazil’s largest with a capacity of 360,000 barrels oil per day.
The 3D plant data management system has been in use for a decade, covering engineering, operations, maintenance and HSE. The model contains over a million objects and some 7,000 real time monitoring points coming. The VR system provides access to real-time attributes such as pressure, temperature, volumes and alarms. Walk-Inside also displays computational fluid dynamics (CFD) and satellite imagery inter alia. More from www.fiatech.org.
Acceleware has just rolled out ‘AxK,’ a seismic processing software library that brings graphics processing unit (GPU)-based number crunching to 3D Pre-Stack Time Migration (PSTM) developers. Acceleware demoed the solution at the 2008 C3GEO Convention in Calgary this month. AxKTM is claimed to deliver increased computing performance with reducing power consumption, cooling needs and data center footprint.
Wilf Kruggel, director of Techco Geophysical added, ‘The use of GPUs for running large seismic migrations gives processors massive processing power while reducing the need for additional infrastructure.’
Acceleware investor and technology provider Nvidia announced it is a founding member of Stanford University’s new Pervasive Parallelism Lab (PPL). The PPL will develop new techniques to harness the parallelism of multiple processors. Bill Dally, chair of Stanford’s computer science department said, ‘Parallel programming is perhaps the largest problem in computer science today.’ Other partners in the PPL are AMD, Hewlett Packard, IBM, Intel, and Sun Microsystems.
Netherlands-based upstream software house JOA has teamed with Fusion Reservoir Engineering Services to offer integrated geological modeling, reservoir engineering, reservoir simulation and field management services. JOA’s Jewel Suite and Fusion’s project management expertise will be available from a new company, ‘Fusion Reservoir Engineering Services.’ Fusion and JOA are also working to integrate their software in a new cross platform offering.
Fusion CEO Alan Huffman said, ‘Our vision is of an integrated services and technology offering across the geoscience and engineering disciplines that will integrate the strengths of all of our proprietary software. The integration of the Fusion and JOA software and technology will provide a new concept for integrated services that will change the way oil and gas are discovered and produced.’
JOA has also teamed with Enres to develop a ‘next generation’ well correlation tool. Using Enres’ ‘CycloLog’ tool—targeting areas where 3D seismic is unavailable and where modeling is ‘wellbore driven.’
Jewel Suite 2008
JOA has also announced Jewel Suite 2008—with OpenWorks and GeoGraphix file support, 3D visualization, speedup of seismic slices and volume display, uncertainty management and more.
Landmark has struck a ‘preferred-reseller’ and strategic development agreement with parallel storage solution provider Panasas Inc. The deal adds Panasas’ ActiveStor storage clusters to Landmark’s ProMAX SeisSpace seismic processing portfolio.
Landmark CTO Chris Usher said, ‘Seismic data processing can challenge high performance computing systems, particularly with conventional storage technologies, which can’t keep pace with the parallel processing functions required for most seismic processing jobs. Hardware is often waiting for storage systems to catch up, creating I/O bottlenecks.’ Panasas storage subsystems deploy a dedicated ‘PanFS’ parallel file system, eliminate I/O bottlenecks, improving application performance and cluster use.
Panasas delivers ActiveStor storage as an ‘appliance,’ suitable for deployment and administration in shops with limited IT staff. Landmark has also optimized its processing suite to take advantage of the high end storage systems. The software is available bundled with Panasas ActiveStor parallel storage clusters.
Venezuelan state oil company PDVSA has chosen NeuraDB as its corporate well log repository. The database will house some 500,000 well log images that PDVSA has acquired over the past 70 years. NeuraDB provides PDVSA personnel immediate access to this information through query and GIS based interfaces.
NeuraDB’s strength is that it manages scanned historical wells logs (rasters) alongside structured data. PDVSA has undertaken a significant digitizing program to build a digital archive of wells drilled up to 2003. Around 80% of these paper and film well records have been digitized and some 40% of these have also been vectorized to log curve data.
PDVSA has previously developed its own web-based solution for accessing scanned log imagery over the internet. The system proved hard to maintain and so PDVSA decided to acquire NeuraDB as its new well log repository. Along with well logs, NeuraDB can house geologic maps, sections and seismic data. NeuraDB was also selected for its data management and organizational capabilities, its configurable document management, geo-referencing, security and ease of integration with other repositories. Neuralog is also engaged in training PDVSA’s data managers and IT personnel who will be operating the system. More from email@example.com.
3DGeo has developed a high-end imaging application using Anisotropic Reverse Time Migration (RTM) to image complex structures such sub salt plays.
Beicip-Franlab’s new CondorFlow 1.5 release integrates the geological model into the fluid flow history match workflow. The new release provides geostatistical and upscaling algorithms and links to third party tools such as Petrel, GoCad and Eclipse.
Earthworks Environment & Resources has now productized its ‘MPSI’ fast stochastic inversion tool as well as HIIP a ‘portable’ volumetrics package.
PPDM member ETL Solutions’ ‘TM’ 4.6 data migration toolset offers improved discovery of XML models and improved database transforms. Flat file model editing gives fine grain control over CSV file formats data transformation.
Engineering data management specialist Engineous has announced Fiper V3.0, a framework for managing data, models, applications and computing resources. Jobs can be run across the network and projects hosted in a version-controlled library with information checked in/out over the web.
Ikon Science has released RokDoc V5.3 with a new pore pressure calculator, interactive VSP interpretation and EM modeling. ChronoSeis, the 3D/4D reservoir characterization module now has direct access to the RokDoc database enabling well data, cross-plot polygons and rock physics models created in RokDoc to be available in ChronoSeis models. ChronoSeis also features a new interactive stochastic inversion module.
Neuralog has just released NeuraLaser, a continuous color laser printer for well logs. The high quality printer offers speeds of up to 18 cm/s, automated paper handling and high capacity consumables. NeuraLaser was developed through a partnership with Lexmark.
The latest 16.5 Release of Petrosys’ mapping package includes a plug-in for Schlumberger’s Petrel, direct access to GoCad ‘tri-surf’ and ZGF data, spatial indexes for PPDM data and an interactive georeferencing tool.
Laser scan specialist Quantapoint unveiled QuantaCAD 8.0 at this month’s Offshore Technology Conference in Houston. Enhancements include improved memory management, fast clash analysis of laser and CAD models and graphical views that can be exported and saved as JPG or TIFF images.
ConocoPhillips has initiated an open source software project, GeoCraft, a general purpose geoscience development platform. GeoCraft began as a way to encourage code reuse at ConocoPhillips which considers that most of its code does not need to be kept secret. Open sourcing the commodity parts of its code base makes for improved collaboration with vendors and universities and will help ConocoPhillips source its geoscience software from consortia, software startups and standards bodies.
Designed as a lightweight framework for rapid prototyping and deployment of new geoscience algorithms, GeoCraft tools include ‘ABavo’ for seismic amplitude versus offset (AVO) analysis, ‘GeoMath,’ simple geoscience algorithms and tools for visualization and data exploration. GeoCraft targets a wide user community including exploration geoscientists, research geophysicists, computer scientists and data managers.
The framework includes a standardized domain model based on geoscience objects such as wells, traces, logs, faults, horizons and seismic volumes. Effort has been made to reconcile the different types and levels of detail among proprietary formats to enable interoperability. Viewers send and receive data selection, GIS, and cursor tracking events and a right mouse click displays an object’s properties. More from www.geocraft.org.
Geoscience and engineering consultants Knowledge Reservoir (KR) and upstream software boutique 3GiG LP (3GiG), both located in Houston, have formed a strategic alliance to offer business process and well lifecycle management services. 3GiG provides software and consulting services focused on business process, knowledge management and decision support for the upstream.
KR president Ivor Ellul said, ‘KR will now be able to expand its service offering by providing asset team consultants and technology to streamline clients’ business processes around field, reservoir and well.’
The alliance coincides with the release of 3GiG’s new business process, knowledge and well lifecycle management system, Prospect Director 2.0. The web-based application addressed asset team workflows, business and decision processes, lifecycle based data, information and knowledge management, well and well work planning, AFE and inventory management. Upstream companies will have access to 3GiG’s technology and expertise through KR asset consulting teams for use across the scope of their projects, from well planning (drilling to plug and abandonment) to field development planning, lead and prospect generation, workflow tracking and acquisition and divesture packaging.
3GiG founder Tim Altum added, ‘For clients interested in streamlining their processes, the alliance brings together KR’s powerful team of asset team subject matter experts with 3GiG’s technology and business process experience to help our mutual clients solve their process and standardization issues.’ Knowledge Reservoir is a wholly owned subsidiary of Ziebel AS.
Bernt Helge Hansen (StatoilHydro) presented the common oil and gas ontology, a component of the Norwegian OLF trade body-sponsored integrated operations project. The aim of integrated operations (IO) is for all fields and facilities to be compliant with the program and to develop a global IO support network. IO promises a ‘revolution in our way of working.’ Enablers are the semantic web, ISO15926 and other oil and gas ontologies and XML schemas. Project scope includes drilling, reservoir and production, operations and maintenance. Smart web services will move real time data from the historian to applications covering the whole range of oilfield activities from condition-based maintenance to fiscal metering.
Automation, automation ...
For Einar Landre (StatoilHydro), both the rationale for IO and the means are clear. An OLF study estimated that IO could produce savings of around $60 billion on the Norwegian continental shelf. The means are twofold first generation IO, which is well underway, involves moving work onshore, leveraging a high bandwidth infrastructure. Generation 2 IO involves more advanced solutions for information management, automation and ‘autonomy.’ Landre believes that in general, people make inferior decisions and that automation is the way forward. But today’s software was not designed for this. ‘We are trying to solve a set of non trivial problems on the boundary of research in IT.’ The answers lie in large scale networked systems, machine learning, Bayesian uncertainties and the Semantic Web.
Early work focuses on ‘socio-technical’ systems that combine humans and machines working to achieve common goals. Automation offloads the human element, making experts available to work on the ‘hard stuff.’ One persistent problem is the ‘buzzword compliant’ vendor. Here Landre suggests a standards based approach (using the IEEE 1471 recommended practice for architectural description of IT systems.) The aim is to extend from the IO feasibility project, building on IIP, ISO 15926, ProdML, DISKOS etc. The semantic model has proved a valuable to link data sources—solving the legacy systems and heterogeneous data sources issues.
Nowhere is automation more important than in Norway’s high north. A new joint industry project is just kicking off to investigate second generation integrated operations—i.e. automation. The $18 million project is headed up by Nils Sandsmark. The high north was chosen because fields will be seabed completed and operations may be thousands of kilometers from the shore. Fields will be highly instrumented and digital with autonomous subsea controls and networks. Such remote operations will be challenging from the standpoint of security and regulations and will require a ‘zero’ environmental impact. ISO 15926 is to be extended to production optimization and real time data quality assurance. Model-based prediction and ‘smart agents’ technology also ran. The JIP is also to study the convergence of various XML formats (ProdML, daily, monthly production reporting etc.) into ISO 15926 which will then be extended to cover production optimization and readied for automated reasoning by ‘smart agents.’
Semantic Web guru Leo Sauerman of the German DFKI artificial intelligence R&D center offered a backgrounder on deployment outside of oil and gas. For Sauerman, the semantic web promises ‘a consistent data model for integration of content from different organizations, providers, departments, disparate data sources and legacy data.’ Multilingual taxonomies use the user’s terminology and are queried by the ‘expressive’ sparql language. Extensible metadata incorporates unanticipated data sets, new requirements do not change code. Current semweb projects include WorldNet, US Census Data, Flickr Exporter, OntoWorld, RDF Book Mashup, DBPedia and ‘friend of a friend.’ Open linked data now includes some 2 billion ‘facts’ and 680 thousand ‘links.’ In a March 2008 announcement, Yahoo unveiled the Search Open Ecosystem, including support for semantic web standards. Other applications cited by Sauermann included Vodaphone Live which uses an RDF vocabulary for ringtones, metadata and content ratings. RDF has been used since 2000 by the Mozilla internet browser to configure its plugins.
ISO 15926 and semantic technology is not restricted to Norway as Wan Hassan Wan Mamat (Petronas) demonstrated. Petronas shares the vision for an industry standard data model based on ISO 15926. The Petronas Carigali information model (PCIM) builds on ALCIM’s ‘myCIS’ Integrated Engineering and Asset Management data model and ISO 15926 reference data library. This covers front end engineering design, process control, electrical and instrumentation. 3D data model translations between applications go through IS5926 and an XMpLant interface. Compliance is mandatory for Petronas’ contractors. Like Norway, Petronas has an integrated operations center ‘PViC’ located in the Petronas twin towers which control operations on the Tangga Barat and Kumang fields.
National Oilwell Varco
Henning Jansen (National Oilwell Varco) believes that the Norwegian Continental Shelf is a world leader in high tech drilling and onshore drilling centers with real time decision making. Protocols such as WITSML, WITS, OPC, ODBC allow for the integration of independent service providers each with their own tools and platforms. While data exchange standards like WITSML are good, true integration remains elusive. So DNV created the PCA Drilling special interest group in 2007 and is working on an ISO 15926 ontology for drilling with participation from StatoilHydro, BP and ConocoPhillips.
Johan Wilhelm Kluever reported on the progress of the modeling, methodology and technology (MMT) special interest group. The MMT SIG works on the establishment and maintenance of correct and reliable reference data. Ontologies are used to record factual statements from domain experts. These are machine translated into a precise statement in the ontology. The MMT SIG cuts across other PCA activities, focusing on ISO 15926 Part 2—i.e. the data model and upper ontology. Part 2 has a ‘canonical’ described in the Express data modeling language with current implementations in OWL. A software development Wiki is available on trac.posccaesar.org.
Mike Brulé kicked off the proceedings with an expansive plug for Microsoft’s capabilities in oil and gas. Slideware showed some 20 contributing Microsoft products and technologies, from .NET to Silverlight, SharePoint, Dynamics and Mobile Computing. A conceptual demo ensued of oilfield problem detection and resolution. PerformancePoint analytics (ProClarity) monitor asset performance and identify problems. The theme is collaboration between field management, operations and service companies. A ‘Groove workshop’ and SharePoint portal make for a ‘role-based, secure information repository.’
Marty Henderson provided a review of Marathon’s Digital Oilfield proof of concept (POC). Marathon sees the Digital Oilfield as a one-stop shop where information on wells, fields and facilities can be accessed and where knowledge workers can find the information they need from an underlying application data store of record. ‘Role-based’ information access provides management reports on performance and drill-down to detailed information. Information is combined and presented in graphs, KPIs, tables, maps, and composite views in ‘mash ups’ from many sources. Tools deployed include Hyperion (business intelligence), OSIsoft’s PI System (data historian), Schlumberger’s Decision Point and the ubiquitous SharePoint. Developers included The Information Store of Houston which was able to implement a set of deliverables in a ‘restricted timeframe.’ Microsoft’s Virtual Earth mapping tool also ran. Marathon was happy with the POC which has increased its understanding of the potential of the digital oilfield and outlined requirements for a final solution.
Bill Gilmore presented Chevron’s upstream IT foundation, a ‘foundation for operational transformation.’ Previously Chevron’s upstream segment comprised 4 autonomous business units. Although these were ‘pervasive’ consumers of IT, this had produced multiple sub optimal solutions with little coordination. Businesses were ‘run by Excel’ and had insufficient tools to survive personnel changes and loss of experience. Chevron therefore decided to develop a ‘service oriented business intelligence solution set’ based on a common IT architecture—the Upstream Foundation (UF) - the ‘plumbing’ that provides Chevron’s personnel and applications access to data from the authoritative System of Record (SoR). The UF leverages common coding practices, an ‘agile scrum’ development process and componentized design. But most importantly, the UF is being developed locally in the business units to ensure alignment with Chevron’s global goals.
The ambitious program sets out to enable accurate measurement of key performance indicators across all fields. Deployment starts at the business unit level with planned rollout across global upstream. Chevron is also standardizing its master data and information architecture to provide ‘one version of the truth.’ SOA is provided by the ‘Jupiter Blueprint,’ a data integration layer that exposes a web services facade to application data. The project leverages a substantial amount of Microsoft’s wares from .NET 3.0 and Visual Studio through the Windows Communication Foundation, Internet Information Services, SQL Server and SharePoint. Partners on the UF are Microsoft, Accenture and Avanade.
BP CIO Steve Fortune’s paper described how BP was leveraging high bandwidth connectivity with its Gulf of Mexico assets to monitor real time data from its OSIsoft PI System hub. BP’s Advanced Collaborative Environment (ACE) facilitates interaction between offshore assets and a remote extended support team through data analysis in real time, aggregation of operational data, video links and ‘enhanced business processes. BP’s ‘OneBusiness’ communications portal embeds technology from OneTouch Systems. More from www.oilit.com/links/0805_1.
Last year, the Microsoft-commissioned Gulf Research survey of high performance computing (HPC) in the oil and gas industry found that Microsoft ‘dominated’ this market—a notion that we pooh-poohed in an editorial (Oil ITJ March 07). Microsoft has watered down its stance in its 2008 ‘HPC in oil and gas’ survey, carried out by the Oil & Gas Journal Online Research Center. Responses were received from 212 individuals working in all industry segments from exploration through refining and marketing.
As last year, little attempt was made to ring-fence HPC and Microsoft’s survey includes questions on Microsoft Office—used by 100% of the respondents to manipulate and report technical data*. Notwithstanding such ambiguity, users consider that ‘there is continued need for development of HPC capability,’ that oils need to add compute power and that suppliers of HPC technology need to educate users on its capabilities. A majority of respondents indicate being ‘mostly or somewhat satisfied’ with the performance of their current technical computing capabilities for specific applications.
Of those power users of compute-intensive scientific applications
using multiple iterations, 40% typically complete ‘two iterations’ per day times
during a 24-hour day, while a third believe that eight iterations represents
an ‘optimal number.’ Another third ‘didn’t know how many iterations would be
optimal.’ 57% of participants reported that they have technical or scientific
computing applications that are unique to their company or department. Of these
62.1% were applications are developed in-house. This indicates a ‘considerable
increase in in-house development capabilities during the past year.’ Finally
spend forecasts for 2007 were roughly one third up by more than 10%, one third
up by less than 10% and no change for the remainder.
* Perhaps the Freudian message Microsoft is making here is that its resource hungry applications really need HPC to work—even if it’s just Excel on Vista!
Shell has hired Marcus Ridgway as senior systems analyst in its well planning and business performance team in Calgary. Ridgway was previously with Landmark.
Jack Angel has joined SensorTran as VP oil and gas. Angel was previously with Sabeus.
Halliburton has appointed Gasser El-Badrashini as VP for Middle East Region. El-Badrashini replaces Marc Edwards who is now VP production enhancement in Houston.
OFS Portal has named Tim Graney as director of operations. Graney hails from ExxonMobil Chemicals.
Ikon Science has named Nick Pillar as operations director and Andrew Paxton as finance director. The company has also hired Mick Lambert as its western hemisphere president in Houston. Lambert was formerly president and CEO of GX Technology.
Katie Zhao has joined GGS-Spectrum as office manager of its Beijing-based subsidiary.
The Pipeline Open Data Standards (PODS) body has appointed Mike King (BP) as president, Scott Moravec (Eagle Mapping) as Vice President, JW Lucas (Enterprise) as secretary and Ken Greer (CenterPoint) as Treasurer.
Beicip-Franlab has hired Anna Babicheva as reservoir modelling specialist, Ran Zhang as geologist and Rodrigo Giraldo as software business development manager.
Gerald Stein is heading up Foster Findlay Associates’ (ffA) new London office.
Steve Goodacre is to head up ZEH Software’s new far east regional office in Perth, Western Australia.
Caesar Systems has hired Alex Jok to handle Shell E&P’s implementation of its PetroVR flagship. Jok was previously with Murphy Oil.
Baker Hughes has just opened a Center for Technology Innovation in Houston, $42 million showcase for completion and production technologies. The facility includes HPT test cells rated at 700°F and 40,000 psi.
Energistics has elected Peter Eilsø Nielsen to its Board of Directors. Nielsen is chief production geologist and ‘process owner’ for technology with StatoilHydro.
Fugro-Jason has named Edward Cherednik business manager for central and eastern Europe.
TengBeng Koid is to head up ION Geophysical’s (formerly Input-Output) ‘second headquarters’ located in Dubai.
The Information Store’s new Middle East unit is headed-up by Ezat Zarasvand, formerly with Oracle Corp.
Aveva’s Derek Middlemas has been elected to the Fiatech board of directors.
Israeli seismic processor Geomage has appointed Ken Larner to its executive advisory board.
Aker Solutions has grouped its maintenance, modification and operations (MMO) and field development under the new business area of ‘Energy Development & Services.’
Tim Madden, senior instrumentation and controls consultant for ExxonMobil has been appointed to the Fieldbus Foundation’s board of directors.
Schlumberger has just opened a training center, in Tyumen, West Siberia.
X-Change Corp. has appointed James Farr to its board of directors. Farr was previously with Dailey International.
Lack of space means that the ‘Done Deals’ section will appear in the next issue of Oil IT Journal.
Since its introduction last year (Oil ITJ June 07), IBM’s reference data model for chemical and petroleum has been the subject of several more presentations. Speaking at the SPE Intelligent Energy conference in Amsterdam earlier this year, IBM’s Ron Montgomery, considered as the father of the technology, offered an update on the framework*.
IBM’s integrated information framework (IIF) builds on a reference data model ‘spanning well bore to export pipe.’ The IIF offers access for all users to all data in a single development environment. The IIF provides an open-standards-based portal view that marshals real time information and provides access to historian data, ERP, engineering and other data. Instead of creating ever more complex data warehouses, IBM advocates federating the relationships between existing databases such that users can access original data in situ, without the need for extensive interfaces or complex data models. IBM claims a ‘significant reduction in data management expenditure.’ The services-oriented architecture means that the system is ‘easily scalable’ and extensible across the enterprise.
Speaking at the IBM WebSphere user group last month**, Matthew Perrins of IBM’s software lab offered a different slant on the technology. Perrins shoehorned the IIF’s plethora of process standards into the context of trendy ‘Web 2.0’ technologies orchestrated by IBM’s Websphere enterprise service bus (ESB). The software smorgasbord includes AJAX, REST, JSON, mashups and more—collectively making up the ‘intelligent web,’ an enterprise mirror of Web 2.0’s collaboration technologies.
Perrins describes the ‘problem statement’ of petrochemicals as a need to measure and analyze process performance in the context of installed equipment base. Multiple applications are deployed, each with its own data model. Cross work flow business processes, transactions and events are not captured, views are incomplete and analysis is sub-optimal.
Under the hood
Under the hood of the IIF is the IBM WebSphere 2.0 feature pack comprising a DOJO user interface, Comet*** for real time data and a JSON/REST SOA. Perrins concluded that ‘IBM is moving towards ‘Restful SOA’ allowing all middleware to offer consumable integration end points.’
The IIF appears to be all thing to all men—both in terms of the plethoric standards ‘supported’ by the technology and the equally numerous software development buzzwords deployed. The claim that SOA is a route to scalability is a bold one. It would be nice if some of IBM’s slide ware were backed up with the publication of the underlying ‘open’ model.
* SPE 112134
Excelsior Energy of Calgary has chosen Paradigm’s strategic consulting (PSC) unit to accelerate development planning at its Hangingstone oil sands resource play in the Athabasca oil sands region of Alberta. Paradigm will provide seismic processing and reservoir characterization based on a 26 core well program. The technologies to be used include Paradigm’s Gocad modeling flagship and its new ‘CRAM’ seismic depth imaging technique.
Excelsior CEO David Winter said, ‘Paradigm’s technology and oil sands expertise will allow us to integrate all of our log, core and seismic data, characterize a complex reservoir and build a resource model. PSC’s experience in the Hangingstone area along with its application portfolio will help us choose the optimal development options, and accelerate development plans for a steam assisted gravity drainage (SAGD) pilot.’
Cortex Business Solutions and Spira Data Corp. (both of Calgary) have teamed to market an ‘open,’ industry-wide solution for e-commerce in oil and gas. The deal combines Spira’s field ticketing solutions with Cortex’s service for the secure electronic exchange of business critical documents. The new solution captures and translates field tickets, invoices and purchase orders electronically from Spira’s system and delivers them any company on the Cortex network.
Ryan Lailey, VP Business Development at Cortex, said, ‘Cortex and Spira share the same target markets. By jointly marketing a complete solution, we provide a seamless system that automates processes from field to payment.’ Spira’s flagship Wireless Field Ticketing solution automates data flows from field to head office, supporting the capture and transfer of operational and financial oil field data including expenses, equipment, materials, and billing.
$5 million offering
In a separate announcement, Cortex has commissioned Standard Securities Capital to sell up to 25,000,000 units of the company for gross proceeds of up to $5 million CDN. Proceeds will be used for general working capital and expansion of its sales and delivery organizations.
Ventyx’s utility demand and price forecasting system has been selected for a NASA project targeting short-term load forecasting for energy utilities. The award was made under the NASA Research Opportunities in Space and Earth Science (ROSES) program. The project involves the application of weather-related earth science to optimize accuracy of Ventyx’s ‘Nostradamus’ short-term forecasts.
Nostradamus uses neural networks to ‘learn’ relationships among any number of data inputs to produce demand and price forecasts. The ROSES project will improve forecasts by using NASA’s high resolution data and models of air temperature, relative humidity, wind speed and more obscure items such as ocean temperatures, offshore winds and snow pack levels.
Ventyx partnered with independent R&D organization Battelle to bid on the project. Battelle’s Chris Pestak added, ‘Load forecasting systems such as Ventyx’s are powerful decision support tools for energy utilities, but we can enhance their performance even further by starting with better data.’
Schlumberger has introduced a new family of coiled-tubing services leveraging real-time downhole measurements. The new ‘ACTive’ range of coiled-tubing (CT) services uses a ruggedized high-bandwidth fiber-optic cable deployed inside the coiled-tubing string to link bottomhole sensors with surface monitors and controls.
CT Services VP Sherif Foda said, ‘When you know exactly what’s happening downhole, you can adjust job parameters in real time based on downhole measurements and make a difference to the results while the operation is still in progress.’ ACTive monitors injection rates and other downhole parameters, and provides formation damage mitigation, enhanced nitrogen lift and distributed temperature surveying (DTS) of the well to monitor treatment placement and production performance. The ACTive tool featured in two presentations at the SPE/ICoTA Coiled Tubing & Well Intervention Conference and Exhibition held last month at The Woodlands, Texas.
Anadarko has contracted with P2 Energy Solutions’ (P2ES) Tobin division to create ‘producing units’ for 14 east and south Texas counties. Producing units are unitized producing wells as described in the county records. The units have been derived from P2 ES’ internal data, with no review of the county or Texas Railroad Commission records. The data is for use in Anadarko’s land management system, Tobin’s SuperBase. Tobin plans to leverage its network of scouts and its well data base to help Anadarko manage non-core properties, non-operated units and farm outs.
P2ES also reports new product sales as follows. Energy XXI USA has implemented P2ES’ Excalibur financial management package software solution in support of its ‘aggressive’ acquisition program. Maritech Resources has selected Enterprise Land 2.5. Enterprise Land’s services oriented architecture will be used to integrate Maritech’s legacy and future financial systems in what is the first Enterprise Land go-live project. Finally, Swift Energy Company has implemented Enterprise Upstream software and Oracle’s eBusiness Suite as its integrated asset management and Enterprise Resource Planning (ERP) platform.
At the Offshore Technology Conference this month, UK-based energy analysts, Infield Systems launched ‘Energy-Gateway,’ an online offshore projects mapping system. EnergyGateway covers operational and future worldwide developments using information from Infield’s Offshore Energy Database (OED). The OED holds details on fields, fixed and floating production systems, subsea equipment, pipelines and more.
Infield also launched several new market reports including, Floating Production, Offshore Pipelines & Control Lines, and the Global Perspective Market Update. The latter forecasts a total global spend of $ 320 billion for next five years, a 50% hike on the previous five years. EnergyGateway was developed with ESRI’s Web ADF Common API which provides a somewhat clunky web mapping system.
Tulsa-headquartered eLynx Technologies has announced two new signings for its web-based monitoring and field automation services this month. Bachtell Oil and Gas is to deploy eLynx monitoring technology on a field in Tatum, Texas. Bachtell, an independent operator based in Longview, Texas, has been using eLynx’s services for the last four years to monitor two other fields.
David Batchell, VP operations said, ‘Automating saves us a tremendous amount of time because we can diagnose any potential production problems first thing in the morning before we head out to the field. We chose eLynx because of the reliability we have experienced in our prior working relationship over the years.’
Tempest Energy Resources has likewise selected eLynx to monitor facilities in state waters along the Gulf Coast of Texas. Tempest operations superintendent John Delaney said, ‘eLynx’s monitoring, alarm and call-out system plays a big role in allowing us to meet our operational goals. We can check our wells and other critical operations in real time, wherever internet access is available. Alarm notifications allow us to respond quickly to problems and keep downtime to a minimum.’ eLynx reports 215 E&P corporate users and has operations in 20 states.
StatoilHydro has awarded Kongsberg Maritime a contract for the delivery of operator training simulators for the Statfjord field in Norway, the oldest field in the Norwegian sector and the largest field in the North Sea. The simulator, due for delivery in April 2009, will be used on Statfjord’s A, B and C platforms.
A customized dynamic process model based on Kongsberg’s ‘Assett’ simulation platform will be coupled with Kongsberg’s AIM safety and automation system, extending the existing engineering and control system check-out simulator. The combined simulator will be deployed across the operations lifecycle from engineering studies, through control system check-out to training and support.
Kongsberg’s Chris Ruigrok said, ‘This contract confirms the success of the Statfjord control system check-out and engineering simulator. StatoilHydro will now have a state of the art dynamic simulation platform that supports operations throughout the installation’s lifetime. The dynamic simulation platform is ready for future online simulator applications.’
StatoilHydro has also awarded Kongsberg a two year contract to operate its automation and safety systems throughout the North Sea. A key facet of the deal is that much of the work will be performed onshore, reducing travel, costs and ‘logistical issues.’ Shore-based operations are increasingly a reality thanks to the Norwegian Integrated Operations program that has been supported by StatoilHydro and Kongsberg.
Shipcom Wireless has kicked-off an oil and gas radio frequency identification (RFID) solutions group, the OGR, bringing together subject matter experts, academics and technology service providers to identify and develop RFID-based solutions for the upstream.
Shipcom’s flagship ‘Catamaran’ supply chain execution platform captures and routes data from multiple devices such as RFID readers and barcode scanners and integrates it with enterprise systems such as Oracle and SAP. Shipcom also provides development tools for customizing Catamaran. The offering includes an enterprise RFID server, an EPC compliance package and RFID tag testing services for EPC or ISO RFID tags.
The OGR is to create application systems and data standards and educate customers in the adoption of RFID solutions. OGR founder members include Texas A&M University, University of Houston, Motorola, Tyco Electronics, Avery Dennison, and Merlin Concepts & Technology.
The ISA100 standards committee on wireless systems for automation has created a new subcommittee to address convergence of the ISA100.11a and WirelessHART standards. The subcommittee will compare the two protocols, building on the experiences gained with industrial applications of both standards, with an ultimate goal of merging the best of both standards into a single converged subsequent release of the ISA standard.
The move towards wireless connectivity in plant and process control is considered a step change in deployment but is not without controversy. Last September, work on the multi-vendor WirelessHART standard was completed in advance of the ISA 100 wireless initiative. The Wireless-HART standard is being promoted by ABB, Emerson, and Siemens.
The ISA subcommittee, which has representation from ExxonMobil and Shell Global Solutions, is concerned that there should be a single industry standard for process applications. The ISA’s current program is for approval and release of the ISA100.11a standard in 2008, followed by an evaluation of WirelessHART prior to convergence.
Pat Schweitzer of ExxonMobil, co-chair of the ISA100 committee, said, ‘Adoption of the ISA100.11a standard in 2008 will be an important step in fulfilling our ISA100 committee mission and of significant value to industry. This new subcommittee is the next logical step to helping industry fully achieve the significant benefits of wireless technology.’
Applied LNG Technologies (ALT) has selected Blue Wireless & Data as the exclusive bandwidth and phone service provider operator of a liquefied natural gas (LNG) processing facility in Topock, Arizona. Blue Wireless offers a range of IT services for multi-use commercial development areas including Ethernet, wireless internet, VoIP telephone, and IT relocation services.
Kevin Markey, Operations VP with ALT said, ‘The phone service Blue Wireless has put in place allows me to manage operations from Dallas, California, or at home with the simplicity of a 4-digit extension. We are also rolling out automation and monitoring services that will allow us to stay abreast of plant information remotely and to make informed business decisions in an industry that requires up-to-the-minute data at all times.’ Future services will include plant automation, monitoring, remote data access, and video surveillance. A customized online dashboard allows ALT to make business decisions on a real-time basis and eliminate delays in regularly receiving critical reports.
Schlumberger and BT have announced what is claimed as a ‘world first’ wireless broadband service installed on the Byford Dolphin rig in the North Sea for a three month trial. The WiFi service lets offshore workers communicate with friends and family using e-mail, instant messaging and web cam. The service is ‘public’ but not free. Users buy vouchers onboard or subscribe to the service online. The service was installed by Schlumberger Information Solutions using the rig’s main satellite link.
Following the successful trial, Schlumberger is to roll-out the wireless service globally for remote and offshore drilling and production operations. Demetrios Stellas, SIS VP for Digital Infrastructure said, ‘Increasingly, offshore workers expect to have access to the internet services that they can get at home. We believe that this new service will be recognized as a real benefit by oil workers who often work in harsh and difficult environments both offshore and in frontier locations.’
Seeing as all the technology required for rig site WiFi is ping, power and a $40 wireless access point, it is unlikely that this is strictly speaking a ‘world first‘!
Halliburton’s Landmark unit has just announced updates to the production optimization components of its DecisionSpace for Production suite. V 2.0 of AssetObserver, a web based production management system now embeds business intelligence functionality from California-based Incuity Software. The IncuityEMI platform maximizes the AssetObserver environment’s existing capabilities, allowing it to seamlessly read, update and delete data from third-party or proprietary data sources.
Landmark CTO Chris Usher said, ‘With the addition of the IncuityEMI platform, AssetObserver can now integrate data from more sources and provide greater utility and efficiency to our clients. By introducing this technology in the production optimization space, Landmark is providing a tool for greater insight into production operations that can lead directly to more efficient execution at the asset level and improved returns.’
Incuity was recently acquired by Rockwell Automation (which also has acquired another company, Pavilion Technologies which provides the platform for Landmark’s AssetSolver). Product manager Kim Sharp told Oil IT Journal, ‘We believe that the acquisition of both of the companies by Rockwell will result in greater synergies between the platforms that will benefit all of the DecisionSpace for Production products.’
Landmark has also migrated one of Halliburton’s service-derived workflows into DecisionSpace. Halliburton’s Sigma process for designing stimulation (frac) jobs has been encapsulated into Sigma-Solver as a new workflow. More from firstname.lastname@example.org.
Jakarta, Indonesia-based upstream operator Medco Energi has deployed a wireless solution to link its field base with several drilling rigs in the Karim Small Fields area in the Sultanate of Oman. Medco uses the network for the transmission of drilling data, Internet, email and providing for future voice and video. Increasing the challenge, the rigs move around an area of 250 square miles (650 square kilometers) typically every 15-20 days.
Muscat-based Hussam Technology Company designed and supplied the turnkey solution built around MeshDynamics’ MD4000 third generation Structured Mesh wireless solution. The mesh network consists of wireless hops every seven to nine kilometers consisting of both point-to-point links and point-to-multipoint links. MeshDynamics MD4000 nodes operate in a wide variety of frequency ranges. Here the 5.8GHz band was used as local telco regulations allowed for higher transmit power in this band.
The MeshDynamics technology provides bandwidths of 36Mbps to 54Mbps for email, Internet and drilling data exchange. The MD4000 family supports up to four radios in a rugged weatherproof enclosure about the size of a hardbound novel, ideal for Medco’s demanding application. Solar power systems were installed at a number of locations to power the mesh nodes.
MeshDynamics CTO Francis daCosta said,‘Our business continues to grow with natural resources enterprises, customers now include seismic exploration, mining, and petroleum exploration firms worldwide. These markets are growing rapidly with these industries’ increased focus on worker safety and efficiency.’ Other HTC communications solutions leverage Free Space Optics (FSO), Millimeter-wave, Microwave, WiMAX, WiFi Mesh Networks, Outdoor and Indoor Wireless LAN. More from www.meshdynamics.com.
Baker Hughes has signed a long term contract with Kongsberg Intellifield for its SiteCom, Discovery Wells and Real-Time Intelligence modules. Baker Hughes is to leverage the Kongsberg tools in web-based delivery of real-time visualization and analysis of drilling operations. Discovery Wells enables remote visualization of WITSML data streams from the well site. Multiple inputs of time or depth-based, real-time or historical data can be blended and viewed in a configurable graphical user interface. Other data sources such as MWD, LWD, mud, cement, weather, etc. can be incorporated in real-time.
SiteCom, one of the first testbed platforms for WITSML development is a real-time data management platform that is built upon the W3C SOAP technology for cross platform web services-based data exchange. SiteCom provides Baker Hughes with an integrated, secure environment for gathering, distributing and managing drilling, formation evaluation, completion and production data.
The infrastructure is planned as a ‘model’ WITSML deployment that will be of interest to both international and national oil and gas companies interested in deploying their own secure enterprise wide infrastructure. In addition to gathering, distributing, and managing data in real-time, it will allow the additional benefits of standardizing real-time and automated global processes including QC of data, seamless flow of data into interpretation packages, and performance of analytical calculations using real-time data. More from email@example.com.