Living in France, one of my regular reads is the venerable Le Monde newspaper. Something of an institution, Le Monde prides itself on reporting from different parts of the political spectrum. Its coverage of science and technology is less of a forté, and it often makes nonsense of energy-related comment, mixing up kilowatts and kilowatt hours on a regular basis. A recent letter to the editor from a concerned reader led to a response from the illustrious daily—that its journalists mostly had a literary training and found this sort of stuff rather hard. So the kilowatts and kilowatt hours go on being jumbled whenever energy is on the agenda.
It is unfortunate that such lack of technical know-how plagues journalism and gives it a bad name. It also leads to a gut reaction from many against journalists as a race. Without making too much of a thing of it, I like to think that what sets out Oil IT Journal from some of the other trade publications is that in general, we write about stuff that we understand. That is at least what we are striving to achieve here. A ‘stretch goal’ perhaps...
But the journalist is only one part of the information processing food chain. Quality of the output is a function of, not just the quality of the processor, but also of the information coming in. As Phil Crouse notes in his eloquent contribution to the ‘two conferences’ debate on page three of this issue, even our learned society conferences are not immune from poor quality information—in the form of ‘blatant promotion of vendor products,’ and this under the guise of a ‘scientific paper.’ Even information emanating from non vendor sources is often suspect—the piece we ran last month on an IDC study of ‘high performance computing’ is a case in point as it was based on the most flimsy evidence and imprecise ring fencing of HPC.
This leads me to another reason that some folks don’t like us ‘journalists.’ This is not because we get it wrong, but sometimes because we get it right. Especially when ‘right’ is not exactly what the marketing department had in mind. How about this for a bold claim from a major upstream software vendor’s website ...
‘Openness is the key to next-generation interpretation workflows. With true openness, there is opportunity for all to derive more value through open technology and leveling the playing field of integration and innovation.’
A bold claim indeed that positively cries out for scrutiny! But before I get down to the payload of this editorial I have to tell you about ... a fairground in a remote part of rural France...
On holiday in the Lot a couple of years back, I visited a fair with a buddy where I came across a big tent packed with folks sitting in front of a long table packed high with cuddly toys, cheap electrical goods, pots, pans and whatever. Neither me nor my buddy were really in the market for this trash, but curiosity made us venture in to take our seats for the show. It was not to be. A couple of big guys sidled up to us to intimate that this was not our place and that we should leave right away! And that if we didn’t like it, we should still leave. We left.
I can only suppose that the bouncers felt that somehow our presence would have adversely affected the pressurized selling of their junk to the old punters. Scrutiny of the proceedings was not on their agenda.
Rejection is a peculiar thing. Even when you know that it really doesn’t matter, it leaves a trace. I can’t say that I have since lost much sleep over this ancient trauma, but I had a couple of flashbacks to this event this year as Oil IT Journal was refused entry to two vendor tradeshows* on the trot. One of the vendors was home to the ‘openness’ snippet I cite above. The other has the following on its website...
[ ... ] announced a set of broad-reaching changes to its technology and business practices to increase the openness of its products and drive greater interoperability, opportunity and choice. These changes are codified into four new interoperability principles and corresponding actions: 1) ensuring open connections; 2) promoting data portability; 3) enhancing support for industry standards; and 4) fostering more open engagement with customers and the industry, including open source communities.
While this would be nice if it were true, but like the other vendor quote, it is in need of some critical scrutiny. The reality is that communication with customers and ‘communities’ are usually the subject of restrictive non disclosure agreements and, we have found at least, the tradeshows are not ‘open’ for us.
We really would like to get back in to the closed trade shows. We would like to think that there is still technical merit in what is being said inside the tent. But if the companies that run them prefer to run them like a fairground for country bumpkins that’s OK by us too!
But there is an issue here—even for the marketing department—and that is the extent to which you can maintain a position as a purveyor of scientific learning and at the same time, ‘sell, sell, sell’ your ‘science’ like soap powder. It is a complex equation that neither the vendors nor even the learned societies have solved. What makes a debate scientific—i.e. scrutiny, debate and dissent, is just what marketing departments spend their waking hours trying to avoid.
Writing this while reading of the aftermath of Hurricane Ike, of $5 gas in Houston, with the radio talking about a stock market crash akin to 1929 and with the oil price collapsing to a level near to the newly-inflated cost of production of some fields makes all this seem rather trivial. But to return to the topic of kilowatt hours and the like. If any of you out there relate to the notion that there is a place for informed journalistic comment on other facets of oil and gas technology—or indeed in other verticals, please get in touch. We are hiring...
* Squeamishness, rather than anything else, prevents us from naming them. But nothing is stopping you from googling (inside quotation marks) the first half dozen words from the two citations...
Speaking at the International Geological Congress in Oslo last month, Ian Jackson of the British Geological Survey announced the official roll out of a worldwide digital geological mapping system ‘OneGeology.’ Earth and computer scientists from all over the world have collaborated on the OneGeology project to produce the first digital geological map of the world. OneGeology is the flagship project for the UN International Year of Planet Earth 2008.
OneGeology was made possible thanks to the development of a new geological data standard, GeoSciML, a geology-specific application schema for the Open Geospatial Consortium’s (OGC) Geography Markup Language (GML). This has been combined with another OGC standard, the Web Map Service (WMS) to allow for a portal access to distributed geological data around the world.
The portal, a battery of virtualized servers, is located at the Bureau de Recherches Géologiques et Minières (BRGM) in Orleans, France. François Robida, Deputy Head of Division, Information Systems and Technologies at the BRGM said, ‘Today you can go to the OneGeology website and get geological maps from across the globe; from an overview of the planet, to larger scale maps of individual nations. You can hop to higher resolution applied maps and data on national web sites.’
OneGeology is also a knowledge transfer exercise, accelerating the development and up take of the new standard for interoperable geological data. A OneGeology ‘cookbook’ is available to help organizations deliver data, register maps and test conformance. Thirty countries and already serving maps along with metadata. Map layers can be viewed with tools such as Google Earth.
Participating geological surveys can host data on their own servers or on a ‘buddy’ server of an associated geological survey. WMS-produced maps are generally rendered in an image format such as PNG, GIF or JPEG, or occasionally as vector-based graphical elements in Scalable Vector Graphics (SVG) or Web Computer Graphics Metafile (WebCGM) formats.
The OneGeology team is currently working on a Web Feature Service for cross platform geographical query. BGS’s own online geological ‘DiGMapGB’ is served through GeoSciML as a web map service in the OneGeology.
BGS is also developing a GeoSciML interface to its GSI3D geological field mapping toolset—and on using the new protocol to transfer borehole information and digital terrain models. We will be reporting from the BGS’ GSI3D conference in next month’s Oil IT Journal. OneGeology went live in August 2008. More from www.onegeology.org and www.youtube.com/OneGeology.
BP presented the results of early tests of Netezza’s ‘Data Intensive Supercomputer’ at an invitation-only event in Houston last month. The benchmark squared off a Netezza Performance Server (NPS) against competing solutions from Teradata, SGI and Dell/Oracle. A Netezza rep intimated to Oil IT Journal that the BP Field of the Future (FotF) test showed a ‘15 to 30 fold’ speedup over the next closest competitor at ‘a fraction of the cost and footprint.’ BP now has two NPS systems running data warehouse applications in support of its FotF operations in the Gulf of Mexico. A Netezza blade comprises a field programmable gate array (FPGA) along with disk, memory and a Power PC. Data is striped across blades such that a query runs simultaneously across hundreds of FPGAs. Filtering data as it streams from storage means that only a small subset of data has to be processed.
The technology is said to be suited to 4D seismic time lapse studies as very large datasets can be compared on the fly. Netezza is currently in talks with the University of Houston’s Mission-Oriented Seismic Research Program (M-OSRP) to set up a seismic test. Netezza has also seen application in other verticals including telcos and homeland security. Netezza’s biggest customer is the NYSE with a 500 TB real time dataset.
Neil McNaughton’s May 2008 editorial compared PNEC with the Society of Petroleum Engineers’ ‘Intelligent Energy’ tradeshow. (Oil ITJ May 2008). I write to respond to the charges of ‘dullness’ and ‘tediousness’ at our PNEC data conference. PNEC works hard on delivering a quality-content driven event. For instance the conference provides a complete set of proceedings—a contrast to the recent SPE Digital Energy Conference in Houston where there were no proceedings, and most papers were commercial.
I know the SPE very well and see it as currently falling in value to its membership. In my opinion it is lacking in in-depth and non-commercial technical presentations at most of its events. This is not only the fault of the vendor/service companies, but also of the large operator companies, who have used SPE to forward ‘new technology’ when in fact they are blatantly promoted their own products.
Attendees tell us that they want to hear about solutions, developments, and best practices. They know that this sometimes involves talking about specific products. But they don’t want to hear the sales pitch they already got in their own offices!
Our data integration and management events have covered the full upstream, most of which is outside of the domain of the petroleum engineering profession. That being said, PNEC papers do come from petroleum engineers directly involved in data and information management. Recent papers have addressed the need for integration of reservoir and production data. Most engineers are just beginning to realize that they need to be part of a larger integrated effort to achieve good information and knowledge management. I agree with Neil’s analysis that the G&G community has already recognized these problems and that this is where most of the activity in data management really is. But managements tends to see data management groups as an expense without an immediate return on investment. This means that data management is experiencing rising demands and relatively little management support – although the tide is changing on this to an extent, driven by $100+ oil and $10+ natural gas. Management in every major company realizes that their tapes are in bad shape, archiving was not what it should have been (no money, no attention). Storage has been poor, and all their data can’t be accessed immediately. Now companies want to use their data to develop information and knowledge strategies for decision support. Many companies have not even looked at standards and catalog processes to the level and extent they should in the enterprise and the going will be tough!
Just this week, I was visiting with a close friend who is a PE with a major. My friend has been hunting through engineering data and analyses with very limited luck so far - wasting time looking for data and earlier engineering analyses. Such problems point to the integrated data and architecture issue. In practice, the engineering community is really at square one on this.
How does all this stack up in the face of the big ‘crew change?’ Most educated young people are great at Google, computers, PDAs, slick internet communities, and looking for ‘free’ information. They, like the rest of us, would rather do that than revive archives of paper, find tapes, etc. all the hard work that must be done. The new generation may be computer literate, but they do not like to rummage through the old stuff for information and knowledge. This is where the ‘wow’ and ‘razzmatazz’ hits the real world which could be described as ‘dull’ and certainly ‘mundane.’
Senior managements are now feeling the pain of neglected data management policies within the enterprise. Data management has targeted ‘low hanging fruit’ to cut costs and not hurt anything short term. Despite efforts by vendors who tried to sell services to provide full data management, or at least step improvement solutions to most of the operators, only ‘bandage solutions’ were implemented with significant lay off of data management personnel through 2002. All of a sudden, everyone needs people that are no longer out there. The long term cost for those actions is now viciously apparent to our industry.
From within the majors there has been a steady chorus from the front lines that things were not good. Within their own companies, data managers felt disenfranchised. Personnel coming to our events realized they were not alone. Neil is right, PNEC is a generic petroleum data managers user group with no specific agenda except to share best practices and experiences. A core group of attendees has been contributing to the PNEC effort over thirteen years – the group which forms the underpinnings of the petroleum data management community.
Recent comments from Norwegian attendees suggest that Norway has achieved the equivalent of data management’s Holy Grail. I guess if a government-run enterprise forces a single solution then you can claim this. But the reality is that Norway has the same issues on quality, data management as anyone else’s approach. Some victories can be claimed, but there are always obstacles out there. Each country has unique issues which complicate global visioning. There is no Holy Grail which makes the sharing of best practices and experiences so beneficial to the industry as a whole.
No, data management is not ‘exciting stuff,’ but it is necessary to achieve the goals of many of the ‘in’ buzzwords like ‘digital oilfield,’ ‘eField’, etc. These concepts pose some of the most complex systems problems that the industry has tried to solve. The stakes are in the multi-billion dollars to industry. Meanwhile we will continue taking the ‘petroleum data management community’ direction to help show vision and value to all the upstream oil industry. PNEC’s aim has never been ‘razzmatazz’ but providing valuable substance and yes, sometimes substance is ‘dull.’
Zeh Software has signed an agreement with Kazeon Systems, Inc. concerning the resale of Kazeon’s ‘intelligent e-Discovery’ solutions to legal teams in the oil and gas vertical. Kazeon’s Information Access Platform streamlines legal discovery tasks including early case assessment, analytics, processing and document preservation.
The advent of regulations such as the US Federal Rules of Civil Procedure (FRCP) and the new Canadian National Instrument (NI) 31-103 guidelines have forced legal professionals to shorten the time it takes to produce relevant information for e-Discovery and to identify relevant and sensitive data residing on a company’s network. Legal teams in the oil and gas industry worldwide are now required to sift through enormous volumes of electronic information scattered around the enterprise. Kazeon provides ‘defensible and auditable’ eDiscovery capabilities, automating collection, processing, analysis and review.
Jerry Martin, Zeh president and CEO said, ‘Leveraging Kazeon’s technology and Zeh’s expertise in the oil and gas industry will automate and greatly simplify our customers’ e-Discovery processes, reducing the manual tasks and burden that exists today.’
Kazeon’s search, indexing, analysis and workflow automation is used by clients including Fujitsu Siemens, Google, Network Appliance, Oracle and Symantec increasing visibility and control over electronically stored information.
P2 Energy Solutions’ (P2ES) Tobin unit is to offer a ‘Data on Demand’ service whereby companies can view and order Tobin’s ‘hard copy’ maps. Users can preview maps on the P2ES ‘Data On Demand’ website and place orders for paper, film or Adobe PDF format. PDF maps can be downloaded by ftp while hard copy is shipped overnight. Coverage includes Louisiana, Texas, Mississippi and Oklahoma with information on legal boundaries, state lines and county lines, roads, pipelines, land/water forms, subdivisions and property ownership as recorded at the time of mapping are also included with current leases. Wells are posted regularly for the most up-to-date and cost effective maps available.
Petris Technology has announced the launch of PetrisWINDS iShare, its new secure collaboration solution for energy teams. iShare is built on Microsoft’s SharePoint Server and is said to be scalable to multiple companies, users and large datasets. iShare lets teams manage energy-related information assets in the context of online data rooms, due diligence, post merger data transfer and sharing of joint venture data. iShare is said to eliminate ‘inefficient and random processes’ with a simple, feature-rich solution to managing data collections.
iShare can be supplied as a hosted solution, with the data and application residing on Petris’ servers. Alternatively, companies can license the software for in-house deployment. According to Petris, no technical skills or IT expertise is required for the user friendly application. Site setup and support is provided by Petris’ data management experts who can customize the solution to suit company branding and special ‘look and feel’ requirements.
Det Norske Veritas (DNV) is to head-up an ambitious automation project for ‘next generation’ capability for oil and gas production from Norway’s high North exploration frontier. Integrated operations in the high north (IOHN) is a collaboration between Norwegian IT contractors, the defense and oil and gas sectors. IOHN is to develop a digital platform for ‘safe and sustainable’ operations in remote, vulnerable and hazardous areas.
Automation is at the heart of the IOHN which is based on the premise that ‘human and organizational’ aspects pose a great challenge to integrated operations. Other challenges include information quality and the poor integration capabilities of both software and business processes.
Thore Langeland, manager of Integrated Operations at the Norwegian Oil Industry Association (OLF) said, ‘Integrated operations are a key element in the future of the oil and gas industry. New technology and new work processes will lead to safer, faster and better decisions. There is potential for considerable value creation and opportunities in new prospective areas.’
The project’s main deliverable is a ‘robust digital infrastructure’ for information exchange. Other components include remote and distributed control of assets and ‘heavily instrumented’ facilities. Pilots include unmanned drilling rigs, HSE in the Arctic regions, sub-ice operations and support for sensor data, information validation and web services. The project will extend and improve the quality of the ISO 15926 based oil and gas ontology and develop an information validation methodology.
The four-year, 90 million NOK ($16 million) program is supported by OLF, the Business Association of Norwegian knowledge- and technology-based enterprises (Abelia), and the Norwegian Defense and Security Industries Association (FSi). Financing is to come from the partners and the Research Council of Norway.
Quorum Business Solutions has announced the release of Quorum TIPS Gathering (QTG). QTG supports the whole business process, from gathering gas at the wellhead, through the complete processing cycle. Quorum’s TIPS flagship software is installed at over 350 plants in North America and accounts for 80% of the gas processed and settled in the US. QTG manages nominations, scheduling, imbalance and invoicing as well as allocating and settling.
Invensys Process Systems has released a new version of its SimSci-Esscor unit’s InPlant simulator for multiphase flow. InPlant V4.1 incorporates the new SIM4ME Portal that exposes InPlant through a Microsoft Excel interface. SIM4ME provides a bidirectional link InPlant, PipePhase PRO/II simulators and Excel allowing developers to create an Excel spreadsheet that is ‘InPlant aware.’ Users can drag and drop multiphase flow model parameters, such as relief valve discharge coefficients and pipe diameters directly into Excel spreadsheets.
de Groot-Bril (dGB) is to add an open source mapping component to its OpendTect package. OpendTect is now available with an embedded mapping functionality using the Generic Mapping Tools (GMT) package from the University of Hawaii.
Calsep has released the first module of its ‘Flowasta’ fully compositional flow assurance simulator for the design and operation of pipelines with possible solids precipitation and deposition. The first Flowasta module simulates hydrate growth oil and gas pipelines. The next module to be released will be an upgraded version of the ‘Depowax’ wax deposition module.
Blue Marble’s new ‘Geospatial Desktop’ bundles the latest version of its Geographic Transformer and Calculator. The Desktop is driven by ‘GeoCalcXML’ said to be largest coordinate conversion library available and offers tools to streamline data transformation and assure accuracy and data quality. An audit trail function tracks edits to the geodetic parameter data source.
Coade has released CADWorx Plant Design Suite 2009 (CWPD) for process plant design with performance-enhancing features for the AutoCAD-based plant designer. CWPD provides ‘intelligent’ drawing to database connectivity, design automation. CWPD provides ‘true’ bi-directional links between CAD and engineering analysis tools, linking CADWorx with COADE’s pipe and pressure vessel design tools.
Oildex has announced Spendworks 3 with enhancements to its ePayable platform for energy companies. Spendworks 3 provides ‘one-click’ access to invoice processing tools and access to business intelligence. TransZap’s Oildex e-commerce exchange serves over 4,200 companies and 44,000 users. See our interview with Oildex CEO Peter Flanagan in the February 2006 issue of Oil IT Journal.
Scandpower Petroleum Technology (SPT Group) has released V6.0 of its Olga flow simulator. The new Olga RocX module allows for dynamic interaction between the near-wellbore and the well to be included in the simulation.
P2 Energy Solutions (P2ES) has released Enterprise Land Lease Acquisition V 1.0, a new data capture and reporting package for lease, mineral and surface ownership information based on legal land descriptions. The broker management system is said to focus on managing field operations from a company’s perspective. Brokers connect to the centralized database over a secure connection and input data in a standard format. Information is immediately available to land managers.
Meridium has announced ProAct logic tree knowledge management templates, a new functionality in its Asset Performance Management (APM) suite. The templates were developed by Reliability Center, a Meridium integration partner and are now available for root cause analysis. The templates reflect experience garnered from some 800 field investigations that have been reviewed and compiled into a single RCA knowledge base. Meridium clients include Chevron, Marathon Oil and Xcel Energy.
The Research Partnership to Secure Energy for America (RPSEA), a consortium of some 130 US oils, suppliers and universities has selected the Society of Exploration Geophysicists’ Advanced Modeling Corp. (SEAM) for ‘project 2007DW2001’ a.k.a ‘geophysical modeling for studying acquisition and processing methods in the deepwater Gulf of Mexico.’ Funds of ‘up to’ $2 million will be awarded on a cost-share basis.
SEAM was announced at last year’s SEG Convention (OITJ October 2007) with the intent of creating synthetic data sets for algorithm testing—with the results of the modeling to be eventually available to the SEG membership ‘at a nominal cost.’ SEAM Phase I concerns the generation of a synthetic data set from subsurface geological models that are ‘at a level of complexity and size that cannot be practicably computed by any single company.’ SEAM is chaired by Kevin Bishop of BHP Billiton.
RPSEA’s mission is to provide stewardship in ensuring the focused research, development and deployment of ‘safe, environmentally sensitive technology that can effectively deliver hydrocarbons from domestic resources to US citizens.’ RPSEA President Mike Ming said, ‘Accelerating the time to first production and building the intellectual capability in the research community for these strategically important resources is vital to meet the nation’s energy needs.’
Funding for the projects is provided through the Ultra-Deepwater and Unconventional Natural Gas and Other Petroleum Resources Research and Development Program authorized by the Energy Policy Act of 2005 with funding from lease bonus and royalties paid by industry to produce oil and gas on federal lands. RPSEA is under contract with the US Department of Energy’s National Energy Technology Laboratory to administer the program.
The API’s e-Commerce Committee (PIDX) International North Sea Forum was held in Aberdeen, the largest regional forum to date with some 50 attendees from operators and suppliers. Chris Welsh, Eirô Consulting and Chairman of PIDX Europe outlined PIDX history and achievements since its inception in 1987. Since then, PIDX has evolved from Electronic Data Interchange (EDI) to support XML-based exchange of commercial data via the API Recommended Practice 3901 e-commerce transaction standards. These allow upstream petroleum industry trading partners to interoperate using XML. PIDX 2007 milestones include the approval of the downstream inventory standard which covers product inventory balances exchange between shipping facilities. The petroleum industry data dictionary (PIDD) now sports a ‘wiki’ on www.oilit.com/links/0809_3.
Jana Schey (Energistics) reported on PIDX standards and guidelines activity. An HSE reporting workgroup has been formed to streamline safety data reporting which is to develop an XML format for training performance data used in vendor qualification and reporting. The API RP 76 standard was published last year. Since then PIDX XML messaging and an Excel template have been approved. The PIDX global business practices workgroup is documenting best practices for global e-commerce on a country-specific basis to overcome global implementation challenges such as variations in legal, fiscal and business practices. A PIDX ‘REGS’ regulatory user group has been formed to develop e-Regulatory data interchange standards with involvement from Energistics, the US Ground Water Protection Council and other regulators. E-permitting with the new WITSML regulatory standards will be ready 2009.
Paul O’Shaughnessy provided an update on Chevron’s eProcurement implementation which in 2008 supported a $20 billion spend on almost 2 million transactions and 17,000 users. eProcurement helps Chevron redirect spend to preferred suppliers and ensures that contact pricing with early pay discounts are adhered to. PIDX standards reduced implement effort. Now 98% of supplier invoices are processed automatically and transaction integrity is assured ‘from delivery through acknowledgement.’ The transaction count is growing steadily from 4k in 2003 to 400k in 2007. In same period, PIDX supported spend has grown from $30 million to around $2.4 billion.
John Boardman (Hubwoo) described the PIDX Complex Service Procurement (CSP) initiative involving OFS Portal, Hubwoo and major operators. The current solution allows supplier invoices to be processed into SAP. Hubwoo’s CSP solution streamlines the purchase-to-pay process by synchronizing service entries and invoices, ensuring contract compliance. CSP provides connectivity to a large number of suppliers’ systems with ‘end-to-end’ exchange of electronic orders, responses, field tickets and invoices. Centralized event management and monitoring is provided by Hubwoo’s ‘Commerster.’ Service entries to SAP are generated via SAPConnect (Xi connector). A pilot between Shell and Schlumberger targets unplanned services, leveraging PIDX XML documents with attachment handling and display in SAP R/3 Service Entry with discrepancy checking of price, part number, units of measure, price, currency and quantity. Boardman noted that contract analysis needs to be done in order to determine the most suitable implementation targets – not all contracts are candidates for complex services.
Harald Gellein told how Statoil invited Halliburton to participate in the ‘Elektronisk Boring og Brønn’ (eBob) project in May 2005. The aim was for automated exchange of commercial documents between Statoil and suppliers to reduce costs on both sides. The first electronic invoice went through the system in Sept 2006 and the first service purchase order in January 2008. Problems encountered included constant SAP upgrades requiring design freezes, evolving contracts and pricing models and mergers and acquisition. Today, 90 % of all invoices are electronic, representing some 200 invoices and 1,500 pages of scanned support documents per month. StatoilHydro is Halliburton’s 2nd largest e-business customer globally.
Guy Brouaux noted that Total’s eSourcing spend totaled € 4,8 billion in 2007, about 20% of Total’s overall spend. eSourcing concerns 2000 active users in five continents working in all purchasing categories, E&P, raw materials, logistics, services, maintenance IT, general services, intellectual services etc. Total is targeting a € 6,5 billion e-spend by 2010 (25% of group spend). Spend analysis is performed in the ‘RE@CT’ project with a database of some 50,000 suppliers.
Jean-Pierre Foehn described operational improvement using PIDX Standards with a case history of converting a supplier’s CSV* invoices to PIDX format. The supplier was keen to align with the standards that its main clients were using but did not want disruption to its business and had no PIDX expertise. Amalto’s ‘B2Box’ was deployed to convert in-house generated field tickets in CSV to PIDX for consumption in client systems such as ConocoPhillips AS2, Chevron’s RNIF and third party exchanges such as Digital Oilfield. Foehn noted that there is ‘more to this than meets the eye.’ Challenges include managing networks and firewalls, managing security certificates and document transport protocols. According to Foehn, operators also benefit from using the Amalto B2Box as hub. For Foehn, ‘PIDX Standards and end to end automation of exchanges lead to greater procurement efficiencies and improved compliance and reporting.’
* Comma separated values—a standard way of exporting data from Excel in ASCII format.
Independent Data Services has inked a three year extension to its contract with Premier Oil’s drilling division. Premier has used IDS’ rigsite data communications infrastructure since 1998. The new contract sees the implementation of IDS’ Datanet2 ‘rich internet applications.’ Tim Hanson, Premier’s group drilling manager said, ‘IDS has set the corporate standard for drilling data reporting for Premier for the past decade. Adopting DataNet2 is a logical step. IDS has the best supported software in the industry, with technical back-up available 24/7.’ Initially Premier is to deploy IDS’ DrillNet (drilling) and GeoNet (geological) reporting services while trialing the StockNet inventory and asset management tool in Vietnam.
Qatar Petroleum (QP) is expanding its use of EnergySolutions’ software with a network license for PipelineStudio, the pipeline design and off-line simulation package. QP has been using PipelineStudio for gas applications for a while and has added the liquids module. PipelineStudio combines graphical configuration and reporting tools with a simulation engine that provides ‘robust and reliable’ solutions to steady-state and transient problems.
Offshore Hydrocarbon Mapping Group’s Rock Solid Images (RSI) unit reports on service deals involving the provision of technical services including rock physics modeling, well and seismic data conditioning and prestack reservoir property inversion to Vanco Energy Company. Ron Wallace, Vanco VP exploration said, ‘RSI’s rigorous rock physics analysis can reduce risk in rank exploration areas such as our deepwater West African and Ukrainian portfolios. RSI has also signed with a Kosmos Energy/Tullow joint venture to provide rock physics-driven, reservoir property inversion services on the Jubilee Field, offshore Ghana.
Transnet Ltd. has awarded Siemens Energy Sector a €40 million contract to supply automation equipment for the ‘NMPP’ multiproduct pipeline in South Africa. The 700km pipeline will transport gasoline, diesel and jet fuel from Durban to the country’s industrial and business center, Gauteng. The project includes master and backup control centers and automation solutions for pumping stations. The deal includes SCADA, telecoms, automation and management information and security systems. The NMPP is expected to become operational in 2010.
SpectrumData has signed a contract with an unnamed Malaysian client to transcribe some 10,000 legacy magnetic tapes and migrate the data to modern high density media. The contract means that SpectrumData will expand its Malaysia operation with a new facility in Kuala Lumpur. Project scope includes recovery and transcription of legacy media such as 9-track tape and 3480 and 3590 cartridges. Seismic data currently stored in the RODE format will be ‘de-encapsulated’ and converted to the ‘more readily accessible’ SEGY format.
William Sellick, a student at Robert Gordon University in Aberdeen has completed a dissertation on ‘a software viewer for WITSML real-time data.’ The project, which was sponsored by Independent Data Services (IDS), evaluated the viability of selected technologies for future development by IDS with a prototype web-based system for the collection and display of real-time data.
Technologies under investigation included Adobe’s ‘Flex’ rich internet application language, and RedHat’s open source ‘JBoss’ application server. These were used to develop a test application accessing the real-time data elements in the Energistics’ WITSML drilling data standard. The paper provides a good general introduction to programming the WITSML API and the SOAP web services infrastructure but the main topic of Sellick’s investigation is the use of ‘rich internet applications’ (RIA). RIA in this context means ‘Web 2.0’ style live data tables and graphing tools running in a web browser.
Although Sellick mentions the DHTML standard for RIA, the study focused on the use of Adobe’s Flash animation. Flex 2 is a Java-based API for Flash that runs in the Eclipse integrated development environment (Java was used throughout the project). The Flex builder uses Macromedia Flex Markup Language (MMFXML) which is read by Flash ‘ActionScripts’ to generate executable code at run time. JBOSS and JAXB (Java Architecture for XML Binding) was used to process WITSML documents.
The pilot resulted in 2D and 3D representation of drilling parameters updated in real time via WITSML’s publish/subscribe mechanism. Tabbed controls act as data selectors and a ‘knowledge bubble’ provides contextual information in tabular form. An XML configuration file contains business logic and allows updates and new features to be added as required.
Graph rendering in Flex automatically updates when new data is added to its data source. Once the user has subscribed to a specific topic the system automatically connects all the components required for the stream and waits for new data to arrive.
Sellick investigated system response times and scalability by publishing ten topics of data on the server and allowing a set of clients to subscribe to a singular topic. While the addition of multiple topics did not stress the server, the number of clients did – with the system becoming unstable with over a thousand connections.
Sellick reported a ‘steep learning curve’ for use of the JBoss server although this did allow all the requirements to be successfully implemented, providing an extendable system for future development. Adobe Flex was deemed to be a powerful tool for the development of rich internet applications albeit limited with the requirement for a Flash browser plug-in. WITSML development also proved challenging but once the infrastructure was created the system performed as advertised. Feedback from IDS deemed the prototype interface ‘rough and ready,’ but that interaction was good and that the user interface showed the kind of RIA features that IDS was looking for. IDS plans to use the real-time data extraction and handling components in its DataNet reporting application and is looking further at 3D visualization. Sellick received the Francis Morrison award for the best individual software project and the SAIC Prize for academic achievement. Read the full text of Sellick’s paper on www.oilit.com/papers/sellick.pdf.
Stephen Bumgardner has been promoted to project manager with Advanced Reservoir International. Prior to joining ARI, Bumgardner was with Texaco E&P.
Guillermo García has been appointed business development manager of ArkEx’ new Houston-based Latin America unit. García was previously with Halliburton.
Guy Johnson has been appointed sales manager, Western Canada with BJ Services process and pipeline unit.
Mark Collins is to head-up CGGVeritas’ new UK ‘Center of Excellence.’ The center hosts R&D teams and seismic processing groups working on the ‘fast and efficient deployment’ of new technologies.
Computer Modelling Group has appointed John Kalman VP Finance. Kalman hails from Grand Banks Energy.
Wim van Loon is to manage Emerson Process Management’s new engineering center in Oslo, Norway. The center will provide offshore front end engineering services and project management services.
Geoservices has appointed Nicolas Malgrain VP Mergers & Acquisitions and to its executive committee. Malgrain was previously with Deloitte & Touche.
Jean-Paul Roux has joined Geovariances as commercial and marketing director.
The University of Houston is to create a Digital Oilfield degree.
Geophysicist Elizabeth Diaz has joined Ingrain to work on computational methods and Patricia Pastana de Lugão is to head up the company’s new office in Brazil.
Calvin Treacy is to be replaced by Chris Staples as CEO of Intellection. Chairman Barry Hilson has also resigned and is to be replaced by Richard Osborne.
Ezat Zarasvand has been named general manager of The Information Store’s (iStore) new Middle East office in Abu Dhabi. Zarasvand was previously with Oracle Corp.
Paradigm has appointed Richard Ward as regional VP, China. Ward was previously with the Chinese geophysical supplier BGP.
John Wearing has joined Petris as VP with responsibility for product management and marketing. Wearing was previously with Paradigm.
Dan Colby has resigned from the Pipeline Open Data Standards Association. He is replaced by Keith Chambless, GeoFields.
Quorum Business Solutions has hired Bruce Wallace to its PGAS Measurement practice. Wallace was previously director of measurement for Regency Gas Services.
Ryder Scott has hired petroleum engineers Allan Chen (previously with Formosa Petrochemical), Bob Paradiso (Devon) and Steve Hudson (Chevron).
Seismic Micro Technology has opened new offices in Calgary (with Brian Kulbaba as account manager) and Moscow (Nikolay Kutsenko as country manager).
Ron Silva has joined Oslo-based Spectrum ASA group as technical manager.
WellPoint Systems has appointed Carrie Manion as SVP sales and services and Mike Weiss as SVP software and technology. Manion was previously with Bolo Systems. Weiss was previously R&D director with Halliburton’s Digital & Consulting Solutions unit.
Schlumberger’s Oilfield Glossary (www.glossary.oilfield.slb.com) celebrates its 10th anniversary this month. In the past decade it has grown from around 500 terms to its current authoritative 4,600 definitions.
Avatar Systems has completed its acquisition of Questa Software Systems, Inc. in a $2.2 million cash and paper deal. The acquisition adds 256 oil and gas companies to Avatar’s current customer base of 360 companies and will be immediately accretive to fiscal 2006 earnings. It is expected to increase Avatar’s Earnings per Share 300% to 400% over the next 12 months. Avatar will retain Questa employees and offices located in Midland, TX.
Aker’s Solutions and Well Services units have signed a master service agreement with Sensornet for the provision of an enhanced well intervention flow profiling service. The deal is ‘potentially’ worth over 2 million. Aker’s well service unit will use its logging, tractor and well control expertise to deploy Sensornet’s ‘FibreDip’ system which real-time flow profiles in multilayered reservoirs.
Divestco has released information about several 2007 acquisitions viz: BlueGrouse Seismic ($CDN 38.5 million), Veritas Energy Services’ Geomatics business unit ($CDN 3.2 million) and Spectrum Seismic Processing ($CDN 1.9 million). Software sales amounted to approximately 7% of Divestco’s 2007 $CDN 116 million revenue.
Meanwhile IHS has acquired Divestco USA’s product portfolio for $3 million in cash. The portfolio—essentially the old Petro Data Source assets, includes drilling, land and production data.
Det Norske Veritas’ (DNV) software unit has acquired Jardine Technology. Target of the acquisition is Jardine’s performance forecasting and optimization product line which will be sold to the oil and gas, refining and petrochemical verticals. Flagship software MAROS (upstream oil and gas), TARO (refining and petrochemicals) will complement DNV’s Safeti package for risk assessment, consequence modeling and management systems assessment. Founder Iain Jardine is leaving the company to pursue opportunities in other markets.
Fugro has acquired UK-based Phoenix Data Solutions, an upstream data management boutique with 2007 turnover of € 1.4 million. Phoenix, which provides digital reconstruction of seismic images, will integrate Fugro Data Solutions division. Fugro has also bought NexTerra Geophysical Solutions of Kolkata, India and SureSpek ISS of Australia. Pty Ltd.
Seismic specialist Geokinetics Inc. has raised $30.0 million from the sale of additional shares of its existing Series B Senior Convertible Preferred Stock to fund growth initiatives and provide working capital for the business. Buyers were Avista Capital Partners. Proceeds will be investments in capital equipment in response to ‘strong customer demand’ for seismic data acquisition and processing services.
TGS –Nopec has terminated its plan to merge with Wavefield-Inseis and is claiming compensation following a protracted dispute.
Telvent has joined the ‘Bandolier’ project that is researching SCADA system security. Bandolier, spearheaded by control system security specialist Digital Bond, is a component of the US Department of Energy’s National Energy Technology Laboratory’s cyber security audit and attack detection program. The project is documenting best security practice configurations for control system application components—such as HMIs, historians, and real-time servers. An alpha version of the security audit template for Telvent’s OASyS DNA SCADA system has already been released.
While traditional ‘active penetration’ security scanning techniques can result in a system crash, Bandolier’s signature files check the system against a known configuration, identifying any variance in settings. ‘Non-invasive’ mechanisms determine if the target system meets the supplied standard. Asset owners can safely use the audit file at initial deployment to verify a secure installation and periodically over time to determine if the security posture of the control system has been modified.
Digital Bond’s Jason Holcomb said, ‘Vendor support for the audit templates is key not only to developing effective files but also to adoption. Telvent takes security very seriously and has provided resources and is sharing its lab, a Windows domain controller and all the system components under test.’ Bandolier is also to generate audit files that can be used with Tenable Network Security’s ‘Nessus’ vulnerability scanners.
BP Exploration Angola has awarded Yokogawa Electric Corp. a frame agreement in connection with its Floating Production Storage and Offloading (FPSO) Program in Angola. The award covers project management, engineering, operation and maintenance services for an integrated control and safety system (ICSS). The ICSS will be built around Yokogawa’s Centum VP integrated production control system, the ProSafe-RS safety instrumented system, the Exaquantum integrated information system and the OmegaLand operator training system.
The ICSS will provide fully integrated and seamless control and safety functions for the subsea, marine, hull, and topside facilities of FPSO vessels along with a single interface allowing operators to start, control, and monitor all facilities from a central control room. Foundation Fieldbus technology will be used for advanced diagnostics. BP Angola is currently considering the development of up to four new offshore oil fields.
The PetroTrek event management solution (EMS) from The Information Store (iStore) provides a ‘complete top-to-bottom view’ of people, assets and events that impact petroleum exploration and production. EMS protects people and assets during routine operations and in the face of a storm. A web-based map improves situational awareness by placing information in context, supporting a wide range of event response and planning scenarios.
EMS connects diverse data sources including GIS data, wells and facilities, HR records, weather tracking and satellite imagery. The EMS map visualizes fields, pipelines, platforms and the operational context of people and assets. Users can drill down to production bubble plots for a whole field or individual wells.
EMS’ hurricane analysis tools let users track storm paths and impact probability regions. The system targets response planning and training with scenarios generated from archived weather data. EMS leverages iSTore’s ‘PetroTrek’ service oriented architecture to combine commercial or proprietary data sources, including SQL Server, Oracle, PeopleSoft and Impact Weather.
N4 Systems, a provider of automated inspection and real-time safety compliance management software, has announced the ‘Field ID Safety Network.’ The Field ID Safety Network (FIDSN) connects stakeholders including manufacturers, distributors, inspectors and end users. N4 Systems CEO Somen Mondal said, ‘Traceability is vital in safety compliance. The FIDSN eliminates guesswork, reduces errors and liability inherent with paper-based compliance and inspection management. The Network will create safer workplaces and prevent accidents.’
Traditional safety compliance relies on an ‘unmanageable’ paper trail of data from manufactures, third party inspectors, distributors and end users. The FIDSN connects all parties in an automated, electronic process. PeakWorks, a Personal Protection Equipment (PPE) provider has joined the FIDSN, adding safety traceability to its fall protection equipment. PeakWorks’ equipment will embed N4 Systems’ Field ID RFID tags and achieve ANSI Z35.1-2007 compliance.
Apache Corp. has selected OpenLink Financial’s Endur to manage its US marketing activities. Endur is a ‘front, middle, and back-office’ solution for trading, risk management and operations for the energy sector. Apache is implementing Endur for end-to-end management of physical crude oil, natural gas, and natural gas liquid transactions and will be working with OpenLink to extend Endur’s upstream functionality into contract administration, producer services, scheduling and midstream.
Apache VP Janine McArdle said, ‘We expect OpenLink’s system will enable us to process transactions related to our increasing production volumes, analyze and manage data more efficiently and provide controls that improve our economics and return value to our shareholders.’ OpenLink CEO Kevin Hesselbirg added, ‘We are pleased to be working with Apache to address the specific needs of the E&P community with the addition of wellhead producer services.’
The Oil Price Information Service (Opis), a ‘comprehensive’ source for petroleum pricing and news information has launched ‘iGas’ an iPhone application that is claimed to ‘help millions of users shop for the cheapest fuel.’ Opis CIO Michael Sinsky said, ‘The iPhone changes the game for consumers when shopping for the best prices on goods and services. The immediacy of being able to use the iPhone to perform targeted internet queries anywhere allows people to optimize their purchasing dollar.’
Users needing to fill-up simply touch the iGas icon on their iPhone and the phone’s GPS system locates their position and returns the 10 cheapest fueling stations in the area. The actual prices are displayed in low-to-high order, along with the brand and address. Users can also enter a zip code and the phone will return the 10 cheapest fueling stations in that particular area. Customers have the option to pre-select their fuel search criteria: Regular Unleaded Gasoline, Premium Unleaded or Diesel. iGas also provides route maps to a chosen gas station. iGas is a download from the Apple Store.
The SEG technical standards committee has released the final draft of the SEG.D Rev 3 acquisition standard. The revision supports bandwidth over 24 bits and passive ‘interferometer’ recording with an ‘extended’ recording mode of up to 1,628 days of data in one record. Records can now have microsecond-accurate GPS time stamp. An appendix includes examples of how to store SEG.D3 files on disc. The committee is currently looking at a ‘roadmap’ for electro magnetic (EM) recording standards development and further tweaks to the SEG.Y format. The SEG committee is also working with OGP to align positional data standards from acquisition to interpretation, leveraging the EPSG positional database. More from www.oilit.com/links/0809_1.
The International Standards Organization has released a new standard for the quality management of IT systems and software engineering. The ISO/IEC TR 90005:2008 standard for systems engineering – a.k.a. ‘guidelines for the application of ISO 9001 to system life cycle processes’ is designed to extend the quality management methodology of ISO 9001:2000 to the acquisition, supply, development, operation and maintenance of IT systems and related support services.’ Work group lead, Shigenobu Katoh said, ‘ISO/IEC 15288:2002 is a starting point for system development, operation or maintenance. This portfolio of technology-independent, generic processes optimizes management of the whole product lifecycle in any sector.’ More from www.iso.org.
The Public Petroleum Data Model Association is considering extending its data model to include oil sands. Another workgroup has been set up to ‘develop information about well components.’ The new workgroup is to operate under the enigmatic banner of ‘What is a Well?’ More from www.ppdm.org.
Felix Herrmann, director of the University of British Columbia’s Seismic Laboratory for Imaging and Modeling (SLIM) has announced the availability of the ‘SLIMpy’ Python interface to seismic data processing packages such as Madagascar. SLIMpy is a scripting language for programming iterative algorithms from numerical linear algebra that were originally designed for batch processing. The current implementation supports a plugin for Madagascar’s out-of-core Unix pipe-based applications and is extendable to pipe-based collections of programs such as Seismic Un*x, SEPLib, and FreeUSP. SLIMpy is academic research code that UBC is releasing under the GNU Lesser General Public License. Hermann hopes that the release as open source code will help create an active community to further develop the software. More from www.oilit.com/links/0809_2.
The ‘Information Overload Research Group’ has been formed to ‘make the business case for fighting information overload’ considered a ‘growing productivity problem.’ Founding members include Microsoft, Google, IBM, Intel, and Xerox. More from www.iorg.org.
A new division of League City, TX-based ERF Wireless has been created to address ‘growing demand’ in the oil and gas industry for the wireless products and services. The new unit is to leverage its expanding wireless broadband networks in Texas, New Mexico and Louisiana to provide specialized products and services to oil and gas customers.
ERF CEO Dean Cubley said, ‘Oil and gas is looking for a way to move from its traditional low bandwidth, high cost satellite-based connectivity to true high-speed broadband for field operations at a reasonable cost. ERF’s acquisition of wireless internet service provider networks created an extensive wireless footprint covering a large percentage of the most active oil and gas exploration, drilling and production regions in Texas, New Mexico and Louisiana.’
New unit head John Nagel added, ‘Our specialized products and services provide a communication pipeline large enough to accommodate massive exploration and production data feeds at great savings compared to satellite. Oil and gas wireless connectivity supports drilling, remote field offices, video surveillance for production facilities and pipelines and HSE monitoring. We also provide radio tower construction and tower climbing, wireless network design, remote network monitoring, and surveillance equipment installation.’
A new release of Aveva Net (Version 3.6) adds real time data access to Aveva’s Portal. The new technology allows for real-time video feeds, instrumentation monitoring and production data to be viewed alongside associated data such as equipment maintenance records and manufacturer’s specifications. The real time extensions were developed through collaboration with partner companies Data Systems and Solutions and ISS Group which has added management and monitoring of plant information and KPIs. Process information from SCADA and DCS sources is now visible thanks to ISS’ ‘Babelfish’ gateway.
The new release adds collaboration and work process management capabilities such that clients can develop their own solutions integrating processes and information flows. Use cases include workflow and configuration management, impact assessment, management of change, and a range of other features designed to ‘ensure information integrity and optimize business processes.’
Executive VP Derek Middlemas said, ‘Aveva Net applications can now interpret and enforce change management, flagging-up the consequences of failing components or challenging the assignment of a work package to an under-qualified engineer.’
IBM has developed a ‘state-of-the-art’ information exchange for the European Chemical Industry’s ‘Cefic’ trade body. The substance information exchange forum (SIEFreach) has been developed to facilitate the exchange of information between chemical companies submitting registration dossiers required under the EU ‘Registration, evaluation, authorization and restriction of chemicals’ (REACH) regulations. Around 30,000 substances have to be registered by 2018 to the European Chemicals Agency. SIEF lets companies exchange tests and other data on identical chemical substances, saving time and cutting costs.
SIEFreach is a Web 2.0/SOA development based on Lotus Quickr (the collaboration tool) and a Websphere Portal. The system uses an SOA based web services implementation of IBM’s ePayment system for online transaction processing.
Development tools included Struts-based portlet applications, Lotus Scripting and Java 2. The system also leverages DOM XML validation and parsing framework for REACH IT XML data integration. SIEFreach went live on the 21st August. More from www.siefreach.com.
Oil and gas logistics management solution provider Entessa, Inc. has delivered a major upgrade to the marine modules of its Synthesis supply chain management solution to Hovensa, a joint venture of Hess Corp. and Petroleos de Venezuela, (PDVSA). The upgrade streamlines the process of vessel, cargo, and dock scheduling at the US Virgin Islands-based Hovensa refinery marine terminal, one of the largest refineries in the world with a 525,000 bopd capacity.
Synthesis underpins Hovensa’s vessel positioning system. The release 4.0 brings enhanced functionality in loss control monitoring, dock utilization tracking, vessel turnaround analysis, and demurrage enhancements to ensure that marine operations are fully integrated with the refinery’s production planning process. State-of-the-art oil measurement tools, data formulation and reporting capabilities are provided through an improved cargo dashboard, which provides real time metrics. Hovensa’s Bob Williams said, ‘Entessa helps us run our cargo operations more efficiently. This new release brings functionality with management capability and use of the system’s analytical tools.’
Wyoming-based Aerial Coalition Technologies (ACT) has boosted its airborne corridor mapping capabilities with the acquisition of a FLIR U8000e GasFind infra red camera coupled to Red Hen Systems’ (RHS) GPS-aware recording system and MediaMapper Server (MMS). ACT provides turnkey services for disaster response and pipeline right of way inspection and leak detection. Deliverables include geo-referenced video, audio and high resolution still imagery—all stored on the RHS MediaMapper Server.
MediaMapper Server integrates geospatial multimedia libraries and relational data to be served through ESRI GIS products—making it available throughout the enterprise and over the internet. According to RHS, the enterprise access model provides a more scalable solution to multimedia distribution than desktop solutions and ensures that information stays current.
The intuitive connection between map features and site-specific multimedia aids in route discovery, reconnaissance, asset condition assement and coordination of response efforts. When required, GPS tagged photos can be emailed in from the field for instant update using Red Hen’s ‘Blue2CAN’ geo-tagging digital still camera. Blue2CAN embeds location information into digital photos’ EXIF tags and provides a Bluetooth link from camera to GPS, laptop or cell phone. An API is available for custom applications. RHS systems are in use at BP Pipelines and Duke Energy for gas leak detection.
SAIC’s Tom Schultz has kindly provided an update on the Energy Service-Oriented Architecture (SOA) Interoperability Collaboration Forum’s activity since we first reported on the initiative last year (OITJ December 2007). The SOA for Energy kick-off event was held in February where companies including BP, Chevron, ExxonMobil, Occidental, Saudi Aramco and Shell agreed to take part in the interoperability standards pilot. This six month project began in March, using the standards document provided by BP as a starting point, and is soon to release a draft standard for review by the broader community.
Work is now in progress on a ‘one voice’ edited version of the document and on identifying and documenting use cases. The plan is to vet the standards in a more public forum, establish a means for publishing the document and a plan for extending and keeping the standards up-to-date. The group has completed reviews and first pass edits of what will become the ‘Web Service Interoperability Technical Standards document’ and has started the process of transitioning ownership of the standards to Energistics. This transition includes the formation of a technical architecture work group under the auspices of the Energistics industry services special interest group as well as leveraging Energistics capabilities for guiding a public vetting of the standards document. More on SOA for Energy from www.oilit.com/links/1003.
The new V7 release of AspenTech’s AspenOne marks a significant development for the developer of process control and engineering software. AspenTech is offering ‘best practices’ developed in collaboration with flagship clients like BP, Flour Corp. and Technip, that cross the silo barrier that previously separated the electronics and digital process control system design from the construction engineering phase. The change has been enabled by the development of a construction-friendly engineering data model embedding the ISO 15926 standard. A patented master data model enables owner-operators manage assets across the entire lifecycle—from design through operations—and an ISO 15926-compliant, vendor-neutral interface links basic to detailed engineering. What is claimed as ‘the most comprehensive physical property database’ has been developed in collaboration with the National Institute of Standards & Technology (NIST) to support complex process modeling and optimization.
FIATECH standards body director Ric Jackson commented, ‘We are pleased to see a major vendor of tools for conceptual and basic design embrace ISO 15926. Now automated design and operation of process facilities can be extended back into the early stages of the asset lifecycle for the first time.’
Jerry Gipson, Director of Engineering Solutions, Dow Chemical added, ‘The integration of process simulation with model-based economic analysis and equipment design improves our engineering efficiency, and reduces the potential for error in data transfer.’
Finally, Ashish Shah, Project Automation Director with Fluor Corp. said, ‘The ability to collaborate with licensors, owner-operators and subcontractors through data exchange is an important element to executing projects globally.’
‘AspenTech is on the right track with aspenONE V7 in adding important usability and functionality to help project teams be more productive. Aspen Basic Engineering, formerly known as Aspen Zyqad, is a global reference software at Fluor.’