In a well received keynote to the Society of Petroleum Engineers’ Digital Energy Conference this month (full report next month), David Payne, VP of Drilling at Completions at Chevron observed that a year had past since the Deepwater Horizon/Macondo blowout. Since then there have been ‘conversations’ around well design, blow out preventer design and so on. But there has been little discussion of human factors.
Payne argues that a major factor in Macondo was that the human interface to data failed the people on the rig. The human brain can only handle 6-8 data points at a time and has a limited capacity to manage data. On Macondo, the data was there but operators were not capable of handling and interpreting it.
A modern deepwater rig produces masses of data, ‘We used to have a little Geolograph and there was a lot of manual data entry, now we have measurement and logging while drilling (MWD/LWD) data coming in in near real time.’
A deepwater Gulf of Mexico rig has thousands of sensors. One rig today can collect more data in a year than Chevron D&C has collected in the whole of its history. How is this handled? It isn’t! There is a big ‘data exhaust’ on the back of the rig, and the data is gone! It scares folks how much it would cost to keep and manage.
What are we doing about all this? We have proliferating drilling decision support centers (DSC). Chevron’s DSCs provide a lot of help in geosteering, but they don’t really change the way we do business.
They do however offer huge opportunity for analytics, doing stuff we have not thought of yet. We would like to be able to compare real time data with a global database—the laws of physics are the same worldwide! We need to connect earth science with drilling data, so that we know, for instance, when to change drilling parameters and when to pull out of hole. We also need to use IT to find out why connection times are too long using data mining, linking multiple data streams and databases.
There is also a demand for open systems. The very idea of proprietary data is unacceptable. We need to set expectations around industry standards. Software standards have limited success to date. WITSML is good but is just the very beginning. We need to change D&C workflows and to move beyond traditional process.
Initially it was suggested that information from the DSC should go through the usual ‘chain of command.’ This is ridiculous! We need to change the paradigm; leveraging video links to the rig and ‘face to face’ contact from DSC to rig. We will see dramatic IT-driven change. IT represents the single biggest opportunity for improvement in D&C.
P2 Energy Solutions went shopping in Calgary this month, bagging both Explorer Software Solutions and the assets of Wellpoint Systems. Wellpoint’s products include Bolo financials and the Ideas international venture management system. Explorer develops land management software for the Canadian market.
The acquisitions will create a consolidated independent technology provider with a portfolio of solutions for financial, production, and operations software, land management tools, mid-stream marketing solutions and outsourcing services. The combined unit has over 1,200 clients in 65 countries, and employs nearly 700 people. P2ES anticipates synergies from ‘scalable global resource allocation across R&D, services, support and account management.’
P2 ES president and CEO Bret Bolin observed, ‘These acquisitions represent a further step toward delivering on our vision of global leadership in enterprise solutions for oil and gas. They add proven technology and talented professionals, extending our capabilities to an international client base.’ Upon completion, 119 of the top 150 US energy companies, 91 of the top 100 Canadian operators and nearly 200 international operators will use P2ES solutions. More from www.oilit.com/links/1104_1.
On my way the SPE Digital Energy conference (DEC—report in next month’s Journal) I found myself in a duty free shop where a chunky little mobile phone caught my eye. It was the antithesis of the iPhone. While not quite the whip-antennaed monster that Tommy Lee Jones brandishes briefly in Mars Attacks, it had something of the same anachronistic pizzazz. Upon further investigation, it turned out to be an explosion-proof Sonim cell phone1 with ATEX certification for use in oil refineries and offshore platforms. If you doubt the capacity of a cell phone to start a fire (and if you are not squeamish) you should check out the video2 on Atex Systems’ website. As I rarely visit refineries these days, I was not about to spend €400 on such a device, but I’ll get back to the Sonim later.
In last month’s Journal we reported on Steve Ballmer’s address at CERA Week where he argued that today’s IT paradigm is of consumer technology, such as instant messaging, migrating into the business environment. Ballmer opined, ‘Opportunities open up with these technologies in business and specifically, in energy.’ He managed to get in a plug for the X-Box 360 Kinect controller, whose gesture capture ‘can be harnessed and applied anywhere, from a classroom to an offshore platform.’ Maybe he has something. Perhaps a smart programmer will one day turn geoscientists’ arm-waving into a well plan.
On the other hand, in the interests of disclosure and to underline that no simple statement should ever be taken at face value in IT, it behooves me to note that in his new book3, Paul Allen categorizes Microsoft as a company whose ‘core strength is software for business,’ and which ‘is not positioned well for the move to a consumer-focused mobile platform.’
In any case, the arguments for and against ‘consumer’ technology stand whether or not such is to be sourced from Microsoft, Apple or Google—so I will plug on with my investigation.
At one level, the ‘consumer to business’ shift has already happened. We now all use ‘commodity’ if not quite ‘consumer’ hardware. Windows has effectively displaced much (but not all) ‘business’ systems, as companies like DEC, HP and IBM, SGI and Sun can testify, well those that are still around can at least.
At the (other) DEC and elsewhere many speakers have argued that social networking, for instance, should be used more in the enterprise. Another trend is the advent of multiple devices—read iPad, Blackberry et al. that are eroding Microsoft’s hegemony in the enterprise, and causing IT managers to rethink their provisioning. IT itself is opening up to more ‘consumer’ provision, with ‘cloud’ offerings and software as a service.
Another given is that ‘office’ IT is inevitably seeing more and more connectivity with plant IT. Of course this has to be weighed up against the security risks. All of which makes for a perfect storm of ‘fear uncertainty and doubt’ (FUD). On the one hand we tout the yet-unrealized benefits of desktop access to every tag in the plant. On the other hand we give ourselves nightmares about a new Stuxnet destroying the facility.
We have been here before. In fact we reported from an SPE Security conference back in 2005—and I have just put the full report4 in the public domain. At the time ‘deperimeterization’ was all the rage—tinged with a little FUD from the ‘Slammer’ worm that had just infected a US nuclear plant. There was also a call for an industry-wide security standards body which as far as I know has come to nothing. In our 2009 report5 from the API Security Conference (also now in the public domain) we learned that ‘mobile devices can mean really rough gaps in security,’ and that, ‘management involvement in security appears to be work in progress.’ There were also suggestions that some ‘reperimeterization’ of the facility might be in order. Judging from the 2011 DEC, it appears that little progress has been made.
I take it that the brave new multi device world is populated by Blackberry and iPhone/iPad users. Instead of the IT department laying down the law about what devices are approved for use, we are now asked to cater for anything. It is not all that long ago when we were talking diskless workstations sans USB slots. Now it seems that the latest ‘always on’ gizmo that ‘calls home’ with positional information every few minutes is fine. Of course I am not privy to such matters but I have a sneaking feeling that the boardroom crowd is responsible for this ‘deregulation’ of the communications space. I notice that the boardroom applications de rigeur are Diligent’s Boardbooks6 or ICSA’s Board Pad7. I have a mind’s eye view of the board all traipsing into the oak paneled boardroom with their iPads and Blackberries, going online and checking their stock quotes and generally bypassing anything that IT in its wisdom has put in place.
Beyond the security risks, perhaps the worst aspect of consumer IT is the objectionable agendas that providers have. Their business model is to extract personal information from individuals and sell it to advertisers who then target you. Do you really want your providers to go around sniffing your wireless networks and recording passwords? Is it OK for them to store your location information in secret files on your mobile devices? Even if the information is not secret—is it OK for it to be stored in a proprietary format to which only they have the key? Do you want them to share this with others? And what happens when they get it wrong and lose your personal/business information to hackers? Is it OK just to say ‘sorry?’ Will this be enough for your shareholders?
I was going to conclude with an accolade for ATEX-compliant devices like the Sonim and how we really do need an IT-ATEX spec for enterprise IT. Unfortunately, the Sonim may not cause a conflagration, but since it has a USB port is unlikely to pass muster in terms of IT security! If this has whetted your appetite for more, you may like to consider attending the WRG Oil & Gas Cyber Security Conference to be held in Houston in June8.
3 ‘Idea Man’—www.oilit.com/links/1104_35.
Pete Pacheco’s new book, An Introduction to Parallel Programming1 (I2PP) is a well written accessible introduction to the subject that is now synonymous with high performance computing. Parallel programming is also increasingly key to performant desktop applications as even commodity PCs come with multiple cores which require re-tooled software to deliver their full potential. I2PP is a textbook, not a cheapo ‘How to’ guide. It is accompanied by a website2 and each chapter includes a set of quite hard questions. It is good to see that not all IT courses are ‘dumbed down.’
I2PP should be read by just about anybody with a serious interest in computing. Those involved in marketing technology may learn what acronyms like SIMD, NUMA and concepts like threads and processes actually mean. Programmers get a hands-on tutorial in building parallel programs of considerable scope. Pacheco works though hardware architectures, network topologies and three approaches to parallel programming, MPI, Pthreads and OpenMP—all called from C. He does not cover Nvidia’s CUDA or ‘other options’ for making the programmer’s life easier. It would have been nice at least to hear what these were.
Pacheco is prof of computing and math at the University of San Francisco and is well placed to provide insight into algorithm crafting and tuning to hardware. His two main use cases are solving the n-body and the travelling salesman problems. As he shows, there are many ways of skinning the parallel cat and indeed, there may be no ‘best way.’
But what is perhaps most surprising in I2PP is the revelation that parallel programs display ‘non determinism.’ In other words, depending on some hardware niceties, the same program can produce different results! This has to be fixed by the programmer using ‘mutex’ constructs rather like a database lock, adding an extra layer of complexity.
It seems like, after years of trying to protect programmers from themselves, compiler engineering has reverted to a previous era where nothing could be taken for granted. Back in the day, memory leaks, out of range pointers and indices were the problem. Then languages and compilers got savvy with built in protection. Hardware abstraction offered more progress in code portability and longevity. Parallel blows a lot of this away. We have to manage non determinism and to retool the algorithm for each hardware nuance. It appears that the chances of writing a bullet proof program are receding. It would be nice to think that we are at a provisional low point in compiler development and that the future will bring more sophisticated tools that will relieve the programmer of the huge burden that current technology imposes. But this unfortunately does not appear to be the way the world is going.
1 ISBN 978-0-12-374260-5 and www.oilit.com/links/1104_45.
Mark Wakefield, Eclipse development manager, Schlumberger, with help from Fortran complier developers at Polyhedron Software, described tests with graphics processing units (GPUs) for fluid flow simulation. Schlumberger’s Eclipse flagship already has an established Fortran/MPI code base while the new Chevron-backed Intersect simulator leverages C++/MPI, targeting scalability and very large models. Schlumberger has been evaluating GPUs 10 megacell models. Nvidia’s Cuda code library gives promising scalability for up to 500 threads. But overall, the GPU-based solver showed a 2-3 x speedup for up to a million grid blocks, with a ‘modest’ effort. Schlumberger is now working on ‘massively parallel recursive’ algorithms that are hoped to provide scalability on larger datasets. Wakefield noted that scientific code has a long life time and that working with non-proprietary standards has enabled Eclipse to adapt to operating system, hardware and interconnect evolution. Along with Cuda, Schlumberger is investigating solutions such as Intel ArBB1, the Oxford Parallel Library2 and Nvidia’s Thrust3.
Kathy Yelick of the Lawrence Berkeley National Laboratory looked ahead to the ‘new world order’ of ‘exascale’ computing. Exascale is the projected compute power that will be available over the next 5 to 10 years. But the path to exascale will likely be a rough road as pure compute horsepower is limited by power considerations and performance will increasingly come from parallelism and ‘heterogeneity,’ i.e. many small, energy efficient cores running under control from (at least) one fat core for the operating system. Such systems may be efficient but they are getting harder and harder to program. In part this is because we like to use general purpose hardware. One answer may be to co-design hardware along with software à la Green Flash/Tensilica4 or the FPGA Verilog/ASIC emulator. Early intervention in hardware design means optimizing for what is important—energy, data movement etc. The approach allows for the use of languages that support machine abstractions. Code generators and autotuners will optimize communications.
Henri Calandra (Total), John Etgen (BP) and Scott Morton (Hess) made a valiant attempt to answer the question, ‘What can we do with an exaflop system?’ It seems like seismic depth imaging has already entered something of a nirvana realm of full wave equation migration on complex geology at (quite) high resolution. The speedup from Petaflop (2010) to Exaflop (2020) will see more of the same trends and the introduction of full elastic WEM by circa 2015 (50hz and 100hz by 2020). Seismics is lucky in that it is ‘naturally’ parallel. While the goal of a single compute node per shot is desirable, seismics does not present the same scalability ‘gotchas’ as other Department of Energy simulations. More of a concern is programmability and reliability/fault-tolerance. There is also a desire to continue to leverage ‘commodity’ components.
The debate5 on Exascale computing highlighted the shortcomings of current systems. For seismics, one node per shot is desirable but as the notion of a ‘core’ is shrinking, this objective is receding. In fact, parallelism is still under-utilized as most routines are still ‘compute bound.’ Elsewhere, I/O is the bottleneck—today’s GB/s disk bandwidth is totally inadequate for some tasks—Etgen wants terabyte storage bandwidth. Seismics is still ‘all about resorting, shuffling data.’ Presentations available on www.oilit.com/links/1104_7.
Seismic Micro Technology (SMT) previewed the 8.7 release of its ‘Kingdom’ geoscience flagship application at the American Association of Petroleum Geologists (AAPG) Annual Convention and Exhibition this month. The new release includes field development functionality, integrating engineering, geological and geophysical capabilities to optimize drilling and production operations. The release also covers geosteering optimization, microseismic interpretation and well path planning—targetting, inter alia, the development of ‘unconventional reservoirs with thin stratigraphic sections and laterally changing geology.’
Kingdom received endorsement from Southwestern Energy senior VP John Thaeler who said, ‘We have used SMT for our seismic interpretation for many years. The enhancements in geology and engineering, along with the tight integration, convinced us to extend this solution across all of our asset teams.’ Kingdom well path planner supports industry standard methods to design the wellbore and calculate inclinations, azimuths and offsets. Users can create a cross-section of the proposed path and quickly share it with drillers and asset team members. Real-time geosteering optimization overlays planned and actual well paths on a vertical display or seismic section allowing geologists to track and adjust drilling in relation to the target.
SMT has also expanded its service and training offering with new data and systems services, onsite services, consulting and training from seven global offices. Kingdom 8.7 is scheduled for release this summer. More from www.oilit.com/links/1104_24.
Studies by the University of Oklahoma, in collaboration with Devon Energy, have shed light on the rapid decline in production observed in some non conventional gas reservoirs. 3D, nanometer-scale imagery produced by a new ‘dual beam’ scanner from FEI Company resolves kerogen, porosity and microstructure of gas shales. FEI’s Helios NanoLab dual beam system combines a scanning electron micrograph with an ion beam imager. The device has allowed researchers to determine production potential and build a simulator of nanoscale pore structure.
Prof Carl Sondergeld commented, ‘Organic is more porous than previously imagined—but pores are so small that they impose novel physical constraints on gas behavior. This hitherto un-imaged pore space explains why there is so much producible gas. The images also explain why production declines so rapidly in some unconventional reservoirs and is revising previously held beliefs about unconventional resources.’ More from www.oilit.com/links/1104_25.
Advanced Visual Systems (AVS) has announced the first deliverables resulting from a joint development in the field of high performance computing with Microsoft. The collaboration sees enhancements to the AVS/Express data visualization system that leverage Microsoft Windows HPC Server 2008 R2. New features include integration with Microsoft’s job scheduler, added security to Microsoft MPI communication and access to cluster resources from within the AVS/Express data visualization environment.
AVS CTO Anoop Chatterjee said, ‘Our relationship with Microsoft reflects the needs of the hands-on HPC customer. For developers this means a faster and easier approach to creating clusters. For engineers it means easier access to state-of-the-art 3D data visualization. We look forward to testing our applications in Microsoft’s performance lab.’
Bill Hamilton, Microsoft’s director of technical computing added, ‘The Windows HPC Server and AVS/Express combo provides scalable compute power and visualization resources.’ More from www.oilit.com/links/1104_26.
Shell and HP have provided an update on their joint development of a novel geophone (Oil IT Journal March 2010). The companies are developing an ‘ultra high resolution’ wireless land seismic recording system leveraging sensing technology from HP’s ‘Central Nervous System for the Earth’ (CeNSE). Testing in the seismic testing vault of the USGS’s Albuquerque Seismological Laboratory facility in New Mexico showed a noise ‘floor,’ (the smallest detectable acceleration) of 10 nano-g per square root Hertz. The test also demonstrated the fidelity of the new sensor at frequencies ‘as low as 25 mHz’ [which we take to be milli hertz].
Dirk Smit, chief scientist for Geophysics and VP Exploration Technology, Shell said, ‘Our collaboration with HP demonstrates Shell’s strategic approach to driving innovative technology solutions through active partnering.’ HP’s Rich Duncombe added, ‘We are on track to produce a leap forward in onshore seismic data quality.’ The system will be delivered by HP Enterprise Services and the company’s Imaging and Printing Group. More from www.oilit.com/links/1104_27.
Neuralog’s NeuraJet17 provides 17” wide presentation quality color well log prints on continuous fanfold paper at up to 1200 dpi. NeuraJet17 comes with the NeuraViewPE imaging and LAS log handling software—www.oilit.com/links/1104_49.
Aveva has announced a new ‘Laser Modeler’ solution to transform laser scan point cloud data into as-built 3D plant models—www.oilit.com/links/1104_50.
Cortona3D ‘s RapidAuthor 6.0 subsumes previous products into a suite of tools for all support documentation authorship of manuals, parts catalogs, training materials and instructions. The tool integrates with Just Systems’ ‘XMax’ XML editor—www.oilit.com/links/1104_51.
R4.0 of Nvidia ‘s CUDA toolkit includes GPUDirect 2.0 for peer-to-peer communication among GPUs in a single server or workstation. Unified Virtual Addressing provides a single merged-memory address space spanning main and GPU memories. Thrust libraries provide open source C++ parallel algorithms and data structures. A 5 to 100 x speedup is claimed—www.oilit.com/links/1104_52.
Invensys’’Eyesim’ virtual reality-based gas plant operator training systems is now available for the iPhone and iPad. The new app offers mobile workers access to process simulation and a ‘walkthrough’ virtual plant environment—www.oilit.com/links/1104_53.
INT has released GeoToolkit.NET 3.1. Highlights include log display in time or depth, added support for deviated logs, new tornado and box plot charts and management of component assemblies for an subsurface completion in Well Schematic—www.oilit.com/links/1104_54.
The 2011 release of Baker Hughes’ JewelSuite offers improved collaboration and decision making, direct connectivity with Computer Modeling Group’s flow simulator, parallel processing with new multi-threading and multicore functionality and micro-seismic visualization for non conventional workflows—www.oilit.com/links/1104_55.
Landmark’s new DecisionSpace Desktop Well Planning module can plan an entire field of platforms and well locations, including multilateral drilling programs for unconventional resource plays—www.oilit.com/links/1104_56.
A new white paper from IBM, ‘Linux and the storage ecosystem,’ provides a good introduction to DAS, NAS, SAN and volume management—www.oilit.com/links/1104_57.
LMKR has acquired Velocity Manager, a seismic depth conversion toolkit, from Cambridge Petroleum Software—www.oilit.com/links/1104_58.
Petrofac’s SPD unit has released WellAtlas project management software for the oil & gas sector—www.oilit.com/links/1104_59.
Ensyte Energy Software is offering software development and testing services to the oil and gas industry targeting the port of legacy VB6 applications to Visual Studio/SQL Server or Oracle. The service offering follows on from the successful port of Ensyte’s Gastar application—www.oilit.com/links/1104_60.
The 8.5 release of VSG’s Open Inventor includes an optimized C++ STL, a new .NET framework and improved support for ATI graphics boards. Improvements to VolumeViz and VolumeViz LDM optimize rendering performance, reduce memory use and speed large volume data loading—www.oilit.com/links/1104_61.
Gas lift (GL) has regularly been cited as an exemplar of the ‘digital oilfield.’ For the digital brigade, GL is seen as an opportunity for digital optimization. Presentations from the ALRDC/ASME 34th Gas-Lift Workshop held earlier this year in Singapore appear to support the idea that GL optimization (GLO) is an under utilized technology with huge potential, but that the impact of ‘digital’ is perhaps secondary to the organizational, engineering and people challenges.
Two presentations described how ExxonMobil ‘s global artificial lift group (GALG) was created to ‘focus key resources on evaluation and optimization of artificial lift systems.’ The business case is rather compelling. ExxonMobil has around 1,000 operated gas lift wells world wide. These represent some 10% of its total well count but make up around 1/3 of overall production. Gas lift deployment is expected to rise significantly in next few years. The GALG is made up of a body of qualified technicians. A structured process has been put into place for review and prioritization of interventions. A standard toolkit comprises Echometer’s WellAnalyzer, digital thermal monitoring, clamp on flowmeters and AppSmith’s WellTracer surveillance tool. Early tests in a SE Asia offshore GLO project evaluated 29 wells, coming up with 12 optimization recommendations. This resulted in a 24% hike in production—and pointed the way to considerable upside from further work. The process is one of continuous learning—hence the use of rotating technicians to share technology and best practices.
A Shell presentation leveraged Petex’ IPM open server macros to connect Prosper and Gap simulators with AppSmith’s WinGlue gas lift software, Shell’s tool of choice. Open server macros developed in Excel VBA code have allowed the tools to interoperate on tasks such as an equilibrium curve module, that determines the maximum depth of injection in gas lift design and analysis. Another use case involves combining many individual well optimizations in WinGlue in GAP for global, field wide optimization. Petex is now considering adding these extensions to Prosper.
A break-out session debated new technology, noting that although world has changed a lot of the last 40 years, gas lift remains the same. In fact, some technology ‘tweaks’ (electric, hydraulic or wireless gaslift) have mostly died quietly. The group outlined the attributes of an ideal gas-lift system as follows, a ‘dynamic’ system that continuously optimizes gas-lifted production, both down-hole and at surface without the need for workover interventions, a system that can inject at any rate at any depth, software driven field-wide optimization and virtual metering. For green field developments, the added cost of a smart gas-lift string should be a no brainer when the prize is a 5% production hike. Presentations available on www.oilit.com/links/1104_47.
Alan Thompson (Production Services Network) uses Microsoft SharePoint/My Site1 to transform the way PSN conducts its oil and gas consulting business. In the past, the company recruited to fulfill most of its new contracts, using the manager’s ‘little book of highly reliable people!’ Now that the company has built a knowledge base of good practice and ideas through its network of consultants, it is in a position to ‘in-source’ work instead. Client requests are now addressed by the whole organization. Thompson warned however of the dangers of resistance to change, quoting Machiavelli, ‘There is nothing more difficult to take in hand than to take the lead in a new order of things, because the innovator has for enemies all those who have done well under the old conditions and lukewarm defenders in those who may do well under the new.’ Another perennial problem for the KM advocate is the fact that knowledge workers may be unwilling to share their ‘prime asset.’ The answer is to develop a culture conducive to collaboration and to use champions. These are not necessarily SMEs themselves, but they will need some chutzpah to ‘approach and engage’ the experts.
Alf Michaelsen (Logica) described KM as the ‘art of transforming intellectual assets into business value.’ Despite much talk of the talent shortage and the case for KM, 80% of KM initiatives fail to deliver all their business objectives. This is because of a lack of visible leadership, a lack of the desire or ability to change, lack of resources, IT limitations and budget. Michaelsen recommends a KM assessment and warns against implementing a technology-led KM strategy. Enter Logica’s KM framework and methodology, an eight stage process of creating change. The framework includes creating a ‘guiding coalition,’ developing a strategy and communicating and generating quick wins. Finally the new approach needs to be anchored in the company culture.
Shell ‘s Andy Boyd, also visiting professor at the London School of Economics, described the ‘great crew change’ as a huge problem for oil and gas and other mature industries where the boomer generation is retiring. Shell is addressing the problem with a ‘retention of critical knowledge (ROCK) program of interviews with knowledge workers that capture and record their skills in the Shell Wiki (the largest corporate Wiki according to Accenture) and ‘Mind Maps2’ to visualize knowledge. Shell has over 70 ‘very active’ KM communities that have been running for the last 15 years. These are generating ‘audited savings of $300-400 million dollars per year.’
Colin Balchin and Nigel Barnes (Adept KM) recommend using 3D visualization to accelerate the transformation of data into information and knowledge for asset management. They distinguish between ‘entropic’ and ‘strategic’ asset management processes. The former are concerned with the disorder inherent in any system and affect safety, integrity and performance. Problems can occur at any stage of the design and build process. Adept recommends a ‘stage gate’ process to manage uncertainty of outcomes and the increasing volume of information. The use of risk and contingency analysis was shown for a project involving adding gas compression to an FPSO. This showed for instance that while non-critical stuff was running two weeks ahead of schedule, critical work was two weeks late. It was time to re-allocate resources! Such information management needs to be done closer to real time with more qualitative analysis. Planners need to see trends for reliable forecasting. Current systems are not always up to the job because of high data volumes and poor systems integration. This is where 3D comes in, repurposing CAD models, which already contain a lot of technical information, to extend their use across the asset lifecycle. The use of 3D is already a given in reservoir management. Here, CAD models can be the basis for engineering information, equipment codes—the master tag register for the asset that ensures information is highly visible in one system.
Flavio Bretanha outlined Petrobras’ lessons learned programs. These were designed to address KM challenges in operations and to develop a collaborative approach to the management of complex projects. The KM unit is involved in Petrobras’ standardization effort and is publishing an engineering management manual. Other activities span communities of practice (CoP) and a lessons learned program. The standardization program includes participation in international and national standards bodies to evaluate and align Petrobras’ standards and specifications. CoPs are built around groups of experts in specific areas are responsible for the incorporation of technical advances and new methodologies. Current focus areas include a revamp of Petrobras’ inspection routines and procedures for dealing with construction contractors. The lessons learned initiative includes knowledge taxonomy development. A single corporate taxonomy improves document coherence and clarity. This in turn will ensure adherence to the project and help define responsibilities. Petrobras is collaborating with the Brazilian engineering trade organization, Abemi3, on the development of best practices and construction checklists for employees. Approved technical routines and executive procedures are being developed—again leveraging the standard taxonomy. Bretanha concluded that, ‘If your company is in an initial level of maturity in KM, strengthen the organization of the technical documentation and information management before implementing a lessons learned program.’ Once that is in place, it should be extended with a unified taxonomy which will form the basis of a solid lessons learned program.
Susan Rosenbaum offered some interesting ‘musings’ on a decade of KM at Schlumberger. KM got off to a flying start in 1997 with strong endorsement from the then chairman and CEO Euan Baird who told analysts ‘Knowledge is an asset that can be reused rather than something that has to be reinvented over and over again. We need to create companies that learn quickly and do not forget.’ A position that was subsequently endorsed by the current president and CEO Andrew Gould. As Rosenbaum remarks, ‘KM starts with management commitment.’ At the top level, KM in Schlumberger connects people to information, communities and business solutions. Initiatives include the Career Network Profile, an internal, self-authored résumé that is linked to the corporate LDAP directory and the internal competency management system. A key finding was that people will keep their own information up-to-date.
Next up was InTouch, a 24/7 knowledge and operational support portal for accessing expertise, best practices, ticket and document management. InTouch is now embedded in all of Schlumberger’s work processes. There are some 150 full time InTouch Engineers and another 250 part-time contributors. Around 4,000 experts assist the InTouch project with specialist input and QC. InTouch Engineers answer questions from the field, capture solutions to the support portal and conduct root cause analysis of why assistance was needed. The latter is used to drive improvement in hardware, software, documentation and training updates. InTouch has over a million attached files, 56,000 users and 2.8 million logins per year.
Another initiative is the ‘Speedia’ community glossary and encyclopedia. This is modeled after Wikipedia and is an internal source of definitions for acronyms and ‘Schlumberger speak.’ Speedia also contains articles on Schlumberger’s technologies and services. The Wiki is proving popular, people like to share, especially when a contest is being held.
Schlumbeger has around 160 ‘Eureka’ communities of practice with over 300 leaders and 25,000 plus members. These center on discussion forums (bulletin boards or ‘BBs’ in Schlumberger parlance). These allow for in depth discussion of a problem by experts from around the world. The BBs have proved their worth in sharing technical knowledge and are used to produce technology roadmaps, white papers and innovation ideas. Results are discussed in over 250 technical webinars annually. People link to and learn from others in their field. Communities have a life cycle—they emerge, they live, and they can eventually die.
Rosenbaum concluded noting that while people are the key to all KM initiatives, KM systems are now an essential part of the way Schlumberger works. Notwithstanding this, KM needs a constant push and encouragement. ‘You can never declare victory and feel that the KM problem is solved.’
Matthieu Lamy (Talengi Document Control) enumerated the risks implicit in information management. These are broken down as follows. Operational risks—e.g. a wrong document is used in design. Financial risks—e.g. a poor supplier relationship means that penalties are incurred. Information security risks—e.g. losses during document exchange. Legal risks—e.g. non availability of key documents in the event of litigation or for regulatory compliance.
Typical problem situations are encountered when there is doubt as to which is the current version of a document, or whether it is complete with the latest annotations. Such worries create issues for personal productivity, operations and knowledge management. Enter Document Control which Lamy defines as ‘the methods and tools used to manage and secure document exchange between an operator and a contractor.’
Talengi’s mission is to assist managers monitoring projects and organize information. This includes recommendations on storage, document identification and QC, compliance and tracking. Document management is essential to keep projects on schedule and to ensure a proper hand over to operations. Document control can also be viewed as a vector for knowledge management, providing much of the taxonomy and metadata that helps circulate and contextualize information. Lamy believes that there is considerable unrealized potential for its use.
Invensys Operations Management is now the exclusive provider of a process safety decision-support solution from ACM Facility Safety (a division of ACM Automation). Invensys will now offer ACM’s ‘Machu Picchu’ safety decision-support software to the global process industry.
Invensys principal Steve Elliott said, ‘Invensys and ACM are making industrial plants safer. Invensys delivers new solutions that let customers continuously track deviations between the plant’s designed risk profile and the actual operating risk profile. Dashboard displays will allow operators to identify and potentially eliminate process risks. This capability enhances our Triconex safety systems to help clients achieve sustainable environment and safety excellence.’
Machu Piccu works alongside automation, control and safety systems providing visibility into process risks, leveraging expert knowledge of a plant’s risk parameters, diagnostic alarms and historical operator observations to monitor integrity. If a deviation is identified, risk levels are computed and operators receive a set of pre-engineered contingency plans on how to address the abnormal situation.
ACM general manager Murray Macza said, ‘Today’s end users are under tremendous pressure to run their plants at maximum throughput without limiting their commitment to safety and the environment. By working with Invensys, we can accelerate the adoption of new technology that will help process manufacturers improve their business and achieve safety and operational excellence.’ More from www.oilit.com/links/1104_30 (Invensys) and www.oilit.com/links/1104_31 (ACM).
Absoft has appointed Eric McAdam head of professional services.
BP Canada president and CEO Anne Drinkwater joins Aker Solution’s board.
Peter Carlile is to head-up Amphora’s new EU HQ in Zug, Switzerland.
The API is backing a new Center for Offshore Safety in Houston.
Hervé Oheix is sales manager for Bull’s industrial HPC unit. He hails from SGI.
Bill Henry heads-up CGGVeritas’ new processing center Muscat, Oman.
Chris Hughes leads NuTech Energy Alliance’s UK operations. He was formerly with Kestrel.
Dahlman Rose & Co. has appointed Rome Arnold as MD and head of energy investment banking. Arnold was formerly consultant to Energy Ventures.
Tony Klapcia heads-up Eurotech Computer Services’ new Aberdeen office.
Chuck McConnell is now COO of the US Office of Fossil Energy. He was formerly VP at Battelle Energy.
Ebrahim Zadeh has joined Ikon Science in Perth and Llion Pritchard in London. Alexander Edwards, Jakob Heller and Asisat Lamina have joined Ikon GeoPressure. Steve Hunt is now COO.
Carl-Magnus Adamsson heads-up IFS’ new oil and gas centre of excellence in Norway. Geir Arne Wilberg and Kristin Husby have joined the team.
Intertek has opened a new petroleum laboratory in Cushing, Oklahoma, providing crude oil quality testing and inspection services.
Aymeric Caroff has joined Kadme’s development team.
Roy Oelking has been appointed president of KBR Hydrocarbons. Dennis Calton is president of KBR Oil & Gas.
Morten Tønnesen is VP business development with Roxar Software Solutions. Khong Kheng Ting replaces him as regional manager, Asia Pacific.
Jim Thomas is VP global resourcing with Knowledge Reservoir. Thomas was a co-founder of OilExec International.
Wes Baird is now Chief Product Architect at geoLOGIC Systems.
Matrix Service Co. has appointed John Hewitt president and CEO. Hewitt hails from Aker Solutions.
Former MD of GeoDZ Todd Porter has joined New Century Software as VP business development in Houston, Texas. Andy Florence joins as business development manager.
Maria Claudia Borras (Baker Hughes), Jacqueline Knight (Halliburton), and Paul Krueger (GE Oil & Gas) have joined the OFS Portal board of managers.
Chris Ring replaces Jeff Pferd as VP development with Petris. Pferd now heads up a new Strategic Consulting service. Ring hails from Landmark.
Bill Lawson is now the PTTC’s strategic manager. He was formerly with the DOE’s National Energy Technology Laboratory.
Ray Harlow is COE of Latitude Solutions’ new Energy Services unit.
James Willis has joined Subsurface Consultants & Associates as an instructor.
Bob Rabalais heads-up Simpson Thacher & Bartlett’s new Houston location.
Alistair Birnie is stepping down as Subsea UK chief executive. The position is currently vacant.
Mike Benjamin has been appointed VP—Offshore Pipeline Solutions, for T.D. Williamson. He was previously with Schlumberger.
David Hicks has resigned as Senior VP of Africa Middle East and Far East with TGS Nopec.
Anton Leemhuis heads-up TNO’s new Middle East office in the Qatar Science & Technology Park, Doha.
The UK Technology Strategy Board’s new grants for R&D targets small to medium enterprises engaged in technology-based R&D. Grants of from £25,000 to £250,000 are available—www.oilit.com/links/1104_44.
In our DecisionSpace article last month we incorrectly called Gene Minnich’s ‘product manager.’ He is of course, VP Landmark Software and Services. Our apologies.
Wipro Technologies is to acquire SAIC’s Global Oil and Gas Information Technology Services Business for a cash consideration of $150 million. Some 1,450 employees will transition to Wipro around the world.
CGGVeritas has purchased Moscow-based static and dynamic reservoir modeling company Petrodata Consulting.
Clariant has acquired Saskatchewan, Canada-based oil services company Prairie Petro-Chem and will integrate it into the Clariant Oil Services business.
Dawson Geophysical has acquired TGC Industries in a $157 million paper deal.
Oil and gas remote camera technology specialist EV has secured £6 million investment from energy-focused private equity firm Lime Rock Partners.
A consortium including Maju Investments and RRJ Capital has acquired 70% of Frac Tech Holdings. Chesapeake holds the remaining 30%.
Harris Corp. has acquired Schlumberger’s Global Connectivity Services in a $397.5 million cash deal.
IHS has acquired ODS-Petrodata. Financial terms were not disclosed.
KSS Fuels is to acquire Market Planning Solutions.
TGS-Nopec has acquired Stingray Geophysical for approx. $80 million.
Total Safety has acquired ‘substantially all’ of the assets and business of Houston-based fire protection and safety services company Webb, Murray & Associates.
Mathematica developer Wolfram Research has acquired MathCore Engineering.
The Houston Technology Future Fund is to raise $100 million in venture funding for selected early-stage technologies in the fields of energy, IT and nanotech.
I/O has raised $105 million of growth capital. The investment was led by existing investor Sterling Partners, new investors managed by J.P. Morgan Asset Management and the I/O management.
John Bergman presented a Spotfire application that allows Chesapeake to investigate well performance from neighboring wells. A map interface lets users click on a well and bring up production numbers for wells within a defined range. The application is used to investigate recovery from wells in the Fayetteville shale play by assessing well offsets and for reserve booking.
Kevin Konopko (Oxy) has been analyzing drill bit performance with Spotfire. The idea was to use data already captured in Landmark’s EDM database and OpenWells application for bit performance analysis and allow ‘head-to-head’ comparison of bits from different vendors. Data gathered from bit runs is blended with cost information and summarized as a ‘bit performance index’ (BPI). Despite various gotchas like non representative runs, missing data and inconsistent nomenclature, Spotfire has provided good tool for visualization and BPI comparison. BPI helps Oxy select the best drill bit for a particular section of hole based on local experience. While rate of penetration is the obvious candidate for top-weighting in the BPI calculation Oxy is looking at improving the index by applying different formulae for cost per foot, rate of penetration and drilled footage derived from a business unit’s own statistics.
Ryan Keel showed how Petrohawk uses Spotfire to produce lease operating statements (LOS) from corporate down to well level. LOSs combine data from the P2ES Bolo accounting system of record, adding analytics and drill down capability. Petrohawk can rotate data through different property groupings, filter and show or hide accounts as needed. More from www.oilit.com/links/1104_43.
UK based oil country tubular goods supplier Petroleum Pipe Co. has deployed Talari Networks’ ‘Mercury’ adaptive private networking (APN) appliances. APN technology is suited to developing world regions where broadband may be inconsistent or cost-prohibitive. Talari’s appliances aggregate available bandwidth (DSL, cable or wireless) into a ‘virtual network’ offering reliable wide area infrastructure connecting remote offices.
Virtualization continuously monitors all network paths between remote sites and PPC’s UK datacenter ensuring a more reliable and predictable data and VoIP network. APN monitors internet connectivity for loss, latency and jitter, detecting and responding sub-second to network traffic congestion. PPC has deployed seven Talari APN appliances: one in London, two in Dubai, and the remainder at smaller remote offices. More from www.oilit.com/links/1104_28.
Petronas has awarded Netherlands-based Spirit Innovative Technologies (Spirit IT) a million Euro contract for the provision of software development, installation and maintenance services. These will initially include the supply of software that will ‘enhance the quality and intelligence’ of Petronas’ onshore and offshore metering systems and centralize data capture. Spirit was selected following an evaluation of five ‘renowned’ competitors.
CEO Harry Kok said, ‘This is an international breakthrough for our company. We are among the first to partner with Petronas on software development and international marketing. More products are likely to follow.’ Spirit expects to up its head count following the deal and is opening an office in Malaysia, where local employees will kick-off the project under supervision from Eindhoven. The office is expected to develop into an expertise centre for the South-East Asian region. In its own right, Spirit develops automation products and metering solutions for the oil and gas industry. Other Spirit IT clients include Saudi Aramco, Shell, NAM, Petrom, ExxonMobil and Petrofac. More from www.oilit.com/links/1104_29.
IBM and Shell report a ‘transformation’ of employee training on the use of its new global terminal automation system (GTAS). GTAS is helping Shell standardize applications across over 150+ terminals and depots around the world. GTAS offers terminal staff and drivers enhanced safety controls and lowers maintenance and support costs. The system interacts with Shell’s business and finance systems, increasing the efficiency of invoicing and financial processes.
Some 1,000 employees in 13 countries are being trained on the new system using a new, on-demand, interactive environment that re-creates typical fuelling scenarios in a safe learning environment. The e-learning tool was built through collaboration between Shell experts, a team of IBM consultants and e-learning developers. GTAS is localized for Brazilian, German, Italian and Turkish users. More from www.oilit.com/links/1104_32.
Absoft has registered with Achilles JQS, a supplier database used by Norwegian oil and gas operators and contractors.
PAA Natural Gas Storage has selected Allegro Development Corp.’s Allegro 8 platform to manage hedging activities at its natural gas unit.
Blue Horseshoe has implemented the Microsoft Dynamics AX ERP system for Amalie Oil Co. The solution includes components for warehousing, transportation, business intelligence and a vendor portal.
Eurotech Computer Services has signed a partnership agreement with WAN optimisation and ISP traffic shaping provider Opteq Systems International.
Real-time retail gas and diesel prices tracker GasBuddy.com has teamed with online provider of fuel and risk management services Pricelock, to help businesses control their fuel prices.
Intertek has been awarded a contract for the provision of Fuel Quality Monitoring System services by the French Ministry of Ecology and Sustainability.
Invensys Operations Management has implemented its SimSci-Esscor/ROMeo optimization software at CNPC’s Jilin Petrochemical unit. Invensys has also signed a sub-licensing deal with French-based Axens for its SimSci-Esscor PRO/II process simulator.
Itron has kicked off a four year deployment of a million 100G Datalogging gas ERT modules for Alberta-based ATCO Gas.
KBR’s new Middle East-based unit been awarded an engineering and project management services contract by Saudi Aramco.
LMKR has entered a strategic partnership with Object Reservoir to ‘close workflow and data integration gaps’ in geology, geophysics and engineering.
Brazilian exploration and production company OGX Oil and Gas is to implement Landmark’s DecisionSpace environment.
Suncor Energy has selected Optimized Systems and Solutions’ ‘OSyS’ solution for enterprise-wide management of change and incident management.
Total has selected Energy Solutions International’s PipelineManager as a componet of its Frigg UK (FUKA) pipeline management system.
The Norwegian Center of Excellence in Physics of Geological Processes has chosen Visualization Sciences Group’s software Avizo as its main 3D analysis and visualization software.
An unnamed ‘leading multinational energy company’ has ‘invested’ €350,000 in ReadSoft’s SAP-vertified Accounts Payable Automation Solution.
Citation Technologies and Antea Group are to offer the Inogen Global Environmental Health and Safety audit protocols, originally developed by Antea. The Inogen protocols provide real-time access to current EHS regulations.
ShipConstructor Software and IFS North America are partnering to enhance the integration between their respective applications.
Total has adopted Paradigm’s SKUA 3D modeling software as a stand-alone application and is to integrate the tool with its proprietary geosciences platform.
The Global Reporting Initiative is calling for comments on the final draft of the Oil & Gas Sector Supplement. GRI’s sector-specific reporting guidance will enable companies make their sustainability reports more relevant and easier to produce. Comments close on 20th July 2011—www.oilit.com/links/1104_62.
Drilling regulators around the world have formed a working group to develop global offshore drilling standards. US offshore drilling supremo Michael Bromwich is leading the initiative.
The network common data form (netCDF) is now an official Open Geospatial Consortium (OGC) standard. netCDF was originally developed for the earth science community for the management of multidimensional array data. The self-documenting format embeds semantic metadata and is developed and supported by the University Corporation for Atmospheric Research (UCAR)—www.oilit.com/links/1104_63.
The Pipeline open data standards organization has approved release 5.1 of its PODS data model. The release includes new ‘one call’ ticket and boundary tables, damage tables and other model enhancements—www.oilit.com/links/1104_64.
PPDM has announced the Petroleum Education Task Force, an international roster of experts who will work with PPDM on the development of an accredited PPDM Data Management Education program—www.oilit.com/links/1104_65.
The SEG-D Rev 3.0 committee met earlier this year and has issued a revised standard document available on www.oilit.com/links/1104_66.
V1.3 of Cisco’s Physical Access Manager (PAM) includes an upgrade to its IP-based dispatch and incident response solution. PAM ensures that if a system fails, access identity and operations still function. Video cameras are used as motion-detection sensors, consolidating alarm management. VP Bill Stuntz said, ‘New security threats and regulations on critical asset protection are driving demand for more efficient physical security technologies’ (www.oilit.com/links/1104_19).
A 47 page report from the European Network and Information Security Agency (ENISA), ‘Cyber Europe 2010—Evaluation Report’ (www.oilit.com/links/1104_20) reports on the first pan-European exercise on Critical Information Infrastructure Protection (CIIP). The exercise, carried out in November 2010, concluded that member states need contingency plans and a ‘roadmap for pan-European exercises and preparedness.’
A 28 page report (www.oilit.com/links1104_21) from Intel unit McAfee and the Center for Strategic and International Studies reveals that despite a ‘dramatic increase’ in cyber attacks, critical infrastructure (including oil and gas) is unprepared.
Amongst those whose preparedness has been called into question is McAfee itself, whose website was found to be ‘full of security holes’ according to the YGN Ethical Hacker Group, reported in Network World (www.oilit.com/links/1104_22).
CNN reported (www.oilit.com/links/1104_23) that BP lost a laptop with personal details of claimants to the Macondo incident.
Royal Dutch Shell has awarded Johnson Controls’ Global WorkPlace Solutions unit a facilities management contract covering 12,000 Shell retail outlets in 27 countries. Shell’s gas stations are staffed by 150,000 people and serve approximately 2.8 million customers per day. The agreement builds on an existing five-year relationship between the two companies and is one of the largest single deals in the fuel retail sector.
Emma Fitzgerald, VP of Shell’s global retail network said, ‘The extension of this contract represents an important step in ensuring a joint outcome-based approach targeting significant and sustained improvements in health and safety, security and the environment (HSSE), operational excellence and innovation.’
Guy Holden, VP and general manager of Johnson Controls’ Global WorkPlace Solutions added, ‘The global portfolio approach to managing facilities and resources is a growing trend. This agreement demonstrates the competitive advantage our focus on continuous improvement provides and underscores the expertise and best practice that we have developed in the oil and gas market, particularly in HSSE.’ The contract covers 27 countries in EAME, Asia-Pacific and the Americas. More from www.oilit.com/links/1104_8.
Fluor Corp. is to team with Azima DLI on a ‘comprehensive’ solution for predictive maintenance. The solution targets, inter alia, the oil, gas and petrochemicals verticals with focus on emerging economies where operations and maintenance (O&M) needs have been identified. Fluor is to contribute O&M resources and expertise including site-based data collection, analysis and corrective action planning. Azima DLI is to provide its ‘Watchman’ Reliability Portal analysis software and instrumentation.
Kirk Grimes, president of Fluor’s Global Services Business Group said, ‘Fluor and Azima will share remote analysis and planning responsibilities based on geography, size of the project and applicable execution capabilities. We expect the results to meet client needs for improved reliability and maintenance.’
Randy Johnson, VP sales and marketing with Azima added, ‘Our software and data collectors are designed to perform under harsh conditions and to ensure that programs deliver on the promise of machine reliability and uptime.’ More from www.oilit.com/links/1104_8 (Azima) and www.oilit.com/links/1104_16 (Flour).
Verian Technologies and Cortex Business Solutions have partnered to offer a paperless invoicing solution to E&P companies. Verian is to integrate Cortex’ Trading Partner Network within its procurement automation solution, enabling E&P clients to eliminate paper invoices, speed invoice approval, and capture early-pay discounts. One of Verian’s clients, an independent oil and natural gas company based in Houston, will go live with the joint technology solution later this year. The integration will provide the E&P firm with paperless invoicing to around 80% of its supplier base.
The (anonymous) client’s VP of IT said, ‘Using Verian’s procurement automation system, we’ve been able to move most of our requisitions onto purchase orders. However, the nature of our business makes it nearly impossible to create an order for every purchase. We needed to find an easier way to process non-PO invoices. After looking at several options, it made the most sense to augment the Verian system we’ve been using for several years with Cortex’s supplier network.’ Cortex’ buy-side clients include Husky Energy, Apache Canada, Bonavista Petroleum, Murphy Petro-Hunt and Energen. More from www.oilit.com/links/1104_17 (Cortex) and www.oilit.com/links/1104_18 (Verian).
E.ON Ruhrgas is to deploy Infotechnics’ ‘Opralog’ operations reporting and logging system at its Southern North Sea Babbage gas production platform. Babbage, the first platform operated by E.ON’s UK unit, will have a 2 million cu. m/day capacity when fully on stream. Opralog is an operations logging tool that captures real time data from sources such as SAP, IBM/Maximo and OSIsoft’s PI System, making such data accessible to a variety of users. Opralog consolidates information from paper logs, spreadsheets and legacy databases into a single source of business intelligence. Key status information on safety, environment and plant issues are usable in support of shift handover and production meetings.
E.ON controls engineer John Newton said, ‘Control room operators have to log events, report on them and communicate any decisions and actions taken. Opralog allows us to do this in a structured manner—improving reporting and communication of key data.’ Infotechnics’ Jon Howard added, ‘Opralog can be extended by E.ON staff for use as a repository for handover check lists, safety meeting information and daily reports and safe cards.’ Opralog customers include Scottish and Southern Energy, BP and National Grid. More from www.oilit.com/links/1104_10.
Shell Europe has deployed server cabinets from Kell Systems at ‘virtual reality’ collaboration centers, a.k.a. ‘Viz rooms’ in Aberdeen and Stavanger. The Viz rooms provide communications and 3D modeling on two high-performance HP XW9400 workstations. Shell found the noise generated by the workstations was a problem, as Viz room coordinator Simon Green explained, ‘The rooms are quite small, so the noise of the PC towers working hard was distracting—especially for videoconferencing.’ Green also observed, ‘The PCs also experienced overheating problems as they are top-end systems—the heat they generated was incredible.’
Shell chose Kell’s PSE24 ComputerVault Pro model—a 24 rack space model. One cabinet was installed in each of the five Viz rooms in Aberdeen. The cabinets solved both the noise and overheating issues. Ultra-low-noise exhaust fan modules keep the systems cool enabling the units to run with the cabinet doors closed—reducing noise by some 18.5dBA. At 38 Watts per cabinet, power consumption is ‘considerably lower’ than an air conditioned server room. Kell clients include Microsoft, BP, NASA, the US Navy and Google. More from www.oilit.com/links/1104_11.
At this month’s AAPG, LMKR unveiled a social networking plug-in for GeoGraphix Discovery Suite, a geoscience workstation formerly marketed by Halliburton/Landmark. The ‘Convofy’ service was developed by ‘Scrybe,’ an LMKR/Adobe joint venture and is described as a ‘private social network for your company.’ Confovy adds real-time, ‘in-context’ collaboration to Geographix, allowing users to push images and files, sharing information relating to their exploration, drilling, production and investment options.
Users across remote locations can connect with each other and collaborate by sharing project status updates, hold discussions around cloud-based information and research topics in the context of their company. Convofy is claimed to increase subject matter visibility and transparency and avoid time wasted traveling or in meetings.
Companies can build their own internal knowledgebase from discussions posted on the Convofy company network. This information can be used to audit decisions made by workgroups, evaluate their performance and productivity or to rapidly educate newer members of a team or project. Convofy also ensures the security of proprietary E&P information with hosting on an SSL encrypted link to the Amazon data cloud. More from www.oilit.com/links/1104_12 (Geographix), www.oilit.com/links/1104_13 (Convofy) and www.oilit.com/links/1104_14 (LMKR).
Wood MacKenzie has teamed with Elsevier to augment its upstream information offering with ‘Geofacets,’ a collection of indexed geological maps and other information sourced from Elsevier’s earth sciences journals. Geofacets (Oil IT Journal, October 2010) is a web-based database of georeferenced geological maps. Users can search via keyword or map interface and refine results with search ‘facets’ such as map type, surface area and geological basin. The combined offering targets geoscientists involved in assessing a geological basin’s characteristics and potential.
Woodmac’s success rates and fiscal regime information are now accessible through map overlays that users can switch on as they view maps within Geofacets, enhancing Geofacets’ database of 125,000 geological maps. Geofacets product manager Phoebe McMellon said, ‘This technology integrates well with existing workflows and lets users place scientific information in a commercial context.’ The Woodmac overlays are now available to all Geofacets users. More from www.oilit.com/links/1104_15.