I seem to be writing a lot about cloud computing lately as in this month’s lead from Chevron, Shell’s keynote at the Microsoft Global Energy Forum (page 6) and a few more cloudy announcements on page 8. As an editor it behooves me to follow the crowd as it were, to get excited along with the rest of the industry, and to track the ‘disruptive’ new technology. But what is strange is that the announcements from Amalto, Ariba and Modulo come from companies who have been providing what seem to be the same hosted services since before the ‘cloud’ was invented. So what is the cloud? What’s the buzz?
If you define the cloud as off-site IT, then arguably, Norway’s Diskos, the UK’s CDA and various commercial data hosting environments in the US are ‘in the cloud’ and have been since the 1990s. Even if you extend the definition to include ‘application service provision,’ most of these examples of prior art pass muster in that they have some sort of back end serving up the data. Many other companies offered hosted applications before the cloud came along. Many of them went belly-up in the 2000 dot com boom. A search for ‘application service provision’ on www.oilit.com brings back 46 references from 2000-on with offerings from Landmark, Paradigm, Petris, Schlumberger and others. Hosting is so commonplace that hosters use third party hosts themselves—a thriving business for generic data center providers such as CyrusOne, acquired last year for a cool $525 million.
So if hosting has been going on for a decade, what does the ‘cloud’ actually bring to the table? The answer is an API on your desktop. Amazon’s Web Services, Google’s App Engine and Microsoft’s Azure offer the same potential for hosting as the previous ‘ASP’ based offerings with this important difference. The new cloud offers the skunk worker—sorry I should have said your in-house developer—the opportunity to build their own hosted application and deploy them ‘at greatly reduced cost’ on cloud-based hardware. You may argue, I would, that there is more to an application than a compile, and that the economics of running it on hosted hardware may prove illusory when the true cost of ownership is evaluated (cf. Bill McKenzie’s ‘chump change’ in this month’s lead). But that is not really where I want to go right now. My main point here is that behind each of the new ‘clouds’ is a heavy-hitting company that wants your business. They are going after it by offering your programmers seductively easy development environments that are locked-in to their hosting services. A perfect storm indeed.
A while back I reported on our acquisition of a holiday home in the south of France. This editorial bifurcated in the direction of ‘air geothermal’ home heating and thermodynamics—setting a challenge1 that as yet has not been solved. I would encourage those of you with some recollection of high school physics to read this and respond. Heck I may even give a prize for someone who can explain this conundrum in a way I can understand.
Anyhow we spend as much time as we can down in our new residence, hidden away in an area of great natural beauty. A pretty village on a river that cuts its way through the limestone plateau of the Larzac. In one of those ‘what I will do when I retire’ fantasies, I even imagined taking Texan geologists around on tours of the limestones and dolomites, while sampling the local fare—sausages and ‘aligot’—a healthy blend of cheese and potatoes.
I was shaken from such reverie last year when I spotted that several exploration permits had been awarded in the south of France—from Provence around to the Cevennes and even up to the area where our holiday home is located. This, as you can imagine, put me on the spot. I have written copiously over the last couple of years on the subject of shale gas exploration. Most all of it in a positive light as indeed the technology, as presented at gatherings of the AAPG and SPE, is quite remarkable and the results, if you believe the promoters, are rather amazing. Shale gas has transformed the gas supply equation in the US. Sure there were some environmental concerns—but things were going fine until the gassy geological provinces extended north into the ‘Nimby2’ country of the north east.
But now I am confronted with the same problem as the northeastern Nimbies. Did I really want shale gas exploration on my beautiful doorstep? Such musings were cut short by events and in particular, by an environmental activist called José Bové, (a very rough equivalent to Al Gore) who has taken up shale gas as a ‘cause célèbre.’ I can assure you that the Journal de Millau and Monsieur Bové paint shale gas exploration in a very different light to the SPE. The outcome to date is that the French government has declared a moratorium on shale gas exploration in France. My soul searching is likewise suspended.
I have been reading the excellent book “Deep Water3” from the National Commission on the Deepwater Horizon. I am sure that you are aware of its main conclusions. But there is also an interesting section (page 225 on) covering the evolving role of the American Petroleum Institute. The API has traditionally been tasked with establishing standards and best practices for operations. The Commission notes however that ‘It is clear that the API’s ability to serve as a standard-setter for drilling safety is compromised by its role as the industry’s principal lobbyist and public policy advocate,’ and later ‘API-proposed safety standards have increasingly failed to reflect “best industry practices” and instead express the “lowest common denominator.” API shortfalls have ‘undermined the entire federal regulatory system.’ Moreover, ‘Inadequacies in the resulting federal standards are evident in the decisions that led to the Macondo well blow out.’ Strong stuff indeed!
2 Not in my back yard.
3 ISBN 9780160873713— www.oilit.com/links/1102_1.
Bill McKenzie (Chevron) speaking at the Microsoft Global Energy Forum last month in Houston described a ‘proof of concept’ (PoC) development to investigate use of cloud computing in support of oil and gas joint venture collaboration. Initially, Chevron was skeptical about the merits of the cloud. It seemed like nothing new and the touted savings in terms of reduced server count and ‘storage by the drink’ amounted to ‘chump change.’ But maybe the situation was different for a joint venture (JV).
JVs benefit from stable, strong tools and practices. But in today’s high-end projects with MWD, decision rooms etc., more complex data is being shared. JV partners may want to see a cellular reservoir model or cooperate on incident response. Here, Energistics’ EnergyML data sharing standards and the new ‘big data’ movement could combine with cloud computing to make for a ‘perfect storm’ of new technologies.
The facet of cloud computing of most interest to a JV is the Platform as a Service offering, using internal applications and the cloud’s data services. Amazon Web Services and Google’s App Engine got a brief mention, but in deference to his hosts, McKenzie described the PoC using Microsoft’s Azure.
For technical computing to work in the cloud, a ‘canonical’ view of data is required that all agree on. Standards for units of measure and CRSs in the cloud will be a major step forward. On top of this, services for query and management can be built.
For security, SAML 2 ‘may not be a panacea,’ but it is getting there. The ‘big data,’ NoSQL approach implies a shift away from ‘third normal form’ relational data bases such as PPDM. Instead of seeking agreement on a rigid data model, the NoSQL approach (as exemplified by Google’s Big Table) involves storing data in an unstructured store, combining full text and key-value pair lookup, an approach which appears promising for a JV. Under the EnergyML hood are WitsML, ProdML and the ‘new child in the family,’ RescuML. These can combine to transport the essential elements of the shared earth model, although there is ‘still a lot to do!’
Generic create/read/update/delete services have been built atop the EnergyML data services. Other key technologies include Microsoft’s ADFS in the cloud and (for Chevron) identity services from Ping Identity. The PoC showed that the SAML 2 token provider works along with the EnergyML data services.
McKenzie wound up with a demo of the PoC in action on a shared asset model adding users, permissions and data to the store. Clients drill-down through the asset hierarchy for reporting information. Current PoC partners are Pioneer, Chevron, Atman Consulting and Microsoft. Other interested parties are invited to join-up. More from David Barret, email@example.com.
A new service from Scout Directories, PumpScout.com provides an online pump supplier search engine connecting buyers of industrial pumps, including those used in the oil and gas vertical, with suppliers. The e-commerce website lets engineers, maintenance professionals and contractors enter their pump criteria and identifies the right suppliers.
PumpScout has data from 80 pump suppliers and their product lines including Blackmer, AMT, Ruhrpumpen, Griswold, Wilden, and Cat Pumps. Product lines cover industries including oil and gas, chemical process, water and wastewater and more.
PumpScout CEO Justin Johnson said, ‘Buyers know the type of pump they are looking for, but getting quotes from multiple suppliers can mean hours trolling the Internet. Our system saves their time and provides a quick and easy way to compare price quotes.’
Featured pump manufacturers and distributors receive qualified sales leads, increased web traffic and improved search engine rankings. The website offers tips and tools from experts in fluid handling. A pump types guide helps buyers identify the right pumps for their application. More from www.PumpScout.com.
John Hofmeister (former president of Shell Oil Co.) wound up the Microsoft Global Energy Forum this year with an inspired rant against different US administrations’ performance in energy policy. Hofmeister is not one to pull his punches. Those senators who defended bio fuels as an alternative to increased oil production will be ‘kneeling come 2012, the gas price is the single most important pocket book issue to the US voter.’ While US consumption has dipped, Asian demand is surging and diesel shortages are reported in China.
The 111th Congress’ plan was to ‘move beyond oil to wind, solar and biofuels.’ Unfortunately, 98% of the US’ energy does not come from renewables and never will! Back in 1973, president Nixon proclaimed ‘energy independence by 1990.’ A similar claim has been made by all subsequent presidents. Back then, the US produced 70% of its oil, today, it imports 70%.
Plugging the growing shortfall now means competing with Asia. We will soon be back to the $147 peak and beyond. The average age of electricity plants (fossil fuel and nuclear) is 30 years. The Yukka Mountain nuclear waste repository was canceled after $20 billion was spent. Without waste management, nuclear is dead.
Today, politics determines energy outcomes to the benefit of those who manipulate the system as a function of the electoral cycle. High energy prices are a serious problem. In 2008 they contributed directly to the meltdown as folks filled up their cars and stopped paying the mortgage. Meanwhile the oil industry is subject to ‘strangulation by regulation’ and no new permits for coal production have been granted since 2009. Coal produces 49% of US electricity.
We need more energy from all available sources. Energy efficiency is a good thing too as is environmental protection. We need new infrastructure, witness the recent explosion of a 50 year old pipeline in California. Instead we just defer and defer. Government is broken, unfixable—even the court system blocks progress. We need a different governance model or we will price 50% of Americans out of market.
Hofmeister thinks we should learn from history. In the 19th another financial meltdown led to the establishment of the Federal Reserve System. Like it or not, the Fed has saved the day many times and sees beyond the electoral cycle. We need to establish an independent Board of Energy Governors with a 14 year term who will make decisions in the interest of the nation.
We also need to develop superconducting transmission, the smart grid and to plan for the management of all ‘gaseous waste,’ including CO2, thereby taking care of global warming. If ‘nimbys’ get in the way, the good of the nation takes precedence.
Hofmeister has set up the Citizens for Affordable Energy to lobby for this and to ensure that ‘supply is greater than demand.’ This is to be achieved by removing or adapting regulation to make business work. ‘We need a new oil production target of say 10 million bopd, up from today’s 7.’ This will create three million new jobs and will take about a decade. More from www.oilit.com/links/1102_23.
Quest for Data Quality1 (QDQ) is not about ‘data quality’ as perceived by the IT community. It is about recording good logging data and understanding the pitfalls. QDQ sets out to ‘express real concerns’ about the data that is going from the logging truck into the ‘digital oilfield.’ After a disappointing first chapter on metrology, Theys gets into his subject, working through the logging tool suite, enumerating various gotchas. One of Theys’ hobby-horses is the repeat section. This should be an opportunity for on-the-spot QC. Unfortunately the trend today is for a ‘head in the sand’ approach, with no repeat section recorded in case it confuses the logger! Even worse, some LWD data is streamed straight into the corporate database, leaving potentially useful memory data behind.
An important job for the logger is the bringing of bad news—a dry hole or a poor cement job. The temptation to prefer ‘pleasant’ information needs to be avoided. One chapter compares logging company ‘brochure’ specs with reality to conclude that the evaluation of logging uncertainty is possible, but that a dialog with vendors is required to get behind their ‘simplistic’ public specs. Theys takes a swipe at remote supervision of the logging process noting that ‘reality’ can prove evasive to the remote observer. Theys suggests that logging engineers sign up to a ‘Hippocratic oath’ not to modify, hide or obfuscate the data they acquire. All in all, QDQ is an interesting collection of anecdotes, observations and entreaties. It does suffer from the absence of an index, from intrusive heading numbering, and rather idiosyncratic chapter titles that did not simplify the reviewer’s task!
1 Editions Technip 2011—www.oilit.com/links/1102_22.
The latest release (4.2) of INT’s INTViewer consolidates the toolset’s move from a set of E&P-specific ‘widgets’ to a fully-featured toolset for data visualization and seismic analysis. The new release adds interactive slicing of a 3D volume, well trajectory plots and improved performance for large data volumes. New plug-ins provide connection to Microsoft’s Bing Map server for map data or satellite imagery. A seismic trace header editor for Seg-Y and Seismic Unix formats, and a data browsing window into open datasets add to the viewer’s data management capability. A ‘propagate’ mode is now available for semi-automated seismic horizon tracking. Enhanced EPGS coordinate reference system code handling and on-the-fly re-projection for disparate datasets is available in the map view.
INTViewer’s ‘standards-based,’ extendable architecture is based on the Netbeans Rich Client Platform. INTViewer’s API allows for customization of menus, data and displays. The interpretation environment can be enhanced with proprietary plug-ins and utilities. More from www.oilit.com/links/1102_21.
Speaking at the IQPC Knowledge and Information Management for Oil and Gas conference in Houston last December, Monica Chhina outlined Nexen’s Lessons Learned program. For Chhina, lessons should be considered as assets. Significant value comes from managing corporate lessons so that successes are repeated and mistakes are not! The need for an improved lesson-managing culture stemmed from the observations that expensive mistakes were being repeated and past lessons learned could not be found and shared. The situation in the Drilling and Completions (D&C) was better and Nexen’s first move was to share the D&C lessons corporately. D&C teamed with facilities on the initial project, leveraging the D&C’s process and database. Nexen now operates a cyclical process starting before a project begins with a review of what learnings apply to a project. A lessons log is maintained to ensure that all applicable information is leveraged. As a project proceeds, an ongoing review process captures new lessons which are entered in the log. After a project is completed, learnings are vetted, recommendations are made and adjustments to Nexen’s business processes are rolled out.
Chhina was joined by Charlotte Holmlund for a presentation on Nexen’s organizational network analysis1 (ONA). ONA visualizes an organization through a network, rather than a static hierarchy, showing social network-type relationships across the organization. Nexen uses ONA to optimize networks and identify employees with expertise and knowledge. ONA provided clues as to who the key ‘connectors,’ influencers and information brokers are. The ONA database was built by surveying personnel as to their own self-assessment of their skill sets and competency level along with their view of co-workers’ skills and proficiency. Results translate to a proficiency required vs. proficiency acquired map of the organization that can be used to identify gaps and locate the ‘go to guy,’ the subject matter expert. This allows for interesting comparisons of self assessments and crowd-sourced (peer nominated) SMEs.
Using the ONA begins with a specific business problem statement. Potential participants are identified, questions are designed and objectives set. Survey results are collated in ONA software (Ucinet /Netdraw2) for analysis. Subject matters that have been investigated using this approach include SME qualification for SAP-related activities such as business intelligence, joint venture accounting and finance.
Christian Liipfert, former program director of BP’s global records and information management effort has now set up Liipfert Consulting to advise on risk and compliance. His presentation focused on identifying opportunities to improve productivity, decision support and risk mitigation and establishing organizational requirements for global information governance. Liipfert’s approach addresses policies, roles and responsibilities, processes and technology—and dealing with multiple stakeholders across business, legal and IT. More from www.oilit.com/links/1102_11.
Dwight Cates, Knowledge Manager, Shell Projects & Technology (Americas) described how Shell is ‘easing the pain of the great crew change (GCC). The GCC results from decades of a volatile business environment that has seen staff reductions during downturns which are not wholly compensated for in the upturns. The result is an age gap, with a large cohort of ‘grizzled veterans’ due to retire in the next few years. There is an urgent need to get new staff up to speed and productive as quickly as possible. Shell’s solution is to capture and share more information and to foster an ‘open conversation’ between those who know and those who need to know. This is achieved through a simple framework of standard tools, ‘high-touch’ staff engagement and ongoing coaching.
The toolkit consists of ‘Peer Assist’ meetings, Wikis, Blogs and a ‘retention of critical knowledge (ROCK) program. Peer Assists target ‘non standard’ operations involving new technologies. Peer Assist facilitators assemble project teams to review plans before work starts. The Wiki is used by 70,000 Shell professionals—connecting communities across the organization. The Wiki provides content management and search. The Blogs offer a less formal approach to knowledge sharing—on the ‘most discussed’ tab a tantalizing post was titled—‘So your boss sucks, what do you do now?’ But the most popular was the more prosaic, ‘How do you reduce calendar size in Outlook?’ ROCK is a structured interview process that involves peers and their successors to ensure knowledge is passed on to the next generation.
Shell is throwing a lot into its knowledge retention program with progress monitored with KPIs at VP level and a global ‘hearts and minds’ campaign to win folks to the cause. This ‘low tech high touch’ program involves ‘relentless’ coaching. KPIs show a 340% increase in community membership and a 400+% hike in traffic. There has also been a large increase in Wiki and LiveLink-based knowledge capture.
Rich Schmidt, VP and CIO Projects & Technology (P&T) for Royal Dutch Shell described another facet of Shell’s knowledge management. The P&T unit is Shell’s ‘delivery arm’ for capital projects and technology. P&T is a major contributor to the ROCK. Collaboration tools include Microsoft Office Communicator, LiveLink, HALO TelePresence with pilots on the ‘Yammer’ enterprise social network and IBM’s ‘Jam’ collaboration environment. Capital projects also leverage the ASSAI document workflow template. Shell, along with its EPC contractors, make extensive use of the EP Catalogue. There has been a huge growth in the use of the Shell Wiki by teams and individuals. The future will see a major SAP upgrade and deployment of Microsoft’s ‘next generation’ technology—SharePoint 2010 in the cloud, Fast enterprise search and active archival. Blending the new with legacy systems remains problematical. A new ‘structured information management approach’ is being built into the SharePoint roll-out. This will match metadata with work processes and is to be a component of the templated SharePoint 2010 solution.
Schmidt noted that when a wiki page is added to documents in the repository, retrieval rates soar by at least 700%. These are often built by younger team members but are soon used by all. Previous HTML pages that tried to do the same were quickly abandoned. This approach often revitalizes the traditional electronic document management system and assures proper document control. More from www.oilit.com/links/1102_10.
1 See for instance www.oilit.com/links/1102_8.
Geomodeling Technology’s AttributeStudio 7.0 includes microseismics, hydraulic fracture modeling and well-to-seismic correlation. The company has also announced Sbed 4.1 adding automatic core plug conditioning and direct import of data from Numerical Rocks e-Core pore-scale modeling software—www.geomodeling.com.
Meridium’s Tablet Application Framework allows for the rapid development and deployment of mobile, tablet-based applications that now support elements of Rockwell Automation’s Plant Baseline—www.meridium.com.
Safe Software’s FME 2011 offers ArcGIS users access to cloud data formats from Microsoft and Google along with point cloud datasets—www.safe.com.
Barco’s NSL-5521 video wall display targets control room applications with space-saving LCD technology—www.barco.com.
Blueback Reservoir’s Project Tracker plug-in for Petrel gives data managers visibility and control of project files across the network—www.bluebackreservoir.com.
Ensyte has upgraded its Gastar natural gas management solution with an enhanced Windows GUI. A new version of the oil and gas prospect economics software Prophet has also been released—www.ensyte.com.
ExperTune’s PlantTriage can now share information with maintenance planning and other enterprise resource planning (ERP) tools via a flexible XML intermediary language—www.expertune.com.
The 3.2 release of Fugro-Jason’s PowerLog includes a new electrofacies interpretation module—www.fugro-jason.com.
Landmark is throwing in the towel on Solaris. Future operating systems are Red Hat Enterprise Linux and Microsoft Windows—www.lgc.com.
SPT Group ‘s Mepo 3.4 release adds a GUI workflow, new proxy modeling features and improved QA—www.sptgroup.com.
Oil Price Information Service has launched the OPIS Retail DataHouse—an online database of current and historical US retail gasoline and diesel prices and margins—www.opisnet.com.
Permedia Research Group’s MPath 4.18 adds ‘genetic tracers’ for visualizing 3D fetch volumes, calculation tools and techniques for deriving pore pressure from 3D seismic data—www.permedia.ca.
The 8.6 release of Seismic Micro-Technology’s Kingdom Suite includes microseismic data integration for unconventional interpretation workflows—www.seismicmicro.com.
Flare Solutions has included metadata of the SPE’s OnePetro library in its E&P Catalog—www.flare-solutions.com.
Speaking at the RokDok user meet this month, Harald Flesche reported on an ongoing collaboration between Ikon Science and Statoil that is turning RokDoc into an ‘open-ended’ platform for rock physics. The target interface for the developers is the RokDoc physics module (RPM), a set of equations and statistics that characterizes a particular facies.
RPMs are developed in either Matlab or C and delivered as a dll, along with an XML metadata file.
The RPM is consumed by RokDoc where it can be selected from a drop-down list, inheriting the RokDoc look and feel. Flesche concluded that ‘RokDoc is now a truly open ended platform that will enable quick implementation of internal methods and new industry standards for quantitative seismic analysis.’
Another client-supported development is ‘GraviSeis,’ a new module for reservoir-scale airborne full tensor gravity gradiometry (FTG) data integration. GraviSeis was developed in support of Tullow Oil’s Albertine graben, Uganda development. The idea is to develop a high resolution reservoir model ‘at 5-10% of the cost’ using 2D land seismics. GraviSeis blends inverted FTG data with 2D seismics targeting areas where 3D data is considered uneconomical. More from Lauren King, firstname.lastname@example.org.
A study commissioned by the UK’s Common Data Access, performed by Schlumberger, investigates the value of data and data management to oil and gas companies. Survey authors Steve Hawtin and David Lecore use valuation methodologies suggested by the International Valuation Standards Council. A parallel approach involved quizzing 20 senior executives as to the contributions they believe accrue from their own work, their tools, processes and the data. Data was considered, on average, to contribute around 38% to the overall value pot, with people’s worth at a modest 32%1.
Further discussions with oil company staff established that data contributes ‘between 25 and 33%’ of value in a project—a lot of money for an asset team that is generating millions of dollars in value per year. But this is just the value that accrues through normal day to day business. Data’s contribution is even more important when its ‘unexpected’ value is taken into account. The report includes examples from the North Sea and elsewhere where smart re-use of old data made a major impact to the bottom line.
The report also provides insights into the cost of data in the context of decision tree analysis—using the value of information approach. A rather longwinded ‘round table’ discussion by data management professionals is included as is a historical investigation by word count of data management trends using the OnePetro text data base from the SPE and others. The survey is available as a 50 page download from www.oilit.com/links/1102_12.
1 Given that a decent 3D survey can cost several million dollars, such modesty is relative. Nobody asked the data what it thought of the interpreters!
With a turnout of around 800, albeit with a considerable number of Microsoftees in attendance, the Microsoft Global Energy Forum1 has grown considerably since Oil IT Journal’s last visit back in 2009. Shell VP IT Services Jay Crotts kicked off the proceedings with a keynote on, ‘Combining IT and business skills to generate value.’ Crotts noted the power of modern consumer technology. Today, a mere graphics card costing a few hundred dollars packs the power of what used to be a multi-million dollar supercomputer. We are on the cusp of a similar evolution with cloud computing.
What does this mean for Shell with its 140K desktops, 14,500 servers, 140k phones and 10 million customer interactions per day? Shell is not ready to abandon its own servers and move to a cloud-only environment. Current thinking is along the lines of ‘hybrid cloud computing’ (HCC) that lets Shell pay for what is used, enables innovation and provides variable capacity. HCC can be carved up as follows—1) infrastructure as a service, 2) platform as a service and 3) software as a service (I/P/SaaS). Such a tiered approach lets IT ‘migrate’ from self, to supplier-managed. As an example, conventional wisdom has it that the average life of a server in Shell is around 2 years. But HCC dispenses with this notion. In Houston, servers can be optimized in 4 hours.
Security is still perceived as a big issue—even though this is maybe not so bad with a reliable supplier. For Crotts the key is how to extend Shell’s security model to the cloud. Global regulation and export compliance may restrict what can run outside of a given jurisdiction.
The HCC has allowed Shell to ‘seamlessly’ integrate with Microsoft’s own solutions including SharePoint 2010, Windows Azure—offering IAS, SAS and PAS. Shell has partnerships with Microsoft on SharePoint, Azure and telecoms. Crott believes that Microsoft is leading the world in the breakthrough technology of the cloud.
Craig Hodges led a Microsoft team demonstrating the ‘Contoso Oil’ proof of concept scenario. Here real time tracking of a compressor spots vibration trouble and initiates a timely intervention from the field engineers. The PoC included a bewildering number of Microsoft products—StreamInsight, Dynamic CRM, Office 365, HyperV and Windows 7. OSIsoft RT web parts and ‘Project DaVinci’ also ran. An intriguing new use of customer relationship management (CRM) involved extending the definition of ‘customer’ to include compressors and pumps! And of course, heavy duty number crunching was performed in the Azure cloud.
Ken Startz recalled that
Marathon’s first OSIsoft PI System was installed in a refinery in 1988.
The company now has four in upstream, seven in downstream, and another for supply
and distribution. The systems are now a lot easier to deploy and manage on today’s
Microsoft platform than back in the days of VMS. The systems integrate with
Marathon’s ‘ViewPoint’ digital oilfield for data mashups through the SharePoint
standard. This supports integration with other commercial software from Kappa
Engineering, Halliburton, AspenTech and Schlumberger—blending data from six
different control systems. Knowledge management is simplified as new hires
only have to learn one technology. The ubiquitous RT Webparts are used by engineers,
but also in HR and HSE. Marathon’s flagship Alba field in Equatorial Guinea
is now supported by a single PI System, with all data mirrored back to Plano.
The huge GE compressors are monitored remotely and debottlenecking is performed
remotely. Marathon puts the value of its digital oilfield effort at around
0.1% of its worldwide production. The system has enthusiastic backing from Marathon’s
executive VP upstream Dave Roberts. Startz concluded, ‘If you want to be successful,
use Microsoft and OSIsoft tools.’
CIO Abraham Galan described how Pemex is hoping to leverage IT to help reverse the significant decline in output it has seen in the last decade during which Mexico’s production has decreased from 3.3 to 2.6 million barrels per day. Pemex is structured as four separate companies with great autonomy and little top-level governance. Data is scattered all over the place and it is a challenge to realize the value—particularly in the context of the constraints imposed by Pemex’ cost reduction initiative. IT investment optimization has led to the adoption of a common platform (SharePoint), common standards—PPDM and ‘MURA’ and lower infrastructure costs through use of the cloud. Pemex’ deployment focuses on applications that leverage all of the above, notably iStore’s ‘PetroTrek,’ used to create a single source of the truth across multiple data sources. Data is the key to unlocking the value in Pemex’ assets, ‘If people can’t find data they will either make it up or rework it!’ Galan noted the knowledge gap between different stakeholders. Data gatherers, interpreters, users and curators each have their own, different versions of the truth. Data sharing is a ‘challenge,’ ‘it is not part of our culture.’ Data can be isolated and invisible. Both centralized and local data stores have their own problems. Galan advocates a middle road—building a single source of the truth by accessing the closest original source—although this is often hampered by data formats, tools and ‘irreconcilable’ technologies.
Data will be key to Pemex’ new incentive-based mature fields program. This heralds a new era for Mexico with new players invited in and represents a low risk opportunity for foreign companies to get revenue by applying new technologies. Pemex’ National Data Repository leverages Microsoft SharePoint with encouraging results. In one field, certified reserves have doubled. Data-driven buisness intelligence helps discover new fields and increase recovery from existing ones. Pemex is very pleased with iStore’s know-how and technology. In the Q&A Galan was asked what SharePoint has brought to the table. He noted that building a data repository required a lot of configuration to access existing data sources. This is easy in SharePoint. The decision was also cost driven—a Microsoft frame agreement means that new SharePoint instances are essentially free.
Eric Rearwin outlined Chevron’s work with Invensys/IntelaTrac on a mobile system to support its operator routine duties (ORD) compliance and monitoring effort. This monitors operator data collection, maintenance and greasing activity at Chevron’s 22,000 retail outlets across 6 continents. The activity was previously monitored with paper run sheets and logs. But a lot of exception-based information was either un-recorded or ‘stranded’ and unavailable to other stakeholders. Half the assets in a refinery are un-instrumented—and clipboard data is not timely. There is great potential for rotating equipment monitoring and condition-based maintenance.
Chevron’s solution is to plug vibration monitors into IntelaTrac and to deploy a standard electronic mobile system for ORD. This feeds systems of record in laboratories and data historians. Handheld-based data collection includes ‘proactive’ data analysis, higher equipment availability and faster reaction to changing plant conditions. When Katrina struck, email instructions and alerts meant that operators from other plants could be rushed into action. Tank monitoring makes operators aware of heavy rain—avoiding tank collapse. Other applications have been deployed in the rotating equipment arena. These are ‘$10,000 fixes’ as opposed to $100,000 problems. IntelaTrac is now working its way upstream into Chevron’s E&P units and the ‘field of the future.’ All runs on a Microsoft stack—including Windows Mobile. While technology is key, behavior also needs consideration. It is easier to puzzle over how much memory to load into a hand held device than to check that operators are behaving appropriately. Sites still operate independently—sharing best practices is challenging—especially in the upstream. It is also key that data is actionable—a big complaint is that collected data is not acted on.
Evan Bauman from Shell’s Cri-Criterion catalyst unit likened the way Shell monitors its catalyst performance to how GE monitors its rotating equipment. But you can’t put a sensor into a 1/16th inch pellet, so catalyst monitoring is done by observing process variables. The value of a catalyst is largely determined by how customers use them. Shell has hundred of users across multiple time zones. Data was previously stored on local drives and collaboration was based on email. Stuff was easily lost in .pst files and there were disconnects between customers and catalyst information. Then Shell deployed its IT ‘standards’ (i.e. SharePoint), first in a 2009 pilot with Logica which built the CatCheck Portal. CatCheck plugs into SAP and offers a discussion board, search, customer zone—all linked with business process. While there have been observable business benefits, it is ‘hard to put a number on them...’
Also new at the GEF was Idea Integration’s ‘Constellation,’ a SharePoint/ESRI ArcGIS Server combo that targets the upstream. Constellation GIS includes integration with geoscience desktop applications through an OpenSpirit data link.
More from David Barret, email@example.com.
In 2009, we described SharePoint as the centerpiece of the GEF. This time we were expecting to hear more of MURA2. This did not happen. The GEF remains essentially a group meet of very enthusiastic oil country SharePoint users. Maybe MURA will get a session or two in 2012—if anyone can figure out exactly what it is!
1 Download presentations from www.oilit.com/links/1102_3.
2 Microsoft Upstream Reference Architecture.
According to the US Energy Information Administration fossil fuel fired power plants will account for 70% of US electricity generation right out to 2035. According to George Pau of the Lawrence Berkeley Lab’s Center for Computational Sciences and Engineering (CCSE) ‘Underground carbon capture and sequestration (CCS) will be key to reducing atmospheric CO2.’ The CCSE has been studying the processes involved in CCS particularly the mixing that occurs at the CO 2/ brine interface1.
When CO2 is injected into an aquifer, it collects in a layer above the brine. Over time CO2 diffuses into the brine, causing its density to rise. This denser fluid begins to sink, generating a convective process that increases the rate of CO2 uptake. The CCSE’s Cray XT4 ‘Franklin’ supercomputer was used to figure out the time scale of the process, modeling at a much finer resolution than is usual in geological studies. New parallelized porous media adaptive mesh refinement (PMAMR) code was used for the simulation which took 2 million processor-hours running on up to 2,048 cores. Current PMAMR models are on the scale of meters, but the plan is to scale up the results to characterize a real-world scale aquifer.
LBL researcher Karsten Pruess said, ‘The CO2 from a large coal-fired power plant operating for 30 years will generate a 10 kilometer wide plume which could eventually migrate into aquifers hundreds of kilometers away. We are very interested in the long-term fate of these processes.’
Franklin comprises 9,572 compute nodes, each with a 2.3 GHz single socket quad-core AMD Opteron processor and has 8 GB memory for a total peak performance of 352 teraflops. The system deploys dual operating systems—SuSE Linux on service nodes and a lightweight ‘compute node Linux’ (CNL) on individual nodes. CNL reduces system overhead and assures scalability. A Lustre parallel file system provides some 436 TB of user disk space. The CCSE thinks that PMAMR simulations will help recover more oil from existing wells. More from www.lbl.gov/cs.
Full paper on www.oilit.com/links/1102_7 (behind a pay wall).
Marv LeBlanc heads-up Knowledge Reservoir’s new Real-Time Systems division. Le Blanc was formerly with Cimarron Software.
ABB has appointed Frank Duggan to its group executive committee as head of global markets.
Steve Nadelman has been appointed president and COO of Acme Lift Co. He was previously senior VP of United Rentals.
Aker Solutions has recruited Wolfgang Puennel as head of its new Well Intervention Services business. Former Schlumberger executive Mark Riding is head of corporate strategic marketing and Valborg Lundegaard has been promoted to lead the new Engineering business.
Ametek has appointed Ken Weirman as VP and CIO replacing retiree Bill Lawson.
Argus has appointed former ExxonMobil executive Denny Houston as a non-executive board director.
Hendrik Lombard has joined Cortex Business Solutions as VP finance and corporate planning. He was previously CFO with Genoil Inc.
Dahlman Rose & Co. has named former Barclays Capital MD James Crandell as MD and global head of oilfield services research.
Ron Emerson has been appointed chairman of Fairfield Energy. Chris Wright is now CEO.
Anne McEntee has been appointed VP of GE Energy Services’ Dresser unit. Former Cisco VP Bill Ruh is now VP and global technology director for GE’s Software Sciences and Analytics division.
Bart Heijermans has resigned as Executive VP and COO of Helix Energy Solutions.
Cosme Peruzzolo heads-up ION Geophysical’s new seismic processing joint venture with Brazilian consultancy Bratexco.
Kadme has appointed Nithya Mohan to its service delivery team.
Julie Reid has left Kelman Technologies for NEOS GeoSolutions.
Former Executive VP and CFO of Seagate Technology, Charles Pope, has been elected to the LSI board.
Brady Parish is MD of Moelis & Co.’s new Houston office. Parish hails from Goldman Sachs.
Navigant Economics has recruited George Schink, Cliff Hamal, Julie Carey and Kathleen Rodenrys. Joe Pace and Bruce McConihe have joined as affiliates.
George Intille has joined Nexant’s Energy & Chemicals Consulting business unit. He comes from SRI Consulting.
Gary Bauer has joined Northern Offshore as senior VP operations. He was previously with Transocean.
Former president and CEO of OpenSpirit Corp., Dan Piette, has been appointed CEO of Object Reservoir.
APEX Solutions division OGRE Systems has appointed former Federation Software CEO Kirk Hanes, as president and CEO.
Romanian geophysical services company Prospectiuni has appointed Andy Clark as VP business development, marketing and strategy. He hails from Geokinetics. Tim Branch is to become VP, international operations.
Blake Moret is now senior VP Control Products and Solutions and Frank Kulaszewicz senior VP Architecture and Software of Rockwell Automation.
Denise Stone has joined Rose & Associates as director of business development.
Earthworks has appointed Phil Beale as business development director.
Former president of Geographix, Robert Stevenson, is now COO of TerraSpark Geosciences.
Tami Browning has joined Variance Reduction International’s Bakersfield, CA office.
Forum Energy Technologies has acquired Wood Flowline Products. Terms of the transaction were not disclosed.
GE Austria has made a cash offer to acquire Wellstream Holdings.
Wood Group is to sell its Well Support Division to GE for $2.8 billion cash. Following the deal, Wood Group will return ‘not less than’ $1.7 billion to shareholders.
GE has closed its $3 billion acquisition of energy infrastructure technology and service provider Dresser, from funds managed by Riverstone Holdings and First Reserve Corporation.
ConocoPhillips, NRG Energy and GE have committed $300 million in capital to a new joint venture, Energy Technology Ventures, to fund approximately 30 venture- and growth-stage companies over the next four years.
M2M Data Corporation has acquired the operating assets of Implicit Monitoring Solutions, including its flagship Intellisite application and Right Smart hardware.
Following opposition from its users, UCG has decided not to sell its Oil Price Information Service (OPIS) to McGraw-Hill’s Platts unit according to OPIS CEO Brian Crotty.
Reservoir Group is joining forces with Houston-based surface logging specialist The Mudlogging Company, which will join Reservoir Group’s RG Geo business unit focusing specifically on formation evaluation.
Rockwell Automation has agreed to purchase process control and automation systems integrator Hiprom, headquartered in Johannesburg, South Africa.
Technical outsourcing solutions provider System One has acquired vendor management system provider Link2Consult. Link2Consult’s technology business line will operate as a separate legal entity under System One Holdings, LLC. Peter McCree will remain President.
Total Safety has acquired Pacific Environmental Consulting, a British Columbia-based industrial hygiene, occupational safety and environmental consulting firm. Financial terms of the acquisition were not disclosed.
Amalto has announced the Amalto e-Business Cloud for trading partner collaboration, a business to business (B2B) electronic document exchange solution. The transactional platform offers multiple connectivity solutions for cost-effective exchange of invoices, field tickets, purchase orders, service requests and RFQs. CEO Jean-Pierre Foehn said, ‘Our new e-transaction platform leverages leading-edge technologies and our experience in deploying B2B projects for customers like Chevron, GE Oil and Gas, Total and Alstom Grid (formerly Areva T&D).’ More from www.amalto.com.
Enterprise governance, risk and compliance (GRC) specialist Modulo is now offering its Risk Manager GRC package from the cloud. Modulo president and CEO Alvaro Lima said, ‘The new release offers compliance cross-referencing and vendor risk management and adds support for the iPhone and mobile device collectors to the industry’s first open source GRC data collection platform.’ Modulo has also released an eBook titled ‘ISO 27001 and 27002: A Practical View’ to help adoption and certification with ISO, SOX and HIPPA standards. More from www.modulo.com.
EU gas infrastructure provider Gasunie is moving its trading and spend management to the Ariba Commerce Cloud. Gasunie operates one of the largest networks of high pressure gas pipeline grids in Europe and is seeking to reduce costs with through e-commerce. The agreement covers Ariba’s Spend Visibility, Procure-to-Pay, Sourcing and Contract Management solutions. Ariba’s Commerce Cloud is used by more than 340,000 companies around the world. More from www.ariba.com/commercecloud.
Bull computers is heading up an EU sponsored R&D consortium ‘Cool-IT’ to investigate data center power usage. Current data centers can consume twice as much electricity as is needed to power servers. Cool-IT is looking at new cooling techniques and ‘smart grid’ energy management for global optimization. A 20% energy saving is targeted. More from www.oilit.com/links/1102_18 (in French).
Kappa Engineering has just announced a new consortium focused on the analysis and forecast of production from shale gas and other unconventional resources. The consortium will kick-off on July 1st 2011. More from www.kappaeng.com.
Three EU universities (Barcelona, Stuttgart and Versailles-St. Quentin) report on results from the Parallel Programming for Multi-Cores Architectures (Parma) which has resulted in an open source parallel compiler ‘Unite’ for optimizing parallel computer simulation. Unite was used by Recom Services to optimize fossil-fueled power stations and in Bull’s ‘Bull-X’ supercomputer range. More from www.oilit.com/links/1102_19.
Petrobras has joined phase 2 of Reaction Design’s Model Fuels Consortium. MFC II builds on the work of the original Consortium and continues the innovative design work targeting cleaner burning, more efficient transportation engines, power turbines and fuels. MFC started in 2005, addressing engine design challenges such as emission reduction and novel fuels. The consortium develops ‘surrogate’ fuel models and combustion simulation tools for digital modeling of vehicle engines and power generation plants. Petrobras’ Alipio Ferreira Pinto said, ‘Fuel producers must consider a widening product range, from cleaner, high-performance petroleum blends to alternative fuels that vary in quality. The MFC helps by enabling accurate, industry-validated computer models of complex fuels.’ Other consortium members include ConocoPhillips, GE Energy, the French Petroleum Institute (IFP-Energies Nouvelles), Oak Ridge National Lab., Saudi Aramco and several automobile manufacturers. More from www.oilit.com/links/1102_20.
Active Navigation (AN) has just rolled out a suite of Microsoft SharePoint-based solutions targeting oil and gas operations, compliance, collaboration and risk mitigation. AN trawls unstructured data sources in the organization, identifying high value content in information silos. Cleansed and correctly tagged documents have proved critical during mergers and asset acquisition where business-critical content must be identified and quickly ingested. Clean data means that good intelligence can be derived from ‘mountains’ of legacy content and new information sources to spot trends, perform analysis, and drive sound business decisions.
Another AN service is personal identifiable information (PII) remediation. PII is information that can be used to identify or contact an individual potentially exposing them or the organization to threats. Flagship oil and gas client is BP, whose Olivier Legendre addressed the Active Navigation User Focus Group last year with a talk on how BP has combined different Active Navigation analyses to compile custom reports that ‘drive user buy-in for cleansing projects.’ More from www.activenav.com.
Statoil has selected Petris Technology’s borehole data management solution, PetrisWINDS Recall to automate data quality control and publishing workflows. Statoil is also to deploy PetrisWINDS Enterprise to integrate data from internal and third-party petrophysical applications.
Invensys Operations Management has implemented an InFusion Enterprise Control System for ExxonMobil Lubricants & Specialties Company.
Kongsberg Oil & Gas Technologies is now a shareholder of Norwegian visual solutions provider Viju and has transferred its Integrated Collaborative Environment (ICE) unit to Viju.
dGB Earth Sciences has teamed with Surface and Subsurface Resources to offer depositional framework creation services using OpendTect and SSR’s Depositional Trend Analysis tool.
Fugro group member Interaction is to provide Rock Solid Images with a ‘next generation’ CSEM processing and quality control system.
Accenture and NetApp have signed a technology agreement on data center optimization, cloud computing and virtualization. The companies are also to extend NetApp’s Professional Services organization with a dedicated team of Accenture consultants.
AGR Field Operations has been awarded a contract by Hyundai Heavy Industries for the provision of Maintenance Engineering and Inspection Planning services for the Goliat FPSO. These services are a component of HHI’s contract with to ENI Norge.
Statoil awarded Aker a 500 million NOK contract for engineering, procurement and construction (EPC) of a subsea workover system for the Vigdis North East development.
Saudi Aramco has selected SK Engineering & Construction to perform EPC on its Wasit Gas Program onshore facilities.
Energen Resource Corp. has joined the Cortex trading partner network.
FMC Technologies has signed a $85 million for the supply of subsea production equipment on CNOOC’s Liuhua 4-1 development.
Statoil has awarded Fugro GEOS two ‘metocean’ (meteorological and oceanographic) studies in the Norwegian and North Seas.
Argentinean gas distributor Gasnor has selected Oracle’s JD Edwards EnterpriseOne Financials and Oracle Partner Info Consulting for its administrative and financial processes and to standardize gas purchasing processes and data.
GE Oil & Gas and Al Shaheen joint venture PII Pipeline Solutions is to provide inspection technology and engineering services to monitor the integrity of two of Australia’s major natural gas pipeline infrastructure networks.
GE Oil & Gas also received contracts worth $50 million to supply subsea wellhead and installation systems to Petrobras.
TerraSpark is to provide its InsightEarth seismic interpretation software to Fugro Robertson for use on North Sea projects.
Chevron has selected KBR for the detailed design of its ‘Big Foot’ topsides. KBR was also awarded a contract by the Republic of Equatorial Guinea to provide a conceptual study and project management for a new refinery.
Houston-based E&P start-up Gulf Coast Energy Resources has selected P2 Energy Solutions’ accounting, financial, and land management software.
Petrofac has been awarded a US$1.2 billion EPC contract by the Sonatrach/BP/Statoil joint venture In Salah Gas for field development and a new central production and gas gathering facility.
PGS and SeaBird Exploration are teaming on the development of ocean bottom node solutions for deep water. PGS gains exclusive rights to resell SeaBird’s technology in Brazil.
SAIC has received a $20 million ‘task order’ from the US Environmental Protection Agency for software engineering and technical support.
Siemens and Saudi Aramco have signed a corporate procurement agreement to ‘strengthen cooperation’ between the two companies. The agreement grants Saudi Aramco improved access to Siemens Oil and Gas Division’s rotating equipment and services.
The Fiatech board has endorsed the ongoing iRING Tools Interfacing Project that sets out to deliver native ISO 15926 and iRING interfaces to key commercial engineering design tools.
The PPDM Well Identification Standards Work Group puts the cost of assessing and updating the vendor neutral, publicly available well numbering standard at around $250,000. The Association is soliciting funds so that the project can be launched and completed in a timely manner.
Energistics has released the following updates to its EnergyML standards. A new (1.4.1) release of WitsML is slotted for publication by March 31, 2011 adding a StimJob object for fracturing reporting, an error model object and ‘tightened’ schemas for better interoperability. A five year roadmap is under development.
ProdML V 2.1 is on track for a Q2 2011 release with wireline formation testing, facility parameters for reporting, improved documentation and enhancements to the shared asset model (SAM). SAM provides generic services to maintain a hierarchical organization of assets and XML data objects.
Energistics is mooting the deployment of open source Darwin Information Typing Architecture (DITA) for future documentation. DITA offers a flexible approach to documentation that caters to different user needs and preferences. It is adaptable to online browsing, provides graphical navigation of XML layouts and can also serve content in book format. More (on DITA) from www.oilit.com/links/1102_24.
A whitepaper from McAfee describes the ‘Night Dragon’ cyber attacks on global oil, energy, and petrochemical companies. The attacks began in November 2009 using social engineering, phishing and exploitation of Microsoft Windows vulnerabilities, Microsoft Active Directory compromises, and the use of remote administration tools in targeting and harvesting sensitive competitive proprietary operations and project-financing information with regard to oil and gas field bids and operations. The Night Dragon attacks ‘primarily’ from an individual based in Heze City, Shandong Province, China who appears to have provided the internet command and control infrastructure to the attackers—www.oilit.com/links/1102_15.
Pending legislation is to update the US Dept. of Homeland Security Chemical Facility Anti-Terrorism Standards (CFATS) program. CFATS includes security vulnerability assessments (SVAs), site security plans (SSPs) and other protective measures. A Chemical Security Assessment Tool (CSAT) has been developed to identify high-risk facilities and provide help on SVAs and SSPs. CSAT, an online application, includes CFATS compliance and ‘Top-Screen,’ a consequence-based screening tool—www.oilit.com/links/1102_13.
A new whitepaper from Intel invites CIOs to ‘rethink’ information security. A strategy is proposed based on the four pillars of trust, security zones, controls and perimeters—along with a security model for the cloud—www.oilit.com/links/1102_14.
CygNet Software has extended its enterprise operations platform (EOP) for the oil and gas industry with a new energy load forecasting (ELF) module for natural gas pipeline operators. ELF uses a ‘self learning’ neural network to provide accurate predictions of future gas demand. ELF’s neural nets compare user-defined variance with actual values to compute load calculations. Neural net training can be farmed out to a server farm for better performance.
ELF maximizes profitability while respecting regulatory constraints and contractual obligations. ELF adjusts line pack and stored gas automatically, preparing pipelines for anticipated demand—taking the ‘guess work and grunt’ out of daily load planning. ELF extends CygNet EOP with transparent sharing of operational data. CygNet VP Steve Robb said, ‘ELF offers an out-of-the-box solution that lets pipeline operators deploy tools to automate daily operations and efficiently capture, manage and distribute data.’ EOP provides unified data management with a specific schema for gas pipelines and a gateway to leading enterprise service buses including TIBCO, BEA and Oracle. More from firstname.lastname@example.org.
Nashua, New Hampshire-based S3 Development Corporation has selected GE’s Proficy work process management solution for incorporation within its intelligent control room management (CRM) package for its natural gas and hazardous liquids customers. The solution will be piloted at three East Coast natural gas utilities, starting in Q1 2011.
The CRM was developed in response to the CRM/Human Factors mandate from the US Pipeline and Hazardous Materials Safety Administration (PHMSA) that imposes new minimum safety standards on gas pipeline operators. PHMSA regulations 49 CFR Parts 192 195 lay down procedural guidelines for access to information, alarm and change management, fatigue mitigation, operator training and more. Deadline for compliance has been brought forward by 18 months and companies must have plans in place by August 2011 with implementation by February 2012.
GE’s Sheila Kester observed ‘Companies are scrambling to determine how to comply with this mandate. The S3 solution takes all parts of the mandate into account, allowing companies to capture events and trigger standard operating procedures. This application plays to the strengths of our Proficy workflow solution and will provide a global framework for risk reduction.’ More from www.oilit.com/links/1102_16 (GE) and www.oilit.com/links/1102_17 (S3).
Yokogawa Electric Corp. has released new modules for its Stardom network-based control system including an upgraded module for oil and gas upstream processes improving reliability and increasing greater memory capacity. The new field control node (FCN) autonomous controllers are designed for small- to medium-size turbomachinery and the FCN-RTU low-power autonomous controller used to control of oil and gas wellheads distributed over a wide area.
New functionality includes a servo module to control rotational speed and a module to detect excessive speed. Oil and gas upstream processes customers now benefit from the same error check and correct (ECC) memory found in high-end PCs. Memory capacity has been doubled to facilitate the long-term storage of gas volume data and other on-site information. Stardom now also supports Windows 7. More from www.oilit.com/links/1102_18.
The Applied Geodynamics Laboratory (AGL) of the Bureau of Economic Geology at the University of Texas at Austin has released a summary1 of its achievements in over 20 years of sandbox experimentation. The report makes salutary reading in this digital age, showing how AGL researcher Tim Dooley bests the computer in modeling complex geological situations. Instead of teraflops, AGL’s ‘Super Models’ are used to investigate the formation of geological structures in deepwater salt basins such as the Gulf of Mexico. The silicone and sand models are built inside sheets of plywood and Plexiglas. Pumps and hand cranks apply regional tectonic stresses. Digital cameras make time-lapse movies for kinematic analysis as a laser tracks the evolving terrain model with sub-millimeter accuracy. After the experiment the model is sliced up to build a 3D computer image. Imagery is spectacular and insightful for seismic interpreters whose view of subsalt structures can be ‘band limited and obscured by imaging problems.’ Similar insight was offered to space scientists puzzling over imagery of the Hebes area on Mars.
The results of the Bureau’s research will be published later this year as ‘The Salt Mine: An Interactive Atlas of Salt Tectonics. The book and DVD combo contains 1,400 images of salt structures around the world, including outcrop data, seismics and geologic cross-sections and animations. AGL has received $13.7 million for research from consortium members since its founding in 1988. The AGL’s 2010 annual Industrial Associates Review meet hosted a record 325 delegates from 35 companies. More from www.oilit.com/links/1102_4.
Somewhat counter intuitively, oil and gas workers are most at risk of death or serious injury when driving to work. Shell’s Petroleum Development Oman (PDO) unit is taking action, kitting-out its vehicles with an integrated driver safety solution from MiX Telematics. MiX (formerly marketed as Siemens’ VDO brand) is an in-vehicle monitoring system (IVMS) that targets safety, reduced costs and compliance with company rules.
PDO MD Raoul Restucci said, ‘I am pleased to report that I am the first employee to have my vehicle installed with this technology. I am now driving knowing that my behavior is monitored with the IVMS and that it will be evaluated by an independent assessor.’ The four year project was executed by MiX Telematics’ Middle East partner FMSi. Other MiX clients include Schlumberger and Chevron. More from www.oilit.com/links/1102_5.
A separate announcement from SAE International covers a new publication, Performance Metrics for Assessing Driver Distraction: The Quest for Improved Road Safety. The 266 page book summarizes the results of the 4th International Driver Metrics Workshop and provides ‘vital information’ for designers of in-vehicle information and communication systems. More from www.oilit.com/links/1102_6.
Speaking at a the French Petroleum Institute (now IFP Energies Nouvelles) this month Jean-Paul Rolando described how Total is raising its geomodeling game. Geomodeling now underpins the whole E&P workflow, from early stage volumetric estimation and reservoir characterization through development scenario planning to fluid flow modeling and monitoring with input from 4D seismics.
Total uses commercial software from Paradigm, Schlumberger and Halliburton alongside bespoke tools for internal use. Total’s flagship is Sismage, an integrated interpretation platform that is steadily expanding in scope and capabilities. Sismage has been under development for 20 years, leveraging Peter Vail’s research on seismic stratigraphy. Sismage integrates with Paradigm’s Skua and Schlumberger’s Petrel.
Rolando’s ‘workflow of the future’ envisages a single platform that embraces multiple domain skill sets, along with data integration and federating technologies.
In the Q&A, representatives from IFP seized the opportunity to plug their integrated interpretation environment ‘Open Flow’ which includes geomodeling and flow simulation. Rolando wound up the event noting that ‘Proper use of these tools adds significant value to a development—without them you increase the risk of failure.’
A three year study of the social and organizational impact of the Statoil-Hydro merger has come to the conclusion that this has been ‘efficient’ is now ‘seen by most as a success.’ The study, conducted by a team from three Norwegian institutes, noted that the merger ‘has not gone completely without a hitch.’ A partnership model involved employees and trade unions in the decision making processes.
Project Leader Helene Loe Coleman from the Institute of Labor and Social Research (FAFO) said, ‘The new operational model was controversial and things did not go as planned.’ Unions tried to secure employee positions. Statoil countered with a strategy that avoided lay offs and compulsory redundancies.’
Statoil Business Director, Tor Egil Sunderø added, ‘The merger gave us a unique opportunity to gather new knowledge which has been used in the implementation of the Statoil 2011 program and in aspects of our improvement process.’
Sunderø believes that Statoil has now acquired a competitive advantage in the mergers and acquisitions arena. The team now plans to publish its findings in a book. More from www.statoil.com.