SEG 2012—Las Vegas (December 2012)

Plenary ‘social responsibility’ session. M-OSRP and seismic ‘group therapy.’ Cable-less acquisition surveyed. Automated and touch-enabled interpretation. Micro seismic monitoring of frack jobs. SpiceRack autonomous marine sensors. Fractal methods. Nvidia’s Index. Petrel on SQL Server.

President Bob Hardage gave his state of the society address at the SEG council meeting. Hardage opined that the SEG’s focus should be technology and the advancement of appliedgeophysics. Geology and engineering should be the bailiwick of the AAPG and SPE respectively. The SEG is working to implement new governance and improve communications between its executive council and the 33,000-strong rank and file. A new ‘Interpretation’ publication is to hit the newsstands mid 2013. Corporate sponsorship is at a historic high and overall revenues are nearly $20 million. A new unconventional resources technology conference ‘URTeC’ will kick of, in partnership with AAPG and SPE, in Denver next summer.

Five years ago, the SEG ‘almost came to extinction’ following a constitutional crisis. This was solved with a ‘great compromise’ as the board replaced the council as the SEG’s governing body. Accordingly, Hardage wound up by ‘retiring’ the old SEG gavel and handing a brand new one to incoming ‘first ever’ council chair, Mike Graul.

Peter Pangman reported from the SEG’s first ‘corporation,’ SEAM, the SEG Advanced Modeling Corp. SEAM filled the need for industrial strength earth models to test and collaborate on novel technologies. SEAM Phase I is now in its 6th year, funded by the US Govt and 24 corporate sponsors. Phase one has produced a 220 terabyte dataset of a deepwater target. 22 members have already signed up for Phase II which addresses onshore exploration including non conventional targets.

Perhaps the new organization explains why the SEG missed a trick or two in how the annual conference proceeded. The Honors and Awards session was well attended, but the opportunity for ‘communication’ was passed over as there was nothing in the way of discourse or address. Another disappointment came in the Monday plenary session, the worst-attended event we have ever seen at a major exhibition, with under a hundred present in the cavernous ballroom for the 10am kick-off. The reason? Well, the subject could hardly have been further from Harding’s exhortation to focus on geophysics. The BP-sponsored event elected to investigate ‘corporate and academic social responsibility; engagement or estrangement?’

Jonathan Nyquist (Temple University) observed that the ‘e-word’ (environment) is being replaced by the ‘s-word,’ sustainability (not shale!) The ‘Geoscientists without borders’ (GWB) program is facing a funding problem. Nyquist encourages corporations to get involved to enhance their reputation and ‘build the workforce’ in the face of a geoscience demographic deficit—forecast to be around 150,000 by 2010. CGGVeritas was the only corporation that responded to the SEG’s invitation, although the event was sponsored by BP. While Isabelle Lombert joked that she might get the Pinocchio ‘greenwashing’ prize, she made a good case for CGG’s efforts to limit the impact of seismics with narrow, meandering lines and minimal clearance. Elsewhere, in streamer design and high performance computing, ‘green’ equates with efficiency. CGG claims the largest oil immersion HPC data center in the world—with ‘90% energy savings.’ Employees have a corporate citizenship program and get ‘solidarity leave’ to work with NGOs. CGG also supports GWB and is a member of the UN Global Compact reporting initiative. This involved 90 key performance indicators and is a ‘daunting task.’ Mike Oxman, from consultants Business for Social Responsibility offered a new industry paradigm for transparency, social and legal impact and human rights. All of which requires navigation through a complex landscape of stakeholders (UN, OECD, ISO 14001, GRI, FCPA and SEC) and national laws. ‘Studies show’ that CSR has a positive return on investment. In the debate, one speaker mentioned the irony of a social responsibility discussion taking place in a ‘totally unsustainable city dedicated to conspicuous consumption.’

Art Weglein (M-OSRP) presented the first field data examples of direct depth imaging without a velocity model. Weglein believes that research is prejudiced in favor of the ‘velocity field’ and that if you claim, as he does, to have developed a method to process sans velocity, ‘everyone breaks out in hives!’ All processing objectives, multiple removal, attenuation, depth imaging etc., can be achieved without subsurface information. Weglein’s ‘group therapy,’ a.k.a. the inverse scattering series (ISS) has layers ‘chatting amongst themselves’ until they output correct depths and flattened gathers. The latest M-OSRP tests, Weglein claims, demonstrate the viability of ISS. The overburden reflections avoided in conventional imaging are precisely what ISS imaging leverages. More from M-OSRP.

Total’s Henri Houllevigue reviewed the state of play in ‘cableless’ acquisition, observing that there is probably only one true cable-less system in use today. Power and communications requirements mean that most are a combination of cable and wireless systems. But there is a general recognition that cableless is enabling better resolution with point receivers.

As indeed was borne out by presentations from Saudi Aramco (covered in last month’s editorial) with a 100,000 trace land trial that heralds a revolution in seismic quality and a corresponding boom in data volumes. The quality enhancement is driving new automated workflows for both processing and interpretation. As Brian Wallick observed, horizon autotracking on the old data was a ‘70% solution.’ Point source has brought this to ‘nearly 100%.’

Several presentations and vendors presented work in this space. Paradigm has announced new ‘constrained autopicking’ in Skua—with the ability to track ‘hundreds’ of horizons in a depositional context. Eliis’ Paleoscan adds a ‘seismic stratigraphic’ element to a semi-automated interpretation workflow.

XinMing Wu—(Colorado School of Mines) presented a three step process to go from seismics to the seismic stratigraphic Wheeler diagram using instantaneous phase and a cost minimization algorithm to transform the image to relative geological time. The results were displayed as a Wheeler volume movie.

More compelling was the presentation by Saverio Damiani of Schlumberger’s ‘Extrema/seismic geology.’ Extrema promises automated seismic stratigraphic interpretation by identifying events such as horizon terminations and delimiting sequences. Output is again the Wheeler diagram along with seismic facies. Today’s interpreters are confronted with numerous subtle reflector terminations and unclear chrono stratigraphy and problematic surface extraction. Extrema’s objective is to standardize interpretations. Curiously, the ‘automated’ technique is currently only available from Schlumberger’s petrotechnical service unit using an ‘internal’ Petrel plug in. The results were shown using offshore Angola data and are said to be useful in jump correlation from one basin to the next and to create a detailed static model for input to Schlumberger’s Visage geomechanical modeller. The tool is claimed to result in a ‘huge increase in interpreter productivity.’

Terraspark’s ‘Turbo AFE’ accelerated fault extraction leverages GPU technology for automatic fault extraction. It works as a background task on an undecimated 3D volume. The tool runs on  a ‘desktop,’ actually a rather chunky four unit box on the floor!

Halliburton/Landmark is blurring the processing/interpretation boundary with SeisSpace/Promax, now a part of a ‘greater DecisionSpace and Open Works framework.’ SeisSpace now offers high volume parallel processing on distributed memory machines. The SeisSpace API has been used to good effect by poster children Crestone Seismic and Canonical Geoscience. The DecisionSpace interpretation tool was being shown with a new touch-enabled interface using a hardware overlay from Perceptive Pixel (now bought by Microsoft). Touch-enablement is seen as key in the development of new interpretation workflows and for the mitigation of repeat strain injury—said to be ‘a huge challenge for clients.’ In the same vein, Halliburton now offers its pore pressure prediction app on the iPad—including access to the Cloud for historical data. Halliburton also uses iPads in the field with OpenWells mobile and for data entry to EDM.

The geophysical profession is throwing all it has into the non conventional boom with a variety of specialist seismic acquisition and monitoring technologies on offer. Peter Duncan, founder and CEO of MicroSeismic gave a spirited account of how surface monitoring is key to understanding what is happening during hydraulic fracturing. Currently, ‘only 4%’ of fracked wells are monitored. Additionally, monitoring ‘confirms containment’ i.e. can be used to demonstrate to environmentalists and others that fracs do not affect the water table. Monitoring may also used to provide alerts when surface motion exceeds a threshold as now mandatory in the UK. MicroSeismic has opened an online ‘Reservoir intelligence’ portal of monitoring information and offers real time communication of frac data to Houston and an engineers’ iPhone ‘so he can be golfing as he fracs!’

CGGVeritas CEO Jean-Georges Malcor and Baker Hughes VP Andy O’Donnel announced a collaboration on shale gas operations. Again, seismic monitoring is seen as key to ‘show environmentalists that we don’t affect the water table or activate faults.’

In another teaming, CGG hooked up with Saudi Aramco on ‘SpiceRack’ a very high end autonomous, cable-less node for full component sea-bed acquisition. SpiceRack gets a check out while on the mother vessel before sliding down the launcher to the ocean bottom. After the mission completes, it swims back under its own steam to a ‘dream catcher’ on the support vessel. Despite the sexy technology, it is hard to equate the deployment of a very limited number of high end nodes with the push for throw-away geophones for million-trace deployment. Whatever, vive la difference!

There was considerable buzz around Chris Green’s (Getech) poster presentation on Cryosat-2, a new high resolution geodetic altimetry satellite that promises a ‘renaissance’ of satellite gravity processing and ‘imaging.’ Some potential field ‘forgotten truths, myths and sacred cows’ were visited by Alan Reid (University of Leeds) including the likelihood that magnetic data is self similar i.e. ‘fractal.’ Self similarity ‘pervades geology’ and is ‘where the real profit is to be made in the next 20 years.’ Reid says ‘put your best graduate students onto it.’ The fractal concept should be recognized as a paradigm shift although ‘few of us [oldies?] can make it, we can start.’

On the exhibition floor, Trimble was showing ‘GateWing,’ a small photogrammetry drone with pinpoint GPS. The exchangeable polystyrene wings and body are good for ‘3 normal landings or one bad one.’ The $70k device was used to provide spectacular imagery of Easter Island in a recent documentary. Fraunhofer’s global address space programming interface GPI-Space provides a virtual memory layer across cluster and a generic workflow engine. GPI-Space is used by Statoil to parallelize SeismicUnix and legacy Fortran/C codes. Fraunhofer may consider open sourcing GPI-Space ‘if we can find the right business model.’ On the Paradigm booth, Nvidia showed work done for a major oil using its ‘interactive and scalable subsurface data visualization framework,’ Index. Index is a production of Nvidia’s advanced rendering center in Berlin whose ‘Mental Ray’ is used in film computer graphics (The Hobbit, Superman). Index is a specialization of the technology for seismic feature extraction and imaging. The toolkit includes a distributed computing library for parallelizing across heterogeneous CPU/GPU architectures, and a ‘geospatial’ library for ray tracing and visualization. The ‘simple’ API hides details of cluster allowing interaction with huge datasets from a ‘web browser or iPad.’ Paradigm is working to leverage the technology to provide compute intensive capability to e.g. Barnett Shale field workers.

Finally, word on the exhibition floor was that Schlumberger will be rolling out a version of Petrel running on a Microsoft SQL Server database ‘real soon now.’ We would like to tell you more but Big Blue is too successful at ‘navigating the media maze’ for us!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to with OilIT_1212_11 as the subject. Web use only - not for intranet/corporate use. Copyright © 2012 The Data Room - all rights reserved.