Professional Documents
Culture Documents
Don Paul (Chevron), Washington Salles (Petrobras), Don Moore (OXY) and Steve Fortune (BP)
The Society of Petroleum Engineering Houston Digital Oilfield conference and exhibition is an altogether
lower key affair compared to its European ‘Intelligent Energy’ counterpart – reflecting its origins with the
SPE’s local Gulf Coast chapter.
Previous SPE-supported events have often touted the idea that the oil industry is a technology ‘laggard.’ Not
so according to Don Paul (Chevron) who cited a recent study by the US Council on Competitiveness2 that
found that Tier 1 US energy companies outpaced other sectors. Oil and gas has embedded IT ‘all the way to
the front line of the business.’ Paul also analyzed the current supply and demand situation to conclude that
demand growth is likely to continue apace – and that for the industry, ‘this is as good as it gets.’ The
technology lead theme was largely supported in Donna Crawford’s (LLNL3) keynote on high performance
computing.
A couple of interesting panel sessions debated the state of the art. Tony Edwards (BG) saw ‘confusion’ as to
what should be standardized, outsources and automated. Steve Fortune (BP) ventured that information
management was more important than the digital oilfield. An interesting observation in so far as others, Don
Paul included, considered the data management problem as more or less intractable. Fortune also noted that
the digital oilfield was moving from pilots into ‘full scale, value generating deployments.’
Otherwise the papers presented showed considerable diversity around the digital oilfield theme. Topics
included Shell’s Gulf of Mexico production operations management center, AspenTech’s work on
optimizing production on BP’s Azeri field, Shell’s gas lift optimization on Brunei’s Champion field and
rock mechanics modeling from Chevron. An interesting gathering of smaller vendors was showing software
for workflow management, collaboration and visualization.
Many exhibitors were showing off Chevron deployments. We’re not sure if this means that Chevron is
ahead in the digital oilfield stakes or just more communicative. Either way, Chevron’s openness is to be
encouraged.
Highlights
1
Gulf Coast Section.
2
www.compete.org.
3
Lawrence Livermore National Laboratory.
Contents
© June 2008
The Data Room
7 rue des Verrieres
F-92310 Sevres France
Tel (USA) 281 968 0752
Tel (UK) 020 7193 1489
Tel (France) +33 1 4623 9596
Fax +33 1 4623 0652
Technology Watch Home Page info@oilit.com
Paul
Many outside the industry don’t understand that oil and gas is one of most ‘digitally intensive’ industries as
a recent study by the US Council on Competitiveness shows. Upstream technology is pulled by business
opportunities such as the deepwater. The industry has been successful because it never got confused -
business drives technology rather than the other way round. The report studied technical computing in the
automobile, aerospace, pharmaceutical and oil and gas industries finding that Tier 1 US energy companies
outpaced other sectors. The big difference is that oil and gas has embedded IT ‘all the way to the front line
of the business.’ In other words, opportunity pulls technology. The study can be downloaded from
www.compete.org/publications.
Paul then turned to the oil price (currently around $135 per barrel) to ask ‘What’s happened to supply and
demand?’ Demand (consumption) is forecast to increase 50-60% by 2030. A 30% increase equates to the
equivalent of Saudi Arabia plus Russia. Oil, gas and coal will dominate as far as the eye can see although,
‘We need to do as much alternatives as possible ...’ While we are not running out of molecules (of oil),
supply is constrained by politics and costs. ‘Conventional’ may not be able to deliver. According to the
IEA, some $20 trillion in energy capital expenditure is required over the next 25 years. This is significantly
higher than current rates which are clearly insufficient to meet the above needs. Total energy demand
forecast supports the above 35% growth even though the economy may slow down. Demand is bolstered by
the shift in OECD to non OECD countries – with a cross over happening around now. Today growth is
outside of the OECD – the whole demand train is being pulled by new actors. Paul cited the Tata Motors
‘Nano’ low cost auto, expected to be produced in hundreds of millions. All gasoline powered – which will
have a major impact on consumption.
Both conventional and non conventional resources rely on ‘digital energy.’ The biggest resource in the
world is arguably the Athabasca tar sands. These need better designed processes and systems. As computers
get faster new applications come along. Not long ago subsalt imaging was considered impossible. It is
feasible now thanks to ‘smart people plus computers that are 1,000 times faster.’
Paul stressed that for the industry, ‘this is as good as it gets.’ An FPSO can be viewed as the most complex
mobile manufacturing facility ever. This can’t be designed with pencil and paper or even CAD/CAM – it
needs high performance computing (HPC). The Tengiz sour gas injection plant was needed so as not to
produce another cubic kilometer of sulfur and is equally complex. CERA estimated development costs at
$20/bbl. There was a time when this would have been considered outrageous. The digital oilfield is also
about improving decisions with decision support centers, operations centers and collaboration centers. Other
industries (defense, intelligence) have similar complex integrations issues. Chevron’s 100 year old Kern
River field in California has been completely revived by technology. Today there are 6,000 trucks moving
around the field. Chevron uses ‘integrated’ 4D movies for real time planning (work from USC). This has
been used to mitigate risks, spills and safety incidents.
Paul sees current trends centered on connectivity, computing, engineering and manufacturing, human-digital
relationships, virtual worlds, automation and robotics (well controls). According to Ray Kurzweil, today a
$1000 buys a giga flop of compute power while the fastest machines approach the petaflop. In ten years, we
will have a petaflop for $1000. Chevron had 5,000 terabytes of storage 2007 – growing at an annual rate of
80% for technical and 60% for business data. ‘Intensity’ is rising constantly and requires advances in data
management and access technology. SCADA systems were not built with security in mind! An issue that is
being addressed by the US Department of Homeland Security’s ‘Lining the Oil and Gas Industry to Improve
Cyber Security’ (LOGIIC – www.logiic.org) program. Chevron, like the US Army, has a Second Life
simulation game to educate about energy.
Q&A
Shell – could you elaborate on digital technology and safety?
Technology Watch Report 3 © 2008 The Data Room
Digital Energy 2008, Houston TW0808
Fields such as predictive spatial intelligence can be used to monitor when things are going out of
spec. More measurement and integration produced better predictive models that allow us to
anticipate ‘bad stuff.’ Here refining is further along the curve.
OpenSpirit – What about the McKinsey study that portrayed the industry as reluctant to use new
technology?
I don’t buy the story. I’ve been a contrarian on this for a long time. We are not talking about
consumer software. Bad technology in process can be catastrophic. So we implement scalable and
sure technology. In view of the consequence of failure, the time frame for take-up is about what you
would expect. We don’t use latest and greatest stuff and fix it when it breaks: The time frame may
be 15 years yes, but this is a hundred year business. We experiment fast but don’t implement till it
works.
What is the impact of the ‘younging’ of the workforce?
This is the ‘bridge dilemma.’ The young have fantastic digital skills but no experience. Older folks
have the experience but don’t use an iPod! But there is no substitute for experience. Simulations
help people realize the consequences of bad decisions. If you blow up the plant in the simulator you
learn something. This may even be better than hanging out in Angola for a couple of years.
TW0808_2 Next Generation Production Surveillance - Tom Moroney, Shell
Shell’s deepwater Gulf of Mexico (GOM) asset portfolio is growing in size and complexity. More data has
to be processed routinely and efficiently. Much of the workforce is close to retirement and Shell is
mentoring a wave of young talent that is coming into the organization. One facet of this is ‘exception based’
surveillance workflows and tools feeding into a knowledge repository. This is allowing Shell to ‘routinize’
elements of its activity so as to avoid ‘firefighting.’ Shell’s old surveillance system in New Orleans was not
considered good enough. A new system was needed to offer centralized, consistent cross-asset surveillance
for the GOM and maybe Brazil and Alaska. The key to the new Production Operations Management Center
(POMC) is that it is ‘exception based,’ providing operating limits, drawdowns, performance curves for
topsides and subsea equipment. The idea is to understand conditions as they occur. The appropriate time
frame may be days, weeks or months. A virtual asset team can be constituted from POMC staff, operations
and process engineers which can apply ‘lean’ principles and consistent definitions across wells, facilities and
subsea. Technology in the POMC includes a portal, advanced alarm tool, workflow automation, dynamic
reporting and the knowledge repository. Alarm notifications pop up PI Historian trends and well test data for
analysis and ‘situational awareness’ – all managed in the workflow engine. Once the analysis is done, this is
handed off to engineering. After an intervention, the system captures new parameters and performs short
cycle optimization before issuing recommendations to operations. Upon which the surveillance cycle is
complete, captured and documented for later reference.
TW0808_3 Panel Session
Tony Edwards (BG) attended the EAGE workshop on Integrated Operations in Stresa last year and observed
that there was confusion as to what should be standardized, outsourced and automated. Edwards believes
standards should be limited to HSSE, integrity, procurement etc., but that well placement and other
‘creative’ processes should be approached differently. If you put too much in standardized box you can
inhibit creativity. Similar issues arise with rigid process workflows. There is an approach that says, ‘don’t
standardize but go for a continuous improvement approach, start with the status quo and improve’ Some
think the process never stops. Another aspect is that standardizing on how, rather than on what, may inhibit
innovation. These issues are particularly important if you have a diverse portfolio. One size will not fit all!
ExxonMobil – There is excitement about IT enablement and removing the mundane stuff so that people do
what they were trained to do. Exxon does internal videos for presentations to management. Everyone says
the new IT is wonderful. Exxon also advocates continuous improvement of its workflows and best practices.
TW0808_4 Workflows in digital oilfields – Anil Pande, Infosys
This talk concerns office automation rather than real time/process control workflows. Such workflows might
concern a workover, AFE, travel request etc. A workflow could be a Perl script for a particular task or a
more complex task. Workflow is the glue that holds all these together. Standardization is ‘desirable,’
allowing for standard processes that can be integrated to other work flows. There are no standard tools – the
following can be used to map out a workflow, Word, Excel, Power Point. But there is ‘no best practice’.
Work flow modeling building bricks include participant, activity, transition and swim-lane.
Q&A
Shell – Are there any standards for workflow?
4
It’s surprising that no mention was made of the Workflow Management Coalition - http://www.wfmc.org/ in this
context or the Web Services Business Process Execution Language - http://docs.oasis-open.org/wsbpel/2.0/OS/wsbpel-
v2.0-OS.pdf - now supported by Microsoft’s Windows Workflow Foundation and BizTalk.
5
https://ipo.llnl.gov/.
6
This led to the invention of Wikinomics - http://www.usatoday.com/money/books/reviews/2007-01-02-
wikinomics_x.htm.
7
A case could be made for these problems remaining as major ‘issues’ today.
Panel – Don Paul (Chevron), Washington Salles (Petrobras), Don Moore (OXY) and Steve Fortune (BP)
Moore – The oil and gas industry doesn’t get as much credit as it should for getting to where it is today. A
decade or so ago I was at a CERA conference where all these young folks from the technology side were
telling us ‘you oil and gas guys just don’t get it!’ Then the tech bubble burst. Today you should be
encouraged how fast the gap closed. And this with far less people.
Fortune – We are also encouraged by the take up of technology. A couple of years ago we were in pilots
but now these are ‘full scale, value generating deployments.’
Paul – The data management issue is a good news/bad news story. The problem is now accepted – even
though it might be unsolvable – we now know how to live with it.
Royal Strategies – how do you leverage holistic sensing of an asset to adapt to emerging risk?
Fortune – Risk involves integrity and safety. We use predictive analytics – for instance in our GOM hubs it
gets hard to manage each piece of equipment to the same level. So we try to put the monitoring equipment at
optimum locations9.
Moore – You have to keep operations operating – its about making our numbers! And it’s going to get
harder and harder to do. We need to fix pump jacks before they are down. Saudi Aramco is still putting a lot
of effort into efficient operations. Everyone has to make their reserve numbers or it will be very painful!
Paul – We also need to constantly feed the financial community which requires lots of information. They
now can see a 2% products shortfall. For Chevron, every barrel not produced must be bought. At
$130/barrel, this means direct financial consequence for every shortfall. Environmental liabilities are no
small deal. We are constantly observed by the government and NGOs. Hence the need to predict and
prevent.
OpenSpirit – Referring to Don Paul’s remark about unmanageable data – is this a Chevron specific
problem?
8
Non uniform rational basic splines.
9
Fortune mentioned GE’s turbine monitoring and perhaps SolArc.
10
http://en.wikipedia.org/wiki/Halo_(Series).
11
Image courtesy Coreworx.
12
Image courtesy EPSIS.
13
Image courtesy EPSIS.
14
Image courtesy EPSIS.
15
Image courtesy Halliburton.
16
Image courtesy Infonic.
17
Image courtesy IOCOM.
18
Image courtesy Optelligent.
19
Image courtesy P2 Energy Solutions
© June 2008
The Data Room
7 rue des Verrieres
F-92310 Sevres France
Tel (USA) 281 968 0752
Tel (UK) 020 7193 1489
Tel (France) +33 1 4623 9596
Fax +33 1 4623 0652
Technology Watch Home Page info@oilit.com