When it comes to understanding the world underneath the earth’s surface, computers are turning out to be an oil company’s best friend.
Total is the most recent of the Big Oil elite to announce never-before-matched supercomputing. The French integrated company introduced the world last week to its new Pangea supercomputer, which will process geological data to help Total geologists more accurately decide how and where to drill wells.
And it will help them do it faster, as the oil industry once again pushes the frontiers of computing in the search for oil.
“It is absolutely true that the seismic industry, probably next to the Defense Department, has always been the biggest user of computer power,” said Peter Duncan, CEO of Microseismic, a Houston-based geophysical service company. “The original post-World War II computer at MIT, the Whirlwind – one of its original tasks was to do seismic data processing.”
As with smart phones and other technology, the race for the best super computer is perpetual.
Total’s Pangea has a computing capacity of 2.3 quadrillion operations per second – the equivalent of 27,000 office computers and is in the top 10 in terms of computing power, according to the company.
BP has announced plans to open a new computing facility later this year in Houston, with central processing units that can process data at to two quadrillion calculations per second — up more than 60 percent from its current computing systems.
See more: BP building new supercomputing facility
Exxon Mobil Inc. has also touted its supercomputer in an in-house magazine, saying that its calculating abilities are in the “next-generation” quadrillions per second. The company declined to discuss its exact speed with the Houston Chronicle.
“Each company likes to get the best computer, and they stay there about six months,” said Ramanan Krishnamoorti, a professor and chief energy officer at the University of Houston. “What people in the industry really care about is the number of calculations it can do per second and the amount of capacity of data storage.”
The power of Pangea and its speedy peers is critical to grind through the reams of data geologists collect in seismic imaging — which uses sound waves to collect data about the underground geology.
When the seismic data industry started a few decades ago, it only measured two dimensions, but improvements in computer processing have allowed the addition of a third and even fourth dimension, with time now used as one of the variables.
“What you want to tell in real time is how the reservoir is changing,” Krishnamoorti said. “That is the big challenge – is there enough information to describe the changes? They are looking at sound waves propagating through the reservoirs, and from that, they start to interpret how the reservoir is changing with time.”
The industry practice of collecting data – and its use of computers – traces back decades, albeit with less sophisticated technology.
“When seismic began to be used around the Gulf Coast of Texas to find oil back in the 1920s and 1930s, there was a technician on the crew whose job it was to develop the films that captured the analog records, with all those wiggles,” Duncan said. “That guy was called ‘the computer’.”
The seismic industry has moved beyond that, but geologists still have been limited in how much of the seismic data can be used, because the computers have not been able to handle the sheer quantity of the information.
Technology: Fiber optics providing clearer downhole data
“We send a sound signal down to the earth and capture it when it comes back to the surface, and all the time it is traveling down and coming back up, it is collecting information about the earth it passes through,” Duncan said. “The earth is a very complicated structure – it’s a lot of data.”
The seismic imaging process may involve a surface grid of several hundred square miles, comprising 25 by 25 foot squares.
A series of sound waves generated by explosives, air guns or mechanical vibrators penetrate the grid vertically, often several miles deep, and the behavior of the sound waves underground is sampled 500 times a second to provide clues about what lies beneath.
Because all this data could take months to process, more guesswork has been required on the geology of reservoirs, making some of the most complex formations extremely difficult to tap.
The new supercomputers allow geologists to clear up some of these geological mysteries, better arming them to tell stories about the treasures that lie below with a specificity never previously imagined.
“You can think of a faster computer as like having a bigger lens, so that you get a sharper picture or look at more area faster,” Duncan said. “We cannot fully model or image the data exactly – we have to make some assumptions, and those assumptions make the picture just a little fuzzy. If you get a better computer, you make fewer assumptions, and in the same amount of time, get a more complete solution.”
The faster processors open the doors for projects and allow those collecting the data and the geologists, who interpret the data, to exchange information and made needed adjustments.
“They are now doing in weeks what back in the early 1990s we didn’t think would be possible in 30 years,” said Kenny Laughlin, manager of geophysics for Halliburton’s Landmark Software and Services.
“Projects that used to take two years now take six months. You are processing the data faster, which allows for more interaction with interpretation – it allows for a shorter cycle – and it allows us to improve the image. This has been a goal in the industry for years.”
These improvements have been a key factor in why operators have been able to move into the increasingly complex plays in the deep water.
“The recent discoveries in the Gulf would not have been possible 20 years ago,” Laughlin said. “In deep water, you wouldn’t know where to go without this seismic data– the wells are so expensive and the structures are masked, you can’t tell where to drill by surface geography like you can onshore. Seismic has become critical.”