By Geoffrey Cann
There is no better candidate industry for edge computing than oil and gas, and it’s a wonder that the edge isn’t closer than it should be.
I was part of a virtual think tank this past week on the power and promise of edge computing for the oil and gas industry. We covered a lot of ground in a scant hour, owing to the clever composition of the thought partners (service companies, operators, technology providers, and researchers). Like many think tank events, the discussions were held in a kind of Chatham House rules setting, closed to all but a few invitees, and unlikely to be broadcast (although it was recorded). As a community service, I am pleased to share my perspectives from the discussion.
What Is Edge Computing?
If you’re not familiar with edge computing, don’t worry. Edge computing is one of those terms invented by nerds in the computing industry to try to carve out a new product category amidst the background noise of the technology sector.
The “edge” in a business is where real operational work happens. For upstream oil and gas, the edge is a drilling rig that is grinding through rock, or a storage tank that holds fluids, or a well that is producing hydrocarbons. For midstream companies, the edge is a pump pushing fluids or gases, a generator producing power, or a truck hauling a tank. For a downstream company, the edge might be an unmanned retail site, or an airport in the bush.
Edge computing refers to the design of a computing environment where a proper computer, with all the inherent accoutrements of our modern computer age, is located at these various edges for the purposes of providing automation support directly at the edge.
An edge computer must be powerful enough to be useful, connected to the world through a network, able to run without the network, fitted out with the ability to share its data with other similar edge devices, upgradable or fixable on the fly, secure from cyber activity, and reliable where there might not be power, light, heat, dryness, or human supervision. A tall order.
The cynical, or possibly the perceptive, might note that a smart phone that has the equivalent computing horsepower of a 1970’s super computer is a nice example of edge computing. People do real work, like inspecting an asset, and their phones provide network connectivity, apps for recording observations, storage of the data, continuous upgrades, security from hackers, tons of other useful features. Some are even water proof.
Throughout my long career, we used terms like coal face or front office to refer to the place in the business where operational work happened. But in a world where coal is fast fading as a commodity for energy supply in favour of the sun, and the office has fallen victim to the celebrity micro-terrorist we call COVID19, the edge will do.
The Old Edge
Oil and gas, and indeed the whole of the energy sector, has long relied on a kind of edge computing, except it has been called SCADA, or Supervisory Control and Data Acquisition. SCADA systems are exceptionally reliable (no blue screens of death), rugged, and run around the clock without fail. They are among the true workhorses of the energy world.
But they’re getting a little long in the tooth:
- Integrating with a SCADA system is expensive, which is a problem in a digital world which is all about integration.
- Older versions of SCADA are not enabled for the internet, which limits the ability to manage them remotely.
- SCADA systems rely on their obscurity for their security, which is a flawed strategy in a world with robotic hackers.
- They were designed for control, not for analysis, which makes them unsuited for a machine learning world.
Adding a new digital sensor to a running SCADA system is like adding a new joystick to a flying aircraft. You can do it, but no one wants to be on the plane with you at 30000 feet while you’re poking around the cockpit with a screwdriver.
In much the same way that computing has evolved from mainframes to departmental servers to client server to cloud and mobile, SCADA systems are overdue for an upgrade, an overhaul or a full replacement.
Think Tank Observations
The think tank highlighted why oil and gas, particularly the upstream, is a sound candidate for edge computing:
- Much activity in the upstream takes place beyond the reach of modern telecoms networks, which limits the ability to use cloud computing. For some unexplained reason, oil and gas deposits are all found in dark, cold and remote places. Sometimes I think that oil and gas is just God’s way to teach humans geography.
- Remote sites are frequently already wired up with SCADA systems. The business case to supervise and control the edge with computers is already sound.
- Modern analytic tools, like machine learning, thrive on the volumes of data that real time sensors generate in operations. Oil and gas companies are furiously innovating around machine learning.
- Edge computing gives front line managers real forensic information about what is actually going on in the field and much closer to real time, much faster than the usual 6 week reporting delay.
- Operators have already seen real performance gains — more production from existing assets, fewer personnel per operating well, greater leverage for staff, enhanced staff decision making, and an objective view of the asset that is unfiltered by spreadsheets and human bias.
But despite the fit of edge computing concepts with this business reality, edge computing in oil and gas faces lots of hurdles:
Even though an edge computer can happily compute away while not on a network, eventually it needs to phone home. Solving for the absence of always-on, reliable, low cost networks to make that call is a must to accelerate interest in edge computing.
Oil and gas is deeply concerned about the security of its business, since hydrocarbons are so dangerous. The industry has enough misguided adversaries who claim to be actively attempting industrial sabotage. A few companies have even been hacked. And SCADA environments are notoriously hackable. Edge has to be great at security.
Keeping a watchful eye on what happens at the edge is both necessary and unresolved. Is there an edge device that monitors the edge computer?
While the simplest edge computing structure is a single isolated edge computer running quietly, eventually the device needs to be updated (perhaps frequently, given the need to keep current with security patches). The way this is done is via a cloud connection. Operations might think that cloud issues are really a commercial IT question, but that is no longer correct. And which cloud to use? One of the think tank parties pointed out that while an edge device will certainly add value, it’s when the data from multiple edge devices are combined and analysed that machine learning becomes very powerful. Again, a cloud strategy becomes key.
It’s also increasingly clear that industry will need to embrace multiple cloud offerings. Amazon, Google, Microsoft, IBM and other niche players will duke it out over the market, leading to a level of unavoidable fragmentation. The future is looking cloudy. How will multiple cloud environments be connected, integrated, and supported? As importantly, sharing data from many companies can improve machine learning, but only if the companies agree to share data (and oil and gas companies still have huge hang ups about sharing data).
MANAGEMENT OF EDGE.
A proliferation of edge devices will require a new layer of management services to calibrate devices, carry out repairs, provide installation assistance, among other things, and provision such changes while the edge device is controlling the process. What will be the management protocols and practices for providing these services? Is there an emerging services industry that provides edge computer support?
The industry is short of the necessary to invest heavily in edge computing, despite the advantages of higher production, lower cost, and optimised emissions. This could create the demand for new capital light business models where a third party finances the edge computing in exchange for a share of improved performance.
The absence of an industry-level architecture for edge devices is forcing all edge computing services to provision all the layers of the edge stack — telecoms, bus, power, security, apps, sensors, data structure, dashboards, user interface, support tools. Provisioning includes operating system updates, app updates, roll-backs, back ups, unit testing, and security validation. This usually results in the kinds of walled gardens that made SCADA historically costly. An open architecture would accelerate edge proliferation, but will the industry commission the creation of edge computing industrial standards or open protocols?
The jobs preservation gene has been vigorously triggered among many front line staff, and the “do nothing before retirement” gene is now active among many in management. Overcoming cultural barriers to adoption must be placed high on the agenda for enlightened managers intent on improving the business through edge computing.
The industrial logic of edge computing in the oil and gas industry is very compelling. The industry knows and understands the power of automation at the very front of the business, and pressures to lower cost, improve productivity, and provide for social distancing create strong interest in solving the many design and management hurdles facing this technology.
Check out my book, ‘Bits, Bytes, and Barrels: The Digital Transformation of Oil and Gas’, available on Amazon and other on-line bookshops.
Take Digital Oil and Gas, the one-day on-line digital oil and gas awareness course on Udemy.