Energy & Power Shifts from IoT Cloud to Edge Computing
Distributed energy resources can easily generate over one Terabyte (TB) of data a day, which is pushed back to a centralized cloud platform service to be processed, analyzed and ultimately acted upon. Assets are now generating more useful data than infrastructure can efficiently move. Furthermore, as more assets are connected, the cloud-platform framework is at risk of serious bandwidth problems (lets see what 5G delivers), as well as cyber security concerns.
In edge computing, computation is performed on distributed device nodes reducing processing speed and security concerns. Batch uploads are sent on scheduled intervals for wider trends analysis across assets.
Step closer to the edge …
Software applications, data and services are moving back toward the asset, either on existing or additional hardware. Innovator Swim AI told us that edge data transfer speeds are faster than cloud processing by a factor of 1000, which can have major impacts with AI-based learning models and other advanced analytics.
The textbook definition refers to edge as all computing outside cloud happening at the edge of the network, in truth many solutions take a hybrid approach, batch processing some of the data at the edge, before sending it to either an intermediary processing server away from the centralized cloud system (otherwise known as Fog), or back to the cloud for larger activities such a big data processing or data warehousing. A side-by-side comparison of cloud and edge computing is shown in figure 1.
For oil & gas, power utilities and other energy players across the value chain, edge computing can enhance production capabilities, improve process, extended asset life and create opportunities for the deployment of additional capabilities.
Many of the players with critical infrastructure (grids, power plants) have limited connectivity due to cyber-security issues, which edge computing can address by removing the need to connect to the web. By 2021, 75% of enterprise-generated data will be processed at the edge, up from under 10% in 2018. Valued at $2.2 billion, the global edge computing market is predicted to reach an estimated value of $9.2 billion by 2023.
Innovators enabling Edge computation
Many of the edge innovators act as the orchestration layer within the software framework, enabling applications and services to run at the edge. Players such as Zededa, founded in 2016, have developed edge virtualization software platforms, allowing for a standardized way to interact with and create applications for the edge. Through applications built into the platform, the startup told us they have supported a wind farm operator through the provision of energy management and asset health programs.
Pixeom, founded in 2014, raised $19.6 million in a January Series A round from Intel and National Grid created an edge computing platform that can run multiple applications. Without the need to buy hardware to run the software such as Microsoft’s Azure Stack, the startup has lowered to entry point for new edge users. The ability to run multiple applications on the same edge device in isolation also reduces costs due to server consolidation. CEO Sam Nagar explained that the energy and industrial sectors represented 30% of revenue and 50% of growth, due the for volume of legacy data and the general market apprehension to modernizing. Energy customers have reported 20% reduction in maintenance, repair and operations costs as well as reduction in safety stock and administrative costs for managing and replenishing inventory.
Innovators creating Edge applications
The real value created through edge computing is the ability to run advanced asset optimization (see my previous blog for a better explanation of this) in real-time. Core cloud-based analytics are now being reduced in size to run efficiently on the edge. Swim AI have a created a 2MB software package for ingesting and handling edge data, assisting with data flow, using digital twin solutions to listen to and create digital duplicates of assets, enabling real-time predictive maintenance and machine learning. They currently operate in the oil & gas, utility, smart city, manufacturing and logistics sectors, with over 34 million connected assets.
Some more established players have created platforms which enable edge processing, and also provide inhouse analytics. FogHorn developed an edge intelligence platform for industrial and commercial IoT applications and has now raised almost $50 million in equity investment from investors including Saudi Aramco, Honeywell, General Electric and Bosch. The platform acts as an edge enabler, as well as bringing analytics and machine learning to the on-premises edge environment, enabling machine performance optimization, proactive maintenance and operational intelligence.
As the edge computing market has emerged, there is also a growth of more niche software solutions. New business value is being created from novel solutions catering specifically to the edge infrastructure. Deeplite, founded last year, is looking to replicate the deep-learning based neural networking capabilities on the edge, which currently runs on centralized high-powered graphical processing units, using highly complex power processing and, on average, using 3000 hours of electricity per running cycle. Through redesigning high-performance algorithms, neural networking optimizes and forecasts a high throughput of distributed data but remains a few MB in size. By integrating edge application capabilities, the reduction in size of more advanced analytics will create a broader set of opportunities to capture value and reduce operational expenditure. Figure 2 outlines some of the investments made into edge computing innovators.
Applications
Since business-specific drivers for energy companies are dependent on where a player sits on the digitization journey, many begin with single applications, typically around asset instrumentation, before running predictive analytics and asset process optimization in real-time:
- Saudi Aramco, a customer and investor in Foghorn, used edge computing for flare stack monitoring in a gas refinery, addressing issues with limited communications, computer resources, environmental and regulatory compliance and maintenance costs.
- National Grid, who utilize Pixeom’s solution, are running prescriptive analytics on the edge for real-time closed-loop optimization for some of their assets.
Challenges to overcome
- Unlike cloud-based processing systems, edge computing systems cannot support heavy-weight software due to hardware constraints. If the market continues to create small software duplicates, big data analysis and data warehousing will never be feasible with the existing framework of edge systems.
- A lack of a standardized framework among edge providers creates difficulty for customers and users looking for an edge provider they can trust for an extended period.
- The ability to develop software and hardware that can handle computational offloading from the cloud is critical.
Keep an eye on…
The evolving use cases in energy sector edge computing, include machine-to-machine security validation, blockchain for enhanced ledger-based communication between edge devices, and the use of 5G to enhance advanced edge computing capabilities (eg deep-learning and prescriptive optimization). Keep an eye on how edge computing will become ubiquitous across Distributed Energy Resources. From asset health monitoring of substations to country-scale grid transmission modelling, edge in energy is yet to have its biggest impact.
Join us in Stockholm for more insight into the future of Energy & Power.