Data Abundance – Opportunities in Planet Management
Author’s note: In January 2019, I will be moderating a panel on the topic of space-based data utilization at the Cleantech Forum San Francisco. In this blog, we look at market dynamics in this segment and highlight some of the innovators who will be joining us in January to discuss these topics in more detail. I hope you will be joining us too!
Modern satellites and sensors are capturing huge amounts of data – around 2.5 billion gigabytes per day. Converting this data into actionable insight will create value by helping us to understand how to mitigate climate change caused by industrial activity. Two areas of interest are making use of existing capacity through “Space as a service” and advances in data analytics.
Data Collection
Hundreds of satellites have been scheduled for launch in 2018; however, bottlenecks are causing delays, extending the first mover advantage of existing constellation owners. Private-sector operators like Spire and Planet are two examples, with over 300 satellites launched between them. These satellites are capturing data for customers in multiple sectors including agriculture, transportation, energy & power and resource monitoring.
We spoke to Nick Allain, Head of Brand at Spire, who highlighted the rapid iteration of hardware and software innovation, which is enabling greater data capture. Spire satellites today are able to make adjustments to optics and tune relay frequencies via software-defined radios – leading to a 10x increase in usable data received from satellite constellations.
Technical advances have also led to a smaller sensor footprint on satellites, enabling new business models. Spire is moving to capture value from this free load space, recently announcing a space-as-a-service (SpaaS) offering which allows third-party sensors to be packaged onto existing Spire payloads.
Space-as-a-Service
This ‘SpaaS’ business model will provide the first service where prospective satellite operators can connect to space in under a year. Nick believes this cost reduction and re-use of Spire’s existing infrastructure, know-how and supply chain will allow further expansion of applications. Examples he gave include specialized environmental monitoring and detailed real-time tracking for industrial markets – for example energy companies tracking IIoT sensors on their pipeline. This model also enables Spire to generate additional revenue – this service is priced around $10 million per launch.
Spire is also developing solutions to terrestrial data storage issues. The huge amounts of data generated from satellite imagery are then plugged into cloud-based analytics platforms on servers such as AWS. Large cloud storage and rapid retrieval is extremely costly for real-time cloud-based analytics – Planet generates six TB of data per day. Spire has been exploring attaching NVIDIA Tegra GPUs to satellites for on-board processing, meaning that less data needs to be beamed down and stored on cloud servers.
Data Analytics
Advances in sensor technology and reductions in launch-costs are driving hardware innovation, but data is still the only monetizable commodity currently bought back from space. Planet’s corporate restructuring in July reflects this reality, with less than 10% of staff let go to “develop commercial products and build a successful business.” This restructuring corresponded with the launch of Planet Analytics, a geospatial platform moving the company downstream from imagery to insights.
To date, the low-hanging fruit for geospatial analytics services like Planet Analytics, Orbital Insight and Descartes Labs have been government and defense contracts, and clients in industry segments including finance and insurance.
Applications for Data
With improved sensor technology, there are emerging industrial applications for data. The Oil & Gas Climate Initiative (OGCI) announced its primary goal is to reduce methane emissions from industry by one-fifth by 2025, and the OGCI has invested in two data-enabled startups addressing methane emissions – GHGSat and Kairos Aerospace.
Another use case example is natural asset management and renewable energy optimization. Glasgow-based Global Surface Intelligence (GSI) recently completed the Seraphim Space Accelerator and closed around $1 million in seed funding to build out its platform, which is focused on asset management for renewable energy producers. GSI turns raw imagery from satellite, LiDAR and other data sets into analytics to better manage natural assets. This practice is crucial for future performance of FTSE (Financial Times Stock Exchange) companies, which a recent report estimates could lose $1.6 trillion in value if they do not factor in ‘natural capital assets’ within their operations.
Descartes Labs is creating a digital twin of large global supply chains, with customers including agricultural corporate Cargill. James Orsulak, Descartes’ Senior Director of Business & Sales, told us that Descartes has evolved beyond satellite imagery, integrating various data sets into a “data refinery.” This data refinery integrates large disparate data sets – including satellite, internal data, RFID tags and widely available data such as shipping and rail arrival times – to provide a competitive advantage.
Initially, the company’s product was focused on agricultural and energy supply chains, but a broad approach has been applied, with Descartes moving into large physical supply chains, such as ocean, rail and mining. This new product is delivered via a cloud-based platform, which provides clients a tailored API in real-time.
In the future, we see abundant space-based data providing a complement to macro-trends such as autonomous driving and smart city integrated hubs. These applications will require huge amounts of data, but also interoperable systems which can integrate varied data sets. These markets represent big growth opportunities for data providers and integrators between 2020 and 2025.