Predicting Climate Change: Resilience, Risk and… Revenue?
Extreme weather events have caused significant economic damage over the past 24 months, with total economic losses from natural disasters in 2017 reaching $337 billion (source: Swiss Re). Wildfires in California, drought in Australia and storms across the U.S. and Asia have devastated communities, infrastructure and assets which are ill-equipped to deal with a changing environment.
In response to these challenges, innovative solutions are addressing climate-related weather volatility. City and municipal planners are exploring innovative ways to build resiliency into new infrastructure.
Jupiter Intelligence is one example of an innovator creating value from access to huge amounts of data and processing power to respond to climate change. Our conversation with the CEO provides insight into new services for interrelated customers.
Building climate resiliency
In the US, Europe and most major cities in Asia, infrastructure was designed and built decades ago, with the vast majority constructed for a less volatile environment. Today’s higher frequency extreme weather events are putting critical infrastructure – such as highways, water facilities and the power grid (an issue in the US more than Europe, according to a recent study) under growing pressure.
Cities are adapting to these new conditions, building resiliency into their planning and construction of new infrastructure. For example, the Government of Malaysia commissioned a PPP construction project in Kuala Lumpur to develop smart storm water management systems for traffic tunnels in the event of extreme flooding. In the US, Miami Beach is introducing resiliency measures to combat climate-change related flooding, with a $400 million plan to upgrade infrastructure and raise roads along the ocean front.
Cities can also adopt digital solutions for improving resiliency. For example, CrowdHydrology is a website which uses crowd-sourced data submitted by citizens for real-time flooding insights.
Cloud-based climate monitoring: from macro- to micro-scope
As well as building long-term resiliency into new infrastructure at a city/municipal level, improved data on the changing climate is allowing innovators to adapt and respond to change at much shorter time horizons than traditional methods.
Models on climate change have been refined for decades, leading to sophisticated algorithms to understand historic weather patterns, and predict change in the future. This process has traditionally required supercomputers to run the programs – the UK Met Office’s three Cray XC40 supercomputers are capable of calculating 14,000 trillion calculations per second.
More recent developments in computational processing speed and cloud-based infrastructure have allowed startups to access the tools needed for climate modelling, creating opportunities for innovators to delivering data insights to an audience beyond government and academia.
Innovator Profile: Jupiter Intelligence
Jupiter Intelligence is one of the innovators delivering climate insights to new sectors. The San Mateo-based company predicts risk from severe weather and climate change, focusing on specific perils such as flood, fire, extreme heat, wind and drought.
Recently we spoke to Rich Sorkin, CEO of Jupiter, who highlighted the shift that we are seeing from low spatial-temporal resolution monitoring at a regional level, towards asset-specific monitoring at a much more detailed resolution (of up to 1 meter image resolution).
Recent advances enable higher quality imagery from satellites (for more info, read my blogs on geospatial trends and opportunities), and cloud-based data repositories allow the ingestion of heterogenous datasets – which include cell phone data, IoT sensors, existing climate records, and network infrastructure records held by utilities – which can all be processed into data insights.
In Jupiter’s case, experts and executives have been recruited from climate, weather, insurance, finance, utilities, AI, satellites, IoT, and cloud computing to focus on three specific target customer segments:
- Physical asset owners – such as utilities and energy companies
- Financial services companies – such as insurance providers
- Public sector – such as government organizations and NGOs
An important part of this go-to-market strategy is the synergy between these three customer sets. This approach reduces the risk of excessive opex expenditure of huge amounts of data expensively stored on AWS or Google Cloud (which add up quickly when the data is spread out over a large geographic space). This synergistic early customer set also allow Jupiter to focus on deepening relationships and market penetration in a smaller segment, before expanding to more sectors and geographies.
On the O&M side, asset-level analytics allow a greater level of granularity in terms of understanding how complex systems work together, and which parts of a network are most threatened by extreme weather. Operationally, organizations can also use this data-driven asset-level risk data to make quicker decisions on how to develop resiliency and adaption plans.
For service providers such as insurance, asset-level understanding of climate-risk can allow improved risk modeling for underwriting of insurance plans. This also translates into new forms of insurance that can be offered, to previously un-insurable parts of the market.
Keep an eye on …
WorldCover, a Y-Combinator graduate which is developing parametric insurance products based upon geospatial imagery of crop fields in emerging economies. By being able to understand climate-related risk at a micro-level scale (the asset, in this case, is a farmer’s field), WorldCover is able to pay out parametric insurance (an ex ante payment trigger when flood or drought occurs) to farmers. This creates a new insurance product, as well as providing a safety net to those most exposed to climate-related risks.