Earth Observation Big Data - The Hammer Mindset Lacks the Edge - 08/06/2018
An old adage tells us that if all we have is a hammer, we see every problem as a nail. So for bigger problems we get a bigger hammer...
The European Sentinels generate more than 10TB of free scientific-grade imagery every day. Over the past few years I’ve heard about truly amazing new approaches to handling output at this scale. These have ranged from bigger storage and faster processors to better fibre and wholesale use of the Cloud. Sounds like Big Data needs a Big Hammer... right?
So here’s a thought. Only a small proportion of commercial satellite data is ever monetised. Naturally, therefore, we can’t afford to treat all data the same way. Considering this fact, are we sure the hammer is up to the job?
Data transport, processing and storage drive costs, especially when talking about huge amounts of data. I mean, why even move data from the ground station if it is impossible to sell? Similarly, processing usually means storing the same data over and over again in multiple versions. This is great if you sell storage.
But let’s say you have all these products ready to go. What are the chances you have exactly what any specific customer actually wants? This is a case of "one size fits no one”. The Sentinels are great in many ways, but their standard products don’t quite work for anyone straight out of the box. So, why the flexibility?
There are other perspectives out there also – what I would cautiously call “smarter” approaches. "Just-in-time" or “lean" methods have been around now for a long time, and they can impact the bottom line in a positive way, both in terms of investment and operational costs:
- Only move the data you really need
- Don’t process the data until you need to
- Only process the right data.
I doubt anyone reading this thinks these are controversial ideas. So why are we stuck in the old paradigm of “move/process/store everything”? After all, it came down to us from the early days of satellite imaging when the technology gave us no alternative. Today, we have other options. And yes, everyone talks now about processing “on the fly”, but typically this is at the end of a chain, only after significant monies have already been handed over to our three old friends - transport, process, and storage of data.
I believe the root of the issue is the misconception that flexibility is expensive. That is, the flexibility to select, process and serve up just the data needed to meet the customer’s specific needs. Today, the whole process can be built this way - from the ground station (or even the satellite itself) all the way to the user. And building for flexibility doesn’t need to cost any more than a traditional stovepipe. Over the lifetime of a constellation it will pay dividends through operational savings on transport, processing and storage. So I would argue that “smarter” approaches are advantageous, and perhaps even essential, for healthy satellite image businesses. Focus on rapid access and processing of the precise data needed to create exactly the product your customer wants - without spending money and effort on the rest.
Innovative ways to keep CAPEX and OPEX under control sharpen the competitive edge. And I think we can all agree that hammers don’t have much of an edge. The technologies are out there, but first we need to change our perspective. Once we’ve let go of the hammer, the whole data chain can be lean and flexible. We won’t try to deliver all the data to everyone (or indeed, most of it to no one) - instead we'll deliver the right data to the right people just when they need it.
This article was published in GIS Professional June 2018Last updated: 23/01/2020