Have you ever heard about “real-time data streaming”, “data lakes”, or “big data”? These are some of the terms you might have heard when it comes to the handling of data. Even though technology has changed over the years, the needs of our customers stay the same: taking business decisions based on data. In this blog post, we will shed light on the technological shifts in data storage and transmission over the last forty years that were the door openers for the technologies which Zeppelin uses in its special data platforms today.
Data storage then & now
How has data storage developed over the last forty years? To be more concrete, let’s give you an example! What does a Volkswagen Golf VIII Style 1.5TSI ACT have in common with a small tiny screw? Both stand for the same storage costs for 1 GB of data – but you need to consider that that applies to different time decades! In 1981, the data storage costs for 1 GB were approximately EUR 40,000 whereas today it is only around EUR 0.02.
The evolution of data transmission
And how has data transmission developed throughout the years? Let’s look at another example: What does a 200 km round-trip hike from Garching to Friedrichshafen (Germany) have in common with taking five deep breaths? Both take the same time as downloading a 1 GB file – but again, just in different decades! In 1991, with the use of a 56k modem, the download time was about 80 hours, whereas today it is only 30 seconds! High-speed network capabilities allow data transfer nearly in real-time.
The best time is: NOW! More and more project managers are looking for an instant dataflow to obtain timely insights. This calls for data streaming, which delivers smaller increments of data rather than running once a day a massive bulk upload. It is designed to continuously monitor new data and dispatch processing as soon as the data arrives.
Event processing with Zeppelin’s data platform
Our interdisciplinary project, a cooperation between Zeppelin’s venture Z Lab and Zeppelin’s Strategic Business Unit Power Systems, is a great example of how we benefit from changes in processing time and data storage. But how does it work? Sensors are installed on a given vessel where their main purpose is to collect data from the surrounding environment. Measure points for the data are for example the vessel’s GPS, the level of the water tank, and the fuel rate. One vessel produces more than 1,000 data points within 60 seconds. To give the data analyst of the vessel a full picture of the status quo, the delivery of the data takes place within 30 seconds from the moment they are generated.
Technology has changed over the years, but the basic needs of our customers have stayed the same: taking decisions based on facts. Today, Zeppelin can provide the customers with fast high-level details, and multiple backups for avoiding data loss.
With the accelerating speed of technology, one thing is sure: digital solutions will be a constant component in all industries. At Zeppelin, our Strategic Management Center Zeppelin Digit provides data architecture and data processing chains in a modern, software development-driven way to connect data between systems, visualize the past, and predict the future for our customers.
2019: FOC – the way towards predictive maintenance