Creating value from ship data

Creating value from ship data
Dr Serena Lim presents at VPO Global’s forum in London

Choosing performance monitoring tools that enable valuable data to be created at each stage of a vessel’s performance evaluation process is challenging, but fundamental to improving fleet performance, says Dr Serena Lim, chief scientific officer, Ascenz Solutions

There are numerous digital technologies and monitoring tools that are capable of generating vast quantities of data to help shipping companies maximise their competitive advantage when it comes to ship performance. Creating true value from this data at each stage depends on the type of data management platform in place, an understanding of the boundaries of reliable and less reliable data, and a sufficient data validation process.

Managing data and the importance of investing in a robust platform

The first step is to determine what systems you want to invest in, which depends on what you want to see from the data and whether you have the right skills to analyse it, Ms Lim told delegates of VPO Global’s forum held in London last month.

This includes investing in a robust data management platform. “If you don’t have a platform you don’t have data and you can’t create value from that data.” She notes the importance of establishing early on the kind of digital infrastructure that a company wants to invest in to help decide what kind of data and how much is required.

It is also fundamental for a company to determine how often it wants its data to be collected and fed back for analysis. “If you speak to a computer scientist, they will say monitor as many points as you can, but if you speak to someone using noon reports they may say just noon reports and one other is enough. You need to really understand what kind of value you want to create out of the data that you have.”

Following this, Ms Lim believes that a company should start to establish how much data it can really deal with. “If you’re collecting data at every second, you have about 12.5 million data points in a month – but are you going to process all that data? How do you decide if you are collecting in 10 second intervals or one minute intervals?

She confirms that it all depends on what kind of value you are trying to create. “If I’m trying to predict my engine maintenance, I will need quite a small frequency, every second. And if I’m looking at emissions, noon data is probably enough.”

Understanding data sources

Data quality is of absolute importance, especially if only one or two points are being collected a day. Part of ensuring good data quality includes making sure the source is solid and reliable. Ms Lim points out that, “A lot of us download data from the cloud and look at the numbers, but where are the data coming from?” Most of us don’t tend to question it. In one example, Ms Lim explains how sensors can be used to determine speed over water and speed over ground data.  There are different types of sensors that can provide this information, but, “Which of the sensors is the correct option to choose or how much can you believe in the quality of a sensor is something you should ask yourselves,” she says. “Each have pros and cons, and it depends on where you install a sensor. In a speed through water sensor data, at lower speeds it might not be so accurate, and at some speeds it is more accurate – this is because of the physical boundaries around it.”

Companies that have a basic understating of where and how the data have been generated are more likely to acknowledge the fact that not all data are reliable. “Data directly impacts analysis so we need to know how reliable it is. The dataset does not have to be perfect but we need to know the boundary and extent of what we can believe.” One example she refers to is with weather data, which is based on an external database. “Weather providers give us different spectrums, nice overlays for graphs for instance, but what is the actual value behind it?” Ms Lim believes that understanding where the data have come from and at what point they should be questioned is vital to obtaining a true analysis.

Then should come the conversation around whether more data or better data is needed. If greater insight is required, then one has to be prepared to invest more in sensors to get this. “If you are looking at energy management, you need to invest money into looking at the energy. How can you do your energy management if you are not willing to invest money into assessing the energy?”

Ms Lim says that it is important to understand where your data have come from and how reliable the source is. Figure courtesy of Serena Lim, Ascenz

Data validation

A core element to delivering high-quality, valuable data, is checking, and checking the data again. Ms Lim says that when data is passed onto further parties, the quality of the data is often questioned. “Accuracy will vary and you need to know that quality checks are being done to ensure what data you have is accurate,” she explains.

Data also needs to be validated and an understanding of the physical boundary conditions should be sought.  This is about more than just checking data coming in and going out but includes understanding the physical set up of it, which is “absolutely important to pass the data on to the next party.”  Ms Lim explains that each step of validation is paramount, and while it may be tedious because it needs to be customised to each vessel, it is a vital step to make sure the next party can see exactly where and how the data has been validated as they will not be able to do this themselves.

This also helps when putting data into context. Ms Lim explains that data are just data and can look like a lot of numbers without much context, but if they are broken down to see what they actually represent in the wider context, it becomes clear how much time or energy is spent on each task. In doing so, more value is created from that data.

Validating the data for quality is also central to integration with data from the port and the database from the market. “This allows proper sense to come out of it. Otherwise, garbage in = garage out.”

Different routes to creating true value from ship data according to Serena Lim and Ascenz

Ms Lim believes that the current bottleneck is on how the industry can make proper and effective use of data that contain too much information. She believes that the most important aspects are to ensure that systems are carefully developed and optimised for ship management while finding suitable partnerships it is a must-do. “If you stand alone then it can be difficult to provide a holistic approach, but by finding reliable, active partners, then it is possible.”

Furthermore, it’s about taking one step at a time. “We are not trying to solve everything but starting with one thing,” she concludes.

Dr Serena Lim spoke about creating value from ship data at Digital Ship’s Vessel Performance Optimisation forum held in London in June. To view her presentation, click here.