As shipowners and operators begin to embrace the use of operational data to identify areas for optimization, building a comprehensive network infrastructure is the next challenge to overcome. This opinion piece describes NYK Line‘s approach to big data.
When fuel prices were on a steep rise in 2005 and 2006, NYK Line began looking for new ways to lower bunker costs. Various technical solutions were considered, and fuel consumption was analyzed in the context of vessel speed, location, sea region and other factors – weather conditions in particular turned out to have a major influence on ship performance.
Over time, NYK Line built a database of operational information and discovered various areas where we needed to change our operating patterns. For example, we found that the NYK operations center in Singapore needed a better communication connection with the ships in order to interact with the ship masters more effectively. We needed the ability to present the same data to fleet management and the captains at the same time and improve the dialogue between ship and shore.
A dedicated research institute
In 2004, NYK Line established its Monohakobi Technology Institute, Inc. (MTI) as a strategic subsidiary for technology research and development. Tasked with investigating fleet operation and research optimization potential, the MTI began collecting and analyzing data from ships around 2005, starting with just one vessel to test and confirm the data collection methodology.
We then expanded our scientific work to include all vessels operating between Tokyo and the U.S. West Coast. The vessel-specific data was used to compare ships with each other so as to distinguish good and bad operating practices and identify improvement potential. The data allowed our business unit to ask specific questions about the reasons behind differences in fuel efficiency, while accounting for natural causes, such as seasonal differences.
The number one cause of deviations in fuel efficiency is bad weather. While we can’t control the weather, analyzing the ship data helped MTI develop ways of driving down fuel costs, for example through smart routing and addressing machinery issues proactively. Overall, about half a dozen different causes of increased fuel costs were identified.
Technical advances have since enabled NYK Line to refine its data collection and evaluation processes. Today every NYK Line vessel has a data collection system on board and optimizes operational efficiency and fuel consumption using the information it provides.
The next step is to interconnect the output of voyage data recorders, engine data loggers, the ECDIS and the ballast control system and connect it with shore-based Internet-of-Things (IoT) applications to enable services such as trim optimization, condition and health monitoring, diagnostics, environmental compliance, safe operation, collision prevention as well as fleet and schedule management and service planning. With powerful data mining technology in place, possibilities are nearly unlimited; eventually, even autonomous ship control may become a commercial reality.
A common data platform
Many companies are already collecting their own onboard data, and some engine manufacturers equip their engines with devices capable of measuring engine performance information and transmitting it to shore automatically. Smart shipping applications will soon be commonplace across the industry. What is still needed is a common ship-to-shore platform enabling all stakeholders, from cargo owners to shipyards and equipment manufacturers, to utilize non-sensitive shipping information.
The maritime community could benefit from such an open platform system in many ways, from comparative performance evaluation to technical and environmental research and development. Classification societies could use the data to improve ship and equipment design and detect new needs and issues as a basis for developing new, custom-tailored solutions for their customers.
All this would not only require a powerful computing and data warehouse environment, but also significant IT and ship engineering expertise, and the big question is whom to entrust with the operation of such a data centre. Providers of cloud services have offered assistance and may be up to the challenge in terms of data processing. But in general, IT providers will want to make use of the data for their own purposes, which is not in the best interest of the shipping industry.
The right partner to trust
Classification societies such as DNV GL have been handling confidential information, including drawings and accident records, for a long time and are trusted by the industry. They have both IT capabilities and domain-specific expertise and are neutral international organizations, something other industries lack. Therefore, classification societies are in a unique position to operate a common data platform.
Naturally, some of the operational data collected by shipowners and operators is very valuable and should not be in the public domain. The transfer of data should therefore be governed by agreements to ensure strict confidentiality. Furthermore, the quality of data received from vessels is not always consistent.
NYK filters its data prior to analysis. Data quality and integrity must be guaranteed to make sure those who pay for the privilege of utilizing the data get the quality they expect. Another question is cybersecurity: the emerging satellite-based data transmission technology must be protected against intrusion and abuse, such as hacker attacks.
A classification society would be well-equipped to account for both concerns. An open data platform offers multiple benefits to all partners of the value chain. It will accelerate the transformation of the shipping industry as it learns to embrace data intelligence to streamline operations and maintenance, while facilitating compliance and enhancing safety.
Yasuo Tanaka is President of the Monohakobi Technology Institute, Inc.
The opinions expressed herein are the author's and not necessarily those of The Maritime Executive.