Instead of just leaving you there with this platitude of data being the “digital screws” of the future, we at INFORM have decided to put together a three-part series on effective data strategy. It will have a good deal of meat to it that will assist readers in actually understanding the foundation of data strategy and enable them to start down the road to establishing one. In our series, we will give you an introduction to what cutting-edge technology enables us to do, what the implications for maritime operations are, and, most importantly, give you an idea of how to access these benefits. In Part 1, we kick it off by having a look at data management.
Data Management is a term that can be interpreted in various ways. Its definition ranges from being used as a term for data sources, storage, connectivity, transfer, transformation, and modeling, but also less technical terms like governance, security, or cataloging. So, what is it? Well – all of the above. Data Management describes the collection, refinement, and provisioning of data. It includes everything that has to happen between the creation of a data point and that data being made available in an appropriate form for consumption by data analytics, data science, artificial intelligence, operations research, and other advanced computing practices.
Spoiler alert: Speed is what is important here.
One key component of data management is data integration. The value of datasets is vastly increased if they are enriched with context, especially across system boundaries. If we can, for example, integrate the information of inbound shipments with metadata from the port of origin, shipping line, container master data, handling equipment parameters, etc., we can start to form the ominous, digital twin that is quickly becoming a focal point of many ports and terminals around the world.
The availability of associated information with regards to every part of port and terminal operations would incredibly increase transparency, control, and provide the best foundation for making insight-based, split-second decisions. Speed is crucial. Insights, in retrospect, can be helpful in refining processes, but it does not help you identify problems in real-time and certainly does not give you data-based options for resolving the issues.
Modern Tools Make Data Integration Straight-forward
Up until recently, the entry cost and effort to implement solu ons for Extract, Transf tiorm, and Load (ETL), as well as storage (data warehouses especially), was a significant deterrent to implementing data integration as part of your data strategy. Add to this the lack of qualified personnel, and the challenges typically outwaited the ROI potential.
However, since 2012, many things have changed. “Big Data” has just become “data.” No one really bats an eye at millions of records anymore. Machine learning is ubiquitous in our everyday life, be it in navigation, shopping, meal recommendations, or smart assistants. Computing storage cost and power have been made highly accessible and extremely affordable through the propagation of cloud-based computation business and service models. Before, the scope of data-driven projects used to be limited by the horsepower available in one’s on-premises servers. Nowadays, fully scalable resources are available through cloud providers like Amazon, Google, and Microsoft – to name just a few of the prominent players.
Read the full text now.
Read Online or Download