Skip to content
Icaro Fonseca edited this page Sep 17, 2022 · 95 revisions

This repo presents two early examples of standard-based digital twins. The first one tracks the motion of a toy boat with a web camera to discover its position in 3 degrees of freedom (DOF): heave, surge and pitch. The second example monitors and controls an experiment with a model PSV in a wave basin with 6 DOF (surge, sway, heave, roll, pitch, yaw), dynamic positioning (azimuth, propeller rotations) and wave characteristics. In the future, we expect this standard-based approach to support development of digital twins for complex maritime systems while ensuring compatibility with a wide range of systems.

Data content

You will notice that the data from both examples follows a similar organization: it is listed in JSON packages containing descriptive metadata, content and links to external files when necessary. At this stage the files are stored in a local folder, but in future examples a database might be used. The content in a digital twin can be organized in four groups: digital representation of the asset, measured states from the asset, measured states from its operating environment and the simulations or analyses making use of such data. These categories are listed on the Schema.json files inside each folder. The schemas group the packages according to their domain and following a flat hierarchy. Each data package contains a few identification tags which can be used to map them to different taxonomies, e.g., functional, physical or spatial. This modular approach aims to enable gradual development of digital twins with flexibility for customization of taxonomies.

The main taxonomy adopted in the examples is the Vessel Information System (VIS), provided by DNV. VIS covers both asset representation and states. It follows an operational view and organizes the data primarily according to a functional hierarchy. The tags contained by the data packages allow them to be mapped to the exact functional leaf inside VIS and the final hierarchy linking all the packages is stored in a JSON file. On the other hand, simulation domains and environment data are out of scope for VIS, so they need to be covered by complementary taxonomies. Thus, we identify the wave data and web simulation with labels, respectively Wave and Motion response. These categorizations might be further expanded in future examples.

Asset representation

The asset representation comprises metadata providing textual description about the ship and 3D models for visualization.

Ship specification

The specifications were written as sketches of how general ship data could be stored according to the standards used here. They contain IMO number, main dimensions and other design parameters. When creating the visualizations, such parameters are used to position the 3D models according to the design draft and to apply the movement to the center of gravity.

3D models

Each ship component or relevant geometry is represented by a JSON package with metadata and the corresponding 3D model. The metadata contains the identification tags, coordinate positions and links to the 3D files. The examples store 3D models in two different formats: STL and glTF (Graphics Library Transmission Format). While STL supports only the geometry in tessellated form, glTF supports entire scenes with assemblies, colors, materials and lights. Both formats can be stored as binary or text. The glTF format uses two extensions: .gltf for text, which follows a JSON schema, and .glb for binary. We must note that the glTF format takes the upwards direction of the 3D model as aligned with the y axis, i.e., y up. This is inconsistent with the dominant convention in naval architecture, z up, but can be adjusted by applying a rotation to the 3D model once it is imported to the visualization.

Toy boat: the toy boat is represented by a 3D model of its overall hull shape.

Hull:

Model PSV: the glTF format allows inclusion of assemblies with articulations for movement. This functionality has been used to create a 3D model of the azimuth thruster with one axis centered on the casing bearing and a second on the propeller's center. This allows animation of both propeller and azimuth rotations with a single GLB file.

Hull:

Azimuth thrusters:

Tunnel thruster:

Sensor states

The examples used ISO 19848: Ships and marine technology — Standard data for shipboard machinery and equipment to model the data collected by the various sensors in the experiment. The standard defines two types of packages: data channel list and time series data. The data channel list contains the metadata about the sensor setup. It describes the meaning and physical units of the variables being measured, the expected range of the values and their identification tags, among others. A time series data package stores a collection of readings extracted in one of two forms: tabular data for sensor readings at a regular interval and event data for irregular data such as alarms and setpoint commands.

Aquarium: the example collects the boat positions in 3 DOF: surge, heave and pitch. The ISO 19848 text specifies that regular measuring intervals are expected for consecutive sensor readings in tabular data. This is not the case with this example, since the tracking algorithm converges at different intervals thorough the video depending on the number of iterations it requires to identify the boat on a given frame. In any case, the standard template was able to accommodate the variable time step without any modifications. A remark indicating this was added to the demo log itself.

Motion tracking with positions from the demo video:

Wave basin: the wave basin measures the PSV position in 6 DOF, the water elevation, the propulsion RPM, azimuth angles and target setpoint for the dynamic positioning. In order to reduce storage and streaming bandwidth, the wave characteristics were calculated and saved instead of the entire water elevation log. Furthermore, while the RPM and azimuth data are functional on the streaming application, they have not been saved during the experiments, so the corresponding log is unfortunately not available.

Motion tracking and DP setpoint from the demo video:

Wave characteristics:

Analysis results

The wave basin applications use the results of a motion response analysis performed with a boundary element method software. The results list the wave period, heading, amplitude response and phase response. In order to ease the understanding and use of such results, they have been converted to a JSON file containing the same operators organized first according with period and and then with heading. While both files list the same results, the later version is easier to interpret and manipulate for both humans and web-based systems.

Web applications

The web applications provide a dashboard for monitoring and control of the digital twin based on the data collected during the experiments. The dashboard displays a 3D visualization showing the current state of the experiment (ship position, waves, propulsion system) created with Three.js. It also provides a few charts created with ZingChart showing the positions as time series.

Aquarium (monitoring dashboard): the application receives a video showing the boat and extracts its positions in 3 DOF with the computer vision algorithm Camshift. The app also shows some warnings in case any of the positions surpasses the threshold ranges specified in the motion tracking metadata. Three different modes are available in the application:

  • Demo: plays a video excerpt recorded from the aquarium setup and shows the corresponding boat positions stored in a demo log. The purpose of this mode is to illustrate the app functionality. It does not execute the Camshift algorithm, as the positions have already been stored in a JSON log.
  • Capture from webcam: processes the video captured by the camera plugged to the computer.
  • Connect to stream: fetches the video from a remote server connected to the aquarium setup. This functionality is currently working only locally on the Eduroam network at NTNU in Ålesund.

Wave basin (monitoring and control dashboard): the interface is used to monitor and control the model PSV during an experiment. It shows a 3D visualization with ship motions and propulsion system parameters in real-time and also allows the user to define a setpoint for station keeping with the DP system. This interface proved very useful by allowing debugging and testing of the experimental setup. For example, it helped to identify if the model was drifting towards an undesired position out of the field of view of the motion tracking system and thus risking getting out of control by breaking the DP feedback loop. The situations could then be quickly addressed from the appication itself.

During experiments, the digital twin app can be used to perform some basic optimization of motion response. The user selects a decoupled motion mode they desire to minimize and the application maneuvers the ship towards the heading which will minimize that motion mode based on the stored BEM results. In the future the optimization could be improved to minimize the coupled motions on a given position of the ship, for example, the point where a crane is being operated.

Wave basin (experiment replay): shows a demo with a video recording of an experiment, the corresponding data and some validation plots. The page briefly introduces the experimental setup and simplifications.

Appendix: streaming configuration

Aquarium: the application supports streaming of the aquarium video, so the monitoring can be accessed from different machines. The VLC media player was configured to capture and stream the video according to the instructions on this page. However, as a default configuration of web-browsers, a client cannot manipulate the streamed video with the computer vision algorithm if it comes from a source outside the localhost (see Cross-Origin Resource Sharing or CORS). We used NGINX to set up a reverse proxy which adds the CORS headings, allowing the video to be processed by the monitoring application. You can access the NGINX configuration file here.

It is possible to set a Raspberry Pi as streaming server. First, connect it to NTNU's Eduroam by following this tutorial. In short, edit /etc/wpa_supplicant/wpa_supplicant.conf to append the following lines:

network={
  identity="username@ntnu.no"
  password="password"
  eap=PEAP
  phase1="peaplabel=0"
  phase2="auth=MSCHAPV2"
  priority=999
  disabled=0
  ssid="eduroam"
  scan_ssid=0
  mode=0
  auth_alg=OPEN
  proto=RSN
  pairwise=CCMP
  key_mgmt=WPA-EAP
  proactive_key_caching=1
}

You might also configure the computer to start streaming the webcam automatically during startup. First, enable NGINX as a systemctl service for automatic start at boot. Then, specify a new service which will start streaming the webcam through VLC once the network connection is online. Enable this new service as automatic too.

Wave basin: a Python bridge was developed to establish the communication between the measuring systems and the digital twin interface. The communication from the side of the measuring systems is established via UDP, and from the side of the web interface it occurs via WebSockets. The bridge receives the DP and water elevation measurements in different threads and retransmits them to the digital twin interface. When receiving the water elevation signal, it calculates the wave characteristics and then sends to the client only the period, height and phase, thus reducing the necessary communication bandwidth. The DP setpoint defined by the user is also handled by the bridge, which receives it from the web interface and sends it forward to the DP control system.

Academic publications

  1. Link to final version. Reference:
@Article{Fonseca2022,
  author    = {{\'{I}}caro Arag{\~{a}}o Fonseca and Henrique Murilo Gaspar and Pedro Cardozo de Mello and Humberto Akira Uehara Sasaki},
  journal   = {Computer-Aided Design},
  title     = {A Standards-Based Digital Twin of an Experiment with a Scale Model Ship},
  year      = {2022},
  month     = {apr},
  pages     = {103191},
  volume    = {145},
  doi       = {10.1016/j.cad.2021.103191},
  publisher = {Elsevier {BV}},
}
  1. Link to final version. Reference:
@InProceedings{Fonseca2020,
  author    = {Icaro A. Fonseca and Henrique M. Gaspar},
  booktitle = {{ECMS} 2020 Proceedings edited by Mike Steglich, Christian Mueller, Gaby Neumann, Mathias Walther},
  title     = {Fundamentals Of Digital Twins Applied To A Plastic Toy Boat And A Ship Scale Model},
  year      = {2020},
  month     = {jun},
  publisher = {{ECMS}},
  doi       = {10.7148/2020-0207},
}

References

Standards

Motion tracking

Visualizations