Those who, like myself, have been Compit regulars since the first edition in 2000 will easily spot the unmistakable signs of industry acceptance and success: big industry sponsorship, international attendance, and a selection of visionary yet practical and directly implementable papers.
At COMPIT 2016, Lecce, the organiser’s leitmotif and vision that the marine industry should exploit and build upon the wealth of leading commercial technology commonly available today started to take form. It all started to gel, as Big Data becomes a key support of predictive algorithms and real-life operations, open-source keeps its head high, and even generative design and system-agnostic software mark (faint) blips on the radar screen. In a nutshell, exploitation of computer power in innovative ways and sharing Big Data were the underlying common denominators of a milestone Compit.
Refreshingly, in Open Source and Web Based Ship 3D Virtual Simulator, Olivia Chaves and Henrique Gaspar proposed a forward looking combination of high-performance yet commonly available technology to build a real-time ship motion 3D virtual simulator in a web browser environment, which well deservedly earned them the Compit 2016 DNV GL Award.
As an engineer and avid reader of Waveform, Denis Morais’ blog, and of various non-marine technology forums, I treasure out-of-the-box vision, but also like to see research and ensuing technology applied, which in the Big Data context at hand requires sharing the appropriate and relevant data with all concerned stakeholders. This subject was approached by Denis’ Open Architecture Applications: The Key to Best-of-Breed Solutions, in which “Platformisation” is proposed for efficient data sharing, and Cloud Computing for CFD Based on Novel CAE Software Containers [Gentzsch et al.], which made me think of system-agnostic applications, for example Docker technology.
Kjetil Nordby and Snorre Hjelseth also look to the future in Efficient Use of Mixed Reality in Conceptual Design of Maritime Work Places. Kjetil and Snorre discuss applying Human-to-Human communications in virtual reality (VR) spaces to designing ship bridges, perhaps a first step towards exploiting augmented multi-player environments in other scenarios, for example remote rescue.
While the surface of how to practically exploit Big Data got scratched, ideas on what to do with Big Data were aplenty. Design, in its various liveries, will benefit vastly from Big Data and VR, particularly by aligning design parameters to field metrics collected during operation and using the underlying framework to optimise for future mission profiles. Improving Early OSV Design Robustness by Applying Adaptive Distributive Clustering in Ship Lifecycle Big Data exploits lifecycle data to improve the ship’s performance in several, different operational areas. Coraddu et al. develop realistic operating profiles to assess different design solutions and to predict and evaluate performance decay in a statistical fashion. Applications of Network Science in Ship Design by Pawling et al discusses the exploitation of networks, another promising piece of the computational puzzle.
Virtual reality also contributes to design, as we read in A General Arrangement Visualisation Approach to Improve Ship Design and Optimise Operator Performance, by Lundh et al. The proposed linking of general arrangement drawings to work procedures and task execution in VR allows improving work space for crew operation at the design stage, to generate overall cost savings.
Perhaps comparable to learning from Big Data, it is interesting to see the ‘existing ship’ or offshore platform side of the same in Virtual Reality Based Training Improves Mustering Performance by Scott MacKinnon et al. Many COMPITs ago, a researcher from NavSea presented a Navisworks based application that could be asked using a microphone to show the way from one location to another, which it did (with practical limitations). Scott’s VR showed how such VR-based familiarisation may save lives and make evacuation training fun.
Hull shape manipulation was discussed by Ang, Goh and Li, while wind resistance and propulsion assessment using CFD was reviewed in A Novel Way to Harness Wind Energy on Ships: How CFD Helps Foster Innovation. Wind propulsion seems to have evolved slowly, from the wing sails first presented at SMM, Hamburg, some eight years ago, to more recently proposed wing-like ships. The related COMPIT 2016 papers might just offer a glimpse of tools soon to be used to produce widely encompassing design concepts that include active and/or passive wind powering.
Contributing to design and operations, simulation fills the room more and more. It was interesting to see good old fuzzy logic and pre-generative design techniques side by side, on the background of the ever present optimisation overlord. Design Optimisation using Fluid-Structure Interaction and Kinematic Analyses takes another look at FSI and applies the ANSA (BetaCAE) environment to ensuring the structural integrity of a free-falling life-boat upon impact with water, which must take into account different fall trajectories resulting in different impact dynamics. Fuzzy logic is used by Fireman et al. to determine the operability of the ship from a human performance standpoint.
Surrogate models (or meta-models) represent promising strategies in overcoming unavoidable computational limitations, as discussed by Prebeg et al. in Evaluation of Surrogate Models of Internal Energy Absorbed by Oil Tanker Structure During Collision. Essentially a logical representation of the physical model and of its relevant characteristics (crash behaviour in this case), meta-models have long existed in the electronic industry (e.g. to test CPU, boards, etc. by behavioural simulation).
The surrogate model in object here carries the added complexity of there still being little experience in the relative merits of the several models available, the need to tailor the relationship between underlying constraints, model and its training curve, all the while keeping an eye on the ultimate goal, here structural optimisation. The clear conclusion that meta-models are effective in replacing the top-heavy environment of several design objectives, hundreds of design variables and tens of thousands of design constraints spawns the question of whether generative design techniques would further alleviate the problem or prove too computationally intensive in the context at hand.
Generative design also comes to mind when reading the very interesting Multi-Objective Design Study for Future U.S. Coast Guard Icebreakers, where genetic algorithms are employed. First-hand human experience is used directly as an input parameter to compensate a relatively shallow learning curve due to the somewhat niche nature of icebreaker technology. Another distinctive aspect of the research undertaken is the use of a combinatorial catalogue-based engine selection algorithm. More than 170 objects were used to describe the 2.5 CAD model, subject to a number of non-negotiable constraints, weight, and cost, as driving factors.
Coming closer to everyday considerations, such as cost and ROI, Learning Curve and Return of Investment in the Implementation of a CAD System in a Generic Shipbuilding Environment, by Rodrigo Fernandez, discusses a ROI formula based on data collected over the 2007-2015 period by SENER. Morais and Waldie also discuss ROI from a data sharing and best-of-breed system architecture perspective.
Also dealing with data sharing and cost, MacPherson et al. present work carried out using the off-the-shelf NavCad-CAESES coupled solution, to troubleshoot and remedy an endangered design project. CAESES (Friendship Systems) was used as the hull modeller and optimization platform, while NavCad (Hydrocomp) contributed a novel linear wave-theory code for bare-hull resistance prediction of high-speed transom-stern craft. Also on everyday grounds, Digital Twins for Design, Testing and Verification throughout a Vessel’s Life Cycle suggests a very appealing exploitation of Big Data, and ties in with several other papers on simulation, ship and fleet operation, PLM and cost presented at Compit 2016.
Human-mimicking robotics (CADDY Project) by Marco Bibuli et al. deserves a mention, too, as the work closely relates to multi-presence VR environments and to context sensitive Big Data collection.
Closing the underlying common denominators of a milestone COMPIT loop, I will note the Internet of Things presentation by Mary Etienne and Anthony Sayers, forbearer of more Big Data to come and its corollary exploitation mode requirements.
Neither last nor least, justice is due to all the very valid papers that could not be mentioned in this review, too, all valid, relevant, and well-deserving to be read, as well as to several deserving abstracts that will hopefully be presented during a future COMPIT.
It is always a little triste to leave COMPIT, and the memorable closing dinner at the Masseria Melcarne made it no easier this year, but the stage that has been set for COMPIT 2017 in Cardiff will make short work of a year’s wait.