Next Article in Journal
Volunteer Engagement in a Stroke Self-Management Program: Qualitative Analysis of a Hybrid Team of Healthcare Providers and Trained Volunteers
Next Article in Special Issue
Barriers Involved in the Safety Management Systems: A Systematic Review of Literature
Previous Article in Journal
Self-Testing as a Hope to Reduce HIV in Transgender Women—Literature Review
Previous Article in Special Issue
Evaluation of Occupational Health and Safety Management of Listed Companies in China’s Energy Industry Based on the Combined Weight-Cloud Model: From the Perspective of FPE Information Disclosure
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Model to Evaluate the Effectiveness of the Maritime Shipping Risk Mitigation System by Entropy-Based Capability Degradation Analysis

1
School of Maritime Economics and Management, Dalian Maritime University, Dalian 116026, China
2
Marine Engineering College, Dalian Maritime University, Dalian 116026, China
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2022, 19(15), 9338; https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph19159338
Submission received: 9 July 2022 / Revised: 24 July 2022 / Accepted: 28 July 2022 / Published: 30 July 2022

Abstract

:
Accurate evaluation of the risk mitigation status of navigating ships is essential for guaranteeing navigational safety. This research mainly focuses on the feasibility and accuracy of evaluating the real effectiveness of a risk mitigation system for navigating ships, including addressing the problem of immeasurableness for risk mitigation capability and determining the degradation regulation of risk mitigation capability over time. The proposed method to solve the problem is an effectiveness evaluation model based on the capability perspective, composed of a capability measurement algorithm based on entropy theory and capability degradation regulation analysis based on numerical process fitting. First, combined with the theoretical framework of a comprehensive defence system, the risk mitigation system designed for navigating ships is reconstructed based on capability building. Second, using a numerical fitting method, the degradation regulation of risk mitigation capability with time is obtained to improve the accuracy of the dynamic evaluation. Finally, referring to entropy theory, the uncertainty of capability is calculated, and then the model is constructed based on this uncertainty to realize the effectiveness evaluation from a capability perspective. The results obtained in an application test of the proposed model indicate that using the entropy of capability can realize an accurate effectiveness evaluation of a risk mitigation system for navigating ships, with a 9% improvement in accuracy, and the Weibull curve fitting is more consistent with capability degradation regulation, with a signification level of 2.5%. The proposed model provides a new path for evaluating the effectiveness of a risk mitigation system for navigating ships from the entropy of capability, and compared with the traditional probabilistic method, the model is more realistic and accurate in actual applications.

1. Introduction

1.1. Evaluation of Ship Risk Mitigation Status

With the rapid growth of the shipping industry and continuous changes in the navigational environment, the risk mitigation for navigating ships, as an important guarantee of navigation safety, has received more attention from the shipping industry and academia, especially the evaluation of the overall operation status of ship risk mitigation system (SRMS) under navigational state. Limited by the sailing scenario variability and time sequence degradation of risk mitigation capability, the traditional methods, including qualitative description and equipment function indicator measurement, cannot be accurately used to represent the real level of navigational safety [1]. Additionally, the risk mitigation status shows certain degradation characteristics (performance, adaptability, reliability) over time. Therefore, it is necessary to analyse and consider this problem in an evaluation of overall effectiveness.
As the main water transport means of international trade, the navigational safety of ships has always been the focus of shipping industry. In recent years, ship risk mitigation has gradually ushered in new challenges: on the one hand, ships are developing towards large-scale, high-speed, and intelligent uses, and the difficulty of ship risk mitigation has further increased; on the other hand, with the increase in traffic density for inland and coastal ships, the difficulty of ship navigational risk identification has further increased. Therefore, how to accurately evaluate risk mitigation status has become a core focus of ensuring ship navigation safety.
Similarly, with the development of navigation technology, risk mitigation techniques are being gradually interconnected to jointly complete risk mitigation action, showing some systematic characteristics. The interactions between these techniques and the resulting system uncertainty have gradually highlighted the systemic safety issues related to ship navigation. Therefore, in the evaluation process of ship risk mitigation status in the new era, the impact and role of ship technology participation need to be fully considered to solve systemic security problems on the basis of considering traditional human defence and physical defence.
Based on the above analysis, it is found that the current SRMS is a complex multidimensional system composed of human–ship interactions, ship–shore interactions, and ship–ship interactions, including crews, equipment, technology, management, and other elements. These elements are interrelated to form a comprehensive risk mitigation capability. Generated by the systematic development of ship risk mitigation measures, a comprehensive risk mitigation capability not only reduces the high consumption of traditional multipoint decentralized risk mitigation but also produces overall system effectiveness. Therefore, ensuring the comprehensive capability of SRMSs is particularly important for marine navigation safety. However, the application of various information technologies will inevitably result in new technological risks and operational uncertainty for navigating activities. For instance, the risk of network hijacking may be caused by the introduction of data transmission technology. Therefore, how to correctly identify the impact of these uncertainties on system capabilities and accurately evaluate the effectiveness of SRMSs has become a key issue to ensure the navigational safety of ships.

1.2. Related Work

With widespread concern about maritime safety around the world, a large number of evaluative studies on the status of ship risk mitigation have emerged in recent years. Since human factors have always been an important cause of ship accidents, most of the current research on the ship risk mitigation domain has focused on the assessment and analysis of the risk mitigation status of avoiding human error [2]. In practical applications, a quantitative method based on the analytic hierarchy process has been mainly used to analyse the failure indicators of ship personnel risk mitigation status [3,4]. This method has been successfully applied to human accident analysis, crew capability building, and crew comprehensive quality evaluation [5,6]. Additionally, in recent years, with the continuous development of science and technology, risk mitigation auxiliary technologies such as video surveillance, risk perception, and information communication in ship risk mitigation have also been widely applied [7,8,9]. The introduction of new technologies has improved the level of risk mitigation, but at the same time, it has also brought new security risks. Current research on the failure of new technologies and the risks associated with uncertainties has mainly used fuzzy comprehensive evaluations and Bayesian networks to evaluate and analyse the relevant performance indicators of SRMSs. Among them, the fuzzy comprehensive evaluation method has been successfully used for the safety assessment of ship security video monitoring, data interconnection, and packet loss [10,11,12]. However, the fuzzy comprehensive evaluation method is greatly affected by the subjectivity of the scorer and lacks a certain degree of objectivity in SRMS analysis. Similarly, the Bayesian network analysis method has also been successfully applied in the fields of risk mitigation strategy evaluation, risk mitigation system optimization, and risk mitigation information fusion [13,14,15]. However, this method relies on the regression analysis of past experience while evaluating the effectiveness of an SRMS and lacks applicability for the analysis of the relationship between discontinuous complex systems. In recent years, with the continuous enrichment of risk mitigation methods, the overall evaluation of SRMSs has gradually received attention in the navigation safety domain. Some scholars have carried out research on evaluating SRMSs from a multiple perspective and a full-chain perspective, focusing on analysing improvements to risk mitigation methods after an accident and evaluations of the cohesion between the risk mitigation links [16,17]. This “patch-style” optimization method has promoted the improvement of ship risk mitigation capability and the filling of the risk mitigation vacuum. However, a navigating ship is faced with complex and changeable external environments and various uncertain challenges all the time, so the “patch” can never be completed, and this method cannot fundamentally solve the problem. Therefore, a capability construction method for building a “universal” system for ship risk mitigation is more effective than a traditional “patch” scattered risk mitigation method. In a recent research report, it was proposed to analyse the relationship between the composition of the system’s capabilities and the performance of the system [18]. On this basis, the capability measurement method was used to calculate the effectiveness of a complex system [19]. However, this method equates effectiveness with capability and does not fully consider the time-series system capability change attributes and navigation scenario change attributes of SRMSs, which affects the feasibility and accuracy of the evaluation.
Based on previous research, this paper proposes a new effectiveness evaluation methodology for SRMSs, including entropy-based capability calculation and capability degradation analysis. The methodology considers a system’s scenario variability and time sequence degradation. A quasi-normal distribution of statistical parameters is combined with a Weibull distribution based on degradation analysis theory to achieve an accurate effectiveness evaluation. The improved calculation method of capability uncertainty constructed by entropy theory is used to address the problem of capability mismeasurement. The contribution of this study is summarized as follows.
(1)
Exploration in entropy-based model to investigate the effectiveness of the risk mitigation system when defencing various risks, the results of which are able to benefit greatly the safety management.
(2)
Establishment a complete solution for the effectiveness assessment model for the ship risk mitigation system, which can be used to improve the risk management of the maritime transportation.
(3)
The performance of the proposed model is verified by a case study, which indicates the great application potential in the field of risk management.

1.3. Organization

The remainder of this paper is organized as follows. Section 2 presents an overview of the research methodology and the employed techniques. Section 3 presents the model establishment process. Section 4 verifies the proposed model with an actual application case. Section 5 concludes the study and presents future directions. Additionally, an explanation of all the collection data used in this study is given in Appendix A.

2. Methodology

An overview of the methodology proposed in this study is illustrated in Figure 1, in which the integration of a comprehensive security system theory, a capability degradation analysis method, and the entropy of capability leads to the formation of a comprehensive model to analyse the risk mitigation status of ships under navigation. As shown in Figure 1, there are 3 parts to this study:
Part 1—evaluation framework construction. An evaluation indicator framework is obtained based on structure reorganization using a comprehensive defence framework.
Part 2—indicator degradation analysis. The capability degradation regulation is obtained through numerical curve fitting.
Part 3—system effectiveness evaluation. The system effectiveness is obtained by calculating the degree of capability uncertainty by entropy theory.

2.1. Comprehensive Defence System

The comprehensive defence system provides theoretical support for the reorganization of SRMSs from a traditional decentralized prevention framework based on risk mitigation subject to the capability building framework based on risk mitigation tasks. The comprehensive defence system originated from the physical protection system (PPS), which was first proposed by Sandia Laboratories in the 1980s and was widely used in the field of nuclear safety protection [20]. For the first time, this theory proposes a “triad” safety protection theoretical framework composed of humans, material objects, and technology from the perspective of improving overall risk mitigation capabilities [21]. The principle is shown in Figure 2.
In Figure 2, the theoretical framework of risk mitigation emphasizes the coordination, interdependence, complementarity, and cooperation of multiple risk prevention and control methods. Different from the traditional ergodic risk-responsive safety protection theory, this theory emphasizes the diversification of risk mitigation methods, the systematization of risk mitigation organizations, and the integrity of risk mitigation capabilities and strives to transform the construction of discrete risk mitigation points into the construction of integrated capability. It is more advantageous to use the risk mitigation framework based on comprehensive capability building to address changeable risks in a complex navigational environment, especially in terms of flexible response and timely control.

2.2. Capability Degradation Analysis

The capability degradation analysis method provides an effective means for quantitatively grasping the regulation of system capability changing with time. With the basic understanding that an SRMS has a prescribed life cycle, we infer that the performance of its system components is continuously degraded with time, which will lead to a degradation of the system capability. Therefore, this study adopts the capability degradation analysis method to analyse the degradation regulation of the component units, which can accurately identify the dynamic characteristics and real-time state of risk mitigation capability [22].

2.2.1. Theoretical Basis of Degradation Analysis

Degradation theory is the core foundation of degradation analysis and the main theoretical basis for the exploration of degradation law. Degradation theory was first put forward in biology, which mainly refers to the transformation trend and process of material system from order to disorder or from high order to low order in natural evolution and is used to express the analysis of the continuous decline degree of biological performance. The main core of degradation theory is that as time goes on, a certain function of anything will gradually decay. This is consistent with the law of our real world. The introduction of degradation theory makes it possible to master the law of performance change with time. On the one hand, for many devices in our research, degradation is a natural attribute. Whether there is failure or not, their performance data can be monitored to get degradation data. On the other hand, due to the modelling of degradation data closer to the results of failure physics, the introduction of degradation theory can provide more information for the degradation process, and the degradation process can also be used to find the correlation between equipment performance degradation and system state.

2.2.2. Degradation Form Judgment and Indicator Selection

Effective degradation form judgement is the basis for analysing the regulation of ship risk mitigation capability change. According to the status of a component unit losing its prescribed function, the failure of a component unit in SRMS can be divided into two forms: sudden failure and degenerative failure. Due to inevitable sudden failures in actual risk mitigation, the form of sudden failure needs to be removed in the degradation analysis to reduce its interference with the analysis of routine degradation regulation.
Unlike sudden failures, degenerative failures refer to a slow decline in performance over time. It is usually represented by a degradation function that changes with time. Considering that the performance of component units affecting the system risk mitigation capability needs to reserve a certain margin of insurance, the performance standard to ensure the normal operation of the system is set as a fixed value R , and its corresponding equipment effective life threshold is T , as shown in Figure 3:
The accurate selection of the degradation indicator is a core task for mastering the degradation regulation of ship risk mitigation capability. Clarifying the characteristic indicators and failure criterion of a component unit is the basic goal when analysing the system effectiveness. The performance characteristic quantity of a component unit is a parameter value reflecting the capability change of a risk mitigation subject, and it needs to be selected according to the function setting by a specific risk mitigation task [23]. In this study, the criterion was usually determined according to the requirements of risk mitigation tasks and called the failure threshold [24]. The failure threshold can be a determined value or an interval.
The selection of performance failure characteristic quantity and data acquisition usually have the following considerations: (1) The performance failure characteristic quantity can be an output parameter specified by the risk mitigation subject [25]. (2) Measurability is the basic requirement for the setting of the failure characteristic quantity. (3) Considering the regularity of measurement data and the convenience of experimental development, in this study, the time interval of each sample measurement is set to be the same.

2.2.3. Degradation Regulation Analysis

After selecting the corresponding performance characteristic quantity, exploring the capability degradation regulation of a component unit becomes the top priority. According to the natural degradation phenomenon, the component unit of an SRMS can be classified into the loss component with time. Based on reliability theory, the data distribution of the failure characteristic quantity can be approximated as subject to some known empirical distribution functions, such as normal, exponential and Weibull distributions [26]. Combined with probability theory, the failure probability at a certain moment can be obtained through the distribution function of the failure characteristic quantity at the moment to finally realize the effectiveness evaluation of the SRMS. The degradation regulation analysis can be roughly divided into the following two steps:
The first step is extracting the true value of the failure characteristic quantity. The main goal is to process the corresponding measurement data by mathematical statistics on the basis of the above selected methods. Assuming that the component unit stays in the normal state, the failure degree of its performance at a certain point can be determined. However, due to the existence of measurement error, there will be some differences between the measured value and the real value. Therefore, the values of multiple measured samples at the measurement point obey the normal distribution form as follows:
x ~ N μ , σ 2
where x represents the measured value, μ represents the true value, and σ represents the standard deviation of the measurement, as shown in Figure 4.
The second step is the numerical curve fitting of the characteristic quantity. An appropriate distribution function is typically selected to fit the curve of the characteristic variable with time. Since assuming that the risk mitigation unit follows the regulation of capability degradation, its characteristic value should also follow the corresponding degradation regulation. By comparing and analysing the exponential, logarithmic, and Weibull forms, the optimized distribution form can be selected to fit the corresponding failure characteristic values, as shown in Figure 5.

2.3. Entropy of Capability Calculation

The method of entropy measurement provides an effective means for quantitatively analysing the indirect measured quantity of capability. Based on the basic belief that a change in risk mitigation status for navigating ships is equivalent to a change in risk mitigation effectiveness caused by the uncertainty of a system’s risk mitigation capability, the effectiveness value can be obtained by measuring the uncertainty of risk mitigation capability [27]. This approach fundamentally addresses the difficult problem that risk mitigation capability cannot be measured.
According to the concept of information entropy in entropy theory designed for measuring the uncertainty of an information source, the measurement of information is characterized by the uncertainty of the information source [28]. By analogy, the method of entropy of capability can be constructed as the uncertainty measure of a capability source, that is, the measure of capability is characterized by the uncertainty of its component units. Based on this analogy, it is assumed that the component factor of a certain type of capability is A i . The influence degree of factors on this capability is p i , expressed by the basic probability of causing capability failure. Therefore, the uncertainty degree H i caused by this factor to the capability can be expressed by the self-information quantity function:
H i = l n 1 p i
Since the sum of p i in traditional entropy theory is 1, the measurement of information will show a convergence trend [29]. Therefore, the uncertainty of the system is the uncertainty of the value of the system factors. Assuming that there are n factors that affect the protection capability, according to the nature of the entropy function in information theory, when p i = 1 n , the entropy value is the largest [30]. However, the result of this assumption is inconsistent with the objective understanding of protection effectiveness in practice. In reality, the smaller the failure probability of the factor affecting the protection capability, the smaller the instability it brings to the protection ability, the greater the protection capability, and the higher the protection effectiveness. Therefore, the traditional concept of information entropy cannot be directly used here as an analogy for risk mitigation effectiveness.
Considering that the capability of SRMS is affected by the component units and has a positive correlation, the entropy of capability in the SRMS is not the probability of the system capability value but the strength of the uncertainty of conduction correlation between the units and the system. Therefore, the entropy of capability fundamentally solves the uncertainty of the capability fluctuation rather than the uncertainty of the capability value. On this basis, the entropy of capability based on conduction correlation is constructed as follows:
I i = ( 1 p i ) l n 1 p i
Furthermore, it can be concluded that the entropy of capability does not have the characteristics of convergence, which is consistent with reality. Therefore, the greater the uncertainty of a system risk mitigation capability, the greater the uncertainty of its system risk mitigation effectiveness, and then the greater the entropy of capability.

3. Model Establishment

3.1. Framework Reorganization of SRMS

3.1.1. System Analysis and Reorganization

With the continuous enrichment of risk mitigation means and technology, the composition of SRMSs is constantly deepening and developing, from single components to integrated subsystems and then to the current comprehensive risk mitigation system.
Most of the common SRMS frameworks have been designed based on the spatial distribution of risk mitigation points, mainly including bridges, engine rooms, cargo holds, decks, and living areas [8]. Each risk mitigation point is composed of multiple elements, as shown in Figure 6. The element details are as follows:
  • The bridge subsystem includes elements such as navigation operators, navigation monitoring and warning systems, and navigation communication equipment.
  • The engine room subsystem includes elements such as marine engineers, engine room monitoring platforms, and engine room maintenance equipment.
  • The cargo hold subsystem includes elements such as operators, monitoring systems, and emergency equipment.
  • The deck subsystem mainly includes elements such as staff, protection equipment and ship rescue equipment.
  • The living cabin mainly includes elements such as personal protective equipment and personnel health protection systems.
Due to continuous changes in the navigation environments of ships and the continuous improvement of shipping safety requirements, the complexity and suddenness of the current ship risk mitigation situation has become increasingly serious [31]. The risk mitigation effectiveness of this common SRMS adopting single-point strict prevention means is worsening. Combined with the capability system architecture design of human, physical and technical defences proposed in the comprehensive defence system, the multiple elements in the five subsystems are classified and reorganized, as shown in Figure 6. The system forms the following three parts:
  • Risk mitigation methods, including human-based behaviours, strategies, and measures.
  • Risk mitigation equipment, including hardware-based equipment, signs, and facilities.
  • Risk mitigation platforms, including technology-based perception models, monitoring software, and early warning systems.
Based on the risk mitigation capability construction, the reorganized framework of the SRMS has a clear boundary in structural decomposition and a relatively independent carrier in capability measurement. It provides a practical system network architecture for effectively mastering and analysing the actual effectiveness of SRMSs.

3.1.2. System Description

The component units of SRMS are independent of each other in the risk mitigation carrier, but they are related to each other in terms of function. Through interaction, a comprehensive capability of risk mitigation is jointly formed. According to the composition, functions and characteristics of each subsystem in the SRMS, its basic architecture and constituent units are analysed and integrated as follows:
(1)
Human defence subsystem
Human defence is the oldest method in the risk mitigation domain, and it is also an indispensable protective measure used since ancient times. In the risk mitigation system, human defence mainly includes all related personnel involved in the risk mitigation subjects [32]. Its core role is to use human sensors (eyes, hands, ears, etc.) for detection. When an attacker is found to be dangerous to the protection target, it will rely on its own identification or information transmission to obtain risk signals, make judgements about it and then take corresponding measures. Based on this understanding, the human defence subsystem of the SRMS can be defined as a collection of artificial behaviours for risk mitigation established by the risk mitigation awareness, consciousness and capability of the behavioural associates to realize ship navigation safety. Its basic composition, objectives, functions and characteristics are shown in Table 1.
In Table 1, one of the cores of the risk mitigation system established for navigating ships is humans. Therefore, the related organizations and personnel involved in ships need to put humans in the first place, and education on risk mitigation is essential, covering all links focusing on the education and training of personnel involved in risk mitigation awareness, knowledge, and skills. Additionally, there must be a focus on improving human risk mitigation capability, especially allowing ship managers, operators, and risk mitigation specialists who participate in actual risk mitigation strengthen safety responsibilities ideologically to improve professional ethics in concepts and develop risk mitigation habits in terms of behaviour. To strengthen human defence, the key is to improve and implement the responsibility system for risk mitigation, clarify the functions and responsibilities of all personnel, and strengthen the supervision, inspection and assessment of the implementation of the risk mitigation responsibility system.
Based on a statistical analysis of ship accident reports and on-site investigations and research, combined with the above definition of the human defence subsystem, we divide it into the following units:
  • Ship management unit. This is mainly composed of ship owners and ship controllers. It is responsible for the maintenance plan, personnel arrangement, financial support and other aspects of ship risk mitigation. This is the top-level design unit of the human defence subsystem of the SRMS.
  • Post operation unit. This is mainly composed of operators and supplementary personnel in various positions, such as the bridge, engine room, and deck of the ship. They are responsible for the safe operation of specific positions and the handling and response of direct risks under the ship’s sailing state. This is the core response unit of the human defence subsystem of the SRMS.
  • Shore-based assistance unit. This is mainly composed of shore-based ship dispatching, supervision and piloting personnel. It is responsible for real-time monitoring, regular inspections, and assistance in response to the ship navigational risk state from the shore. This is an important guaranteed part of the human defence subsystem of SRMS.
(2)
Physical defence subsystem
Physical defence is the core of early systematic protection, and it is also one of the oldest risk mitigation methods used to deal with safety risk. In the risk mitigation system, the physical defence subsystem generally refers to the hardware, physical objects and other barriers that protect the safety of the object [33]. Based on this, the physical defence subsystem of an SRMS can be defined as a collection of physical risk mitigation objects relying on the attributes, configuration and functions of the physical risk mitigation system to realize ship navigation safety. Its main risk mitigation objects include installing protections, facilities, and tools while strengthening protective barriers in key parts, important areas, and key places; setting up hardware facilities such as fire rescue and safe operation; equipping the post personnel with necessary duty, protection, inspection, and other equipment; and setting signs such as prompt board, safety board and restricted access signs. Its basic composition, objectives, functions and characteristics are shown in Table 2.
In Table 2, one of the cores of the safety risk physical defence subsystem established for ships in navigation is hardware. This means that the related parties of ship safety navigation need to increase the resource investment of hardware to provide strong hardware guarantees for the smooth development of safety protection and management of ships during a voyage. It is also an important material basis for building a comprehensive risk mitigation system for safety risk responses. On the premise that the configuration of risk mitigation hardware covers all links of the whole shipping chain, the function and combination of risk mitigation hardware shall be determined according to the actual risk mitigation task requirements. The maintenance and updating of risk mitigation hardware require the support of corresponding funds. Under the overall requirements of reducing ship operation costs, relevant departments related to navigational safety must scientifically plan the budget of hardware funds related to ship navigation safety to effectively ensure the normal and orderly operation of the physical defence subsystem.
Based on a statistical analysis of ship accident reports and on-site investigation and research, combined with the above definition of a physical defence subsystem, we divide it into the following units:
  • Safety facility unit. This is mainly composed of protective equipment related to navigation safety and ship fire protection. It is the basic unit of passive ship risk mitigation.
  • Prevention barrier unit. This is mainly composed of a safety valve, protective net, and explosion-proof door to delay and hinder risk diffusion. It is a supplementary unit for passive risk mitigation.
  • Prompt identification unit. This is mainly composed of indicative signs such as reminder boards, safety boards, and restricted access signs set up at a fixed position. It is an important unit for ship safety protection.
  • Personnel equipment unit. This is mainly composed of the necessary protection and inspection equipment for post personnel. It is the basic material guarantee to support post personnel to effectively deal with navigational risk.
(3)
Technical defence subsystem
Technical defence is a new risk mitigation method in the information age and is the main technological core of the current risk mitigation system. Technical defence refers to protection means using modern science and technology, especially modern information technology means, with strong technical characteristics such as various detection systems, early warning and alarm systems, video monitoring systems, and access control systems [34]. This is an extension and strengthening of human defence and physical defence in protection means and the supplementation and improvement of human defence and physical defence. Based on this, the technical defence subsystem can be defined as a collection of risk mitigation technologies for navigational ships with the characteristics of digitization, networking and intelligence, which rely on emerging technologies in the fields of sensing, computing, communication and processing.
Technical defence subsystems mainly refer to the design and development of corresponding information platforms and application software based on computer communication technology according to the actual risk mitigation requirements to realize the corresponding risk mitigation functions. When it is found that an attack object is dangerous to the protection target, the technical defence subsystem relies on its own information perception or processing capability to obtain potential hazard signals, effectively compensating for the inefficiency of the human defence subsystem and the solidification of the physical defence subsystem. Its basic composition, objectives, functions and characteristics are shown in Table 3.
In Table 3, one of the cores of the technical defence subsystem for ship navigational risk is the information processing system. With the rapid development of modern information technology, it is being widely used in the process of storing, processing and transmitting information in SRMSs. Especially in the construction of ship navigation information platforms, the construction of information systems and the construction of safety risk technology protection systems should be synchronously planned, designed, and constructed. The production, transmission and use of vessel navigation information and its carriers must be equipped with technical equipment that meets the standards for information safety. Key positions for risk mitigation must take measures to prevent and eliminate hazards and be equipped with perfect technical equipment.
Based on a statistical analysis of ship accident reports and on-site investigation and research, combined with the above definition of a technical defence subsystem, we divide it into the following units:
  • Navigation monitoring unit. This is mainly composed of modern information protection platforms such as bridge information monitoring, radar monitoring, and weather monitoring. It is the basic functional unit for the ship technical risk mitigation subsystem.
  • Information assurance unit. This is mainly composed of information technology protection methods such as emergency communication platforms, network protection means, and ship–shore cooperative communication guarantees. It is the basic guarantee unit for the ship technical risk mitigation subsystem.
  • Risk warning unit. This is mainly composed of specific risk mitigation technologies such as the identification of unsafe behaviours, the alarm of abnormal routes, and the fault tolerance of the warning system. It is the core application unit for the ship technical risk mitigation subsystem.
  • Decision management unit. This is mainly composed of the specific safety risk treatment means of automatic collision avoidance route planning, ship automatic navigation and safety risk autonomous response systems. It is an important response unit for the realization of ship technical risk mitigation.

3.2. Indicator Design of SRMS

Based on the human defence, physical defence, and technical defence framework architecture in the above comprehensive defence system and considering the transformation of the risk mitigation concept from discrete risk response to system capability construction in SRMSs, a comprehensive capability indicator framework is proposed in this study, combined with the actual understanding of SRMSs.

3.2.1. Human Defence Subsystem

Combined with human protection characteristics such as human initiative and the flexibility proposed in system safety protection, the human defence subsystem is actually a collection of risk mitigation standards, measures and behaviours that realize ship navigation safety, relying on the risk mitigation consciousness, consciousness and capability of behaviour-related persons and human subjects [35]. Based on an analysis of SRMS structure and the compilation of questionnaires for on-board personnel, a framework for evaluating the capability of a human defence subsystem is constructed. It is mainly divided into three parts: human-oriented attributes, responsibility planning, and system guarantees, as shown in Figure 7.
In Figure 7, human-oriented attributes are used to measure whether the basic capability attributes of associated personnel meet the needs of risk mitigation capabilities, mainly including skill level (age, experience, training, communication, emergency), safety awareness (awareness of compliance with regulations, self-protection, and group cooperation), and personnel health (physical and psychological).
In Figure 7, responsibility planning is used to measure whether the manpower allocation of the human defence subsystem meets the needs of risk mitigation capability, mainly including whether the distribution of power and responsibility is reasonable (rationality of post setting, applicability of staffing requirements, accuracy of staffing quantity), whether job training is in place (pertinence of training content, compliance rate of training frequency, accuracy of training objects), and post-assessment accuracy (assessment content, assessment method and assessment result).
In Figure 7, system support is used to measure whether the support force meets the requirements of supporting the human defence system to give full play to the risk mitigation ability, mainly including team building (personnel age structure, personnel education structure, personnel salary structure), incentive mechanisms (whether the incentive strength is enough and the incentive scale is accurate), and material support (living materials, work materials and emergency materials).
Based on the abovementioned human defence subsystem measurement framework and considering the ergodicity and effectiveness of capability indicators, we selected 11 sets of core capability indicators from historical data and risk mitigation node analysis and used them to ensure measurability. On the above, measurable indicators are set for the corresponding indicators, as shown in Table 4.

3.2.2. Physical Defence Subsystem

The physical defence subsystem is regarded as object protection in safety protection. It is a collection of instruments, signs and facilities for ship risk mitigation with hardware as the main body for the purpose of realizing ship navigation safety, giving full play to the passive damage resistance of objects to reduce and delay potential risk [36]. Based on the trace-to-source safety risk and investigation of ship-related personnel, a framework for evaluating the capability of the physical defence subsystem is constructed, which is mainly divided into three parts: entity attributes, combination configurations, and system guarantees, as shown in Figure 8.
In Figure 8, entity attributes are mainly used to measure whether the basic performance quality of the entity meets the requirements of physical defence, mainly including core function (damage resistance, replaceability) and basic quality (service life, consistency and failure rate).
In Figure 8, the combined configuration is mainly used to measure whether the physical configuration of the physical prevention subsystem meets the requirements of risk mitigation capability, mainly including physical selection (the rationality of physical risk mitigation settings, the applicability of physical allocation requirements, and the accuracy of physical configuration quantity), physical response design (whether physical use training is available, and whether the cooperation between physical objects is good), and the actual effect feedback of the material object (feedback content, feedback method and effect after feedback).
In Figure 8, system support is mainly used to measure whether the support force can meet the requirements of supporting the risk mitigation capability of the physical defence system, mainly including physical maintenance (whether the maintenance frequency is enough, whether the maintenance personnel are good enough, whether the maintenance content is accurate), quality control (whether the acceptance control of equipment is not controlled, whether the hidden danger of equipment is not dealt with, whether the operation environment is optimized or not), and funding guarantee (equipment purchases, equipment maintenance, upgrade replacements).
Based on the abovementioned physical defence subsystem measurement framework and considering the ergodicity and effectiveness of capability indicators, we selected 13 sets of core capability indicators from historical data and risk mitigation node analysis and used them to ensure measurability. On the above, measurable indicators are set for the corresponding indicators, as shown in Table 5.

3.2.3. Technical Defence Subsystem

Technical defence is regarded as the extension and enhancement of human defence and physical defence in risk mitigation. It is a collection of risk mitigation methods, application software and integrated systems with technology as the main body for the purpose of realizing ship navigation safety, giving full play to the effectiveness of technical activities from the perspective of technical prevention, using information technology to control, discover, analyse and deal with potential risk. Based on the trace to source of safety risk and investigation of ship-related personnel, a framework for evaluating the capability of technical defence subsystem is constructed, which is mainly divided into three parts: an ontology attribute, function setting, and system guarantee, as shown in Figure 9.
In Figure 9, the ontology attribute is used to measure whether the basic function module design of the system meets the calibre requirements of technical defence, mainly including system functionality (information integration and function expansion), system reliability (environmental adaptability, compatibility between systems), and system portability.
In Figure 9, the function setting is used to measure whether the configuration of the technical defence system meets the requirements of risk mitigation capability, mainly including the integrity of the system function (coverage, target type, monitoring accuracy probability), accuracy (measurement accuracy of technical defence, information fusion accuracy, communication efficiency) and timeliness.
In Figure 9, the system support is used to measure whether the support force meets the requirements of the support defence system to give full play to the risk mitigation capability, mainly including maintenance mechanism (whether the maintenance frequency is enough, whether the maintenance personnel can do it, and whether the maintenance content is accurate), emergency mechanism (emergency response to system failure), and rescue mechanism.
Based on the abovementioned technical defence subsystem measurement framework and considering the ergodicity and effectiveness of capability indicators, we selected 12 sets of core capability indicators from historical data and risk mitigation node analysis and used them on the basis of ensuring measurability. On the above, measurable indicators are set for the corresponding indicators, as shown in Table 6.

3.3. Capability Degradation Analysis at Subsystem Level

Based on the above characteristic parameter setting and degradation regulation analysis and considering the scene volatility and time series degradation characteristics of SRMSs, this study proposes a capability degradation methodology based on failure characteristic quantity analysis. It is mainly divided into the following two parts:

3.3.1. Parameters Characterizing Degradation Process

According to mathematical statistics theory, the characteristic parameter data should be obtained through a certain measurement experiment, and the measured data should be recorded at each time point in time order in the specific test [37]. To reduce the single measurement error of the component unit, the measurement of multiple similar samples is selected at one time point instead of multipoint measurement for a single sample at the same time point. Additionally, a certain number of test points are selected in the rated life cycle partition interval to reduce test numbers.
Assuming that a component unit of the SRMS is selected as a sample for the capability degradation analysis, the degradation characteristic parameter data of the corresponding unit are recorded in chronological order at each time point. For the objects of evaluation sample set A = 1 , 2 , ,   i , n 1 , n , we also create indicators of the evaluation sample set B = 1 , 2 , ,   j , m 1 , m . Considering the different types of measurement data, the numerical polarization technique is introduced to normalize the degradation characteristic parameter [38]. Then, under the assumption that x j i t represents the value of the j th indicators expressed for the i th evaluation object at the test time point t , we can obtain the evaluation sample result as follows:
x j i t | i A , j B
x j i t ~ 0 , 1
Due to the interference of equipment factors, measurement errors will inevitably exist in the test. Assuming that the theoretical value for this kind of sample at a certain time is z j t and according to the theory of quasi-normal distribution of statistical parameters, it is also represented as y j t by the expected value comparison method; thus, we can obtain:
x j i t = z j t + ε i j = y j t + ε j i t ,   ε j i t ~ 0 , σ 2
y j t < t 1 y j t < t 2 , t 1 t 2
where ε j i t represents the measurement error value of the j th indicators expressed for the i th evaluation object at the test time point t .

3.3.2. Degradation Regulation Analysis

The purpose of building the degradation analysis method is to construct a suitable curve function to fit the trend of degradation characteristic parameters over time. It is assumed that the SRMS follows the capability degradation regulation, so its effectiveness characteristic value should also follow the corresponding degradation regulation. Combined with the theory of equipment reliability analysis and life tests, the optimized Weibull distribution is selected to fit the degradation characteristic curve [39]. The statistical value of the degradation characteristic parameters corresponds to a Weibull distribution, so its probability density function can be expressed as:
f x ; λ , κ = κ λ ( x λ ) κ 1 × e x λ κ
where x represents the probability of the effectiveness characteristic value, λ represents the scale parameter, and κ represents the shape parameter.
Applying Equation (11), it can be determined that the value of y j t follows the optimized Weibull distribution. Additionally, the effectiveness characteristic value is expressed as the effective life value of the system component unit, so its change function can be represented as the probability density function f x ; λ , κ . On this basis, the trend of degradation characteristic parameters over time can be calculated as:
y j t = f j t ; λ , κ = κ λ ( t λ ) κ 1 × e t λ κ
Then, based on the expected value of measurement y j t and fitting curve function f j t ; λ , κ , the corresponding scale parameter λ and shape parameter κ can be obtained by statistical analysis and calculation of the maximum similarity value.

3.4. System Effectiveness Measurement Based on Entropy of Capability

3.4.1. Measurement Methodology Description

The effectiveness measurement methodology based on entropy theory provides an effective theoretical basis for solving the accurate measurement of the actual status of ship risk mitigation. The specific derivation logic is shown in Figure 10.
In Figure 10, it can be seen that to realize the accurate measurement of the actual effect of ship risk mitigation, the traditional evaluation method of ship risk mitigation capability is faced with the inherent dilemma that the capability cannot be measured directly [40]. In addition, the specific application scenario characteristics of the SRMS cannot be considered in the evaluation of capability, so the actual status of risk mitigation under the specific task scenario cannot be truly reflected in it. Therefore, combined with the effectiveness definition of the completion degree of the capability to specific tasks, the measurement of the actual status of ship risk mitigation can be transformed into the solution of ship risk mitigation effectiveness.
However, according to the probability attribute characteristics of the capability to complete the task in effectiveness, capability is not measurable and therefore not probabilistic. Considering this, we assume that the initial ability can fully complete the task requirements, but the uncertainty of the capability occurs due to the scenario change, which leads to the change of the degree of capability to complete the task.
According to the aforementioned principle, the measurement of risk mitigation effectiveness can be converted into the uncertainty measurement of risk mitigation capability.
Combined with the definition and formula of entropy of capability in Section 2.3, the effectiveness measurement equation can be constructed as follows:
M i = I i = ( 1 p i ) l n 1 p i
where p i represents the failure characteristic value of the i th element affecting capability.
In addition, according to capability degradation Equation (12) in Section 3.3, the time-series failure probability measurement value of the system components can be calculated as follows:
p i = F i t F f i x e d t = f i t ; λ , κ = κ λ ( t λ ) κ 1 × e t λ κ T f i x e d
where T f i x e d represents the rated service life value.
Therefore, the calculation equation of the system risk mitigation effectiveness based on capability degradation can be obtained as follows:
M i = f i t ; λ , κ = κ λ ( t λ ) κ 1 × e t λ κ T f i x e d 1   l n f i t ; λ , κ = κ λ ( t λ ) κ 1 × e t λ κ T f i x e d

3.4.2. Comprehensive Effectiveness Aggregation

Based on the effectiveness measurement framework of “performance–capability–effectiveness” and combined with the above capability degradation methodology, the effectiveness measurement process is systematically integrated [41]. It is mainly divided into the following steps:
(1)
Indicator data pre-processing
Based on the indicator design of the SRMS, it is assumed that the expected value of each indicator is the characteristic quantity of the constituent elements affecting the capability. According to the rated performance requirements of constituent elements, the classification indicator value of each indicator is set y l o w as unqualified, y m i d as qualified, and y h i g n as excellent. Combined with the algorithm of data extremum optimization, performance characteristic quantity is carried out with data cleaning. Therefore, the improved performance characteristic quantity can be obtained as follows:
s i = 0.8 + y i y m i d y h i g h y l o w
where y i represents the performance characteristic value of the i th component unit and s i represents the improved performance characteristic value of the i th component unit.
Then, a logical judgement is made for the performance characteristic indicator under each capability, and it is determined whether to use the following parallel algorithm or series algorithm for fusion:
parallel   algorithm :   q y = i = 1 n s i ,   series   algorithm :   q y = i = 1 n ω i s i
where q y represents the performance characteristic fusion value of the y th capability.
(2)
Effectiveness calculation based on entropy of capability
Combined with the performance measurement methodology constructed in Section 3.4.1, the probability value of the characteristics is transformed into a single performance value as follows:
e y = ( 1 q y ) l n 1 q y
where e y represents the effectiveness of the y th capability.
Then, the unit effectiveness value under the parallel algorithm can be calculated as:
e y = ( 1 q y ) ln 1 q y = 1 i = 1 n 0.8 + y i y m i d y h i g h y l o w ln i = 1 n 0.8 + y i y m i d y h i g h y l o w
(3)
Subsystem performance integration
To facilitate the analysis and comparison of each system, based on the indicator system framework constructed above, this study separately calculates the human, physical, and technical defences and then integrates the three subsystems as follows:
I y = e y * ω y ,   ω y = v y 1 n v y
where ω y represents the loss value of risk accidents caused by the y th capability in the past 10 years, and v y represents the loss value caused by the y th capability under safety out of control.
Then, the j th subsystem efficiency value is
E j = y = 1 n e j y * ω j y = i = 1 n v j y 1 n v j y * 1 i = 1 n 0.8 + y j i y j m i d y j h i g h y j l o w ln i = 1 n 0.8 + y j i y j m i d y j h i g h y j l o w
Considering the weight ratio among subsystems, the indicator weight is adjusted according to the evaluation value of ship informatization degree and route safety value [42]. The evaluation value of informatization degree is used for the distribution of system strength, and the route safety value is adjusted as a whole. Thus, the total effectiveness can be obtained as follows:
E t o t a l = θ j = 1 m E j * f j t ,   θ = r L
where f j t represents the function of the degree of ship informatization with the age of the ship, θ represents the safety value of the route, r represents the number of annual accidents, and r represents the total annual route flow.
Combined with Equation (21), the total effectiveness value based on the performance characteristic quantity can be obtained as follows:
E t o t a l = r L j = 1 m i = 1 n f j t * v j y 1 n v j y 1 i = a n 0.8 + y j i y j m i d y j h i g h y j l o w * ln i = 1 n 0.8 + y j i y j m i d y j h i g h y j l o w

4. Case Study

4.1. Case Selection and Results Output

Accurate scenario definition is the basic guarantee for the evaluation preparation of system effectiveness. Considering that the research object is the sailing state of the ship, the verification scenarios need to be restricted before case selection and indicator data collection.
Generally, a ship’s transportation cycle is the process from the previous port to the next port, including loading, unberthing, leaving port, sailing, entering port, berthing, and unloading [43]. Since this study mainly focuses on the effectiveness evaluation of the risk mitigation system for navigating ships, the scenario is limited to the processes of unberthing, leaving port, sailing, entering port, and berthing, as shown in Figure 11.
In Figure 11, it can be seen that ship navigational tasks can be reorganized into three main stages:
  • Ship departure. In the departure stage, ships go through the process of unberthing and leaving port. Ship risk mitigation mainly involves unberthing safety, sailing in narrow waters, and route planning.
  • Ship sailing. In the sailing stage, ships go through the process of multiple navigation areas and intersection navigation. Ship risk mitigation mainly involves the unberthing safety of navigation monitoring, ship–shore communications, and collision avoidance decisions.
  • Ship arrival. In the arrival stage, ships go through the process of entering ports and berthing. Ship risk mitigation mainly involves the unberthing safety of sailing in narrow water navigation monitoring, ship–shore collaboration, and berthing safety.
Based on the ship navigational information platform in a test ship named Yukun, we carried out an exploratory test in an SRMS to verify whether the methodology could accurately evaluate the effectiveness of each subsystem proposed in Section 3.1. The specific parameters of test ship named Yukun is shown in Table 7 below. This test was based on the information collected from an actual ship. The main data types we collected were based on navigation records to complete the risk mitigation tasks, and then we tested and verified them through changes in interaction frequency, volume, and complexity. We took the data as the input and the indicator results under the test as the output. Then, we monitored the effectiveness change of the system and the performance difference under different indicator data. To improve the data consistency, the same set of data measurement standards was used to collect data from all tests.
Considering that the research involved 3 subsystems including 36 capabilities and 82 groups of influencing elements, we selected one indicator of the 82 elements to analyse the model results for the verification of degradation methodology and 36 capabilities to analyse the entropy of capability calculation process for the verification of effectiveness evaluation methodology. The test data and corresponding effectiveness evaluation values are shown in Appendix A. Based on the inspection practice for the risk mitigation system on board of Yukun, five points are selected chronologically to measure the degradation of the risk mitigation system. The measured values for the indicator of “basic personnel support capacity” in human defence subsystem at these five points are exampled in Appendix A by multiple cycle test [44], and the measured results are shown in Figure 12, based on which the statistical analysis data are summarized in Table 8.
In addition, based on the effectiveness measurement method, we calculated the statistics of relevant indicators under the three subsystems of human defence, physical defence, and technical defence, as shown in Table 9.

4.2. Degradation Regulation Analysis

4.2.1. Volatility Analysis

Anderson darling test is a non-parametric test method to test whether the collected data obey a certain distribution (such as normal distribution, exponential distribution, Weber distribution, etc.) or not. For a given data and distribution, the better you sum up, the smaller the value will be. A certain given value can be used to test whether the data comes from the given distribution or not. For instance, if the given value is less than a certain threshold value (e.g., 0.05), then the original hypothesis would be rejected, and the data does not obey the distribution. For the data at points A–E, the Anderson–Darling test was used to detect the distribution form, and the test results are shown in Table 10 [45].
In Table 10, based on the comparison between statistical values and critical values, it is found that the data obeyed the normal distribution with a confidence of 95% but did not obey the exponential distribution.
In Table 8, the test results at five points were E j A = 0.863755 , E j B = 0.788881 , E j C = 0.653696 , E j D = 0.513926 , and E j E = 0.431334 . As shown in Figure 13, the results are in accordance with the regulation of a gradual decline.
In Figure 13, it can be seen that the measured value of each point has a large fluctuation, so it is verified that the expected average value can effectively improve the accuracy of measurement compared with the single measurement.
Additionally, Figure 13 shows that the value of the effectiveness characteristic parameter gradually degenerates as the system life cycle develops. The results are consistent with the actual capability degradation trend and verify the effectiveness and applicability of the methodology.
Based on the standard deviation (std) of the expected values in Table 8 and the regional dispersion degree in Figure 14, it can be determined that with the capability degradation of the element, the standard deviation presents a gradual upwards trend, and the upwards trend becomes faster in the later stage. The results show that the system performance fluctuates greatly after degradation. This is also consistent with the increasing uncertainty of system capability over time.

4.2.2. Degradation Trend Analysis

To master the effectiveness degradation regulation of SRMS in the whole life cycle, we divided 250 measurement points according to the time series and expressed them in the form of a scatter diagram. The data were fitted by exponential, logarithmic, and Weibull curves, as shown in Figure 14.
The fitting forms of three common curves are tested on the data, and the results are shown in Table 11. In Table 11, it is found that compared with exponential curve and logarithmic curve fitting, the data were subjected to the Weibull curve.
Through calculation by the Python flatform, the fitting functions of the curves were as follows:
Weibull :   E t = κ λ ( t λ ) κ 1 × e t λ κ ,   κ = 99.49 ,   λ = 1.7
Expponential :   E t = a e b x + c ,   a = 0.91 ,   b = 0.004 ,   c = 0.1
Logarithmic :   E t = a l o g ( x b ) + c , a = 0.2 , b = 50 , c = 1.48
In Figure 14, it can be determined that the fitting curve shows a downwards trend with time, which also corresponds to the above five groups of measured points, indicating that the curve conforms to the corresponding degradation regulation. Additionally, it is found that the degradation curve in the first 25% of the life cycle belongs to a relatively stable stage, the degradation rate gradually increases at 25~50% of the life cycle, decreases at 50–75% of the life cycle, and the system performance reaches a relatively low stable state at 75–85% of the life cycle. These results are consistent with the phenomenon that the SRMS can operate but is not efficient in reality. A value at 75–85% of the life cycle ( E j 250 0.4 ) that is not 0 also shows that there is a large error in the traditional use of a detection value of 0 to represent the failure of the risk mitigation system [46]. By comparing the fitting results of the three common curves in Figure 14, it can be seen that the curve fitting can better express the effectiveness degradation regulation of the SRMS.

4.3. Evaluation Accuracy Analysis

Based on the data collection and sorting of the risk mitigation indicator, the effectiveness measurement algorithm based on the entropy of capability was compared with the traditional accident probability algorithm to verify the accuracy of the effectiveness evaluation model. Considering the uncertainty of using the subjective scoring method in many traditional accident probability measurement algorithms, the prior experience of historical data is selected as the basis of the weight ratio setting to reduce the impact of subjective evaluation results on the analysis results.

4.3.1. Traditional Effectiveness Evaluation Method Based on the Accident Probability Algorithm

(1)
Priori data generation
Based on the common framework of the ship risk mitigation system, we collected accident statistics from the HIS sea–web database in the past 10 years, and we conducted statistical analysis on the main types of accidents and the causes of accidents, as shown in Figure 15.
Combining specific accident cause classification and statistical reports and following the common five types of ship risk mitigation, the cause statistical value table is constructed as in Table 12.
(2)
Effectiveness calculation
Since the traditional accident probability algorithm is based on actual ship accidents, we compiled the maintenance records of the tested ship in the past 1 year and sorted out the relevant data, as shown in Table 13.
Based on the failure probability of each element, we use the weight ratio of a priori experience to calculate the effectiveness value as follows:
E = j = 1 m i = 1 n 1 x i j x e j * m 1 δ 1 p i j * j
where x i j represents the i th causal indicator value of the j th risk mitigation subject, x e j represents the reference value of the j th risk mitigation subject, δ represents the sailing time correction factor, and p i j represents the a priori failure probability value of the i th cause indicator of the j th risk mitigation subject. j represents the weight factor of the j th risk mitigation subject.
Based on the failure probability of each element, we calculated the effectiveness of the SRMS, as shown in Table 14.

4.3.2. Effectiveness Evaluation Based on Entropy of Capability

Based on the data collection in the technical defence, human defence, and physical defence subsystems, using the effectiveness measurement methodology in Section 3.4, the corresponding subsystem and system effectiveness values can be obtained as shown in Table 15.
Then, based on the calculation standard of casualties and property loss of the international labour organization, the real navigational efficiency value of the ship is calculated. Combined with the actual sailing accident record data of the test ship in the past year, we conduct a comparative analysis of the two methods, as shown in Figure 16.
Figure 16 shows that the deviation between the effectiveness value based on the entropy of capability and the actual value is approximately 4.93%. The deviation of the traditional effectiveness value based on accident probability from the actual value can reach 13.58%, which is much larger than the effectiveness based on the entropy of capability. Therefore, the effectiveness model based on the entropy of capability can more accurately represent the real state of ship risk mitigation, with an accuracy improvement of 9%.
Based on the above definition of performance in Section 2.1, The effectiveness defined in this research refers to the degree to which the vessel risk mitigation system has completed risk control, not to the operation efficiency of the system itself. From the statistical historical data, the traditional effectiveness calculation methodology is mainly based on accident statistics and can only be calculated after the accident occurred, ignoring the degradation process of the capability to cause the accident from quantitative change to qualitative change. Therefore, there is a certain lag in the effectiveness evaluation of an SRMS. In addition, the problem of non-accidental dangerous phenomena may not be collected. These reasons combined lead to the phenomenon of a large distortion of evaluation value. This method may lead to a certain risk of error in the understanding of risk mitigation status, which will seriously affect the safety of ship navigation. In contrast, the effectiveness model based on the entropy of capability proposed in this study is based on the measurement of the uncertainty of capability, and at the same time, the capability change caused by capability degradation over time is considered. Therefore, the effectiveness evaluation based on the entropy of capability is more accurate and applicable.
Figure 16 shows that the calculated effectiveness value based on the entropy of capability is also slightly larger than the true value. Through the decomposition of the model, it is found that since the complexity and uncertainty of the internal relationship in the SRMS may bring certain systematic failure, the real effectiveness will be slightly lower than the shown status. Therefore, this further confirms that the calculation method of the SRMS using entropy of capability is closer to the real state.

4.4. Model Comprehensive Analysis

From the comparative analysis in Section 4.2 and Section 4.3, the effectiveness evaluation model based on the entropy of capability has better advantages in coverage, timeliness, accuracy and operability. The evaluation framework of the SRMS based on the theoretical perspective of the comprehensive defence system is more scientific and can provide a better network architecture for accurately measuring the effectiveness of the SRMS. In addition, compared with the traditional ergodic indicator design, the architecture from the capability perspective is more systematic and coverage. From the volatility analysis of system capability degradation, the model considers and analyses the dynamic variability of system capability and improves the timeliness characteristics of the effectiveness value. Additionally, the construction of the entropy of capability in this study solves the problem that capability is difficult to measure directly. In addition, it constructs the corresponding effectiveness measurement methodology based on the risk mitigation tasks. Therefore, it improves the accuracy of the evaluation of the actual risk mitigation state and defines the operability of the improvement of risk mitigation capability.
However, since the model has less sample verification, its versatility needs to be further determined. In addition, the suddenness and complexity of ship safety risks are relatively serious, and the corresponding causal relationships need to be further studied.

5. Conclusions

Sustaining ship safety through navigational stability is highly reliant on the effectiveness of ship risk mitigation systems. To comprehensively consider the degradation characteristics of risk mitigation capability and the scenario attributes of the risk mitigation task, this study proposed an effectiveness evaluation model based on the entropy of capability construction and capability degradation analysis. Beginning with the transformation of the risk mitigation concept in the SRMS and referring to the three-dimensional integrated risk mitigation framework of comprehensive defence theory, human defence, physical defence and technical defence, the SRMS indicator system is established based on capability building, which breaks through the traditional evaluation method based on accident probability analysis. On this basis, a capability degradation analysis methodology was constructed to solve the problem of capability change over time. It realized the possibility of dynamic capability analysis of an SRMS and improved the accuracy of effectiveness evaluation of an SRMS. In addition, because the risk mitigation capability cannot be directly measured, the entropy of capability is constructed based on entropy theory. On this basis, an effectiveness evaluation model based on the entropy of capability was proposed. From the perspective of capability uncertainty measurement, the effectiveness evaluation of risk mitigation status in the SRMS was realized by the model proposed in the study.
The case study results show that using the entropy of capability can realize the accurate effectiveness evaluation of a risk mitigation system for navigating ships with an accuracy improvement of 9%, and the Weibull curve fitting is more consistent with the capability degradation regulation with a significance level of 2.5%. Our research provides a more accurate and comprehensive dynamic analysis method for the effectiveness evaluation of SRMSs. Instead of simple statistics of accident probability and substituting the measurement of capability indicators, it is a comprehensive analysis of the measurement of capability uncertainty and timing degradation. This framework, design based on capability building, ensures good applicability for evaluating the effectiveness of SRMSs on multilevel ships. The present study verified the proposed model for one case. For the application in the practical operation, the application procedure needs to be further standardized to facilitate the engineers’ utilization, and the test from the third party may be also required. Further research work for the present study would be focused on the practical application of the proposed model for different ship fleets, based on which, the standardized application procedure would be developed.

Author Contributions

Conceptualization, X.M.; methodology, J.S.; validation, J.S.; formal analysis, J.S.; investigation, J.S.; resources, J.S.; data curation, J.S.; writing—original draft preparation, J.S.; writing—review and editing, W.Q.; visualization, W.Q.; supervision, X.M.; project administration, X.M.; funding acquisition, X.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the National Key Research and Development Program of China (Grand No. 2019YFB1600601), Fundamental Research Funds for the Central Universities (Grant No. 3132022223) and Program of Innovative Talents of Dalian (Grand No. 2021RQ043).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. The measured values of basic personnel support capacity in the human defense subsystem at different measuring points.
Table A1. The measured values of basic personnel support capacity in the human defense subsystem at different measuring points.
Measuring Point
No.ABCDE
10.83910.69740.65940.55920.2120
20.93840.84780.56490.52940.4729
30.83790.85220.71030.62030.4478
40.87030.73150.44120.71270.4023
50.87070.86630.62540.75120.4270
60.87110.70330.77730.34600.0617
70.90210.61020.54020.27640.0851
80.90360.79630.69850.69600.1030
90.87450.85650.59820.54200.1075
100.86440.81000.46760.73940.1164
110.76680.75930.70840.73790.1424
120.82910.74090.65030.46610.1439
130.86090.84980.56870.39490.1590
140.93090.91900.56680.50980.1792
150.82370.77340.71430.66330.2009
160.84580.81080.62830.63210.2117
170.89370.76540.67250.71230.2146
180.95650.78690.61780.42230.2302
190.81400.76850.66970.29760.2306
200.79420.81260.61270.56420.2336
210.92100.73440.51430.35870.2527
220.88040.87900.58530.21020.2693
230.85210.80020.74390.35120.2759
240.82850.81940.57190.55760.2782
250.90260.72110.67950.52680.2833
260.95490.64610.69490.63610.2948
270.78400.83110.66780.56090.2968
280.86030.69320.49040.72780.3077
290.95080.69280.66410.57840.3135
300.79700.76550.62600.53130.3248
310.89020.77200.70520.61780.3249
320.79860.71890.62040.51890.3260
330.92040.80610.67920.52500.3265
340.77680.91340.81750.68660.3279
350.93280.67300.63230.63040.3359
360.80060.75060.68280.56810.3371
370.87760.81540.76870.51830.3427
380.82010.85350.78140.41630.3482
390.86821.06050.68470.56780.3505
400.85920.81090.66530.39090.3520
410.91980.90390.76920.69950.3529
420.78370.93660.61590.51110.3670
430.93350.85830.61480.49310.3701
440.85440.70280.61980.50980.3713
450.89850.77820.86600.38800.3802
460.87010.75490.82220.41280.3905
470.92180.88470.64880.41430.3957
480.79110.74330.64930.42410.3977
490.89580.93770.58840.63640.4097
500.91060.76700.74410.59960.4245
510.86110.80960.72910.54500.4284
520.91740.59320.58760.50700.4293
530.84350.75790.77650.56080.4393
540.91450.57240.48390.40980.4403
550.80940.73190.66380.46180.4411
560.95750.82810.53040.49350.4439
570.75590.81130.56840.61820.4490
580.84910.73900.58510.46980.4528
590.90120.77000.46910.59800.4536
600.77320.87820.80940.72570.4611
610.97420.82530.76330.58250.4639
620.81260.81500.75080.45870.4673
630.80380.88050.70910.51910.4736
640.86070.67040.66760.57930.4742
650.82520.68520.74920.50110.4807
660.87940.77960.56350.55500.4807
670.84480.72110.42400.27330.4840
680.86430.77000.65420.87110.4916
690.89400.83250.82800.58510.5045
700.85800.79250.69380.50800.5071
710.87760.85640.56440.60170.5106
720.89490.94810.51970.54760.5310
730.86350.80400.68290.42940.5335
740.80960.72570.68600.31970.5387
750.84170.65560.77150.52320.5434
760.78940.80580.63800.49890.5436
770.86880.87850.69340.35640.5641
780.81770.82320.57920.35760.5670
790.83030.70660.43590.60650.5690
800.87420.88310.85360.56930.5848
810.86070.82430.65200.54350.6022
820.86310.66880.63090.49110.6053
830.82380.78590.52670.50560.6219
840.87690.93130.70720.51730.6440
850.90910.71170.60570.45140.6468
860.82470.83430.82500.57210.6491
870.87490.75080.67820.32120.6532
880.85450.83600.67170.49380.6721
890.87690.75960.46610.38980.6785
900.87610.79570.86120.35390.6888
910.92070.78860.68080.63700.6984
920.87610.69750.65610.63240.7075
930.91560.74780.66180.43530.7398
940.86620.95230.63460.57290.7544
950.85370.77910.63190.38260.7801
960.85510.82670.68050.41720.7900
970.84890.79480.67030.31070.7915
980.89020.73210.75720.30150.4525
990.83920.76040.63290.25700.8633
1000.89350.77720.60090.43200.8319

References

  1. Xue, J.; Wu, C.; Chen, Z.; van Gelder, P.H.A.J.M.; Yan, X. Modeling Human-like Decision-Making for Inbound Smart Ships Based on Fuzzy Decision Trees. Expert. Syst. Appl. 2019, 115, 172–188. [Google Scholar] [CrossRef]
  2. Galieriková, A. The Human Factor and Maritime Safety. Transp. Res. Procedia 2019, 40, 1319–1326. [Google Scholar] [CrossRef]
  3. Qiao, W.; Liu, Y.; Ma, X.; Liu, Y. A Methodology to Evaluate Human Factors Contributed to Maritime Accident by Mapping Fuzzy FT into ANN Based on HFACS. Ocean. Eng. 2020, 197, 106892. [Google Scholar] [CrossRef]
  4. Kaptan, M.; Sarıalïoğlu, S.; Uğurlu, Ö.; Wang, J. The Evolution of the HFACS Method Used in Analysis of Marine Accidents: A Review. Int. J. Ind. Ergonom. 2021, 86, 103225. [Google Scholar] [CrossRef]
  5. Kayisoglu, G.; Gunes, B.; Besikci, E.B. SLIM Based Methodology for Human Error Probability Calculation of Bunker Spills in Maritime Operations. Reliab. Eng. Syst. Saf. 2022, 217, 108052. [Google Scholar] [CrossRef]
  6. Shi, X.; Zhuang, H.; Xu, D. Structured Survey of Human Factor-Related Maritime Accident Research. Ocean. Eng. 2021, 237, 109561. [Google Scholar] [CrossRef]
  7. Guo, Y.; Lu, Y.; Guo, Y.; Liu, R.W.; Chui, K.T. Intelligent Vision-Enabled Detection of Water-Surface Targets for Video Surveillance in Maritime Transportation. J. Adv. Transp. 2021, 2021, 9470895. [Google Scholar] [CrossRef]
  8. Olba, X.B.; Daamen, W.; Vellinga, T.; Hoogendoorn, S.P. Risk Assessment Methodology for Vessel Traffic in Ports by Defining the Nautical Port Risk Index. J. Mar. Sci. Eng. 2020, 8, 10. [Google Scholar] [CrossRef] [Green Version]
  9. Zolich, A.; Palma, D.; Kansanen, K.; Fjørtoft, K.; Sousa, J.; Johansson, K.H.; Jiang, Y.; Dong, H.; Johansen, T.A. Survey on Communication and Networks for Autonomous Marine Systems. J. Intell. Robot. Syst. 2019, 95, 789–813. [Google Scholar] [CrossRef] [Green Version]
  10. Wang, D.; Li, D.; Sun, L.; Zhang, Z.; Cheng, J.; Yu, M. Fuzzy Comprehensive Evaluation of Maritime Boundary Delimitation Schemes Based on AHP-Entropy Weight Method. J. Intell. Fuzzy Syst. 2022, 2022, 2757779. [Google Scholar] [CrossRef]
  11. Sur, J.M.; Kim, D.J. Comprehensive Risk Estimation of Maritime Accident Using Fuzzy Evaluation Method—Focusing on Fishing Vessel Accident in Korean Waters. Asian J. Shipp. Log. 2020, 36, 127–135. [Google Scholar] [CrossRef]
  12. Wang, L.; Liu, Q.; Dong, S.; Guedes Soares, C. Effectiveness Assessment of Ship Navigation Safety Countermeasures Using Fuzzy Cognitive Maps. Saf. Sci. 2019, 117, 352–364. [Google Scholar] [CrossRef]
  13. Fan, S.; Yang, Z.; Blanco-Davis, E.; Zhang, J.; Yan, X. Analysis of Maritime Transport Accidents Using Bayesian Networks. Proc. Inst. Mech. Eng. Part O J. Risk Reliab. 2020, 234, 439–454. [Google Scholar] [CrossRef]
  14. Xie, T.H.; Lin, Y.; Chi, W.; Yang, Z.Y. Intelligent Identification of Fires in Ship Compartments Using a Bayesian Network. J. Mar. Sci. Tech.-Jpn. 2016, 24, 842–850. [Google Scholar] [CrossRef]
  15. Fan, S.; Blanco-Davis, E.; Yang, Z.; Zhang, J.; Yan, X. Incorporation of Human Factors into Maritime Accident Analysis Using a Data-Driven Bayesian Network. Reliab. Eng. Syst. Saf. 2020, 203, 107070. [Google Scholar] [CrossRef]
  16. Jia, X.; Zhang, D. Prediction of Maritime Logistics Service Risks Applying Soft Set Based Association Rule: An Early Warning Model. Reliab. Eng. Syst. Saf. 2021, 207, 107339. [Google Scholar] [CrossRef]
  17. Yu, Q.; Teixeira, Â.P.; Liu, K.; Rong, H.; Guedes Soares, C. An Integrated Dynamic Ship Risk Model Based on Bayesian Networks and Evidential Reasoning. Reliab. Eng. Syst. Saf. 2021, 216, 107993. [Google Scholar] [CrossRef]
  18. Nasiri, M.; Ukko, J.; Saunila, M.; Rantala, T.; Rantanen, H. Digital-Related Capabilities and Financial Performance: The Mediating Effect of Performance Measurement Systems. Technol. Anal. Strateg. 2020, 32, 1393–1406. [Google Scholar] [CrossRef]
  19. Li, X.; Lin, M.; Shu, Z.; Liu, G.; Wang, T.; Wang, W. Synthesized Capability-Effectiveness Evaluation Method of Contribution Ratio to System-of-Systems. J. Simul. 2018, 12, 4520–4528. [Google Scholar] [CrossRef]
  20. Bettauer, R.J. The Convention on the Physical Protection of Nuclear Material. Am. J. Int. Law. 1980, 74, 205–206. [Google Scholar] [CrossRef]
  21. Garcia, M.L. Design and Evaluation of Physical Protection Systems; Butterworth-Heinemann: Oxford, UK, 2007. [Google Scholar]
  22. Chatzimichailidou, M.M.; Dokas, I.M. The Risk Situation Awareness Provision Capability and Its Degradation in the Überlingen Accident over Time. Procedia Eng. 2015, 128, 44–53. [Google Scholar] [CrossRef] [Green Version]
  23. Chen, C.H.; Chou, C.Y.; Kan, C.C.; Khoo, M.B.C. Production Quantity and Specification Limits Settings by Considering Specified Process Capability Value. J. Ind. Eng. Chem. 2014, 31, 229–237. [Google Scholar] [CrossRef]
  24. Gao, H.; Cui, L.; Kong, D. Reliability Analysis for a Wiener Degradation Process Model under Changing Failure Thresholds. Reliab. Eng. Syst. Saf. 2018, 171, 1–8. [Google Scholar] [CrossRef]
  25. Fotr, J.; Špaček, M.; Souček, I.; Vacík, E. Scenarios, Their Concept, Elaboration and Application. Balt. J. Manag. 2015, 10, 73–97. [Google Scholar] [CrossRef]
  26. Aslett, L.J.M.; Nagapetyan, T.; Vollmer, S.J. Multilevel Monte Carlo for Reliability Theory. Reliab. Eng. Syst. Saf. 2017, 165, 188–196. [Google Scholar] [CrossRef] [Green Version]
  27. Paté-Cornell, M.E. Organizational Aspects of Engineering System Safety: The Case of Offshore Platforms. Science 1990, 250, 1210–1217. [Google Scholar] [CrossRef] [Green Version]
  28. Shannon, C.E. A Mathematical Theory of Communication. Bell. System. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  29. Ornstein, D.S. Ergodic Theory, Randomness, and “Chaos”. Science 1989, 243, 182–187. [Google Scholar] [CrossRef] [Green Version]
  30. König, R.; Renner, R.; Schaffner, C. The Operational Meaning of Min- and Max-Entropy. IEEE Trans. Inf. Theory 2009, 55, 4337–4347. [Google Scholar] [CrossRef] [Green Version]
  31. Łosiewicz, Z.; Nikończuk, P.; Pielka, D. Application of Artificial Intelligence in the Process of Supporting the Ship Owner’s Decision in the Management of Ship Machinery Crew in the Aspect of Shipping Safety. Procedia Comput. Sci. 2019, 159, 2197–2205. [Google Scholar] [CrossRef]
  32. Knox, B.J.; Lugo, R.G.; Sütterlin, S. Cognisance as a Human Factor in Military Cyber Defence Education. IFAC-Pap. OnLine 2019, 52, 163–168. [Google Scholar] [CrossRef]
  33. Li, J.; Liu, Y.; Chen, T.; Xiao, Z.; Li, Z.; Wang, J. Adversarial Attacks and Defenses on Cyber-Physical Systems: A Survey. IEEE Int. Things J. 2020, 7, 5103–5115. [Google Scholar] [CrossRef]
  34. Tam, K.; Moara-Nkwe, K.; Jones, K. The Use of Cyber Ranges in the Maritime Context: Assessing Maritime-Cyber Risks, Raising Awareness, and Providing Training. Marit. Technol. Res. 2020, 3, 16–30. [Google Scholar] [CrossRef]
  35. Roesler, E.; Manzey, D.; Onnasch, L. A Meta-Analysis on the Effectiveness of Anthropomorphism in Human-Robot Interaction. Sci. Robot. 2021, 6, 5425. [Google Scholar] [CrossRef]
  36. Wang, W.; di Maio, F.; Zio, E. Adversarial Risk Analysis to Allocate Optimal Defense Resources for Protecting Cyber–Physical Systems from Cyber Attacks. Risk. Anal. 2019, 39, 2766–2785. [Google Scholar] [CrossRef] [Green Version]
  37. Perera, L.P.; Ferrari, V.; Santos, F.P.; Hinostroza, M.A.; Soares Guedes, C. Experimental Evaluations on Ship Autonomous Navigation and Collision Avoidance by Intelligent Guidance. IEEE J. Ocean. Eng. 2014, 40, 374–387. [Google Scholar] [CrossRef]
  38. Ma, X.; Shen, J.; Liu, Y.; Qiao, W. A Methodology to Evaluate the Effectiveness of Intelligent Ship Navigational Information Monitoring System. IEEE. Access. 2020, 8, 193544–193559. [Google Scholar] [CrossRef]
  39. Song, X.; Wang, S.; Hu, Z.; Li, H. A Hybrid Rayleigh and Weibull Distribution Model for the Short-Term Motion Response Prediction of Moored Floating Structures. Ocean. Eng. 2019, 182, 126–136. [Google Scholar] [CrossRef]
  40. Khamkanya, A.; Cho, B.R.; Goethals, P.L. Integrating Customer Perception into Process Capability Measures. Qual. Reliab. Eng. Int. 2016, 32, 1331–1345. [Google Scholar] [CrossRef]
  41. Lewis, H.F.; Lock, K.A.; Sexton, T.R. Organizational Capability, Efficiency, and Effectiveness in Major League Baseball: 1901-2002. Eur. J. Oper. Res. 2009, 197, 731–740. [Google Scholar] [CrossRef]
  42. Liu, J.; Zhang, H.; Zhen, L. Blockchain Technology in Maritime Supply Chains: Applications, Architecture and Challenges. Int. J. Prod. Res. 2021, 2021, 1–17. [Google Scholar] [CrossRef]
  43. Akyuz, E. Quantitative Human Error Assessment during Abandon Ship Procedures in Maritime Transportation. Ocean. Eng. 2016, 120, 21–29. [Google Scholar] [CrossRef]
  44. Dong, D.T.; Cai, W. A Comparative Study of Life Cycle Assessment of a Panamax Bulk Carrier in Consideration of Lightship Weight. Ocean. Eng. 2019, 172, 583–598. [Google Scholar] [CrossRef]
  45. Razali, N.M.; Wah, Y.B. Power Comparisons of Shapiro-Wilk, Kolmogorov-Smirnov, Lilliefors and Anderson-Darling Tests. J. Stat. Modeling Anal. 2011, 2, 21–33. [Google Scholar] [CrossRef] [Green Version]
  46. Fam, M.L.; Konovessis, D.; He, X.H.; Ong, L.S. Data Learning and Expert Judgment in a Bayesian Belief Network for Aiding Human Reliability Assessment in Offshore Decommissioning Risk Assessment. IEEE J. Ocean. Eng. 2021, 6, 170–184. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of the effectiveness evaluation model based on entropy theory.
Figure 1. Schematic diagram of the effectiveness evaluation model based on entropy theory.
Ijerph 19 09338 g001
Figure 2. The comprehensive security system theory core principle.
Figure 2. The comprehensive security system theory core principle.
Ijerph 19 09338 g002
Figure 3. Performance change trends of sudden failures and degraded failures.
Figure 3. Performance change trends of sudden failures and degraded failures.
Ijerph 19 09338 g003
Figure 4. The distribution form selected for extracting the true value.
Figure 4. The distribution form selected for extracting the true value.
Ijerph 19 09338 g004
Figure 5. Comparison of various curve fitting forms of characteristic quantities.
Figure 5. Comparison of various curve fitting forms of characteristic quantities.
Ijerph 19 09338 g005
Figure 6. The common and reorganized framework of an SRMS.
Figure 6. The common and reorganized framework of an SRMS.
Ijerph 19 09338 g006
Figure 7. Measurement framework of the human defence system capability indicator.
Figure 7. Measurement framework of the human defence system capability indicator.
Ijerph 19 09338 g007
Figure 8. Measurement framework of the physical defence system capability indicator.
Figure 8. Measurement framework of the physical defence system capability indicator.
Ijerph 19 09338 g008
Figure 9. Measurement framework of the technical defence system capability indicator.
Figure 9. Measurement framework of the technical defence system capability indicator.
Ijerph 19 09338 g009
Figure 10. Derivation logic of effectiveness measurement based on entropy of capability.
Figure 10. Derivation logic of effectiveness measurement based on entropy of capability.
Ijerph 19 09338 g010
Figure 11. Scenario settings for the effectiveness evaluation of SRMS.
Figure 11. Scenario settings for the effectiveness evaluation of SRMS.
Ijerph 19 09338 g011
Figure 12. Measured results: distribution at points (AE).
Figure 12. Measured results: distribution at points (AE).
Ijerph 19 09338 g012
Figure 13. Degradation trend of the data at points A–E.
Figure 13. Degradation trend of the data at points A–E.
Ijerph 19 09338 g013
Figure 14. Fitting curves of the statistical analysis data at 250 measurement points.
Figure 14. Fitting curves of the statistical analysis data at 250 measurement points.
Ijerph 19 09338 g014
Figure 15. The main accident type and cause from the IHS database in 2011–2021.
Figure 15. The main accident type and cause from the IHS database in 2011–2021.
Ijerph 19 09338 g015
Figure 16. Comparison of the results from the two models.
Figure 16. Comparison of the results from the two models.
Ijerph 19 09338 g016
Table 1. Details of the human defence subsystem of the SRMS.
Table 1. Details of the human defence subsystem of the SRMS.
CompositionA collection of human-centred risk mitigation standards, measures and behaviours.
objectiveFrom the perspective of human risk mitigation, fully mobilize people’s subjective initiative, timely identify, accurately judge and efficiently deal with potential safety risks to ensure the normal operation of risk mitigation subjects.
functionBased on people’s subjective initiative, give play to the role of independent decision-making, flexible interaction, and experience dependence of human defence.
characteristicAdvantage: strong individual dependence (experience, capability), strong autonomy (self-judgement), strong interaction (flexible), strong adaptability (changing with the environment).
Disadvantage: poor uniformity (judgement criteria, operation mode), poor driver (unable to standardize the program), poor persistence (human characteristics).
Table 2. Details of the physical defence subsystem of the SRMS.
Table 2. Details of the physical defence subsystem of the SRMS.
CompositionA collection of hardware-based risk mitigation instruments, signs and facilities.
objectiveFrom the perspective of physical risk mitigation, giving full play to the passive resistance of objects, reducing, delaying and avoiding possible risks, and then improving the effectiveness of water traffic risk mitigation.
functionBased on the characteristics of a physical object, playing the role of barrier, identification, and damage resistance of physical prevention.
characteristicAdvantage: good damage resistance (compared to humans), high permanence (24 h on duty), excellent economy (compared with personal injury), strong replaceability (replaceable after damage), strong objectivity (fixed attributes).
Disadvantage: poor autonomy (no self-awareness), poor flexibility (almost no interaction), poor lifting performance (cured performance).
Table 3. Details of the technical defence subsystem of the SRMS.
Table 3. Details of the technical defence subsystem of the SRMS.
CompositionA collection of technology-based risk mitigation methods, application software, and integrated systems.
objectiveFrom the perspective of technical risk mitigation, giving full play to the effectiveness of technical activities, using information technology to discover, analyse, and deal with potential risks, then improving the effectiveness of water traffic risk mitigation.
functionBased on the inherent characteristics of technology, playing the role of identification, early warning, analysis, and monitoring in technical prevention.
characteristicAdvantage: strong methodology (with technical support), strong adaptability (fast perception speed), strong interactivity (link people and things), high integration (diversified constituent elements), good relevance and responsiveness (completely from actual demand).
Disadvantage: high external dependence (cannot run alone), high technical threshold (not easy to achieve), faster stacking speed (as demand changes).
Table 4. Capability indicators of human defence.
Table 4. Capability indicators of human defence.
Basic Personnel Support CapabilityThe Certificate Status
Ship Manning Situation
Stability of personnel on boardaverage time on board
proportion of leader layer in half a year
proportion of operation layer in half a year
proportion of support layer in half a year
Familiarity of decision-makers with shipsthe ship’s captain continuous on-board time
the ship’s chief mate continuous on-board time
average cumulative time of officers on similar ships
Risk mitigation exercise situationcompleteness of types of ship exercises
frequency of key project exercises
record times of basic exercise training
Working language on boardproportion of native language crew
proportion of crew nationality differentiation
Crew safety trainingaverage annual class hours of organizations
average annual training hours for crew enterprises
Crew handover situationproportion of crew handover records
Implementation of pre shift meeting systempre shift meeting record
statistics of on-board operation accidents
Discussion on on-board safety risk eventsnumber of participants
frequency of discussion
Health status of on-board personnelfrequency of crew medical examination
proportion of chronic occupational diseases
frequency of psychological relief
Operation conditions on boardcontinuous monitoring under closed operation
frequency of on-board operation
Table 5. Capability indicators of physical defence.
Table 5. Capability indicators of physical defence.
Anti-piracy capabilitysafe house
frequency of safety house inspections
Mobile firefighting capabilityquota quantity
instrument pressure (bar)
inspection cycle
Closed space gas detection capabilityalarm concentration
number of false-positives
Video collection capabilitydefinition (image resolution)
signal-to-noise ratio (dB)
Bilge emergency pump discharge capabilitylift (m)
flow (m3/h)
cavitation indicator
Safety warning capabilityvent prompt bar
smoking warning signs
warehouse warning signs
Safe operation support capabilitytag and lock off
Fuel safety protection capabilityinspection frequency of quick closing valve
Personal protective equipment configuration capabilityprotective rope
protective cap
protective clothing
gas protection equipment
protective earplugs
Fixed fire extinguishing capabilitytrigger response value (mg/l)
number of false-positives (times/month)
gas emission rate (l/min)
Ship self-rescue capabilitynumber of lifeboats
number of life rafts
number of lifebuoys
number of life jackets
Fire isolation capabilityairtightness of fire door
alarm device
Water inlet alarm capabilityalarm value of water inlet (mm)
number of false-positives
Table 6. Capability indicators of technical defence.
Table 6. Capability indicators of technical defence.
Ship’s automatic navigation capabilityelectronic chart update (times/month)
GPS accuracy
Bridge information monitoring capabilitycoverage of monitoring indicators
effective information fusion rate
Radar monitoring capabilitydetermination accuracy
anti-interference rate
Emergency communication capabilityinformation fidelity
channel capability (kb)
communication delay (ms)
Route abnormal alarm capabilityalarm value (deviation degree)
number of alarms (times/day)
Automatic collision avoidance capabilityaccuracy of generation
probability of scheme adoption
Meteorological monitoring capabilityaccuracy within 1 year
Fault tolerance of risk alarm systemfault tolerance degree
Network protection capabilityprotection software and hardware
network paralysis response plan
Video surveillance coverage capabilitycoverage
Unsafe behaviour recognition capabilityintelligent recognition rate
Ship–shore cooperative monitoring capabilityVTS visibility in the jurisdiction
visualization degree of remote sensing information
sum of GNSS delay error and inherent error (ms)
LRIT information protection mechanism
GMDSS false alarm rate
Table 7. The specific parameters of test ship named Yukun.
Table 7. The specific parameters of test ship named Yukun.
Total Length116 mGross Tonnage6000 t
width18 mspeed18 nm/h
depth8.35 mvoyage10,000 nm
design draft5.4 mconstruction date2008
Table 8. Statistical analysis data at five points of the proposed life cycle.
Table 8. Statistical analysis data at five points of the proposed life cycle.
ABCDE
Count100100100100100
Mean0.8637550.7888810.6536960.5139260.431334
Std0.0543910.0818780.0972340.1267450.180557
Min0.755910.5724050.4240120.2101930.061739
25%0.819450.7378420.595730.4210130.321971
50%0.8621280.787750.6606080.5189990.434303
75%0.8992030.8329730.7075310.5883040.539868
Max0.9741951.0605060.866030.871060.863276
Table 9. Statistical analysis data for the 36 capability indicators under the three subsystems.
Table 9. Statistical analysis data for the 36 capability indicators under the three subsystems.
Capability IndicatorsCharacteristic Value
q y
Improved Characteristic Value
e y
Loss
Factor
ω y
Entropy of Capability
I y
Subsystem
Effectiveness
E j
Informatization Degree
f j t
Route Safety Factor
θ
System
Effectiveness
E t o t a l
physical defenceAnti-piracy capability0.80001.2881.92%0.02480.942139%0.960.8485
Mobile firefighting capability0.47050.2993.85%0.0115
Closed space gas detection capability0.83331.4935.77%0.0862
Video collection capability0.76801.1227.70%0.0864
Bilge emergency pump discharge capability0.71470.8969.62%0.0862
Safety warning capability1.00002.00011.54%0.2309
Safe operation support capability1.00002.00013.47%0.2694
Fuel safety protection capability0.93332.52815.39%0.3891
Personal protective equipment configuration capability1.00002.00017.32%0.3463
Fixed fire extinguishing capability0.54570.4311.92%0.0083
Ship self-rescue capability0.76801.1221.92%0.0216
Fire isolation capability0.80001.2883.85%0.0495
Water inlet alarm capability0.55140.4425.77%0.0255
technical defenceShip’s automatic navigation capability0.62670.6176.25%0.03860.829427%
Bridge information monitoring capability0.66180.7177.81%0.0560
Radar monitoring capability0.63330.6359.38%0.0596
Emergency communication capability0.74671.02510.94%0.1121
Automatic collision avoidance capability0.42670.23712.50%0.0297
Meteorological monitoring capability0.59730.543414.06%0.0764
Fault tolerance of risk alarm system0.83601.511115.63%0.2361
Network protection capability0.80001.28761.56%0.0201
Video surveillance coverage capability1.00002.00003.13%0.0625
Unsafe behaviour recognition capability0.83601.51114.69%0.0708
Ship–shore cooperative monitoring capability0.81001.34526.25%0.0841
Ship’s automatic navigation capability0.19600.04287.81%0.0033
human defenceBasic personnel support capability1.00002.00008.86%0.17720.860434%
Familiarity of decision-makers with ships0.18510.037910.34%0.0039
Risk mitigation Exercise situation0.46420.289711.82%0.0342
Working language on board0.36980.170713.29%0.0227
Crew safety training0.49490.33811.48%0.0050
Crew handover situation0.72000.91652.95%0.0271
Implementation of pre-shift meeting system0.80001.28764.43%0.0570
Discussion on on-board safety risk events0.80001.28765.91%0.0761
Health status of on-board personnel0.80001.28767.38%0.0951
Operation conditions on board0.28800.09788.86%0.0087
Stability of personnel on board0.53330.406523.63%0.0961
Table 10. Distribution pattern test results based on Anderson–Darling test.
Table 10. Distribution pattern test results based on Anderson–Darling test.
Distribution FormFeatureFlag (0.05)StatisticCritical ValuesSignification Level
Normal
distribution
A+0.262144011 [0.555 0.632 0.759 0.885 1.053] [15. 10. 5. 2.5 1.]
B+0.240336575 [0.555 0.632 0.759 0.885 1.053] [15. 10. 5. 2.5 1.]
C+0.447665853 [0.555 0.632 0.759 0.885 1.053] [15. 10. 5. 2.5 1.]
D+0.341666108 [0.555 0.632 0.759 0.885 1.053] [15. 10. 5. 2.5 1.]
E+0.30482838 [0.555 0.632 0.759 0.885 1.053] [15. 10. 5. 2.5 1.]
Exponential
Distribution
A-41.12834048 [0.917 1.072 1.333 1.596 1.945] [15. 10. 5. 2.5 1.]
B-37.26900374 [0.917 1.072 1.333 1.596 1.945] [15. 10. 5. 2.5 1.]
C-33.61289887 [0.917 1.072 1.333 1.596 1.945] [15. 10. 5. 2.5 1.]
D-26.24084953 [0.917 1.072 1.333 1.596 1.945] [15. 10. 5. 2.5 1.]
E-14.97244258 [0.917 1.072 1.333 1.596 1.945] [15. 10. 5. 2.5 1.]
Table 11. Test results of three curve shape fitting.
Table 11. Test results of three curve shape fitting.
FeatureFlagStatisticCritical ValuesSignification Level
Exponential-61.50365 [0.921 1.075 1.338 1.602 1.952] [15. 10. 5. 2.5 1.]
Logarithmic-12.30670 [0.426 0.562 0.659 0.768 0.905 1.009] [25. 10. 5. 2.5 1. 0.5]
Weibull+0.447665 [0.362 0.395 0.427 0.462 0.506] [15. 10. 5. 2.5 1.]
Table 12. Traditional accident statistical data of the IHS database in 2011–2021.
Table 12. Traditional accident statistical data of the IHS database in 2011–2021.
SubjectDistribution TypeNumberTotalAllWeights
p i j
bridge subsystemnavigation operators2694578616,14616.69%
navigation monitoring and warning207612.86%
navigation communication equipment10166.29%
engine room subsystemmarine engineer151934619.41%
engine room monitoring platform7774.81%
maintenance equipment11657.22%
cargo hold subsystemoperators103226436.39%
monitoring system10326.39%
emergency equipment5793.59%
deck subsystemstaff2321325614.37%
protection equipment3812.36%
ship rescue equipment5543.43%
living cabin subsystempersonal protective equipment53610003.32%
personnel health protection unit4642.88%
Table 13. Statistical data of maintenance records of the tested ship in the past 1 year.
Table 13. Statistical data of maintenance records of the tested ship in the past 1 year.
SubjectDistribution TypeFailure
x i j
Criterion
x e j
Sailing Times
m
Sailing Time
Correction Factor
δ
Probability
bridge subsystemnavigation operators16655100/2700.9695
navigation monitoring and warning740.9800
navigation communication equipment420.9771
engine room subsystemmarine engineer1240.9657
engine room monitoring platform920.9485
maintenance equipment530.9809
cargo hold subsystemoperators640.9828
monitoring system1340.9628
emergency equipment620.9657
deck subsystemstaff1260.9771
protection equipment110.9828
ship rescue equipment310.9657
living cabin subsystempersonal protective equipment310.9657
personnel health protection unit210.9771
Table 14. Effectiveness value based on the traditional accident probability algorithm.
Table 14. Effectiveness value based on the traditional accident probability algorithm.
SubjectEffectivenessWeights
j
Total
E
common framework of SRMS
based on spatial distribution
bridge subsystem0.928335.84%0.9202
engine room subsystem0.898421.44%
cargo hold subsystem0.913816.37%
deck subsystem0.927320.17%
living cabin subsystem0.94356.19%
Table 15. Effectiveness value based on the entropy of capability construction.
Table 15. Effectiveness value based on the entropy of capability construction.
SubjectEffectivenessTotal
SRMS based on capability constructionhuman defence subsystem0.86040.8485
physical defence subsystem0.9421
technical defence subsystem0.8294
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Shen, J.; Ma, X.; Qiao, W. A Model to Evaluate the Effectiveness of the Maritime Shipping Risk Mitigation System by Entropy-Based Capability Degradation Analysis. Int. J. Environ. Res. Public Health 2022, 19, 9338. https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph19159338

AMA Style

Shen J, Ma X, Qiao W. A Model to Evaluate the Effectiveness of the Maritime Shipping Risk Mitigation System by Entropy-Based Capability Degradation Analysis. International Journal of Environmental Research and Public Health. 2022; 19(15):9338. https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph19159338

Chicago/Turabian Style

Shen, Jun, Xiaoxue Ma, and Weiliang Qiao. 2022. "A Model to Evaluate the Effectiveness of the Maritime Shipping Risk Mitigation System by Entropy-Based Capability Degradation Analysis" International Journal of Environmental Research and Public Health 19, no. 15: 9338. https://0-doi-org.brum.beds.ac.uk/10.3390/ijerph19159338

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop