Abstract

The Intelligent Cabin Management System (ICMS) as a solution for the aviation and rail industries, has been undergoing continuous advancements with emerging technologies and capabilities.

Thanks to ever-improving facial recognition algorithms and their implementations, artificial intelligence (AI) has vastly improved efficiencies and reliabilities in domains such as user authentication, behavior analysis, safety, protection, threat detection, and object tracking. In addition, monitoring vital signs of passengers onboard is taking precedence.

Introduction

We, at Cyient Limited, have developed an Intelligent Cabin Management Solution (ICMS), capable of carrying out cabin operations with a focus on passenger safety, security, and health. While this application is suitable for all kinds of passenger cabins in aircraft and rail, it is especially suited for Urban Air Mobility (UAM) applications where four- to six-passenger aircraft would incrementally graduate from initial single-pilot-operated platforms to completely autonomous ones, thus gradually replacing human element inside passenger cabins.

Functionalities

Cyient’s ICMS can perform the following tasks during flight operations:

  • Assure cabin entry by authorized crew and passengers during boarding
  • Guide and ensure that passengers occupy their allocated seats, assigned during booking 
  • Ensure passengers keep hand luggage inside specified overhead/under-seat bins 
  • Ensure passengers comply with air travel advisories such as using seat belts and raising/lowering window curtains during take-off/landing
  • Continuously monitor passengers to distinguish between offensive and non-offensive gestures to preempt the likely occurrence of unexpected behavior/arguments/untoward incidents by raising a cautionary flag for timely intervention
  • Identify permitted objects carried by passengers, and classify and continuously track those which may be used with offensive intent such as walking sticks, umbrellas, sports racquets, cricket bat, hockey sticks, etc.
  • Raise flags for left/unclaimed luggage on seats
  • Detect vital health parameters like heart rate and breathing rate of passengers in contactless form, and make the information available to display as well as ground control center.

shutterstock_2194598141

To meet the above objectives, the ICMS application has been designed to integrate software modules for face detection, face identification, gesture recognition and object detection, identification and tracking, and radar-based vital sign detection.

How it works

A passenger's credentials are captured as soon as a booking is made for a journey by aircraft/train/UAM by providing appropriate identification documents and pictures. These credentials are stored in the customer database to be used to identify the passenger when she/ he reports for boarding. At the time of boarding, their credentials and facial captures through onboard cameras are matched to authenticate their entry into the cabin. The application maps all passenger-occupied seats against seat allocations made during the reservation process and raises cautionary flags in case wrong seating is detected. It detects objects carried by passengers and raises prohibitory alarms to dissuade the carrying of banned items into the cabin. The application identifies items that are permitted but could be used for aggression to harm fellow passengers, and categorizes them appropriately for continuous tracking during the journey. It raises cautionary flags when such objects are taken out during the flight. The application continuously monitors facial gestures of passengers to identify and detect those which can be classified as offensive to raise timely cautionary flags to pilot/operations control. 

shutterstock_740383273

Regulatory-strategy

Regulatory strategy for new product development

Regulatory-compliance

Regulatory compliance for sustenance

EU MDD to EU MDR2017745 transition

EU MDD to EU MDR2017/745 transition

Solution Implementation

Images or image sequences during the boarding process and inside the passenger cabin are captured in the system through a set of high-resolution cameras in real time and processed by the face detection module. These facial images are then processed by the face recognition module to extract unique identification features for finding a match against the image library in the customer database. Figure 1 below depicts the architecture of the entire process.Screenshot 2023-05-24 at 7.22.04 PM

Figure 1: ICMS architecture

Feasibility-assessment

Feasibility assessment


Understanding the product and identifying applicable standards as per the intended purpose of the product.

Product-development

Product development


Identifying applicable regulatory requirements and preparation of regulatory assessment report.

Product-launch

Product launch


Compiling the technical documentation, product registration, and launch.

The sequence of processes followed for solution implementation are: 

Verification 

This involves one-to-one matching of the individual’s captured image with those in the library which were provided by the customer during booking as reference images.

Video indexing 

This involves labeling faces in a video/image. 

Identification

At this stage, noise and other irrelevant aspects are removed from the captured image to let the algorithm match and recognize an individual out of multiple photographs in the database.

Mask detection 

An individual’s face capture is processed at this stage to determine if he is wearing a COVID mask or not. This is applicable when COVID-appropriate behavior monitoring is to be implemented inside the passenger cabin.

Object detection 

In this stage, each passenger is scanned on entry into the cabin, to detect all objects being carried in. The system matches and identifies objects classified as permitted but fall in likely-offensive category, and continuously tracks their location and raises cautionary flags if lifted during the flight.

Person behavior 

Here, facial expressions of an individual are processed to determine if his gestures are aggressive or not.

Radar-Based Vital Signs Monitoring

This is a contactless process and maintains complete privacy of individuals. Radar sensors are used to detect the heart rate and breathing rate of passengers.

Further analysis of the health condition of the patient based on heart rate and breathing rate can be carried out using application software.
The concept of radar health monitoring is explained in Figures 2 and 3 below.

figure-2

Figure 2: Heart rate monitoring using radar

Screenshot 2023-05-24 at 7.22.23 PM

Figure 3: Typical variation of heart rate and breathing pattern based on different health conditions

Challenges and Mitigations 

Face detection systems are becoming increasingly vital for implementation of security measures in all walks of life. Challenges lie in accurate detection of face(s) in a captured image as minutely varying environmental attributes such as variations in ambient light, occlusions, variations in position/posture, changes in backdrops, and aging may severely impact detection leading to completely missed faces in a captured shot.

Despite environmental variations, a reliable and accurate face recognition system should be able to detect and identify a face irrespective of existing conditions and ambience. Challenges lie not only in accurate identification but also identifying facial gestures and categorizing them appropriately. As the range of applications expands day by day, the system's complexity increases manifold and hence adds to the burden of making detection accurate. 

Factors that affect accurate detection are of two broad types:

  • Intrinsic factors - These are due to the physical nature of the face, independent of the observer. These factors are further divided into intrapersonal and interpersonal where intrapersonal is caused due to variations in the facial appearance of an individual, for example, aging, expression, and attributes like facial hair, cosmetics, glasses, etc. 
  • Extrinsic factors - These are attributable to variations in facial appearance caused by incident light interacting with target face and the observer. This includes illumination, pose, scale, and imaging parameters such as camera resolution, focus, imaging, noise, similar faces, etc.

Developing a highly reliable face recognition system requires incorporating latest technological developments with special attention to every minute detail captured by monitoring systems. As real-world data is limited, expensive, and time-consuming to collect, retrieving the right data to curate, label, train, test, and validate such systems becomes even more challenging.


Another challenge is to maintain the accuracy of heart rate and breathing rate measurements of individuals in the presence of any other moving parts which may be present in a complex cabin scenario. A detailed model for training the algorithms is essential for greater accuracy and hence usefulness of the radar data.

Change In The Pipeline

Oil and gas transmission via pipelines is a tightly regulated and closely monitored global operation. Due to the extensive geographical coverage, significant risks are involved. 

Pipeline operators have slim profit margins and multiple challenges to manage.

The safety of the pipeline network is thus a top priority for any operator, not just due to fines and penalties but also because of potential operational blockages. Any interruption in product transmission or incidents in the field can lead to financial and reputational loss. The use of Geo-AI can help reduce the chances of such incidents occurring by:

  • Providing timely notifications of details, anomalies, and activities along the pipeline 
  • Delivering detailed imagery using EO data to cover various points across the earth within a short period of time 
  • Using AI and ML to help ascertain variations quickly and take appropriate action
cloud-based solution


A cloud-based solution to accelerate the regulatory compliance process. It helps to search worldwide regulations, and offers a digitized form regulation database for easy search and analysis. The standard module consists of a library of 1500+ international standards such as ISO/IEC/AAMI.

impact-analysis


The solution offers a device classification tool, device-specific compliance, and regulatory intelligence services such as regulation assessment, gap assessment, and impact analysis.

02-08


Its regulatory watch feature monitors changes in regulations and provides a personalized news feed to users consisting of regulations news, safety communication, and warning letters.

Contents of a Clinical Evaluation Report

  • Executive Summary
  • Scope
  • State-of-the -Art Evaluation
  • Subject Device Description
  • Equivalence Study
  • Data Sources—Identification and Appraisal
  • Post-Market Experience and Surveillance
  • Risk Benefit Analysis
  • Conclusion

Challenges in Clinical Evaluation

  • Updated risk management data, where full risk management data are defined as delivery of the following: Risk analysis report + Failure Mode and Effects Analysis (FMEA) + Process Failure Mode and Effects Analysis (PFMEA) + Risk management report.
  • Biocompatibility reports.
  • Verification and validation (V&V) testing reports.
  • Full PMS data as an indication of an active PMS program, where full PMS data are defined as delivery of all the following: Sales + Complaints + Trend reporting + Corrective and Preventative Actions (CAPAs) if issued + Field Safety Corrective Actions (FSCAs) if applicable + PMCF data + data from registries.
  •  Technical documentation as per Annexes II and III to the MDR.
  • Copies of the instructions for use (IFU) and other applicable labeling documents.
  • Essential Requirement (ER)/General Safety and Performance Requirement (GSPR) checklists.
  •  Compliance with applicable standards.
  • Previously released CER, regardless of its compliance statement.




shutterstock_2014047518

Conclusion

Facial recognition systems are in great demand and require a high level of accuracy and reliability. As facial images are often captured in their natural environment, backgrounds can be complex, with drastic variations in illumination. These systems therefore necessitate careful mitigation of challenges due aging, occlusions, degree of illumination, variation in resolution, expression, and poses. Also, with increasingly complex lifestyles, temporal changes in health parameters are critical and need to be monitored onboard for taking autonomous decisions on UAM emergency landing or rerouting to the nearest healthcare facility.
All these challenges can be addressed through appropriate technology and algorithms, opening up opportunities for innovations in future.

About the Authors


Ajay Kumar Lohany

Ajay Kumar Lohany is an aeronautical engineer with specialization in avionics systems. He holds a master’s degree in computer science and modeling and simulation. He has served in the Indian Air Force as a flight test and instrumentation engineer. With over 32 years of industry experience, he takes keen interest in building technological solutions that help solve problems in the aerospace and rail domains.

 

Ranadeep Saha

Ranadeep Saha is an Electronics & Communication Engineer with specialization in Microwave & Radar technologies and systems. He has over 20 years of industry experience in Microwave and Radar system research, design & development. He has primarily worked in design & development of radars and other microwave systems for detection, sensing and tracking, initially as a design engineer, through function lead, into R&D head for radars. He has extensive experience in the application of microwave & radar technology in defence, space, government and commercial markets.

About Cyient

Cyient (Estd: 1991, NSE: CYIENT) is a consulting-led, industry-centric, global Technology Solutions company. We enable our customers to apply technology imaginatively across their value chain to solve problems that matter. We are committed to designing tomorrow together with our stakeholders and being a culturally inclusive, socially responsible, and environmentally sustainable organization.

For more information, please visit www.cyient.com