How will behaviour detection technology transform the future of airports? Aviation security stakeholders are making substantial investments in new technologies in order to predict criminality and get ahead of those with malintent. Behaviour detection technology, therefore, has the potential to revolutionise our approach to security. The Transportation Security Administration (TSA) is in the process of developing the Behaviour Detection and Analysis Program (BDA), and a range of private companies are racing to deliver the best digital solution to the market. However, from civil liberty groups to government auditors, some suggest that the approach is ‘unscientific’ and therefore susceptible to bias. Yet, despite its surrounding controversy, these technologies are expected to transform the future of aviation security. Selma Blum discusses the future of aviation security – the era of emotional machines.
We should start by explaining the main concepts of behaviour technology and its implications on security. We need to explore how it works, what controversies exists and why artificial intelligence and machine learning is so important. Are machines reliable in terms of understanding human behaviour? Will they be capable of keeping passengers’ information safe and preserving our privacy? Can those machines predict intentions better than human security agents? Why is it taking so long for behavioural technology to be implemented? The main objective of this article is to clarify and help decision-makers to take action. Additionally, it will examine how these new technologies will transform and improve customers’ experiences.
The era of huge queues and shoe removal, of false alarms, unfriendly agents, delays and uncomfortable situations will, most likely, be consigned to the past. Behaviour detection technology will help us in providing passengers with a seamless and safe journey through our airport terminals onto the aircraft. Unfortunately, all major changes to security processes take time and serious consideration due to the potential impact on people’s lives. Decisions relating to sensitive issues like privacy, consistency, data storage location, safety and IT infrastructure must be well defined and justified in advance in order for a new security measure to work properly and deliver the best results.
Aviation Industry and Behaviour Technology Challenges
The aviation industry is growing fast. The sector is experiencing hyper-competition, increasing numbers of passengers, new players and routes. However, crowded places are vulnerable targets and levels of criminality have risen in line with industry growth. It is essential to create methods and technological solutions capable of detecting potential threats in advance. Travellers now have much higher expectations regarding airline and airport service than ever before. Therefore, it is not only necessary to offer a pleasant experience but a safe and secure one as well, and airports must invest in technological infrastructure in order to survive in this frenetic environment. There is a direct correlation between passenger experience at a security checkpoint and the airport’s propensity to spend money. In the digital era, customers’ needs can be more readily met through faster and non-invasive solutions on the go. Digital behaviour detection technology has the potential to satisfy those needs and is proven to be more efficient than traditional methods.
The ‘digital revolution’ is the result of significant growth in computational power and data. There has been enormous improvement in capabilities, resulting in the intensive application of Artificial Intelligence (AI) and robotics to security-sensitive environments, providing non-invasive threat identification, predicting criminal actions, detecting lies, tracking illicit intentions and even tracking diseases by thermal video analysis. Consequently, stakeholders in both the public and private sectors have begun to actively invest in technologies in order to gain the power of prediction, improve security intelligence, reduce costs and generate new business opportunities. As a result, these technologies are becoming more advanced and widely available at lower costs. However, this rise in availability simultaneously raises the chances of mismanagement, representing a potential threat. This is now being pointed to as one of the main reasons the technology is not yet widely implemented.
How does it work?
What kind of information is processed by behaviour detection technology? The idea behind behaviour detection is that internal, emotional processes produce physical, detectable changes, which may be captured by sensors and processed and understood as an alert, depending on the desired algorithms. It is a multi-modal approach where everything can be taken into consideration. Behaviour detection in the field of civil aviation relies both on verbal and non-verbal indicators. It is based on the premise that a criminal exhibits indicators that deviate from the norm. Those parameters are divided into two categories: micro and macro. Facial expressions, biometrics, temperature, perspiration, lack of eye contact or unusual possessiveness of hand luggage belong to the micro group. Macro behaviours include larger movements, such as those strategically aimed at hiding the face or entire body, or conducting surveillance, etc.
“…whichever indicators are selected for analysis, developers must ensure that their algorithms are free from bias…”
Airport terminals can be fitted with 360-degree cameras with high definition graphical interfaces, geo-tracking maps, temperature sensors, biophysiological sensors and emotion feedback. This allows security agents to identify suspicious individuals based on several parameters. Various types of data may be collected, clustered, stored, processed, and activity may be analysed in real time to reveal malicious patterns in the earliest stages of execution. Other data may also be tracked and taken into consideration; for example, social demographic statistics, criminal records or personal data may contribute to more detailed analysis. TSA’s Optimized Behaviour Detection (OBD) includes 36 behavioural parameters, which was pared down from an initial list of 96 indicators. However, whichever indicators are selected for analysis, developers must ensure that their algorithms are free from bias. To mitigate bias, it is necessary to ensure that the data informing the algorithms are sufficiently diverse. This is achieved by drawing data from people of different cultural groups, age groups, educational backgrounds, etc. This helps developers to identify and minimise biases, making systems more adaptative, consistent and reliable.
“…humans are unable to accurately determine things like temperature, heart rate, blood density, movement and other precision parameters…”
The Era of Intelligent Environments
In the old days, people needed to learn how to use machines; however, now machines are expected to learn, adapt to and even anticipate our preferences and behaviours. Usually, this is designed to happen in unobtrusive and transparent ways in order to provide personalised services. Sensors are used to perceive different situations, allowing machines to understand the context, infer actions and make decisions without requiring human input. The main objective is to combine the strengths of different techniques, developing a domain of independent computer learning. However, this is complicated by the fact that the system is comprised of a combination of hardware, software networks and humans, all of which must cooperate harmoniously to transform raw data into logical decisions.
Humans are poor observers compared with machines because they cannot process large amounts of data as quickly as machines. Also, humans are unable to accurately determine things like temperature, heart rate, blood density, movement and other precision parameters. Independently, none of these can produce reliable results; however, when combined, they can be excellent parameters of analysis for security enforcement agents. This is the main difference between data and information. Figures can only mean something when we understand their context. How do we differentiate a passenger with a phobia of flying from someone who is preparing to commit a crime on board the flight? Both display indicators of stress but stress levels tell us nothing when analysed independently. The value of behaviour detection tools is their ability to combine parameters in order to arrive at a reasonable conclusion.
Additionally, if a computer produces a false positive, it has the potential to learn from this and avoid making the same mistake again. Once predictive analytics identify a high-risk individual, the security team can instigate an investigation; for example, by conducting an interview or luggage search to find out if the machine outcome is true or false, or to decide if further measures are necessary.
Technology is now capable of detecting delinquent behaviour and of giving direction to an investigation by correctly identifying suspects without disturbance. However, real-time analyses depend on the co-operation of several organisations. The International Air Transport Association (IATA) and Airports Council International (ACI) announced a collaborative approach to improving passengers’ journeys with the launch of the New Experience in Travel and Technologies (NEXTT) initiative. The concept relies on early validation of passengers’ identities, and controlled access to this information by various public and private stakeholders on an authorised-to-know basis, so that each passenger can be recognised and processed in the most efficient way. In Brazil, Enterprise Consulting formed an association with other companies under the supervision of the Chinese government in order to provide AI solutions. Now their products are implemented at several airports: Shanghai Pudong Airport, Jinan, Hong Kong, Singapore Changi.
“…how do we differentiate a passenger with a phobia of flying from someone who is preparing to commit a crime on board the flight? Both display indicators of stress but stress levels tell us nothing when analysed independently…”
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), a peer-reviewed journal, published research led by University of Manchester’s Dr. Omar Costilla Reyes, in which a gait analysis system was developed and which operates through floor sensors, assessing unique patterns associated with individuals’ footsteps. The researchers trained their AI system on a 20,000-footstep database based on 127 individuals and found that it could identify individuals based on their footsteps with an accuracy of about 99.3 percent.
Meanwhile, thermal analysis can be used to detect diseases identifiable by a high body temperature, such as the Coronavirus. Ironically, the Chinese company Wuhan Guide Infrared is one of the biggest and most respected producers of thermal devices in the world.
“…a gait analysis system was developed and which operates through floor sensors, assessing unique patterns associated with individuals’ footsteps. The researchers trained their AI system on a 20,000-footstep database based on 127 individuals and found that it could identify individuals based on their footsteps with an accuracy of about 99.3%…”
If we consider voice analysis and polygraph machines, Nemesysco, from Israel, presents an innovative and detailed voice analysis and emotion detection solution. It is based on patterns of voice fragments empowered by artificial intelligence algorithms. It claims to be able to detect, in real-time, voice manipulations, and emotions such as excitement, happiness, rage, confusion, embarrassment, fear, and sadness, producing a visual display of emotions being experienced by the subject.
All such solutions, working together, potentially hold inestimable value for airports in the near future.
Implementation and Opinions Worldwide
Patricia Cogswell, Acting Deputy Administrator of the Transportation Security Administration (TSA), has said that the technology is not yet where she would like it to be and that TSA will use its ‘convening authority’ to help advance this and other technologies. Similarly, Dr Peter Collings, a forensic psychiatrist and member of the Canadian Association of Threat Assessment, also claims that, “We are not there yet. Human behaviour is complex and the solution must be a combination of police, neurosciences and machines to work well”.
Paul Ekman, a ‘micro-expressions’ researcher is more optimistic: “Usually, studies related to behavioural analysis are based on the expected physiological response caused by emotions such as fear, shame, or fear of being discovered in those people who hide something. So, machines can be perfect to identify those patterns”. In a similar vein, Philip Mulley, CEO of SecureWorx in Australia, argues that their technology, “looks at and identifies behaviour through a series of skeletal movements” and that that they can also snapshot the person’s heartbeat at the same time. “By correlating both the movements and an elevated heart rate to confirm our interest in certain persons, we have the technology to enable drug and alcohol testing through CCTV.” However, this is in the early stages of development.
Amir Liberman, the CEO of the aforementioned Nemesysco from Israel, says that, “We need an emotion microscope. We are in the age of emotional machines. Layered Voice Analysis (LVA) is the advent of this microscope of feelings.” Liberman correctly notes that, “We observe the poor accuracy of magnetometers, and yet no one is considering ceasing using them.” In Brazil, Rafael Cavalcante and Francisco Linhares, partners of Enterprise Consulting, design several AI detection devices. Recently, they developed a very interesting police eyeglass in association with a pool of companies supervised by Chinese government. Projecting a client-led mentality, they claim that their “company executes exactly what the client wants. It is all adaptable to clients’ needs. Whatever they want, we project, find partners and deliver it. We are Brazilians and Brazil is recognised by its adaptability. So, that’s how we work at Enterprise Consulting.”
Behaviour technology is a very controversial topic; however, the impact it could have on how airports work in the future is significant. We are already seeing marked improvements in the availability, affordability and power of current digital solutions, and these are only going to become more intelligent, adaptable and reliable. However, some challenges must still be overcome before these products are more widely deployed and relied upon by those responsible for keeping passengers safe and secure. First, data privacy issues must be identified and resolved. Second, investments must be made in airport infrastructure and personnel training in order to safely and efficiently accommodate any changes in security processes, according to relevant legislation. Third, the issue of bias must be taken seriously when implementing new behaviour technology. No passenger should be discriminated against based on their appearance, race, religion, gender or social background. The new era of behaviour analysis must have the same parameters, rules, and ethics towards all travellers, always respecting universal human rights.
Selma Blum is a forensic psychologist and a partner in a technology start-up in Brazil, Spain and Israel. She also has several years of experience in behaviour analysis and criminal profiling.