Intention recognition has the outstanding purpose of interpreting the actions and goals of one or more agents from a series of observations on the agents’ actions and the environmental conditions. This research field has shown a remarkable strength in providing personalised support for many different applications whilst also being connected to many different fields of study such as medicine, human-computer interaction, or sociology.
One wide and extremely useful domain of this is undoubtedly represented by wearable technology – smart devices with the opportunity of being used in a very subtle and comfortable way. Among the obvious benefits we can enumerate the real time data connection (can collect an array of data like activity levels, sleep, heart rate), the continuous monitoring (which improves the analysis of the data), predicting and alerting (can predict an increased risk of a potentially fatal health event and alert healthcare providers), empowering patients (and also making them more aware of the consequences of their lifestyle changes).
Helping elderly people, patients and the disabled has become more and more advanced (and at the same time interesting) in recent years due to the wide variety of the new inventions. These devices show the ability to understand user’s implicit intentions, to prospect the evolution and to communicate various data stored, which not only can inform the owners about, but most importantly any doctor or nurse that might have to decide what to do next.
Wearable Sensor-Based Hand Gesture
Human-robot interaction (HRI) implies using artificial agents(robots) with capabilities of both perception and action in the physical world. In this sense, the spotting of the gestures can be implemented using a neural network and introducing some gesture patterns, whilst the interpretation of the given context can be made by a hierarchical hidden Markov model (a statistical model of sequential data for recognition); the neural network is used to determine whether the HHMM-based recognition module should be applied.
As shown in the picture above, the main components of the smart assisted living system (SAIL) are a body sensor network (BSN), a companion robot, a Smartphone (or PC), and a remote health provider. The inertial sensors on the human subject collect three-dimensional angular velocity and three-dimensional acceleration of different body parts, such as the foot, hand, and chest, focusing on vital signs and motions, which are sent afterwards to the companion robot. The data is transferred and stored on a mobile device such as a Smartphone/PDA carried by the human subject, which sends it forward to a PC through WiFi. The component that is used for interpreting the gestures that the person made is the PC, and it also sends corresponding commands to control the robot. The remote health provider, a nurse or physician, evaluates the human subject’s health status and if necessary, remotely controls the companion robot to observe and help the human subject through a web-based interface and a joystick.
The most fascinating part of the process is definitely related to the correlation between the basic hand gestures and the way of interpreting them. The recognition algorithm consists of two modules : the neural network-based segmentation detects the start and end point of a gesture and also decides whether it should be considered a gesture (the output is 1) or not (the output is 0), and the recognition module which includes data pre-processing (raw data is clustered into observations symbols), individual hand gesture recognition without the knowledge of the context (may cause classification errors) and Bayesian filtering procedure (to produce more accurate decisions in the given parameters).
CodeBlue : Emergency Medical Care
CodeBlue is a revolutionary device that uses wireless sensing and communication, providing thus broad applications with main focus on medicine. Its leading enquiry is emergency medical care, as the principal idea of data mining is to transfer it among caregivers and facilitate efficient allocation of hospital resources depending on the health status of different patients.
Wireless sensor networks are an emerging technology consisting of spatially distributed autonomous devices using sensors to monitor physical or environmental conditions. In particular, this concept can deal efficiently with the gap between patient load and remaining resources, as it captures continuous, real-time vital signs, locations and identities from an amazingly large number of patients, sending the data then to computers of emergency medical technicians (EMTs), physicians, nurses. A decision support system is implemented to deal with the case in which there is a significant number of casualties and not sufficient space in hospitals. By analysing the data from the network, it is almost trivial to decide which patients suffer from more urgent and severe problems and deserve priority.
An extremely useful property of CodeBlue is that even though it uses an enormous amount of information, it supports filtration in order to avoid network congestion and overloading. This leads to managing efficiently and rapidly the data, taking the decisions sensibly and in a really short time, as well as monitoring meanwhile any other changes or urgent events that might occur. A smart implementation like this can help not only a huge amount of doctors to be well-organised, but also save a lot of people’s lives.