In 2020, surveillance technology that can identify people from the way they walk will lead to an outbreak of Monty Python-style silly walks, predict Laurie Smith and Hannah Picton.
With attempts to block facial recognition in certain jurisdictions and imaginative countermeasures being used by protestors elsewhere, gait technology will identify people even when their faces are concealed. In 2020, those wishing to avoid being identified by their footsteps may adopt Monty Python-style silly walks.
While conversations around these technologies have been dominated by fears of an ever-growing surveillance society, this can detract from some of the other ways its capabilities can be utilised, like monitoring the health of older people.
With so many potential applications and ethical questions to take into consideration, the need for good governance will be key.
The use of facial recognition technologies for surveillance received considerable public attention in 2019. High-profile examples included legal action against use on the King’s Cross development in London, while protesters in Hong Kong donned face masks and wielded lasers to counter this sort of monitoring.
Despite privacy concerns and both formal and informal attempts to curtail the power of surveillance, ever more intrusive technologies are being developed. Many of these cannot be easily countered, or might even be more widely deployed, if facial recognition were to become more heavily regulated.
One of these developing technologies is behavioural biometrics, which can identify people by how they do things, rather than what they look like. This includes gait recognition that singles people out by the way they walk. Already police in China are conducting trials, but they are not the only ones interested in taking surveillance to the next level.
Gait recognition has received less recent attention than facial recognition. However, it can actually be more intrusive. In some forms the surveillance technology can be harder to spot, since pressure pads can be used rather than cameras. For example, a pressure pad might be hidden under a carpet or look like part of the floor. When combined with artificial intelligence (AI) software, this technology can recognise people from normal surveillance camera footage up to 50 metres away, even with their face concealed or back turned.
While gait recognition, like facial recognition, is not a new form of technology, advances in AI make it much more effective, enabling people to be recognised in real time. Researchers from the University of Manchester were able to use data from 20,000 foot steps to train an AI to recognise distinct factors like the rhythm of people’s stride. As these forms of technology become more advanced and cheaper to develop they could also become more commercially available.
Developers of the technology claim that their gait recognition cannot be countered by people limping, hunching over or walking with splayed feet. But this may not deter people wishing to avoid surveillance. In addition to anti-surveillance clothing and make up already on the market, perhaps counter surveillance techniques in 2020 will also look like an outbreak of Monty Python-style silly walks.
Gait recognition is just one example of a range of surveillance technologies currently under development – there are also emerging technologies that can recognise individuals through household wifi, vital signs, scent and microbiomes. Far from being on their last legs, facial recognition and CCTV might actually be more powerful when combined with gait recognition and other techniques.
Based on the way facial recognition has been adopted, it seems likely that we’ll become more and more familiar with these other techniques too. This of course brings potential dangers, and if we want to avoid entering a new age of hyper surveillance, it’s vital that we ensure effective governance of these technologies.
Most technology is neither inherently good or evil – it is up to us to govern how it is used. For example, gait recognition could also be used to remotely monitor signs of illness in vulnerable older people, which might allow them to remain independent for longer. This means governance is not only important for protecting privacy, but it could support the technology being used responsibly for social good.
How we maximise the benefits and minimise the risks of these sorts of technologies is one of our most important challenges as a society.
Laurie Smith is Nesta's Principal Researcher in the Explorations team; Hannah Picton is Assistant Programme Manager in Nesta's International Development and Education & Skills team.