Biometric technology has become entwined with our everyday lives, from facial and fingerprint recognition on our mobile phones, to passing through airport security when we travel. But did you know that the biometric technology used across the globe today can be traced back thousands of years? In this blog, we’ll explore the history of biometrics from its early developments, to now.
While the term ‘biometrics’ doesn’t appear until the 1880’s, there is evidence that early humans used physical characteristics to verify a person’s identity as early as 6000 B.C. This first known reference was a prehistoric hand with ridge patterns discovered in Nova Scotia.
Fast-forward to 500 B.C, where we find evidence that humans used biometrics for identification, signing legal documents and business transactions. This has been apparent in civilisations across the earth, including the ancient Assyrians, Japanese, Babylonians and Chinese. Explorer Joao de Barros recorded that an early form of fingerprinting was being used by merchants in China. There were stamping children’s hand and foot prints on paper with ink to identify them. In Babylon, clay tablets with fingerprints were discovered that had been used in business transactions.
So, how did we develop from using biometrics as a rough method of classification, to an advanced technology with wide-ranging applications that we recognise today?
Early developments: 1800 – 1900
By the time we reach the 1800’s, human population growth had exploded following the industrial revolution and as cities expanded, the need to better identify people was a pressing issue. Local knowledge was no longer sufficient to keep criminal activity under control and this drove innovation during this period.
1823: The first ever system for classification of fingerprints is proposed by Czech physiologist and Professor of Anatomy, Johannes Evangelista.
1858: British officer Sir William Herschel began using handprints for those signing documents at the Indian Magistrate’s Office in Jangipur. He then moved from hand prints to prints of the right index and middle fingers. This was to identify who were real employees on payday.
1881: Parisian policeman, Alphonse Bertillion developed techniques to measure individual features in an attempt to identify repeat offenders who would use different aliases each time they were arrested. He recorded eye colour, shape and angle of the ears, brow and nose as well as identifying any tattoos. By 1884 he had successfully identified 241 repeat offenders using this system which became known as ‘Bertillonage.’
1892: Argentinian police official, Juan Vucetich created his own fingerprint identification system, pioneering the first use of fingerprint evidence in a murder case.
1892: Sir Francis Galton wrote a detailed study where he presented a new fingerprint classification system that included prints from all ten fingers. This system is still in use today, often referred to as Galton’s details.
20th century breakthroughs: 1900 – 1999
The 1990’s saw the use of biometrics boom, with major breakthroughs including the use of iris pattern identification as well as the birth of facial recognition.
1903: The New York Civil Service Commission started fingerprinting applicants to prevent fraud. The New York state prison adopted the practice to identify criminals.
1936: Frank Burch proposed the idea of using iris patterns as a method of identification.
1964-1966: Woodrow W. Bledsoe researched programming computers to detect human faces. He then developed the first semi-automatic facial recognition system.
1969: The FBI starts funding research into the development of automated fingerprint and facial recognition. This funding helped develop much more sophisticated sensors for biometric capture.
1974: The first commercial hand geometry recognition systems became available to manage things like time and attendance, to identify employees and for physical access control.
1996: In Atlanta, USA, the Olympic Games used hand geometry systems for secure access to the Olympic Village with 65,000 people enrolled.
By the early 2000’s, biometric technology had become more efficient, more socially accepted and the solutions were no longer used exclusively by government and large corporations.
The biometric boom: 2000 – 2022
2001: The Super Bowl in Tampa, Florida had a facial recognition system installed in an attempt to identify ‘wanted’ individuals in the stadium.
2003: The US Government’s National Science & Technology Council initiated a Subcommittee on Biometrics responsible for research, development, policy and international collaboration of biometric systems.
2008: Google enabled voice search in BlackBerry mobile version of Google Maps.
2010: US National Security used biometrics to identify a terrorist who took part in the planning of 9/11.
2011: Biometric facial identification is used by the CIA to identify the body of Osama bin Laden.
2013: Apple launched ‘touch id’ fingerprint scanners on Iphones
2018: The first MasterCard biometric card was released combining chip technology and fingerprints to verify purchases.
The future of biometrics
From looking back at the history of biometrics, we can see that the technology has come a long way since the first humans left their prints on cave walls. Biometric identification technology has made rapid advances in the last few years alone. A 2020 study found that facial recognition technology is as accurate as 99.97%.
Governments across the world are adopting new biometric solutions for law enforcement, border patrol and developing countries are starting to close the identity gap with new identification technology in place. With the global biometrics market projected to reach $45 billion by 2027, innovation of these technologies is only likely to continue to skyrocket and where we stand now will also become a small step in the history of biometrics.
At Arana Security, we always make sure we stay up to date with the latest developments in biometrics. Find out more about our products, from CCTV solutions, Biometric ID and Access Control to ANPR and Payment Card Solutions.