Everyday Science Notes of Fingerprinting

Everyday Science Notes of Fingerprinting

Fingerprinting, method of identification using the impression made by the minute ridge formations or patterns found on the fingertips. No two persons have exactly the same arrangement of ridge patterns, and the patterns of any one individual remain unchanged through life. To obtain a set of fingerprints, the ends of the fingers are inked and then pressed or rolled one by one on some receiving surface. Fingerprints may be classified and filed on the basis of the ridge patterns, setting up an identification system that is almost infallible.

The first recorded use of fingerprints was by the ancient Assyrians and Chinese for the signing of legal documents. Probably the first modern study of fingerprints was made by the Czech physiologist Johannes Evengelista Purkinje, who in 1823 proposed a system of classification that attracted little attention. The use of fingerprints for identification purposes was proposed late in the 19th century by the British scientist Sir Francis Galton, who wrote a detailed study of fingerprints in which he presented a new classification system using prints of all ten fingers, which is the basis of identification systems still in use. In the 1890s the police in Bengal, India, under the British police official Sir Edward Richard Henry, began using fingerprints to identify criminals. As assistant commissioner of metropolitan police, Henry established the first British fingerprint files in London in 1901. Subsequently, the use of fingerprinting as a means for identifying criminals spread rapidly throughout Europe and the United States, superseding the old Bertillon system of identification by means of body measurements.

As crime-detection methods improved, law enforcement officers found that any smooth, hard surface touched by a human hand would yield fingerprints made by the oily secretion present on the skin. When these so-called latent prints were dusted with powder or chemically treated, the identifying fingerprint pattern could be seen and photographed or otherwise preserved. Today, law enforcement agencies can also use computers to digitally record fingerprints and to transmit them electronically to other agencies for comparison. By comparing fingerprints at the scene of a crime with the fingerprint record of suspected persons, officials can establish absolute proof of the presence or identity of a person.
The confusion and inefficiency caused by the establishment of many separate fingerprint archives in the United States led the federal government to set up a central agency in 1924, the Identification Division of the Federal Bureau of Investigation (FBI). This division was absorbed in 1993 by the FBI’s Criminal Justice Information Services Division, which now maintains the world’s largest fingerprint collection. Currently the FBI has a library of more than 234 million civil and criminal fingerprint cards, representing 81 million people. In 1999 the FBI began full operation of the Integrated Automated Fingerprint Identification System (IAFIS), a computerized system that stores digital images of fingerprints for more than 36 million individuals, along with each individual’s criminal history if one exists. Using IAFIS, authorities can conduct automated searches to identify people from their fingerprints and determine whether they have a criminal record. The system also gives state and local law enforcement agencies the ability to electronically transmit fingerprint information to the FBI. The implementation of IAFIS represented a breakthrough in crimefighting by reducing the time needed for fingerprint identification from weeks to minutes or hours.

Infrared Radiation
Infrared Radiation, emission of energy as electromagnetic waves in the portion of the spectrum just beyond the limit of the red portion of visible radiation (see Electromagnetic Radiation). The wavelengths of infrared radiation are shorter than those of radio waves and longer than those of light waves. They range between approximately 10-6 and 10-3 (about 0.0004 and 0.04 in).
Infrared radiation may be detected as heat, and instruments such as bolometers are used to detect it. See Radiation; Spectrum.
Infrared radiation is used to obtain pictures of distant objects obscured by atmospheric haze, because visible light is scattered by haze but infrared radiation is not. The detection of infrared radiation is used by astronomers to observe stars and nebulas that are invisible in ordinary light or that emit radiation in the infrared portion of the spectrum.
An opaque filter that admits only infrared radiation is used for very precise infrared photographs, but an ordinary orange or light-red filter, which will absorb blue and violet light, is usually sufficient for most infrared pictures. Developed about 1880, infrared photography has today become an important diagnostic tool in medical science as well as in agriculture and industry. Use of infrared techniques reveals pathogenic conditions that are not visible to the eye or recorded on X-ray plates. Remote sensing by means of aerial and orbital infrared photography has been used to monitor crop conditions and insect and disease damage to large agricultural areas, and to locate mineral deposits. See Aerial Survey; Satellite, Artificial. In industry, infrared spectroscopy forms an increasingly important part of metal and alloy research, and infrared photography is used to monitor the quality of products. See also Photography: Photographic Films.
Infrared devices such as those used during World War II enable sharpshooters to see their targets in total visual darkness. These instruments consist essentially of an infrared lamp that sends out a beam of infrared radiation, often referred to as black light, and a telescope receiver that picks up returned radiation from the object and converts it to a visible image.

Post a Comment




Contact Form


Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget