The Evolution of the Science of Fingerprint Identification: Meeting the Daubert Challenge

Standard

Written by Aischa S. Prudhomme, CLPE – May 2012

Abstract

Fingerprint identification has risen from its once humble beginnings to become one of the most definitive and reliable disciplines within the criminal justice system. Although it has most recently become popularized by modern primetime television shows like “CSI”, fingerprint identification has been considered the gold standard in the field of forensic science for over a hundred years. But what about the science of fingerprint identification and how has it evolved through modern times; and most importantly, can it overcome the judicial challenges of the past two decades? Have these challenges truly hindered the forensic community, or have they instead merely assisted in strengthening the science?

The first part of this paper sets the foundation by examining the origins of fingerprint identification, including a timeline account of its many applications throughout history, as well as an introduction to the many pioneers whose scientific research has facilitated in setting the foundation for its application in forensic science. It also discusses the biological processes involved in the growth of friction ridge skin during the critical stages of fetal development; focusing on the premises of permanence and uniqueness, the fundamental scientific bases of fingerprint identification. It then presents a comprehensive overview of the methodology involved in fingerprint identification and provides a comparative analysis of the traditional versus modern techniques utilized today. Additionally, it discusses the demanding role of the fingerprint examiner today and the challenges which now face forensic scientists in the modern day judicial system.

The second part of the paper discusses Daubert v. Merrell Dow Pharmaceuticals, Inc., the Supreme Court case that set a new precedent for the admissibility of scientific evidence and/or expert testimony; and its impact on forensic science, specifically within the discipline of latent print examination. The Court’s ruling outlines a prescribed standard, which is comprised of four prongs: (1) empirical testing of the theory/technique and peer review and publication of the theory/technique, (2) the known or potential error rate, (3) standards controlling the technique’s operation, and (4) general acceptance within the scientific community. Each of the four prongs, known collectively as the “Daubert standard”, are deconstructed and analyzed, then correlated to the correspondent data utilized from a compiled literature review of the relevant fingerprint science, with the intent to: (A) demonstrate the method’s definitive admissibility under the Daubert requirements and to (B) substantiate its status as a valid scientific method applicable to distinguishing individual human identity and in identifying the source of latent print impressions through the use of fingerprints.

Introduction

What is the science of fingerprint identification? It seems like a simple enough question, but to understand the science one has to first look at its history. Fingerprints can be traced as far back as prehistoric and ancient times when it was first used in primitive carvings and drawings as a form of artistic expression; and later during early Chinese civilization, BCE, when it was used primarily as a civil application for the endorsement of contract deeds, document seals and land titles.

It was only later, during the late 1700’s, that the uniqueness of fingerprints became the focus of scientific study; and by the late 1800’s, fingerprint identification had become the preferred method for human identification, particularly within the criminal justice system. Subsequently, it was from its criminal application that fingerprint identification further evolved, becoming an early investigative tool used for solving crimes; and so it remains today in the modern field of forensics, as the science of fingerprint/latent print identification (Ashbaugh, 1999).

History

In 1870, a British surgeon by the name of Dr. Henry Faulds became one of the first scientists to propose the use of fingerprints as a means of personal identification, as well as its value in solving crimes. His findings were documented in letters and writings that were then published in the scientific journal Nature. Later his work would be passed on to British anthropologist Sir Francis Galton, a cousin to Charles Darwin, whose interest in genetics lead him into the field of anthropometry; and later, to research involving the significance of the distinct variations in minutiae that are found in fingerprints (Ashbaugh, 1999).

Through his anthropometrical research and studies focusing on friction ridge detail in the skin of the hands and feet, Galton was the first to scientifically prove that fingerprints were both permanent and unique; and in 1892, his findings were published in the book Finger Prints. In his book, Galton discusses the role of permanence and uniqueness and its correlation to the presence of the distinct minutiae which are vital to the identification process. When referring to the papillary ridges, Galton states:

We shall see that they form patterns, considerable in size and of a curious variety of shape, whose boundaries can be firmly outlined, and which are little worlds in themselves. They have the unique merit of retaining all their peculiarities unchanged throughout life, and afford in consequence an incomparably surer criterion of identity than any other bodily feature. (1892, p. 2).

His name has coined the term that is still in use today when referring to the minutiae or Galton “points” of identification.

Once it had been established as a means of identification, fingerprints became an integral part of the criminal justice system. Not only was it utilized in the recording, classification and identification of criminal arrest records but it was also used in solving criminal cases where latent prints were recovered. Seeing the potential crime solving benefits of latent fingerprint identifications, it was around this time that scientists began experimenting with techniques using chemicals and powders for the purpose of developing latent fingerprints at crime scenes (Barnes, 2011). Consequently, the first homicide case to be solved using fingerprint evidence occurred in Argentina in 1892 and involved the identification of a bloody thumbprint found at the scene of the crime (Ashbaugh, 1999). Since that time, latent print evidence has become one of the most valuable forms of evidence used in criminal trials today.

In 1891 Juan Vucetich, a statistician and Argentine police official, utilized data from Galton’s research to design what is considered by many to be the first workable system for recording, classifying and identifying the fingerprints of criminals. Incidentally, this system is still in use today in many of the Spanish-speaking countries. However, it was in 1892 that Vucetich’s training became instrumental in identifying a bloody thumbprint taken from the door post of a house in Buenos Aires, which was the scene of a gruesome homicide involving the murder of two boys. The bloody print was matched to the thumb of Francisca Rojas, the woman accused of murdering her two sons. Vucetich’s work in fingerprints has credited him with being the first to utilize fingerprint identification specifically as a function of the criminal justice system (Ashbaugh, 1999).

Another figure who was essential in the adoption of fingerprint classification for its use in criminal applications was Sir Edward Henry. In 1894 in collaboration with Sir Francis Galton, Henry devised a formula using the nine fingerprint pattern classifications originally named by German professor Dr. Johannes E. Purjinke in 1823 based on research from his published thesis titled “Commentary on the Physiological Examination of the Organs of Vision and the Cutaneous System” (Ashbaugh, 1999). Henry’s formula provided a workable method for recording, classifying, categorizing and filing criminal fingerprint records, which allowed for greater efficiency, reliability and ease of use (Barnes, 2011),

Henry’s classification system was developed in India and first employed by the Bengali Police in the Lower Provinces where Henry was appointed Inspector General; and by 1901, the system was officially adopted in England by the Scotland Yard (Barnes, 2011). And in 1904 in the United States, the fingerprints of all inmates housed at Leavenworth Federal Prison were being systematically recorded, which were to become the first Federal fingerprint records commencing the U.S. Government’s official collection (Barnes, 2011).

Today, the FBI’s fingerprint records have grown to over 100 million, including both civilian and criminal records, and are maintained within a computer database program called IAFIS (Integrated Automated Fingerprint Identification System), which is recognized as the largest biometric database in the world (U.S. Dept. of Justice – FBI, 2011) . Although computer database programs, such as IAFIS and AFIS (Automated Fingerprint Identification System), designed for the automation of these records are most commonly used today, many law enforcement agencies continue to utilize Henry classification in conjunction with these newer automated systems (Evidence Technology Magazine, 2008).

Around the time Henry was developing his classification system other discoveries were being made within the biological spectrum of fingerprint pattern development. Based on previous primate research conducted in 1883 by Dr. Arthur Kollman of Germany on the embryological growth of friction ridge skin and its correlation to volar pad development, other primate studies involving friction ridge development were beginning to come underway (Barnes, 2011).

In 1897 a Zoologist named Harris Hawthorne Wilder published his first paper titled “On the Disposition of the Epidermic Folds Upon the Palms and Soles of Primates” and began what would entail 30 years of research into the morphology of friction ridge skin, focusing on palmar and plantar dermatoglyphics and variations in genetics and race. During his research, Wilder proposed that friction ridge patterns in primates were associated with volar pad development and undoubtedly set the foundation for further research into the evolution of friction ridge development (Barnes, 2011).

One scientist who was particularly influenced by Wilder’s research was Inez Whipple, a co-worker who later became Mrs. Inez Whipple Wilder. In 1904 she published what is considered today a landmark study in the fields of genetics and ridgeology. Her paper titled “The Ventral Surface of the Mammalian Chiridium” is a dissertation on the evolutionary processes between reptiles and mammals, which focuses on the development of friction ridge skin as it evolved according to man’s evolutionary needs for survival. She describes the process as an evolutionary occurrence involving the configurations of reptilian volar scales, which over time fused into rows to become epidermal ridges as certain species evolved into mammals; therefore allowing for friction in grasping and climbing, in accordance with man’s development (Barnes, 2011) .

As a result of these advancements in scientific research and the many contributions from both the scientific and law enforcement community, fingerprint identification had become generally accepted as a valid scientific method for human identification. Most importantly, improved techniques for developing and identifying latent prints of evidentiary value were instrumental in further propelling this general acceptance status among the judicial system as well. In fact the first U.S. appellate case involving the admissibility of fingerprint testimony, People v. Jennings (1911), was upheld on the basis that the court recognized fingerprint identification as a science that requires careful study and the expertise of those trained in the practice of its methods; and under those circumstances, expert testimony should be deemed appropriate in supporting the jury’s understanding of fingerprint evidence. People v. Jennings became a landmark case in the United States because it was the first appellate case to challenge the admissibility of expert fingerprint testimony (Barnes, 2011).

Another legal case concerning the use of latent fingerprint evidence, People v. Crispi (1911), was to become the first judicial conviction in the U.S. determined solely from evidence involving the identification a latent fingerprint. During trial, the fingerprint expert’s testimony and presentation of the evidence was so convincing even the defendant tried to change his plea to guilty (Barnes, 2011).

By 1914, the science of fingerprint identification had once again evolved to include the use of poroscopy; a theory first proposed by Dr. Edmond Locard. In “The Legal Evidence by the Fingerprints”, a published article of his research, Locard explains how sweat pores within the impressions of friction ridges can also be used to support fingerprint comparisons by providing supplemental data toward the weight of the conclusion (Ashbaugh, 1999). Further scientific research involving the permanence and uniqueness of third level detail, conducted by Harris Hawthorne Wilder and Bert Wentworth, would also support Locard’s theory (Ashbaugh, 1999).

As the momentum of fingerprint identification increased so did the need to form a governing body to oversee it. In 1915 Harry Caldwell, inspector of the Oakland, California Police Department’s Bureau of Identification, began to form an assembly of those in the profession of fingerprint identification. Twenty-two members convened to form what is known today as the International Association for Identification (IAI). The IAI has created the benchmark for those in the field of forensic identification and is responsible for the establishment of the certification of fingerprint examiners (theiai.org, 2011).

Through its professional journal, the Journal of Forensic Identification (JFI), the IAI has encouraged research toward the progress of forensic identification and has consistently published studies that are peer-reviewed not only by those in the field of forensic science but throughout the entire scientific community as well (theiai.org, 2011). There are other organizations and technical working groups who are also dedicated to identification research and were specifically formed to develop best-practices guidelines within the forensic identification community (SWGFAST, 2011); however, the IAI remains as the foremost authority in the field (James & Nordby, 2005).

SWGFAST (Scientific Working Group on Friction Ridge Analysis Study and Technology) formerly known as TWG (Technical Working Group), was initially formed by the FBI as a short term project to address the need for the development of quality standards and protocols within the latent print discipline. On June 10, 1995, fifteen prominent figures within the latent print community collaborated in discussions that took place for a total of eleven days in order to establish consensus guidelines relative to friction ridge examinations. The mission was so successful the FBI decided to re-establish the group as an ongoing scientific working group, and in 1998 the name SWGFAST was officially adopted. Since that time, SWGFAST has been instrumental in establishing the standard operating procedures, quality practices and protocols which have been adopted by forensic practitioners and generally accepted within the forensic community and judicial arenas alike (SWGFAST, 2011).

Throughout the course of the mid to late 20th century there were many more important scientific breakthroughs and landmark judicial rulings concerning fingerprint identification and its general acceptance as a valid science. Listed below is a brief timeline and summary of these historical contributions.

  • July 1924- Establishment of the Identification Division which maintained the federal criminal fingerprint records under the U.S. Justice Department’s Bureau of Investigation (Barnes, 2011).
  • April 1939- The Washington County Supreme Court upheld the decision on a habitual offender conviction in State v. Johnson, 1939. The lower court’s verdict was based on a fingerprint identification using certified copies of the defendant’s prior conviction records (Barnes, 2011).
  • May 1939- The sinking of the USS Squalus was the first U.S. disaster in which fingerprints were used to identify the victims (Barnes, 2011).
  • 1940- An appellate judge in Hamilton Texas upheld a conviction that was based on the identification of a latent fingerprint. In his decision, the judge proclaimed there was sufficient proof in the classification and identification of thousands of fingerprints conducted in the U.S. to hold that fingerprints were unique. He further added that the burden of proof lay with the defense to show two individuals who share the same fingerprints (Barnes, 2011).
  • 1940- The FBI Disaster Squad is formed in response to the Pan Am Airliner crash in Lovettsville, Virginia. Members of the FBI’s Identification Division are dispatched to assist in identifying the bodies of the victims (Barnes, 2011).
  • 1943- Dr. Harold Cummins of Tulane University publishes a book he coauthored with Charles Midlo entitled Fingerprints, Palms and Soles – An Introduction to Dermatoglyphics, which is based on his extensive research on the fetal development of volar pads and its relationship to the morphology of friction ridge skin (Ashbaugh, 1999).
  • 1952- Dr. Alfred Hale, also from Tulane University and an associate of Cummins, publishes his thesis “Morphogenesis of the Volar Skin in the Human Fetus”, which is based on his extensive research into the differential growth of friction ridges during fetal development (Ashbaugh, 1999).
  • 1953- Salil Kumar Chatterjee, a scientist from Calcutta, India, publishes his book Finger, Palm and Sole Prints. In 1962 Chatterjee also publishes an article titled “Edgeoscopy”, for which he is most recognized, which describes his theory of using ridge edge shapes to supplement fingerprint identifications (Ashbaugh, 1999).
  • 1976- Dr. Michio Okajima of Japan publishes his paper “Dermal and Epidermal Structures of the Volar Skin”. Okajima is most notable for his contributions in the study of incipient ridges (Okajima, 1975).
  • 1991- Dr. William Babler of Marquette University Wisconsin publishes his paper “Embryologic Development of Epidermal Ridges and Their Configurations” which is based on his work involving the prenatal relationship between the epidermal ridges and bone dimensions in the hand. The paper also reviews the literature of prior research involving embryologic ridge development (Ashbaugh, 1999).

Another prominent modern figure responsible for the advancement of the science of fingerprint identification is David R. Ashbaugh. Ashbaugh, a Certified Forensic Identification Specialist and retired Staff Sergeant with the Royal Canadian Mounted Police, has spent 28 years conducting extensive research in the field of fingerprint identification, specifically in the development, formation and analysis of the friction ridges of the skin (Evidence Technology Magazine, 2007). Through his research, he has expanded on the analysis and comparison processes that are involved in the evaluation of friction ridge characteristics to include the ridge units themselves; i.e., the intrinsic ridge formations, which are relative to size, shape and the positioning of pores and are unique to the individual ridge units themselves (Ashbaugh, 1999)

The term “Ridgeology” coined by Ashbaugh in 1982 comprises the analysis of these details, also known as third level detail. He discusses this research in his book Quantitative-Qualitative Friction Ridge Analysis – An Introduction to Basic and Advanced Ridgeology, which was published in 1999. Ashbaugh is also credited with the expansion of ACE (Analysis, Comparison, Evaluation), which now includes a fourth and final step, verification. ACE-V is the scientific methodology that is used by fingerprint examiners today (Evidence Technology Magazine, 2007).

Theory

It has been stated many times that fingerprints are unique. This premise is what makes fingerprint identification superior to other methods of identification (FBI, 1984). If this is the case then how do we know that fingerprints are in fact unique? In theory we do know because of the thousands of fingerprint examiners conducting “scientific experiments” each and every time they perform analysis in case work and in the millions of searches being conducted through AFIS and IAFIS (Integrated Automated Fingerprint Identification System) (Champod & Evett, 2001).

In order to further understand the theory of uniqueness, scientists have been conducting extensive research for years, specifically involving the biological development of friction ridges in fetuses; and indeed, this type of research is ongoing today. David Ashbaugh is one the researchers who has been instrumental in the effort of bringing this information to the forefront of the fingerprint community. His research has led to the advancement of forensic science through the support of previous fingerprint studies conducted within the field of biological science (Ashbaugh & Houck, 2005).

In his book Quantitative-Qualitative Friction Ridge Analysis – An Introduction to Basic and Advanced Ridgeology, Ashbaugh discusses the growth and recession of the volar pads (bulbous protrusions) in the hands of the developing fetus. Volar pad growth begins at about the sixth week of the gestational period and continues to grow as they begin to swell, and then suddenly begin to recess at around the twelfth week, when the friction ridges begin to develop in the basal layer of skin. This swelling in the volar pads as well as genetics, determine the pattern type (Ashbaugh, 1999). Other factors including maternal diet, developmental stressors and the movements of the fetus affect the ridge path deviations or breaks in the friction ridges. All of these random biological events, which occur during the critical stage of fetal development, contribute to the formation of the individual features which make fingerprints unique (Wertheim & Maceo, 2002)

There has also been validation studies conducted for establishing the individuality between twins which further exemplifies the uniqueness of fingerprints. In one particular study, 3,000 pairs of fingers from identical and fraternal twins, including all finger types, were compared for level 1 (pattern type) and level 2 (minutiae) detail. The conclusion held that although pattern types were often similar among twins, the minutiae or individual characteristics within the patterns were indeed unique (Srihari, Srinivasan & Fang, 2007). Other fingerprint studies on genetic variability, which focus on the heritability between variables of race and gender (Singh, Chattopadhyay, & Garg, 2005), as well as the determinacy of certain genetic disease risk factors have been recently published (Kahn, Ravindranath, Valdez, & Narayan, 2001) .

Methodology

ACE-V is the accepted scientific method practiced by fingerprint examiners today and is currently the only method recognized and endorsed by the International Association for Identification and the Scientific Working Group on Friction Ridge Study and Technology (SWGFAST) as a valid scientific standard for fingerprint examination. ACE-V is an acronym for analysis, comparison, evaluation and verification, which are the four steps involved in the examination process. Analysis (A), the initial stage, is the assessment that is conducted of an impression to determine its suitability for comparison. This assessment utilizes three levels of friction ridge detail. The first level consists of the general class features or pattern classification of the impression; i.e., loop, arch or whorl, as well as the overall ridge flow. The second level consists of the individual characteristics present within the impression and the third level encompasses the aspects of “Ridgeology”, i.e., the shape and/or size of the individual ridges, positioning of the pores, etc. It is at the analysis stage that the value and sufficiency of an impression is determined. Once sufficiency and/or classification is established, the examiner moves on to the comparison (C) stage, which involves the visual observation of the similarities and/or differences in correlation with the sequence and spatial relationship of the characteristics present within the two friction ridge impressions. At the evaluation (E) stage, a conclusion is established through the cumulative data gathered in both the analysis and comparison stages in which three conclusions can be drawn: (1) Individualization– the determination that “two friction ridge impressions originated from the same source”, (2) Exclusion– the determination that “two friction ridge impressions originated from different sources” and (3) Inconclusive– the determination which “result[s] in the inability to reach either an individualization or exclusion decision”. Verification (V), the final stage, is the replication of the ACE method by a second examiner who is deemed qualified to objectively confirm or reject the conclusion made by the primary examiner (SWGFAST, 2002).

Prior to the use of ACE-V, the only standard in practice was the “twelve point rule”. This standard required that all conclusions of identification be predicated on a minimal numerical threshold of twelve identifying and sequential ridge characteristics within two friction ridge impressions. It was falsely believed that the absence of a numerical threshold somehow invalidated the process and deemed it “unscientific”; however, the rule did not take into consideration the totality of all analytical, comparative, and evaluative processes involved, but instead placed unsubstantiated limitations on the examiner’s evaluative conclusion (Champod & Evett, 2001).

William F. Leo, a certified latent print examiner and author states: “In the guise of quality control, standards based solely upon the quantity of characteristics reduce an examiner to a technician more adept at ciphering than exercising scientific judgment” (1994). For example, under the minimum point standard, a latent print with only an area of “open fields” or an area containing no ridge path deviation, would technically be determined “unidentifiable”; however, an examiner with specific knowledge and training can logically deduce that the absence ridge path deviation is as equally unique as its presence. Because of the extent of the quantitative-qualitative method of analysis and the varying complexity of each impression, it is impossible to assume that all identifications be limited to the same cookie-cutter like process (Champod & Evett, 2001). “The flaw in the use of any numeric standard for friction ridge identification is its inability to account for all the observations made and the examiner’s ability to evaluate this information” (Leo, 1994).

Another argument against the “point standard” is the fact that not all agencies had the same numerical standard of twelve points. Some agencies mandated that as little as eight points or even six points were sufficient (Champod & Evett, 2001). Where is the valid scientific basis for this pre-determination? It was also theorized that this practice would prevent errors in identification. This analogy seems absurd since within any situation where a human is involved the possibility for error will always exist. It is through verification that errors, if any are made, may be identified and corrected (Ashbaugh & Houck, 2005).

To resolve the inconsistencies that the point standard presented within the fingerprint community the IAI formed a committee, and through a three year study came to a decision on the matter; and in 1973, a report from these findings state: “‘[There is] no valid basis [that] exists at this time for requiring that a pre-determined minimum number of friction ridge characteristics must be present in two impressions in order to establish positive identification’” (Ashbaugh & Houck, 2005). Needless to say, there are still agencies that adhere to a point standard. This is an issue that will probably continue to be argued for years to come.

However, more recently there has been a focus on quantitative research involving the development of statistical probability models and likelihood ratios for the identification of latent prints. One particular study, “Computation of Likelihood Ratios in Fingerprint Identification for Configurations of Any Number of Minutiæ”, seeks to obtain a more robust and quantifiable method for assigning values to friction ridge minutiae, known as Level 2 characteristics. The research design for this model takes a more objective approach in validating the latent print examination process by using Level 2 characteristics; i.e., ridge endings, bifurcations and dots, and assigning them a numerical value based on triangular positioning, feature vector, frequency and pattern type. A numerical scale is then calculated based on the sum of each value within a particular minutiae configuration containing n number of features. Variability is also assessed by assigning values for both distortion and background noise, calculating their mean values and compiling the sums into a mean value set σ and a standard deviation set µ. Lastly, an overall likelihood ratio (LR) is calculated by multiplying the sums of these values, where the total sum (LR1) is the numerator and the total sum of random sources containing n number of close but non-matching minutiae configurations (LR2) is the denominator (Neumann, et al., 2007).

The authors proposed this probability model in light of the current trend to strengthen the validity of the latent print identification process, which resulted due to a recent influx in court challenges and federal legislation calling for reforms in forensic science. Up to this point, the latent print discipline had been seriously lacking in statistical research, specifically studies committed to developing computational models which could determine the probability ratios for minutia configurations in latent print impressions. The authors in this study tested their model on a sample of 686 ulnar loops and 204 arches. The results were promising, showing low a low false positive rate and the likelihood ratio (LR) power of the model significantly increasing with the number of minutiae. Additionally, within a significant percentage of the cases the LR’s indicated a correct identity a source, even in cases where configurations contained few minutiae. This model is an improvement from previous studies in that it takes into account finger distortion; and by incorporating radial triangulation method, the model is also more efficient  because can measure the spatial relationships of minutiae (orientation) without imposing probabilistic assumptions (Neumann, et al., 2007).

Another study titled “Pilot-study: A statistical analysis of the ACE-V methodology – analysis stage”, conducted by Glen Langenburg of the Minnesota State Crime Lab, takes a quantitative approach toward the validation of the ACE-V methodology. Langenburg designed this pilot-study as the prototype to a series of studies designed to deconstruct and examine the various aspects of the ACE-V methodology, which is the examination standard used by latent print examiners today. The intended goal for this series of long term research is to gain insight into each stage of the methodological procedure by collecting data and to the gauge the effectiveness of the methodology with the intention of making improvements in examiner training and possibly within the method itself (Langenburg, 2004).

This first study examines the Analysis stage of the ACE-V methodology. It is designed to examine only the quantitation of minutiae during the analysis phase of fingerprint examination. A test group consisting of 24 latent print examiners and trainees with varying levels of experience and a control group consisting of 50 participants with no training or experience in latent print examinations participated in the study. Both groups were given a survey packet which included a worksheet consisting of 12 print impressions (2 inked control prints and 10 latent prints), an instruction sheet for counting minutiae, an enlarged copy of one of the 10 latent prints and a four-page survey consisting of 30 questions (Langenburg, 2004).

The objectives for the study were to determine: (1) the mean responses and variances between the test group and the control group, (2) whether significant variances between the latent print examiners and the trainees (if any were observed), could be attributed to factors involving information obtained in the survey, (3) which of the minutiae was most frequently observed and (4) whether participants in both groups observed features which were not minutiae (Langenburg, 2004).

The results of the study drew the following conclusions: (1) there was no significant variance in the mean response of minutiae reported among the test group, (2) there was a significance in variance between the test group and the control group for the mean response of minutiae reported, (3) the mean response of minutiae reported by the test group was nearly doubled that of the control group and (4) the test group marked more minutiae correctly than the control group who tended to mark more false minutiae (Langenburg, 2004).

Tools of the Trade

For years the only tools fingerprint examiners had at their disposal were their eyes and the trusted magnifying loupe and pointers. Searches were conducted manually using the Henry classification system which meant tireless searching through hundreds even thousands of inked fingerprint cards. Since the last 100 years, technology has allowed the science to flourish by incorporating the first computerized databases capable of searching through millions of fingerprints. This ability caused a significant increase in the total percentage of fingerprint identifications. This increase aided in the solving of numerous criminal cases that would have possibly gone unsolved in the past (U.S. Dept. of Justice – FBI, 2011).

These computer systems known as AFIS (Automated Fingerprint Identification System) and the FBI’s IAFIS (Integrated Automated Fingerprint Identification System) made their debut in the early 90’s allowing the fingerprint examiner to search digitally scanned images. These scanned images are captured using Livescan computers that use lasers to scan the surfaces of the fingers and palms then transferring them to an automated file (U.S. Dept. of Justice – FBI, 2011).

This became a wonderful tool to the fingerprint examiner who could view these scanned fingerprints on a computer screen without excess strain to the eyes. Other integrated features allow for the enhancement of latent prints which are often times fragmented and distorted. The image could be magnified as well as enhanced in quality. For an agency that acquired this system there would be no more straining through magnifying loupes and time consuming searches through manual files (Evidence Technology Magazine, 2008).

There have been several upgrades in the past few years, each time adding new features and search capabilities. One such capability that is currently in the works is the interoperability between local and state databases. Currently, local operating systems within each state can only search that state’s database or search the FBI’s database IAFIS. For example, state X could not search state Y’s database and vice versa but with interoperability the systems could be linked to search other databases. This would further expand the examiner’s search thus increasing the probability of an identification being made (Evidence Technology Magazine, 2008).

The Fingerprint Examiner

The role of the fingerprint examiner is one of great importance. Without fingerprint examiners many criminals would go unidentified, criminal cases would possibly go unsolved and the missing and the dead could remain anonymous. Although these are just a few of the functions performed by fingerprint examiners, their work does not stop there. Not only does the examiner have to be highly skilled and expertly trained, he/she has to be proficient. There is no room for error in this discipline (Jones, 2007).

The fingerprint examiner today is more knowledgeable and professional than in past decades. Examiners must continually strive to increase their skill, knowledge, training and experience and are invariably exposed to rigorous testing and peer review. Examiners currently entering the field have more formal educations and are expected to have knowledge and understanding of other sciences such as chemistry and physics. Groups such as the IAI and SWGFAST are strong forces in setting the guidelines with which examiners should follow and they continue to encourage research and professional growth (James & Nordby, 2005).

More than ever before today’s fingerprint examiners are increasing being challenged in the court room and must continually remain current with the latest research and training in order to survive the cross-examination of their expert testimony. In an article by Christophe Champod and Ivan Evett that was published in the Journal of Forensic Identification, the authors make reference to the ability of the fingerprint community and its practitioners to uphold the discipline’s “scientific status”. Additionally, the article is right on par with what is happening in the field today when it suggests it is the responsibility of its practitioners to become more exposed to training in the sciences and in scientific testing protocols in order to gain back its credibility within the science world (Champod & Evett, 2001) .

New Precedents Create New Challenges

There were many early pioneers in the study of fingerprints, most of whom were doctors and scientists whose studies in the biological development of friction ridge skin led to the discovery of fingerprint identification. From these early discoveries, and through the continued research and current innovations in science and technology, fingerprint identification has evolved to become one of the most recognizable and efficient forms of human identification in forensic science today. However, along with these modern advancements, so too have come its challenges.

In 1993 one of those challenges came in the form of a Supreme Court ruling, not as a result of an appealed criminal court conviction but through an appeal on a summary judgment made by plaintiffs in a civil litigation suit against the major medical drug manufacturing corporation Merrell Dow Pharmaceuticals (U.S. Supreme Court, 1993) . The decision to appeal the judgment arose as a result of conflicting expert scientific and medical testimony on both sides which cast light on the obvious question “what criteria determines whether a method or practice is (a) scientific and (b) valid?”, eventually making its way up to the Supreme Court to decide. Subsequent to its now infamous ruling in the case known as Daubert v. Merrell Dow Pharmaceuticals (1993), the Supreme Court successfully set a new precedence for the admissibility of expert scientific testimony and also raised the bar in forensic criminal cases. Though its standard applies to all expert scientific testimony, its implementation has since elicited a stream of attacks by critics who question the validity of forensic science in general, particularly those applications used in forensic identification; more specifically, fingerprint and latent print identification (Jones, 2007).

Meeting the Daubert Challenge

The criterion established by this Supreme Court ruling imposes the implementation of four specific guidelines in measuring the admissibility of scientific or technical expert testimony, which are: (1) Has the theory/technique been tested and has it been subjected to peer review and/or publication and validation? (2) Is there a known or potential error rate? (3) Are standards maintained which govern the operation of the method or technique? (4) Is it generally accepted within the relevant scientific community? The judge acting in the role as gatekeeper must make this determination using these guidelines (U.S. Supreme Court, 1993).

It is evident from its documented history through over 100 years of research, which has served as a cornerstone in the validity of the science, that fingerprint identification more than sufficiently meets the Daubert requirements; however, the goal in writing this paper is to further substantiate this conclusion by deconstructing each of the four standards and citing the supporting documentation relevant to the discipline and the science.

(1) Theory- permanence and uniqueness: the basic premise of fingerprint identification, validation studies. Technique– the scientific methodology: ACE-V.

  • 1892- Galton publishes his book Finger Prints establishing the theory of permanence and uniqueness (Galton,1892).
  • 1883- Dr. Arthur Kollman conducts primate studies on the embryological development of friction ridge skin forming a link between volar pad development and friction ridge growth (Barnes, 2011).
  • 1897- Hawthorne Wilder publishes his article “On the Disposition of the Epidermic Folds Upon the Palms and Soles of Primates” and proposes that friction ridge patterns in primates were associated with volar pad development. Sets the foundation for the premise of evolutionary development (Barnes, 2011).
  •  1904- Inez Whipple Wilder publishes her paper “The Ventral Surface of the Mammalian Chiridium” which establishes the theory of evolutionary development in friction ridge skin (Barnes, 2011)
  • 1914- Edmond Locard publishes his paper “The Legal Evidence by the Fingerprints” which focused on the use of pores in the identification process (Barnes, 2011).
  • 1943- Dr. Harold Cummins publishes book Fingerprints, Palms and Soles – An Introduction to Dermatoglyphics, based on the fetal development of volar pads and its relationship to the morphology of friction ridge skin (Ashbaugh, 1999).
  • 1952- Dr. Alfred Hale publishes his thesis “Morphogenesis of the Volar Skin in the Human Fetus”, based on differential growth of friction ridges during fetal development (Ashbaugh, 1999).
  • 1953- Salil Kumar Chatterjee publishes his book Finger, Palm and Sole Prints. In 1962 Chatterjee also publishes an article titled “Edgeoscopy”, describes his theory of using ridge edge shapes to supplement fingerprint identifications (Ashbaugh, 1999).
  • 1976- Dr. Michio Okajima publishes his paper “Dermal and Epidermal Structures of the Volar Skin”. Notable for his study of incipient ridges (Okajima, 1975).
  • 1991- Dr. William Babler publishes his paper “Embryologic Development of Epidermal Ridges and Their Configurations” based on his work involving the prenatal relationship between the epidermal ridges and bone dimensions in the hand (Ashbaugh, 1999).
  • 1999- David Ashbaugh publishes his book Quantitative-Qualitative Friction Ridge Analysis – An Introduction to Basic and Advanced Ridgeology. Discusses ridgeology and the ACE-V methodology.
  • 2001- Genetic studies on the determinacy of certain genetic disease risk factors (Kahn, Ravindranath, Valdez, & Narayan, 2001)
  • 2005- Studies on genetic variability, which focus on the heritability between variables of race and gender (Singh, Chattopadhyay, & Garg, 2005),
  • 2008- Twin studies conducted for establishing the individuality between the fingerprints in twins (Srihari, Srinivasan, & Fang, 2008)

(2) Error rate- CTS proficiency testing and scientific studies.

  • 2004- Glen Langenburg designs pilot study for the statistical analysis of the ACE-V methodology (Langenburg, 2004).
  • 2007- Statistical research studies on developing likelihood ratios in fingerprint identification. The research design for this model takes a more objective approach in validating the latent print examination process (Neumann, et al., 2007).

(3) Standards- SWGFAST, International Association for Identification (IAI).

  • 1915- Harry Caldwell forms the International Association for Identification (IAI) (theiai.org, 2011).
  • 1973- The IAI conducts research to invalidate the use of point standards. “‘[There is] no valid basis [that] exists at this time for requiring that a pre-determined minimum number of friction ridge characteristics must be present in two impressions in order to establish positive identification’” (Ashbaugh & Houck, 2005).
  • 1982- David Ashbaugh coins the term “ridgeology” to describe the use of third level detail in fingerprint identifications (Ashbaugh, 1999)
  • 1998- SWGFAST is formed by the FBI (SWGFAST, 2011).

(4) General acceptance- history and timeline: over 100 years of research in the scientific community. Court precedents.

  • 1911- People vs. Jennings. Appellate court recognized fingerprint identification as a science and expert testimony should be deemed appropriate in supporting the jury’s understanding of fingerprint evidence (Barnes, 2011).
  • 1911- People vs. Crispi. The first judicial conviction in the U.S. determined solely from evidence involving the identification a latent fingerprint (Barnes, 2011).
  • 1939- The Washington County Supreme Court upheld the decision on a habitual offender conviction in State v. Johnson, 1939. Upheld the practice of fingerprint identification using certified copies of the defendant’s prior conviction records (Barnes, 2011).
  • 1939- The sinking of the USS Squalus. Fingerprints were used to identify disaster victims (Barnes, 2011).
  • 1940- An appellate judge in Hamilton Texas upheld a conviction that was based on the identification of a latent fingerprint. Judge proclaimed there was sufficient proof in the classification and identification of thousands of fingerprints conducted in the U.S. to hold that fingerprints were unique (Barnes, 2011).
  • 1940- The FBI Disaster Squad is formed in response to the Pan Am Airliner crash in Lovettsville, Virginia. Fingerprints were used to identify disaster victims (Barnes, 2011).

Conclusion

Today, the science of fingerprint identification has become one of the most formidable assets in the field of forensics. It has developed over time from a mere curiosity into the vast and dynamic discipline it is today; and though it is one of the oldest and most reliable methods of personal identification, it’s validity as a scientific application has continually fallen under attack. Issues which arose as a result of Supreme Court cases such as Daubert vs. Merrell Dow Pharmaceuticals, Inc., call into question the reliability of fingerprint identification and lay doubt about the scientific validity of forensics as a whole. However, throughout history fingerprint identification has been established time and again as a tried and true practice; and in the end, so too shall forensic science prevail. Through its establishment as a pioneer in the vast field of forensics, the dedicated research of educated professionals and its innovation into the advancements of technology, fingerprint identification will continue to evolve and to persevere as it continues to gain credibility within the science world.

Bibliography

Anthonioz, A., Egli, N., Champod, C., Neumann, C., Puch-Solis, R., & Bromage-Griffiths, A. (2008). Level 3 details and their role in fingerprint identification: A survey among practitioners. Journal of Forensic Identification, 58(5), 562-589.

Anthonioz, A., Egli, N., Champod, C., Neumann, C., Puch-Solis, R., & Bromage-Griffiths, A. (2011). Investigation of the reproducibility of third-level characteristics. Journal of Forensic Identification, 61(2), 171-192.

Ashbaugh, D. R. (1999). Quantitative-Qualitative Friction Ridge Analysis – An Introduction to Basic and Advanced Ridgeology. Boca Raton: CRC Press.

Ashbaugh, D. R., & Houck, M. M. (2005). Fingerprints and Admissibility: Friction Ridges and Science. The Canadian Journal of Police & Security Services, 3(2), 107-108.

Barnes, J. G. (2011). History. In SWGFAST, The Fingerprint Sourcebook (pp. 1-18). Washington, D.C.: National Institute of Justice (NIJ).

Champod, C., & Evett, I. W. (2001). A probabalistic approach to fingerprint evidence. Journal of Forensic Identification, 51(2), 101-122. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/194791354?accountid=12168

Evidence Technology Magazine. (2007). Interview with David R. Ashbaugh. Evidence Technology Magazine, 5(3).

Evidence Technology Magazine. (2008). AFIS Interoperability. Evidence Technology Magazine, 6(1).

Evidence Technology Magazine. (2008). Getting out of the Loupe. Evidence Technology Magazine, 6(4).

Galton, F. (1892, March). Finger Prints. “Holy Grail” Reference Library for Latent Print Examiners(3.7).

Gill, K. W., & Lock, D. (2000). Cloned sheep of Roslin: Muzzle prints. Journal of Forensic Identification, 50(3), 276-288. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/194828343?accountid=12168

James, S. H., & Nordby, J. J. (2005). Forensic Science An Introduction to Scientific and Investigative Techniques. Boca Raton: CRC Press.

Jones, G. W. (2007). Courtroom Testimony for the Fingerprint Expert (2 ed.). Wildomar: Staggs Publishing.

Kahn, H. S., Ravindranath, R., Valdez, R., & Narayan, K. M. (2001). Fingerprint ridge-count difference between adjacent fingertips (dR45) predicts upper-body tissue distribution: evidence for early gestational programming. American Journal of Epidemiology, 153(4), 338-344. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/11207151

Langenburg, G. M. (2004). Pilot-study: A statistical analysis of the ACE-V methodology – analysis stage. Journal of Forensic Identification, 54(1), 64-79. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/194790119?accountid=12168

Leo, W. F. (n.d.). Friction Skin Identification A Scientific Approach. The Print, 10(3), pp. 1-3.

Medland, S. E., Loesch, D. Z., Mdzewski, B., Zhu, G., & Montgomery, G. W. (2007). Linkage Analysis of a Model Quantitative Trait in Humans: Finger Ridge Count Shows Significant Multivariate Linkage to 5q14.1. PLoS Genetics, 3(9), 1736-1744. doi:10.1371/journal.pgen.0030165

Nemann, C., Mateos-Garcia, I., Langenburg, G., Kostroski, J., Skerrett, J. E., & Koolen, M. (2011). Operational benefits and challenges of the use of fingerprint statistical models: A field study. Forensic Science International, 212(1-3), 32-46. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/194794262?accountid=12168

Neumann, C., Champod, C., Puch-Solis, R., Egli, N., Anthonioz, A., & Bromage-Griffiths, A. (2007). Computation of Likelihood Ratios in Fingerprint Identification for Configurations of Any Number of Minutiae. Journal of Forensic Sciences (Blackwell Publishing Limited), 52(1), 54-64. doi:10.1111/j.1556-4029.2006.00327

Neumann, C., Evett, I. W., Skerrett, J. E., & Mateos-Garcia, I. (2011). Quantitative assessment of evidential weight for a fingerprint comparison I: generalisation to the comparison of a mark with a set of ten prints from a suspect. Forensic Science International, 207(1-3), 101-105. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/866350042?accountid=12168

Neumann, C., Evett, I. W., Skerrett, J. E., & Mateos-Garcia, I. (2012). Quantitative assessment of evidential weight for a fingerprint comparison part II: A generalization to take account of the general pattern. Forensic Science International, 214(1-3), 195-199. doi:10.1016/j.forsciint.2011.08.008

Oates, R. T. (2000). Elbow print identification. Journal of Forensic Identification, 50(2), 132-137. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/194794262?accountid=12168

Okajima, M. (1975). Development of dermal ridges in the fetus. Journal of Medical Genetics, 12(3), 243-250. Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1013284/?tool=pubmed

Olsen, R. D. (1978). Scott’s Fingerprint Mechanics. Springfield: Bannerstone House.

Seidenberg-Kajabova, H., Pospisilova, V., Vranakova, V., & Varga, I. (2010). An original histological method for studying the volar skin of the fetal hands and feet. Biomedical Papers of the Medical Faculty of the University Palacky Olomouc Czechoslovakia, 154(3), 211-218. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/21048806

Singh, I., Chattopadhyay, P. K., & Garg, R. K. (2005). Determination of the hand from single digit fingerprint: a study of whorls. Forensic Science International, 152(2-3), 205-208. Retrieved from http://www.sciencedirect.com/science/article/B6T6W-4F9SY62-1/2/6f4d8b103ee8fbf02740472fc7f79a33

Srihari, S. N., Srinivasan, H., & Fang, G. (2008). Discriminability of fingerprints of twins. Journal of Forensic Identification, 58(1), 109-127.

SWGFAST. (2002). Friction Ridge Examination Methodology for Latent Print Examiners. Retrieved from SWGFAST: http://www.swgfast.org/index.html

SWGFAST. (2011, August). SWGFAST Origin and Growth. Retrieved from SWGFAST.org: http://www.swgfast.org/Resources/SWGFAST-Origin-and-Growth.pdf

Swofford, H. J. (2005). Fingerprint patterns: A study on the finger and ethnicity prioritized order of occurrence. Journal of Forensic Identification, 55(4), 480-488. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/194791539?accountid=12168

Swofford, H. J. (2008). The ontogeny of the friction ridge: A unified explanation of epidermal ridge development with descriptive detail of individuality. Journal of Forensic Identification, 58(6), 682-695.

theiai.org. (2011, August 13). IAI History. Retrieved from The International Association for Identification: http://www.theiai.org/history/

Turner, J. M., & Weightman, A. S. (2007). Focus on pores. Journal of Forensic Identification, 57(6), 874-882.

U.S. Court of Appeals for the Fourth Circuit. (2003, March 31). U.S. v. Crisp. Retrieved from NLADA: http://www.nlada.org

U.S. Department of Justice Federal Bureau of Investigation. (1984). The Science of Fingerprints. Washington, D.C.: Government Printing Office.

U.S. Dept. of Justice – FBI. (2011). Integrated Automated Fingerprint Identification System. Retrieved from FBI – The Federal Bureau of Investigation: http://www.fbi.gov/about-us/cjis/fingerprints_biometrics/iafis/iafis

U.S. District Court Southern District of Indiana Indianapolis Division. (2000, October 5). U.S. V. Havvard. Retrieved from Federal Evidence: http://federalevidence.com

U.S. Supreme Court. (1993, June 28). Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 US 579 – Supreme Court 1993. Retrieved from Google Scholar: http://scholar.google.com

Ulery, B. T., Hicklin, R. A., Buscaglia, J., & Roberts, M. A. (2011, March 31). Accuracy and reliability of forensic latent fingerprint decisions. Retrieved from PNAS: http://www.pnas.org

Wertheim, K., & Maceo, A. (2002). The critical stage of friction ridge and pattern formation. Journal of Forensic Identification, 52(1), 35-85. Retrieved from http://ezproxy.loyno.edu/login?url=http://search.proquest.com/docview/194801682?accountid=12168

Living in the “Big Sleazy”: Why Corruption is to Blame for New Orleans’ Violent Crime Rate and its Impact on Recovery

Standard

forensiccrow

Written by Aischa S. Prudhomme – November 23, 2010

For the longest time New Orleans has been known as the “Big Easy” because of its easy living and slow paced, laissez-faire atmosphere. But since the mid-Nineties, the “Big Easy” has also become synonymously known as the “murder capital” of the United States; a title it has consistently held even through post Katrina, and all anyone has to do is turn on the news or pick up a copy of the Times Picayune to see why. The headlines are constantly riddled with stories of the city’s ever increasing murder rate and underlying corruption. This underlying corruption has pervaded New Orleans’ society and indirectly affected the city’s spiraling violent crime rate, turning it from the “Big Easy” to the “Big Sleazy”, and continues to impede the city’s recovery efforts nearly six years after Hurricane Katrina.

New Orleans was first given the dubious…

View original post 3,345 more words

Forensic Evidence Solves 29-Year Cold Case Murder: A Case Study of State of Connecticut v. Edward Grant

Standard

Written by: Aischa S. Prudhomme, May 2012

The Murder of Penney Serra

The Crime Scene

On July 16th, 1973, the body of 22 year old Concetta “Penney” Serra was found dead in the stairway of the 9th level parking garage in New Haven, Connecticut.  She was found lying shoe-less and curled into a fetal position, wearing a blue knit mini dress. At the bodice of the dress was a large red stain which appeared to be blood. Additionally, lying inside the garage on the 9th floor was a hairpiece which belonged to the victim.

On level 8, investigators locate a 1972 Buick which is registered to the victim’s father. The vehicle was found to be abandoned in what seemed to be an erratic fashion. Blood was located by investigators on both the exterior and interior of the vehicle, including the exterior driver’s side door, window and door handle, and in the interior near the left side of the driver’s seat. Also inside the car was a box of tissues smeared with bloody finger marks, located in the foot well directly behind the driver’s seat. In addition were the victim’s purse and possessions, which were discovered to be untouched inside the vehicle.

Present on the concrete floor next to the driver’s side of the vehicle, was a bloody trail which led away from car and down to the 5th level of the garage, then back up to a parking area on the 7th level where it stopped near a set of keys and a man’s bloody handkerchief.

The Eyewitnesses

Shortly after 1:00 pm, eyewitnesses Timothy Woodstock and Frederick Petzold, III return from the shopping mall to their vehicle parked on level 9. They report seeing a man chasing a woman toward the stairs leading to the 10th level of the garage. They stated that the man appeared to be grabbing at the woman’s back as she let out one long single scream, then disappear behind the elevator shaft and into the stairwell. Shortly afterward, the man reappears and is seen running with what looks like a shiny object in his hand. He then enters the Buick, drives the length of the garage area in an apparent attempt to find the down ramp, but instead misses it and drives back up to level 10.

Witnesses on level 9, Jane Merold, Gary Hyrb and Ivan Hodes, report seeing a man in a Buick drive past them and up to level 10 in a fast and erratic manner. They describe the man as being either white or Puerto Rican, between the ages of 18-35 and most probably suffering from a wound on his hand or lower arm.

During that time, parking garage attendant Christopher Fagan is working the booth at the Frontage Road exit. Fagan reports that a man, driving either a blue or green Chrysler or GM model car, possibly with a Connecticut license plate beginning with either the letters “MR” or “MH” and ending in four digits, hands him a bloody parking ticket with the man’s right hand crossing his body. He further states that the man appeared to be injured and asked him if he needed help. And though Fagan did not get a look at the man’s face, he states that the man spoke in either a broken English or Hispanic accent.

The Autopsy

Serra’s body is taken to the Yale-New Haven Hospital where it will be medically examined. From the autopsy it was determined that Serra’s blood type was Type A and that she suffered from a single stab wound to the left side of her chest, which pierced the lower tip of her heart causing her to bleed to death within a minute. The examination also revealed that the body was not sexually molested and that weapon was most probably a knife with a blade of at least 3 inches. It was also discovered that the body had several bruises and abrasions, particularly to the right temple, right forehead, right knee and left ankle, as well as a cut on one of the fingers.

Police Theory of the Crime

Serra drove the 1972 Buick to the garage in order to go shopping at the mall; and often drove without shoes. From the information time stamped on the parking ticket, investigators speculate that the assailant arrived approximately four minutes after Serra and entered the garage from an entrance point opposite from Serra. Serra is attacked by her assailant on level 9 and then attempts to escape by running up the stairwell to level 10 where she dies. During the attack the assailant is most likely injured, leaving a trail of blood which traces his movements inside the garage.

The Suspects

Suspect #1 Phil DeLieto, Serra’s former fiancé, was known to have been involved in a domestic dispute with Serra shortly before her death. He is identified in a lineup by one of the witnesses but was later cleared based on his blood type and alibi.

Suspect #2 Anthony Golino, a former classmate of Serra’s at Wilbur Cross High School, was arrested in 1984 for her murder. The arrest was based on probable cause stemming from incriminating statements made by Golino’s wife, who stated that her husband threatened to kill her and do to her “what [he] did to Penney Serra”, and the one inch scar on Golino’s left hand which he could not explain. Charges against Golino were later dismissed when samples of his blood were tested and found to be Type A, the same type as Serra’s and not the assailant’s.

Suspect #3, Selman Topicu, was a patient at the dental office where Serra worked but was also a cook at a diner which Serra frequented on a daily basis. Topicu also spoke with an accent, had Type O blood and drove a Buick with a license plate of HN5533; however, police could not get judges to sign a warrant for Topicu’s arrest.

Forensic Evidence (1973-1988)

Blood

In 1973, DNA testing was not available as it had not yet been developed. Instead the technique of blood typing was used. During the time of the initial investigation into Serra’s murder, several blood samples collected from the handkerchief and the interior and exterior of Serra’s car, as well as the blood trail, all tested positive for Type O human blood.

Then in 1988, another type of blood testing, called DQ Alpha testing was used to analyze some of the blood samples collected from the interior of Serra’s car. Results of this test further revealed that the blood was Type O human blood with a DQ Alpha of 1.2,3, which only exists in roughly five percent of the general population.

Latent Prints

In 1973, one latent print developed on the exterior of the tissue box and two latent prints collected from the interior of Serra’s car were compared to the 70,000 fingerprints on file with Connecticut State Police; however, no match was made during this time. Additionally, Dr. Henry Lee, who was the chief criminalist at the Connecticut State Police Forensic Laboratory, determined that blood on the tissue box had been deposited on top of the latent print and concluded there was no way to determine the time interval between the placement of the latent print and the deposit of blood.

Genetic Material on Serra’s Clothing

Testing of Serra’s clothing worn at the time of her murder revealed the existence of genetic material on the exterior of Serra’s panties and slip. In 1987, Dr. Lee concluded the genetic material to be semen; however, due to degradation, it could not be determined if the semen was already on the garments when Serra wore them. Dr. Lee does infer, however, that the semen was most likely ejaculated onto the outside of the garments.

Edward Grant is Arrested

The Latent Print

In 1997 Christopher Grice, a latent print examiner with the Connecticut State Police Forensic Laboratory, conducts a search of the latent print from the tissue box using the State’s Automated Fingerprint Identification System (AFIS) and identifies the print to Edward Grant’s left thumb. Prior to AFIS, Grant had not been fingerprinted; however, after an arrest for domestic violence in 1994, Grant’s fingerprints were subsequently scanned into the AFIS database.

Police Interview Grant

On September 3rd, 1997, armed with the fingerprint match, Inspectors Gerald Hanahan and John Torento from the Chief State’s Attorney’s Office pay a visit to Edward Grant at his place of employment in Waterbury, Connecticut. They inform Grant that his left thumb print had been identified to a latent print on a tissue box inside the vehicle of Penney Serra, who was murdered 25 years ago in New Haven. When questioned about his relationship to the victim, Grant states that he occasionally travels to New Haven for business but claims he did not know Penney Serra or her family. Grant does admit, however, that he has Type O blood.

On September 4th, 1997, Hanahan and Torento return to Grant’s place of employment for a second interview. Grant cannot explain why or how his fingerprint got on the tissue box and admits that he has memory problems as the result of a head injury, which he incurred during a Jeep accident while he was in the military prior to 1973. Additionally, he admits that he is currently undergoing treatment for this injury with prescription medication. Grant is then asked to voluntarily submit a sample of his blood, to which he initially agrees upon but changes his mind the next day on the advice of his attorney.

On September 19th, based on probable cause from the fingerprint match along with the totality of information obtained during the two interviews, including corroboration of the Type O blood, a search warrant to obtain Grant’s blood is signed and executed. The testing of Grant’s blood confirms positive for Type O with a DQ Alpha Type of 1.2,3.

Pursuant to probable cause obtained from both the fingerprint match and the blood test results, a warrant for Edward Grant’s arrest is signed and executed. On June 24th, 1997, the arrest warrant is served on Grant at his home in Waterbury, Connecticut by Inspectors James Rovella and Peter Fearon of the Chief State’s Attorney’s Office, and accompanied by officers from the Waterbury Police Department. Edward Grant, 59 years old at the time of his arrest, is charged with the 1973 murder of Penney Serra.

The Trial

In May of 2002 Edward Grant is tried for the murder of Penney Serra. The New Haven cold case had previously gone unsolved for 29 years; however, with the advancement of forensic technology involving two key pieces of evidence, the case is finally being brought to justice. The very same key pieces of forensic evidence, the prosecution will argue, that point to the only person who is responsible for the death of Penney Serra; and that person is Edward Grant. The defense doesn’t try to argue the validity of the forensic analysis, which has seemingly implicated Mr. Grant; instead, the focus is placed on faulty evidence handling and even possible tampering by the police. In addition, the defense argues to the fact that there is no motive and no murder weapon either. Both sides, however, will use each crucial piece of evidence to their advantage in arguing their case to the jury.

The Prosecution’s Case

#1) The Latent Print– Probably one of the single most important pieces of evidence in the prosecution’s case is the latent print that was identified to Edward Grant. That AFIS hit in 1997 was instrumental to the identification of a latent print which had previously gone unidentified since 1973 and subsequently leads to the cold case being reopened and the arrest of Edward Grant. Furthermore, the identification of the latent print casts suspicion toward Edward Grant and why his thumb print would be on a tissue box inside the victim’s vehicle; since, by his own admission, Grant did not even know the victim.

#2) Blood- There were several pieces of evidence involving blood that was collected from the crime scene, both from the victim and the assailant as well; so much so, that there was even a trail of the assailant’s blood leading from the victim’s car and through three levels of the garage. There was blood on the front and back of the victim’s dress, blood in and on several objects and several places within the victim’s car, blood on the parking ticket and blood on the handkerchief. However, with the absence of DNA testing at the time, the blood could only be tested with blood typing.

In 1973, basic blood typing confirmed that only the blood on the front of the victim’s dress was Type A (Penney Serra’s blood type) and all of the rest was Type O (the assailant’s blood type). This would corroborate statements made by eyewitnesses Woodstock and Petzold that the assailant was seen grabbing at the back of Serra’s dress; and in all likelihood, left his Type O blood on her dress with his injured hand in the process.

Then in 1988, a more advanced method of blood typing called DQ Alpha Typing confirmed that Type O blood obtained from the interior of the victim’s car could now be further narrowed down to a DQ Alpha of 1.2, 3. Statistically, only 5 percent of the general population would have that particular DQ Alpha blood type. From this information, along with the l988 test results of Edward Grant’s blood, which also tested positive for Type O with a DQ Alpha of 1.2,3, investigators can now positively eliminate the 2 previous suspects who also had Type O blood. This leaves Grant with now 2 strikes against him.

#3) DNA- In 2000- DNA testing conducted on the handkerchief can now positively confirm a match to Edward Grant’s DNA profile. In fact, DNA expert Dr. Ladd testified that the expected statistical frequency of this particular profile within the general population is one in 6.9 trillion. The DNA results were integral in corroborating what the other evidence could not, proof that Edward Grant was in that parking garage at the time Penney Serra was murdered. This evidence was the clincher in the case against Grant.

#4) Paint Stains- Upon initial examination of the handkerchief, it had been determined that along with the blood, there were also spots which were consistent with some type of paint. Dr. Maxwell testified that these spots were in fact paint and also particularly consistent with a type of aftermarket paint used in auto body shops. It is also a known fact that during the time of the murder Grant worked in an auto repair shop owned by his parents where it would have been common for him to have access to this type of paint.

#5) Composite Sketches- In 1973, several eyewitnesses gave descriptions of a man who was seen chasing Penney Serra and later driving erratically inside of the garage. As a result, several composite sketches were rendered which ironically bear a striking resemblance to Edward Grant’s appearance during that time.

#7) Grant’s Spontaneous Statements to Police- In 1997 after Grant’s arrest, he was taken into custody and transported to the Bethany Barracks by Inspectors Rovella and Fearon. Along the way, Grant is given his Miranda Rights and asked if he would like to waive those rights and speak to the investigators; to which Grant replies, “Maybe I should wait.” At this time there is no further questioning; however, Grant of his own volition, indicates confusion as to why he is being charged with Serra’s murder and how his fingerprint could have been inside the victim’s car. Rovella then explains to Grant that along with the fingerprint, Grant’s blood was also found inside the car and in the garage.

Once Grant arrives at the barracks, he is once again given his Miranda Rights. Grant again indicates that he is aware of these rights. After Grant is booked and fingerprinted, he is asked once again if he would like to waive his rights and speak to the investigators. Grant then states that he would like to speak to the investigators but would first like to speak with his attorney. Again, at this point, all questioning of Grant ceases; however, Grant again initiates a conversation with Rovella by stating, “Did you read about the guy in Texas that killed all those people? They got him on a fingerprint too.” Rovella then acknowledges that he has read about that particular case; and without any questioning by Rovella, Grant continues to explain that he wouldn’t even be able to recall where he was or who he was with at the time of the murder. He goes on to explain that he used to go into New Haven on business as an insurance adjuster and also to purchase paint for his family’s auto repair business where he worked. He further states that he often suffers from blackouts and memory loss due to an accident he incurred when he was in the Army during the 60’s; and as a result of this injury, metal plates were placed inside his head. Grant then adds that he has been treated for these injuries at the Veteran’s Hospital in West Haven. To this, Rovella nods his head in response but asks no questions.

#7) Autopsy Results- The medical examination of Penney Serra’s body was conducted at the Yale-New Haven Hospital. Results from the autopsy reveal that Serra had Type A blood and that she suffered from a single stab wound that pierced the lower tip of her heart, causing her to bleed to death. Examination also revealed that the wound was made using a knife with a blade of at least 3 inches in length. Results of the autopsy also showed that Serra had not been sexually molested; and that her body, in addition to the stab wound, had several bruises and abrasions.

#6) Possible Murder Weapon- Though no murder weapon was ever found, results of the autopsy confirm that Serra died from a single stab wound and the murder weapon used was most probably a knife with a 3 inch blade. Testimony also revealed that a knife fitting that description, which was borrowed from a family friend, was missing from the vehicle at the time of Serra’s death. The prosecution displays an exact replica of this very same knife to the jury.

The Defense’s Case

#1) The Latent Print- The defense argues that the tissue box is a moveable object which could have been touched by Grant in the Pathmark store where it had been purchased and later given to the victim. Dr. Henry Lee corroborates this by confirming that the bloody finger marks on the tissue box were deposited on top of the latent print and further states there is no way to determine the interval of time between the placement of the latent print and the bloody finger marks. Latent print expert Christopher Grice further corroborates Lee’s testimony by stating that there is no way to determine when a latent print has been deposited.

The defense further argues that there were three other latent prints taken from Serra’s vehicle which do not match Grant’s and have not been identified. The prosecution’s rebuttal argues that none of the stock boys’ prints from the Pathmark were left on the tissue box and that the only person’s fingerprint that has been identified is Grant’s.

#2) Blood- The defense argues that the blood inside Serra’s car from the seat, the tissue box, a tissue next to the tissue box, the steering column, floor pedal, exterior door handle, a small white envelope, a pink rag on the floor, metal trim on the left door and the debris on the floor, as well as the blood trail from three different locations and the back of Serra’s dress tested positive for human Type O blood; therefore, former suspects Phillip DeLieto and Selman Topicu cannot be excluded since they both have Type O blood.

#3) DNA- The defense argues that all but one spot of genetic material on the handkerchief had degraded beyond the point of testing; and states that “something is not right with the handkerchief”, inferring that Grant’s DNA had been planted by the police. Testimony from DNA expert Dr. Ladd confirms there is no way to determine how, when or under what circumstances the DNA was deposited.

The defense also draws questions about the bloody parking ticket taking two days to dry, yet the handkerchief was “bone dry” when it was collected. The defense draws further attention to the mishandling and improper storage of evidence by the police and points to the fact that the parking ticket with bloody fingerprints has somehow gone missing along with the car keys which would preclude any sort of DNA testing or fingerprint comparison. Additional implications are made about the blood evidence being stored next to a furnace inside of a basement for almost 30 years while in police custody, further bolstering the defense’s evidence planting theory and the degradation of biological fluids, which also preclude DNA testing of those items.

The defense also adds that photos taken at the crime scene in 1973 do not show a close up of the handkerchief and argues that the object shown in the crime scene photos looks more like a shirt or a towel rather than a handkerchief. The prosecution gives a rebuttal in reference to the crime scene photos of the handkerchief and argues that witnesses have testified under oath to the validity of the handkerchief in those photos.

#4) Composite Sketches- The defense argues that eyewitness descriptions of the assailant are conflicting; therefore, their testimony must be discredited. Some examples of conflicting eyewitness information that was given include: Hispanic vs. white male, dark skinned vs. light skinned, foreign accent vs. English, injury to the assailant’s left hand or arm vs. Grant having no cuts or scars on his left arm or hand, Polo shirt vs. work uniform, dark blue pants vs. light green dressier pants, short sleeved shirt vs. long sleeved work shirt bearing a logo, and well trimmed mustache vs. no facial hair. The defense also draws attention to admissions made by eyewitnesses Merold and Hyrb that they had been smoking marijuana shortly before the saw the assailant and that Hyrb identified former suspect Phil DeLieto in a line up.

#5) Paint Stains- The defense argues that Dr. Maxwell also stated in her testimony that that type of paint could also be used in kitchens and bathrooms or on any exterior trim surfaces or areas which undergo a high level of traffic or activity.

#6) Murder Weapon- In reference to the prosecution’s exhibit of a replica of the type of knife which was missing from the victim’s car and fits the description of the autopsy results, the defense simply argues that there was no murder weapon found.

#7) Lack of Motive- The defense argues that the prosecution has no motive and poses the theory that Penney Serra knew her attacker; and that the meeting was arranged. The defense points to the fact that Edward Grant didn’t know the victim or her family; however, three former suspects who knew Serra did have motive. The defense further strengthens its theory of an arranged meeting by drawing attention to the fact that the entire incident, from the time of meeting with the victim to her murder, only took between 2 to 6 minutes. Additionally, questions were also raised as to why Serra got “dressed up” just to go shopping for patio furniture and to the fact that Serra usually parks between the 3rd and 5th floors; but this time decided to park on the 9th floor, even though the garage was not full. And lastly, why did Serra have traces of semen on her undergarments and how did it get there? Is it related to possible attempted sexual activity with the assailant prior to her being murdered? No motive causes reasonable doubt.

The prosecution argues in rebuttal that no motive needs to be proven and it is not a requirement of the jury to determine motive but if a motive is warranted, then perhaps the perpetrator was trying to steal Serra’s car. The defense counters that Edward Grant had no reason to steal Serra’s car when he has his own.

A 29-Year Old Cold Case is Solved

The Verdict

On May 28th, 2002 Edward Grant was found guilty of the murder of Penney Serra and on September 27th, 2002, Grant was sentenced to 20 years to life. With the identification of the latent print and the DNA, it is apparent that the jury could not deny the overwhelming role forensic evidence played toward the strength of the prosecution’s case. However, the defense did raise some valid arguments with regard to improper evidence handling and possible tampering by the police. On the other hand, with the totality of inculpatory evidence between the forensic analysis and spontaneous statements made by the defendant, the jury’s level of doubt went way past the point of reasonable and closer to the level almost no doubt.

I tend to agree with the jury’s verdict. I think there was just a substantial amount of evidence pointing toward Grant’s guilt than toward his innocence. The defense did raise some valid arguments; however, those arguments were much weaker than those of the prosecution’s and failed to hold up against the validity of forensic science, and for those reasons I believe Grant is guilty. 

 

Bibliography

Associated Press. (2002, February 11). Dead men tell tales in trial for 28-year old murder. Yale Daily News. Retrieved from http://www.yaledailynews.com

Connecticut Superior Court. (2002, February 6). 2002 Case Law Superior Court of Connecticut: State of Connecticut v. Edward R. Grant. Retrieved from Lexis Nexis : http://webservices.lexisnexis.com

Connecticut Superior Court. (2002, February 19). State of Connecticut v. Edward R. Grant: No. CR99-0481390. Retrieved from Find Law.com: http://caselaw.findlaw.com

Keating, C. (2002, May 23). Closing Arguments Presented in Serra Case. Hartford Courant. Retrieved from http://articles.courant.com

Keating, C. (2002, May 17). Trying To Grasp Penney Serra’s Last Day: Court Visits Scene Of July 1973 Killing. Hartford Courant. Retrieved from http://articles.courant.com

Kovner, J. (2002, September 28). Sentencing Ends 1973 Serra Case: Grant Proclaims His Innocence To The End Gets 20 Years To Life For Murder. Hartford Courant. Retrieved from http://articles.courant.com

Lissitzyn, C. B. (2007). Forensic Evidence in Court: A Case Study Approach. Durham: Carolina Academic Press.

Supreme Court of Connecticut. (2008, April 22). State v. Grant, 944 A. 2d 947 – Conn: Supreme Court 2008. Retrieved from Google Scholar: http://scholar.google.com

 

Do You Know What’s in Your Medicine Cabinet? A Look at the Rise in Prescription and Over-the-Counter Drug Abuse Among Teens

Standard

Written by: Aischa S. Prudhomme February 2009

Abstract

Prescription drugs have increasingly become the drugs of choice for many in the United States. However, with their popularity rising among teens, only recently has it become a growing concern for parents. This paper will examine the escalation of prescription and over the counter drug abuse among teens. It will describe the several types of commonly abused prescription drugs, illustrate their effects as well as discuss the signs of abuse. In addition, it will highlight the rates of abuse and reveal the reasons indicated by teens for deciding to abuse prescription drugs. Lastly, the measures and guidelines for the prevention of future drug abuse will be presented. 

Do you know what’s in your medicine cabinet? Your teen probably does. That is but one of many questions parents may find themselves asking. National studies and published statistics suggest that not only do teens know what types of prescription drugs are in their parents’ medicine cabinets, but that these drugs are fast becoming the drugs of choice for their children. In fact, teens abuse prescription drugs more than any other illicit drug, with the exception of marijuana (Office of National Drug Control Policy, Jan. 2008). Specifically, powerful pain killers such as Vicodin and OxyContin are among the leading drugs of abuse for youths between the ages of 12 and 17 (Maxwell, 2006).

But what are these drugs and what effects do they have on the body? In order to better understand why teens choose to use prescription drugs one must first understand the dynamics of them. The most commonly abused prescription drugs fall into three classes: opioids, depressants and stimulants (Office of National Drug Control Policy, Jan. 2008). 

Opioids

Probably the number one abused prescription drugs are opioids, which are also referred to as painkillers or narcotics. Because of their analgesic properties, they are commonly prescribed to alleviate pain (Office of National Drug Control Policy, Jan. 2008). These drugs are synthetically produced from morphine, a drug that is extracted from opium, and a derivative of the poppy plant (Saferstein, 2007). Some of the most common examples of opioids include: oxycodone (OxyContin, Percocet, Percodan) and hydrocodone (Vicodin). On the street, these drugs may be referred to as “Oxy”, “O.C.”, “Percs”, “Vikes” and “Vikings”. They are typically ingested orally in pill form but they can also be injected intravenously or crushed and smoked and are said to produce a sense of well-being and feelings of euphoria. Some of the side effects consistent with these drugs include drowsiness, nausea, and possible respiration depression and withdrawal symptoms can include restlessness, insomnia, muscle and bone pain, diarrhea and vomiting (www.nida.nih.gov).  

Depressants

Another class of commonly abused prescription drugs is depressants. Depressants, also referred to as “downers” or sedatives, are frequently prescribed to treat anxiety and sleep disorders (Office of National Drug Control Policy, Jan. 2008).

One example of prescribed depressants is barbiturates (Amytal, Nembutal, Seconal, Phenobarbital), which are derived from barbituric acid (Office of National Drug Control Policy, Jan. 2008). Barbituric acid was first created over a hundred years ago by German chemist Adolf Von Bayer. On the street, barbiturates are known as “yellow jackets”, blue devils” and “reds”, which is usually derived from whatever colors the pills are (Saferstein, 2007). Another example of prescribed depressants is benzodiazepines, such as diazepam (Valium) and alprazolam (Xanax), which are also known on the street as “candy”, “sleeping pills”, “xanbars”, “roofies” and “benzos” (www.nida.nih.gov).

Both barbiturates and benzodiazepines can either be ingested orally in pill form or injected intravenously and are said to produce a drowsy or calming effect (www.nida.nih.gov). Some of the side effects attributed to these drugs include depression, fatigue, confusion and irritability. Withdrawal symptoms can include anxiety, insomnia, muscle tremors and loss of appetite (www.theantidrug.com). Long term use can lead to developing a tolerance; as a result, larger doses would be required to reach the initial effects (www.nida.nih.gov).

Stimulants

The final class of commonly abused prescription drugs is stimulants. Also known as “uppers” or “speed”, stimulants were at one time prescribed as an appetite suppressant for the treatment of obesity (www.nida.nih.gov). Today, they are prescribed for the treatment of Attention Deficit/ Hyperactivity Disorder (ADHD), narcolepsy, and depression and are said to increase alertness, attention and energy (Office of National Drug Control Policy, Jan. 2008).

Some examples of prescribed stimulants are dextroamphetamines (Dexedrine, Adderall) and methylphenidate (Ritalin, Concerta) and are commonly known on the street as “bennies”, “black beauties”, “speed”, “truck drivers” and “uppers”. They can either be ingested orally in pill form or crushed and smoked or snorted and are said to produce feelings of euphoria due to an increase of dopamine in the brain (www.nida.nih.gov). Some of the side effects associated with these drugs are feelings of hostility or paranoia and with high doses, may result in an elevated body temperature and irregular heartbeat, which can lead to cardiovascular failure or seizures. Withdrawal symptoms include fatigue, depression, and disturbance of sleep patterns (www.nida.nih.gov).

Over-the-Counter (OTC) Drug Abuse

Another substance that is often abused by teens, and is commonly found in just about every medicine cabinet in the United States, is cough and cold medicine. Many parents are unaware of the dangers involved in the abuse of these OTC drugs. Because it is an over-the-counter medication, and available without a prescription, abuse frequently goes undetected (www.theantidrug.com).

The drug, dextromethorphan or more commonly known as DXM, is the active ingredient found in several OTC cough and cold medications including Coricidin, NyQuil and Robitussin (Office of National Drug Control, Jan. 2008). DXM, commonly known on the street as “Orange Crush”, “Triple C’s”, “Skittles”, “Robo” and “Robo-trippin’” (www.streetdrugs.org), is a cough suppressant that is available in preparations of either caplet or liquid form that is ingested orally (www.theantidrug.com).

When taken in large doses, DXM can produce feelings of euphoria and mild hallucinogenic effects with adverse side effects including impaired judgment, loss of coordination, dizziness and nausea (www.theantidrug.com). In some cases of extreme abuse, overdoses having resulted in coma and/or death have been reported. Some of the withdrawal symptoms associated with DXM use are restlessness, insomnia, muscle or bone aches, diarrhea and vomiting; and, with prolonged use, tolerance and physical dependence may also develop (www.streetdrugs.org).

Reasons for Abuse

So, how did something as seemingly innocent as a little pill become one of the most highly addicted and widely abused substances among teens? There are several answers to that question. The first answer addresses the fact that many parents are simply unaware of the abuse and are not discussing the dangers of prescription and over the counter drug use with their children. Also, coupled with the myth that prescription drug use is safe, many teens and parents believe that they “provide a safe high” because they are readily accessible in the home (Office of National Drug Control Policy, Jan. 2008).

But why do teens choose to use prescription and over the counter drugs over other illicit “street” drugs and what are the rates of abuse? In a 2005 study conducted by the Partnership for a Drug-Free America, 7,218 adolescents nationwide, in grades 7-12, completed self-administered questionnaires which were then compiled into a survey. This self-report survey is an annual survey that tracks the attitudes of teens towards drug use. Included in that survey were questions focusing specifically on prescription pain reliever use. At the end of the study, the data was compiled in the Partnership Attitude Tracking Study (PATS) (PATS, 2006).

According to the survey, there were several reasons given for prescription drug use, but as the study reveals, easy access and availability coupled with the belief that the drugs are not illegal, ranked among the highest. The table below lists the outcome of the portion of the survey that focused on prescription drug use (PATS, 2006).

Table 1. Reasons Reported for Using Prescription Pain Relievers: PATS Attitude Tracking Study – 2005

 

%

Easy to get from parents’ medicine cabinet

62

Are available everywhere

52

They are not illegal drugs

51

Easy to get through other people’s prescriptions

50

Teens can claim to have a prescription if caught

49

They are cheap

43

Safer to use than illegal drugs

35

Less shame attached to using

33

Easy to purchase over the internet

32

Fewer side effects than street drugs

32

Can be used as study aids

25

Parents don’t care as much if you get caught

21

Note. The data in this table was taken from the Partnership Attitude Tracking Study (PATS) – “Prescription Pain Reliever Abuse”; a study conducted and published by The Partnership for a Drug-Free America, May 16, 2006, p. 20.

Rates of Abuse 

In another study, conducted by the University of Michigan’s Institute for Social Research and funded by the National Institute on Drug Abuse (NIDA), 8th, 10th and 12th- graders participated in the Monitoring the Future Survey (MTF). This survey tracks the trends and attitudes in illicit drug use among teens (Maxwell, 2006).

The results of the 2008 study, which monitored those trends of abuse between the years 2004-2007, reveal that Vicodin, OxyContin, Valium and Xanax use was highest among 12th-graders (Johnston, O’Malley, Bachman & Schulenberg, 2008). The table below reveals the trends in the annual occurrences of the use of Vicodin, OxyContin and Tranquilizers (Valium and Xanax

Table 2. Annual Trends in the Use of Vicodin, OxyContin and Tranquilizers 

Monitoring the Future Survey 2004-2007      

 

2004

2005

2006

2007

Vicodin

 

 

 

 

8th Grade

2.5%

2.6%

3.0%

2.7%

10th Grade

6.2%

5.9%

7.0%

7.2%

12th Grade

9.3%

9.5%

9.7%

9.6%

OxyContin

 

 

 

 

8th Grade

1.7%

1.8%

2.6%

1.8%

10th Grade

3.5%

3.2%

3.8%

3.9%

12th Grade

5.0%

5.5%

4.3%

5.2%

Tranquilizers

 

 

 

 

8th Grade

2.5%

2.8%

2.6%

2.4%

10th Grade

5.1%

4.8%

5.2%

5.3%

12th Grade

7.3%

6.8%

6.6%

6.2%

 

 

 

 

 

% = percentage who used in the last 12 months

 

 

 

 

Note. The data in this table was taken from data compiled in the Monitoring the Future Survey Occasional Paper No. 69, a study conducted and published by the University of Michigan Institute for Social Research, (2008), available at www.monitoringthefuture.org.

Signs of Abuse

How can a parent tell if their child has a problem with drugs? Many parents are unaware that their child even has a drug problem at all. However, there are many signs that may give clues to their abuse. In adolescents, drug abuse should be suspected if there is a sudden increase in school absences, a decline in performance in the classroom and an overall lack of interest in school as well as decreased participation in extra curricular activities. Other signs may include slurring of speech, puffiness of the face and temper tantrums (Harish, Sharma & Chavali, 2008).

Addicts usually become self-absorbed in the need to obtain drugs and often become inactive and non-committal in their day to day lives. Although there are many signs indicating drug abuse, it is difficult to recognize when abuse elevates to addiction (Harish et al., 2008). It should be noted that these signs are only an indication that there may be abuse and not a determination. While it may seem that all users are considered “hopeless addicts” or social “misfits”, they should not be labeled as such since most users present a facade of normality and appear to lead conventional lives (Saferstein, 2007).

Prevention

So, how can the problem of prescription and OTC drug abuse be addressed? Many parents say that they simply do not have the proper information or resources available to successfully aid in the prevention of teen drug use (PATS, 2008). This could explain why most parents are not having conversations with their kids about prescription and OTC drug abuse. It may also be due, in part, to the fact that many parents say that they are more inclined to discussing the dangers of heroin and cocaine with their kids than they are of prescription and OTC drugs; even though studies reveal that teens are less likely to use these types of drugs (Office of National Drug Control Policy, Jan. 2008).      

Besides proper communication, there are other precautions parents should take when safeguarding their child against prescription and OTC drug abuse. Parents should always take note of and monitor the amount and types of medication that is available within the home and should properly secure these medications at all times. In addition, any unused or expired medications should be properly disposed of; but most importantly, parents should set a good example for their kids by not sharing prescription drugs with them or abusing prescription drugs themselves. By following these simple guidelines parents can make a difference in their child’s decision to not abuse prescription and OTC drugs (www.theantidrug.com). 

Conclusion

As previously discussed, studies and trends reveal the dramatic rise in the abuse of prescription and OTC drugs among teens. The reported reasons for their increase can be attributed to several factors; but the main emphasis revealed an absence of communication between parents and their kids. This can be associated, in part, with a lack of education about the dangers and risks involved in the abuse of these drugs as well as a general reluctance by parents to discuss these issues. Whatever the case may be, clearly a problem does exist. The problem of drug abuse is a concern that will likely remain constant; but with the proper education and awareness, it can somewhat be diminished.

References

Drugs of Abuse – DXM. (n.d.). Over-the-Counter Drugs: Abuse Where You Least Expect It. Retrieved January 30, 2009, from Parents: The Anti-Drug, Web Site: http://www.theantidrug.com/DRUG_INFO/drug_info_dxm.asp

Harish, D., Sharma, B. R., & Chavali, K. H. (2008). Substance Abuse and its Medico-legal Implications: an Overview, Indian Journal of Forensic Medicine, Vol. 2, No. 1. Retrieved from IndMedica & Toxicology database.

Johnston, L. D., O’ Malley, P. M., Bachman, J. G., & Schulenberg, J. E. (2008). Demographic subgroup trends for various licit and illicit drugs, 1975-2007(Monitoring the Future Occasional Paper No. 69). [Online] Ann Arbor, MI: Institute for Social Research, pp. 325-346. Available at: http://www.monitoringthefuture.org/

Maxwell, Jane Carlisle, Ph.D. (2006). Trends in the Abuse of Prescription Drugs, The Center for Excellence in Drug Epidemiology, The Gulf Coast Addiction Technology Transfer Center, The University of Texas at Austin, pp. 2-7.

National Institute on Drug Abuse (NIDA). (2008). Commonly Abused Drugs. Retrieved January 30, 2009, from the National Institute on Drug Abuse: Official Site, Web site: http://www.drugabuse.gov/DrugPages/DrugsofAbuse.html

National Institute on Drug Abuse (NIDA). (2008). Opioids. Retrieved January 30, 2009, from the National Institute on Drug Abuse: Official Site, Web site: http://www.nida.nih.gov/Researchreports/Prescription/prescription2.html

National Institute on Drug Abuse (NIDA). (2008). CNS Depressants. Retrieved January 30, 2009, from the National Institute on Drug Abuse: Official Site, Web site: http://www.nida.nih.gov/Researchreports/Prescription/prescription3.html

National Institute on Drug Abuse (NIDA). (2008). Stimulants. Retrieved January 30, 2009, from the National Institute on Drug Abuse: Official Site, Web site: http://www.nida.nih.gov/Researchreports/Prescription/prescription4.html

National Institute on Drug Abuse (NIDA). (2008). Trends in prescription drug abuse. Retrieved January 30, 2009, from the National Institute on Drug Abuse: Official Site, Web site: http://www.nida.nih.gov/Researchreports/Prescription/prescription5.html

Office of National Drug Control Policy, Executive Office of the President. (2008, January). Prescription for Danger: A Report on the Troubling Trend of Prescription and Over-the-Counter Drug Abuse Among the Nation’s Teens. pp. 1-7.

Partnership for a Drug Free America. (2006, May 16). The Partnership Attitude Tracking Study (PATS) Teens in grades 7 through 12, pp.4-6, and “Prescription Medicine Abuse”, p. 20.

Partnership for a Drug Free America. (2008, June 11). The Partnership Attitude Tracking Study (PATS) Parents 2007 Report, pp.4-6.

Prescription Drug Abuse. (n.d.). What can you do? Tips for preventing Rx abuse. Retrieved February 2, 2009, from Parents: The Anti-Drug, Web Site: http://www.theantidrug.com/drug_info/prescription_what_can_you_do.asp

Saferstein, Richard. (2007). Criminalistics: An Introduction to Forensic Science, Ninth Edition, Drugs. (pp. 249-261). Upper Saddle River, New Jersey: Pearson Prentice Hall.

http://www.streetdrugs.org. (2008). Dextromethorphan (DXM). Retrieved January 30, 2009, from http://www.streetdrugs.org, Web site: http://www.streetdrugs.org/dxm.htm

 

 

 

 

 

                                                                     

 

 

 

Living in the “Big Sleazy”: Why Corruption is to Blame for New Orleans’ Violent Crime Rate and its Impact on Recovery

Standard

Written by Aischa S. Prudhomme – November 23, 2010

For the longest time New Orleans has been known as the “Big Easy” because of its easy living and slow paced, laissez-faire atmosphere. But since the mid-Nineties, the “Big Easy” has also become synonymously known as the “murder capital” of the United States; a title it has consistently held even through post Katrina, and all anyone has to do is turn on the news or pick up a copy of the Times Picayune to see why. The headlines are constantly riddled with stories of the city’s ever increasing murder rate and underlying corruption. This underlying corruption has pervaded New Orleans’ society and indirectly affected the city’s spiraling violent crime rate, turning it from the “Big Easy” to the “Big Sleazy”, and continues to impede the city’s recovery efforts nearly six years after Hurricane Katrina.

New Orleans was first given the dubious title of “murder capital” in 1994, when its murder rate reached epic proportions. With an alarming 421 murders that year, an equivalent to 88 murders per 100,000 people, the city’s violent crime rate had gained national attention, making it the highest and deadliest on record (Shapiro, n.d.). Although the city’s murder rate had somewhat decreased since that year, by 2003 it had once again seen an increase in numbers. Since Hurricane Katrina in 2005, and with a vastly decreased population, New Orleans has continued to rank as number one among the most violent cities in the U.S. With an average of 63.5 to 72.6 murders per 100,000 people in 2006 (depending on population estimates), the city had nearly reached its 1994 pre-Katrina record.

Today, tallying 57 murders for every 100,000 people in 2009 alone, the numbers aren’t that much better; and beating out St. Louis who ranked second with 47 murders per 100,000 people, New Orleans once again claims the title of “murder capital” (McCarthy, 2008). And just recently, in a list compiled by CNN, New Orleans was ranked as one of the top ten most dangerous cities in the world among Baghdad, Iraq; Beirut, Lebanon; Cape Town, South Africa and Juarez, Mexico. The only other U.S. city to make the list was Detroit, Michigan (2010). Though these alarming figures have long been attributed to the city’s festering drug trade which has pervaded its blighted housing projects, in reality there is a much more disturbing yet profound cause – corruption.

It is no myth that New Orleans has held an implicit reputation for corruption that goes far beyond Katrina and spans through many decades; but only until recently, with the efforts of U.S. Attorney Jim Letten and the media coverage during the aftermath of Hurricane Katrina, has it been exposed. In fact, according to an article in the Chicago Tribune, Louisiana is ranked 3rd in the country for most corrupt states, even beating out Illinois who ranked 19th and has had its own notorious history of corruption (Witt, 2009).

Before the appointment of U.S. Attorney Jim Letten in 2001, corruption was perpetuated and recognized as being “business as usual” in New Orleans, but since that time, Letten has been successful in the indictment of 213 state, local and private individuals with a near 100 percent conviction rate (Carey, n.d.). Included in that figure is the 2007 conviction of former New Orleans City Council president, Oliver Thomas, who was given a 3 year sentence for taking $15,000 in bribes and kickbacks in a parking garage contract (Dubos, 2010). And in 2009, Letten was instrumental in the convictions of former Congressman Bill Jefferson, his brother Mose Jefferson and former City Councilwoman Renee Gill-Pratt, as well as the indictments of several other Jefferson family members, who were all part of the infamous New Orleans political machine.

But Letten is probably most recognized for the prosecution and conviction of former Louisiana Governor Edwin Edwards back in 2000, when he was the assistant U.S. Attorney and lead prosecutor in the highly publicized political corruption case. Edwards, who was once quoted as stating, “They have not made a prosecutor who could ambush me.”, is presently serving the last of a 10 year sentence in a federal prison on numerous counts of racketeering and extortion in that case (Dubos, 2010). Noting the state’s penchant for corruption, Letten states: “Over the course of many decades Louisiana and New Orleans have earned a reputation as being exceptionally tolerant of corruption” (Carey, n.d.). He goes on to further explain that this iniquitous atmosphere has not only contributed to a decline in education and health care, but also to the city’s rising crime rate. What’s even more disturbing is that much of the city’s corruption exists within that very entity which was designed to protect and serve its citizens – the criminal justice system.

For over a decade violent crime has run rampant in this once charming but crumbling “city that care forgot”. The city, now dependent on tourism, was once a thriving and economically viable community, but since the oil bust of the mid-Eighties, its economy has been on a steady decline. Add in urban flight, a failing educational system and a teeming illicit drug trade, and combine that with rampant corruption, and you have a recipe for violent crime (Maxwell, 2005). That’s exactly what happened to the New Orleans Police Department. It was during this time period that the department was depleted of its most experienced officers who, seeking to move out of the city, were taking better paying jobs with other agencies. Because of a new city hiring requirement enacted by then-mayor Marc Morial, which strictly stipulated that all new officers must reside within the city, the department was forced to lower its acceptance standards.

In The Desire Terrorist, an expose’ about former NOPD officer and convicted killer Len Davis, author Dean Shapiro states:

Recruits with criminal records, DWIs, unfavorable employment records and dishonorable discharges from the Armed Forces were allowed to enter the Police Academy, whereas they had previously been excluded. A number of these new recruits had been charged with violent crimes as serious as armed robbery and rape. Some were openly recruited from the projects and off the street with no prior experience with the law, other than being on the receiving end of its consequences. Their records were expunged and, on completion of their training, they were issued badges, guns and patrol cars and turned loose on the street (n.d.).

The influx of these new recruits, coupled with the city’s already burgeoning crack epidemic, became the catalyst for the NOPD’s imminent involvement with corruption and police brutality. Through the next two decades the department would face a slew of disciplinary and criminal misconduct charges, including police brutality, as well as convictions for murder, rape and fraud, to name a few (Shapiro, n.d.).

More recently, through a federal investigation led by U.S. Attorney Jim Letten, at least six NOPD officers have been charged in the unlawful police shooting of six people and its cover up on the Danziger Bridge during the aftermath of Hurricane Katrina (Maggi, 2010).  But that is only one of eight federal investigations that have been launched by the federal government against the department since the storm. Another high profile Katrina related investigation involves yet five more New Orleans police officers who have been recently charged in the shooting death of 31 year old Henry Glover whose body was burned inside a car in an alleged cover-up scheme (The Economist, 2010). These indictments are part of the U. S. Justice Department’s aggressive plan to force changes in the NOPD and to restore public confidence in the criminal justice system. According to Assistant Attorney General Thomas Perez, who earlier this year met with city leaders to discuss the possibility of seeking a court order that would impose these changes, “The problems in the New Orleans Police Department are frankly deep-seated” and that “it [the NOPD] is among the most troubled departments in the country” (USA Today, 2010).

In the days following Hurricane Katrina, the whole world was exposed to the shocking state of lawlessness, poverty and racial disparity that has plagued the city for decades. While many were quick to attribute these social ills to the city’s skyrocketing violence, a grim portrait of police corruption was beginning to emerge. Breaking news images of the unlawful police looting of several stores for electronics and other non-essential merchandise a midst allegations of payroll fraud by the city and police department were examples of just some of the police corruption that was taking place during the aftermath of Katrina. And almost six years later, with the indictments of these NOPD officers in the Danziger Bridge and Henry Glover incidents, more information about police corruption during that time is being brought to the surface.

Police corruption is nothing new. In fact, studies have shown that among other job related criminal activities, police corruption is the most widespread above any other occupation. Similarly, nearly every major metropolitan police department in the United States has been involved in some form of criminal misconduct or organized corruption. Organized criminal activities, including “shakedowns”, kickbacks, case fixing, thefts of confiscated items and complicit partnerships with known criminals, such as in the trafficking and sales of drugs, are among some of the examples of police corruption (Inciardi, 2010). But what is the basis for police corruption?

There are several theories as to the causation of police corruption and why it is so prevalent. One theory, called the “Society-at-Large” Explanation, originates from the practice of giving small gratuities for services rendered within the community, which eventually made its way into the police department in the form of small bribes and ticket fixing. These petty offenses then led to more serious crimes. This type of progression has been termed the slippery slope hypothesis, wherein corruption begins with good intentions and harmless practices but subsequently leads to involvement in criminal profit. The level of participation may range from that of the individual officer to groups of officers or to the department as a whole (Inciardi, 2010).

Another theory, known as the “Structural” Explanation, hypothesizes that because police officers, especially those in large cities, are continuously surrounded by an environment of crime; they become more or less desensitized. Through their experiences they become cynical in their views of the criminal justice system and begin to rationalize their actions as normative and justified behavior. They consider corruption to be just “a game in which every person is out to get a share” (Inciardi, pg. 242).  In contrast, the younger, less experienced “rookie” is increasingly indoctrinated into this seemingly harmless and unintended behavior by veteran officers, in whom they must establish a mutual trust and fellowship, for their own protection (Inciardi, 2010).

The final theory, the “Rotten-Apple” Explanation, considered to be the most popular, is regularly used by police chiefs and commissioners to justify acts of police misconduct within the ranks of their own departments. Taken from the adage “One rotten apple spoils the bunch”, this theory suggests that corruption results from the moral collapse of a few officers, which eventually proliferates throughout the whole of the organization and to other officers. Although this theory is the most popular, it is also the most criticized. Critics argue that this view fails to explain why certain officers become corrupt in the first place and also why certain departments have a disproportionate amount of corrupt officers over a longer period of time (Inciardi, 2010). Other factors that play a role in police corruption are caused by the nature of police work in itself, as well as the criminal clientele, public attitudes and a lack of strong organizational leadership within the department (Gleick & Epperson, 1995).

One other factor that is rarely discussed is the “Code of Silence”, which exists within virtually all police agencies. Although many in the field are reluctant to speak of its existence to outsiders, it nevertheless does exist. In fact, in a 2000 survey conducted by the National Institute of Ethics of 1,000 law enforcement academy recruits, 79 percent said that “a Code of Silence exists and is fairly common throughout the nation” and more than half said they didn’t find a problem with that. What’s even more alarming is that in another survey of 1,116 law enforcement officers, nearly half admitted that they had personally “witnessed misconduct by another officer” and had concealed what they knew. And over half who kept the Code of Silence revealed their reason for doing so was because they felt pressure from others. They also reported that in 73 percent of those situations, it was the department leaders who were applying the pressure. This unspoken silent allegiance amongst law enforcement officers has become the glue that keeps the many pieces of police corruption intact. It is only when the code is broken that corrupt officers will refrain from misconduct and the cycle of corruption can begin to diminish (Salter, 2003).

Police corruption is among the very worst kinds of corruption because it further contributes to a lawless environment but most importantly, it erodes the public’s respect for the law and creates a disconnection to the community. When those ties are broken, the entire criminal justice system fails. In an article that associates high crime rates with police corruption, author C. Stone Brown states:

Criminologists are quick to link poverty and crime, but conveniently overlook the correlation between systemic police corruption and high crime rates. Consider the broadest negative results that corrupt police officers have on urban communities. When police are abusive, it undermines legitimate attempts at curbing urban crime, such as “community policing.” What community wants an alliance with corrupt police officers? (1997)

This cycle is further extended to the criminal court system. Without the public’s active participation and assistance in police investigations, many violent crimes cannot be prosecuted. These violent offenders are then returned to the community only to perpetuate their criminal activity which, consequently, increases the crime rate, thus renewing the cycle. The New Orleans District Attorney’s Office has certainly had its share of controversy as a result of this, not to mention the corruption that exists within the New Orleans criminal courts.

According to the New Orleans Metropolitan Crime Commission (MCC), in its 2005 independent study of the New Orleans Criminal District Court, “the criminal justice system is little more than a revolving door back to the street.” With Magistrate Judges such as Charles Elloie significantly reducing the amounts of commercial surety bonds (CSB’s) for arrested individuals a whopping 83 percent of the time and granting release on recognizance bonds (ROR’s) 48 percent of the time, it is no wonder why (MCC, 2005). The study also revealed that in a 12 month period, only 7 percent of those arrested by the NOPD were sentenced to serve prison time. The reason being is that, more often than not, violent crime charges end up being dropped by the District Attorney’s Office due to a lack of witness cooperation. In an interview with Time Magazine, MCC president Rafael Goyeneche explains that the number one reason these charges get dropped, is that witnesses and victims, who are initially cooperative, become fearful and change their minds. They know that most of these perpetrators will get early releases and will wind up back on the street (Hylton/Austin, 2006).

It is crucial to the moral fabric of society to maintain a judicial system that is free from corruption; otherwise, the alternative would mean the betrayal of public commitment and the collapse of law and order. According to John Kleinig, in his book Ethics and Criminal Justice: An Introduction, it is critical that a judicial system be independent, impartial and characterized by integrity in order to achieve a status of authority in the governance of society. He states:

Judicial corruption is almost always more troubling than other forms of corruption because, at least for liberal democratic societies, an honest and impartial judiciary is necessary to ensure the rule of law that sustains those societies. The judiciary is the final arbiter of the general framework for the conduct of public life. As the final arbiter that brings social closure, the judiciary bears a heavy burden of responsibility. Moreover, even if a judiciary’s basic structures remain sound, judicial misconduct and corruption are deeply problematic because they undermine the authoritativeness of judicial judgments and also the social trust that must be placed in them to be effective guarantors of the justice of the law and the fairness of its application (2008, pp 157-158).

It is apparent that the cycle of corruption within the New Orleans criminal justice system has contributed to the prevalence of violent crime in the city. And in doing so, it has brought shame to the city and has ultimately affected post-Katrina recovery efforts by limiting its ability to attract both residents and businesses back to the area. In the months after Katrina, the goal was to rebuild bigger, better and stronger levees so that the city would be protected from future hurricanes like Katrina. But with so much emphasis being placed on levee reconstruction, it seems as though New Orleans’ other inherent problems were simply being pushed aside. Federal funding came pouring into the city and with it came more allegations of fraud and corruption, leading many to now believe that New Orleans’ weakest link didn’t lie with the levee system but with its violent crime and rampant corruption.

The levees have now been repaired, yet New Orleans’ population has not returned to its pre-Katrina numbers and violent crime continues to rise, while charges of city and police corruption continue to make headlines. Although residents feel that the federal government has made some progress in attacking corruption, they also feel that nothing is being done to overhaul the New Orleans Police Department or combat the rising levels of violent crime. According to a 2009 study conducted by CommonHealth ACTION, a program subsidized by the Bush Clinton Katrina Fund, data collected from the observations of two focus groups indicated that “violence is the most unrecognized recovery issue and the basic issues behind the problem which are not being dealt with effectively” and that “violence is a primary reason many residents have either not returned, or have moved away and will not return, especially if they have children”. Although the group did indicate that gangs and drugs were common factors to the increased post-Katrina violence, other issues such as politics and police corruption were especially emphasized (2009).

New Orleans’ image of economic prosperity and commercial industry has long been tarnished by its history of corruption and has been replaced by tourism and violent crime. For a city that is still struggling after Katrina, and so economically dependent on tourism, the latter may yet take away its only means for recovery. Since Katrina, the government has pledged assistance with federal funds but unless New Orleans can clean up its “sleazy” image and rehabilitate its failing criminal justice system, the money will do nothing to end this vicious cycle that perpetuates violence. Until this cycle is broken, no amount of levee protection will be able to convince displaced residents and businesses that New Orleans is a safe place to live and that it can once again prosper.

References:

Brown, C. S. (1997, December). POLICE CORRUPTION: THE CRIME THAT’S NOT GOING DOWN. New Crisis   (15591603), 104(3), 48. Retrieved from Professional Development Collection database.

Carey, N. (unknown, August 14). Fighting corruption is hard going in New Orleans. Reuters:   http://us.mobile.reuters.com/m/FullArticle/p.rdt/CTOP/ntopNews_uUSN0448037120080814

CNN. (2010, April 13). The world’s most dangerous cities?. CNN World:   http://www.cnn.com/2010/WORLD/americas/04/10/dangerous.cities.world/index.html

CommonHealth ACTION.   (2009, Fall). New Orleans: Community Perspectives on the Root Causes of   Violence.commonhealthaction.org:   http://www.commonhealthaction.org/CHA_BCKFreport.pdf

Dubos, C. (2010,   January 4). Jim Letten leads a team of Modern-Day Untouchables. The Gambit: http://bestofneworleans.com/gyrobase/Content?oid=oid%3A66994

Gleick, E. & Epperson, S. (1995). The crooked blue line. Time, 146(11). 38. Retrieved from Academic Search Complete database.

Hylton/Austin, H. (2006, May 14). The Gangs of New Orleans. TIME Magazine: http://www.time.com/time/magazine/article/0,9171,1194016-2,00.html

Inciardi, J. A. (2010). Criminal Justice (9th ed.). New York: McGraw Hill.

Kleinig, J. (2008). Ethics and Criminal Justice: An Introduction. New York: Cambridge University Press.

Maggi, L. (2010, July 14). Six New Orleans police officers indicted in Danziger Bridge shootings.The Times Picayune: http://www.nola.com/crime/index.ssf/2010/07/prosecutors_will_seek_detentio.html

Maxwell, C. W. (2005,   December 30). Crime and Economic Disparity in Pre-Katrina New Orleans.neworleans.indymedia.org:  http://neworleans.indymedia.org/news/2005/12/6695.php

McCarthy, B. (2008, November 24). New Orleans has highest crime rate in country, national study says.The Times Picayune:   http://www.nola.com/news/index.ssf/2008/11/new_orleans_has_highest_crime.html

Rick, J. (n.d). Feds may force changes at NOPD. USA Today, Retrieved from Academic Search Complete database.

Salter, S. (2003, March 9). Cracking the Code of Silence. Insighthttp://articles.sfgate.com/2003-03-09/opinion/17481764_1_chief-earl-sanders-off-duty-san-francisco-officer

Shapiro, D. M. (n.d). The Desire Terrorist. truTV Crime Library: http://www.trutv.com/library/crime/gangsters_outlaws/cops_others/len_davis/index.html

The Metropolitan   Crime Commission (MCC). (2005). Performance of the New Orleans Criminal Justice System 2003-2004.metropolitancrimecommission.org:   http://www.metropolitancrimecommission.org/html/2Perf_of_the_NO_Criminal_Justice_System_2003-20041.pdf

Witt, H. (2009, March 27). Most corrupt state: Louisiana ranked higher than Illinois.The Chicago Tribune.: http://www.chicagotribune.com/news/chi-corruption-louisiana_wittmar27,0,2957672.story

(2010). Whitewash and ham sandwiches. Economist396(8692), 32-34. Retrieved from Academic Search Complete database.