• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1025
  • 477
  • 249
  • 98
  • 86
  • 78
  • 16
  • 15
  • 14
  • 14
  • 14
  • 14
  • 14
  • 14
  • 8
  • Tagged with
  • 2575
  • 424
  • 380
  • 295
  • 290
  • 282
  • 279
  • 230
  • 214
  • 204
  • 203
  • 189
  • 189
  • 179
  • 175
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
301

Mineral absorption by submerged bone in marine environments as a potential PMSI indicator

Mammano, Kristina Lynn 24 February 2021 (has links)
Human remains enter marine environments in a number of ways ranging from homicides, suicides, accidental drownings, shipwrecks, to burials at sea. Once the remains are discovered, a legal and forensic investigation begins. A key component to this investigation is the postmortem submergence interval (PMSI). Determining this range on skeletonized remains is a complicated process in which there is no accurate test; although barnacle growth data was previously used to determine PMSI, there are still limitations with that method. Therefore, a more reliable component of bone needs to be used as a potential PMSI indicator, such as its elemental composition. Diagenesis starts affecting bones immediately and continues for thousands of years. Although diagenesis is a slow process, an exchange of elements between bone and the marine environment continually occurs. The purpose of the present study is to determine whether an increase in marine elements is found within the composition of bone after being submerged in a marine environment for up to 20 months. The present study will also determine whether bones submerged in different aquatic environments have significantly different elemental concentrations. For the time trials, pig femora were submerged in lobster cages off the coast of the University of Massachusetts Boston for 2-20 months. For the salinity trials, pig femora were submerged in a freshwater pond (Holliston, MA), the Inner Boston Harbor, and an ocean inlet near Woods Hole, MA for 18 months. All bone samples were dried, milled, homogenized, and analyzed by ED-XRF under He purge. The initially produced mass percentages of the identified elements were corrected with certified values of standard reference materials (NIST 1486, 1646a, and 2702). A Pearson’s correlation test determined that the concentrations for K, Fe, Zn, Sr, Si, S, Cr, Mn, Cl, Br, Ta, and W were significantly correlated to the amount of time submerged in the water. An ANCOVA analysis was applied to the significant elements noted above. After adjusting for the amount of time submerged, the concentrations of K, Fe, Sr, Si, S, Cl, Br, and Ta were determined to be significantly different between the control samples (never submerged) and the submerged samples (submerged for 2-20 months). K was the only element that had greater concentrations in the control samples than the submerged samples, most likely because of the decrease in mass percent as other environmental elements were incorporated into the bone. S and W were significantly related to the number of months submerged, with S being positively influenced and W being negatively. A multivariable linear regression was run in order to identify a means of predicting the amount of time submerged from the elemental concentrations of an unknown bone from a marine environment. The regression produced an equation that used the concentrations for K, Sr, Si, S, Cr, Cl, and Br to predict the PMSI in months. For the salinity trials, a one-way ANOVA was performed on all the elemental concentrations from the different salinity environments. Post hoc tests determined significant differences in elemental concentrations for K, Fe, Si, S, Al, Ti, Cr, Ni, Mn, Cl and Br among the different submergence locations; elemental concentrations of S, Fe, Mn, Cl, K, and Br were either significantly different between the fresh, brackish, and saltwaters or the freshwater and some form of marine water (brackish and salt). The trends in the other elemental concentrations were less obvious due to the impact of pollution within the surrounding environments. The linear regression equation created in the present study accounted for the majority of the variance in the outcome (R2 = 80.2%); however, this equation should not currently be applied in forensic investigations. The study needs to be repeated a number of times with other bone samples from the same and different submergence locations, in order to determine the accuracy and usefulness of the equation. Although not verified, this regression equation may be useful in analyzing samples from brackish and saltwater environments, because the majority of the variables within the equation (K, Sr, S, Cl, Br) were consistent among the fresh, brackish, and saltwater samples. Time constraints, small sample sizes, and variance among samples were the major limitations of the present study. Even with limitations, significant results were produced by the ED-XRF analysis. Future research should expand upon the methodologies of XRF analyses of bones, especially those from marine environments. Because of their relevance to forensic investigations and PMSI, future research should include longer experimental periods, more salinity locations, more information on the surrounding water components, and more comparisons among instrumentation.
302

The forensic utility of photogrammetry in surface scene documentation

Church, Elizabeth 09 October 2019 (has links)
In current forensic practice, there are few standards for outdoor crime scene documentation, despite the need for such documentation to be accurate and precise in order to preserve evidence. A potential solution to this is the implementation of image-based photogrammetry. Applied Structure from Motion (SfM) reconstructs models through image point comparisons. A 3D model is produced from a reference photoset that captures a 360-degree view of the subject and the software employs triangulation to match specific points, datums, across individual photos. The datums are arranged into a point-cloud that is then transformed into the final model. Modifying the point-cloud into a final product requires algorithms that adjust the points by building a textured mesh from them. One of the disadvantages of SfM is that the point-cloud can be “noisy,” meaning that the program is unable to distinguish the features of one datum from another due to similarities, creating coverage gaps within the meshed images. To compensate for this, the software can smooth portions of the model in a best-guess process during meshing. As commercial software does not disclose the adjustment algorithms, this documentation technique, while very useful in other disciplines that regularly apply SfM such as archaeology, would fail to meet the standards of the Daubert and Kumho criteria in a forensic setting. A potential solution to this problem is to use open-source software, which discloses the adjustment algorithms to the user. It was hypothesized that the output of open-sourced software solutions would as accurate as the models produced with commercial software and with total station mapping techniques. To evaluate this hypothesis, a series of mock outdoor crime scenes were documented using SfM and traditional mapping techniques. The scenes included larger surface scatter and small surface scatter scenes. The large surface scatter scenes contained a dispersed set of plastic human remains, and various objects that might reasonably be associated with a crime scene. Ten of these scenes were laid out in 10 x 10 m units in a New England forested environment, each grid with a slightly different composition, and then documented using an electronic total station, data logger and digital camera. The small surface scatter scenes consisted of a pig mandible placed in different environments across two days of data collection. The resulting models were built using PhotoScan by AgiSoft, the commercial software, and MicMac for Mac OSX as the open-source comparison software. Accuracy is only part of the concern however; the full utility of any one of the workflows is defined additionally by the overall cost-effectiveness (affordability and accessibility) and the visual quality of the final model. Accuracy was measured by the amount of variance in fixed-datum measurements that remained consistent across scenes, whereas visual quality of the photogrammetric models were determined by cloud comparison histograms, which allows for comparison of models between software types and across different days of data collection. Histograms were generated using CloudCompare. Not all models that were rendered were useable—90% of large surface scatter models and 87.5% of small surface scatter models were useable. While there was variance in the metric outputs between the total station and photogrammetric models, the average total variance in fixed-datum lengths for individual scenes was below 0.635 cm for six of the ten scenes. However, only one of the large surface scatter scenes produced measurement that were significantly different between the total station measurements and the software measurement. The maximum differences in measurement between the total station and software measurements were 0.0917 m (PhotoScan) and 0.178 m (MicMac). The minimum difference that was found for either software was 0.000 m, indicating exact measurement. The histograms for the large scatter scenes were comparable, with the commercial and open-source software-derived models having low standard deviations and mean distances between points. For the small surface scatter scenes, the histograms between software types varied depending on the environment and the lighting conditions on the day of data collection. Conditions such as light, ground foliage and topography affect model quality significantly, as well as the amount of available computing power. No such issues of losing objects or limitations of computing power were encountered when mapping by total station and processing the data in AutoCAD. This research shows that SfM has the potential to be a rapid, accurate and low-cost resource for forensic investigation. SfM methodology for outdoor crime scene documentation can be adapted to fit within evidentiary criteria through the use of open-source software and transparent processing, but there are limitations that must be taken into consideration.
303

Using SPME and GC-MS to detect nicotine from teeth as an indicator for tobacco use

Muschal, Alexis 10 March 2022 (has links)
In the United States, 19.3% of adults use some form of tobacco. Nicotine is a compound that is present in tobacco. Nicotine also has been detected in the calculus of teeth of archaeological specimens. The teeth are often well-preserved in unidentified human remains and are resilient in terms of damage from physical or chemical degradation. Dental plaque and calculus form on teeth and can entrap many chemical traces of compounds to which teeth are exposed. Many studies have examined the calculus microscopically and chemically using various techniques to establish patterns in diet, behavior, and health. Forensic odontologists can assist in the identification of individuals using dental records, but some individuals may not have recent or relevant records. In addition, some skeletal remains might only include a partial mandible or maxillae or may consist of only disarticulated teeth. However, given their representation in the anthropological record and the persistence of dental calculus on teeth postmortem, analyzing teeth for behavioral indicators can provide anthropologists with significant information to assist with identifying an individual. For example, detecting the presence of nicotine from teeth can provide information about tobacco use habits of an individual. In this study, the hypothesis was tested whether nicotine can be detected from teeth. It was also hypothesized that different behaviors of use would yield different quantities of nicotine in calculus, and that some nicotine in lower but detectable quantities may still be present from the teeth of non-users. Eighty-two teeth from the Boston University Goldman School of Dentistry were collected from tobacco users and non-users, which were determined by anonymous survey. All teeth were analyzed using gas chromatography mass spectrometry. Three types of sample extraction were tested. Solid-phase microextraction (SPME) was used with two subgroups of the overall sample. For Sample Groups A (n=41) and B (n=18), teeth were heated at 80ºC for an hour in a 20 mL SPME vial. An 85 μm polyacrylate SPME fiber was exposed to the headspace of the vials for 15 minutes for teeth in group A and 30 minutes for teeth in group B. For sample group C (n=24), teeth were sonicated in 5 mL ethanol for 30 minutes. After sonication, the ethanol was pipetted into autosampler GC vials. All samples were analyzed on an Agilent Technologies 7890A Gas Chromatography system connected to and Agilent Technologies 5975C inert XL EI/CI Mass Spectrometer and data produced were analyzed using Agilent ChemStation Software. Gas chromatography –mass spectrometry parameters were consistent across all samples, with a solvent delay for the group C samples. Results indicated that while SPME is sensitive enough to detect spiked nicotine from prepared samples, the extraction method did not show any significant data which would identify users. For the samples which were extracted by sonication in methanol, there was an increased likelihood that teeth belonging to a user had a positive identification for nicotine. Of the sonicated samples, 63.6% of teeth from tobacco users showed a positive hit after sonication, while none of the non-user teeth had positive nicotine hits. However, the amount of nicotine detected from each sample does not necessarily reflect tobacco user behavior such as daily frequency, years of use, or type of tobacco.
304

Analysis of ketamine and xylazine in fur and bones using multidimensional liquid chromatography tandem mass spectrometry

Karanth, Neesha Claire 21 February 2019 (has links)
While ketamine is traditionally administered for anesthesia or pain management, illicit usage is often seen in forensic cases either as a recreational drug or as a tool in drug-facilitated sexual assault. Xylazine is an anesthetic agent used in veterinary medicine and does not have FDA approval for use in humans. However, it has recently been observed as a cutting agent in heroin. Post-mortem specimens present many challenges when it comes to toxicological analysis. Due to compound degradation and decomposition factors, analytes present at trace levels may be missed in blood and urine. Hair, bone, and insects have recently been investigated as alternative matrices for postmortem analysis due to their increased durability compared to more traditional matrices. However, this durability increases the difficulties in extracting and isolating compounds of interest from these matrices via traditional extraction and chromatography methods. These methods require lengthy extraction times and extensive cleanup steps in order to obtain samples suitable for analysis. Utilizing multiple instrumentation combinations, analysts are able to detect compounds at trace levels. Through the use of multidimensional chromatography, several time-consuming extraction steps can be eliminated while still retaining the ability of trace level detection and quantitation. Using Waters Oasis® HLB PRiME solid phase extraction cartridges using a methanol pH10 loading and an acetonitrile pH3 elution, a solvent extraction yielded linear dynamic ranges of 2pg/mL-1ng/mL and 5pg/mL-1ng/mL for xylazine and ketamine respectively. Rat specimens utilized in this project were treated as per an Institutional Animal Care and Use Committee (IACUC) protocol. The test rodents received an acute dosage of 2mg/mL of xylazine and 24mg/mL of ketamine approximately half an hour prior to death. The 14 test samples were placed outside directly on the ground at the Boston University Forensic Anthropology Outdoor Research Facility (Holliston, MA, U.S.A.) for a period of 6 months. A 15th rat was kept in -20°C until analysis to serve as a Time=0 sample. The outdoor samples were recovered and de-fleshed along with the Time=0 sample manually. Drug-free hair samples were donated anonymously as per Internal Review Board (IRB) protocols.
305

Forensic DNA phenotyping and massive parallel sequencing

Breslin, Krystal 04 December 2017 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / In the forensic science community, there is an immense need for tools to help assist investigations where conventional DNA profiling methods have been non-informative. Forensic DNA Phenotyping (FDP) aims to bridge that gap and aid investigations by providing physical appearance information when other investigative methods have been exhausted. To create a “biological eye witness”, it becomes necessary to constantly improve these methods in order to develop a complete and accurate image of the individual who left the sample. To add to our previous prediction systems IrisPlex and HIrisPlex, we have developed the HIrisPlex-S system for the all-in-one combined prediction of eye, hair, and skin color from DNA. The skin color prediction model uses 36 variants that were recently proposed for the accurate prediction of categorical skin color on a global scale, and the system is completed by the developmental validation of a 17-plex capillary electrophoresis (CE) genotyping assay that is run in conjunction with the HIrisPlex assay to generate these genotypes. The predicted skin color output includes Very Pale, Pale, Intermediate, Dark and Dark-to-Black categories in addition to categorical eye (Blue, Intermediate, and Brown) and hair (Black, Brown, Blond, and Red) color predictions. We demonstrate that the HIrisPlex-S assay performs in full agreement with guidelines from the Scientific Working Group on DNA Analysis Methods (SWGDAM), achieving high sensitivity levels with a minimum 63pg DNA input. In addition to adding skin color to complete the pigmentation prediction system termed HIrisPlex-S, we successfully designed a Massively Parallel Sequencing (MPS) assay to complement the system and bring Next Generation Sequencing (NGS) to the forefront of forensic DNA analyses methods. Using Illumina’s MiSeq system enables the generation of HIrisPlex-S’s 41 variants using sequencing data that has the capacity to xiii better deconvolute mixtures and perform with even more sensitivity and accuracy. This transition opens the door for a plethora of new ways in which this physical appearance assay can grow as sequencing technology is not limited by variant number; therefore, in essence many more traits have the potential to be included in this one assay design. For now, the HIrisPlex-S design of 41 variants using MPS is being fully assessed according to SWGDAM validated guidelines; therefore, this design paves the way for Forensic DNA Phenotyping to be used in any forensic laboratory. This new and improved HIrisPlex-S system will have a profound impact on casework, missing persons cases, and anthropological cases, as it is relatively inexpensive to run, HIrisPlex-S is easy to use, developmentally validated and one of the largest systems freely available online for physical appearance prediction from DNA using the freely available online web tool found at https://hirisplex.erasmusmc.nl/. Lastly, moving forward in our aim to include additional traits for prediction from DNA, we contributed to a large-scale research collaboration to unearth variants associated with hair morphology. 1026 samples were successfully sequenced using an inhouse MPS design at 91 proposed hair morphological loci. From this reaction, we were able to contribute to the identification of significant correlations between the SNPs rs2219783, rs310642 and rs80293268 with categorical hair morphology: straight, wavy or curly.
306

An investigation of genetic variability in Lucilia cuprina and Musca domestica utilizing phylogenetic and population genetic approaches

Doll, Laura Catherine 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Forensic entomology is a subdiscipline of entomology that involves the use of insect behavior and developmental data to aid in criminal investigations. Genetic data has become increasingly important to the field as there has been a push for DNA-based species identification methods of forensically relevant insects. Genetic data can also elucidate population structure and relatedness of these insects, and such knowledge can contribute to the development of more specific datasets for insects in different regions. The first study presented here investigated the phylogenetics of sister species Lucilia cuprina and Lucilia sericata to identify possible subspecies divisions and issues with DNA-based identifications in the United States. The initial aim of this study was to identify genetic differences between specimens of L. cuprina that preferred live versus carrion flesh. Flies collected from Indiana, USA and South Africa were sequenced and analyzed. Upon sequencing of the genes COI, Period, and 28s, our results indicated that L. cuprina from Indiana possess a unique combination of nuclear and mitochondrial haplotypes that suggest a unique lineage, possibly indicating modern hybridization with L. sericata. The inability of both nuclear and mitochondrial genes to distinguish between L. cuprina and L. sericata raises questions about the capabilities of DNA-based species identifications within this genus. Additionally, the inability of these genes to distinguish between specimens that preferred live versus carrion flesh highlights a need for continued research of these behavioral differences. The second study presented here investigated the population structure and relatedness of house flies in the American southwest in relation to a civil lawsuit where neighbors of a poultry farm alleged that flies were emanating from the farm to their homes. Musca domestica (house fly) specimens were collected from the chicken farm and from locations in varying directions and distances from the farm. Amplified fragment length polymorphism (AFLP) analysis was performed and the data were used in a number of analyses. Population reallocation simulations generally indicated that samples from different locations were not genetically different enough from other locations to allocate to their true origin population over others. Kinship analysis showed differences in samples collected in a later season that indicate a genetic bottleneck over time. Population structure analysis indicated the presence of two intermixing genetic populations in the dataset. AMOVA revealed that the majority of genetic variation laid within, rather than among, populations. A Mantel test revealed no significant correlation between genetic and geographic distances. These results indicate that the M. domestica population in this region of southwestern America is large and intermixing, with no clear genetic distinctions between specimens collected at the poultry farm versus the surrounding locations. In regard to the civil lawsuit, it was not possible to conclude that the flies did not emanate from the poultry farm. In a broader perspective, these data can be utilized to develop pest management strategies in this region. Overall, the data from both studies presented here will be useful to forensic investigations, development of more specific and detailed data and identification techniques, and pest control measures.
307

Analysis of Lubricants at Trace Levels Using Infrared Spectroscopy

Bandarupalli, Tanmai 01 January 2021 (has links)
Analysis of trace evidence involved in sexual assault investigations holds considerable potential as a newer avenue of identification when bulk, larger evidence is not found or unreliable. Trace analysis of forensic materials involves common findings such as strands of hair, residues left on clothing, shards of paint or glass, etc. In recent research focused on the analysis of trace materials found as evidence in a sexual assault, there has been promise in condom and bottled lubricant classification based on their chemical profiles that can provide an associative link in an investigation. Few studies have considered the examination of lubricant evidence at a trace level as it may be found on a crime scene or a victim. In this study, a new protocol will be tested and established to analyze trace lubricant evidence recovered from a fabric substrate, such as underwear, after sexual assaults using Fourier transform infrared (FTIR) spectroscopy. An experiment is proposed to examine the comparison of the spectra resulting from FTIR spectroscopic analysis of bulk and trace level lubricants recovered from a cotton substrate. The resulting spectra will be compared for their similarities using multivariate statistical techniques to test the viability of the approach.
308

A Fatal Drug Interaction Between Clozapine and Fluoxetine

Ferslew, Kenneth E., Hagardorn, Andrea N., Harlan, Gretel C., McCormick, William F. 01 January 1998 (has links)
A case is presented of a fatal drug interaction caused by ingestion of clozapine (Clozaril(TM)) and fluoxetine (Prozac(TM)). Clozapine is a tricyclic dibenzodiazepine derivative used as an 'atypical antipsychotic' in the treatment of severe paranoid schizophrenia. Fluoxetine is a selective serotonin reuptake inhibitor used for the treatment of major depression. Clinical studies have proven that concomitant administration of fluoxetine and clozapine produces increased plasma concentrations of clozapine and enhances clozapine's pharmacological effects due to suspected inhibition of clozapine metabolism by fluoxetine. Blood, gastric, and urine specimens were analyzed for fluoxetine by gas chromatography/mass spectrometry (GC/MS) and for clozapine by gas-liquid chromatography (GLC). Clozapine concentrations were: plasma, 4.9 μg/mL; gastric contents, 265 mg; and urine, 51.5 μg/mL. Fluoxetine concentrations were: blood, 0.7 μg/mL; gastric contents, 3.7 mg; and urine 1.6 μg/mL. Norfluoxetine concentrations were: blood, 0.6 μg/mL, and none detected in the gastric contents or urine. Analysis of the biological specimens for other drugs revealed the presence of ethanol (blood, 35 mg/dL; vitreous, 56 mg/dL; and urine 153 mg/dL) and caffeine (present in all specimens). The combination of these drugs produced lethal concentrations of clozapine and high therapeutic to toxic concentrations of fluoxetine. The deceased had pulmonary edema, visceral vascular congestion, paralytic ileus, gastroenteritis and eosinophilia. These conditions are associated with clozapine toxicity. The combined central nervous system, respiratory and cardiovascular depression of these drugs was sufficient to cause death. The death was determined to be a clozapine overdose due to a fatal drug interaction.
309

Cranial Thickness in American Females and Males

Ross, Ann H., Jantz, Richard L., McCormick, William F. 01 January 1998 (has links)
To date, numerous studies have examined the range of cranial thickness variation in modern humans. The purpose of this investigation is to present a new method that would be easier to replicate, and to examine sex and age variation in cranial thickness in a white sample. The method consists of excising four cranial segments from the frontal and parietal regions. The sample consists of 165 specimens collected at autopsy and 15 calvarial specimens. An increase in cranial thickness with age was observed. The results suggest that cranial thickness is not sexually dimorphic outside the onset of hyperostosis frontalis interna (HFI).
310

Comparison of results using temperature controlled differential extraction and differential extraction using the QIAGEN EZ1 advanced

Nicholas, Emily Leona 10 February 2022 (has links)
The sexual assault kit backlog in the United States has become an increasing problem over the years. Combined with the number of kits laboratories receive with how it takes to extract the deoxyribonucleic acid (DNA) from the cells, it is hard for labs to keep up with the demand. The extraction method used is called differential extraction, where the epithelial cells from the victim are separated from the sperm cells from the perpetrator into different fractions. The Temperature Controlled Differential Extraction (TCDE) method is a novel procedure developed by the Cotton Lab at the Boston University School of Medicine and designed to decrease the extraction time while performing just as well, if not better, than traditional differential extraction methods. The TCDE method uses a series of temperature-controlled enzymes to lyse cells and purify the DNA extract. The purpose of this study is to compare this TCDE method to a method implemented by QIAGEN using the EZ1® Advanced biorobot for purification, which is used in many forensic laboratories. Ten female donors each received ten cotton swabs for vaginal cell collection; cotton swabs are typically found in sexual assault kits. Each swab then received either 5ng, 25ng, or 50ng of male DNA in the form of sperm cells. One half of the swab was processed using the TCDE procedure while the other half was processed using the EZ1® method. The TCDE method results in three fractions: the Epithelial Fraction (EF), the Material Fraction (MF), and the Sperm Fraction (SF). The EZ1® protocol was modified to include the additional MF. Results of both the quantitation data as well as the electropherograms (EPGs) produced are compared between the two methods. The quantitation data for the EF shows a variable amount of female DNA recovered due to the uncontrolled amount of female epithelial cells added to the swabs from the donors. The MF shows that large amounts of female epithelial DNA remain in the fraction for the EZ1® protocol and not the TCDE protocol because of the nuclease activity of one of the enzymes. The remaining male DNA on the MF can be used to compare to a known male profile, showing that there is valuable data potentially left behind. Regarding the SF, the EZ1® protocol resulted in a higher yield of DNA than the TCDE, however, the TCDE SF electropherograms are still able to be used for comparisons against known male profiles. The TCDE protocol cuts extraction time by almost half, and the quantitation results and EPGs prove that this method has the potential to become the new standard method of differential extraction.

Page generated in 0.0599 seconds