• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • Tagged with
  • 97
  • 27
  • 27
  • 22
  • 19
  • 12
  • 11
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Surface and topography metrology in firearm evidence identification and engineering surface quality control

Song, Junfeng January 2017 (has links)
This thesis is a topical review on the application of Surface and Topography Metrology in Firearm Evidence Identification and Engineering Surface Quality Control. It summarizes my research work at the National Institute of Standards and Technology (NIST) from 1987 to present, where I’m a Project Lead for the Forensic Topography and Surface Metrology since 1997. I started my research in surface metrology since 1982 -- after my MS study at the Harbin Institute of Technology (HIT, Harbin, China) from 1978 to 1981. In 1985, I designed, manufactured and patented the Precision Random Profile Roughness Specimens in Beijing aimed to provide a reference standard for quality control of smooth engineering surfaces [1]. These specimens were manufactured with Ra values ranging from 0.015 μm to 0.1 μm -- less than 1/10 of the similar specimens developed by PTB (Physikalisch-Technische Bundesanstalt) in Germany. These specimens were successfully used by U.S. manufacturers for measurement unification and quality control of smooth engineering surfaces, and were included in ASME B46 surface standard in 1995. Microform metrology is a subfield of surface metrology that involves surface measurements of complex geometry features on the micrometer scale. In 1995, I led a team at NIST which established a Microform Calibration System with the lowest calibration uncertainty in the world for calibration of Rockwell hardness (HR) diamond indenters. Based on the precision calibration of HR indenters and the control of other influencing quantities, I proposed a “Metrological Approach” to unifying international HRC scales with metrological traceability. I led an international HRC comparison among five National Metrological Institutes (NMIs). The comparison results strongly supported the proposed Metrological Approach. I drafted a joint paper for five NMIs entitled “Establishing a worldwide unified Rockwell hardness scale with metrological traceability” which was published at the Metrologia 34, 1997 in Paris [4]. Surface and topography metrology provides strong support to firearm evidence identifications. Based on my experience in developing surface standards, measurement systems, uncertainty and traceability procedures, I led a research team which developed the NIST Standard II Reference Material (SRM) Bullets and Cartridge Cases, and the NIST 2D/3D Topography Measurement System [5]. We formulated a National Traceability and Quality System using the SRM Bullets and Cartridge Cases to support ballistics identifications within the National Integrated Ballistics Information Network (NIBIN) in the United States [6]. I have recently invented a Congruent Match Cells (CMC) method for accurate ballistics identification and error rate estimation [7], which can serve as a statistical foundation for estimating error rates in firearm evidence identifications, thus emulating methods used for forensic identification of DNA evidences [8].
62

Automated digital forensics and computer crime profiling

Al Fahdi, Mahmood January 2016 (has links)
Over the past two decades, technology has developed tremendously, at an almost exponential rate. While this development has served the nation in numerous different positive ways, negatives have also emerged. One such negative is that of computer crime. This criminality has even grown so fast as to leave current digital forensic tools lagging behind in terms of development, and capabilities to manage such increasing and sophisticated types of crime. In essence the time taken to analyse a case is huge and increasing, and cases are not fully or properly investigated. This results in an ever-increasing number of pending and unsolved cases pertaining to computer crime. Digital forensics has become an essential tool in the fight against computer crime, providing both procedures and tools for the acquisition, examination and analysis of digital evidence. However, the use of technology is expanding at an ever-increasing rate, with the number of devices a single user might engage with increasing from a single device to 3 or more, the data capacity of those devices reaching far into the Terabytes, and the nature of the underlying technology evolving (for example, the use of cloud services). This results in an incredible challenge for forensic examiners to process and analyse cases in an efficient and effective manner. This thesis focuses upon the examination and analysis phases of the investigative process and considers whether automation of the process is possible. The investigation begins with researching the current state of the art, and illustrates a wide range of challenges that are facing the digital forensics investigators when analysing a case. Supported by a survey of forensic researchers and practitioners, key challenges were identified and prioritised. It was found that 95% of participants believed that the number of forensic investigations would increase in the coming times, with 75% of participants believing that the time consumed in such cases would increase. With regards to the digital forensic sophistication, 95% of the participants expected a rise in the complexity level and sophistication of digital forensics. To this end, an automated intelligent system that could be used to reduce the investigator’s time and cognitive load was found to be a promising solution. A series of experiments are devised around the use of Self-Organising Maps (SOMs) – a technique well known for unsupervised clustering of objects. The analysis is performed on a range of file system and application-level objects (e.g. email, internet activity) across four forensic cases. Experiment evaluations revealed SOMs are able to successfully cluster forensic artefacts from the remaining files. Having established SOMs are capable of clustering wanted artefacts from the case, a novel algorithm referred to as the Automated Evidence Profiler (AEP), is proposed to encapsulate the process and provide further refinement of the artefact identification process. The algorithm led to achieving identification rates in examined cases of 100% in two cases and 94% in a third. A novel architecture is proposed to support the algorithm in an operational capacity – considering standard forensic techniques such as hashing for known files, file signature analysis, application-level analysis. This provides a mechanism that is capable of utilising the A E P with several other components that are able to filter, prioritise and visualise artefacts of interest to investigator. The approach, known as Automated Forensic Examiner (AFE), is capable of identifying potential evidence in a more efficient and effective manner. The approach was evaluated by a number of experts in the field, and it was unanimously agreed that the chosen research problem was one with great validity. Further to this, the experts all showed support for the Automated Forensic Examiner based on the results of cases analysed.
63

Human identification using soft biometrics

Reid, Daniel January 2013 (has links)
Humans naturally use descriptions to verbally convey the appearance of an individual. Eyewitness descriptions are an important resource for many criminal investigations. However, they cannot be used to automatically search databases featuring video or biometric data - reducing the utility of human descriptions in the search for the suspect. Soft biometrics are a new form of biometric identification which uses physical or behavioural traits that can be naturally described by humans. This thesis will explore how soft biometrics can be used alongside traditional biometrics, allowing video footage and biometric data to be searched using a description. To permit soft biometric identification the human description must be accurate, yet conventional descriptions comprising of absolute labels and estimations are often unreliable. A novel method of obtaining human descriptions will be introduced which utilizes comparative categorical labels to describe the differences between subjects. A database of facial and bodily comparative labels is introduced and analysed. Prior to use as a biometric feature, comparative descriptions must be anchored. Several techniques to convert multiple comparative labels into a single relative measurement are explored. Recognition experiments were conducted to assess the discriminative capabilities of relative measurements as a biometric. Relative measurements can also be obtained from other forms of human representation. This is demonstrated using several machine learning techniques to determine relative measurements from gait biometric signatures. Retrieval results are presented showing the ability to automatically search video footage using comparative descriptions.
64

Eyewitness identification performance on lineups for distinctive suspects

Colloff, Melissa F. January 2016 (has links)
When constructing lineups for suspects with distinctive facial features (e.g., scars, tattoos, piercings), current police guidelines in several countries state that the distinctive suspect must not stand out. To this end, police officers sometimes artificially replicate a suspect’s distinctive feature across the other lineup members (replication); other times, they conceal the feature on the suspect and conceal a similar area on the other members by pixelating the area (pixelation), or covering the area with a solid rectangle (block). Although these three techniques are used frequently, little research has examined their efficacy. This thesis investigates how the lineup techniques for distinctive suspects influence eyewitness identification performance and, in doing so, tests the predictions of a new model of eyewitness decision-making—the diagnostic-feature-detection model (Wixted & Mickes, 2014). The research uses a standard eyewitness identification paradigm and signal detection statistics to examine how replication, pixelation, and block techniques influence identification performance: [1] compared to doing nothing to stop the distinctive suspect from standing out; [2] in young, middle-aged and older adults; and [3] when the culprit does not have the feature during the crime. It also examines [4] how variation in the way the suspect’s feature is replicated influences identification performance. The results converge to suggest that all three lineup techniques currently used by the police to accommodate distinctive suspects are equally effective and, when the culprit has the feature at the time of the crime, all enhance people’s ability to discriminate between innocent and guilty suspects more than doing nothing to prevent a distinctive suspect from standing out. All three lineup techniques enable people of all ages to make highly confident decisions when they are likely to be accurate. These findings align with the predictions of the diagnostic-feature-detection model, which suggests that the model remains a viable theory of eyewitness decision-making.
65

Paying attention to the evidence : a comparison of perception and decision making processes in novice and experienced 'scene of crime' officers using eye tracking in simulated crime scene scenarios

Ozger, Murat January 2016 (has links)
Research on crime scene investigation has strongly focused on the technical aspects of the process, while cognitive aspects (searching, reasoning and perception) have often been overlooked. Textbooks on forensic sciences tend to focus on identifying and processing evidence, and the use of equipment while it can be argued that cognitive factors in processing such evidence and using equipment are equally important. This thesis studies the cognitive aspects of crime scene investigation by comparing eye movement patterns in experts and novices. Studies in various domains, including surgery, sports, and chess playing have shown that eye movements differ between experts and novices, providing a tool towards a more objective assessment of skill than is possible with peer assessment. In four experiments eye movements of experts and novices were examined during (1) inspection of photographs of crime scenes on a computer screen (2) a change blindness task on crime and non-crime scene images, (3) active exploration of a simulated crime scene and (4) the assessment of emotional crime and natural scenes. While some trends in eye movement differences, such as a tendency on longer fixation durations and a broader focus on the overall scene and less on the direct evidence could be found in experts compared to novices, differences between experts and novices were considerably smaller than in other domains, despite the broad range of measures extracted from the data. This lack of clear expertise effects may relate to the rather diverse range of perceptual layouts of crime scenes, reducing possible top-down effects of expertise on the deployment of attention. The results will be discussed with a view of possible directions of future research in this domain.
66

The importance of DNA as an investigation tool

Maharaj, Udesh 10 September 2013 (has links)
This study has a twofold purpose in that it attempts to identify how knowledgeable investigators are about the collection and use of DNA in relation to the building of a criminal case, and to establish how optimally DNA as evidence is utilised. The study has revealed several shortcomings which render the use of DNA evidence inadmissible in criminal proceedings. The researcher also analysed other aspects relating to DNA evidence, namely: identification, individualisation, criminal investigation, forensic investigation, and objectives of criminal investigation. For criminal investigators to be successful in their investigation of cases involving DNA, it is imperative for them to have a clear understanding of the basic concepts surrounding DNA investigations and the value of DNA evidence. It is submitted that, because of a lack of knowledge in DNA-related investigations by detectives, a lack of training in DNA-related cases and delays in the collection of DNA evidence, valuable evidence is often lost and/or contaminated. This causes such evidence to become inadmissible in criminal proceedings, and has a negative impact on the conviction rate for such crimes. / Preface in English and Afrikaans / Hierdie studie het ’n tweevoudige doel in die poging om te identifiseer hoe kundig ondersoekers is aangaande die insameling en gebruik van DNA in terme van die bou van ’n kriminele saak en die vasstel van hoe DNA optimal [sic] as bewys gebruik kan word. Die studie het getoon dat daar verskeie tekortkominge is wat teweeg gebring het dat die gebruik van DNA bewys onaanvaarbaar was in kriminele prosedure. Die navorser het ook ander aspekte geanaliseer aangaande DNA bewyse, naamlik: identifikasie, individualisering van onderwerpe tydens ondersoek, kriminele ondersoek, forensiese ondersoek, en die aspekte van kri-minele ondersoeke. Vir die kriminele ondersoekers om suksesvol te wees in hulle ondersoek van sake waarby DNA betrokke is, is dit vir hulle van kardinale belang om ’n suiwere begrip van die basiese konsep rondom DNA ondersoeke en die waarde van DNA bewyse te hê. Dit is voortgebring dat, as gevolg van die tekort aan kennis in DNA ondersoeke, asook opleiding van DNA-sake by die speurders, en vertraging in die versameling van DNA bewysstukke, waardevolle bewyse is telkens verlore of gekontamineer. Dit veroorsaak dat sulke bewyse verwerp word in kriminele prosesse, en het ’n negatiewe impak op die vonnis statestieke [sic] vir sulke misdade. / Criminology / M.Tech. (Forensic Investigation)
67

Validating digital forensic evidence

Shanmugam, Karthikeyan January 2011 (has links)
This dissertation focuses on the forensic validation of computer evidence. It is a burgeoning field, by necessity, and there have been significant advances in the detection and gathering of evidence related to electronic crimes. What makes the computer forensics field similar to other forensic fields is that considerable emphasis is placed on the validity of the digital evidence. It is not just the methods used to collect the evidence that is a concern. What is also a problem is that perpetrators of digital crimes may be engaged in what is called anti-forensics. Digital forensic evidence techniques are deliberately thwarted and corrupted by those under investigation. In traditional forensics the link between evidence and perpetrator's actions is often straightforward: a fingerprint on an object indicates that someone has touched the object. Anti-forensic activity would be the equivalent of having the ability to change the nature of the fingerprint before, or during the investigation, thus making the forensic evidence collected invalid or less reliable. This thesis reviews the existing security models and digital forensics, paying particular attention to anti-forensic activity that affects the validity of data collected in the form of digital evidence. This thesis will build on the current models in this field and suggest a tentative first step model to manage and detect possibility of anti-forensic activity. The model is concerned with stopping anti-forensic activity, and thus is not a forensic model in the normal sense, it is what will be called a “meta-forensic” model. A meta-forensic approach is an approach intended to stop attempts to invalidate digital forensic evidence. This thesis proposes a formal procedure and guides forensic examiners to look at evidence in a meta-forensic way.
68

Evaluation of evidence for autocorrelated data, with an example relating to traces of cocaine on banknotes

Wilson, Amy Louise January 2014 (has links)
Much research in recent years for evidence evaluation in forensic science has focussed on methods for determining the likelihood ratio in various scenarios. One proposition concerning the evidence is put forward by the prosecution and another is put forward by the defence. The likelihood of each of these two propositions is calculated, given the evidence. The likelihood ratio, or value of the evidence, is then given by the ratio of the likelihoods associated with these two propositions. The aim of this research is twofold. Firstly, it is intended to provide methodology for the evaluation of the likelihood ratio for continuous autocorrelated data. The likelihood ratio is evaluated for two such scenarios. The first is when the evidence consists of data which are autocorrelated at lag one. The second, an extension to this, is when the observed evidential data are also believed to be driven by an underlying latent Markov chain. Two models have been developed to take these attributes into account, an autoregressive model of order one and a hidden Markov model, which does not assume independence of adjacent data points conditional on the hidden states. A nonparametric model which does not make a parametric assumption about the data and which accounts for lag one autocorrelation is also developed. The performance of these three models is compared to the performance of a model which assumes independence of the data. The second aim of the research is to develop models to evaluate evidence relating to traces of cocaine on banknotes, as measured by the log peak area of the ion count for cocaine product ion m/z 105, obtained using tandem mass spectrometry. Here, the prosecution proposition is that the banknotes are associated with a person who is involved with criminal activity relating to cocaine and the defence proposition is the converse, which is that the banknotes are associated with a person who is not involved with criminal activity relating to cocaine. Two data sets are available, one of banknotes seized in criminal investigations and associated with crime involving cocaine, and one of banknotes from general circulation. Previous methods for the evaluation of this evidence were concerned with the percentage of banknotes contaminated or assumed independence of measurements of quantities of cocaine on adjacent banknotes. It is known that nearly all banknotes have traces of cocaine on them and it was found that there was autocorrelation within samples of banknotes so thesemethods are not appropriate. The models developed for autocorrelated data are applied to evidence relating to traces of cocaine on banknotes; the results obtained for each of the models are compared using rates of misleading evidence, Tippett plots and scatter plots. It is found that the hiddenMarkov model is the best choice for themodelling of cocaine traces on banknotes because it has the lowest rate of misleading evidence and it also results in likelihood ratios which are large enough to give support to the prosecution proposition for some samples of banknotes seized from crime scenes. Comparison of the results obtained for models which take autocorrelation into account with the results obtained from the model which assumes independence indicate that not accounting for autocorrelation can result in the overstating of the likelihood ratio.
69

Telomerase : a prognostic marker and therapeutic target

Thakkar, Dipti S. January 2010 (has links)
Malignant glioma is the most common and aggressive form of tumours and is usually refractory to therapy. Telomerase and its altered activity, distinguishing cancer cells, is an attractive molecular target in glioma therapeutics. The aim of this thesis was to silence telomerase at the genetic level with a view to highlight the changes caused in the cancer proteome and identify the potential downstream pathways controlled by telomerase in tumour progression and maintenance. A comprehensive proteomic study utilizing 2D-DIGE and MALDI-TOF were used to assess the effect of inhibiting two different regulatory mechanisms of telomerase in glioma. RNAi was used to target hTERT and Hsp90α. Inhibition of telomerase activity resulted in down regulation of various cytoskeletal proteins with correlative evidence of the involvement of telomerase in regulating the expression of vimentin. Vimentin plays an important role in tumour metastasis and is used as an indicator of glioma metastasis. Inhibition of telomerase via sihTERT results in the down regulation of vimentin expression in glioma cell lines in a grade specific manner. While, 9 of 12 glioblastoma tissues (grade IV) showed vimentin to be highly expressed, its expression was absent in lower grades and normal tissues. This suggests that vimentin can be potentially used as a glioma progressive marker. This is the first study to report the potential involvement of telomerase in the regulation of vimentin expression. This study also identified that combination therapy, comprising siRNA targeted towards telomerase regulatory mechanisms and the natural product Epigallocatechin-3-gallate (ECGC), results in decreased cell viability producing comparable results to that of other chemotherapeutic drugs.
70

Hsp90 as a molecular target

Munje, Chinmay January 2011 (has links)
Heat shock protein 90 (Hsp90), a highly conserved molecular chaperone, has been proposed to play a vital role in tumorigenesis. Hsp90 has two isoforms, of which Hsp90α is the major isoform of the Hsp90 complex and has an inducible expression profile. The molecular chaperone Hsp90α has been recognized in different cancers and it is implicated to play a role in cell cycle progression, apoptosis, regulates invasion, angiogenesis and metastasis. It is being recognized as a promising target in cancer treatment. Previous studies in our laboratory have demonstrated hsp90α expression in both primary glioma tissue and cell lines, but not in normal healthy brain tissues and cell lines. Enhanced chemosensitivity was observed upon specific inhibition of hsp90α expression by siRNA, suggesting that inhibiting hsp90α expression could possibly be a favourable therapeutic approach compared to conventional chemotherapies. In this novel study, Hsp90 was inhibited by either treatment with 17AAG or shRNA oligonucleotide targeting hsp90α (shhsp90α) in the U87-MG glioma cell line. The inhibition profile of Hsp90α was observed at the protein levels in control and treated cells by FACS analysis (quantitative) using a flow cytometer and Hsp90α ELISA kit. The results demonstrated a significant reduction of Hsp90α protein levels post treatment with 17AAG and shhsp90α. The activity of Hsp90α was assayed by quantifying the levels of Akt/PKB in the samples. Significant reductions (>50 %) of Akt/PKB levels were observed post hsp90α inhibition. Cell cycle analysis carried out reported S and G2 phase arrest, post Hsp90 inhibition by either 17AAG or shhsp90α. Interestingly, it was reported that 17AAG shows a better silencing profile compared to shhsp90α. To analyse the downstream effects of Hsp90 inhibition and to determine the client proteins affected, proteomic analysis was performed. Proteomic analysis identified several proteins which were either upregulated/downregulated post Hsp90 inhibition. IPA analysis further identified “cancer” as the top network significantly transformed post Hsp90 inhibition. Upregulated proteins include Hsp70 family members, Hsp27 and gp96, thereby suggesting the role of Hsp90 co-chaperones in compensating for Hsp90 function post Hsp90 inhibition. Moreover, members of the glycolysis/glucogenesis pathway were also upregulated, demonstrating increased dependency on glycolysis for energy supply by the treated glioma cells. Considering Hsp70 and its role in anti-apoptosis, it was postulated that a combination therapy involving a multi-target approach could be carried out. Subsequently, inhibition of both Hsp90 and Hsp70 in U87-MG glioma cell line was carried out resulting in 60 % cell death along with S and G2 phase arrest. Thus, in the effective treatment of glioma, the inhibition of multiple targets needs to be taken into consideration. Conclusion: It can be thus concluded that, combination therapy involving silencing of Hsp90 and Hsp70 could be of possible significance in glioma therapy.

Page generated in 0.0434 seconds