• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 720
  • 238
  • 238
  • 121
  • 67
  • 48
  • 21
  • 19
  • 13
  • 10
  • 9
  • 8
  • 8
  • 8
  • 7
  • Tagged with
  • 1772
  • 530
  • 473
  • 275
  • 184
  • 139
  • 137
  • 117
  • 117
  • 115
  • 114
  • 110
  • 107
  • 102
  • 102
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
641

Quasi-Static and Creep Behavior of Enhanced SIC/SIC Ceramic Matrix Composites

Pandey, Vinayak 17 July 2000 (has links)
Continuous Fiber Reinforced Ceramic Composites (CFCC's) are being currently investigated as potential materials for high temperature applications such as combustor liners in stationary gas turbines. The creep behavior of woven Enhanced SiC/SiC composites was studied at temperatures from 600 to 1200 °C and at 140 to 220 MPa stress levels. Most researchers studying the creep behavior of ceramic matrix composites (CMCs) use the time hardening model and rate equations for expressing the dependence of creep strain on time, temperature and stress. Such laws, although simple and easy to use, are inadequate to represent the creep behavior over a range of stress levels and temperatures and cannot be used to quantify the pest phenomenon commonly observed in CMCs. Hence, these laws were modified to include the pest phenomenon and an empirical equation was developed that can be used to represent the creep behavior at various stresses and temperatures. The modified equation was used in the finite element analysis and the results were compared with the time and strain hardening models. Microscopic observations on the fractured surfaces revealed the pseudo-ductile behavior of the material at high temperatures. A quasi-static test was conducted at 1200 °C to determine the unloading response of the material. The stress-strain response of the composite demonstrates a hysterisis loop and a small amount of permanent strain, which are characteristic of the CMC's [3]. Finally, a test was conducted at 1200 oC to investigate the recovery behavior of the material. The material exhibits a tendency to recover the accumulated creep strain as well as the small permanent strain upon unloading, if sufficient time is allowed for recovery. The creep data were also modeled using the representations such as Monkmon-Grant and Larsen-Miller equations. A modified Monkman-Grant equation was used to model the stratification of the creep strain rate data with temperature. A finite element model based on the plasticity theory was developed to simulate the quasi-static cyclic behavior of the material. Though the loading behavior of CMCs can be modeled using the bilinear or multilinear kinematic hardening plasticity models, the unloading behavior as predicted by the models is entirely different from the experimentally observed behavior. Hence, these models were modified to correctly predict the stress-strain behavior. The model, which was input via a user defined subroutine into the ANSYS finite element program uses the concept of state or internal variables to define the unloading portion of the stress-strain curve. The results were compared with the test data and they show very good agreement. The model was then used to predict the stress-strain response of a plate with a notch. The results from the analysis were compared with the experimental data and they show good agreement if average values of strains are considered. / Master of Science
642

Fast Static Learning and Inductive Reasoning with Applications to ATPG Problems

Dsouza, Michael Dylan 03 March 2015 (has links)
Relations among various nodes in the circuit, as captured by static and inductive invariants, have shown to have a positive impact on a wide range of EDA applications. Techniques such as boolean constraint propagation for static learning and assume-then-verify approach to reason about inductive invariants have been possible due to efficient SAT solvers. Although a significant amount of research effort has been dedicated to the development of effective invariant learning techniques over the years, the computation time for deriving powerful multi-node invariants is still a bottleneck for large circuits. Fast computation of static and inductive invariants is the primary focus of this thesis. We present a novel technique to reduce the cost of static learning by intelligently identifying redundant computations that may not yield new invariants, thereby achieving significant speedup. The process of inductive invariant reasoning relies on the assume-then-verify framework, which requires multiple iterations to complete, making it infeasible for cases with a large set of multi-node invariants. We present filtering techniques that can be applied to a diverse set of multi-node invariants to achieve a significant boost in performance of the invariant checker. Mining and reasoning about all possible potential multi-node invariants is simply infeasible. To alleviate this problem, strategies that narrow down the focus on specific types of powerful multi-node invariants are also presented. Experimental results reflect the promise of these techniques. As a measure of quality, the invariants are utilized for untestable fault identification and to constrain ATPG for path delay fault testing, with positive results. / Master of Science
643

Design Validation of RTL Circuits using Binary Particle Swarm Optimization and Symbolic Execution

Puri, Prateek 05 August 2015 (has links)
Over the last two decades, chip design has been conducted at the register transfer (RT) Level using Hardware Descriptive Languages (HDL), such as VHDL and Verilog. The modeling at the behavioral level not only allows for better representation and understanding of the design, but also allows for encapsulation of the sub-modules as well, thus increasing productivity. Despite these benefits, validating a RTL design is not necessarily easier. Today, design validation is considered one of the most time and resource consuming aspects of hardware design. The high costs associated with late detection of bugs can be enormous. Together with stringent time to market factors, the need to guarantee the correct functionality of the design is more critical than ever. The work done in this thesis tackles the problem of RTL design validation and presents new frameworks for functional test generation. We use branch coverage as our metric to evaluate the quality of the generated test stimuli. The initial effort for test generation utilized simulation based techniques because of their scalability with design size and ease of use. However, simulation based methods work on input spaces rather than the DUT's state space and often fail to traverse very narrow search paths in large input spaces. To encounter this problem and enhance the ability of test generation framework, in the following work in this thesis, certain design semantics are statically extracted and recurrence relationships between different variables are mined. Information such as relations among variables and loops can be extremely valuable from test generation point of view. The simulation based method is hybridized with Z3 based symbolic backward execution engine with feedback among different stages. The hybridized method performs loop abstraction and is able to traverse narrow design paths without performing costly circuit analysis or explicit loop unrolling. Also structural and functional unreachable branches are identified during the process of test generation. Experimental results show that the proposed techniques are able to achieve high branch coverage on several ITC'99 benchmark circuits and their modified variants, with significant speed up and reduction in the sequence length. / Master of Science
644

Greenhouse Gas Production and Nutrient Reductions in Denitrifying Bioreactors

Bock, Emily 11 June 2014 (has links)
The global nitrogen cycle has been disrupted by large anthropogenic inputs of reactive nitrogen to the environment. Excess nitrogen underlies environmental problems such as eutrophication, and can negatively affect human health. Managing the natural microbial process of denitrification is advocated as a promising avenue to reduce excess nitrogen, and denitrifying bioreactors (DNBRs) are an emerging technology harnessing this biochemical process. Previous DNBR research has established successful nitrate removal, whereas this study examines the potential to expand DNBR functionality to address excess phosphorus and mitigate the production of nitrous oxide, a potent greenhouse gas. Results from a laboratory experiment supported the hypothesis that the addition of biochar, a charcoal-like soil amendment and novel organic carbon source in DNBR research, would increase nitrate and phosphorus removal as well as decrease the accumulation of nitrous oxide, an intermediate product of microbial denitrification. In order more closely examine the ratio of the products nitrous oxide and inert dinitrogen, development of a novel analytical method to quantify dissolved gases in environmental water samples using gas chromatography mass spectrometry was undertaken. Although static headspace analysis is a common technique for quantifying dissolved volatiles, the variation in sample preparation has recently been revealed to affect the determination of dissolved concentrations of permanent gases and convolute comparison between studies. This work demonstrates the viability of internal calibration with gaseous standard addition to make dissolved gas analysis more robust to variable sample processing and to correct for matrix effects on gas partitioning that may occur in environmental samples. / Master of Science
645

From Theory to Practice: Deployment-grade Tools and Methodologies for Software Security

Rahaman, Sazzadur 25 August 2020 (has links)
Following proper guidelines and recommendations are crucial in software security, which is mostly obstructed by accidental human errors. Automatic screening tools have great potentials to reduce the gap between the theory and the practice. However, the goal of scalable automated code screening is largely hindered by the practical difficulty of reducing false positives without compromising analysis quality. To enable compile-time security checking of cryptographic vulnerabilities, I developed highly precise static analysis tools (CryptoGuard and TaintCrypt) that developers can use routinely. The main technical enabler for CryptoGuard is a set of detection algorithms that refine program slices by leveraging language-specific insights, where TaintCrypt relies on symbolic execution-based path-sensitive analysis to reduce false positives. Both CryptoGuard and TaintCrypt uncovered numerous vulnerabilities in real-world software, which proves the effectiveness. Oracle has implemented our cryptographic code screening algorithms for Java in its internal code analysis platform, Parfait, and detected numerous vulnerabilities that were previously unknown. I also designed a specification language named SpanL to easily express rules for automated code screening. SpanL enables domain experts to create domain-specific security checking. Unfortunately, tools and guidelines are not sufficient to ensure baseline security in internet-wide ecosystems. I found that the lack of proper compliance checking induced a huge gap in the payment card industry (PCI) ecosystem. I showed that none of the PCI scanners (out of 6), we tested are fully compliant with the guidelines, issuing certificates to merchants that still have major vulnerabilities. Consequently, 86% (out of 1,203) of the e-commerce websites we tested, are non-compliant. To improve the testbeds in the light of our work, the PCI Security Council shared a copy of our PCI measurement paper to the dedicated companies that host, manage, and maintain the PCI certification testbeds. / Doctor of Philosophy / Automatic screening tools have great potentials to reduce the gap between the theory and the practice of software security. However, the goal of scalable automated code screening is largely hindered by the practical difficulty of reducing false positives without compromising analysis quality. To enable compile-time security checking of cryptographic vulnerabilities, I developed highly precise static analysis tools (CryptoGuard and TaintCrypt) that developers can use routinely. Both CryptoGuard and TaintCrypt uncovered numerous vulnerabilities in real-world software, which proves the effectiveness. Oracle has implemented our cryptographic code screening algorithms for Java in its internal code analysis platform, Parfait, and detected numerous vulnerabilities that were previously unknown. I also designed a specification language named SpanL to easily express rules for automated code screening. SpanL enables domain experts to create domain-specific security checking. Unfortunately, tools and guidelines are not sufficient to ensure baseline security in internet-wide ecosystems. I found that the lack of proper compliance checking induced a huge gap in the payment card industry (PCI) ecosystem. I showed that none of the PCI scanners (out of 6), we tested are fully compliant with the guidelines, issuing certificates to merchants that still have major vulnerabilities. Consequently, 86% (out of 1,203) of the e-commerce websites we tested, are non-compliant. To improve the testbeds in the light of our work, the PCI Security Council shared a copy of our PCI measurement paper to the dedicated companies that host the PCI certification testbeds.
646

Muscle Activation Patterns and Chronic Neck-Shoulder Pain in Computer Work

Kelson, Denean M. 20 April 2018 (has links)
Prolonged computer work is associated with high rates of neck and shoulder pain symptoms, and as computers have become increasingly more common, it is becoming critical that we develop sustainable interventions targeting this issue. Static muscle contractions for prolonged periods often occur in the neck/shoulder during computer work and may underlie muscle pain development in spite of rather low relative muscle load levels. Causal mechanisms may include a stereotypical recruitment of low threshold motor units (activating type I muscle fibers), characterized by a lack of temporal as well as spatial variation in motor unit recruitment. Based on this theory, although studies have postulated that individuals with chronic neck-shoulder pain will show less variation in muscle activity compared to healthy individuals when engaged in repetitive/monotonous work, this has seldom been verified in empirical studies of actual computer work. Studies have rarely addressed temporal patterns in muscle activation, even though there is a consensus that temporal activation patterns are important for understanding fatigue and maybe even risks of subsequent musculoskeletal disorders. This study applied exposure variation analysis (EVA) to study differences in temporal patterns of trapezius muscle activity as individuals with and without pain performed computer work. The aims of this study were to: Assess the reliability of EVA to measure variation in trapezius muscle activity in healthy individuals during the performance of computer work; Determine the extent to which healthy subjects differ from those with chronic pain in trapezius muscle activity patterns during computer work, measured using EVA. Thirteen touch-typing, right-handed participants were recruited in this study (8 healthy; 5 chronic pain). The participants were asked to complete three 10-minute computer tasks (TYPE, CLICK and FORM) in two pacing conditions (self-paced, control-paced), with the healthy group completing two sessions and the pain group completing one. Activation of the upper trapezius muscle was measured using surface electromyography (EMG). EMG data were organized into 5x5 EVA matrices with five amplitude classes (0-6.67, 6.67-20, 20-46.67, 46.67-100, >100% Reference Voluntary Exertion) and five duration classes (0- 1, 1-3, 3-7, 7-15, >15 seconds). EVA marginal distributions (along both amplitude and duration classes) for each EVA class, as well as summary measures (mean and SD) of the marginal sums along each axis were computed. Finally, “resultant” mean and SD across all EVA cells were computed. The reliability in EVA indices was estimated using intra-class correlation coefficients (ICC), coefficient of variation (CV) and standard error of measurement (SEM), computed from repeated measurements of healthy individuals (aim 1), and EVA indices were compared between groups (aim 2). Reliability of EVA amplitude marginal sums ranged from moderate to high in the self-paced condition and low to moderate in the control-paced condition. The duration marginal sums were moderate in the self-paced condition and moderate to high in the control-paced condition. The summary measures (means and SDs) were moderate to high in both the self-paced and control-paced condition. Group comparisons revealed that individuals with chronic pain spent longer durations of work time in higher EVA duration categories, exhibited larger means along the amplitude, duration and in the resultant, and higher EVA SD in the amplitude and duration axes as compared to the healthy group. To our knowledge, this is the first study to report on the reliability of EVA applied specifically to computer work. Furthermore, EVA was used to assess differences in muscle activation patterns as individuals with and without chronic pain engaged in computer work. Individuals in the pain group seemed to exhibit prolonged sustained activation of the trapezius muscle to a significantly greater extent than controls, even though they did not experience pain during the performance of the computer tasks (as obtained through self-reports). Thus, these altered muscle recruitment patterns observed in the pain subjects, even in the absence of task-based pain/discomfort, are suggestive of chronic motor control changes occurring in adaptation to pain, and may have implications for the etiology of neck and upper-limb musculoskeletal disorders. / Master of Science / This study aims to assess the reliability of exposure variation analysis (EVA) to measure variation in trapezius muscle activity in healthy individuals during the performance of computer work, and to determine the extent to which healthy subjects differ from those with chronic pain in trapezius muscle activity patterns during computer work, measured using EVA. Muscle activation was recorded for eight healthy individual and five suffering from chronic neck-shoulder pain. The data were then categorized into amplitude and continuous time categories, and summary measures of resulting distributions were calculated. These measures were used to assess the reliability of participant responses to computer work of healthy individuals, as well as quantify differences between those with and without chronic pain. We found that individuals with pain activated their neck-shoulder muscles for longer continuous durations than healthy individuals, thus showing an inability to relax their muscles when performing work.
647

Im2Vid: Future Video Prediction for Static Image Action Recognition

AlBahar, Badour A Sh A. 20 June 2018 (has links)
Static image action recognition aims at identifying the action performed in a given image. Most existing static image action recognition approaches use high-level cues present in the image such as objects, object human interaction, or human pose to better capture the action performed. Unlike images, videos have temporal information that greatly improves action recognition by resolving potential ambiguity. We propose to leverage a large amount of readily available unlabeled videos to transfer the temporal information from video domain to static image domain and hence improve static image action recognition. Specifically, We propose a video prediction model to predict the future video of a static image and use the future predicted video to improve static image action recognition. Our experimental results on four datasets validate that the idea of transferring the temporal information from videos to static images is promising, and can enhance static image action recognition performance. / Master of Science
648

Enhancing CryptoGuard's Deployability for Continuous Software Security Scanning

Frantz, Miles Eugene 21 May 2020 (has links)
The increasing development speed via Agile may introduce overlooked security steps in the process, with an example being the Iowa Caucus application. Verifying the protection of confidential information such as social security numbers requires security at all levels, providing protection through any connected applications. CryptoGuard is a static code analyzer for Java. This program verifies that developers do not leave vulnerabilities in their application. The program aids the developer by identifying cryptographic misuses such as hard-coded keys, weak program hashes, and using insecure protocols. In my Master thesis work, I made several important contributions to improving the deployability, accessibility, and usability of CryptoGuard. I extended CryptoGuard to scan source and compiled code, created live documentation, and supported a dual cloud and local tool-suite. I also created build tool plugins and a program aid for CryptoGuard. In addition, I also analyzed several Java-related surveys encompassing more than 50,000 developers and reported interesting current practices of real-world software developers. / Master of Science / Throughout the rise of software development, there has been an increase in development speed with developers embracing methodologies that use higher rates of changes, such as Agile. Since Agile naturally addresses "problems of rapid change", this also increases the likelihood of insecure and vulnerable coding practices. Though consumers depend on various public applications, there can still be failures throughout the development process in applications such as the Iowa caucus application. It was determined the Iowa cacus application development teams' repository credentials (API key) was left within the application itself. API keys provide the credential to be able to directly interact with server systems, and if left unguarded can be easily exploited. Since the Iowa cacus application was released publicly, malicious actors (other people looking to exploit the application) may have already discovered this credential. Within our team we have created CryptoGuard, a program to analyze applications to detect cryptographic issues such as an API key. Creating it with scalability in mind, it was created to be able to scan enterprise code at a reasonable speed. To ensure its use within companies, we have been working on extending and enhancing the work to the current needs of Java developers. Verifying the current Java landscape, we investigated three different companies and their developer ecosystem surveys that are publicly available. Amongst these companies are; JetBrains, known for their Integrated Development Environments (IDE, or application to help write applications) and their own programming language, Snyk, known for their public security platform and anti-virus capability, and Jakarta EE, which is the new platform for the enterprise version of Java. Throughout these surveys, we accumulate more than 50,000 developers' responses, spanning various countries, company experience, and ages. With their responses amalgamated, we enhance CryptoGuard to be available to as many developers and their requests as possible.First, CryptoGuard is enhanced to scan a projects source code. After that, ensuring our project is hosted by a cloud service, we actively are extending our project to the Security Assurance Marketplace (SWAMP). Funded by the DHS, SWAMP not only supplies a public cloud for developers to use, but a local download option to scan a program within the user's own computer. Next, we create a plugin for two most used build tools, Gradle and Maven. Then to ensure CryptoGuard can be have reactive aide, CryptoSoule is created to aide minimal interface aide. Finally utilizing a live documentation service, an open source documentation website was created to provide working examples to the community.
649

Secure Coding Practice in Java: Automatic Detection, Repair, and Vulnerability Demonstration

Zhang, Ying 12 October 2023 (has links)
The Java platform and third-party open-source libraries provide various Application Programming Interfaces (APIs) to facilitate secure coding. However, using these APIs securely is challenging for developers who lack cybersecurity training. Prior studies show that many developers use APIs insecurely, thereby introducing vulnerabilities in their software. Despite the availability of various tools designed to identify API insecure usage, their effectiveness in helping developers with secure coding practices remains unclear. This dissertation focuses on two main objectives: (1) exploring the strengths and weaknesses of the existing automated detection tools for API-related vulnerabilities, and (2) creating better tools that detect, repair, and demonstrate these vulnerabilities. Our research started with investigating the effectiveness of current tools in helping with developers' secure coding practices. We systematically explored the strengths and weaknesses of existing automated tools for detecting API-related vulnerabilities. Through comprehensive analysis, we observed that most existing tools merely report misuses, without suggesting any customized fixes. Moreover, developers often rejected tool-generated vulnerability reports due to their concerns on the correctness of detection, and the exploitability of the reported issues. To address these limitations, the second work proposed SEADER, an example-based approach to detect and repair security-API misuses. Given an exemplar ⟨insecure, secure⟩ code pair, SEADER compares the snippets to infer any API-misuse template and corresponding fixing edit. Based on the inferred information, given a program, SEADER performs inter-procedural static analysis to search for security-API misuses and to propose customized fixes. The third work leverages ChatGPT-4.0 to automatically generate security test cases. These test cases can demonstrate how vulnerable API usage facilitates supply chain attacks on specific software applications. By running such test cases during software development and maintenance, developers can gain more relevant information about exposed vulnerabilities, and may better create secure-by-design and secure-by-default software. / Doctor of Philosophy / The Java platform and third-party open-source libraries provide various Application Pro- gramming Interfaces (APIs) to facilitate secure coding. However, using these APIs securely can be challenging, especially for developers who aren't trained in cybersecurity. Prior work shows that many developers use APIs insecurely, consequently introducing vulnerabilities in their software. Despite the availability of various tools designed to identify API insecure usage, it is still unclear how well they help developers with secure coding practices. This dissertation focuses on (1) exploring the strengths and weaknesses of the existing au- tomated detection tools for API-related vulnerabilities, and (2) creating better tools that detect, repair, and demonstrate these vulnerabilities. We first systematically evaluated the strengths and weaknesses of the existing automated API-related vulnerability detection tools. We observed that most existing tools merely report misuses, without suggesting any cus- tomized fixes. Additionally, developers often reject tool-generated vulnerability reports due to their concerns about the correctness of detection, and whether the reported vulnerabil- ities are truly exploitable. To address the limitations found in our study, the second work proposed a novel example-based approach, SEADER, to detect and repair API insecure usage. The third work leverages ChatGPT-4.0 to automatically generate security test cases, and to demonstrate how vulnerable API usage facilitates the supply chain attacks to given software applications.
650

Self-Assembled Multilayered Dielectric Spectral Filters

Chandran, Ashwin 11 January 2002 (has links)
Thin film optical filters are made by depositing thin films of optical materials on a substrate in such a way as to produce the required optical and mechanical properties. The Electrostatic Self Assembly (ESA) process is accomplished by the alternate adsorption of poly-anionic and poly-cationic molecules on progressive oppositely charged surfaces. This technique offers several advantages such as ease of fabrication, molecular level uniformity, stable multilayer synthesis and avoidance of the need for a vacuum environment. The ESA process offers an excellent choice for manufacturing optical thin film coatings due to its capability to incorporate multiple properties into films at the molecular level and its ability to be a fast and inexpensive process. The ESA process, as a method for manufacturing optical thin film filters has been investigated in detail in this thesis. A specific design was made and analyzed using TFCalc, a commercial thin film design software. Sensitivity analysis detailing the changes in filter response to errors in thickness and refractive index produced by the ESA process were done. These proved that with a high level of quality control, highly reliable and accurate optical thin films can be made by the ESA process. / Master of Science

Page generated in 0.0781 seconds