• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 4
  • 3
  • 1
  • Tagged with
  • 92
  • 92
  • 92
  • 34
  • 24
  • 24
  • 23
  • 23
  • 20
  • 17
  • 17
  • 17
  • 15
  • 15
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Efficient Knot Optimization for Accurate B-spline-based Data Approximation

Yo-Sing Yeh (9757565) 14 December 2020
<div>Many practical applications benefit from the reconstruction of a smooth multivariate function from discrete data for purposes such as reducing file size or improving analytic and visualization performance. Among the different reconstruction methods, tensor product B-spline has a number of advantageous properties over alternative data representation. However, the problem of constructing a best-fit B-spline approximation effectively contains many roadblocks. Within the many free parameters in the B-spline model, the choice of the knot vectors, which defines the separation of each piecewise polynomial patch in a B-spline construction, has a major influence on the resulting reconstruction quality. Yet existing knot placement methods are still ineffective, computationally expensive, or impose limitations on the dataset format or the B-spline order. Moving beyond the 1D cases (curves) and onto higher dimensional datasets (surfaces, volumes, hypervolumes) introduces additional computational challenges as well. Further complications also arise in the case of undersampled data points where the approximation problem can become ill-posed and existing regularization proves unsatisfactory.</div><div><br></div><div>This dissertation is concerned with improving the efficiency and accuracy of the construction of a B-spline approximation on discrete data. Specifically, we present a novel B-splines knot placement approach for accurate reconstruction of discretely sampled data, first in 1D, then extended to higher dimensions for both structured and unstructured formats. Our knot placement methods take into account the feature or complexity of the input data by estimating its high-order derivatives such that the resulting approximation is highly accurate with a low number of control points. We demonstrate our method on various 1D to 3D structured and unstructured datasets, including synthetic, simulation, and captured data. We compare our method with state-of-the-art knot placement methods and show that our approach achieves higher accuracy while requiring fewer B-spline control points. We discuss a regression approach to the selection of the number of knots for multivariate data given a target error threshold. In the case of the reconstruction of irregularly sampled data, where the linear system often becomes ill-posed, we propose a locally varying regularization scheme to address cases for which a straightforward regularization fails to produce a satisfactory reconstruction.</div>
52

JOKE RECOMMENDER SYSTEM USING HUMOR THEORY

Soumya Agrawal (9183053) 29 July 2020 (has links)
<p>The fact that every individual has a different sense of humor and it varies greatly from one person to another means that it is a challenge to learn any individual’s humor preferences. Humor is much more than just a source of entertainment; it is an essential tool that aids communication. Understanding humor preferences can lead to improved social interactions and bridge existing social or economic gaps.</p><p> </p><p>In this study, we propose a methodology that aims to develop a recommendation system for jokes by analyzing its text. Various researchers have proposed different theories of humor depending on their area of focus. This exploratory study focuses mainly on Attardo and Raskin’s (1991) General Theory of Verbal Humor and implements the knowledge resources defined by it to annotate the jokes. These annotations contain the characteristics of the jokes and also play an important role in determining how alike these jokes are. We use Lin’s similarity metric (Lin, 1998) to computationally capture this similarity. The jokes are clustered in a hierarchical fashion based on their similarity values used for the recommendation. We also compare our joke recommendations to those obtained by the Eigenstate algorithm (Goldberg, Roeder, Gupta, & Perkins, 2001), an existing joke recommendation system that does not consider the content of the joke in its recommendation.</p>
53

IMPROVING THE FIDELITY OF AGENT-BASED ACTIVE SHOOTER SIMULATIONS THROUGH MODELING BLOOD LOSS AND INJURY MANAGEMENT

Krassimir Tzvetanov (11818304) 09 December 2021 (has links)
<p>Simulation modeling has proven beneficial in gathering insights that may aid safety policy considerations for schools, offices, and outdoor events. This is especially true when conducting a drill that is not practical or possible, such as active shooter response. However, we can improve the current modeling practices with high-fidelity simulation logic reflecting a victim's well-being. Currently, victims are modeled either as “killed,” or they continue their normal movement. The binary approach is suitable for many simulations developed to understand course trends in an event space but does not allow for more fine-tuned insights that may be beneficial when developing a safety and response protocol for a specific facility or event. Additional victim characteristics, such as tracking the location of a victim's wound and the rate of physiological decline, may be added into a model that will improve the realism and lead to an improved response protocol. The increased fidelity will be helpful when simulating and assessing the effects of volunteer response, critical care transport for medical intervention, and other first-responder interventions.</p> <p>While some think it is not possible or necessary to simulate how fast gunshot victims would lose blood, we show that a high-fidelity simulation is possible. The main counterargument is that there is no sufficient data, and also it will be challenging to implement this process as it is occurring. However, we found enough data or were able to extrapolate the missing pieces and develop a consistent and realistic blood loss model. In addition, the state of current simulation packages, such as AnyLogic, has advanced to the point where we can model a liquid system dynamic within an agent-based model. Furthermore, there is an acute benefit to conducting this type of research as it can help us develop better response policies, which result in more saved lives.</p> <p>The research aims to improve emergency-response simulation fidelity by developing a model that simulates gunshot wounds and the subsequent blood loss while accounting for a victim's age, weight, gender, and the affected area. The model also accounts for the body's compensatory response and medical interventions, such as tourniquet application, wound packing, and direct pressure. The work presents an analytical model and its implementation using agent-based modeling in AnyLogic. This AnyLogic module can be inserted into active shooter simulations that easily integrate with the existing logic. This integration happens through a high-level application programming interface (API) exposed to the user. The API allows for automatic infliction of injury and mitigation. The extensive literature review and case studies provide a sound foundation for creating the model. AnyLogic was chosen due to its common usage and versatility with other systems and computer programming languages.</p>
54

Data Driven Dense 3D Facial Reconstruction From 3D Skull Shape

Anusha Gorrila (7023152) 13 August 2019 (has links)
<p>This thesis explores a data driven machine learning based solution for Facial reconstruction from three dimensional (3D) skull shape for recognizing or identifying unknown subjects during forensic investigation. With over 8000 unidentified bodies during the past 3 decades, facial reconstruction of disintegrated bodies in helping with identification has been a critical issue for forensic practitioners. Historically, clay modelling has been used for facial reconstruction that not only requires an expert in the field but also demands a substantial amount of time for modelling, even after acquiring the skull model. Such manual reconstruction typically takes from a month to over 3 months of time and effort. The solution presented in this thesis uses 3D Cone Beam Computed Tomography (CBCT) data collected from many people to build a model of the relationship of facial skin to skull bone over a dense set of locations on the face. It then uses this skin-to-bone relationship model learned from the data to reconstruct the predicted face model from a skull shape of an unknown subject. The thesis also extends the algorithm in a way that could help modify the reconstructed face model interactively to account for the effects of age or weight. This uses the predicted face model as a starting point and creates different hypotheses of the facial appearances for different physical attributes. Attributes like age and body mass index (BMI) are used to show the physical facial appearance changes with the help of a tool we constructed. This could improve the identification process. The thesis also presents a methods designed for testing and validating the facial reconstruction algorithm. <br></p>
55

Generating Evidence for COPD Clinical Guidelines Using EHRs

Amber M Johnson (7023350) 14 August 2019 (has links)
The Global Initiative for Chronic Obstructive Lung Disease (GOLD) guidelinesare used to guide clinical practices for treating Chronic Obstructive Pulmonary Disease (COPD). GOLD focuses heavily on stable COPD patients, limiting its use fornon-stable COPD patients such as those with severe, acute exacerbations of COPD (AECOPD) that require hospitalization. Although AECOPD can be heterogeneous, it can lead to deterioration of health and early death. Electronic health records (EHRs) can be used to analyze patient data for understanding disease progression and generating guideline evidence for AECOPD patients. However, because of its structure and representation, retrieving, analyzing, and properly interpreting EHR data can be challenging, and existing tools do not provide granular analytic capabil-ities for this data.<div><br></div><div>This dissertation presents, develops, and implements a novel approach that systematically captures the effect of interventions during patient medical encounters, and hence may support evidence generation for clinical guidelines in a systematic and principled way. A conceptual framework that structures components, such as data storage, aggregation, extraction, and visualization, to support EHR data analytics for granular analysis is introduced. We develop a software framework in Python based on these components to create longitudinal representations of raw medical data extracted from the Medical Information Mart for Intensive Care (MIMIC-III) clinical database. The software framework consists of two tools: Patient Aggregated Care Events (PACE), a novel tool for constructing and visualizing entire medical histories of both individual patients and patient cohorts, and Mark SIM, a Markov Chain Monte Carlo modeling and simulation tool for predicting clinical outcomes through probabilistic analysis that captures granular temporal aspects of aggregated, clinicaldata.<br></div><div><br></div><div>We assess the efficacy of antibiotic treatment and the optimal time of initiationfor in-hospitalized AECOPD patients as an application to probabilistic modeling. We identify 697 AECOPD patients of which 26.0% were administered antibiotics. Our model simulations show a 50% decrease in mortality rate as the number of patients administered antibiotics increase, and an estimated 5.5% mortality rate when antibiotics are initially administrated after 48 hours vs 1.8% when antibiotics are initially administrated between 24 and 48 hours. Our findings suggest that there may be amortality benefit in initiation of antibiotics early in patients with acute respiratory failure in ICU patients with severe AECOPD.<br></div><div><br></div><div>Thus, we show that it is feasible to enhance representation of EHRs to aggregate patients’ entire medical histories with temporal trends and support complex clinical questions to drive clinical guidelines for COPD.<br></div>
56

Protecting Bare-metal Systems from Remote Exploitation

Abraham Anthony Clements (6618926) 15 May 2019 (has links)
The Internet of Things is deploying large numbers of bare-metal systems that have no protection against memory corruption and control-flow hijacking attacks. These attacks have enabled unauthorized entry to hotel rooms, malicious control of unmanned aerial vehicles, and invasions of privacy. Using static and dynamic analysis these systems can utilize state-of-the-art testing techniques to identify and<br>prevent memory-corruption errors and employ defenses against memory corruption and control-flow hijacking attacks in bare-metal systems that match or exceed those currently employed on desktop systems. This is shown using three case studies.<br><br>(1) EPOXY which, automatically applies data execution prevention, diversity, stack defenses, and separating privileged code from unprivileged code using a novel<br>technique called privileged overlaying. These protections prevent code injection attacks, and reduce the number of privileged instruction to 0.06% verses an unprotected<br>application.<br><br>(2) Automatic Compartments for Embedded Systems (ACES), which automatically creates compartments that enforce data integrity and code isolation within bare-metal applications. ACES enables exploring policies to best meet security and performance requirements for individual applications. Results show ACES' can form 10s of compartments within a single thread and has a 15% runtime overhead on average.<br><br><div>(3) HALucinator breaks the requirement for specialized hardware to perform bare-metal system testing. This enables state-of-the-art testing techniques –e.g., coverage based fuzzing – to scale with the availability of commodity computers, leading to the discovery of exploitable vulnerabilities in bare-metal systems. <br></div><div><br></div><div>Combined, these case studies advance the security of embedded system several decades and provide essential protections for today’s connected devices.</div>
57

Sistemas especialistas modulados e abrangentes para a gestão de operações. / Modulate and wide experts systems to the operations management.

Barrella, Wagner Däumichen 12 December 2000 (has links)
A existência de novas condições econômicas e de trabalho tem conduzido as empresas a uma modernização de técnicas e metodologias para a resolução de problemas em Engenharia de Produção. Embora seja rápido o surgimento de novas ferramentas de informática e haja uma grande preocupação, no sentido de empregar a informática como suporte à tomada de decisões, o que se tem notado é que a utilização dos computadores nas empresas não tem sido feita na plenitude de suas possibilidades, ou seja, os usuários (especialmente os engenheiros) fazem uso de programas especialistas para chegarem a decisões isoladas e, posteriormente, transferem os resultados obtidos para outros aplicativos e/ou realização das análises. Este trabalho desenvolveu estudos multidisciplinares que envolvem as novas ferramentas oferecidas pelos avanços da Informática e pelos modernos conceitos de administração e otimização de processos, estudados em Engenharia de Produção. Tais estudos foram direcionados para a obtenção de resultados práticos que possam ser rapidamente aplicados nas empresas brasileiras, ou seja, dentro do contexto financeiro-tecnológico nacional. Estas pesquisam procuraram documentar qual é o formato do sistema, para que sejam capazes de facilitar e automatizar o planejamento da produção de uma indústria ou de uma empresa de serviços. Pretendeu-se, assim, registrar os conceitos e filosofias necessários para se construir uma ferramenta para otimização de processos produtivos que resolva, ou atenue, eventuais problemas causados por restrições nadisponibilidade de recursos (equipamento, mão-de-obra, material e tempo) ou de outra natureza. / The existence of new economic and work conditions has been driving the companies to a modernization of techniques and methodologies for resolution of problems in Production Engineering. Although be fast the appearance of new computer science tools and there be a great concern in the sense of using the computer science as support to the taking of decisions, which one has been noticing is that the use of the computers in the companies has not been made in the fullness of its possibilities, that is to say, the users (especially the engineers) make use of expert programs for they arrive to isolated decisions, and later, they transfer the results obtained for another applications and/or accomplishment of the analyses. This research developed multidisciplinaries studies involving the new tools offered by the progresses of the Computer Science and by the modern administration concepts, and optimization of processes studied by the Production Engineering. Such studies were addressed for the obtaining of practical results that they can be quickly applied in the Brazilian companies, that is to say, inside the national financial-technological context. Those researches tried to document which is the format of the system, so that they are capable to facilitate and to automate the planning of the production of an industry or company of services. It was intended like this to register the concepts and necessary philosophies to build a tool for optimization of productive processes that it solves, or attenuate, eventual problems caused by restrictions in the readiness of resources (equipment, work-hand, material and time) or by another nature.
58

Sistemas especialistas modulados e abrangentes para a gestão de operações. / Modulate and wide experts systems to the operations management.

Wagner Däumichen Barrella 12 December 2000 (has links)
A existência de novas condições econômicas e de trabalho tem conduzido as empresas a uma modernização de técnicas e metodologias para a resolução de problemas em Engenharia de Produção. Embora seja rápido o surgimento de novas ferramentas de informática e haja uma grande preocupação, no sentido de empregar a informática como suporte à tomada de decisões, o que se tem notado é que a utilização dos computadores nas empresas não tem sido feita na plenitude de suas possibilidades, ou seja, os usuários (especialmente os engenheiros) fazem uso de programas especialistas para chegarem a decisões isoladas e, posteriormente, transferem os resultados obtidos para outros aplicativos e/ou realização das análises. Este trabalho desenvolveu estudos multidisciplinares que envolvem as novas ferramentas oferecidas pelos avanços da Informática e pelos modernos conceitos de administração e otimização de processos, estudados em Engenharia de Produção. Tais estudos foram direcionados para a obtenção de resultados práticos que possam ser rapidamente aplicados nas empresas brasileiras, ou seja, dentro do contexto financeiro-tecnológico nacional. Estas pesquisam procuraram documentar qual é o formato do sistema, para que sejam capazes de facilitar e automatizar o planejamento da produção de uma indústria ou de uma empresa de serviços. Pretendeu-se, assim, registrar os conceitos e filosofias necessários para se construir uma ferramenta para otimização de processos produtivos que resolva, ou atenue, eventuais problemas causados por restrições nadisponibilidade de recursos (equipamento, mão-de-obra, material e tempo) ou de outra natureza. / The existence of new economic and work conditions has been driving the companies to a modernization of techniques and methodologies for resolution of problems in Production Engineering. Although be fast the appearance of new computer science tools and there be a great concern in the sense of using the computer science as support to the taking of decisions, which one has been noticing is that the use of the computers in the companies has not been made in the fullness of its possibilities, that is to say, the users (especially the engineers) make use of expert programs for they arrive to isolated decisions, and later, they transfer the results obtained for another applications and/or accomplishment of the analyses. This research developed multidisciplinaries studies involving the new tools offered by the progresses of the Computer Science and by the modern administration concepts, and optimization of processes studied by the Production Engineering. Such studies were addressed for the obtaining of practical results that they can be quickly applied in the Brazilian companies, that is to say, inside the national financial-technological context. Those researches tried to document which is the format of the system, so that they are capable to facilitate and to automate the planning of the production of an industry or company of services. It was intended like this to register the concepts and necessary philosophies to build a tool for optimization of productive processes that it solves, or attenuate, eventual problems caused by restrictions in the readiness of resources (equipment, work-hand, material and time) or by another nature.
59

ALGORITHMS FOR DEGREE-CONSTRAINED SUBGRAPHS AND APPLICATIONS

S M Ferdous (11804924) 19 December 2021 (has links)
A degree-constrained subgraph construction (DCS) problem aims to find an optimal spanning subgraph (w.r.t an objective function) subject to certain degree constraints on the vertices. DCS generalizes many combinatorial optimization problems such as Matchings and Edge Covers and has many practical and real-world applications. This thesis focuses on DCS problems where there are only upper and lower bounds on the degrees, known as b-matching and b-edge cover problems, respectively. We explore linear and submodular functions as the objective functions of the subgraph construction.<br><br>The contributions of this thesis involve both the design of new approximation algorithms for these DCS problems, and also their applications to real-world contexts.<br>We designed, developed, and implemented several approximation algorithms for DCS problems. Although some of these problems can be solved exactly in polynomial time, often these algorithms are expensive, tedious to implement, and have little to no concurrency. On the contrary, many of the approximation algorithms developed here run in nearly linear time, are simple to implement, and are concurrent. Using the local dominance framework, we developed the first parallel algorithm submodular b-matching. For weighted b-edge cover, we improved the classic Greedy algorithm using the lazy evaluation technique. We also propose and analyze several approximation algorithms using the primal-dual linear programming framework and reductions to matching. We evaluate the practical performance of these algorithms through extensive experimental results.<br><br>The second contribution of the thesis is to utilize the novel algorithms in real-world applications. We employ submodular b-matching to generate a balanced task assignment for processors to build Fock matrices in the NWChemEx quantum chemistry software. Our load-balanced assignment results in a four-fold speedup per iteration of the Fock matrix computation and scales to 14,000 cores of the Summit supercomputer at Oak Ridge National Laboratory. Using approximate b-edge cover, we propose the first shared-memory and distributed-memory parallel algorithms for the adaptive anonymity problem. Minimum weighted b-edge cover and maximum weight b-matching are shown to be applicable to constructing graphs from datasets for machine learning tasks. We provide a mathematical optimization framework connecting the graph construction problem to the DCS problem.
60

Human-AI Teaming for Dynamic Interpersonal Skill Training

Ogletree, Xavian Alexander 26 May 2021 (has links)
No description available.

Page generated in 0.1312 seconds