• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 145
  • 36
  • 22
  • 15
  • 8
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 288
  • 288
  • 97
  • 90
  • 77
  • 69
  • 57
  • 57
  • 56
  • 39
  • 39
  • 36
  • 34
  • 31
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Distributed Decision Tree Induction Using Multi-agent Based Negotiation Protocol

Chattopadhyay, Dipayan 10 October 2014 (has links)
No description available.
122

A Model to Predict Ohio University Student Attrition from Admissions and Involvement Data

Roth, Sadie E. 05 August 2008 (has links)
No description available.
123

Structural classification of glaucomatous optic neuropathy

Twa, Michael Duane 13 September 2006 (has links)
No description available.
124

Applying Systems Thinking and Machine Learning Techniques to Identify Leverage Points for Intervening in Perioperative Opioid Use and Developing Risk Score Tools to Guide Perioperative Opioid Prescription

Huang, Yongmei January 2024 (has links)
Study Background and Objectives:Excessive perioperative opioid prescribing has been detrimental to public health, contributing to the elevated prevalence of opioid use disorder. Since 2016, rigorous regulation of opioid prescribing has reduced over-prescription, but has also led to opioid-phobia. The 2022 CDC guideline promotes person-centered decisions on pain management by relaxing restrictions on opioid prescription. The determination of opioid requirements for surgical pain management is influenced by various factors and stakeholders. Despite extensive research, the mechanisms underlying perioperative pain management and the persistence of opioid use after surgery remain unclear. Clinicians currently lack tools to guide opioid prescription in clinical settings, and patients often face a dearth of information regarding expected pain levels, proper opioid use, and options for surgical pain management. The main objective of my doctoral project is to disentangle the intricate relationships among patients, healthcare providers, and policy changes in perioperative opioid prescription for pain management and to identify key intervention points to balance the beneficial effects of proper opioids use against the risks of addition. Another objective is to develop a risk score algorithm for perioperative opioid requirements to help with decision-making in clinical practice. Materials and Methods:In chapter 1, I undertook a systematic review and meta-analysis, and investigated the percentage of adult patients scheduled for general surgeries who received opioid analgesia for perioperative pain management, the quantities of opioids prescribed to patients, the actual quantities consumed, the percentage of patients without prior opioid exposure experiencing prolonged opioid use, and the evolution of perioperative opioid prescription patterns since the policy changes. A causal loop diagram was used to visualize the complex conceptual framework of perioperative pain management and post-surgical prolonged use of opioids based on insights derived from the systematic review and meta-analysis. In chapter 2 and 3, data from patients aged 18-64 years undergoing one of 12 commonly performed procedures (e.g., laparoscopic cholecystectomy) from 2015 to 2018 at a single institution were analyzed. Perioperative opioid requirements (none/low, medium, high) were determined based on patients’ self-reported pain scores and opioid prescription/administration from 30 days before to 2 weeks after surgery. Patients’ clinical and procedure-related factors were collected as potential predictors. Random forest, the Least Absolute Shrinkage and Selection Operator (LASSO), and multinomial logistic regression were used to develop prediction models. Models’ performance, including discrimination, calibration, classification measures were evaluated. A nomogram based on multinomial logistic regression was generated as a score tool, and decision curve analysis was used to examine the clinical utility of the final prediction model dichotomizing the opioid prescription as none/sparing versus medium/high requirements. Results: My systematic review and meta-analysis revealed that around 85% of surgical patients received opioids perioperatively. The pooled mean total amount of opioids dispensed was 210 MME per patient per surgical procedure. Notably, only approximately 44% of the prescribed opioids were consumed. Among opioid-naïve patients who initiated opioid use perioperatively, 7.1% persisted in opioid use beyond the conventional three-month postoperative recovery timeframe. Intervention programs (such as setting up maximum limits of opioids prescription, providing trainings to health providers, monitoring opioids prescription behaviors, providing health education to patients, etcetera) reduced perioperative opioid prescription by 38% and opioid consumption by 63.2%. The causal loop diagram illustrates a balancing feedback loop between policy and over-prescription, highlighting the pivotal role of a decision tool in reducing the over-prescription of perioperative opioids while ensuring the fulfillment of opioid needs for effective perioperative pain management. To develop a decision-aid tool based on prediction models, I included 2733 patients in the training dataset and 1081 in the testing dataset, all of whom underwent general surgeries. All prediction models demonstrated moderate discrimination in the testing dataset. The null hypothesis of perfect calibration intercepts and calibration slopes was rejected. In analyses restricted to patients undergoing laparoscopic cholecystectomy, model discrimination remained similar while model calibration improved. The revised LASSO model had an accuracy of around 65% in the testing dataset, classifying future cases correctly into opioid requirements groups in laparoscopic cholecystectomy cohort. Features in the final laparoscopic cholecystectomy model included the use of opioid/NSAID/anti-depressant before surgery, emergency surgery, anesthesia type, and surgical indication for cholelithiasis/cholecystitis. A nomogram was created to guide perioperative opioids use among laparoscopic cholecystectomy patients, and the decision curve analysis demonstrated the clinical utility of the prediction model; it generated higher net benefits than the strategy of prescribing no opioids or opioid sparing to surgical patients and the strategy of prescribing medium or high opioids doses to all patients, with a broad threshold probability from 18% to 92%. Conclusions:In summary, this dissertation described the historically high levels of perioperative opioid prescriptions and highlighted their adverse impacts: persistent opioid use and community diversion. Although the implementation of guidance and policies has significantly reduced nationwide over-prescriptions of opioids, it is essential to recognize the potential benefits of appropriate opioid use in perioperative pain management. The incorporation of a machine-learning approach with subject-matter knowledge may achieve more accurate predictions of opioid requirements than employing machine-learning techniques alone and increase the interpretability of the prediction model. Notably, the surgery-specific model demonstrated superior performance than the model for general surgeries. Future studies should further validate the conceptual model of perioperative opioid prescription and misuse in real-world scenarios, enhance model discrimination, extend external validation efforts, and develop electronic applications tailored to contemporary medical practices.
125

Methodology for a Security-Dependability Adaptive Protection Scheme based on Data Mining

Bernabeu, Emanuel 21 January 2010 (has links)
The power industry is currently in the process of re-inventing itself. The unbundling of the traditional monopolistic structure that gave birth to a deregulated electricity market, the mass tendency towards a greener use of energy, the new emphasis on distributed generation and alternative renewable resources, and new emerging technologies have revolutionized the century old industry. Recent blackouts offer testimonies of the crucial role played by protection relays in a reliable power system. It is argued that embracing the paradigm shift of adaptive protection is a fundamental step towards a reliable power grid. The adaptive philosophy of protection systems acknowledges that relays may change their characteristics in order to tailor their operation to prevailing system conditions. The purpose of this dissertation is to present methodology to implement a security/dependability adaptive protection scheme. It is argued that the likelihood of hidden failures and potential cascading events can be significantly reduced by adjusting the security/dependability balance of protection systems to better suit prevailing system conditions. The proposed methodology is based on Wide Area Measurements (WAMs) obtained with the aid of Phasor Measurement Units (PMUs). A Data Mining algorithm known as Decision Trees is used to classify the power system state and to predict the optimal security/dependability bias of a critical protection scheme. / Ph. D.
126

Hoeffding-Tree-Based Learning from Data Streams and Its Application in Online Voltage Security Assessment

Nie, Zhijie 05 September 2017 (has links)
According to the proposed definition and classification of power system stability addressed by IEEE and CIGRE Task Force, voltage stability refers to the stability of maintaining the steady voltage magnitudes at all buses in a power system when the system is subjected to a disturbance from a given operating condition (OC). Cascading outage due to voltage collapse is a probable consequence during insecure voltage situations. In this regard, fast responding and reliable voltage security assessment (VSA) is effective and indispensable for system to survive in conceivable contingencies. This paper aims at establishing an online systematic framework for voltage security assessment with high-speed data streams from synchrophasors and phasor data concentrators (PDCs). Periodically updated decision trees (DTs) have been applied in different subjects of security assessments in power systems. However, with a training data set of operating conditions that grows rapidly, re-training and restructuring a decision tree becomes a time-consuming process. Hoeffding-tree-based method constructs a learner that is capable of memory management to process streaming data without retaining the complete data set for training purposes in real-time and guarantees the accuracy of learner. The proposed approach of voltage security assessment based on Very Fast Decision Tree (VFDT) system is tested and evaluated by the IEEE 118-bus standard system. / Master of Science
127

Transient Stability Prediction based on Synchronized Phasor Measurements and Controlled Islanding

Li, Meiyan 20 June 2013 (has links)
Traditional methods for predicting transient stability of power systems such as the direct method, the time domain approach, and the energy function methods do not work well for online transient stability predictions problems. With the advent of Phasor Measurement Units (PMUs) in power systems, it is now possible to monitor the behavior of the system in real time and provide important information for transient stability assessment and enhancement. Techniques such as the rotor oscillation prediction method based on time series have made the prediction of system stability possible for real-time applications. However, methods of this type require more than 300 milliseconds after the start of a transient event to make reliable predictions. The dissertation provides an alternate prediction method for transient stability by taking advantage of the available PMUs data. It predicts transient stability using apparent impedance trajectories obtained from PMUs, decision trees, and FLDSD method. This method enables to find out the strategic locations for PMUs installation in the power system to rapidly predict transient stability. From the simulations performed, it is realized that system stability can be predicted in approximately 200 milliseconds (12 cycles). The main advantage of this method is its simplicity as the PMUs can record the apparent impedance trajectories in real-time without any previous calculations. Moreover, using decision trees built in CART, transient stability prediction becomes straightforward and computationally very fast. The optimum locations for PMUs placement can also be determined using this technique. After the transient instability prediction by the apparent impedance trajectories, a slow- coherency based intelligent controlled islanding scheme is also developed to restore the stability of system. It enables the generators in the same island to stay in synchronism and the imbalance between the generators and load demand is minimized. / Ph. D.
128

Practical Implementation of a Security-Dependability Adaptive Voting Scheme Using Decision Trees

Quint, Ryan David 06 December 2011 (has links)
Today's electric power system is operated under increasingly stressed conditions. As electrical demand increases, the existing grid is operated closer to its stable operating limits while maintaining high reliability of electric power delivery to its customers. Protective schemes are designed to account for pressures towards unstable operation, but there is always a tradeoff between security and dependability of this protection. Adaptive relaying schemes that can change or modify their operation based on prevailing system conditions are an example of a protective scheme increasing reliability of the power system. The purpose of this thesis is to validate and analyze implementation of the Security-Dependability Adaptive Voting Scheme. It is demonstrated that this scheme can be implemented with a select few Phasor Measurement Units (PMUs) reporting positive sequence currents to a Phasor Data Concentrator (PDC). At the PDC, the state of the power system is defined as Stressed or Safe and a set of relays either vote or perform normal operation, respectively. The Adaptive Voting Scheme was implemented using two configurations: hardware- and software-based PDC solutions. Each was shown to be functional, effective, and practical for implementation. Practicality was based on the latency of Wide Area Measurement (WAM) devices and the added latency of relay voting operation during Stressed conditions. Phasor Measurement Units (PMUs), Phasor Data Concentrators (PDCs), and relay operation delays were quantified to determine the benefits and limitations of WAMS protection and implementation of the voting scheme. It is proposed that the delays injected into the existing protection schemes would have minimal effect on the voting scheme but must be accounted for when implementing power system controls due to the real-time requirements of the data. / Master of Science
129

POLYNOMIAL CURVE FITTING INDICES FOR DYNAMIC EVENT DETECTION IN WIDE-AREA MEASUREMENT SYSTEMS

Longbottom, Daniel W. 14 August 2013 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / In a wide-area power system, detecting dynamic events is critical to maintaining system stability. Large events, such as the loss of a generator or fault on a transmission line, can compromise the stability of the system by causing the generator rotor angles to diverge and lose synchronism with the rest of the system. If these events can be detected as they happen, controls can be applied to the system to prevent it from losing synchronous stability. In order to detect these events, pattern recognition tools can be applied to system measurements. In this thesis, the pattern recognition tool decision trees (DTs) were used for event detection. A single DT produced rules distinguishing between and the event and no event cases by learning on a training set of simulations of a power system model. The rules were then applied to test cases to determine the accuracy of the event detection. To use a DT to detect events, the variables used to produce the rules must be chosen. These variables can be direct system measurements, such as the phase angle of bus voltages, or indices created by a combination of system measurements. One index used in this thesis was the integral square bus angle (ISBA) index, which provided a measure of the overall activity of the bus angles in the system. Other indices used were the variance and rate of change of the ISBA. Fitting a polynomial curve to a sliding window of these indices and then taking the difference between the polynomial and the actual index was found to produce a new index that was non-zero during the event and zero all other times for most simulations. After the index to detect events was chosen to be the error between the curve and the ISBA indices, a set of power system cases were created to be used as the training data set for the DT. All of these cases contained one event, either a small or large power injection at a load bus in the system model. The DT was then trained to detect the large power injection but not the small one. This was done so that the rules produced would detect large events on the system that could potentially cause the system to lose synchronous stability but ignore small events that have no effect on the overall system. This DT was then combined with a second DT that predicted instability such that the second DT made the decision whether or not to apply controls only for a short time after the end of every event, when controls would be most effective in stabilizing the system.
130

New Physics Probes at Present/Future Hadron Colliders via Vh Production

Englert, Philipp 26 April 2023 (has links)
In dieser Arbeit nutzen wir Effektive Feldtheorien, genauer gesagt die SMEFT, um BSM-Effekte modellunabhängig zu parametrisieren. Wir demonstrieren die Relevanz von Präzisionsmessungen sowohl an aktuellen als auch an zukünftigen Hadronenbeschleunigern durch die Untersuchung von Vh-Dibosonen-Prozessen. Diese Prozesse ermöglichen uns die Untersuchung einer Reihe von Dimension-6-Operatoren, die BSM-Effekte erzeugen, die mit der Schwerpunktsenergie wachsen. Im Besonderen betrachten wir die leptonischen Zerfallskanäle der Vektorbosonen und zwei verschiedene Zerfallsmodi des Higgs-Bosons, den Diphoton-Kanal und den h->bb-Kanal. Der Diphoton-Kanal zeichnet sich durch eine saubere Signatur aus, die mit relativ einfachen Mitteln sehr gut von den relevanten Hintergründen unterschieden werden kann. Aufgrund der geringen Rate dieses Higgs-Zerfallskanals werden diese Prozesse allerdings erst für die Untersuchung von BSM-Effekten am FCC-hh relevant. Dank des großen h->bb Verzweigungsverhältnisse liefert der Vh(->bb)-Kanal bereits eine kompetitive Sensitivität für BSM-Effekte am LHC. Jedoch leidet dieser Kanal unter großen QCD-induzierten Hintergründen, weswegen ausgefeiltere Analysetechniken nötig sind, um dieses Niveau an BSM-Sensitivität zu erreichen. Wir leiten die erwarteten Schranken für die zuvor erwähnten Operatoren für den Vh(->gamma gamma)-Kanal am FCC-hh und für den Vh(->bb)-Kanal am LHC Run 3, HL-LHC und FCC-hh her. Unsere Studie des Vh(->bb)-Kanals zeigt, dass die Extraktion von Schranken für BSM-Operatoren an Hadronenbeschleunigern eine höchst nicht-triviale Aufgabe sein kann. Algorithmen des Maschinellen Lernens können potenziell nützlich zur Analyse solch komplexer Event-Strukturen sein. Wir leiten Schranken her, indem wir Boosted Decision Trees zur Signal-Hintergrund Klassifizierung benutzen und und vergleichen sie mit den Schranken aus der zuvor diskutierten Cut-and-Count Analyse. Wir finden eine leichte Verbesserung von O(einige %) für die verschiedenen Operatoren. / In this thesis, we utilise the framework of Effective Field Theories, more specifically the Standard Model Effective Field Theory, to parameterise New-Physics effects in a model-independent way. We demonstrate the relevance of precision measurements both at current and future hadron colliders by studying Vh-diboson-production processes. These processes allow us to probe a set of dimension-6 operators that generate BSM effects growing with the center-of-mass energy. More specifically, we consider the leptonic decay channels of the vector bosons and two different decay modes of the Higgs boson, the diphoton channel and the hadronic h->bb channel. The diphoton channel is characterised by a clean signature that can be separated very well from the relevant backgrounds with relatively simple methods. However, due to the small rate of this Higgs-decay channel, these processes will only become viable to probe New-Physics effects at the FCC-hh. Thanks to the large h->bb branching ratio, the Vh(->bb) channel already provides competitive sensitivity to BSM effects at the LHC. However, it suffers from large QCD-induced backgrounds that require us to use more sophisticated analysis techniques to achieve this level of BSM sensitivity. We derive the expected bounds on the previously mentioned dimension-6 operators from the Vh(->gamma gamma) channel at the FCC-hh and from the Vh(->bb) channel at the LHC Run 3, HL-LHC and FCC-hh. Our study of the Vh(->bb) channel demonstrates that extracting bounds on BSM operators at hadron colliders can be a highly non-trivial task. Machine-Learning algorithms can potentially be useful for the analysis of such complex event structures. We derive bounds using Boosted Decision Trees for the signal-background classification and compare them with the ones from the previously discussed cut-and-count analysis. We find a mild improvement of O(few %) across the different operators.

Page generated in 0.0495 seconds