Spelling suggestions: "subject:"probabilistic"" "subject:"probablistic""
1 |
A Probabilistic Inventory Analysis of Biomass for the State of Texas for Cellulosic EthanolGleinser, Matthew A. 16 January 2010 (has links)
Agricultural and forestry wastes for the use of creating cellulosic ethanol were
inventoried for each county in Texas. A simple forecast was created for each of the
agricultural wastes and then a multivariate empirical distribution was used to simulate
the range of biomass available by county and district. The probability that a district
could support a 25, 50, 75, or 100 million gallon cellulosic ethanol plant is estimated
from the Monte Carlo simulation results.
Biomass in Texas is concentrated in the Northern and Eastern areas of the state.
The areas of South and West Texas have little to no biomass available to use for
cellulosic ethanol. The North East, South East, and Upper Coast districts include forestry
waste that increase the amount of available biomass. With 100 percent certainty the
North East and South East districts can support four 100 million gallon cellulosic ethanol
plants each. The research found that there is more than enough biomass to support
numerous cellulosic ethanol plants in Texas, and decision makers can use the results of
this study to identify regions of low and high risk for available biomass from agricultural
and forestry waste.
|
2 |
Fast and Scalable Power System Learning, Analysis, and PlanningTaheri Hosseinabadi, Sayedsina 01 February 2022 (has links)
With the integration of renewable and distributed energy resources (DER) and advances in metering infrastructure, power systems are undergoing rapid modernization that brings forward new challenges and possibilities, which call for more advanced learning, analysis, and planning tools. While there are numerous problems present in the modern power grid, in this work, this work has addressed four of the most prominent challenges and has shown that how the new advances in generation and metering can be leveraged to address the challenges that arose by them. With regards to learning in power systems, we first have tackled power distribution system topology identification, since knowing the topology of the power grid is a crucial piece in any meaningful optimization and control task. The topology identification presented in this work is based on the idea of emph{prob-to-learn}, which is perturbing the power grid with small power injections and using the metered response to learn the topology. By using maximum-likelihood estimation, we were able to formulate the topology identification problem as a mixed-integer linear program. We next have tackled the prominent challenge of finding optimal flexibility of aggregators in distribution systems, which is a crucial step in utilizing the capacity of distributed energy resources as well as flexible loads of the distribution systems and to aid transmission systems to be more efficient and reliable. We have shown that the aggregate flexibility of a group of devices with uncertainties and non-convex models can be captured with a quadratic classifier and using that classifier we can design a virtual battery model that best describes the aggregate flexibility. For power system analysis and planning, we have addressed fast probabilistic hosting capacity analysis (PHCA), which is studying how DERs and the intermittency that they bring to the power system can impact the power grid operation in the long term. We have shown that interconnection studies can be sped up by a factor of 20 without losing any accuracy. By formulating a penalized optimal power flow (OPF), we were able to pose PHCA as an instance of multiparametric programming (MPP), and then leveraged the nice properties of MPP to efficiently solve a large number of OPFs. Regarding planning in power systems, we have tackled the problem of strategic investment in energy markets, in which we have utilized the powerful toolbox of multiparametric programming to develop two algorithms for strategic investment. Our MPP-aided grid search algorithm is useful when the investor is only considering a few locations and our MPP-aided gradient descent algorithm is useful for investing in a large number of locations. We next have presented a data-driven approach in finding the flexibility of aggregators in power systems. Finding aggregate flexibility is an important step in utilizing the full potential of smart and controllable loads in the power grid and it's challenging since an aggregator controls a large group of time-coupled devices that operate with non-convex models and are subject to random externalities. We have shown that the aggregate flexibility can be accurately captured with an ellipsoid and then used Farkas' lemma to fit a maximal volume polytope inside the aforementioned ellipsoid. The numerical test showcases that we can capture 10 times the volume that conventional virtual generator models can capture. / Doctor of Philosophy / With the integration of renewable and distributed energy resources (DER) and advances in metering infrastructure, power systems are undergoing rapid modernization that brings forward new challenges and possibilities, which call for more advanced learning, analysis, and planning tools. While there are numerous problems present in the modern power grid, in this work, this work has addressed four of the most prominent challenges and has shown that how the new advances in generation and metering can be leveraged to address the challenges that arose by them. With regards to learning in power systems, we first have tackled power distribution system topology identification, since knowing the topology of the power grid is a crucial piece in any meaningful optimization and control task. We next have tackled the prominent challenge of finding optimal flexibility of aggregators in distribution systems, which is a crucial step in utilizing the capacity of distributed energy resources as well as flexible loads of the distribution systems and to aid transmission systems to be more efficient and reliable. For power system analysis and planning, we have addressed fast probabilistic hosting capacity analysis (PHCA), which is studying how DERs and the intermittency that they bring to the power system can impact the power grid operation in the long term. We have shown that interconnection studies can be sped up by a factor of 20 without losing any accuracy. Regarding planning in power systems, we have tackled the problem of strategic investment in energy markets, in which we have utilized the powerful toolbox of multiparametric programming to develop two algorithms for strategic investment. We next have presented a data-driven approach in finding the flexibility of aggregators in power systems. Finding aggregate flexibility is an important step in utilizing the full potential of smart and controllable loads in the power grid and it's challenging since an aggregator controls a large group of time-coupled devices that operate with non-convex models and are subject to random externalities.
|
3 |
Transient stability-constrained load dispatch, ancillary services allocation and transient stability assessment procedures for secure power system operationKarimishad, Amir January 2008 (has links)
[Truncated abstract] The present thesis is devoted to the development of new methods for transient stability-constrained optimal power flow, probabilistic transient stability assessment and security-constrained ancillary services allocation. The key objective of the thesis is to develop novel dispatch and assessment methods for power systems operation in the new environment of electricity markets to ensure power systems security, particularly transient stability. A new method for economic dispatch together with nodal price calculations which includes transient stability constraints and, at the same time, optimises the reference inputs to the Flexible AC Transmission System (FACTS) devices for maintaining power systems transient stability and reducing nodal prices is developed. The method draws on the sensitivity analysis of time-domain transient stability simulation results to derive a set of linearised stability constraints expressed in terms of generator active powers and FACTS devices input references. '...' The low computing time requirement of the two-point estimate method allows online applications, and the use of detailed power systems dynamic model for time-domain simulation which offers high accuracy. The two-point estimate method is integrated in a straightforward manner with the existing transient stability analysis tools. The integrated software facility has potential applications in control rooms to assist the system operator in decision making process based on instability risks. The software system when implemented on a cluster of processors also makes it feasible to re-assess online transient stability for any change in system configuration arising from switching control. The method proposed has been tested on a representative power system and validated using the Monte Carlo simulation. In conjunction with the energy market, by which forecasted load demand is met by generator dispatch, ancillary services are required in relation to control for secure system operation and power quality. The final part of the thesis has a focus on the key aspect of allocating these ancillary services, subject to an important constraint that the dispatch of the ancillary services will not impair the system security achieved in the load dispatch. With this focus and requirement, the thesis develops a new dispatch formulation in which the network security constraints are represented in the optimal determination of generator active power schedule and allocation of ancillary services. Contingencies considered include power demand variations at individual load nodes from the values specified for the current dispatch calculation. The required changes in generator active powers to meet the new load demands are represented by additional control variables in the new dispatch formulation which augment those variables in the traditional OPF dispatch calculation. Based on the Lagrange function which includes the extended set of security constraints, the formulation derives the optimality condition to be satisfied by the dispatch solution, together with the marginal prices for individual ancillary service providers and LMPs. The effects of the security constraints are investigated and discussed. Case studies for representative power systems are presented to verify the new dispatch calculation procedure.
|
4 |
Transparent Decision Support Using Statistical EvidenceHamilton-Wright, Andrew January 2005 (has links)
An automatically trained, statistically based, fuzzy inference system that functions as a classifier is produced. The hybrid system is designed specifically to be used as a decision support system. This hybrid system has several features which are of direct and immediate utility in the field of decision support, including a mechanism for the discovery of domain knowledge in the form of explanatory rules through the examination of training data; the evaluation of such rules using a simple probabilistic weighting mechanism; the incorporation of input uncertainty using the vagueness abstraction of fuzzy systems; and the provision of a strong confidence measure to predict the probability of system failure. <br /><br /> Analysis of the hybrid fuzzy system and its constituent parts allows commentary on the weighting scheme and performance of the "Pattern Discovery" system on which it is based. <br /><br /> Comparisons against other well known classifiers provide a benchmark of the performance of the hybrid system as well as insight into the relative strengths and weaknesses of the compared systems when functioning within continuous and mixed data domains. <br /><br /> Classifier reliability and confidence in each labelling are examined, using a selection of both synthetic data sets as well as some standard real-world examples. <br /><br /> An implementation of the work-flow of the system when used in a decision support context is presented, and the means by which the user interacts with the system is evaluated. <br /><br /> The final system performs, when measured as a classifier, comparably well or better than other classifiers. This provides a robust basis for making suggestions in the context of decision support. <br /><br /> The adaptation of the underlying statistical reasoning made by casting it into a fuzzy inference context provides a level of transparency which is difficult to match in decision support. The resulting linguistic support and decision exploration abilities make the system useful in a variety of decision support contexts. <br /><br /> Included in the analysis are case studies of heart and thyroid disease data, both drawn from the University of California, Irvine Machine Learning repository.
|
5 |
Transparent Decision Support Using Statistical EvidenceHamilton-Wright, Andrew January 2005 (has links)
An automatically trained, statistically based, fuzzy inference system that functions as a classifier is produced. The hybrid system is designed specifically to be used as a decision support system. This hybrid system has several features which are of direct and immediate utility in the field of decision support, including a mechanism for the discovery of domain knowledge in the form of explanatory rules through the examination of training data; the evaluation of such rules using a simple probabilistic weighting mechanism; the incorporation of input uncertainty using the vagueness abstraction of fuzzy systems; and the provision of a strong confidence measure to predict the probability of system failure. <br /><br /> Analysis of the hybrid fuzzy system and its constituent parts allows commentary on the weighting scheme and performance of the "Pattern Discovery" system on which it is based. <br /><br /> Comparisons against other well known classifiers provide a benchmark of the performance of the hybrid system as well as insight into the relative strengths and weaknesses of the compared systems when functioning within continuous and mixed data domains. <br /><br /> Classifier reliability and confidence in each labelling are examined, using a selection of both synthetic data sets as well as some standard real-world examples. <br /><br /> An implementation of the work-flow of the system when used in a decision support context is presented, and the means by which the user interacts with the system is evaluated. <br /><br /> The final system performs, when measured as a classifier, comparably well or better than other classifiers. This provides a robust basis for making suggestions in the context of decision support. <br /><br /> The adaptation of the underlying statistical reasoning made by casting it into a fuzzy inference context provides a level of transparency which is difficult to match in decision support. The resulting linguistic support and decision exploration abilities make the system useful in a variety of decision support contexts. <br /><br /> Included in the analysis are case studies of heart and thyroid disease data, both drawn from the University of California, Irvine Machine Learning repository.
|
6 |
Genetics, drugs, and cognitive control: uncovering individual differences in substance dependenceBaker, Travis Edward 11 September 2012 (has links)
Why is it that only some people who use drugs actually become addicted? In fact, addiction depends on a complicated process involving a confluence of risk factors related to biology, cognition, behaviour, and personality. Notably, all addictive drugs act on a neural system for reinforcement learning called the midbrain dopamine system, which projects to and regulates the brain's system for cognitive control, called frontal cortex and basal ganglia. Further, the development and expression of the dopamine system is determined in part by genetic factors that vary across individuals such that dopamine related genes are partly responsible for addiction-proneness. Taken together, these observations suggest that the cognitive and behavioral impairments associated with substance abuse result from the impact of disrupted dopamine signals on frontal brain areas involved in cognitive control: By acting on the abnormal reinforcement learning system of the genetically vulnerable, addictive drugs hijack the control system to reinforce maladaptive drug-taking behaviors.
The goal of this research was to investigate this hypothesis by conducting a series of experiments that assayed the integrity of the dopamine system and its neural targets involved in cognitive control and decision making in young adults using a combination of electrophysiological, behavioral, and genetic assays together with surveys of substance use and personality. First, this research demonstrated that substance dependent individuals produce an abnormal Reward-positivity, an electrophysiological measure of a cortical mechanism for dopamine-dependent reward processing and cognitive control, and behaved abnormally on a decision making task that is diagnostic of dopamine dysfunction. Second, several dopamine-related neural pathways underlying individual differences in substance dependence were identified and modeled, providing a theoretical framework for bridging the gap between genes and behavior in drug addiction. Third, the neural mechanisms that underlie individual differences in decision making function and dysfunction were identified, revealing possible risk factors in the decision making system. In sum, these results illustrate how future interventions might be individually tailored for specific genetic, cognitive and personality profiles. / Graduate
|
Page generated in 0.0984 seconds