• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1700
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 9
  • 8
  • 7
  • Tagged with
  • 3615
  • 598
  • 433
  • 364
  • 360
  • 359
  • 347
  • 328
  • 326
  • 296
  • 282
  • 259
  • 214
  • 214
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
521

Three material decomposition in dual energy CT for brachytherapy using the iterative image reconstruction algorithm DIRA : Performance of the method for an anthropomorphic phantom

Westin, Robin January 2013 (has links)
Brachytherapy is radiation therapy performed by placing a radiation source near or inside a tumor. Difference between the current water-based brachytherapy dose formalism (TG-43) and new model based dose calculation algorithms (MBSCAs) can differ by more than a factor of 10 in the calculated doses. There is a need for voxel-by-voxel cross-section assignment, ideally, both the tissue composition and mass density of every voxel should be known for individual patients. A method for determining tissue composition via three material decomposition (3MD) from dual energy CT scans was developed at Linköping university. The method (named DIRA) is a model based iterative reconstruction algorithm that utilizes two photon energies for image reconstruction and 3MD for quantitative tissue classification of the reconstructed volumetric dataset. This thesis has investigated the accuracy of the 3MD method applied on prostate tissue in an anthropomorphic phantom when using two different approximations of soft tissues in DIRA. Also the distributions of CT-numbers for soft tissues in a contemporary dual energy CT scanner have been determined. An investigation whether these distributions can be used for tissue classification of soft tissues via thresholding has been conducted. It was found that the relative errors of mass energy absorption coefficient (MEAC) and linear attenuation coefficient (LAC) of the approximated mixture as functions of photon energy were less than 6 \% in the energy region from 1 keV to 1 MeV. This showed that DIRA performed well for the selected anthropomorphic phantom and that it was relatively insensitive to choice of base materials for the approximation of soft tissues. The distributions of CT-numbers of liver, muscle and kidney tissues overlapped. For example a voxel containing muscle could be misclassified as liver in 42 cases of 100. This suggests that pure thresholding is insufficient as a method for tissue classification of soft tissues and that more advanced methods should be used.
522

Service Offering Uncertainty Analysis Tool

Gomolka, Beth January 2009 (has links)
Companies that seek to venture into providing services in addition to providing products have many business issues to consider as there are many differences between providing service and product offerings.  One factor that needs to be considered in service offerings is the aspect of time, as services are offered for an extended period of time, creating a unique type of relationship between the customer and the service provider.  With product offerings, the point of sale is usually the end of the product provider and customer relationship.  The added time aspect in the service offering brings with it the issues of uncertainty as service contracts are made for a certain period of time in the future, where things are unknown.   This thesis looked at types of uncertainties important to service offerings, especially in the manufacturing industry.  The uncertainties have an impact on how service offering contracts are constructed, as they can affect the profit and costs of the service provider. The three types of uncertainties that were examined were product malfunction uncertainty, service delivery uncertainty, and customer requirement uncertainty. Using these three types of uncertainty, mathematical models were constructed to represent the cost and revenue of different contract types. The different contract types were identified through a case study with a product manufacturer in Sweden.  Different probability distributions were selected to model the three types of uncertainty based on a literature review.  The mathematical models were then used to construct a software program, the uncertainty simulator tool, which service contract designers can use to model how uncertainties affect cost and revenue in their contracts.
523

Parameter Estimation and Uncertainty Analysis of Contaminant First Arrival Times at Household Drinking Water Wells

Kang, Mary January 2007 (has links)
Exposure assessment, which is an investigation of the extent of human exposure to a specific contaminant, must include estimates of the duration and frequency of exposure. For a groundwater system, the duration of exposure is controlled largely by the arrival time of the contaminant of concern at a drinking water well. This arrival time, which is normally estimated by using groundwater flow and transport models, can have a range of possible values due to the uncertainties that are typically present in real problems. Earlier arrival times generally represent low likelihood events, but play a crucial role in the decision-making process that must be conservative and precautionary, especially when evaluating the potential for adverse health impacts. Therefore, an emphasis must be placed on the accuracy of the leading tail region in the likelihood distribution of possible arrival times. To demonstrate an approach to quantify the uncertainty of arrival times, a real contaminant transport problem which involves TCE contamination due to releases from the Lockformer Company Facility in Lisle, Illinois is used. The approach used in this research consists of two major components: inverse modelling or parameter estimation, and uncertainty analysis. The parameter estimation process for this case study was selected based on insufficiencies in the model and observational data due to errors, biases, and limitations. A consideration of its purpose, which is to aid in characterising uncertainty, was also made in the process by including many possible variations in attempts to minimize assumptions. A preliminary investigation was conducted using a well-accepted parameter estimation method, PEST, and the corresponding findings were used to define characteristics of the parameter estimation process applied to this case study. Numerous objective functions, which include the well-known L2-estimator, robust estimators (L1-estimators and M-estimators), penalty functions, and deadzones, were incorporated in the parameter estimation process to treat specific insufficiencies. The concept of equifinality was adopted and multiple maximum likelihood parameter sets were accepted if pre-defined physical criteria were met. For each objective function, three procedures were implemented as a part of the parameter estimation approach for the given case study: a multistart procedure, a stochastic search using the Dynamically-Dimensioned Search (DDS), and a test for acceptance based on predefined physical criteria. The best performance in terms of the ability of parameter sets to satisfy the physical criteria was achieved using a Cauchy’s M-estimator that was modified for this study and designated as the LRS1 M-estimator. Due to uncertainties, multiple parameter sets obtained with the LRS1 M-estimator, the L1-estimator, and the L2-estimator are recommended for use in uncertainty analysis. Penalty functions had to be incorporated into the objective function definitions to generate a sufficient number of acceptable parameter sets; in contrast, deadzones proved to produce negligible benefits. The characteristics for parameter sets were examined in terms of frequency histograms and plots of parameter value versus objective function value to infer the nature of the likelihood distributions of parameters. The correlation structure was estimated using Pearson’s product-moment correlation coefficient. The parameters are generally distributed uniformly or appear to follow a random nature with few correlations in the parameter space that results after the implementation of the multistart procedure. The execution of the search procedure results in the introduction of many correlations and in parameter distributions that appear to follow lognormal, normal, or uniform distributions. The application of the physical criteria refines the parameter characteristics in the parameter space resulting from the search procedure by reducing anomalies. The combined effect of optimization and the application of the physical criteria performs the function of behavioural thresholds by removing parameter sets with high objective function values. Uncertainty analysis is performed with parameter sets obtained through two different sampling methodologies: the Monte Carlo sampling methodology, which randomly and independently samples from user-defined distributions, and the physically-based DDS-AU (P-DDS-AU) sampling methodology, which is developed based on the multiple parameter sets acquired during the parameter estimation process. Monte Carlo samples are found to be inadequate for uncertainty analysis of this case study due to its inability to find parameter sets that meet the predefined physical criteria. Successful results are achieved using the P-DDS-AU sampling methodology that inherently accounts for parameter correlations and does not require assumptions regarding parameter distributions. For the P-DDS-AU samples, uncertainty representation is performed using four definitions based on pseudo-likelihoods: two based on the Nash and Sutcliffe efficiency criterion, and two based on inverse error or residual variance. The definitions consist of shaping factors that strongly affect the resulting likelihood distribution. In addition, some definitions are affected by the objective function definition. Therefore, all variations are considered in the development of likelihood distribution envelopes, which are designed to maximize the amount of information available to decision-makers. The considerations that are important to the creation of an uncertainty envelope are outlined in this thesis. In general, greater uncertainty appears to be present at the tails of the distribution. For a refinement of the uncertainty envelopes, the application of additional physical criteria is recommended. The selection of likelihood and objective function definitions and their properties are made based on the needs of the problem; therefore, preliminary investigations should always be conducted to provide a basis for selecting appropriate methods and definitions. It is imperative to remember that the communication of assumptions and definitions used in both parameter estimation and uncertainty analysis is crucial in decision-making scenarios.
524

Preference Uncertainty and Trust in Decision Making

Al-Mutairi, Mubarak 23 March 2007 (has links)
A fuzzy approach for handling uncertain preferences is developed within the paradigm of the Graph Model for Conflict Resolution and new advances in trust modeling and assessment are put forward for permitting decision makers (DMs) to decide with whom to cooperate and trust in order to move from a potential resolution to a more preferred one that is not attainable on an individual basis. The applicability and the usefulness of the fuzzy preference and trust research for giving an enhanced strategic understanding about a dispute and its possible resolution are demonstrated by employing a realworld environmental conflict as well as two generic games that represent a wide range of real life encounters dealing with trust and cooperation dilemmas. The introduction of the uncertain preference representation extends the applicability of the Graph Model for Conflict Resolution to handle conflicts with missing or incomplete preference information. Assessing the presence of trust will help to compensate for the missing information and bridge the gap between a desired outcome and a feared betrayal. These advances in the areas of uncertain preferences and trust have potential applications in engineering decision making, electronic commerce, multiagent systems, international trade and many other areas where conflict is present. In order to model a conflict, it is assumed that the decision makers, options, and the preferences of the decision makers over possible states are known. However, it is often the case that the preferences are not known for certain. This could be due to lack of information, impreciseness, or misinformation intentionally supplied by a competitor. Fuzzy logic is applied to handle this type of information. In particular, it allows a decision maker to express preferences using linguistic terms rather than exact values. It also makes use of data intervals rather than crisp values which could accommodate minor shifts in values without drastically changing the overall results. The four solution concepts of Nash, general metarationality, symmetric metarationality, and sequential stability for determining stability and potential resolutions to a conflict, are extended to accommodate the new fuzzy preference representation. The newly proposed solution concepts are designed to work for two and more than two decision maker cases. Hypothetical and real life conflicts are used to demonstrate the applicability of this newly proposed procedure. Upon reaching a conflict resolution, it might be in the best interests of some of the decision makers to cooperate and form a coalition to move from the current resolution to a better one that is not achievable on an individual basis. This may require moving to an intermediate state or states which may be less preferred by some of the coalition members while being more preferred by others compared to the original or the final state. When the move is irreversible, which is the case in most real life situations, this requires the existence of a minimum level of trust to remove any fears of betrayal. The development of trust modeling and assessment techniques, allows decision makers to decide with whom to cooperate and trust. Illustrative examples are developed to show how this modeling works in practice. The new theoretical developments presented in this research enhance the applicability of the Graph Model for Conflict Resolution. The proposed trust modeling allows a reasonable way of analyzing and predicting the formation of coalitions in conflict analysis and cooperative game theory. It also opens doors for further research and developments in trust modeling in areas such as electronic commerce and multiagent systems.
525

Petroleum Refining and Petrochemical Industry Integration and Coordination under Uncertainty

Alqahtani, Khalid January 2009 (has links)
Petroleum refining and the petrochemical industry account for a major share in the world energy and industrial market. In many situations, they represent the economy back-bone of industrial countries. Today, the volatile environment of the market and the continuous change in customer requirements lead to constant pressure to seek opportunities that properly align and coordinate the different components of the industry. In particular, petroleum refining and petrochemical industry coordination and integration is gaining a great deal of interest. However, previous research in the field either studied the two systems in isolation or assumed limited interactions between them. The aim of this thesis is to provide a framework for the planning, integration and coordination of multisite refinery and petrochemical networks using proper deterministic, stochastic and robust optimization techniques. The contributions of this dissertation fall into three categories; namely, a) Multisite refinery planning, b) Petrochemical industry planning, and c) Integration and coordination of multisite refinery and petrochemical networks. The first part of this thesis tackles the integration and coordination of a multisite refinery network. We first address the design and analysis of multisite integration and coordination strategies within a network of petroleum refineries through a mixed-integer linear programming (MILP) technique. The integrated network design specifically addresses intermediate material transfer between processing units at each site. The proposed model is then extended to account for model uncertainty by means of two-stage stochastic programming. Parameter uncertainty was considered and included coefficients of the objective function and right-hand-side parameters in the inequality constraints. Robustness is analyzed based on both model robustness and solution robustness, where each measure is assigned a scaling factor to analyze the sensitivity of the refinery plan and the integration network due to variations. The proposed technique makes use of the sample average approximation (SAA) method with statistical bounding techniques to give an insight on the sample size required to give adequate approximation of the problem. The second part of the thesis addresses the strategic planning, design and optimization of a network of petrochemical processes. We first set up and give an overview of the deterministic version of the petrochemical industry planning model adopted in this thesis. Then we extend the model to address the strategic planning, design and optimization of a network of petrochemical processes under uncertainty and robust considerations. Similar to the previous part, robustness is analyzed based on both model robustness and solution robustness. Parameter uncertainty considered in this part includes process yield, raw material and product prices, and lower product market demand. The Expected Value of Perfect Information (EVPI) and Value of the Stochastic Solution (VSS) are also investigated to numerically illustrate the value of including the randomness of the different model parameters. The final part of this dissertation addresses the integration between the multisite refinery system and the petrochemical industry. We first develop a framework for the design and analysis of possible integration and coordination strategies of multisite refinery and petrochemical networks to satisfy given petroleum and chemical product demand. The main feature of the work is the development of a methodology for the simultaneous analysis of process network integration within a multisite refinery and petrochemical system. Then we extend the petroleum refinery and petrochemical industry integration problem to consider different sources of uncertainties in model parameters. Parameter uncertainty considered includes imported crude oil price, refinery product price, petrochemical product price, refinery market demand, and petrochemical lower level product demand. We apply the sample average approximation (SAA) method within an iterative scheme to generate the required scenarios and provide solution quality by measuring the optimality gap of the final solution.
526

Uncertainty Analysis and the Identification of the Contaminant Transport and Source Parameters for a Computationally Intensive Groundwater Simulation

Yin, Yong January 2009 (has links)
Transport parameter estimation and contaminant source identification are critical steps in the development of a physically based groundwater contaminant transport model. Due to the irreversibility of the dispersion process, the calibration of a transport model of interest is inherently ill-posed, and very sensitive to the simplification employed in the development of the lumped models. In this research, a methodology for the calibration of physically based computationally intensive transport models was developed and applied to a case study, the Reich Farm Superfund site in Toms River, New Jersey. Using HydroGeoSphere, a physically based transient three-dimensional computationally intensive groundwater flow model with spatially and temporally varying recharge was developed. Due to the convergence issue of implementing saturation versus permeability curve (van Genuchten equation) for the large scale models with coarse discretization, a novel flux-based method was innovated to determined solutions for the unsaturated zone for soil-water-retention models. The parameters for the flow system were determined separately from the parameters for the contaminant transport model. The contaminant transport and source parameters were estimated using both approximately 15 years of TCE concentration data from continuous well records and data over a period of approximately 30 years from traditional monitoring wells, and compared using optimization with two heuristic search algorithms (DDS and MicroGA) and a gradient based multi-start PEST. The contaminant transport model calibration results indicate that overall, multi-start PEST performs best in terms of the final best objective function values with equal number of function evaluations. Multi-start PEST also was employed to identify contaminant transport and source parameters under different scenarios including spatially and temporally varying recharge and averaged recharge. For the detailed, transient flow model with spatially and temporally varying recharge, the estimated transverse dispersivity coefficients were estimated to be significantly less than that reported in the literature for the more traditional approach that uses steady-state flow with averaged, less physically based recharge values. In the end, based on the Latin Hypercube sampling, a methodology for comprehensive uncertainty analysis, which accounts for multiple parameter sets and the associated correlations, was developed and applied to the case study.
527

Stability and Performance for Two Classes of Time-Varying Uncertain Plants

Vale, Julie January 2009 (has links)
In this thesis, we consider plants with uncertain parameters where those parameters may be time-varying; we show that, with reasonable assumptions, we can design a controller that stabilizes such systems while providing near-optimal performance in the face of persistent discontinuities in the time-varying parameters. We consider two classes of uncertainty. The first class is modeled via a (possibly infinite) set of linear time invariant plants - the uncertain time variation consists of unpredictable (but sufficiently slow) switches between those plants. We consider standard LQR performance, and, in the case of a finite set of plants, the more complicated problem of LQR step tracking. Our second class is a time-varying gain margin problem: we consider an reasonably general, uncertain, time-varying function at the input of an otherwise linear time invariant nominal plant. In this second context, we consider the tracking problem wherein the signal to be tracked is modeled by a (stable) filter at the exogenous input and we measure performance via a weighted sensitivity function. The controllers are periodic and mildly nonlinear, with the exception that the controller for the second class is linear.
528

Quantifying the Groundwater Component within the Water Balance of a Large Lake in a Glaciated Watershed: Lake Pyhäjärvi, SW Finland

Wiebe, Andrew James January 2012 (has links)
Accurate estimates of the amount of groundwater entering a lake on a yearly basis may provide valuable information for assessing contaminant loadings such as nutrient mass fluxes and the subsequent contribution of groundwater to eutrophication. Groundwater exchange with lakes is often a critical component of a lake’s water balance, yet its quantification has often proven problematic. Large component uncertainties preclude accurate estimation of the groundwater flux, upon which the assessment of contaminant loadings may depend. In this study, water balance techniques for lake systems were assessed at Lake Pyhäjärvi (near Säkylä, SW Finland), a relatively large lake in a long established agricultural area. A water balance was conducted over 38 water years to estimate the net groundwater discharge into the lake. This was compared with groundwater flux estimates via Darcy’s Law for the adjacent Honkala Aquifer in the Kuivalahti-Säkylä tributary esker (a potential conduit for groundwater impacted by agricultural practices). Direct runoff estimates were initially made using an average of river flow per unit area ratios from the two rivers that flow into the lake. Adjustments to these estimates were made using PART (Rutledge, 2007) hydrograph separation results from the larger river. The mean net groundwater discharge increased from -73 to +38mm per unit lake area (-4.8 to +2.5% of average total inflow) due to these adjustments, which yielded a better qualitative match with observations at the lake (e.g., Rautio, 2009; Rautio and Korkka-Niemi, 2011). Uncertainty analysis for the water balance indicated that relative uncertainty ranged from 40 to 2900% on the net groundwater flux, while the average absolute uncertainty was 118mm per unit lake area. Groundwater discharge estimates based on Darcy’s Law were ≤ 22 mm per unit lake area (≤1.4% of average total inflow) with sizeable uncertainty (± one order of magnitude). Most of the uncertainty on the net groundwater discharge estimates was incurred from the evaporation, precipitation, and direct runoff components; esker flux uncertainty was essentially due to error on the hydraulic conductivity estimate. The resolution of the water balance method suggests that it is better suited to lakes with relatively large net groundwater contributions (>5% of average total inflow). Results highlight the following needs for large lake water balances: improvements in the accuracy of evaporation, precipitation, and direct runoff component estimates; and uncertainty analysis. Groundwater contributions to inflow rivers may be more important than direct discharge from highly permeable subsurface materials adjacent to lakes in the context of understanding nutrient loadings to large lakes.
529

Hydrologic prediction using pattern recognition and soft-computing techniques

Parasuraman, Kamban 20 August 2007 (has links)
Several studies indicate that the data-driven models have proven to be potentially useful tools in hydrological modeling. Nevertheless, it is a common perception among researchers and practitioners that the usefulness of the system theoretic models is limited to forecast applications, and they cannot be used as a tool for scientific investigations. Also, the system-theoretic models are believed to be less reliable as they characterize the hydrological processes by learning the input-output patterns embedded in the dataset and not based on strong physical understanding of the system. It is imperative that the above concerns needs to be addressed before the data-driven models can gain wider acceptability by researchers and practitioners.<p>In this research different methods and tools that can be adopted to promote transparency in the data-driven models are probed with the objective of extending the usefulness of data-driven models beyond forecast applications as a tools for scientific investigations, by providing additional insights into the underlying input-output patterns based on which the data-driven models arrive at a decision. In this regard, the utility of self-organizing networks (competitive learning and self-organizing maps) in learning the patterns in the input space is evaluated by developing a novel neural network model called the spiking modular neural networks (SMNNs). The performance of the SMNNs is evaluated based on its ability to characterize streamflows and actual evapotranspiration process. Also the utility of self-organizing algorithms, namely genetic programming (GP), is evaluated with regards to its ability to promote transparency in data-driven models. The robustness of the GP to evolve its own model structure with relevant parameters is illustrated by applying GP to characterize the actual-evapotranspiration process. The results from this research indicate that self-organization in learning, both in terms of self-organizing networks and self-organizing algorithms, could be adopted to promote transparency in data-driven models.<p>In pursuit of improving the reliability of the data-driven models, different methods for incorporating uncertainty estimates as part of the data-driven model building exercise is evaluated in this research. The local-scale models are shown to be more reliable than the global-scale models in characterizing the saturated hydraulic conductivity of soils. In addition, in this research, the importance of model structure uncertainty in geophysical modeling is emphasized by developing a framework to account for the model structure uncertainty in geophysical modeling. The contribution of the model structure uncertainty to the predictive uncertainty of the model is shown to be larger than the uncertainty associated with the model parameters. Also it has been demonstrated that increasing the model complexity may lead to a better fit of the function, but at the cost of an increasing level of uncertainty. It is recommended that the effect of model structure uncertainty should be considered for developing reliable hydrological models.
530

Uppamma mod och utjämna makt : Ett aktionsforskningsprojekt där arbetsredskap utformades tillsammans med sjuksköterskestudenter för att kunna lindra för personer att leva med ovisshet.

Norén, Johanna January 2008 (has links)
Uncertainty is a well-known phenomenon that is recurring in biographies, imaginative literature and science, but is less studied in nursing science. The aim of this participatory action research project was to, in collaboration with nursing students; explore nursing activities for relieving patients’ uncertainty. A qualitative approach was chosen with focus groups and the text was content analyzed. The students studied their third semester on a nursing program. The findings showed two different kinds of uncertainty, existential uncertainty and care evoked uncertainty. Relieving uncertainty showed to be a process in three steps. The first step is to reach deeper understanding of what uncertainty is and how it affects patients. The second step is to develop sensibility for how to discover uncertainty and the third step is to use uncertainty relieving strategies. To do this requires courage and a wish to share power with the patient. The strategies for relieving existential and care evoked uncertainty differed.

Page generated in 0.0422 seconds