• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 227
  • 119
  • 49
  • 25
  • 19
  • 17
  • 15
  • 15
  • 8
  • 6
  • 6
  • 4
  • 4
  • 3
  • 3
  • Tagged with
  • 617
  • 175
  • 145
  • 77
  • 76
  • 67
  • 56
  • 50
  • 47
  • 46
  • 45
  • 43
  • 43
  • 42
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

An optimal framework of investment strategy in brownfields redevelopment by integrating site-specific hydrogeological and financial uncertainties

Yu, Soonyoung January 2009 (has links)
Brownfields redevelopment has been encouraged by governments or the real estate market because of economic, social and environmental benefits. However, uncertainties in contaminated land redevelopment may cause massive investment risk and need to be managed so that contaminated land redevelopment is facilitated. This study was designed to address hydrogeological as well as economic uncertainty in a hypothetical contaminated land redevelopment project and manage the risk from these uncertainties through the integration of the hydrogeological and economic uncertainties. Hydrogeological uncertainty is derived from incomplete site information, including aquifer heterogeneity, and must be assessed with scientific expertise, given the short history of redevelopment projects and their unique hydrogeological characteristics. Hydrogeological uncertainty has not yet been incorporated in one framework with the economic uncertainty that has been relatively well observed in financial markets. Two cases of Non-Aqueous Phase Liquid (NAPL) contamination were simulated using a physically-based hydrogeological model to address hydrogeological uncertainty: one concerns the effect of an ethanol spill on a light NAPL (LNAPL) contaminated area in the vadose zone, and the other is regarding the vapour phase intrusion of volatile organic compounds, in particular, Trichloroethylene (TCE), a dense NAPL (DNAPL), into indoor air through a variably saturated heterogeneous aquifer. The first simulation replicated experimental observations in the laboratory, such as the capillary fringe depressing and the NAPL pool remobilizing and collecting in a reduced area exhibiting higher saturations than observed prior to an ethanol injection. However, the data gap, in particular, on the chemical properties between the model and the experiment caused the uncertainty in the model simulation. The second NAPL simulation has been performed based on a hypothetical scenario where new dwellings in a redeveloped area have the potential risk of vapour phase intrusion from a subsurface source into indoor air because remediation or foundation design might fail. The simulation results indicated that the aquifer heterogeneity seemed the most significant factor controlling the indoor air exposure risk from a TCE source in the saturated zone. Then, the exposure risk was quantified using Monte Carlo simulations with 50 statistically equivalent heterogeneous aquifer permeability fields. The quantified risk (probability) represents the hydrogeological uncertainty in the scenario and gives the information on loss occurrence intensity of redevelopment failure. Probability of failure (or loss occurrence intensity) was integrated with cost of failure (or loss magnitude) to evaluate the risk capital in the hypothetical brownfields redevelopment project. The term “risk capital” is adopted from financial literature and is the capital you can lose from high risk investment. Cost of failure involves economic uncertainty and can be defined based on a developer’s financial agreement with new dwellers to prevent litigation in the case of certain events, such as an environmental event where indoor air concentrations of pollutants exceed regulatory limits during periodic inspections. The developer makes such a financial agreement with new dwellers because new dwellings have been constructed founded on flawed site information, and municipalities may require it if a land use planning approval is required. An agreement was presumed that the developer would repurchase the affected houses from new dwellers immediately, if indoor air contamination exceeded the regulatory limit. Furthermore, the developer would remediate any remaining contamination, demolish the affected houses and build new houses if they were worth investing in. With this financial plan assumed, the stochastic housing price, stochastic inflation rate and stochastic interest rate have been considered to cause the uncertainty in the cost of failure, and the information on these stochastic variables was obtained from the financial market due to its long history of observations. This research reviewed appropriate risk capital valuation methods for hydrogeologists to apply straightforwardly to their projects, with integrating probability of failure (hydrogeological uncertainty) and cost of failure (economic uncertainty). The risk capital is essentially the probability of failure times the cost of failure with safety loading added to compensate investors against hydrogeological and financial uncertainty. Fair market prices of risk capital have been valuated using financial mathematics and actuarial premium calculations, and each method has a specific safety loading term to reflect investors’ level of risk aversion. Risk capital results indicated that the price of the risk capital was much more sensitive to hydrogeological uncertainty than financial uncertainty. Developers can manage the risk capital by saving a contingency fee for future events or paying an insurance premium, given that the price of this risk capital is the price of a contingent claim, subsequent to failure in remediation or in foundation design, and equivalent to an environmental insurance premium if there is an insurance company to indemnify the liability for the developer. The optimal framework of investment strategy in brownfields redevelopment can be built by linkage of addressing and integrating uncertainties and valuating risk capital from the uncertainties. This framework involves balancing the costs associated with each step while maximizing a net profit from land redevelopment. The optimal investment strategy, such as if or when to remediate or redevelop and to what degree, is given when the future price of the land minus time and material costs as well as the contingency fee or insurance premium maximizes a net profit.
202

Contingency Learning and Unlearning in the Blink of an Eye: A Resource Dependent Process

Schmidt, James R January 2009 (has links)
Recent studies show that when words are correlated with the colours they are printed in (e.g., MOVE is presented 75% of the time in blue), colour identification is faster when the word is presented in its expected colour (MOVE in blue) than in an unexpected colour (MOVE in green). The present series of experiments explored the possible mechanisms involved in this colour-word contingency learning effect. Experiment 1 demonstrated that the effect was already present after 18 learning trials. During subsequent unlearning, the effect extinguished equally rapidly, suggesting that only a handful of the most recently encountered trials are used to predict responses. Two reanalyses of data from Schmidt, Crump, Cheesman, and Besner (2007) ruled out an account of the effect in terms of stimulus repetitions. Experiments 2 and 3 demonstrated that participants who carry a memory load do not show a contingency effect, supporting the hypothesis that limited-capacity resources are used to retrieve a small number of trial memories in order to prepare a response. Experiment 4 demonstrated that memory resources are required for both storage and retrieval processes.
203

Reading the Book of Life: Contingency and Convergence in Macroevolution

Powell, Russell 01 January 2008 (has links)
<p>This dissertation explores philosophical problems in biology, particularly those relating to macroevolutionary theory. It is comprised of a series of three papers drawn from work that is currently at the publication, re-submission, and review stage of the journal refereeing process, respectively. The first two chapters concern the overarching contours of complex life, while the third zeroes in on the short and long-term prospects of human evolution.</p><p>The rhetorical journey begins with a thought experiment proposed by the late paleontologist Stephen Jay Gould. Gould hypothesized that replaying the "tape of life" would result in radically different evolutionary outcomes, both with respect to animal life in general and the human species in particular. Increasingly, however, biologists and philosophers are pointing to convergent evolution as evidence for replicability and predictability in macroevolution. Chapters 1 and 2 are dedicated to fleshing out the Gouldian view of life and its antithesis, clarifying core concepts of the debate (including contingency, convergence, constraint and causation), and interpreting the empirical data in light of these conceptual clarifications. Chapter 3 examines the evolutionary biological future of the human species, and the ways in which powerful new biotechnologies can shape it, for better and for worse. More detailed chapter summaries are provided below.</p><p>In Chapter 1, I critique a book-length excoriation of Gould's contingency theory written by the paleobiologist Simon Conway Morris, in which he amasses and marshals a good bulk of the homoplasy literature in the service of promoting a more robust, counter-factually stable account of macroevolution. I show that there are serious conceptual and empirical difficulties that arise in broadly appealing to the frequency of homoplasy as evidence for robustness in the history of life. Most important is Conway Morris's failure to distinguish between convergent (`externally' constrained) and parallel (`internally' constrained) evolution, and to consider the respective implications of these significantly different sources of homoplasy for a strong adaptationist view of life.</p><p>In so doing, I propose a new definition of parallel evolution, one intended to rebut the common charge that parallelism differs from convergence merely in degree and not in kind. I argue that although organisms sharing a homoplastic trait will also share varying degrees of homology (given common decent), it is the underlying developmental homology with respect to the generators directly causally responsible for the homoplastic event that defines parallel evolution and non-arbitrarily distinguishes it from convergence. I make use of the philosophical concept of `screening-off' in order to distinguish the proximate generators of a homoplastic trait from its more distal genetic causes (such as conserved master control genes).</p><p>In Chapter 2, I critically examine a recent assessment of the contingency debate by the philosopher John Beatty, in which he offers an interpretation of Gould's thesis and argues that it is undermined by iterative ecomorphological evolution. I develop and defend alternative concepts of contingency and convergence, and show how much of the evidence generally held to negate the contingency thesis not only fails to do so, but in fact militates in favor of the Gouldian view of life. My argument once again rests heavily on the distinction between parallelism and convergence, which I elaborate on and defend against a recent assault by developmental biologists, in part by recourse to philosophical work on the ontological prioritization of biological causes.</p><p>In Chapter 3, I explore the probable (and improbable) evolutionary biological consequences of intentional germ-line modification, particularly in relation to human beings. A common worry about genetic engineering is that it will reduce the pool of genetic diversity, creating a biological monoculture that could not only increase our susceptibility to disease, but even hasten the extinction of our species. Thus far, however, the evolutionary implications of human genetic modification have remained largely unexplored. In this Chapter, I consider whether the widespread use of genetic engineering technology is likely to narrow the present range of genetic variation, and if so, whether this would in fact lead to the evolutionary harms that some authors envision. By examining the nature of biological variation and its relation to population immunity and evolvability, I show that not only will genetic engineering have a negligible impact on human genetic diversity, but that it will be more likely to ensure rather than undermine the health and longevity of the human species. To this end, I analyze the relationship between genotypic and phenotypic variation, consider process asymmetries between micro and macroevolution, and investigate the relevance of evolvability to clade-level persistence and extinction.</p> / Dissertation
204

Bayesian Semi-parametric Factor Models

Bhattacharya, Anirban January 2012 (has links)
<p>Identifying a lower-dimensional latent space for representation of high-dimensional observations is of significant importance in numerous biomedical and machine learning applications. In many such applications, it is now routine to collect data where the dimensionality of the outcomes is comparable or even larger than the number of available observations. Motivated in particular by the problem of predicting the risk of impending diseases from massive gene expression and single nucleotide polymorphism profiles, this dissertation focuses on building parsimonious models and computational schemes for high-dimensional continuous and unordered categorical data, while also studying theoretical properties of the proposed methods. Sparse factor modeling is fast becoming a standard tool for parsimonious modeling of such massive dimensional data and the content of this thesis is specifically directed towards methodological and theoretical developments in Bayesian sparse factor models.</p><p>The first three chapters of the thesis studies sparse factor models for high-dimensional continuous data. A class of shrinkage priors on factor loadings are introduced with attractive computational properties, with operating characteristics explored through a number of simulated and real data examples. In spite of the methodological advances over the past decade, theoretical justifications in high-dimensional factor models are scarce in the Bayesian literature. Part of the dissertation focuses on exploring estimation of high-dimensional covariance matrices using a factor model and studying the rate of posterior contraction as both the sample size & dimensionality increases. </p><p>To relax the usual assumption of a linear relationship among the latent and observed variables in a standard factor model, extensions to a non-linear latent factor model are also considered.</p><p>Although Gaussian latent factor models are routinely used for modeling of dependence in continuous, binary and ordered categorical data, it leads to challenging computation and complex modeling structures for unordered categorical variables. As an alternative, a novel class of simplex factor models for massive-dimensional and enormously sparse contingency table data is proposed in the second part of the thesis. An efficient MCMC scheme is developed for posterior computation and the methods are applied to modeling dependence in nucleotide sequences and prediction from high-dimensional categorical features. Building on a connection between the proposed model & sparse tensor decompositions, we propose new classes of nonparametric Bayesian models for testing associations between a massive dimensional vector of genetic markers and a phenotypical outcome.</p> / Dissertation
205

Data Mining in Tree-Based Models and Large-Scale Contingency Tables

Kim, Seoung Bum 11 January 2005 (has links)
This thesis is composed of two parts. The first part pertains to tree-based models. The second part deals with multiple testing in large-scale contingency tables. Tree-based models have gained enormous popularity in statistical modeling and data mining. We propose a novel tree-pruning algorithm called frontier-based tree-pruning algorithm (FBP). The new method has an order of computational complexity comparable to cost-complexity pruning (CCP). Regarding tree pruning, it provides a full spectrum of information. Numerical study on real data sets reveals a surprise: in the complexity-penalization approach, most of the tree sizes are inadmissible. FBP facilitates a more faithful implementation of cross validation, which is favored by simulations. One of the most common test procedures using two-way contingency tables is the test of independence between two categorizations. Current test procedures such as chi-square or likelihood ratio tests provide overall independency but bring limited information about the nature of the association in contingency tables. We propose an approach of testing independence of categories in individual cells of contingency tables based on a multiple testing framework. We then employ the proposed method to identify the patterns of pair-wise associations between amino acids involved in beta-sheet bridges of proteins. We identify a number of amino acid pairs that exhibit either strong or weak association. These patterns provide useful information for algorithms that predict secondary and tertiary structures of proteins.
206

Using graph theory to resolve state estimator issues faced by deregulated power systems

Lei, Jiansheng 15 May 2009 (has links)
Power industry is undergoing a transition from the traditional regulated environment to the competitive power market. To have a reliable state estimator (SE) in the power market environment, two major challenges are emerging, i.e. to keep SE running reliably even under a contingency and to run SE over a grid with extremely large size. The objective of this dissertation is to use graph theory to address the above two challenges. To keep SE running reliably under a contingency, a novel topological approach is first proposed to identify critical measurements and examine network observability under a contingency. To advance the classical topological observability analysis, a new concept of contingency observability graph (COG) is introduced and it is proven that a power system network maintains its observability under a contingency if and only if its COG satisfies some conditions. As an application of COG, a two-stage heuristic topological approach is further developed based on the new concept of qualified COG (QCOG) to minimize the number of measurements and RTUs under the constraint that the system remains observable under any single contingency. To overcome the disadvantages of existing SE over extremely large networks, a textured distributed state estimator (DSE), which consists of the off-line textured architecture design and the on-line textured computation, is proposed based on COG and a new concept of Bus Credibility Index (BCI). The textured DSE is non-recursive, asynchronous and avoids central controlling node. Numerical tests verify that the performance of the new textured DSE algorithm improves greatly compared with existing DSE algorithms in respect of bad data detection and identification. Furthermore, the software implementation for DSE is formulated as an information integration problem over regional power markets, and is very challenging because of its size and complexity. A new concept of semantic knowledge warehouse (SKW), together with the proposed concepts of semantic reasoning software component (SRSC) and deduction credibility, is developed to implement such an information integration system.
207

A Discussion On How To Formulate The Question Of Contingency In Leibniz&#039 / s System: A Logical Approach

Besler, Arman 01 June 2008 (has links) (PDF)
The main objective of this study is to shed light on some difficulties involved in the formulation of the problem of contingency in Leibniz&rsquo / s philosophical system. Leibniz&rsquo / s mature philosophy is characterized by the solutions he proposes for this problem, and the ontological ideas underlying or assisting them. &lsquo / The problem of contingency&rsquo / refers to the tension between his conceptual containment theory of truth and his claim that true existential propositions &ndash / that is, propositions which concern actual individuals &ndash / are all contingent. Though Leibniz does not seem to have one definite theory of contingency, two general lines of thought can nevertheless be discerned from his fragments on propositions and propositional truth. The first one is the infinite analysis theory, which is regarded in general as Leibniz&rsquo / s real theory of contingency, and the other is a theory of necessity, providing a division between absolute and hypothetical modalities. This thesis is not a study on the question whether Leibniz did really manage to solve the problem, but rather an attempt to trace the problem to its logical and ontological origins, and redefine it under a relatively simple form. It is first shown that Leibniz&rsquo / s theory of propositions relies heavily on his ontological conception of modalities, which covers the idea of a division between pure possibility and actuality / and then this idea is shown to be reflected on the logical level as a division between essential and existential truths. Finally it is argued that the two lines of thought and some peculiar characteristics of Leibniz&rsquo / s conception of modalities bring us to the conclusion that his real problem is the (deliberate) inability of his propositional calculus to express the difference between truths of reason (essential truths) and truths of fact (existential truths) as a logical structural one.
208

Fact of Disaster Recover Plan within Information Security Management System ¡ÐExample as CTS Corporation

Lee, Chun-Lung 18 July 2003 (has links)
Abstract The motive of thesis is trying to find out the role of information system security via the researching company ¡V CTS Corporation¡¦s disaster recovery plan and present a model to discuss with that. While through the researching procedure, we found out that most of information security systems obey the BS 7799-2 and ISO 17799 even the CNS 17799 in the world. Through the help of investigation of the questionnaire, all evidence just show out that how to construct and recognize the information security system is the issue and trend for enterprise to do after the ISO 9000 and QS 9000. Since 1978, the case study company, CTS Corporation has had fire and evacuation procedures, which were informally supported by various committees. An emergency plan, which was issued March 24, 1980, incorporated the functions of these committees and provided more formalized procedures for responding to emergencies. The emergency plan was later reissued as know of ¡§CTS Corporation Disaster Recovery Plan¡¨ as one of policy to be followed. In thesis, widely discuss the risk and evaluation of information security and show up some of major case of the information security for domestic and international by sequence of the date. Present some of overview from industry¡Bgovernment and academic how to face such of information security around the enterprises. Final conduct of 10 control sets, 36 control subjects and 127 control items with 8 abstracts of ISO 17799 introduction from BS 7799-1 and BS 779902 with comparison of ISO 17799 totally. Compare and distinguish the variance from CTS Corporation and those of international standard for the information security system, we deploy 4 steps of development the information security system as of: Information Security Policy; Documentation and Implementation; Risks Management and Information Security Management Security (ISMS). Provide evidence of questionnaires of the case study company. Summary three of dimensions for five(5) issues of ¡§Environment & Infrastructure¡¨, seven(7) issues of ¡§Disaster Recovering Planning¡¨ and six(6) issues of ¡§Business Contingency Planning¡¨ to conduce a Disaster Recovery Planning¡¦s Deployment Model for seven steps of four scenario as a conclusion.
209

Contingency theory of group communication effectiveness in Korean organizations: influence of fit between organizational structural variables and group relational climate on communication effectiveness

Cho, WoonYoung 30 October 2006 (has links)
This study developed and tested a contingency model of group communication in Korean workgroups that posited that the communication effectiveness and group performance of workgroups is determined by the “fit” of communication practices with organization structure and group relational climate. A contingency model incorporates three variables: contingency variables, response variables, and performance variables. Based on a review of the literature on Korean organizations and groups, the model incorporated two contingency variables: organizational structure and group relational climate. Organizational structure was indexed by the level of centralization and formalizations in the organization. Group relational climate was indexed by the level of closeness and group conformity among members. The response variables, communication practices of Korean workgroups, was measured in terms of the frequency of formal and informal meetings held by the workgroups. Two types of performance were measured: communication effectiveness and performance level. The contingency model hypothesized that the level of communication effectiveness and group performance of a workgroup that engages in communication practices which fit the requirements of organizational structure and group relational climate will be higher than that of a group whose communication practices do not fit the requirements of organizational structure and group relational climate. It also hypothesized the communication effectiveness group performance would be lower in groups which faced conflicting contingencies than in groups that faced consistent contingencies. A survey of 409 members of 84 workgroups in 37 Korean organizations was conducted. Results of this study supported the predictions of the contingency model. In particular, centralization, formalization, and closeness were significant contingency variables. The hypothesis regarding conflicting contingency was not supported. Implications of the study regarding the contingency theory, group communication and group effectiveness, and the nature of Korean groups and organizations are discussed.
210

Cooperation, communication and contingencies : the relationships of corporate public relations practitioners, lawyers and their external public /

Reber, Bryan H. January 2001 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 2001. / Typescript. Vita. Includes bibliographical references (leaves 144-152). Also available on the Internet.

Page generated in 0.088 seconds