• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 615
  • 157
  • 86
  • 74
  • 54
  • 47
  • 33
  • 17
  • 16
  • 14
  • 13
  • 12
  • 9
  • 8
  • 8
  • Tagged with
  • 1425
  • 210
  • 188
  • 188
  • 181
  • 178
  • 123
  • 116
  • 103
  • 102
  • 98
  • 85
  • 80
  • 78
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Adaptive Reliability Analysis of Excavation Problems

Park, Jun Kyung 2011 August 1900 (has links)
Excavation activities like open cutting and tunneling work may cause ground movements. Many of these activities are performed in urban areas where many structures and facilities already exist. These activities are close enough to affect adjacent structures. It is therefore important to understand how the ground movements due to excavations influence nearby structures. The goal of the proposed research is to investigate and develop analytical methods for addressing uncertainty during observation-based, adaptive design of deep excavation and tunneling projects. Computational procedures based on a Bayesian probabilistic framework are developed for comparative analysis between observed and predicted soil and structure response during construction phases. This analysis couples the adaptive design capabilities of the observational method with updated reliability indices, to be used in risk-based design decisions. A probabilistic framework is developed to predict three-dimensional deformation profiles due to supported excavations using a semi-empirical approach. The key advantage of this approach for practicing engineers is that an already common semi-empirical chart can be used together with a few additional simple calculations to better evaluate three-dimensional displacement profiles. A reliability analysis framework is also developed to assess the fragility of excavation-induced infrastructure system damage for multiple serviceability limit states. Finally, a reliability analysis of a shallow circular tunnel driven by a pressurized shield in a frictional and cohesive soil is developed to consider the inherent uncertainty in the input parameters and the proposed model. The ultimate limit state for the face stability is considered in the analysis. The probability of failure that exceeding a specified applied pressure at the tunnel face is estimated. Sensitivity and importance measures are computed to identify the key parameters and random variables in the model.
152

Pretreatment and hydrolysis of recovered fibre for ethanol production

Ruffell, John 11 1900 (has links)
Energy utilization is a determining factor for the standards of living around the world, and the current primary source of energy is fossil fuels. A potential source of liquid fuels that could ease the strain caused by diminishing petroleum resources is bioethanol. Effective exploitation of biomass materials requires a pretreatment to disrupt the lignin and cellulose matrix. The pretreatment utilized for this research was oxygen delignification, which is a standard process stage in the production of bleached chemical pulp. The model substrate utilized as a feedstock for bioethanol was recovered fibre. An analysis of the substrates digestibility resulted in a hexose yield of approximately 23%, which justified the need for an effective pretreatment. An experimental design was performed to optimize the delignification conditions by performing experiments over a range of temperature, caustic loadings, and reaction times. Equations were developed that outline the dependence of various response parameters on the experimental variables. An empirical model that can predict sugar concentrations from enzymatic hydrolysis based on the Kappa number, enzyme loading, and initial fibre concentration was also developed. A study of hydrolysis feeding regimes for untreated recovered fibre (87 Kappa), pretreated recovered fibre (17 Kappa), and bleached pulp (6 Kappa) showed that the batch feeding regime offers reduced complexity and high sugar yields for lower Kappa substrates. In order to evaluate the possibility of lignin recovery, the pH of delignification liquor was reduced by the addition of CO₂ and H₂SO₄, resulting in up to 25% lignin yield. An experiment that looked at effect of post-delignification fibre washing on downstream hydrolysis found that a washing efficiency of approximately 90% is required in order to achieve a hexose sugar yield of 85%.
153

Analytical and empirical models of online auctions

Ødegaard, Fredrik 11 1900 (has links)
This thesis provides a discussion on some analytical and empirical models of online auctions. The objective is to provide an alternative framework for analyzing online auctions, and to characterize the distribution of intermediate prices. Chapter 1 provides a mathematical formulation of the eBay auction format and background to the data used in the empirical analysis. Chapter 2 analyzes policies for optimally disposing inventory using online auctions. It is assumed a seller has a fixed number of items to sell using a sequence of, possibly overlapping, single-item auctions. The decision the seller must make is when to start each auction. The decision involves a trade-off between a holding cost for each period an item remains unsold, and a cannibalization effect among competing auctions. Consequently the seller must trade-off the expected marginal gain for the ongoing auctions with the expected marginal cost of the unreleased items by further deferring their release. The problem is formulated as a discrete time Markov Decision Problem. Conditions are derived to ensure that the optimal release policy is a control limit policy in the current price of the ongoing auctions. Chapter 2 focuses on the two item case which has sufficient complexity to raise challenging questions. An underlying assumption in Chapter 2 is that the auction dynamics can be captured by a set of transition probabilities. Chapter 3 shows with two fixed bidding strategies how the transition probabilities can be derived for a given auction format and bidder arrival process. The two specific bidding strategies analyzed are when bidders bid: 1) a minimal increment, and 2) their true valuation. Chapters 4 and 5 provides empirical analyzes of 4,000 eBay auctions conducted by Dell. Chapter 4 provides a statistical model where over discrete time periods, prices of online auctions follow a zero-inflated gamma distribution. Chapter 5 provides an analysis of the 44,000 bids placed in the auctions, based on bids following a gamma distribution. Both models presented in Chapters 4 and 5 are based on conditional probabilities given the price and elapsed time of an auction, and certain parameters of the competing auctions. Chapter 6 concludes the thesis with a discussion of the main results and possible extensions.
154

Empirical Correlates of the Personality Assessment Inventory (PAI) in an Outpatient Sample: A Replication and Extension

Pan, Minqi 05 1900 (has links)
The Personality Assessment Inventory (PAI) has gained widespread favor since its publication. However, validation studies of its interpretive descriptors remained limited to date. As such, the primary goal of the current study aimed at validating the interpretive descriptors through the lens of empirical correlates using the PDSQ as the external criterion. It also served as a replication and extension to the 2018 study conducted by Rogers and colleagues. The final archival sample included 204 clients from the UNT Psychology Clinic who were administered PAI between May 2016 and December 2020. Overall, reliability and construct validity were strongly supported for PAI clinical scales. Further, the current study replicated large majority of the correlates identified by Rogers and colleagues, which boosted the confidence in reproducible interpretations based on empirical correlates. Importantly, investigation of item-level and gendered correlates provided crucial interpretive implications that were otherwise obscured. For example, item-level correlates refined interpretation by clarifying the nature of scale-level correlates, particularly those of moderate strength. On the other hand, notable gender differences were identified for certain scales, which led to drastic differences in patterns of gendered vs. non-gendered correlates. Finally, several important methodological considerations are proposed in hope to facilitate the empirical research concerning measurement validity, as well as combat the current replication crisis. The need to adopt more stringent standards for effect size, as well as the instability of correlates of moderate strength were discussed. Implications of clinical practice and future directions for research are also discussed.
155

An empirical study of the usefulness of accounting ratios to describe levels of insolvency risk

Lincoln, Mervyn George January 1982 (has links) (PDF)
This study aims to add a new dimension to research in Australia on the use of accounting ratios to predict corporate failure. Previous studies have used the statistical technique of discriminant analysis to derive models for predicting whether a firm will or will not fail. This study will use the same statistical technique but with three differences: / (a) The ratios to be used in the discriminant analysis are selected by a method which ensured that no arbitrary limit is placed on their number. / (b) Because the significance of accounting ratios can vary from industry to industry, four industries are separately analysed: manufacturing, retail, property, and finance. / (c) The statistical probabilities yielded by the analysis are used to measure a firm’s current level of insolvency risk. / The extra dimension is added by interpreting the characteristic patterns of insolvency risk which emerge: an analysis of the factors causing the differences in these patterns throws new light on the causes, symptoms, and remedies of financial distress.
156

Reality and Myth: The New Zealand Media and the Chilling Effect of Defamation Law

Cheer, Ursula Jan January 2008 (has links)
In 2001, I began the field work in an empirical study of the laws of defamation in New Zealand. This study involved a comprehensive mail-out survey of the New Zealand media, and an adapted survey of defamation lawyers, which were designed to discover how the laws of defamation affected both groups, and what the respondents thought about those laws. The survey was augmented by an extensive search of defamation court files in the most important New Zealand High Court registries. The question behind the survey was essentially whether New Zealand’s defamation laws have a chilling effect on the media, to the extent that stories which should be told do not see the light of day. In this thesis, I contextualise and report on the results of the survey. I first describe and analyse the sources and trends in current defamation law, the other forms of regulation of the media in New Zealand, and the patterns of media ownership. I go on to utilise background data from the survey to present a character and business profile of the media who responded to the survey and find the data confirms the representative nature of those respondents. I then complete contextualisation of the survey by analysing the nature of the chilling effect doctrine itself, a canon which began as a predictive theory importing sociological concepts into legal analysis, but which is now a doctrine applied somewhat inconsistently, but with substantive effects, by the courts. In the following chapters I present the results of the media survey, the court file search and the survey of defamation lawyers, both in narrative and graph or tabular form. My tentative initial finding, that New Zealand’s defamation laws do not have an excessive chilling effect on our media, although they do have some, is progressively confirmed, with each set of data appearing to mirror and corroborate that which went before. In the final chapters, I take this somewhat surprising finding and augment it by theorising about future developments in defamation law. I suggest that increased constitutionalisation of this area of private law, in the form of full incorporation of a Bill of Rights methodology, is both desirable and necessary to protect against any chilling effects, such as they are. I conclude by posing a question about a possible joint future for defamation and privacy claims.
157

Engineering Approaches for Improving Cortical Interfacing and Algorithms for the Evaluation of Treatment Resistant Epilepsy

January 2015 (has links)
abstract: Epilepsy is a group of disorders that cause seizures in approximately 2.2 million people in the United States. Over 30% of these patients have epilepsies that do not respond to treatment with anti-epileptic drugs. For this population, focal resection surgery could offer long-term seizure freedom. Surgery candidates undergo a myriad of tests and monitoring to determine where and when seizures occur. The “gold standard” method for focus identification involves the placement of electrocorticography (ECoG) grids in the sub-dural space, followed by continual monitoring and visual inspection of the patient’s cortical activity. This process, however, is highly subjective and uses dated technology. Multiple studies were performed to investigate how the evaluation process could benefit from an algorithmic adjust using current ECoG technology, and how the use of new microECoG technology could further improve the process. Computational algorithms can quickly and objectively find signal characteristics that may not be detectable with visual inspection, but many assume the data are stationary and/or linear, which biological data are not. An empirical mode decomposition (EMD) based algorithm was developed to detect potential seizures and tested on data collected from eight patients undergoing monitoring for focal resection surgery. EMD does not require linearity or stationarity and is data driven. The results suggest that a biological data driven algorithm could serve as a useful tool to objectively identify changes in cortical activity associated with seizures. Next, the use of microECoG technology was investigated. Though both ECoG and microECoG grids are composed of electrodes resting on the surface of the cortex, changing the diameter of the electrodes creates non-trivial changes in the physics of the electrode-tissue interface that need to be accounted for. Experimenting with different recording configurations showed that proper grounding, referencing, and amplification are critical to obtain high quality neural signals from microECoG grids. Finally, the relationship between data collected from the cortical surface with micro and macro electrodes was studied. Simultaneous recordings of the two electrode types showed differences in power spectra that suggest the inclusion of activity, possibly from deep structures, by macroelectrodes that is not accessible by microelectrodes. / Dissertation/Thesis / Doctoral Dissertation Bioengineering 2015
158

Research on the Relationship between Integrity and Market Economy in China

January 2017 (has links)
abstract: The discussion on the question of integrity has a long history. It is regarded as an important research proposition in the fields of Chinese and Western philosophy, law, sociology, ethics, management and economics. Even though the research results are combined with each other for reference, there is still a lack of comprehensive analysis and empirical study on integrity. In the process of transition from planned economy to market economy, a series of problems have arisen. The problem of dishonesty has caused huge losses to the society as a whole. Therefore, it is of great theoretical and practical importance to conduct a quantitative analysis on the topic of trustworthiness, and investigate the relationship between trustworthiness and the economy. This paper adopts a game theory model to analyze the reasons for corporate dishonesty, and the problem faced by the regulators. Based on the results from the model, we clarify the duties of various subjects (enterprises, governments and credit organizations) in the construction of a society with integrity. Then, we analyze different cases of dishonesty and point out the channels through which the economy would be affected by the problem of dishonesty. (1) as an institution, integrity helps reduce the transaction cost and prompt market efficiency; (2) integrity serves as a production factor that influences the economy; (3) integrity will affect the economy by influencing the ability of small and medium enterprises to borrow. Finally, after we establish the relationship between integrity and the market economy, we use survey data to conduct an empirical analysis on the development of integrity in China. The survey data allow us to build a cycle of integrity risk, and identify the current position in the cycle. Besides, we also compare the region difference regarding integrity, which supports the idea that integrity matters for the economic development. Because the questionnaires are the only way to obtain the data that can be analyzed at present, the paper not only fills in the research gap caused by the lack of data, but also jumps out of the existing research methods, and enriches the empirical work for the study of integrity. / Dissertation/Thesis / Doctoral Dissertation Business Administration 2017
159

Modularity analysis of use case implementations

Rodrigues dos Santos d'Amorim, Fernanda 31 January 2010 (has links)
Made available in DSpace on 2014-06-12T15:57:56Z (GMT). No. of bitstreams: 2 arquivo3237_1.pdf: 1530844 bytes, checksum: dcdb6221a7c974cbfc9e96c7629001ef (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2010 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / Atualmente, arquitetura baseada em componentes é a abordagem mais utilizada no desenvolvimento de softwares complexos; esta tem como principal objetivo a atribuição dos requisitos da aplicação aos componentes. Uma das técnicas mais difundidas para especificação de requisitos é a utilização de Casos de Uso. Em geral, arquiteturas de software baseadas em componentes resultam em implementações onde o código relativo a um caso de uso está espalhado e entrelaçado em diversos componentes do sistema, caracterizando um crosscutting concern. Isto ocorre porque técnicas tradicionais, como Orientação a Objetos (OO), não oferecem mecanismos que sejam capazes de modularizar este tipo de concern. Recentemente, novas técnicas de modularização como aspectos, mixins e classes virtuais, foram propostas para tentar resolver este problema. Estas técnicas podem ser usadas para agrupar o código relacionado a um único caso de uso em uma nova unidade de modularização. Este trabalho analisa qualitativa e quantitativamente o impacto causado por este tipo de modularização de casos de uso. Nós exploramos duas técnicas baseadas em Orientação a Aspectos (OA): (i) Casos de Uso como Aspectos - onde utilizamos os construtores de AspectJ para isolar todo código relativo à implementação de um caso de uso em um aspecto; e (ii) Casos de Uso como Colaborações Plugáveis - onde usamos os construtores de CaesarJ para modularizar implementações de casos de uso através de uma composição hierárquica de colaborações. Nós executamos dois estudos de casos onde comparamos as implementações OA de casos de uso com sua implementação OO. No processo de avaliação extraímos métricas tradicionais e contemporâneas incluindo coesão, acoplamento e separação de concerns e analisamos modularidade em termos de atributos de qualidade de software como: plugabilidade, rastreabilidade e suporte para desenvolvimento em paralelo. Nossos resultados indicam que modularidade é um conceito relativo e sua análise depende de outros fatores além do sistema alvo, das métricas e da técnica aplicada
160

Business Process Variability: A Systematic Literature Review

SANTOS, George Augusto Valença 03 1900 (has links)
Submitted by Pedro Henrique Rodrigues (pedro.henriquer@ufpe.br) on 2015-03-05T17:19:23Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Dissertação George Augusto Valença Santos _ CIn UFPE _ Bussiness Process Variability - A Systematic Literature Review.pdf: 5890061 bytes, checksum: 6ff1f1ce8ef59a0eab2536ed6f50659a (MD5) / Made available in DSpace on 2015-03-05T17:19:23Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Dissertação George Augusto Valença Santos _ CIn UFPE _ Bussiness Process Variability - A Systematic Literature Review.pdf: 5890061 bytes, checksum: 6ff1f1ce8ef59a0eab2536ed6f50659a (MD5) Previous issue date: 2012-03 / Business processes have facilitated and enhanced management activities, being considered an instrument capable of approximating the strategic guidance and the people who execute their work to achieve organizational goals. In this scenario, continuous evaluation procedures, compliance with government regulations and industry standards, evolutions in the business domain, stakeholders’ needs, new technologies and economic factors related to globalization pressure are examples of aspects that can foster changes on business processes. The impact of this changing environment is the variation of business processes, in a phenomenon called business process variability. The objective of this research is, therefore, to aggregate relevant studies which address the context of this phenomenon. The studies’ selection was accomplished through a Systematic Literature Review, conducting automatic searches in a set of digital libraries and manual searches in leading conferences and journals in the fields of Business Process Management and Computer Science. In total, 13619 studies were retrieved, from which 80 were classified as relevant. This set of primary studies acted as sources of evidence for answering 3 research questions and their respectively subquestions. From the analysis performed, the study concludes that despite efforts in the literature for managing business process variability, this concept is not clear and well delimited, involving additional aspects and, hence, lacking a structured taxonomy. Contributions of the current work are: to provide valuable information with respect to the main notions in business process variability field and, possible types and inductors of process variability;to identify the main challenges faced by organizations when dealing with this phenomenon and; to examine a set of proposals for process variability management, investigating the existence of tool support and empirical evaluations carried out.

Page generated in 0.1293 seconds