• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2077
  • 469
  • 321
  • 181
  • 169
  • 71
  • 68
  • 65
  • 53
  • 51
  • 49
  • 43
  • 28
  • 23
  • 22
  • Tagged with
  • 4366
  • 717
  • 538
  • 529
  • 506
  • 472
  • 432
  • 408
  • 390
  • 323
  • 316
  • 306
  • 296
  • 286
  • 275
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
621

Markowitz-style Quartic Optimization for the Improvement of Leveraged ETF Trading

DeWeese, Jackson Paul 25 April 2013 (has links)
This paper seeks to unconventionally maximize the volatility of a portfolio through a quartic optimization based on Markowitz’s modern portfolio theory, which generally seeks to do exactly the opposite. It shows that through this method, a daily leveraged exchange traded fund (ETF) strategy investigated by Posterro can be significantly improved upon in terms of its Sharpe ratio. The original strategy seeks to use a combination of momentum trading and tracking error in leveraged ETFs to trade during the last half an hour of the trading day, but it suffers in a low volatility market. By maximizing the volatility to take better advantage of tracking error and momentum, this problem is addressed by both increasing the mean daily return and significantly decreasing the variance of the strategy’s daily returns. GARCH forecasting is also implemented to assist in the maximization of the daily portfolios’ variances, though this does not prove to make a statistically significant difference in the strategy’s performance.
622

Medium-chain Acyl-CoA dehydrogenase deficiency: a characterization of the most common variant and current and future therapeutics

Barbera, Gabrielle 01 November 2017 (has links)
Medium-Chain Acyl-CoA Dehydrogenase Deficiency (MCADD) is the most common inborn error of metabolism affecting the fatty acid oxidation pathway. The deficiency is caused by a defect in the medium-chain acyl-CoA dehydrogenase enzyme which catalyzes the first step in the oxidation of medium-chain fatty acids. Long-chain fatty acids, after being transported into the mitochondria and activated into long-chain acyl-CoAs, are sequentially broken down until they become medium-chain acyl-CoAs. Medium-chain acyl-CoAs are then broken down until they become short-chain acyl-CoAs. Short-chain acyl-CoAs are broken down until only acetyl-CoA remains. The block in the oxidation of fatty acids in those with MCADD happens once the long-chain acyl-CoAs have been oxidized to medium-chain acyl-CoAs. The medium-chain acyl-CoAs cannot be further oxidized and build up. Without the breakdown of fatty acids, individuals with MCADD cannot produce enough energy during times of increased metabolic demand. Thus, prolonged exercise, fasting, or fever can precipitate clinical symptoms once the body enters a hypoketotic hypoglycemic state. Those with MCADD typically present in the early months of life with fasting intolerance, vomiting, lethargy, and, in more serious cases, seizures. Adult presentation is rare, but should not be ruled out of a differential diagnosis, because early detection and intervention can prevent permanent brain damage and death. Because early detection can prevent the serious effects of metabolic decompensation, MCADD was added to the Newborn Screen and is tested through measuring levels of medium-chain acylcarnitines in dried blood smears by tandem mass spectrometry. Metabolic decompensation is manifested clinically through dehydration, vomiting, and acidosis. In serious cases, metabolic decompensation can progress to seizures, coma, and death. Introduction of the Newborn Screen has reduced the morbidity of the deficiency, but has not eliminated it. Those with MCADD need to be closely monitored and emergency glucose needs to be available to them in case of a hypoglycemic emergency. The Newborn Screen has been effective in finding mutations in the ACADM gene that produce a mild phenotype of MCADD. Before the Newborn Screen, the most common variant, K329E, was detected in clinically diagnosed patients. However, the screen has shown that there are about 150 variants leading to MCADD. The most common variant of the MCAD protein, K329E, has been studied and characterized in order to further understand the pathogenesis of MCADD. This mutation substitutes a lysine for a glutamic acid, introducing hindrance and the inability of the protein to form its fully functional tetrameric form. The mutant protein also has an increased sensitivity to heat denaturation. Currently, there are no pharmacological treatments for MCADD. The idea of pharmacological chaperones is explored by using the example of tetrahydrobiopterin and phenylketonuria. Future studies will need be done to find a treatment for MCADD that is curative rather than treating the symptoms of the deficiency; however, curative therapies which target the mutant enzyme may be problematic since there is a wide array of mutations that result in a defective enzyme in affected individuals.
623

The contributory factors in drug errors and their reporting

Armitage, Gerry R. January 2008 (has links)
The aim of this thesis is to examine the contributory factors in drug errors and their reporting so as to design an enhanced reporting scheme to improve the quality of reporting in an acute hospital trust. The related research questions are: 1. What are the contributory factors in drug errors? 2. How effective is the reporting of drug errors? 3. Can an enhanced reporting scheme, predicated on the analysis of local documentary and interview data, identify the contributory factors in drug errors and improve the quality of their reporting in an acute hospital trust? The study aim and research questions reflect a growing consensus, articulated by Boaden and Walshe (2006), that patient safety research should focus on understanding the causes of adverse events and developing interventions to improve safety. Although there are concerns about the value of incident reporting (Wald & Shojania 2003, Armitage & Chapman 2007), it would appear that error reporting systems remain a high priority in advancing patient safety (Kohn et al 2000, Department of Health 2000a, National Patient Safety Agency 2004, WHO & World Alliance for Patient Safety 2004), and consequently it is the area chosen for intervention in this study. Enhancement of the existing scheme is based on a greater understanding of drug errors, their causation, and their reporting.
624

On merit functions, error bounds, minimizing and stationary sequences for nonsmooth variational inequality problems. / CUHK electronic theses & dissertations collection

January 2005 (has links)
First, we study the associated regularized gap functions and the D-gap functions and compute their Clarke-Rockafellar directional derivatives and the Clarke generalized gradients. Second, using these tools and extending the works of Fukushima and Pang (who studied the case when F is smooth), we present results on the relationship between minimizing sequences and stationary sequences of the D-gap functions, regardless the existence of solutions of (VIP). Finally, as another application, we show that, under the strongly monotonicity assumption, the regularized gap functions have fractional exponent error bounds, and thereby we provide an algorithm of Armijo type to solve the (VIP). / In this thesis, we investigate a nonsmooth variational inequality problem (VIP) defined by a locally Lipschitz function F which is not necessarily differentiable or monotone on its domain which is a closed convex set in an Euclidean space. / Tan Lulin. / "December 2005." / Adviser: Kung Fu Ng. / Source: Dissertation Abstracts International, Volume: 67-11, Section: B, page: 6444. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2005. / Includes bibliographical references (p. 79-84) and index. / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.
625

Architectures for fault-tolerant quantum computation

O'Gorman, Joe January 2017 (has links)
Quantum computing has enormous potential, but this can only be realised if quantum errors can be controlled sufficiently to allow quantum algorithms to be completed reliably. However, quantum-error-corrected logical quantum bits (qubits) which can be said to have achieved meaningful error suppression have not yet been demonstrated. This thesis reports research on several topics related to the challenge of designing fault-tolerant quantum computers. The first topic is a proposal for achieving large-scale error correction with the surface code in a silicon donor based quantum computing architecture. This proposal relaxes some of the stringent requirements in donor placement precision set by previous ideas from the single atom level to the order of 10 nm in some regimes. This is shown by means of numerical simulation of the surface code threshold. The second topic then follows, it is the development of a method for benchmarking and assessing the performance of small error correcting codes in few-qubit systems, introducing a metric called 'integrity' - closely linked to the trace distance -- and a proposal for experiments to demonstrate various stepping stones on the way to 'strictly superior' quantum error correction. Most quantum error correcting codes, including the surface code, do not allow for fault-tolerant universal computation without the addition of extra gadgets. One method of achieving universality is through a process of distilling and then consuming high quality 'magic states'. This process adds additional overhead to quantum computation over and above that incurred by the use of the base level quantum error correction. The latter parts of this thesis report an investigation into how many physical qubits are needed in a `magic state factory' within a surface code quantum computer and introduce a number of techniques to reduce the overhead of leading magic state techniques. It is found that universal quantum computing is achievable with &Tilde; 16 million qubits if error rates across a device are kept below 10<sup>-4</sup>. In addition, the thesis introduces improved methods of achieving magic state distillation for unconventional magic states that allow for logical small angle rotations, and show that this can be more efficient than synthesising these operations from the gates provided by traditional magic states.
626

Asymptotic performance of multiuser massive MIMO systems

Hburi, Ismail Sh. Baqer January 2017 (has links)
This thesis addresses and identifies outstanding challenges associated with the Multi user massive Multiple-Input Multiple-Output (MU massive MIMO) transmission, whereby various system scenarios have been considered to tackle these challenges. First, for a single cell scenario, the uplink effective capacity under statistical exponent constraints, the asymptotic error and outage probabilities in a multi user massive MIMO system are provided. The proposed approach establishes closed form expressions for the aforementioned metrics under both perfect and imperfect channel state information (CSI) scenarios. In addition, expressions for the asymptotically high signal-to-interference ratio (SIR) regimes are established. Second, the statistical queueing constraints, pilot contamination phenomenon and fractional power control in random or irregular cellular massive MIMO system are investigated, where base station locations are modelled based on the Poisson point process. Specifically, tractable analytical expressions are developed for the asymptotic SIR coverage, rate coverage and the effective capacity under the quality of service statistical exponent constraint. Laplace transform of interference is derived with the aid of mathematical tools from stochastic geometry. Simulation outcomes demonstrate that pilot reuse impairments can be alleviated by employing a cellular frequency reuse scheme. For example, with unity frequency reuse factor, we see that 40% of the total users have SIR above −10.5dB, whereas, with a reuse factor of 7, the same fraction of users have SIR above 20.5dB. In addition, for a certain parameters setting, the coverage probability in the lower 50th percentile can be maximized by adjusting power compensation fraction between 0.2 and 0.5. Also, for SIR threshold of 0dB, allocating 0.25 fraction of uplink transmit power can achieve approximately 6% improvement in coverage probability in the cell edge area compared to constant power policy and about 14% improvement compared to the full channel-inversion policy. Third and last, motivated by the powerful gains of incorporating small cells with macro cells, a massive MIMO aided heterogeneous cloud radio access network (H-CRAN) is investigated. More specific, based on Toeplitz matrix tool, tractable formulas for the link reliability and rate coverage of a typical user in H-CRAN are derived. Numerical outcomes confirm the powerful gain of the massive MIMO for enhancing the throughput of the H-CRAN while small remote radio heads (RRH cells) are capable of achieving higher energy efficiency.
627

Dos intentos de escrita à escrita convencional : algumas manifestações / Fom writing intents to conventional writing : some manifestations

Barbosa, Maria Jose Landivar de Figueiredo, 1963- 18 September 2018 (has links)
Orientador: Rosa Attié Figueira / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Estudos da Linguagem / Made available in DSpace on 2018-09-18T20:43:59Z (GMT). No. of bitstreams: 1 Barbosa_MariaJoseLandivardeFigueiredo_D.pdf: 2421191 bytes, checksum: 0e50b0ca724ead6becd1f77e2a8d8d4e (MD5) Previous issue date: 2013 / Resumo: Neste trabalho, a partir da perspectiva que nos abre o Interacionismo (De Lemos) propomo-nos a analisar algumas manifestações no processo de aquisição de escrita de crianças que frequentam a Educação Infantil e crianças dos anos iniciais do Ensino Fundamental, cuja faixa etária compreende de 2 a 9 anos. Metodologicamente nos acercamos de modos diferentes de coletas: dados provenientes de gravações de áudio e vídeo realizadas em âmbito escolar no ano letivo de 2011; dados retomados de Barbosa (2000) e de Rodrigues (2012). Realizamos um percurso analítico tomando Figueira (1984, 1985, 1995, 1996, 2001, 2002, 2003, 2005, 2008, 2010, 2011, 2012) como principal interlocutora. Suas postulações na área de aquisição da linguagem oral acerca do "erro" e das autocorreções espontâneas ou elicitadas, mostraram possibilidades significativas de interpretação no terreno da aquisição da escrita. Retomadas, reformulações, rasuras são atos que no quadro da oralidade ou da escrita põem em foco, pelo menos em sua maior parte, o erro - ocorrência que, destoando daquela considerada correta ou esperada, tanto na fala quanto na escrita - pode ser conduzida a um reparo ou correção. Ademais, a aquisição da linguagem está relacionada à entrada do sujeito no funcionamento simbólico. Nesse lugar, o sujeito criança mostra uma faceta de sua posição no mundo, uma vez que pactua com a cultura da linguagem oral e escrita, expondo sua singularidade. Nesse contexto, entendemos que a passagem do registro natural para o registro simbólico, valorativo e cultural se dá no jogo de relações entre o sujeito, o outro e a língua / Abstract: In this work, based on the perspective opened by the Interactionism (De Lemos), we propose to analyze some manifestations of the process of writing acquisition by children attending Preschool and children at the early years of Elementary School with ages ranging from 2 to 9 years. Two different methods of collection have been used methodologically: data derived from audio and video recordings in the school environment during the school year of 2011; data recovered from Barbosa (2000) and Rodrigues (2012). We have pursued an analytical course with Figueira (1984, 1985, 1995, 1996, 2001, 2002, 2003, 2005, 2008, 2010, 2011, 2012) as the main interlocutor. This author's postulations on "error" in oral language acquisition and the spontaneous or elicited self-corrections have led to significant possibilities of interpretation in the field of writing acquisition. Resumptions, reformulations and erasures are acts that, within the frame of orality or writing, focus mostly on the error - an occurrence that in disagreement with what is deemed correct or expected both in speech and writing - may be led to a rectification or correction. Furthermore, language acquisition is related to the entry of the subject in symbolic operation. In this stance, the subject-child shows a facet of his/her position in the world and, in agreement with the culture of the oral and written language, exposes his/her singularity. In this context, we understand that the passage from the natural register to the symbolic, self-esteem and cultural happens within the interplay of relations between the subject, the other and the language / Doutorado / Linguistica / Doutora em Linguística
628

Avaliação de métodos estatísticos na análise de dados de consumo alimentar

Paschoalinotte, Eloisa Elena [UNESP] 17 December 2009 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:30:59Z (GMT). No. of bitstreams: 0 Previous issue date: 2009-12-17Bitstream added on 2014-06-13T21:01:49Z : No. of bitstreams: 1 paschoalinotte_ee_me_botib.pdf: 355804 bytes, checksum: f6f7da3741a371f0a44fb543773dfea3 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Universidade Estadual Paulista (UNESP) / A avaliação do consumo alimentar de um indivíduo ou de uma população tem sido um desafio tanto para profissionais da área de nutrição como para os da área de estatística. Isso porque a característica central do consumo alimentar é a variabilidade da dieta, que pode gerar uma grande variabilidade inter e intrapessoal. Para contornar esse problema, métodos estatísticos apropriados foram desenvolvidos baseados no modelo de regressão com erro de medida de modo a se obter a distribuição estimada do consumo habitual. Dentre os métodos de avaliação de consumo, têm-se o método da Iowa State University (ISU), o método da Iowa State University for Foods (I8UF) e o método do National Cancer Institute (NCI). Todos esses métodos são baseados no modelo com erro de medida incorporando a questão do consumo esporádico (método I8UF) e a possibilidade de incluir covariáveis que podem interferir na distribuição estimada do consumo (método NCI). Para o uso do método ISU, foi desenvolvido um programa chamado PC-SIDE (Software for Intake Distribution Estimate), que fornece a distribuição do consumo habitual bem como a probabilidade de inadequação de determinados nutrientes de acordo com as recomendações nutricionais. Com esse mesmo programa, é possível obter a distribuição do consumo habitual esporádico, dado pelo método ISUF. Para o método NCI, foram desenvolvidas macros no programa SAS (Statistical Analysis System) que permitem incluir covariáveis e estimar a distribuição do consumo habitual baseado no modelo com erros de medidas. Desse modo, o objetivo deste trabalho foi avaliar essas metodologias estatísticas na análise de dados de consumo alimentar e aplicá-los a um conjunto de dados de um levantamento nutricional de idosos. Foram estudadas as metodologias de ajuste dos modelos propostos para a obtenção da distribuição estimada de consumo baseado... / Evaluating an individual's or a population's food intake has been a challenge for both nutrition professionals and statisticians. This is so because the main characteristic of food intake is diet variety, which may generate large betweenand within-person variability. In order to overcome this problem, appropriate statistical methods have been developed based on the measurement-error regression model so as to obtain the estimated distribution of usual intake. Among the intake evaluation methods are the Iowa State University (lSU), the lowa State University for Foods (ISUF) and the National Cancer lnstitute (NCI) methods. All of them are based on the measurement- error model incorporating the issue concerning episodic intake (ISUF method) and the possibility of including covariates that can interfere in the intake estimated distribution (NCl method). ln order to use the lSU method, a software referred to as PC-SlDE (Software for Intake Distribution Estimate) was designed. It provides the usual intake distribution as well as the probability of inadequacy for certain nutrients according to nutritional recommendations. By using the same software, it is possible to obtain the distribution of episodic usual intake given by the ISUF method. For the NCI method, macros were developed in the SAS (Statistical Analysis System) software which enable the inclusion of covariates and the estimation of the usual intake distribution based on the measurement-error IDodel. Hence, this study aimed at evaluating these statistical methodologies in the analysis of food intake data and at applying them to a data set for a nutritional assessment of elderly individuaIs. The fitting methodologies for the models proposed to obtain the estimated intake distribution based on the ISU, ISUF and NCI methods were studied. The ISU and NCI methods were applied to data from three 24-hours recalls obtained fram a study... (Complete abstract click electronic access below)
629

Lexical and sublexical analysis of single-word reading and writing errors

Ross, Katrina 07 July 2016 (has links)
Within a dual-route neuropsychological model, two distinct but interrelated pathways are used to read and write, known as the lexical and sublexical routes. Individuals with reading and writing deficits often exhibit impairments in one or both of these routes, and therefore must rely on the combined power of the integrated system in print processing tasks. The resultant errors reflect varying degrees of lexical and sublexical accuracy in a single production. However, no system presently exists to analyze bimodal errors robustly in both routes. The goal of this project was to develop a system that simultaneously, quantitatively, and qualitatively captures lexical and sublexical errors for single-word reading and writing tasks. This system evaluates responses hierarchically in both routes according to proximity to a target. Each response earns a bivariate score [sublexical, lexical], which is plotted along x and y axes. This scoring system was developed using data from a novel treatment study for patients with acquired alexia/agraphia. Repeated-measures multivariate analyses of variance and post hoc analyses revealed a significant treatment effect in both the lexical and sublexical systems. Qualitative analyses were also conducted to evaluate patterns of change in both the trained and untrained modalities, in the sublexical and lexical systems. Overall, the results of this study indicate that treatment-induced evolution of reading/writing responses can be comprehensively represented by this novel scoring system. / 2018-07-07T00:00:00Z
630

Error control for descriptor systems

Mann, George Robert January 2011 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries

Page generated in 0.0828 seconds