• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 199
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 460
  • 63
  • 56
  • 56
  • 53
  • 48
  • 44
  • 43
  • 41
  • 39
  • 37
  • 37
  • 35
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Rainfall Variation and Food Security in Malawi : A Panel Data Study with Valuable Insights from the Field

Elzvik Nyström, Klara January 2019 (has links)
This study addresses the question of how climate variability, in terms of seasonal rainfall variation, might affect food security in Malawi. It hypothesizes that seasonal rainfall variation could cause food insecurity and that the consequences of weather hazards possibly differ within the country. An additional aim of this study is therefore to map local resilience in Malawi to estimate the adaptation ability by analyzing two subsamples. The hypothesis is tested by using a two-way fixed effect regression analysis and panel data for 28 districts in Malawi covering the years 2000, 2004, 2010 and 2015. This study finds no statistically significant effect of seasonal rainfall variation on children’s health for the examined years.
52

Estudo do efeito de suavização da krigagem ordinária em diferentes distribuições estatísticas / A study of ordinary kriging smoothing effect using diferent statistics distributions

Souza, Anelise de Lima 12 July 2007 (has links)
Esta dissertação apresenta os resultados da investigação quanto à eficácia do algoritmo de pós-processamento para a correção do efeito de suavização nas estimativas da krigagem ordinária. Foram consideradas três distribuições estatísticas distintas: gaussiana, lognormal e lognormal invertida. Como se sabe, dentre estas distribuições, a distribuição lognormal é a mais difícil de trabalhar, já que neste tipo de distribuição apresenta um grande número de valores baixos e um pequeno número de valores altos, sendo estes responsáveis pela grande variabilidade do conjunto de dados. Além da distribuição estatística, outros parâmetros foram considerados: a influencia do tamanho da amostra e o numero de pontos da vizinhança. Para distribuições gaussianas e lognormais invertidas o algoritmo de pós-processamento funcionou bem em todas a situações. Porém, para a distribuição lognormal, foi observada a perda de precisão global. Desta forma, aplicou-se a krigagem ordinária lognormal para este tipo de distribuição, na realidade, também foi aplicado um método recém proposto de transformada reversa de estimativas por krigagem lognormal. Esta técnica é baseada na correção do histograma das estimativas da krigagem lognormal e, então, faz-se a transformada reversa dos dados. Os resultados desta transformada reversa sempre se mostraram melhores do que os resultados da técnica clássica. Além disto, a as estimativas de krigagem lognormal se provaram superiores às estimativas por krigagem ordinária. / This dissertation presents the results of an investigation into the effectiveness of the post-processing algorithm for correcting the smoothing effect of ordinary kriging estimates. Three different statistical distributions have been considered in this study: gaussian, lognormal and inverted lognormal. As we know among these distributions, the lognormal distribution is the most difficult one to handle, because this distribution presents a great number of low values and a few high values in which these high values are responsible for the great variability of the data set. Besides statistical distribution other parameters have been considered in this study: the influence of the sample size and the number of neighbor data points as well. For gaussian and inverted lognormal distributions the post-processing algorithm worked well in all situations. However, it was observed loss of local accuracy for lognormal data. Thus, for these data the technique of ordinary lognormal kriging was applied. Actually, a recently proposed approach for backtransforming lognormal kriging estimates was also applied. This approach is based on correcting the histogram of lognormal kriging estimates and then backtransforming it to the original scale of measurement. Results of back-transformed lognormal kriging estimates were always better than the traditional approach. Furthermore, lognormal kriging estimates have provided better results than the normal kriging ones.
53

Confidence and Prediction under Covariates and Prior Information / Konfidenz- und Prognoseintervalle unter Kovariaten und Vorinformation

Lurz, Kristina January 2015 (has links) (PDF)
The purpose of confidence and prediction intervals is to provide an interval estimation for an unknown distribution parameter or the future value of a phenomenon. In many applications, prior knowledge about the distribution parameter is available, but rarely made use of, unless in a Bayesian framework. This thesis provides exact frequentist confidence intervals of minimal volume exploiting prior information. The scheme is applied to distribution parameters of the binomial and the Poisson distribution. The Bayesian approach to obtain intervals on a distribution parameter in form of credibility intervals is considered, with particular emphasis on the binomial distribution. An application of interval estimation is found in auditing, where two-sided intervals of Stringer type are meant to contain the mean of a zero-inflated population. In the context of time series analysis, covariates are supposed to improve the prediction of future values. Exponential smoothing with covariates as an extension of the popular forecasting method exponential smoothing is considered in this thesis. A double-seasonality version of it is applied to forecast hourly electricity load under the use of meteorological covariates. Different kinds of prediction intervals for exponential smoothing with covariates are formulated. / Konfidenz- und Prognoseintervalle dienen der Intervallschätzung unbekannter Verteilungsparameter und künftiger Werte eines Phänomens. In vielen Anwendungen steht Vorinformation über einen Verteilungsparameter zur Verfügung, doch nur selten wird außerhalb von bayesscher Statistik davon Gebrauch gemacht. In dieser Dissertation werden exakte frequentistische Konfidenzintervalle unter Vorinformation kleinsten Volumens dargelegt. Das Schema wird auf Verteilungsparameter für die Binomial- und die Poissonverteilung angewandt. Der bayessche Ansatz von Intervallen für Verteilungsparameter wird in Form von Vertrauensintervallen behandelt, mit Fokus auf die Binomialverteilung. Anwendung findet Intervallschätzung in der Wirtschaftsprüfung, wo zweiseitige Intervalle vom Stringer-Typ den Mittelwert in Grundgesamtheiten mit vielen Nullern enthalten sollen. Im Zusammenhang mit Zeitreihenanalyse dienen Kovariaten der Verbesserung von Vorhersagen zukünftiger Werte. Diese Arbeit beschäftigt sich mit exponentieller Glättung mit Kovariaten als eine Erweiterung der gängigen Prognosemethode der exponentiellen Glättung. Eine Version des Modells, welche doppelte Saison berücksichtigt, wird in der Prognose des stündlichen Elektrizitätsbedarfs unter Zuhilfenahme von meteorologischen Variablen eingesetzt. Verschiedene Arten von Prognoseintervallen für exponentielle Glättung mit Kovariaten werden beschrieben.
54

Resultatutjäming : en jämförelsestudie efter införandet av IFRS

Ardenstedt, Therese, Friberg, Jessica January 2007 (has links)
No description available.
55

Nonlinear Analog Networks for Image Smoothing and Segmentation

Lumsdaine, A., Wyatt, J.L., Jr., Elfadel, I.M. 01 January 1991 (has links)
Image smoothing and segmentation algorithms are frequently formulatedsas optimization problems. Linear and nonlinear (reciprocal) resistivesnetworks have solutions characterized by an extremum principle. Thus,sappropriately designed networks can automatically solve certainssmoothing and segmentation problems in robot vision. This papersconsiders switched linear resistive networks and nonlinear resistivesnetworks for such tasks. The latter network type is derived from thesformer via an intermediate stochastic formulation, and a new resultsrelating the solution sets of the two is given for the "zerostermperature'' limit. We then present simulation studies of severalscontinuation methods that can be gracefully implemented in analog VLSIsand that seem to give "good'' results for these non-convexsoptimization problems.
56

Resultatutjäming : en jämförelsestudie efter införandet av IFRS

Ardenstedt, Therese, Friberg, Jessica January 2007 (has links)
No description available.
57

Rao-Blackwellized particle smoothers for mixed linear/nonlinear state-space models

Lindsten, Fredrik, Bunch, Pete, Godsill, Simon J., Schön, Thomas B. January 2013 (has links)
We consider the smoothing problem for a class of conditionally linear Gaussian state-space (CLGSS) models, referred to as mixed linear/nonlinear models. In contrast to the better studied hierarchical CLGSS models, these allow for an intricate cross dependence between the linear and the nonlinear parts of the state vector. We derive a Rao-Blackwellized particle smoother (RBPS) for this model class by exploiting its tractable substructure. The smoother is of the forward filtering/backward simulation type. A key feature of the proposed method is that, unlike existing RBPS for this model class, the linear part of the state vector is marginalized out in both the forward direction and in the backward direction. / CNDM / CADICS
58

Variational based analysis and modelling using B-splines

Sherar, P. A. January 2004 (has links)
The use of energy methods and variational principles is widespread in many fields of engineering of which structural mechanics and curve and surface design are two prominent examples. In principle many different types of function can be used as possible trial solutions to a given variational problem but where piecewise polynomial behaviour and user controlled cross segment continuity is either required or desirable, B-splines serve as a natural choice. Although there are many examples of the use of B-splines in such situations there is no common thread running through existing formulations that generalises from the one dimensional case through to two and three dimensions. We develop a unified approach to the representation of the minimisation equations for B-spline based functionals in tensor product form and apply these results to solving specific problems in geometric smoothing and finite element analysis using the Rayleigh-Ritz method. We focus on the development of algorithms for the exact computation of the minimisation matrices generated by finding stationary values of functionals involving integrals of squares and products of derivatives, and then use these to seek new variational based solutions to problems in the above fields. By using tensor notation we are able to generalise the methods and the algorithms from curves through to surfaces and volumes. The algorithms developed can be applied to other fields where a variational form of the problem exists and where such tensor product B-spline functions can be specified as potential solutions.
59

Chinese input method based on reduced phonetic transcription

Hsu, Feng-Ho 22 May 2012 (has links)
In this paper, we investigate a highly efficient input method in Chinese. In the traditional Mandarin phonetic input method, users have to input the complete Mandarin phonetic symbol. The proposed new Chinese input method is which transforms the first Mandarin phonetic symbol sequence to character sequence. Users only have to input the first Mandarin phonetic symbol. Users input first Mandarin phonetic symbol and follow the input rule that spaces are inserted between the words. The system outputs the candidate character sequence hypotheses. Bigram model is used to describe the relation between words. We use the dynamic programing for decoding. We estimate the feasibility for our new Chinese input method and estimate the Stanford segmenter. In the experiment, we estimate the Standford Segmenter works on the simplified Chinese and Traditional Chinese firstly. We observe that the precision and recall on the simplified Chinese are 84.52% and 85.20% which is better than works on the Traditional Chinese 68.43% and 63.43%. And we estimate system efficiency based on language model that trained by WIKI corpus and ASBC corpus separately. The sentence and word accuracy for the ASBC corpus are 39.8% and 70.3%. And the word and character accuracy for WIKI corpus are 20.3% and 53.3%. Finally we estimate the number of candidate hypotheses. The research shows the 10 hypotheses and 20 hypotheses the sentence accuracy are closed.
60

Stability Analysis of Voxel-based Cortical Thickness Measurement of Human Brain

Chung, Run-Hong 04 September 2012 (has links)
The cerebral cortex is gray matter tissue which covers cerebral hemispheres. In recent years, many studies reported that abnormal cortical thickness was found in several diseases of central neural system, such as multiple sclerosis, Alzheimer's diseases, and schizophrenia. Therefore, the whole-brain measurement of cortical thickness using the non-invasive magnetic resonance imaging becomes important. However, not many algorithms were reported in the past due to the extremely complex folding structure of human cortex. In this thesis, a voxel-based cortical thickness method proposed by Hutton et al was implemented using MATLAB to achieve automated measurement. Several crucial factors, including the definition of boundary condition, interpolation method, the step size of developing each streamline, and spatial resolution of imaging space, in the implementation were discussed. In addition, the analysis of stability, or precision, of our self-developed program was evaluated . Sixteen experiments of reproducibility were performed in two months on the same 24-year-old healthy volunteer repeatedly to obtain whole-brain 3D T1WI. Cortical thickness map was calculated independently and normalized to the same coordination. Mean, standard deviation, and normalized standard deviation of 16 measurements were calculated on every cortical voxel, along with whole-brain mean cortical thickness. Various sizes of 3D smoothing kernel were applied, and the results showed stronger smoothing might help higher precision by the cost of spatial resolution.

Page generated in 0.0646 seconds