• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 298
  • 107
  • 49
  • 38
  • 23
  • 20
  • 20
  • 18
  • 9
  • 8
  • 7
  • 6
  • 5
  • 4
  • 4
  • Tagged with
  • 689
  • 152
  • 84
  • 77
  • 71
  • 66
  • 55
  • 54
  • 49
  • 48
  • 46
  • 43
  • 43
  • 42
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
371

Implementering av en användbar e-butik för tygmärken : En praktisk fallstudie av webbapplikationen Festing / Implementing a useable e-shop for textile patches : A practical case study of the web application Festing

Barrdahl, Adam, Berg, Josefin, Fridberg, Pontus, Gunnarsson, David, Lidberg, Sara, Lindblom, Martin, Melin Wenström, Peter, Olsson, Gustav, Ström, Johan January 2016 (has links)
This report covers the development of the web application Festing and the results and experiences which have been reached and learned during the process. The purpose of the study was to examine how the selling of patches could be both simplified and improved by using an advanced web application designed with focus on usability. Currently the patches are sold in a large but geographically and temporally limited market at Linköping University. The project was comprised of a case study and development of an application following the agile project framework Scrum. According to this the projects has been split up into four sprints after each of which a working product was delivered. This resulted in a usable web application with an intuitive design and multiple functions linked to the selling and buying of patches. Based on interviews, surveys and usability tests this web application is considered to have the long-term potential to take over the current physical market of textile patches. / Rapporten omfattar beskrivningen av utvecklingsprocessen för webbapplikationen Festing samt resultat och erfarenheter som uppnåtts och anskaffats under processens gång. Syftet med studien var att utveckla en avancerad webbapplikation av typ e-butik för att undersöka hur försäljningen av tygmärken kan förenklas och förbättras av en webbapplikation med fokus på användbarhet. Detta då försäljningen i dagsläget består av en stor men fysiskt och geografiskt begränsad marknad vid Linköpings universitet. Projektet har utformats som en fallstudie och utveckling av en webbapplikation enligt det agila arbetssättet Scrum. Arbetet har delats upp i fyra stycken olika sprintar där det i slutet av varje sprint ska ha levererats en fungerande delprodukt. Detta resulterade i en användbar webbapplikation med intuitiv design och flertalet funktioner kopplade till försäljning och köp av tygmärken. Utifrån intervjuer, enkätundersökningar och användbarhetstester anses denna webbapplikation ha potential till att på lång sikt kunna ta över den nuvarande fysiska försäljningen av tygmärken.
372

Mätning av svarstider på grafiskt tunga webbutiker : Med användning av AngularJS, Twitter Bootstrap och Foundation / Measuring response time of heavy graphic e-commerce : Using AngularJS, Twitter Bootstrap and Foundation

Saleh, Jango January 2015 (has links)
No description available.
373

Estimating the Local False Discovery Rate via a Bootstrap Solution to the Reference Class Problem: Application to Genetic Association Data

Abbas Aghababazadeh, Farnoosh January 2015 (has links)
Modern scientific technology such as microarrays, imaging devices, genome-wide association studies or social science surveys provide statisticians with hundreds or even thousands of tests to consider simultaneously. Testing many thousands of null hypotheses may increase the number of Type $I$ errors. In large-scale hypothesis testing, researchers can use different statistical techniques such as family-wise error rates, false discovery rates, permutation methods, local false discovery rate, where all available data usually should be analyzed together. In applications, the thousands of tests are related by a scientifically meaningful structure. Ignoring that structure can be misleading as it may increase the number of false positives and false negatives. As an example, in genome-wide association studies each test corresponds to a specific genetic marker. In such a case, the scientific structure for each genetic marker can be its minor allele frequency. In this research, the local false discovery rate as a relevant statistical approach is considered to analyze the thousands of tests together. We present a model for multiple hypothesis testing when the scientific structure of each test is incorporated as a co-variate. The purpose of this model is to incorporate the co-variate to improve the performance of testing procedures. The method we consider has different estimates depending on the tuning parameter. We would like to estimate the optimal value of that parameter by considering observed statistics. Thus, among those estimators, the one which minimizes the estimated errors due to bias and to variance is chosen by applying the bootstrap approach. Such an estimation method is called an adaptive reference class method. Under the combined reference class method, the effect of the co-variates is ignored and all null hypotheses should be analyzed together. In this research, under some assumptions for the co-variates and the prior probabilities, the proposed adaptive reference class method shows smaller error than the combined reference class method in estimating the local false discovery rate, when the number of tests gets large. We describe the adaptive reference class method to the coronary artery disease data, and we use simulation data to evaluate the performance of the estimator associated with the adaptive reference class method.
374

Analyse des états de surface en science des matériaux : caractérisation multi-échelles par ondelette et détermination de l'anisotropie des surfaces / Analysis of surface states in materials science : multi-scale wavelet characterization and determination of the anisotropy of the surfaces

Khawaja, Zahra 21 January 2014 (has links)
Le contrôle et à la maîtrise de l’état des surfaces est un besoin majeur pour les industriels. De nombreuses études sur les interactions entre la morphologie de surface et les mécanismes physiques, chimiques ou mécaniques, ont été réalisées. Cependant une caractérisation plus précise en fonction des domaines et des besoins est nécessaire. Elle consiste à chercher les paramètres de rugosité les plus pertinents qui relient la topographie d’une surface aux phénomènes physiques qu’elle subit ou aux propriétés du matériau dont elle composé.Dans ce travail, un logiciel pour caractériser l’état de surface a été développé. Cet outil nommé « MesRug » permet de calculer des paramètres de rugosité et d’extraire les plus pertinents ainsi que de définir l’échelle la plus adéquate pour une application donnée. La recherche des paramètres les plus pertinent se fait par une approche statistique (l'analyse de la variance ‘ANOVA’ combinée avec la théorie du Bootstrap).Une caractérisation a été effectuée en utilisant des données de mesures (2D) sur des surfaces abrasives. L’influence de la forme des ondelettes discrètes et continues sur la détection de l’échelle pertinente du mécanisme d’abrasion a été testée. On déduit que la décomposition en ondelettes permet de quantifier et de localiser les échelles de l'abrasion des processus d'usinage pour tous les paramètres du processus. Cependant, la pertinence de caractériser les échelles appropriées d'abrasion ne dépend pas de la forme de l'ondelette.Dans ce travail, un nouveau paramètre de rugosité 3D est proposé pour quantifier la régularité d'une surface indépendamment de l'amplitude et des unités de longueur de balayage. L'efficacité de ce paramètre est testée sur des surfaces périodiques bruitées avec différents degrés d'anisotropie. La valeur de ce paramètre est comprise entre zéro (bruit parfait) et 100% (surface sinusoïdale parfaite). Il nous a permis de détecter les directions d'anisotropie de régularité pour une surface donnée. / Monitoring and control of the state of the surfaces is a major need for industry. Numerous studies on the interactions between the surface morphology and the physical, chemical or mechanical mechanisms have been conducted. However, a more precise characterization related to industrial domains and needs is necessary. It consists in finding the most relevant roughness parameters that connect the topography of a surface with the physical phenomena which it undergoes or in the properties of the material of which it consisted.In this work, a software designed to characterize the surface condition was developed. This tool named "MesRug" allows to calculate roughness parameters then extract the most relevant ones and to define the most appropriate scale for a given application. The search for the most relevant parameters is done by a statistical approach (analysis of variance ANOVA combined with the theory of Bootstrap).A characterization was performed using (2D) data of measurement on abrasive surfaces. The influence of the form of discrete and continuous wavelet on the detection on the relevant scale mechanism of the abrasion was tested. We conclude that the wavelet decomposition allows to quantify and localize the scales of abrasion of the machining process for all process parameters. However, the relevance of appropriate scales to characterize abrasion does not depend on the shape of the wavelet.In this work, a new 3D roughness parameter is proposed to quantify the smoothness of a surface, independently of the amplitude and the scanning length units of the surface. The efficiency of this parameter is tested on noisy periodic surfaces with varying degrees of anisotropy. The value of this parameter is between zero (perfect sound) and 100 % (sine perfect surface). It enables us to identify the anisotropy directions of regularity for a given surface.
375

Kvantilová regrese / Quantile Regression

Procházka, Jiří January 2015 (has links)
The thesis deals with brief introduction of the quantile regression theory. The thesis is divided into three thematic parts. In the first part the thesis deals with general introduction to the quantile regression, with theoretical aspects regarding quantile regression and with basic approaches to estimation of quantile regression parameters. The second part of the thesis focuses on general and asymptotic properties of the quantile regression. Goal of this part is to compare the quantile regression with traditional OLS regression and outline its possible application. In the third part the thesis describes statistical inference, construction of the confidence intervals and testing statistical hypotheses about quantile regression parameters. The goal of this part is to introduce traditional approach and the approach based on resampling procedures and in the end of the day perform mutual comparison of different approaches eventually propose partial modification.
376

Analýza vlastností robustních odhadů / Analysis of Properties of Robust Estimates

Sládek, Václav January 2014 (has links)
The aim of this thesis is to analyze the properties of robust estimates and to compare these estimates with regard to the properties between them. The analysis of properties depends on the type of a variable (continuous or discrete), its probability distribution, the range of random sample and the proportion of outliers in random sample. Comparing the properties in different situations will give "guidance" to determine which of the estimates is preferable in specific situations, and which of them should be rather avoided. An adjusted bootstrap method is used to obtain the estimates of properties estimates. The thesis is devided into two parts. In the first part, the parameter estimates, type and design of robust estimators and the bootstrap method are monitored. In the second practical part we determine the suitability of bootstrap to obtain estimates of the properties of robust estimators, followed by obtaining estimates of the properties of the estimates and compare them. At the conclusion of the practical part we observe and compare the values of bootstrap confidence intervals on real data on household income. The results of this thesis shows us, that the bootstrap method does not provide good estimates of the properties of robust estimators in all cases. The results also bring us to the conclusion that from a certain extent of random sample regardless of the number of outliers, you can choose from a robust estimate only on the basis of its value, properties of robust estimates are very similar. Contemplated robust estimates of variability are not suitable estimates in most cases.
377

Optimal Sampling Designs for Functional Data Analysis

January 2020 (has links)
abstract: Functional regression models are widely considered in practice. To precisely understand an underlying functional mechanism, a good sampling schedule for collecting informative functional data is necessary, especially when data collection is limited. However, scarce research has been conducted on the optimal sampling schedule design for the functional regression model so far. To address this design issue, efficient approaches are proposed for generating the best sampling plan in the functional regression setting. First, three optimal experimental designs are considered under a function-on-function linear model: the schedule that maximizes the relative efficiency for recovering the predictor function, the schedule that maximizes the relative efficiency for predicting the response function, and the schedule that maximizes the mixture of the relative efficiencies of both the predictor and response functions. The obtained sampling plan allows a precise recovery of the predictor function and a precise prediction of the response function. The proposed approach can also be reduced to identify the optimal sampling plan for the problem with a scalar-on-function linear regression model. In addition, the optimality criterion on predicting a scalar response using a functional predictor is derived when the quadratic relationship between these two variables is present, and proofs of important properties of the derived optimality criterion are also provided. To find such designs, an algorithm that is comparably fast, and can generate nearly optimal designs is proposed. As the optimality criterion includes quantities that must be estimated from prior knowledge (e.g., a pilot study), the effectiveness of the suggested optimal design highly depends on the quality of the estimates. However, in many situations, the estimates are unreliable; thus, a bootstrap aggregating (bagging) approach is employed for enhancing the quality of estimates and for finding sampling schedules stable to the misspecification of estimates. Through case studies, it is demonstrated that the proposed designs outperform other designs in terms of accurately predicting the response and recovering the predictor. It is also proposed that bagging-enhanced design generates a more robust sampling design under the misspecification of estimated quantities. / Dissertation/Thesis / Doctoral Dissertation Statistics 2020
378

Svavelhaltsmätning av bränd kalk från Rättvik

Makhmour, Salim, Thunström, Robert January 2016 (has links)
This thesis project was carried out by two students on behalf of SMA Mineral AB, which owns the lime plant in Rättvik, where there was need to establish a sampling method for the local quick lime product. The aim was to ensure a maximum concentration of impurities in the product—primarily carbon and sulphur. The mean value of sulphur found in the input material varied over time. Consequently, a suitable statistical method was needed to ensure product quality for the prospective customer as they required that the sulphur content of the proposed product never exceed 500 ppm.The aim was, on the one hand, to process and compile the sampling results in accordance with a suitable statistical method which enabled reasonable conclusions about the product quality and, on the other hand, to answer three key queries that SMA Mineral AB posed:• to investigate whether the product’s sulphur content was affected during conveyance through the lime plant;• to investigate whether sampling at various time intervals may have been a factor which affected the product’s sulphur content;• to investigate whether there was, or were, any particular times of day at which the sulphur content always maintained the correct level.A number of phases were required to find answers to these questions. The planning phase was initiated by a visit to Rättvik, with the purpose of gaining an overall picture of how work at the plant was conducted as well as which guidelines and regulations were in effect. After this visit, a project plan was drawn up in order to serve as support for further work.The sampling campaign took place during the period of 13–16 April 2015 and analysis of the collected material was carried out the following week at the company’s laboratory in Persberg, Sweden. However, the results from the sampling campaign did not provide sufficient basis for answering the company’s questions, which is why data from SMA Mineral AB’s own data collection was used. Data collected during the sampling campaign proved to follow normal distribution. Subsequently, the statistical analysis of variance method, ANOVA, was applied in order to investigate whether the sulphur content changed with respect to the time interval and the sampling site. The test results demonstrated p-values under 0.005, which meant that neither the sampling site nor the sampling time intervals had an effect on the product’s sulphur content. The company’s question, whether there were daily time intervals of acceptable sulphur content in the product, was answered with the assistance of the company’s own data collection, which demonstrated that it did not follow normal distribution. For that reason, the bootstrap method was used to create confidence intervals for the different points in time. The result showed that there were no points in time during which acceptable material was produced. One reason for this is the occurrence of a set of deviating values that were observed to have a sulphur content that exceeded 1,000 ppm. This report presents recommendations for various measures independently of any opinions SMA Mineral AB may have concerning the source of these values and whether they can possibly be avoided. / <p>Validerat; 20160612 (global_studentproject_submitter)</p>
379

Posouzení informačního systému firmy a návrh změn / Information System Assessment and Proposal for ICT Modification

Záklasník, Martin January 2017 (has links)
The master thesis is focused on the information system of broker company. Analyze its strengths and weaknesses, describes suggest changes, including the implementation of improvements in project management and describes recommendations and visions for the future.
380

Kaon to two-pion decay and pion-pion scattering from lattice QCD

Wang, Tianle January 2021 (has links)
In this work, we present a lattice QCD calculation of two closely related quantities: 1). The 𝜋𝜋 scattering phase shift for both 𝑰=0 and 𝑰=2 channels at seven energies in total, and 2). The 𝜟𝑰=1/2, 𝛫 → 𝜋𝜋 decay amplitude 𝐴₀ and 𝜖′, the measure of direct CP violation. These two results improve our earlier calculation presented in 2015 [1]. The calculation is performed on an ensemble of 32³ × 64 lattice with 𝛼⁻¹=1.3784(68)GeV. This is a physical calculation, where the chiral symmetry breaking is controlled by the 2+1 flavor Möbius Domain Wall Fermion, and we take the physical value for both kaon and pion. The G-parity boundary condition is used and carefully tuned so that the ground state energy of the 𝜋𝜋₁₌₀ state matches the kaon mass. Three sets of 𝜋𝜋 interpolating operators are used, including a scalar bilinear ``σ" operator and paired single-pion bilinear operators with the constituent pions carrying various relative momenta. Several techniques, including correlated fits and a bootstrap determination of the 𝑝-value have been used, and a detailed analysis of all major systematic error is performed. The 𝜋𝜋 scattering phase shift results are presented in Fig. 5.10 and Tab. 5.12. For the Kaon decay amplitude, we finally get Re(𝐴₀) = 2.99(0.32)(0.59) × 10⁻⁷GeV, which is consistent with the experimental value of Re(𝐴₀) = 3.3201(18) × 10⁻⁷GeV, and Im(𝐴₀) = -6.98(0.62)(1.44) × 10⁻¹¹GeV. Combined with our earlier lattice calculation of 𝐴₂ [2], we obtained Re(𝜖′/𝜖) = 21.7(2.6)(6.2)(5.0) × 10⁻⁴, which agrees well with the experimental value of Re(𝜖′/𝜖) = 16.6(2.3) × 10⁻⁴, and Re(𝐴₀)/Re(𝐴₂) = 19.9(2.3)(4.4), consistent with the experimental value of Re(𝐴₀)/Re(𝐴₂) = 22.45(6), known as the 𝜟𝑰=1/2 rule.

Page generated in 0.0527 seconds