621 |
Log-selection strategies in a real-time systemGillström, Niklas January 2014 (has links)
This thesis presents and evaluates how to select the data to be logged in an embedded realtime system so as to be able to give confidence that it is possible to perform an accurate identification of the fault(s) that caused any runtime errors. Several log-selection strategies were evaluated by injecting random faults into a simulated real-time system. An instrument was created to perform accurate detection and identification of these faults by evaluating log data. The instrument’s output was compared to ground truth to determine the accuracy of the instrument. Three strategies for selecting the log entries to keep in limited permanent memory were created. The strategies were evaluated using log data from the simulated real-time system. One of the log-selection strategies performed much better than the other two: it minimized processing time and stored the maximum amount of useful log data in the available storage space. / Denna uppsats illustrerar hur det blev fastställt vad som ska loggas i ett inbäddat realtidssystem för att kunna ge förtroende för att det är möjligt att utföra en korrekt identifiering av fel(en) som orsakat körningsfel. Ett antal strategier utvärderades för loggval genom att injicera slumpmässiga fel i ett simulerat realtidssystem. Ett instrument konstruerades för att utföra en korrekt upptäckt och identifiering av dessa fel genom att utvärdera loggdata. Instrumentets utdata jämfördes med ett kontrollvärde för att bestämma riktigheten av instrumentet. Tre strategier skapades för att avgöra vilka loggposter som skulle behållas i det begränsade permanenta lagringsutrymmet. Strategierna utvärderades med hjälp av loggdata från det simulerade realtidssystemet. En av strategierna för val av loggdata presterade klart bättre än de andra två: den minimerade tiden för bearbetning och lagrade maximal mängd användbar loggdata i det permanenta lagringsutrymmet.
|
622 |
Sur les familles des lois de fonction de hasard unimodale : applications en fiabilité et analyse de survieSaaidia, Noureddine 24 June 2013 (has links)
En fiabilité et en analyse de survie, les distributions qui ont une fonction de hasard unimodale ne sont pas nombreuses, qu'on peut citer: Gaussienne inverse ,log-normale, log-logistique, de Birnbaum-Saunders, de Weibull exponentielle et de Weibullgénéralisée. Dans cette thèse, nous développons les tests modifiés du Chi-deux pour ces distributions tout en comparant la distribution Gaussienne inverse avec les autres. Ensuite nousconstruisons le modèle AFT basé sur la distribution Gaussienne inverse et les systèmes redondants basés sur les distributions de fonction de hasard unimodale. / In reliability and survival analysis, distributions that have a unimodalor $\cap-$shape hazard rate function are not too many, they include: the inverse Gaussian,log-normal, log-logistic, Birnbaum-Saunders, exponential Weibull and power generalized Weibulldistributions. In this thesis, we develop the modified Chi-squared tests for these distributions,and we give a comparative study between the inverse Gaussian distribution and the otherdistributions, then we realize simulations. We also construct the AFT model based on the inverseGaussian distribution and redundant systems based on distributions having a unimodal hazard ratefunction.
|
623 |
Estimation of energy detection thresholds and error probability for amplitude-modulated short-range communication radiosAnttonen, A. (Antti) 30 November 2011 (has links)
Abstract
In this thesis, novel data and channel estimation methods are proposed and analyzed for low-complexity short-range communication (SRC) radios. Low complexity is challenging to achieve especially in very wideband or millimeter-wave SRC radios where phase recovery and energy capture from numerous multipaths easily become a bottleneck for system design. A specific type of transceiver is selected using pulse amplitude modulation (PAM) at the transmitter and energy detection (ED) at the receiver, and it is thus called an ED-PAM system. Nonnegative PAM alphabets allow using an ED structure which enables a phase-unaware detection method for avoiding complicated phase recovery at the receiver. Moreover, the ED-PAM approach results in a simple multipath energy capture, and only one real decision variable, whose dimension is independent of the symbol alphabet size, is needed.
In comparison with optimal phase-aware detection, the appealing simplicity of suboptimal ED-PAM systems is achieved at the cost of the need for a higher transmitted signal energy or shorter link distance for obtaining a sufficient signal-to-noise ratio (SNR) at the receiver, as ED-PAM systems are more vulnerable to the effects of noise and interference. On the other hand, the consequences of requiring a higher SNR may not be severe in the type of SRC scenarios where a sufficient received SNR is readily available due to a short link distance. Furthermore, significant interference can be avoided by signal design. However, what has slowed down the development of ED-PAM systems is that efficient symbol decision threshold estimation and related error probability analysis in multipath fading channels have remained as unsolved problems.
Based on the above observations, this thesis contributes to the state-of-the-art of the design and analysis for ED-PAM systems as follows. Firstly, a closed-form near-optimal decision threshold selection method, which adapts to a time-varying channel gain and enables an arbitrary choice of the PAM alphabet size and an integer time-bandwidth product of the receiver filters, is proposed. Secondly, two blind estimation schemes of the parameters for the threshold estimation are introduced. Thirdly, analytical error probability evaluation in frequency-selective multipath fading channels is addressed. Special attention is given to lognormal fading channels, which are typically used to model very wideband SRC multipath channels. Finally, analytical error probability evaluation with nonideal parameter estimation is presented. The results can be used in designing low-complexity transceivers for very wideband and millimeter-wave wireless SRC devices of the future. / Tiivistelmä
Tässä työssä esitetään ja analysoidaan uusia data- ja kanavaestimointimenetelmiä, joiden tavoitteena on yksinkertaistaa lähikommunikaatiota (short-range communication, SRC) langattomien laitteiden välillä. SRC-radioiden yksinkertainen toteutus on poikkeuksellisen haasteellista silloin, kun käytetään erittäin suurta kaistanleveyttä tai millimetriaaltoalueen tiedonsiirtoa. Tällöin vastaanottimen yksinkertaisen toteutuksen voivat estää esimerkiksi kantoaallon vaiheen estimointi ja signaalienergian kerääminen lukuisilta kanavan monitiekomponenteilta. Näistä lähtökohdista valitaan SRC-radion järjestelmämalliksi positiiviseen pulssiamplitudimodulaatioon (pulse amplitude modulation, PAM) perustuva lähetin ja energiailmaisimeen (energy detection, ED) perustuva vastaanotin. ED-PAM-järjestelmän ei tarvitse tietää vastaanotetun signaalin vaihetta ja signaalienergian kerääminen tapahtuu yksinkertaisen diversiteettiyhdistelytekniikan avulla. Lisäksi ilmaisuun tarvitaan vain yksi reaalinen päätösmuuttuja, jonka dimensio on riippumaton PAM-tasojen määrästä.
ED-PAM-tekniikan yksinkertaisuutta optimaaliseen vaihetietoiseen ilmaisuun verrattuna ei saavuteta ilmaiseksi. Yhtenä rajoituksena on alioptimaalisen ED-PAM-tekniikan luontainen taipumus vahvistaa kohinan ja häiriöiden vaikutusta symbolin päätöksenteossa. Kohinan vahvistus ei välttämättä ole suuri ongelma niissä SRC-radioissa, joissa pienen linkkietäisyyden johdosta riittävä signaali-kohinasuhde vastaanottimessa voidaan kohinan vahvistuksesta huolimatta saavuttaa. Myös häiriöiden vahvistuksen vaikutusta voidaan tehokkaasti vähentää signaalisuunnittelulla. Joka tapauksessa ED-PAM-tekniikan käyttöönottoa on hidastanut tehokkaiden symbolipäätöskynnysten estimointi- ja analysointimenetelmien puuttuminen.
Edellä mainitut havainnot ovat motivoineet löytämään uusia suunnittelu- ja analyysimenetelmiä ED-PAM-järjestelmille seuraavasti. Symbolipäätöskynnysten estimointiin johdetaan lähes optimaalinen suljetun muodon menetelmä, joka kykenee adaptoitumaan muuttuvassa kanavassa ja valitsemaan mielivaltaisen kokonaisluvun sekä PAM-tasojen määrälle että vastaanottimen aika-kaistanleveystulolle. Lisäksi esitetään kaksi sokeaa päätöskynnysten estimointimenetelmää, jotka eivät tarvitse redundanttista opetussignaalia. Työn toisessa osassa ED-PAM-järjestelmän symbolivirhesuhdetta analysoidaan taajuusselektiivisessä monitiekanavassa. Analyysissä keskitytään log-normaalijakauman mukaan häipyvään kanavaan. Seuraavaksi analyysia laajennetaan ottamalla mukaan epäideaalisten kynnysarvojen estimoinnin vaikutus. Saavutettuja tuloksia voidaan hyödyntää erittäin laajakaistaisten ja millimetriaaltoalueen SRC-laitteiden suunnittelussa.
|
624 |
Fault tolerance for stream programs on parallel platformsSanz-Marco, Vicent January 2015 (has links)
A distributed system is defined as a collection of autonomous computers connected by a network, and with the appropriate distributed software for the system to be seen by users as a single entity capable of providing computing facilities. Distributed systems with centralised control have a distinguished control node, called leader node. The main role of a leader node is to distribute and manage shared resources in a resource-efficient manner. A distributed system with centralised control can use stream processing networks for communication. In a stream processing system, applications typically act as continuous queries, ingesting data continuously, analyzing and correlating the data, and generating a stream of results. Fault tolerance is the ability of a system to process the information, even if it happens any failure or anomaly in the system. Fault tolerance has become an important requirement for distributed systems, due to the possibility of failure has currently risen to the increase in number of nodes and the runtime of applications in distributed system. Therefore, to resolve this problem, it is important to add fault tolerance mechanisms order to provide the internal capacity to preserve the execution of the tasks despite the occurrence of faults. If the leader on a centralised control system fails, it is necessary to elect a new leader. While leader election has received a lot of attention in message-passing systems, very few solutions have been proposed for shared memory systems, as we propose. In addition, rollback-recovery strategies are important fault tolerance mechanisms for distributed systems, since that it is based on storing information into a stable storage in failure-free state and when a failure affects a node, the system uses the information stored to recover the state of the node before the failure appears. In this thesis, we are focused on creating two fault tolerance mechanisms for distributed systems with centralised control that uses stream processing for communication. These two mechanism created are leader election and log-based rollback-recovery, implemented using LPEL. The leader election method proposed is based on an atomic Compare-And-Swap (CAS) instruction, which is directly available on many processors. Our leader election method works with idle nodes, meaning that only the non-busy nodes compete to become the new leader while the busy nodes can continue with their tasks and later update their leader reference. Furthermore, this leader election method has short completion time and low space complexity. The log-based rollback-recovery method proposed for distributed systems with stream processing networks is a novel approach that is free from domino effect and does not generate orphan messages accomplishing the always-no-orphans consistency condition. Additionally, this approach has lower overhead impact into the system compared to other approaches, and it is a mechanism that provides scalability, because it is insensitive to the number of nodes in the system.
|
625 |
Computational support for learners of ArabicAl-Liabi, Majda Majeed January 2012 (has links)
This thesis documents the use of Natural Language Processing (NLP) in Computer Assisted Language Learning (CALL) and its contribution to the learning experience of students studying Arabic as a foreign language. The goal of this project is to build an Intelligent Computer Assisted Language Learning (ICALL) system that provides computational assistance to learners of Arabic by teaching grammar, producing homework and issuing students with immediate feedback. To produce this system we use the Parasite system, which produces morphological, syntactic and semantic analysis of textual input, and extend it to provide error detection and diagnosis. The methodology we adopt involves relaxing constraints on unification so that correct information contained in a badly formed sentence may still be used to obtain a coherent overall analysis. We look at a range of errors, drawn from experience with learners at various levels, covering word internal problems (addition of inappropriate affixes, failure to apply morphotactic rules properly) and problems with relations between words (local constraints on features, and word order problems). As feedback is an important factor in learning, we look into different types of feedback that can be used to evaluate which is the most appropriate for the aim of our system.
|
626 |
Visual Tracking of Deformation and Classification of Object Elasticity with Robotic Hand ProbingHui, Fei January 2017 (has links)
Performing tasks with a robotic hand often requires a complete knowledge of the manipulated object, including its properties (shape, rigidity, surface texture) and its location in the environment, in order to ensure safe and efficient manipulation. While well-established procedures exist for the manipulation of rigid objects, as well as several approaches for the manipulation of linear or planar deformable objects such as ropes or fabric, research addressing the characterization of deformable objects occupying a volume remains relatively limited. The fundamental objectives of this research are to track the deformation of non-rigid objects under robotic hand manipulation using RGB-D data, and to automatically classify deformable objects as either rigid, elastic, plastic, or elasto-plastic, based on the material they are made of, and to support recognition of the category of such objects through a robotic probing process in order to enhance manipulation capabilities. The goal is not to attempt to formally model the material of the object, but rather employ a data-driven approach to make decisions based on the observed properties of the object, capture implicitly its deformation behavior, and support adaptive control of a robotic hand for other research in the future. The proposed approach advantageously combines color image and point cloud processing techniques, and proposes a novel combination of the fast level set method with a log-polar mapping of the visual data to robustly detect and track the contour of a deformable object in a RGB-D data stream. Dynamic time warping is employed to characterize the object properties independently from the varying length of the detected contour as the object deforms. The research results demonstrate that a recognition rate over all categories of material of up to 98.3% is achieved based on the detected contour. When integrated in the control loop of a robotic hand, it can contribute to ensure stable grasp, and safe manipulation capability that will preserve the physical integrity of the object.
|
627 |
Extensions of the normal distribution using the odd log-logistic family: theory and applications / Extensões do normal distribuição utilizando a família odd log-logística: teoria e aplicaçõesAltemir da Silva Braga 23 June 2017 (has links)
In this study we propose three new distributions and a study with longitudinal data. The first was the Odd log-logistic normal distribution: theory and applications in analysis of experiments, the second was Odd log-logistic t Student: theory and applications, the third was the Odd log-logistic skew normal: the new distribution skew-bimodal with applications in analysis of experiments and the fourth regression model with random effect of the Odd log-logistic skew normal distribution: an application in longitudinal data. Some have been demonstrated such as symmetry, quantile function, some expansions, ordinary incomplete moments, mean deviation and the moment generating function. The estimation of the model parameters were approached by the method of maximum likelihood. In applications were used regression models to data from a completely randomized design (CRD) or designs completely randomized in blocks (DBC). Thus, the models can be used in practical situations for as a completely randomized designs or completely randomized blocks designs, mainly, with evidence of asymmetry, kurtosis and bimodality. / A distribuição normal é uma das mais importantes na área de estatística. Porém, não é adequada para ajustar dados que apresentam características de assimetria ou de bimodalidade, uma vez que tal distribuição possui apenas os dois primeiros momentos, diferentes de zero, ou seja, a média e o desvio-padrão. Por isso, muitos estudos são realizados com a finalidade de criar novas famílias de distribuições que possam modelar ou a assimetria ou a curtose ou a bimodalidade dos dados. Neste sentido, é importante que estas novas distribuições tenham boas propriedades matemáticas e, também, a distribuição normal como um submodelo. Porém, ainda, são poucas as classes de distribuições que incluem a distribuição normal como um modelo encaixado. Dentre essas propostas destacam-se: a skew-normal, a beta-normal, a Kumarassuamy-normal e a gama-normal. Em 2013 foi proposta a nova família X de distribuições Odd log-logística-G com o objetivo de criar novas distribuições de probabildade. Assim, utilizando as distribuições normal e a skew-normal como função base foram propostas três novas distribuições e um quarto estudo com dados longitudinais. A primeira, foi a distribuição Odd log-logística normal: teoria e aplicações em dados de ensaios experimentais; a segunda foi a distribuição Odd log-logística t Student: teoria e aplicações; a terceira foi a distribuição Odd log-logística skew-bimodal com aplicações em dados de ensaios experimentais e o quarto estudo foi o modelo de regressão com efeito aleatório para a distribuição distribuição Odd log-logística skew-bimodal: uma aplicação em dados longitudinais. Estas distribuições apresentam boas propriedades tais como: assimetria, curtose e bimodalidade. Algumas delas foram demonstradas como: simetria, função quantílica, algumas expansões, os momentos incompletos ordinários, desvios médios e a função geradora de momentos. A flexibilidade das novas distrições foram comparada com os modelos: skew-normal, beta-normal, Kumarassuamy-normal e gama-normal. A estimativas dos parâmetros dos modelos foram obtidas pelo método da máxima verossimilhança. Nas aplicações foram utilizados modelos de regressão para dados provenientes de delineamentos inteiramente casualizados (DIC) ou delineamentos casualizados em blocos (DBC). Além disso, para os novos modelos, foram realizados estudos de simulação para verificar as propriedades assintóticas das estimativas de parâmetros. Para verificar a presença de valores extremos e a qualidade dos ajustes foram propostos os resíduos quantílicos e a análise de sensibilidade. Portanto, os novos modelos estão fundamentados em propriedades matemáticas, estudos de simulação computacional e com aplicações para dados de delineamentos experimentais. Podem ser utilizados em ensaios inteiramente casualizados ou em blocos casualizados, principalmente, com dados que apresentem evidências de assimetria, curtose e bimodalidade.
|
628 |
Försörjningsvillkor i brytningstid : Flottarens arbetsvillkor i förändring under 1930-taletForsgren, Petrus January 2020 (has links)
Living conditions in periods of transitions. Changing working conditions of the log-driver during the 1930s. Petrus Forsgren, Economic History Bachelor Degree, Umeå University Autumn 2020. The working terms, salary and conflicts between the workers and the Umeå flottningsförening in log driving-district of Umeåälven has been studied between the period 1928-1939. Although it was a relatively few years that were studied, changes were observed especially for the working terms. The contracts went from hand-written to standardized and a orientation towards the new consensus for collective agreements in the work places occurred during the 1930s. Collective bargaining agreement were written 1935. The employers were expected to take more responsibility for the workers at the end of the period. A specialization of the workers and foremen occurred although no major changes happened to the log driving process. The working conditions in log-driving were still the same as in early industrialization. There was a diversification to hourly wages although they remained similar in the studied period. Conflicts occurred during the years 1933-34. The log drivers usually combined the work with farming, forestry or later also road construction. Other branches did the same transition but usually earlier such as railway workers during the 1910-1920s or forest workers during the 1960-1970s. This work gives a clue to why the changes happened relatively late in the log driving industry in Umeälven. The internal factors for change were weak due to the background of the workers, pull from other industries in the area and small extent of rationalization within the organization, but the external factors became more and more apparent during the studied period. The new institutional changes ignored the old industries which also lost influence but they were never the less affected indirectly by the changes and the Swedish economy as a whole.
|
629 |
Imagem tridimensional da deformação da musculatura extraocular na orbitopatia de Graves: implicações do efeito de volume parcial. / Tridimensional image of the extraocular muscles deformations in Graves' orbitopathy: implications of partial volume effects.Souza, André Domingos Araújo 22 March 2002 (has links)
Os músculos extraoculares (EOM), responsáveis pelas rotações oculares, apresentam-se aumentados em suas dimensões na orbitopatia de Graves, o que pode levar o paciente à cegueira (neuropatia óptica). Na prática clínica normalmente mede-se manualmente, em cada imagem coronal de tomografia computadorizada por raios-X (CT), o diâmetro desses músculos para avaliar se estes estão aumentados. A subjetividade e o tempo consumido na aquisição destas medidas são as principais deficiências desses métodos manuais. Dessa forma, apresentamos um método de segmentação dos EOM (MSEG) que supera as falhas, acima citadas. O MSEG proposto é baseado no detector de bordas Laplaciano da Gaussiana (LoG) associado à morfologia matemática. Para determinação do tamanho da máscara LoG levou-se em consideração os efeitos devido ao truncamento e a amostragem. A acurácia das medidas em modelos tridimensionais (3D) é afetada pelo efeito de volume parcial (PVE). Em CT, por exemplo, falsas estruturas de tecidos moles aparecem nas interfaces do osso-para-gordura e do osso-para-ar. Além disso, a pele, que tem número CT (ou escala de Hounsfield) idêntico ao tecido mole, obscurece a renderização deste. A fim de produzir imagens 3D do osso e dos tecidos moles, mais confiáveis para medidas e com melhora de qualidade, foram desenvolvidos dois métodos de classificação dos voxels com PVE (MCLA) baseados num novo modelo de mistura. A remoção da pele é realizada por meio da morfologia matemática. Renderizações volumétricas foram criadas, antes e depois de aplicar os MCLA. Experimentos qualitativo e quantitativo foram conduzidos utilizando fantons matemáticos que simularam diferentes níveis de PVE por adição de ruído e borramento e em dados clínicos de CT. O resultado em 218 pares de medidas de áreas dos EOM realizadas em imagens coronais de CT (3 normais e 2 Graves) revelou uma boa correlação (R=0,92) entre o MSEG e o traçado manual. A medida de taxa de ocupação dos EOM na órbita (TO) feita em 33 pacientes (5 normais e 28 Graves) apresentou o maior valor no grupo Graves com neuropatia óptica, TO=34,3%. Este valor é quase cinco vezes maior que o grupo normal, TO=7,3%. Todos os resultados demonstraram uma melhora de qualidade das imagens 3D depois da aplicação dos MCLA. A análise quantitativa indica que mais de 98% dos voxels com PVE foram removidos por ambos MCLA, e o segundo MCLA têm um desempenho um pouco melhor que o primeiro. Além disso, a remoção da pele torna vívidos os finos detalhes nas estruturas musculares. Medidas em modelos 3D devem ser tomadas com cuidado na radiologia em vista dos artefatos demonstrados neste trabalho, artefatos vindos, principalmente, do PVE. Em nossos experimentos, os erros nas medidas de volume dos EOM foram acima de 25% do valor estimado como "verdadeiro". Imagens volumétricas com PVE resolvidos são apresentadas, e assim medidas mais acuradas são asseguradas. / The extraocular muscles (EOM), which are responsible for the eyes movements, are presented enlarged in their dimensions in Graves orbitopathy. These deformations can lead patients to blindness. In clinical routine, physicians normally evaluate, in computer tomography (CT) images, the diameter of the EOM by manual tracing to check if they are enlarged. However, the accuracy of the EOM measurements is impaired by the subjectivity of these manual methods. Further, the time consuming is also one of the main drawbacks on these methods. This way we present an EOM segmentation method (MSEG) that overcomes the difficulties pointed above. The MSEG method is based on the Laplacian-of-Gaussian operator (LOG) combined with the mathematical morphology theory. We have taken into account the effect of discretization and numerical truncation during the LOG implementation. In CT, partial volume effects (PVE) cause several artifacts in volume rendering. In order to create 3D rendition more reliable to carry out anatomical measures and also to pursue superior quality of display of both soft-tissue and bone, we introduce two methods for detecting and classifying voxels with PVE (MCLA) based on a new approach. A method is described to automatically peel skin so that PVE-resolved renditions of bone and soft-tissue reveal considerably more details. We have conducted experiments to evaluate quantitatively and qualitatively all methods proposed here. The MSEG method is well correlated with manual tracing in our experiments (R=0,92). Surface renditions are created from EOM CT dataset segmented using the MSEG method. We have also conducted a quantitative evaluation in patients with Graves orbitopathy wherein the EOM volume ratio in the orbit (TO) was T=34,3%, which is about five times higher than in normal patient (TO=7,3%). Volume renditions have been created before and after applying the methods for several patient CT datasets. A mathematical phantom experiment involving different levels of PVE has been conducted by adding different degrees of noise and blurring. A quantitative evaluation was performed using the mathematical phantom and clinical CT data wherein an operator carefully masked out voxels with PVE in the segmented images. All results have demonstrated the enhanced quality of display of bone and soft tissue after applying the proposed methods. The quantitative evaluations indicate that more than 98% of the voxels with PVE are removed by the two methods and the second method performs slightly better than the first. Further, skin peeling vividly reveals fine details in the soft tissue structures. 3D renditions should be used with care in radiology in view of artifacts demonstrated in this work coming from PVE. Finally, we have estimated volume errors in the EOM models higher than 25% if PVE is not properly handled.
|
630 |
Detekce útoků cílených na webové aplikace / Detection of attacks targeted at web applicationsJégrová, Eliška January 2018 (has links)
This thesis is dealing with vulnerabilities of web applications. The aim of the work is to create tools for attack detection of certain attacks, specifically Same Origin Method Execution (SOME), XML Signature Wrapping attack, XPATH Injection, HTTP Response Smuggling and Server-Side Includes (SSI) injection. Another aim is to create logs that display detected attacks. In the first part, the theory is analyzed and vulnerabilities of chosen attacks are described including their misuse. In the next section there are web application implemented which contain vulnerabilities for successful execution of the attacks. Furthermore, in Python language detection methods are designed and developed for these attacks, which are accompanied by a log entry.
|
Page generated in 0.0501 seconds