• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 191
  • 42
  • 31
  • 20
  • 19
  • 14
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 393
  • 393
  • 292
  • 64
  • 46
  • 46
  • 45
  • 42
  • 40
  • 36
  • 36
  • 34
  • 34
  • 34
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Frequency based efficiency evaluation - from pattern recognition via backwards simulation to purposeful drive design

Starke, Martin, Beck, Benjamin, Ritz, Denis, Will, Frank, Weber, Jürgen 23 June 2020 (has links)
The efficiency of hydraulic drive systems in mobile machines is influenced by several factors, like the operators’ guidance, weather conditions, material respectively loading properties and primarily the working cycle. This leads to varying operation points, which have to be performed by the drive system. Regarding efficiency analysis, the usage of standardized working cycles gained through measurements or synthetically generated is state of the art. Thereby, only a small extract of the real usage profile is taken into account. This contribution deals with process pattern recognition (PPR) and frequency based efficiency evaluation to gain more precise information and conclusion for the drive design of mobile machines. By the example of an 18 t mobile excavator, the recognition system using Hidden – Markov - Models (HMM) and the efficiency evaluation process by means of backwards simulation of measured operation points will be described.
352

Évaluation économique de l'ajustement des antihypertenseurs par le pharmacien en milieu communautaire

Houde, François-Xavier 07 1900 (has links)
OBJECTIF : L’hypertension est une maladie chronique pour laquelle seulement 68 % des patients étaient contrôlés en 2013. Les pharmaciens de la province de Québec ont récemment obtenu l’autorité législative pour ajuster la dose des antihypertenseurs lorsqu’une cible thérapeutique est entendue avec le médecin. Cette étude économique vise à estimer le ratio coût- utilité incrémental (RCUI) de ce nouveau modèle de soin au Québec. MÉTHODE : Un modèle de Markov a été développé pour extrapoler l’impact de cette pratique pharmaceutique sur les accidents vasculaires cérébraux (AVC), sur les infarctus du myocarde aigus et sur la mortalité. Les équations du modèle de risque de Framingham ont été utilisées pour estimer l’impact du contrôle de la pression artérielle sur la survenue d’AVC et d’infarctus. L’efficacité de l’intervention a été estimée à partir des résultats de l’essai clinique RxAction, conduit en Alberta. L’efficacité de l’intervention pharmaceutique est exprimée via la proportion de patients contrôlés. La perspective du payeur public a été utilisée et seulement les coûts directs des soins de santé ont été intégrés. L’issue principale observée par le modèle est le gain en années de vie ajustées par la qualité (AVAQ). Les AVAQ et les coûts ont été actualisés à un taux annuel de 1,5 %. Le modèle a simulé une cohorte de 1000 patients âgés de 65 ans, sur un horizon temporel de 20 ans en utilisant des cycles d’un an. RÉSULTATS: Le modèle a généré 0,051 AVAQs de plus par patient dans le groupe d’intervention pour une dépense supplémentaire de 268 $ par patient. Le RCUI est de 5 223 $/AVAQ gagnée. Si le seuil de volonté à payer du tiers public se tient à 50 000 $/AVAQ, cette intervention serait dite coût-efficace. Les résultats de cette simulation étaient surtout sensibles à l’efficacité de l’intervention pharmaceutique ainsi qu’à l’utilité attribuée à l’hypertension. CONCLUSION: Permettre aux pharmaciens d’ajuster la dose des traitements antihypertenseurs à l’intérieur du modèle de rémunération actuel apparaît coût-efficace. L’obtention de données d’efficacité issue d’une étude clinique conduite au Québec nous permettrait d’augmenter la validité externe de cette modélisation économique. / OBJECTIVE: Hypertension is a chronic disease for which only 68% of treated patients were controlled in Canada in 2013. Pharmacists in the province of Québec recently received legislative authority to adjust the dosage of antihypertensive drugs if there is an agreement with the prescriber of the therapeutic target. This research aims to estimate the incremental cost- utility ratio (ICUR) of this new model of care in Québec. METHODS: A Markov model was developed to extrapolate the impact of this pharmacy practice on strokes, myocardial infarctions and mortality. Framingham Risk Equations were used to derive the impact of blood pressure control on strokes and myocardial infarctions. The efficacy of the intervention was derived from the RxAction clinical trial, conducted in Alberta. Efficacy was expressed as the proportion of patients with controlled blood pressure. The model used the payer perspective and direct costs only. The main clinical outcome was expressed as the number of quality-adjusted life years (QALYs) gained. Both QALYS and costs were discounted at a 1.5% annual rate. A cohort of 1000 patients entered the model at 65 years old. The model used a 20 years time horizon, with a 1-year cycle length. RESULTS: The model yielded 0.051 QALYs per patient in the intervention group for an incremental expense of $268 per patient. The ICUR was 5,223 $/QALY. At a willingness-to- pay threshold of $50,000/QALY, the intervention is cost-effective. The results were sensitive to the comparative efficacy of the pharmacist intervention against usual care and to the utility of hypertension. CONCLUSION: Providing pharmacists the ability to adjust the dosage of antihypertensive drugs within the actual fee-for-service rational appears to be cost-effective. Obtaining data on the efficacy of this pharmacy practice from a trial conducted in Québec would provide better data to inform this economic evaluation.
353

Analyse du contenu expressif des gestes corporels / Analysis of gestures expressive content

Truong, Arthur 21 September 2016 (has links)
Aujourd’hui, les recherches portant sur le geste manquent de modèles génériques. Les spécialistes du geste doivent osciller entre une formalisation excessivement conceptuelle et une description purement visuelle du mouvement. Nous reprenons les concepts développés par le chorégraphe Rudolf Laban pour l’analyse de la danse classique contemporaine, et proposons leur extension afin d’élaborer un modèle générique du geste basé sur ses éléments expressifs. Nous présentons également deux corpus de gestes 3D que nous avons constitués. Le premier, ORCHESTRE-3D, se compose de gestes pré-segmentés de chefs d’orchestre enregistrés en répétition. Son annotation à l’aide d’émotions musicales est destinée à l’étude du contenu émotionnel de la direction musicale. Le deuxième corpus, HTI 2014-2015, propose des séquences d’actions variées de la vie quotidienne. Dans une première approche de reconnaissance dite « globale », nous définissons un descripteur qui se rapporte à l’entièreté du geste. Ce type de caractérisation nous permet de discriminer diverses actions, ainsi que de reconnaître les différentes émotions musicales que portent les gestes des chefs d’orchestre de notre base ORCHESTRE-3D. Dans une seconde approche dite « dynamique », nous définissons un descripteur de trame gestuelle (e.g. défini pour tout instant du geste). Les descripteurs de trame sont utilisés des poses-clés du mouvement, de sorte à en obtenir à tout instant une représentation simplifiée et utilisable pour reconnaître des actions à la volée. Nous testons notre approche sur plusieurs bases de geste, dont notre propre corpus HTI 2014-2015 / Nowadays, researches dealing with gesture analysis suffer from a lack of unified mathematical models. On the one hand, gesture formalizations by human sciences remain purely theoretical and are not inclined to any quantification. On the other hand, the commonly used motion descriptors are generally purely intuitive, and limited to the visual aspects of the gesture. In the present work, we retain Laban Movement Analysis (LMA – originally designed for the study of dance movements) as a framework for building our own gesture descriptors, based on expressivity. Two datasets are introduced: the first one is called ORCHESTRE-3D, and is composed of pre-segmented orchestra conductors’ gestures, which have been annotated with the help of lexicon of musical emotions. The second one, HTI 2014-2015, comprises sequences of multiple daily actions. In a first experiment, we define a global feature vector based upon the expressive indices of our model and dedicated to the characterization of the whole gesture. This descriptor is used for action recognition purpose and to discriminate the different emotions of our orchestra conductors’ dataset. In a second approach, the different elements of our expressive model are used as a frame descriptor (e.g., describing the gesture at a given time). The feature space provided by such local characteristics is used to extract key poses of the motion. With the help of such poses, we obtain a per-frame sub-representation of body motions which is available for real-time action recognition purpose
354

Leveraging formal concept analysis and pattern mining for moving object trajectory analysis / Exploitation de l'analyse formelle de concepts et de l'extraction de motifs pour l'analyse de trajectoires d'objets mobiles

Almuhisen, Feda 10 December 2018 (has links)
Cette thèse présente un cadre de travail d'analyse de trajectoires contenant une phase de prétraitement et un processus d’extraction de trajectoires d’objets mobiles. Le cadre offre des fonctions visuelles reflétant le comportement d'évolution des motifs de trajectoires. L'originalité de l’approche est d’allier extraction de motifs fréquents, extraction de motifs émergents et analyse formelle de concepts pour analyser les trajectoires. A partir des données de trajectoires, les méthodes proposées détectent et caractérisent les comportements d'évolution des motifs. Trois contributions sont proposées : Une méthode d'analyse des trajectoires, basée sur les concepts formels fréquents, est utilisée pour détecter les différents comportements d’évolution de trajectoires dans le temps. Ces comportements sont “latents”, "emerging", "decreasing", "lost" et "jumping". Ils caractérisent la dynamique de la mobilité par rapport à l'espace urbain et le temps. Les comportements détectés sont visualisés sur des cartes générées automatiquement à différents niveaux spatio-temporels pour affiner l'analyse de la mobilité dans une zone donnée de la ville. Une deuxième méthode basée sur l'extraction de concepts formels séquentiels fréquents a également été proposée pour exploiter la direction des mouvements dans la détection de l'évolution. Enfin, une méthode de prédiction basée sur les chaînes de Markov est présentée pour prévoir le comportement d’évolution dans la future période pour une région. Ces trois méthodes sont évaluées sur ensembles de données réelles . Les résultats expérimentaux obtenus sur ces données valident la pertinence de la proposition et l'utilité des cartes produites / This dissertation presents a trajectory analysis framework, which includes both a preprocessing phase and trajectory mining process. Furthermore, the framework offers visual functions that reflect trajectory patterns evolution behavior. The originality of the mining process is to leverage frequent emergent pattern mining and formal concept analysis for moving objects trajectories. These methods detect and characterize pattern evolution behaviors bound to time in trajectory data. Three contributions are proposed: (1) a method for analyzing trajectories based on frequent formal concepts is used to detect different trajectory patterns evolution over time. These behaviors are "latent", "emerging", "decreasing", "lost" and "jumping". They characterize the dynamics of mobility related to urban spaces and time. The detected behaviors are automatically visualized on generated maps with different spatio-temporal levels to refine the analysis of mobility in a given area of the city, (2) a second trajectory analysis framework that is based on sequential concept lattice extraction is also proposed to exploit the movement direction in the evolution detection process, and (3) prediction method based on Markov chain is presented to predict the evolution behavior in the future period for a region. These three methods are evaluated on two real-world datasets. The obtained experimental results from these data show the relevance of the proposal and the utility of the generated maps
355

Models of Discrete-Time Stochastic Processes and Associated Complexity Measures

Löhr, Wolfgang 12 May 2010 (has links)
Many complexity measures are defined as the size of a minimal representation in a specific model class. One such complexity measure, which is important because it is widely applied, is statistical complexity. It is defined for discrete-time, stationary stochastic processes within a theory called computational mechanics. Here, a mathematically rigorous, more general version of this theory is presented, and abstract properties of statistical complexity as a function on the space of processes are investigated. In particular, weak-* lower semi-continuity and concavity are shown, and it is argued that these properties should be shared by all sensible complexity measures. Furthermore, a formula for the ergodic decomposition is obtained. The same results are also proven for two other complexity measures that are defined by different model classes, namely process dimension and generative complexity. These two quantities, and also the information theoretic complexity measure called excess entropy, are related to statistical complexity, and this relation is discussed here. It is also shown that computational mechanics can be reformulated in terms of Frank Knight''s prediction process, which is of both conceptual and technical interest. In particular, it allows for a unified treatment of different processes and facilitates topological considerations. Continuity of the Markov transition kernel of a discrete version of the prediction process is obtained as a new result.
356

Predictive Modeling and Statistical Inference for CTA returns : A Hidden Markov Approach with Sparse Logistic Regression

Fransson, Oskar January 2023 (has links)
This thesis focuses on predicting trends in Commodity Trading Advisors (CTAs), also known as trend-following hedge funds. The paper applies a Hidden Markov Model (HMM) for classifying trends. Additionally, by incorporating additional features, a regularized logistic regression model is used to enhance prediction capability. The model demonstrates success in identifying positive trends in CTA funds, with particular emphasis on precision and risk-adjusted return metrics. In the context of regularized regression models, techniques for statistical inference such as bootstrap resampling and Markov Chain Monte Carlo are applied to estimate the distribution of parameters. The findings suggest the model's effectiveness in predicting favorable CTA performance and mitigating equity market drawdowns. For future research, it is recommended to explore alternative classification models and extend the methodology to different markets and datasets.
357

Detecting Anomalous Behavior in Radar Data

Rook, Jayson Carr 01 June 2021 (has links)
No description available.
358

Scalable Detection and Extraction of Data in Lists in OCRed Text for Ontology Population Using Semi-Supervised and Unsupervised Active Wrapper Induction

Packer, Thomas L 01 October 2014 (has links) (PDF)
Lists of records in machine-printed documents contain much useful information. As one example, the thousands of family history books scanned, OCRed, and placed on-line by FamilySearch.org probably contain hundreds of millions of fact assertions about people, places, family relationships, and life events. Data like this cannot be fully utilized until a person or process locates the data in the document text, extracts it, and structures it with respect to an ontology or database schema. Yet, in the family history industry and other industries, data in lists goes largely unused because no known approach adequately addresses all of the costs, challenges, and requirements of a complete end-to-end solution to this task. The diverse information is costly to extract because many kinds of lists appear even within a single document, differing from each other in both structure and content. The lists' records and component data fields are usually not set apart explicitly from the rest of the text, especially in a corpus of OCRed historical documents. OCR errors and the lack of document structure (e.g. HMTL tags) make list content hard to recognize by a software tool developed without a substantial amount of highly specialized, hand-coded knowledge or machine learning supervision. Making an approach that is not only accurate but also sufficiently scalable in terms of time and space complexity to process a large corpus efficiently is especially challenging. In this dissertation, we introduce a novel family of scalable approaches to list discovery and ontology population. Its contributions include the following. We introduce the first general-purpose methods of which we are aware for both list detection and wrapper induction for lists in OCRed or other plain text. We formally outline a mapping between in-line labeled text and populated ontologies, effectively reducing the ontology population problem to a sequence labeling problem, opening the door to applying sequence labelers and other common text tools to the goal of populating a richly structured ontology from text. We provide a novel admissible heuristic for inducing regular expression wrappers using an A* search. We introduce two ways of modeling list-structured text with a hidden Markov model. We present two query strategies for active learning in a list-wrapper induction setting. Our primary contributions are two complete and scalable wrapper-induction-based solutions to the end-to-end challenge of finding lists, extracting data, and populating an ontology. The first has linear time and space complexity and extracts highly accurate information at a low cost in terms of user involvement. The second has time and space complexity that are linear in the size of the input text and quadratic in the length of an output record and achieves higher F1-measures for extracted information as a function of supervision cost. We measure the performance of each of these approaches and show that they perform better than strong baselines, including variations of our own approaches and a conditional random field-based approach.
359

Particle-based Stochastic Volatility in Mean model / Partikel-baserad stokastisk volatilitet medelvärdes model

Kövamees, Gustav January 2019 (has links)
This thesis present a Stochastic Volatility in Mean (SVM) model which is estimated using sequential Monte Carlo methods. The SVM model was first introduced by Koopman and provides an opportunity to study the intertemporal relationship between stock returns and their volatility through inclusion of volatility itself as an explanatory variable in the mean-equation. Using sequential Monte Carlo methods allows us to consider a non-linear estimation procedure at cost of introducing extra computational complexity. The recently developed PaRIS-algorithm, introduced by Olsson and Westerborn, drastically decrease the computational complexity of smoothing relative to previous algorithms and allows for efficient estimation of parameters. The main purpose of this thesis is to investigate the volatility feedback effect, i.e. the relation between expected return and unexpected volatility in an empirical study. The results shows that unanticipated shocks to the return process do not explain expected returns. / Detta examensarbete presenterar en stokastisk volatilitets medelvärdes (SVM) modell som estimeras genom sekventiella Monte Carlo metoder. SVM-modellen introducerades av Koopman och ger en möjlighet att studera den samtida relationen mellan aktiers avkastning och deras volatilitet genom att inkludera volatilitet som en förklarande variabel i medelvärdes-ekvationen. Sekventiella Monte Carlo metoder tillåter oss att använda icke-linjära estimerings procedurer till en kostnad av extra beräkningskomplexitet. Den nyligen utvecklad PaRIS-algoritmen, introducerad av Olsson och Westerborn, minskar drastiskt beräkningskomplexiteten jämfört med tidigare algoritmer och tillåter en effektiv uppskattning av parametrar. Huvudsyftet med detta arbete är att undersöka volatilitets-återkopplings-teorin d.v.s. relationen mellan förväntad avkastning och oväntad volatilitet i en empirisk studie. Resultatet visar på att oväntade chockar i avkastningsprocessen inte har förklarande förmåga över förväntad avkastning.
360

A Markovian Approach to Financial Market Forecasting / En Markovisk ansats för finansiell marknadsprognostisering

Sun Wang, Kevin, Borin, William January 2023 (has links)
This thesis aims to investigate the feasibility of using a Markovian approach toforecast short-term stock market movements. To assist traders in making soundtrading decisions, this study proposes a Markovian model using a selection ofthe latest closing prices. Assuming that each time step in the one-minute timeframe of the stock market is stochastically independent, the model eliminates theimpact of fundamental analysis and creates a feasible Markov model. The modeltreats the stock price’s movement as entirely randomly generated, which allowsfor a more simplified model that can be implemented with ease. The modelis intended to serve as a starting ground for more advanced technical tradingstrategies and act as useful guidance for a short-term trader when combinedwith other resources. The creation of the model involves Laplace smoothing toensure there are no zero-probabilities and calculating the steady-state probabilityvector of the smoothed matrix to determine the predicted direction of the nexttime step. The model will reset daily, reducing the impact of fundamental factorsoccurring outside trading hours and reducing the risk of carrying over bias fromprevious trading day. Any open positions will hence be closed at the end of theday. The study’s purpose is to research and test if a simple forecasting modelbased on Markov chains can serve as a useful tool for forecasting stock prices atshort time intervals. The result of the study shows that a Markov-based tradingstrategy is more profitable than a simple buy-and-hold strategy and that theprediction accuracy of the Markov model is relatively high. / Denna avhandling syftar till att undersöka möjligheten att använda en markoviskmetod för att förutsäga kortsiktiga rörelser på aktiemarknaden. För att hjälpaaktörer på aktiemarknaden att fatta välgrundade handelsbeslut föreslår dennastudie en markovisk modell för att förutsäga nästa stängningspris baserat påde senaste stängningspriserna. Modellen antar att varje tidssteg i ett en-minuts intervall på aktiemarknaden är stokastiskt oberoende, vilket eliminerarpåverkan från fundamental analys och skapar förutsättningen för en genomförbarmarkov-modell. Modellen behandlar aktieprisets rörelse som helt slumpmässigtgenererat, vilket möjliggör en mer förenklad modell som kan implementeraspå marknaden. Modellen är avsedd att tjäna som en utgångspunkt förmer avancerade tekniska handelsalgoritmer och fungera som en användbarvägledning för en akitehandlare med kort tidshorisont i kombination med andraresurser. Skapandet av modellen inkluderar använding av Laplace-jämning föratt säkerställa att det inte finns nollsannolikheter samt beräknandet av denstationära sannolikhetsvektorn för den jämnade matrisen i syfte att bestämmaden förutsedda riktningen för nästa tidssteg. Modellen kommer att återställasdagligen, vilket minskar påverkan från de fundamentala faktorer som inträffarutanför handelstiderna och ser till att bias inte överförs till nästa börsdag. Dettainnebär att alla öppna positioner stängs vid dagens slut. Studiens syfte är attforska och testa om en enkel prognosmodell baserad på Markovkedjor kan varaanvändbar som ett verktyg för att förutsäga aktiepriser vid korta tidsintervall.Resultatet från studien visar på att en markov-baserad trading strategi är merlönsam än en enkel köp-och-behåll strategi och att prediktionernas träffsäkerhetfrån en markov modell är relativt höga.

Page generated in 0.0656 seconds