• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 600
  • 210
  • 136
  • 54
  • 52
  • 41
  • 20
  • 19
  • 16
  • 16
  • 8
  • 7
  • 7
  • 6
  • 6
  • Tagged with
  • 1466
  • 135
  • 133
  • 107
  • 102
  • 99
  • 95
  • 82
  • 82
  • 80
  • 76
  • 72
  • 72
  • 71
  • 70
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
881

Judging the Behavior of People We Know: Objective Assessment, Confirmation of Preexisting Views, or Both?

Leising, Daniel, Gallrein, Anne-Marie B., Dufner, Michael 17 September 2019 (has links)
The present study investigates the relative extent to which judgments of people’s behavior are influenced by “truth” (as measured by averaged observer-judgments) and by systematic bias (i.e., perceivers’ preexisting views of target persons). Using data from online questionnaires and laboratory sessions (N = 155), we demonstrate that self- and peer-judgments of people’s actual behavior in specific situations are somewhat accurate but are also affected by what perceivers thought of the targets before observing their behavior. The latter effect comprises a general evaluative component (generally positive or negative views of targets) and a content-specific component (views of targets in terms of specific characteristics, for example, “restrained”). We also found that friends, but not targets themselves, tend to judge targets’ behaviors more positively than unacquainted observers do. The relevance of these findings for person perception in everyday life and in research contexts is discussed.
882

Fractional Flow Reserve Using Computed Tomography for Assessing Coronary Artery Disease: A Meta-Analysis

Panchal, Hemang B., Veeranki, Sreenivas P., Bhatheja, Samit, Barry, Neil, Mahmud, Ehtisham, Budoff, Matthew, Lavine, Steven J., Mamudu, Hadii M., Paul, Timir K. 01 January 2016 (has links)
Aims: Noninvasive fractional flow reserve (FFR) measurement with computed tomography (FFRCT) is a newly described method for assessing functional significance of coronary disease. The objective of this metaanalysis is to determine the diagnostic performance of FFRCTin the assessment of hemodynamically significant coronary artery stenosis. Methods: PubMed and the Cochrane Center Register of Controlled Trials were searched from January 2000 through February 2015. Six original studies were found comparing FFRCTto invasive FFR in evaluating hemodynamic significance of coronary lesions (1354 vessels; 812 patients). Lesions were considered hemodynamically significant if invasive FFR was 0.80 or less. FFRCTused the same cutoff as invasive FFR to be considered as a positive test. Sensitivity, specificity, positive and negative likelihood ratios, and diagnostic odds ratio were calculated. Results: One-third of the lesions (n=443) were hemodynamically significant. The pooled per-vessel analysis showed that the sensitivity, specificity, negative and positive likelihood ratios, and diagnostic odds ratio of FFRCTto diagnose hemodynamically significant coronary disease were 0.84 [95% confidence interval (CI):0.80-0.87], 0.76 (95% CI: 0.73-0.79), 0.22 (95% CI: 0.17-0.29), 3.48 (95% CI: 2.21-5.47), and 16.82 (95% CI: 8.20-34.49), respectively. Conclusion: The results of this meta-analysis demonstrate that FFRCTresults correlate closely with invasive coronary angiography and FFR measurement. It is a feasible noninvasive method to assess hemodynamic significance of coronary lesions in patients with stable coronary artery disease.
883

Selection on Intra-Individual Variation in Stigma–Anther Distance in the Tropical Tree Ipomoea Wolcottiana (Convolvulaceae)

Arceo-Gómez, G., Vargas, C. F., Parra-Tabla, V. 01 May 2017 (has links)
It is well known that animals can exert strong selective pressures on plant traits. However, studies on the evolutionary consequences of plant–animal interactions have mainly focused on understanding how these interactions shape trait means, while overlooking its potential direct effect on the variability among structures within a plant (e.g. flowers and fruits). The degree of within-plant variability can have strong fitness effects but few studies have evaluated its role as a potential target of selection. Here we reanalysed data on Ipomoea wolcottiana stigma–anther distance to test alternate mechanisms driving selection on the mean as well as on intra-individual variance in 2 years. We found strong negative selection acting on intra-individual variation but not on mean stigma–anther distance, suggesting independent direct selection on the latter. Our result suggests that intra-individual variance has the potential to be an important target of selection in nature, and that ignoring it could lead to the wrong characterisation of the selection regime. We highlight the need for future studies to consider patterns of selection on the mean as well as on intra-individual variance if we want to understand the full extent of plant–animal interactions as an evolutionary force in nature.
884

Diagnostic Value of Noninvasive Computed Tomography Perfusion Imaging and Coronary Computed Tomography Angiography for Assessing Hemodynamically Significant Native Coronary Artery Lesions

Sethi, Pooja, Panchal, Hemang B., Veeranki, Sreenivas P., Ur Rahman, Zia, Mamudu, Hadii, Paul, Timir K. 01 September 2017 (has links)
The objective of this study is to determine the diagnostic performance of computed tomography perfusion (CTP) with and without computed tomography angiography (CTA) in assessment of hemodynamically significant coronary artery lesions in comparison to invasive fractional flow reserve (FFR). Materials and Methods PubMed and Cochrane Center Register of Controlled Trials from January 2010 searched through December 2014. Nine original studies were selected evaluating the diagnostic performance of CTP with and without CTA to invasive coronary angiography in evaluation of hemodynamic significance of coronary lesions (n = 951). Results The sensitivity, specificity, LR+ and LR- and DOR of CTA+CTP were 0.85 [95% confidence interval (CI: 0.79-0.89)] 0.94 (CI: 0.91-0.97), 15.8 (CI: 7.99-31.39), 0.146 (CI: 0.08-0.26), and 147.2 (CI: 69.77-310.66). Summary Receiver Operating Characteristics (SROC) results showed area under the curve (AUC) of 0.97 indicating that CTA+CTP may detect hemodynamically significant coronary artery lesions with high accuracy. The sensitivity, specificity, LR+ and LR- and DOR of CTP were 0.83 (CI: 0.78-0.87), 0.84 (CI: 0.80-0.87) 5.26 (CI: 2.93-9.43), 0.209 (CI: 0.12-0.36), and 31.97 (CI: 11.59-88.20). Conclusions This result suggests that CTP with CTA significantly improves diagnostic performance of coronary artery lesions compared to CTA alone and closely comparable with invasive FFR.
885

Investigating techniques for improving accuracy and limiting overfitting for YOLO and real-time object detection on iOS

Güven, Jakup January 2019 (has links)
I detta arbete genomförs utvecklingen av ett realtids objektdetekteringssystem för iOS. För detta ändamål används YOLO, en ett-stegs objektdetekterare och ett s.k. ihoplänkat neuralt nätverk vilket åstadkommer betydligt bättre prestanda än övriga realtidsdetek- terare i termer av hastighet och precision. En dörrdetekterare baserad på YOLO tränas och implementeras i en systemutvecklingsprocess. Maskininlärningsprocessen sammanfat- tas och praxis för att undvika överträning eller “overfitting” samt för att öka precision och hastighet diskuteras och appliceras. Vidare genomförs en rad experiment vilka pekar på att dataaugmentation och inkludering av negativ data i ett dataset medför ökad precision. Hyperparameteroptimisering och kunskapsöverföring pekas även ut som medel för att öka en objektdetekringsmodells prestanda. Författaren lyckas öka modellens mAP, ett sätt att mäta precision för objektdetekterare, från 63.76% till 86.73% utifrån de erfarenheter som dras av experimenten. En modells tendens för överträning utforskas även med resultat som pekar på att träning med över 300 epoker rimligen orsakar en övertränad modell. / This paper features the creation of a real time object detection system for mobile iOS using YOLO, a state-of-the-art one stage object detector and convoluted neural network far surpassing other real time object detectors in speed and accuracy. In this process an object detecting model is trained to detect doors. The machine learning process is outlined and practices to combat overfitting and increasing accuracy and speed are discussed. A series of experiments are conducted, the results of which suggests that data augmentation, including negative data in a dataset, hyperparameter optimisation and transfer learning are viable techniques in improving the performance of an object detection model. The author is able to increase mAP, a measurement of accuracy for object detectors, from 63.76% to 86.73% based on the results of experiments. The tendency for overfitting is also explored and results suggest that training beyond 300 epochs is likely to produce an overfitted model.
886

Aktieanalytikers träffsäkerhet : Beror skillnader i konsensusriktkursers träffsäkerhet på bolagens storlek?

Delic, Inas, Oliver, Bergman January 2020 (has links)
Background Equity research analysts publish reports containing recommendations and target prices for stocks. A lot of research has been carried out on the subject of accuracy in earnings per share forecasts. Studies have also been made regarding target price accuracy on different markets and for bigger companies. This study concerns the target price accuracy on the Swedish market and for companies from the lists OMX Stockholm Small Cap-, Mid Cap- and Large Cap. The difference from earlier studies is therefore the focus on target price accuracy difference between large and small companies. Aim The aim of this thesis is to analyze the target price accuracy on the Swedish market and compare this accuracy between big and small companies to see if there is any connection between company size and target price accuracy. Completion The study has been conducted with a quantitative method and a deductive approach. Data for 98 companies on the Swedish market has been collected to be able to calculate the absolute forecasting error for three, six- and twelve-month horizons. The absolute forecasting error is then analysed through a regression to be able to see if there is any connection between the absolute forecasting error and the variables beta, difference between highest and lowest target price, difference between target price and actual price, market capitalization, sales, trading volume and volatility. The accuracy has also been studied by looking at if the target price has been reached during the same time horizons. Results The results of this study show that the ratios of achieved buy recommendations and hold recommendations are higher for constituents of OMX Stockholm Large Cap whilst sell recommendations are more often achieved for OMX Stockholm Small Cap and OMX Stockholm Mid Cap constituents. Most differences are statistically significant at least at a 5% level of significance. The regressions show that at three and six-month horizons a higher Beta and trading volume leads to a smaller absolute forecasting error. Increased discrepancy, absolute target price potential, market capitalization, sales and volatility lead to a higher absolute forecasting error. At a twelve-month horizon, the absolute forecasting error has a positive relationship with discrepancy, absolute target price potential, market capitalization, trading volume and volatility. Beta and sales show a negative relationship with the absolute forecasting error. / Bakgrund Aktieanalytiker publicerar ofta rapporter innehållandes riktkurser och rekommendationer. Det har gjorts många studier på ämnet träffsäkerhet för vinst per aktie prognoser. Det har även gjorts studier på riktkursträffsäkerhet på andra marknader samt främst för stora bolag. Denna studie behandlar riktkursträffsäkerheten på den svenska marknaden för företag från listorna OMX Stockholm Small Cap- , Mid Cap- och Large Cap. Skillnaden från tidigare studier är alltså att fokus ligger på skillnader i riktkursträffsäkerhet mellan små och stora bolag. Syfte Syftet med denna studie är att beräkna aktieanalytikers träffsäkerhet avseende riktkurs på den svenska aktiemarknaden och jämföra denna träffsäkerhet mellan större och mindre bolag för att se om den skiljer sig åt. Genomförande Studien har genomförts med en kvantitativ metod och deduktiv ansats. Data för 98 bolag på den svenska marknaden har samlats in för att beräkna det absoluta prognosfelet för perioderna tre, sex och tolv månader efter utgiven riktkurs. Detta prognosfel har sedan undersökts genom regressioner för att se om det finns ett samband med variablerna beta, diskrepans mellan högsta och lägsta riktkurs, absolut kurspotential, marknadsvärde, omsättning, handelsvolym och volatilitet. Dessa variabler har valts då de visat skilja sig signifikant mellan stora och små bolag. Träffsäkerheten har även undersökts genom att analysera om aktiekursen någon gång uppgår till riktkursen under samma tidsspann som för det absoluta prognosfelet. Resultat Resultatet i denna studie visar att andelen uppnådda köp-och behållrekommendationer är högre för bolag tillhörande OMX Stockholm Large Cap medan säljrekommendationer uppnås oftare för bolag tillhörande OMX Stockholm Small Cap och OMX Stockholm Mid Cap. De flesta skillnaderna är statistisk signifikanta vid minst 5% signifikansnivå. Regressionerna visar att på tre samt sex månaders sikt leder högre Beta och handelsvolym till mindre absolut prognosfel. Ökad diskrepans, absolut kurspotential, marknadsvärde, omsättning och volatilitet ger ett högre prognosfel. På tolv månaders sikt uppvisar diskrepans, absolut kurspotential, marknadsvärde, handelsvolym samt volatilitet ett positivt samband med absolut prognosfel. Beta och omsättning har ett negativt samband.
887

Modeling the speed-accuracy tradeoff using the tools of information theory / Modélisation du compromis vitesse précision d'une tâche de pointage humain à l'aide des outils de la théorie de l'information

Gori, Julien 20 December 2018 (has links)
La loi de Fitts, qui relie le temps de mouvement MT dans une tache de pointage aux dimensions de la cible visée D et W est usuellement exprimée à partir d’une imitation de la formule de la capacité de Shannon MT = a + b log 2 (1 + D/W). Toutefois, l’analyse actuelle est insatisfaisante: elle provient d’une simple analogie entre la tache de pointage et la transmission d’un signal sur un canal bruité sans qu’il n’y ait de modèle explicite de communication.Je développe d’abord un modèle de transmission pour le pointage, où l’indice de difficulté ID = log 2 (1 + D/W) s’exprime aussi bien comme une entropie de source et une capacité de canal, permettant ainsi de réconcilier dans un premier temps l’approche de Fitts avec la théorie de l’information de Shannon. Ce modèle est ensuite exploité pour analyser des données de pointage récoltées lors d’expérimentations contrôlées mais aussi en conditions d’utilisations réelles.Je développe ensuite un second modèle, focalisé autour de la forte variabilité caractéristique du mouvement humain et qui prend en compte la forte diversité des mécanismes de contrôle du mouvement: avec ou sans voie de retour, par intermittence ou de manière continue. À partir d’une chronométrie de la variance positionnelle, évaluée à partir d’un ensemble de trajectoires, on remarque que le mouvement peut-être découpé en deux phases: une première où la variance augmente et une grande partie de la distance à couvrir est parcourue, est suivie d’une deuxième au cours de laquelle la variance diminue pour satisfaire les contraintes de précision requises par la tache.Dans la deuxième phase, le problème du pointage peut-être ramené à un problème de communication à la Shannon, où l’information est transmise d’une“source” (variance à la fin de la première phase) à une “destination” (extrémité du membre) à travers un canal Gaussien avec la présence d’une voie de retour.Je montre que la solution optimale à ce problème de transmission revient à considérer un schéma proposé par Elias. Je montre que la variance peut décroitre au mieux exponentiellement au cours de la deuxième phase, et que c’est ce résultat qui implique directement la loi de Fitts. / Fitts’ law, which relates movement time MTin a pointing task to the target’s dimensions D and Wis usually expressed by mimicking Shannon’s capacityformula MT = a + b log 2 (1 + D/W). Yet, the currentlyreceived analysis is incomplete and unsatisfactory: itstems from a vague analogy and there is no explicitcommunication model for pointing.I first develop a transmission model for pointing taskswhere the index of difficulty ID = log 2 (1 + D/W) isthe expression of both a source entropy and a chan-nel capacity, thereby reconciling Shannon’s informa-tion theory with Fitts’ law. This model is then levera-ged to analyze pointing data gathered from controlledexperiments but also from field studies.I then develop a second model which builds on thevariability of human movements and accounts for thetremendous diversity displayed by movement control:with of without feedback, intermittent or continuous.From a chronometry of the positional variance, eva-luated from a set of trajectories, it is observed thatmovement can be separated into two phases: a firstwhere the variance increases over time and wheremost of the distance to the target is covered, follo-wed by a second phase where the variance decreasesuntil it satisfies accuracy constraints. During this se-cond phase, the problem of aiming can be reduced toa Shannon-like communication problem where infor-mation is transmitted from a “source” (variance at theend of the first phase), to a “destination” (the limb ex-tremity) over a “channel” perturbed by Gaussian noisewith a feedback link. I show that the optimal solution tothis transmission problem amounts to a scheme firstsuggested by Elias. I show that the variance can de-crease at best exponentially during the second phase,and that this result induces Fitts’ law.
888

Přesnost a dynamika lineárních piezo motorů / Precision and dynamics of linear piezo motors

Ilichman, Ondřej January 2014 (has links)
The work deals with the design, control, precision and dynamics of linear piezoelectric motors from PiezoMotor Uppsala AB. The aim of this thesis is to evaluate the applicability of the loaned engine in a mass spectrometer. In introduction‘s section of this work is discussing the design principles of functionality, materials used, procedure and types of piezoelectric motors. It focuses the choice of hardware and design of mechanical assemblies, which could replace the current mechanism with electromagnetic motors. Then we designed and fabricated a test report with hired piezoelectric motor. In NI LabView development environment is created test application for control and measurement. The final section is devoted to a summary of the positive and negative features of the hired piezoelectric motor and evaluation of accuracy and dynamics.
889

Aktiemarknadsprognoser: En jämförande studie av LSTM- och SVR-modeller med olika dataset och epoker / Stock Market Forecasting: A Comparative Study of LSTM and SVR Models Across Different Datasets and Epochs

Nørklit Johansen, Mads, Sidhu, Jagtej January 2023 (has links)
Predicting stock market trends is a complex task due to the inherent volatility and unpredictability of financial markets. Nevertheless, accurate forecasts are of critical importance to investors, financial analysts, and stakeholders, as they directly inform decision-making processes and risk management strategies associated with financial investments. Inaccurate forecasts can lead to notable financial consequences, emphasizing the crucial and demanding task of developing models that provide accurate and trustworthy predictions. This article addresses this challenging problem by utilizing a long-short term memory (LSTM) model to predict stock market developments. The study undertakes a thorough analysis of the LSTM model's performance across multiple datasets, critically examining the impact of different timespans and epochs on the accuracy of its predictions. Additionally, a comparison is made with a support vector regression (SVR) model using the same datasets and timespans, which allows for a comprehensive evaluation of the relative strengths of the two techniques. The findings offer insights into the capabilities and limitations of both models, thus paving the way for future research in stock market prediction methodologies. Crucially, the study reveals that larger datasets and an increased number of epochs can significantly enhance the LSTM model's performance. Conversely, the SVR model exhibits significant challenges with overfitting. Overall, this research contributes to ongoing efforts to improve financial prediction models and provides potential solutions for individuals and organizations seeking to make accurate and reliable forecasts of stock market trends.
890

Evaluating the use of Brush and Tooltip for Time Series visualizations: A comparative study

Helin, Sebastian, Eklund, André January 2023 (has links)
This study uses a combination of user testing and analysis to evaluate the impact of brush and tooltip on the comprehension of time series visualizations. Employing a sequential mixed-methods approach, with qualitative data from semi-structured interviews used to inform the design of a visualization tool, followed by a quantitative user study to validate it. Sixteen (16) participants from various fields of study, predominantly computer science, participated in the study. A MANOVA test was conducted with results indicating a significant statistical difference between the groups. Results deriving from the study show that the use of brush and tooltip increases user accuracy on detecting outliers, as for perception of trends and patterns. The study’s context was limited to desktop usage, and all participants were treated as a homogenous group, presenting potential limitations in applying these findings to other devices or more diverse user groups. The results provide information about improving time series data visualizations for facilitating more efficient and effective understanding, which can be relevant specifically to data analysts and academic researchers.

Page generated in 0.0634 seconds