• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 164
  • 30
  • 15
  • 10
  • 9
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 292
  • 292
  • 143
  • 82
  • 59
  • 46
  • 46
  • 37
  • 32
  • 31
  • 31
  • 26
  • 24
  • 22
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Detection and Analysis of Anomalies in Tactical Sensor Systems through Structured Hypothesis Testing / Detektion och analys av avikelser i taktiska sensorsystem genom strukturerad hypotesprövning

Ohlson, Fredrik January 2023 (has links)
The project explores the domain of tactical sensor systems, focusing on SAAB Gripen’s sensor technologies such as radar, RWR (Radar Warning Receiver), and IRST (InfraRed Search and Track). The study employs structured hypothesis testing and model based diagnostics to examine the effectiveness of identifying and isolating deviations within these systems. The central question addressed is whether structured hypothesis testing reliably detects and isolates anomalies in a tactical sensor system. The research employs a framework involving sensor modeling of radar, RWR, and IRST, alongside a sensor fusion model, applied on a linear target tracking model as well as a real target flight track obtained from SAAB Test Flight and Verification. Test quantities are derived from the modeled data, and synthetic faults are intentionally introduced into the system. These test quantities are then compared to predefined thresholds, thereby facilitating structured hypothesis testing. The robustness and reliability of the diagnostics model are established through a series of simulations. Multiple scenarios with varied fault introductions across different sensor measurements are examined. Key results include the successful creation of a tactical sensor model and sensor fusion environment, showcasing the ability to introduce and detect faults. The thesis provides arguments supporting the advantages of model based diagnosis through structured hypothesis testing for assessing sensor fusion data. The results of this research are applicable beyond this specific context, facilitating improved sensor data analysis across diverse tracking scenarios, including applications beyond SAAB Gripen. As sensor technologies continue to evolve, the insights gained from this thesis could offer guidance for refining sensor models and hypothesis testing techniques, ultimately enhancing the efficiency and accuracy of sensor data analysis in various domains. / Denna rapport undersöker området inom taktiska sensorsystem och fokuserar på SAAB Gripens sensorteknik, såsom radar, RWR (Radar Warning Receiver) och IRST (InfraRed Search- and Track). Studien använder strukturerad hypotesprövning och modellbaserad diagnostik för att undersöka effektiviteten av att identifiera och isolera avvikelser inom dessa system. Den centrala frågan som behandlas är om strukturerad hypotesprövning tillförlitligt upptäcker och isolerar avvikelser i ett taktiskt sensorsystem. För att tackla denna utmaning används sensormodellering av radar, RWR och IRST, tillsammans med en sensorfusionsmodell som appliceras på en linjär målspårningsmodell samt verklig målflygbana erhållen från SAAB. Testkvantiteter härleds från den resulterande datan, och syntetiska fel introduceras avsiktligt i systemet. Dessa testskvantiteter jämförs sedan med fördefinierade trösklar vilket lägger grunden för strukturerad hypotesprövning. Tillförlitligheten och pålitligheten hos diagnostikmodellen etableras genom en serie av simuleringar bestående av flera scenarier med varierade felintroduktioner över olika sensorinmätningar. Huvudresultat inkluderar skapandet av en taktisk sensormodell och en sensorfusionsmiljö, som visar förmågan att introducera och upptäcka fel på ett effektivt sätt. Avhandlingen ger argument som stödjer fördelarna med modellbaserad diagnostik genom strukturerad hypotestestning för bedömning av sensorfusionsdata. Resultaten av denna forskning är tillämpliga utanför detta specifika sammanhang, vilket underlättar förbättrad sensordataanalys över olika spårningsscenarier, inklusive applikationer bortom SAAB Gripen. I takt med att sensorteknologier fortsätter att utvecklas kan insikterna från denna avhandling ge vägledning för att förbättra sensormodeller och hypotestestningstekniker, vilket i slutändan förbättrar effektiviteten och noggrannheten för sensordataanalys inom olika områden.
242

Re-sampling in instrumental variables regression

Koziuk, Andzhey 13 July 2020 (has links)
Diese Arbeit behandelt die Instrumentalvariablenregression im Kontext der Stichprobenwiederholung. Es wird ein Rahmen geschaffen, der das Ziel der Inferenz identifiziert. Diese Abhandlung versucht die Instrumentalvariablenregression von einer neuen Perspektive aus zu motivieren. Dabei wird angenommen, dass das Ziel der Schätzung von zwei Faktoren gebildet wird, einer Umgebung und einer zu einem internen Model spezifischen Struktur. Neben diesem Rahmen entwickelt die Arbeit eine Methode der Stichprobenwiederholung, die geeignet für das Testen einer linearen Hypothese bezüglich der Schätzung des Ziels ist. Die betreffende technische Umgebung und das Verfahren werden im Zusammenhang in der Einleitung und im Hauptteil der folgenden Arbeit erklärt. Insbesondere, aufbauend auf der Arbeit von Spokoiny, Zhilova 2015, rechtfertigt und wendet diese Arbeit ein numerisches ’multiplier-bootstrap’ Verfahren an, um nicht asymptotische Konfidenzintervalle für den Hypothesentest zu konstruieren. Das Verfahren und das zugrunde liegende statistische Werkzeug wurden so gewählt und angepasst, um ein im Model auftretendes und von asymptotischer Analysis übersehenes Problem zu erklären, das formal als Schwachheit der Instrumentalvariablen bekannt ist. Das angesprochene Problem wird jedoch durch den endlichen Stichprobenansatz von Spokoiny 2014 adressiert. / Instrumental variables regression in the context of a re-sampling is considered. In the work a framework is built to identify an inferred target function. It attempts to approach an idea of a non-parametric regression and motivate instrumental variables regression from a new perspective. The framework assumes a target of estimation to be formed by two factors - an environment and an internal, model specific structure. Aside from the framework, the work develops a re-sampling method suited to test linear hypothesis on the target. Particular technical environment and procedure are given and explained in the introduction and in the body of the work. Specifically, following the work of Spokoiny, Zhilova 2015, the writing justifies and applies numerically 'multiplier bootstrap' procedure to construct confidence intervals for the testing problem. The procedure and underlying statistical toolbox were chosen to account for an issue appearing in the model and overlooked by asymptotic analysis, that is weakness of instrumental variables. The issue, however, is addressed by design of the finite sample approach by Spokoiny 2014.
243

Corrected LM goodness-of-fit tests with applicaton to stock returns

Percy, Edward Richard, Jr. 05 January 2006 (has links)
No description available.
244

Validation and Inferential Methods for Distributional Form and Shape

Mayorov, Kirill January 2017 (has links)
This thesis investigates some problems related to the form and shape of statistical distributions with the main focus on goodness of fit and bump hunting. A bump is a distinctive characteristic of distributional shape. A search for bumps, or bump hunting, in a probability density function (PDF) has long been an important topic in statistical research. We introduce a new definition of a bump which relies on the notion of the curvature of a planar curve. We then propose a new method for bump hunting which is based on a kernel density estimator of the unknown PDF. The method gives not only the number of bumps but also the location of their centers and base points. In quantitative risk applications, the selection of distributions that properly capture upper tail behavior is essential for accurate modeling. We study tests of distributional form, or goodness-of-fit (GoF) tests, that assess simple hypotheses, i.e., when the parameters of the hypothesized distribution are completely specified. From theoretical and practical perspectives, we analyze the limiting properties of a family of weighted Cramér-von Mises GoF statistics W2 with weight function psi(t)=1/(1-t)^beta (for beta<=2) which focus on the upper tail. We demonstrate that W2 has no limiting distribution. For this reason, we provide a normalization of W2 that leads to a non-degenerate limiting distribution. Further, we study W2 for composite hypotheses, i.e., when distributional parameters must be estimated from a sample at hand. When the hypothesized distribution is heavy-tailed, we examine the finite sample properties of W2 under the Chen-Balakrishnan transformation that reduces the original GoF test (the direct test) to a test for normality (the indirect test). In particular, we compare the statistical level and power of the pairs of direct and indirect tests. We observe that decisions made by the direct and indirect tests agree well, and in many cases they become independent as sample size grows. / Thesis / Doctor of Philosophy (PhD)
245

Ambient Backscatter Communication Systems: Design, Signal Detection and Bit Error Rate Analysis

Devineni, Jaya Kartheek 21 September 2021 (has links)
The success of the Internet-of-Things (IoT) paradigm relies on, among other things, developing energy-efficient communication techniques that can enable information exchange among billions of battery-operated IoT devices. With its technological capability of simultaneous information and energy transfer, ambient backscatter is quickly emerging as an appealing solution for this communication paradigm, especially for the links with low data rate requirements. However, many challenges and limitations of ambient backscatter have to be overcome for widespread adoption of the technology in future wireless networks. Motivated by this, we study the design and implementation of ambient backscatter systems, including non-coherent detection and encoding schemes, and investigate techniques such as multiple antenna interference cancellation and frequency-shift backscatter to improve the bit error rate performance of the designed ambient backscatter systems. First, the problem of coherent and semi-coherent ambient backscatter is investigated by evaluating the exact bit error rate (BER) of the system. The test statistic used for the signal detection is based on the averaging of energy of the received signal samples. It is important to highlight that the conditional distributions of this test statistic are derived using the central limit theorem (CLT) approximation in the literature. The characterization of the exact conditional distributions of the test statistic as non-central chi-squared random variable for the binary hypothesis testing problem is first handled in our study, which is a key contribution of this particular work. The evaluation of the maximum likelihood (ML) detection threshold is also explored which is found to be intractable. To overcome this, alternate strategies to approximate the ML threshold are proposed. In addition, several insights for system design and implementation are provided both from analytical and numerical standpoints. Second, the highly appealing non-coherent signal detection is explored in the context of ambient backscatter for a time-selective channel. Modeling the time-selective fading as a first-order autoregressive (AR) process, we implement a new detection architecture at the receiver based on the direct averaging of the received signal samples, which departs significantly from the energy averaging-based receivers considered in the literature. For the proposed setup, we characterize the exact asymptotic BER for both single-antenna (SA) and multi-antenna (MA) receivers, and demonstrate the robustness of the new architecture to timing errors. Our results demonstrate that the direct-link (DL) interference from the ambient power source leads to a BER floor in the SA receiver, which the MA receiver can avoid by estimating the angle of arrival (AoA) of the DL. The analysis further quantifies the effect of improved angular resolution on the BER as a function of the number of receive antennas. Third, the advantages of utilizing Manchester encoding for the data transmission in the context of non-coherent ambient backscatter have been explored. Specifically, encoding is shown to simplify the detection procedure at the receiver since the optimal decision rule is found to be independent of the system parameters. Through extensive numerical results, it is further shown that a backscatter system with Manchester encoding can achieve a signal-to-noise ratio (SNR) gain compared to the commonly used uncoded direct on-off keying (OOK) modulation, when used in conjunction with a multi-antenna receiver employing the direct-link cancellation. Fourth, the BER performance of frequency-shift ambient backscatter, which achieves the self-interference mitigation by spatially separating the reflected backscatter signal from the impending source signal, is investigated. The performance of the system is evaluated for a non-coherent receiver under slow fading in two different network setups: 1) a single interfering link coming from the ambient transmission occurring in the shifted frequency region, and 2) a large-scale network with multiple interfering signals coming from the backscatter nodes and ambient source devices transmitting in the band of interest. Modeling the interfering devices as a two dimensional Poisson point process (PPP), tools from stochastic geometry are utilized to evaluate the bit error rate for the large-scale network setup. / Doctor of Philosophy / The emerging paradigm of Internet-of-Things (IoT) has the capability of radically transforming the human experience. At the heart of this technology are the smart edge devices that will monitor everyday physical processes, communicate regularly with the other nodes in the network chain, and automatically take appropriate actions when necessary. Naturally, many challenges need to be tackled in order to realize the true potential of this technology. Most relevant to this dissertation are the problems of powering potentially billions of such devices and enabling low-power communication among them. Ambient backscatter has emerged as a useful technology to handle the aforementioned challenges of the IoT networks due to its capability to support the simultaneous transfer of information and energy. This technology allows devices to harvest energy from the ambient signals in the environment thereby making them self-sustainable, and in addition provide carrier signals for information exchange. Using these attributes of ambient backscatter, the devices can operate at very low power which is an important feature when considering the reliability requirements of the IoT networks. That said, the ambient backscatter technology needs to overcome many challenges before its widespread adoption in IoT networks. For example, the range of backscatter is limited in comparison to the conventional communication systems due to self-interference from the power source at a receiver. In addition, the probability of detecting the data in error at the receiver, characterized by the bit error rate (BER) metric, in the presence of wireless multipath is generally poor in ambient backscatter due to double path loss and fading effects observed for the backscatter link. Inspired by this, the aim of this dissertation is to come up with new architecture designs for the transmitter and receiver devices that can improve the BER performance. The key contributions of the dissertation include the analytical derivations of BER which provide insights on the system design and the main parameters impacting the system performance. The exact design of the optimal detection technique for a communication system is dependent on the channel behavior, mainly the time-varying nature in the case of a flat fading channel. Depending on the mobility of devices and scatterers present in the wireless channel, it can either be described as time-selective or time-nonselective. In the time-nonselective channels, coherent detection that requires channel state information (CSI) estimation using pilot signals can be implemented for ambient backscatter. On the other hand, non-coherent detection is preferred when the channel is time-selective since the CSI estimation is not feasible in such scenarios. In the first part of this dissertation, we analyze the performance of ambient backscatter in a point-to-point single-link system for both time-nonselective and time-selective channels. In particular, we determine the BER performance of coherent and non-coherent detection techniques for ambient backscatter systems in this line of work. In addition, we investigate the possibility of improving the BER performance using multi-antenna and coding techniques. Our analyses demonstrate that the use of multi-antenna and coding can result in tremendous improvement of the performance and simplification of the detection procedure, respectively. In the second part of the dissertation, we study the performance of ambient backscatter in a large-scale network and compare it to that of the point-to-point single-link system. By leveraging tools from stochastic geometry, we analytically characterize the BER performance of ambient backscatter in a field of interfering devices modeled as a Poisson point process.
246

Performance Analysis of Detection System Design Algorithms

Nyberg, Karl-Johan 11 April 2003 (has links)
Detection systems are widely used in industry. Designers, operators and users of these systems need to choose an appropriate design, based on the intended usage and the operating environment. The purpose of this research is to analyze the effect of various system design variables (controllable) and system parameters (uncontrollable) on the performance of detection systems. To optimize system performance one must manage the tradeoff between two errors that can occur. A False Alarm occurs if the detection system falsely indicates a target is present and a False Clear occurs if the detection system falsely fails to indicate a target is present. Given a particular detection system and a pre-specified false clear (or false alarm) rate, there is a minimal false alarm (or false clear) rate that can be achieved. Earlier research has developed methods that address this false alarm, false clear tradeoff problem (FAFCT) by formulating a Neyman-Pearson hypothesis problem, which can be solved as a Knapsack problem. The objective of this research is to develop guidelines that can be of help in designing detection systems. For example, what system design variables must be implemented to achieve a certain false clear standard for a parallel 2-sensor detection system for Salmonella detection? To meet this objective, an experimental design is constructed and an analysis of variance is performed. Computational results are obtained using the FAFCT-methodology and the results are presented and analyzed using ROC (Receiver Operating Characteristic) curves and an analysis of variance. The research shows that sample size (i.e., size of test data set used to estimate the distribution of sensor responses) has very little effect on the FAFCT compared to other factors. The analysis clearly shows that correlation has the most influence on the FAFCT. Negatively correlated sensor responses outperform uncorrelated and positively correlated sensor responses with large margins, especially for strict FC-standards (FC-standard is defined as the maximum allowed False Clear rate). Suggestions for future research are also included. FC-standard is the second most influential design variable followed by grid size. / Master of Science
247

Medborgarforskning inom biologisk mångfald på kommunal nivå : En fallstudie i Tierps kommun / Citizen science within biodiversity at a municipal level : A case study in Tierps kommun

Ekroth, Tobias, Sanne, Tom, Wennergren, Oliver January 2024 (has links)
Hastigheten för globalt utdöende är betydligt högre nu jämfört med den förmänskliga tiden, och utan åtgärder kommer utdöendet att accelerera. Detta medför ett stort ansvar på beslutsfattare som måste agera i förhållande till detta. Därför är det viktigt att kunna följa upp den biologiska mångfalden för att ta strategiska beslut, något som endast sker i begränsad skala på kommunal nivå i Sverige. Vidare finns det mycket medborgardata kring artrikedom, där frivilliga medborgare rapporterar fynd de gjort i databasen Artportalen. Möjligheten att utnyttja denna typ av data undersöks i denna fallstudie av Tierps kommun, där trenden för den biologiska mångfalden undersöktes. Därefter användes hypotesprövning för att kvantifiera osäkerheten i de resultat som framtogs. Däremot hävdar flera forskare att det finns utmaningar med att använda sådan data, därför ämnade denna studie även att undersöka dessa problem med den använda datan. Därmed togs det fram kompletterande information om plats, tid och arter för att analysera detta. Studien visade en övergripande negativ trend för den biologiska mångfalden i Tierps kommun, vilket validerades av hypotesprövningen. Gällande problemen som undersöktes kunde det konstateras att många fynd var centrerade runt särskilda platser, månader och arter, vilket begränsar möjligheten att dra säkra slutsatser gällande hela den biologiska mångfalden i kommunen. Sammanfattningsvis kunde det konstateras att de resultat som framtogs skulle behöva kompletteras av annan data alternativt av andra tillvägagångssätt. / The rate of global extinction is significantly higher today compared to the prehuman times, and without action, the extinction will accelerate. This means that there is great pressure on decision-makers to take action. It is therefore important to monitor biodiversity to make strategic decisions, something that municipalities in Sweden only do at a limited scale. At the same time, there are loads of citizen science data on species richness, where unsolicited citizens report their findings in the database Artportalen. The opportunity to utilize this type of data is explored in this case study of Tierps kommun, where the trend of biodiversity is examined. Additionally, hypothesis testing was used to quantify the uncertainty in the results. However, several scientists claim that there are challenges using such data, therefore this study also aimed to explore these. To do this, complementary information regarding time, space and species were gathered and analyzed. The results indicated that a general negative trend could be identified in biodiversity for the municipality of Tierp, which was validated with the hypothesis testing. Regarding the problems that were examined, it could be ascertained that most of the discoveries were limited to specific places, months and species, which limits the possibility to draw certain conclusions regarding the entire biodiversity in the municipality. In conclusion, it could be stated that the results that were produced need to be complemented by other data, alternatively by other methods.
248

Spectrum Sensing in Cognitive Radios using Distributed Sequential Detection

Jithin, K S January 2013 (has links) (PDF)
Cognitive Radios are emerging communication systems which efficiently utilize the unused licensed radio spectrum called spectral holes. They run Spectrum sensing algorithms to identify these spectral holes. These holes need to be identified at very low SNR (<=-20 dB) under multipath fading, unknown channel gains and noise power. Cooperative spectrum sensing which exploits spatial diversity has been found to be particularly effective in this rather daunting endeavor. However despite many recent studies, several open issues need to be addressed for such algorithms. In this thesis we provide some novel cooperative distributed algorithms and study their performance. We develop an energy efficient detector with low detection delay using decentralized sequential hypothesis testing. Our algorithm at the Cognitive Radios employ an asynchronous transmission scheme which takes into account the noise at the fusion center. We have developed a distributed algorithm, DualSPRT, in which Cognitive Radios (secondary users) sequentially collect the observations, make local decisions and send them to the fusion center. The fusion center sequentially processes these received local decisions corrupted by Gaussian noise to arrive at a final decision. Asymptotically, this algorithm is shown to achieve the performance of the optimal centralized test, which does not consider fusion center noise. We also theoretically analyze its probability of error and average detection delay. Even though DualSPRT performs asymptotically well, a modification at the fusion node provides more control over the design of the algorithm parameters which then performs better at the usual operating probabilities of error in Cognitive Radio systems. We also analyze the modified algorithm theoretically. DualSPRT requires full knowledge of channel gains. Thus we extend the algorithm to take care the imperfections in channel gain estimates. We also consider the case when the knowledge about the noise power and channel gain statistic is not available at the Cognitive Radios. This problem is framed as a universal sequential hypothesis testing problem. We use easily implementable universal lossless source codes to propose simple algorithms for such a setup. Asymptotic performance of the algorithm is presented. A cooperative algorithm is also designed for such a scenario. Finally, decentralized multihypothesis sequential tests, which are relevant when the interest is to detect not only the presence of primary users but also their identity among multiple primary users, are also considered. Using the insight gained from binary hypothesis case, two new algorithms are proposed.
249

A comparative study of permutation procedures

Van Heerden, Liske 30 November 1994 (has links)
The unique problems encountered when analyzing weather data sets - that is, measurements taken while conducting a meteorological experiment- have forced statisticians to reconsider the conventional analysis methods and investigate permutation test procedures. The problems encountered when analyzing weather data sets are simulated for a Monte Carlo study, and the results of the parametric and permutation t-tests are compared with regard to significance level, power, and the average coilfidence interval length. Seven population distributions are considered - three are variations of the normal distribution, and the others the gamma, the lognormal, the rectangular and empirical distributions. The normal distribution contaminated with zero measurements is also simulated. In those simulated situations in which the variances are unequal, the permutation test procedure was performed using other test statistics, namely the Scheffe, Welch and Behrens-Fisher test statistics. / Mathematical Sciences / M. Sc. (Statistics)
250

L'électrophysiologie temps-réel en neuroscience cognitive : vers des paradigmes adaptatifs pour l'étude de l'apprentissage et de la prise de décision perceptive chez l'homme / Real-time electrophysiology in cognitive neuroscience : towards adaptive paradigms to study perceptual learning and decision making in humans

Sanchez, Gaëtan 27 June 2014 (has links)
Aujourd’hui, les modèles computationnels de l'apprentissage et de la prise de décision chez l'homme se sont raffinés et complexifiés pour prendre la forme de modèles génératifs des données psychophysiologiques de plus en plus réalistes d’un point de vue neurobiologique et biophysique. Dans le même temps, le nouveau champ de recherche des interfaces cerveau-machine (ICM) s’est développé de manière exponentielle. L'objectif principal de cette thèse était d'explorer comment le paradigme de l'électrophysiologie temps-réel peut contribuer à élucider les processus d'apprentissage et de prise de décision perceptive chez l’homme. Au niveau expérimental, j'ai étudié les décisions perceptives somatosensorielles grâce à des tâches de discrimination de fréquence tactile. En particulier, j'ai montré comment un contexte sensoriel implicite peut influencer nos décisions. Grâce à la magnétoencéphalographie (MEG), j'ai pu étudier les mécanismes neuronaux qui sous-tendent cette adaptation perceptive. L’ensemble de ces résultats renforce l'hypothèse de la construction implicite d’un a priori ou d'une référence interne au cours de l'expérience. Aux niveaux théoriques et méthodologiques, j'ai proposé une vue générique de la façon dont l'électrophysiologie temps-réel pourrait être utilisée pour optimiser les tests d'hypothèses, en adaptant le dessin expérimental en ligne. J'ai pu fournir une première validation de cette démarche adaptative pour maximiser l'efficacité du dessin expérimental au niveau individuel. Ce travail révèle des perspectives en neurosciences fondamentales et cliniques ainsi que pour les ICM / Today, psychological as well as physiological models of perceptual learning and decision-making processes have recently become more biologically plausible, leading to more realistic (and more complex) generative models of psychophysiological observations. In parallel, the young but exponentially growing field of Brain-Computer Interfaces (BCI) provides new tools and methods to analyze (mostly) electrophysiological data online. The main objective of this PhD thesis was to explore how the BCI paradigm could help for a better understanding of perceptual learning and decision making processes in humans. At the empirical level, I studied decisions based on tactile stimuli, namely somatosensory frequency discrimination. More specifically, I showed how an implicit sensory context biases our decisions. Using magnetoencephalography (MEG), I was able to decipher some of the neural correlates of those perceptual adaptive mechanisms. These findings support the hypothesis that an internal perceptual-reference builds up along the course of the experiment. At the theoretical and methodological levels, I propose a generic view and method of how real-time electrophysiology could be used to optimize hypothesis testing, by adapting the experimental design online. I demonstrated the validity of this online adaptive design optimization (ADO) approach to maximize design efficiency at the individual level. I also discussed the implications of this work for basic and clinical neuroscience as well as BCI itself

Page generated in 0.0976 seconds