71 |
Realizace monitorovacího systému pokojových rostlin v prostředí IoT / Implementation of monitoring system of house plants in IoT environmentMach, Sebastián January 2020 (has links)
This master's thesis is about the design and development of a flower pot sensor, which monitors data related to the cultivation of houseplants. The sensor sends the data to the cloud, where the analysis is performed and the evaluated living conditions of the monitored plant are displayed to the user.
|
72 |
The Gourmet Guide to Statistics: For an Instructional Strategy That Makes Teaching and Learning Statistics a Piece of CakeEdirisooriya, Gunapala 01 January 2003 (has links)
This article draws analogies between the activities of statisticians and of chefs. It suggests how these analogies can be used in teaching, both to help understanding of what statistics is about and to increase motivation to learn the subject.
|
73 |
Intraday Algorithmic Trading using Momentum and Long Short-Term Memory Network StrategiesWhitinger, Andrew R, II, Wallace, Chris, Trainor, William 07 April 2022 (has links)
Intraday stock trading is an infamously difficult and risky strategy. Momentum and reversal strategies and long short-term memory (LSTM) neural networks have been shown to be effective for selecting stocks to buy and sell over time periods of multiple days. To explore whether these strategies can be effective for intraday trading, their implementations were simulated using intraday price data for stocks in the S&P 500 index, collected at 1-second intervals between February 11, 2021 and March 9, 2021 inclusive. The study tested 160 variations of momentum and reversal strategies for profitability in long, short, and market-neutral portfolios, totaling 480 portfolios. Long and short portfolios for each strategy were also compared to the market to observe excess returns. Eight reversal portfolios yielded statistically significant profits, and 16 yielded significant excess returns. Tests of these strategies on another set of 16 days failed to yield statistically significant returns, though average returns remained profitable. Four LSTM network configurations were tested on the same original set of days, with no strategy yielding statistically significant returns. Close examination of the stocks chosen by LSTM networks suggests that the networks expect stocks to exhibit a momentum effect. Further studies may explore whether an intraday reversal effect can be observed over time during different market conditions and whether different configurations of LSTM networks can generate significant returns.
|
74 |
Spectral Density Function Estimation with Applications in Clustering and ClassificationChen, Tianbo 03 March 2019 (has links)
Spectral density function (SDF) plays a critical role in spatio-temporal data analysis, where the data are analyzed in the frequency domain. Although many methods have been proposed for SDF estimation, real-world applications in many research fields, such as neuroscience and environmental science, call for better methodologies. In this thesis, we focus on the spectral density functions for time series and spatial data, develop new estimation algorithms, and use the estimators as features for clustering and classification purposes.
The first topic is motivated by clustering electroencephalogram (EEG) data in the spectral domain. To identify synchronized brain regions that share similar oscillations and waveforms, we develop two robust clustering methods based on the functional data ranking of the estimated SDFs. The two proposed clustering methods use different dissimilarity measures and their performance is examined by simulation studies in which two types of contaminations are included to show the robustness. We apply the methods to two sets of resting-state EEG data collected from a male college student.
Then, we propose an efficient collective estimation algorithm for a group of SDFs. We use two sets of basis functions to represent the SDFs for dimension reduction, and then, the scores (the coefficients of the basis) estimated by maximizing the penalized Whittle likelihood are used for clustering the SDFs in a much lower dimension. For spatial data, an additional penalty is applied to the likelihood to encourage the spatial homogeneity of the clusters. The proposed methods are applied to cluster the EEG data and the soil moisture data. Finally, we propose a parametric estimation method for the quantile spectrum.
We approximate the quantile spectrum by the ordinary spectral density of an AR process at each quantile level. The AR coefficients are estimated by solving Yule- Walker equations using the Levinson algorithm. Numerical results from simulation studies show that the proposed method outperforms other conventional smoothing techniques. We build a convolutional neural network (CNN) to classify the estimated quantile spectra of the earthquake data in Oklahoma and achieve a 99.25% accuracy on testing sets, which is 1.25% higher than using ordinary periodograms.
|
75 |
Analytical and Numerical Techniques for the Optimal Design of Mineral Separation CircuitsNoble, Christopher Aaron 13 June 2013 (has links)
The design of mineral processing circuits is a complex, open-ended process. While several tools and methodologies are available, extensive data collection accompanied with trial-and-error simulation are often the predominant technical measures utilized throughout the process. Unfortunately, this approach often produces sub-optimal solutions, while squandering time and financial resources. This work proposes several new and refined methodologies intended to assist during all stages of circuit design. First, an algorithm has been developed to automatically determine circuit analytical solutions from a user-defined circuit configuration. This analytical solution may then be used to rank circuits by traditional derivative-based linear circuit analysis or one of several newly proposed objective functions, including a yield indicator (the yield score) or a value-based indicator (the moment of inertia). Second, this work presents a four-reactor flotation model which considers both process kinetics and machine carrying capacity. The simulator is suitable for scaling laboratory data to predict full-scale performance. By first using circuit analysis to reduce the number of design alternatives, experimental and simulation efforts may be focused to those configurations which have the best likelihood of enhanced performance while meeting secondary process objectives. Finally, this work verifies the circuit analysis methodology through a virtual experimental analysis of 17 circuit configurations. A hypothetical electrostatic separator was implemented into a dynamic physics-based discrete element modeling environment. The virtual experiment was used to quantify the selectivity of each circuit configuration, and the final results validate the initial circuit analysis projections. / Ph. D.
|
76 |
Parametric Projection Pursuits for Dimensionality Reduction of Hyperspectral Signals in Target Recognition ApplicationsLin, Huang-De Hennessy 08 May 2004 (has links)
The improved spectral resolution of modern hyperspectral sensors provides a means for discriminating subtly different classes of on ground materials in remotely sensed images. However, in order to obtain statistically reliable classification results, the number of necessary training samples can increase exponentially as the number of spectral bands increases. Obtaining the necessary number of training signals for these high-dimensional datasets may not be feasible. The problem can be overcome by preprocessing the data to reduce the dimensionality and thus reduce the number of required training samples. In this thesis, three dimensionality reduction methods, all based on parametric projection pursuits, are investigated. These methods are the Sequential Parametric Projection Pursuits (SPPP), Parallel Parametric Projection Pursuits (PPPP), and Projection Pursuits Best Band Selection (PPBBS). The methods are applied to very high spectral resolution data to transform the hyperspectral data to a lower-dimension subspace. Feature extractors and classifiers are then applied to the lower-dimensional data to obtain target detection accuracies. The three projection pursuit methods are compared to each other, as well as to the case of using no dimensionality reduction preprocessing. When applied to hyperspectral data in a precision agriculture application, discriminating sicklepod and cocklebur weeds, the results showed that the SPPP method was optimum in terms of accuracy, resulting in a classification accuracy of >95% when using a nearest mean, maximum likelihood, or nearest neighbor classifier. The PPPP method encountered optimization problems when the hyperspectral dimensionality was very high, e.g. in the thousands. The PPBBS method resulted in high classification accuracies, >95%, when the maximum likelihood classifier was utilized; however, this method resulted in lower accuracies when the nearest mean or nearest neighbor classifiers were used. When using no projection pursuit preprocessing, the classification accuracies ranged between ~50% and 95%; however, for this case the accuracies greatly depended on the type of classifier being utilized.
|
77 |
AN INTERNSHIP WITH THE OHIO EVALUATION & ASSESSMENT CENTERMarks, Pamela Anne 28 November 2005 (has links)
No description available.
|
78 |
Burn-in with mixed populations /Pan, Un-Quei Winkey January 1987 (has links)
No description available.
|
79 |
Variable screening and graphical modeling for ultra-high dimensional longitudinal dataZhang, Yafei 02 July 2019 (has links)
Ultrahigh-dimensional variable selection is of great importance in the statistical research. And independence screening is a powerful tool to select important variable when there are massive variables. Some commonly used independence screening procedures are based on single replicate data and are not applicable to longitudinal data. This motivates us to propose a new Sure Independence Screening (SIS) procedure to bring the dimension from ultra-high down to a relatively large scale which is similar to or smaller than the sample size. In chapter 2, we provide two types of SIS, and their iterative extensions (iterative SIS) to enhance the finite sample performance. An upper bound on the number of variables to be included is derived and assumptions are given under which sure screening is applicable. The proposed procedures are assessed by simulations and an application of them to a study on systemic lupus erythematosus illustrates the practical use of these procedures. After the variables screening process, we then explore the relationship among the variables. Graphical models are commonly used to explore the association network for a set of variables, which could be genes or other objects under study. However, graphical modes currently used are only designed for single replicate data, rather than longitudinal data. In chapter 3, we propose a penalized likelihood approach to identify the edges in a conditional independence graph for longitudinal data. We used pairwise coordinate descent combined with second order cone programming to optimize the penalized likelihood and estimate the parameters. Furthermore, we extended the nodewise regression method the for longitudinal data case. Simulation and real data analysis exhibit the competitive performance of the penalized likelihood method. / Doctor of Philosophy / Longitudinal data have received a considerable amount of attention in the fields of health science studies. The information from this type of data could be helpful with disease detection and control. Besides, a graph of factors related to the disease can also be built up to represent their relationships between each other. In this dissertation, we develop a framework to find out important factor(s) from thousands of factors in longitudinal data that is/are related to the disease. In addition, we develop a graphical method that can show the relationship among the important factors identified from the previous screening. In practice, combining these two methods together can identify important factors for a disease as well as the relationship among the factors, and thus provide us a deeper understanding about the disease.
|
80 |
Modular GC: A Fully Integrated Micro Gas Chromatography SystemManurkar, Shaunak Sudhir 22 September 2021 (has links)
Gas Chromatography (GC) is one of the most important and widely used tools in analytical chemistry. However, they are bulky, have a longer measurement cycle, and consume a high amount of power. Micro-Gas Chromatography (µGC) is portable and energy-efficient, which allows onsite, real-time biological, forensic, and environmental analyses. This thesis presents a ready-to-deploy implementation of microfabricated gas chromatography (µGC) system capable of separating complex samples. We describe robust, modular, and scalable hardware and software architecture based on Real-Time Operating System (RTOS) and Python Graphical User Interface (GUI) integrated with various microfabricated devices to realize a fully functional µGC system. A sample heater for headspace injection, microfabricated separation column (µSC), a Photoionization Detector (PI-D), and a flow controller unit are integrated with the modular hardware and software to realize a fully functional Vacuum Outlet µGC system. We have designed a novel auto-calibration method for temperature calibration of the microfabricated devices which does not require changing the electronic circuitry or reprogramming the device. The vacuum outlet µGC setup is tested with various mixture of analytes. For these experiments, an average relative standard deviation (RSD) for retention time repeatability of 2.5% is achieved. Data processing techniques for raw chromatograms, including baseline correction and peak detection, are implemented on a microcontroller board and tested extensively as a part of this work. A novel algorithm for multidimensional analysis for the identification of co-eluting compounds in complex samples is implemented with a prediction accuracy of 94%. / Master of Science / Toxic volatile organic compounds (VOCs) such as benzene and toluene found in gasoline and xylene used in ink, rubber, and leather industries are of concern as they are present at elevated concentrations due to their higher vapor pressure. Sufficient exposure to these toxicants, even at lower concentrations like 100 parts-per-billion-volume (ppbv), may cause adverse health effects. Gas Chromatography (GC) has been the established method for assessing the presence and concentration of VOCs in the environment. Traditional GC systems are bulky, power-hungry, expensive, and require expert supervision for analysis. Recent research in microelectromechanical systems (MEMS) has reduced the size of the GC components, also called micro-GC (µGC), while improving the performance. The majority of the research and development of µGC is aimed at advancing microfabricated components such as preconcentrators, separation columns, and gas detectors. However, the integration of these different components is an important topic that requires more investigation. In this thesis, we present a robust and scalable software and hardware architecture that can be used to develop a portable and modular µGC system. The thesis discusses different experiments to calibrate various microfabricated devices, which are then used to build a fully modular µGC system. We show the separation capacity of the modular µGC system by passing complex compounds like kerosene and diesel. As the chromatogram from the µGC system has noise, the second part of the thesis explores data analysis techniques such as baseline correction, peak detection. These data analysis tools are used to filter the noise, detect relevant peaks in the chromatograms, and identify the compounds in a complex sample.
|
Page generated in 0.0405 seconds