Spelling suggestions: "subject:"principal component 2analysis"" "subject:"principal component 3analysis""
321 |
Comparing Building Energy Benchmarking Metrics using Dimension Reduction TechniquesAgale, Ketaki 21 October 2019 (has links)
No description available.
|
322 |
Characterization of a light petroleum fraction produced from automotive shredder residuesTipler, Steven 20 May 2021 (has links) (PDF)
Wastes have a real potential as being players in the energy mix of tomorrow. They can have a high heating value depending on their composition, which makes them good candidates to be converted into liquid fuel via pyrolysis. Among the different types of wastes, automotive residues are expected to rocket due to the increasing number of cars and the tendency to build cars with more and more polymers. Moreover, the existing regulations concerning the recycling of end-of-life vehicles become more and more stringent. Unconventional fuels such as those derived from automotive shredder residues (ASR) have a particular composition which tends to increase the amount of pollutants comparing with conventional fuels. Relying on alternative combustion modes, such as reactivity controlled compression ignition (RCCI), is a solution to cope with these pollutants. In RCCI, two types of fuels are burned simultaneously, namely a light fraction with a low reactivity, and a heavy fraction with a high reactivity. The heavy fraction governs the ignition as it is injected directly in the cylinder close to the end of compression. A variation of its ignition delay could impact the quality of the combustion. Nevertheless, this issue can be tackled by adjusting the injection timing. As long as the low reactivity fuel is concerned, such a solution cannot be adopted as its reactivity depends on the initial parameters (equivalence ratio, inlet temperature, exhaust gas recirculation ratio). However, if the fuel is too reactive, it could create knock that have a dramatic impact on the engine, leading to damages. Thus, being able to predict its features is a key aspect for a safe usage. Predicting methods exist but had never been tested yet with fuels derived from automotive residues. With petroleum products, usual prediction methods stand at three different levels: the chemical composition, the properties, and the reactivity in an appliance. The fuel is studied at these three levels. First, the structure gives a good overview of the fuel auto-ignition. For instance, aromatics tend to have higher ignition delay time (IDT) than paraffins. Second, the octane numbers are good indicators of the fuel IDT and of the resistance toward knock. Precisely, the octane numbers depict the resistance of a fuel towards an end-gas auto-ignition. Last, the IDT was studied in a rapid compression machine and a surrogate fuel was formulated. Surrogate fuels substitute real fuels during simulations because real fuels cannot be modelled by kinetic mechanisms due to their complexity.The existing methods to estimate the composition were updated to predict the n-paraffin, iso-paraffin, olefin, napthene, aromatic and oxygenate(PIONAOx) fractions. A good accuracy was achieved compared with the literature. This new method requires the measurement of the specific gravity, of the distillation cut points, of the CHO atom fractions, of the kinematic viscosity and of the refractive index.Two methods to predict the octane numbers were developed based on Bayesian inference, principal component analysis (PCA) and artificial neural network (ANN). The first is a Bayesian method which modifies the pseudocomponent (PC) method. It introduces a correcting factor which corrects the existing formulation of the PC method to increase its accuracy. A precision of more than 2% is achieved. The second method is based on PCA and ANN. 41 properties are studied among which reduced set of principal variables are selected to predict the octane numbers. 10 properties calculated only with the distillation cut points, the CHO atom fraction and the specific gravity were selected to accurately predict the octane numbers.Measurements of the IDT in a rapid compression machine (RCM) of a fuel produced from ASR were realized. They are the first measurements insuch a machine ever made. This provide experimental data to the literature. Moreover, these experimental data were used to formulate a surrogate fuel. Surrogate fuels can be used to realize simulations under specific conditions. The current thesis investigates fuels derived from ASR. It was showed that this fuel can be burnt in engines as long as their properties are carefully monitored. Among others, the IDT is particularly important. Nevertheless, additional experimental campaigns and simulations in engine are required in order to correctly assess all of the combustion features of such a fuel in an engine. / Doctorat en Sciences de l'ingénieur et technologie / info:eu-repo/semantics/nonPublished
|
323 |
Development and Optimization of Near-infrared spectroscopyHahlin, Amanda January 2023 (has links)
With the growing demand for sustainable options, the existing sorting capacities are limiting the potential for fiber-to-fiber recycling. With the help of near-infrared spectroscopy (NIRS), automated sorting of textiles with high accuracy is possible due to the easy access for polymer identification. Despite the effectiveness of NIRS, some limitations of the process still limit its full potential. Possible disruptors may interfere with and disturb the identification of polymer identities and compositions in different ways. In the following thesis, additives, treatments, and other environmental factors that may hinder fiber identification are further acknowledged. The key results of the thesis state that stains and factors due to wear and tear are the most common possible disruptors that could be identified from pre-sorted post-consumer end-of-life textiles. Further on, stains of ketchup, deodorant, and oil affect the polymer recognition by lowering the recognized fiber content. Water-repellent coatings on 100 % polyamide woven fabric were not detected correctly according to the NIR scanner, as the stated polymer composition was >90 %. Even though some investigated factors, e.g., material structures, were correctly identified by the NIR scanner, the internal deviation of the knitted polyester structure indicates that porous and loose structures hold the ability to interfere with the detection of polymers. To what extent the operating software has been developed is highly relevant to the outcome of how accurate textile sorting may be.
|
324 |
Functional Principal Component Analysis of Vibrational Signal Data: A Functional Data Analytics Approach for Fault Detection and Diagnosis of Internal Combustion EnginesMcMahan, Justin Blake 14 December 2018 (has links)
Fault detection and diagnosis is a critical component of operations management systems. The goal of FDD is to identify the occurrence and causes of abnormal events. While many approaches are available, data-driven approaches for FDD have proven to be robust and reliable. Exploiting these advantages, the present study applied functional principal component analysis (FPCA) to carry out feature extraction for fault detection in internal combustion engines. Furthermore, a feature subset that explained 95% of the variance of the original vibrational sensor signal was used in a multilayer perceptron to carry out prediction for fault diagnosis. Of the engine states studied in the present work, the ending diagnostic performance shows the proposed approach achieved an overall prediction accuracy of 99.72 %. These results are encouraging because they show the feasibility for applying FPCA for feature extraction which has not been discussed previously within the literature relating to fault detection and diagnosis.
|
325 |
Development of statistical shape and intensity models of eroded scapulae to improve shoulder arthroplastySharif Ahmadian, Azita 22 December 2021 (has links)
Reverse Total shoulder arthroplasty (RTSA) is an effective treatment and a surgical alternative approach to conventional total shoulder arthroplasty for patients with severe rotator cuff tears and glenoid erosion. To help optimize RTSA design, it is necessary to gain insight into the geometry of glenoid erosions and consider their unique morphology across the entire bone. One of the most powerful tools to systematically quantify and
visualize the variation of bone geometry throughout a population is Statistical Shape Modeling (SSM); this method can assess the variation in the full shape of a bone, rather than of discrete anatomical features, which is very useful in identifying abnormalities, planning surgeries, and improving implant designs. Recently, many scapula SSMs have been presented in the literature; however, each has been created using normal and healthy bones. Therefore, creation of a scapula SSM derived exclusively from patients exhibiting
complex glenoid bone erosions is critical and significantly challenging.
In addition, several studies have quantified scapular bone properties in patients with complex glenoid erosion. However, because of their discrete nature these analyses cannot be used as the basis for Finite Element Modeling (FEM). Thus, a need exists to systematically quantify the variation of bone properties in a glenoid erosion patient population using a method that captures variation across the entire bone. This can be
achieved using Statistical Intensity Modeling (SIM), which can then generate scapula FEMs with realistic bone properties for evaluation of orthopaedic implants. Using an SIM enables researchers to generate models with bone properties that represent a specific, known portion of the population variation, which makes the findings more generalizable. Accordingly, the main purpose of this research is to develop an SSM and SIM to mathematically quantifying the variation of bone geometries in a systematic manner for the complex geometry of scapulae with severe glenoid erosion and to determine the main modes of variation in bone property distribution, which could be used for future FEM studies, respectively.
To draw meaningful statistical conclusions from the dataset, we need to compare and relate corresponding parts of the scapula. To achieve this correspondence, 3D triangulated mesh models of 61 scapulae were created from pre-operative CT scans from patients who were treated with RTSA and then a Non-Rigid (NR) registration method was used to morph one Atlas point cloud to the shapes of all other bones. However, the more complex the shape, the more difficult it is to maintain good correspondence. To overcome this challenge, we have adapted and optimized a NR-Iterative Closest Point (ICP) method and applied that on 61 eroded scapulae which results in each bone shape having identical mesh structure (i.e., same number and anatomical location of points). To assess the quality of our proposed algorithm, the resulting correspondence error was evaluated by comparing the positions of ground truth points and the corresponding point locations produced by the algorithm. The average correspondence error of all anatomical landmarks across the two observers was 2.74 mm with inter and intra-observer reliability of ±0.31 and ±0.06 mm. Moreover, the Root-Mean-Square (RMS) and Hausdorff errors of geometric registration between the original and the deformed models were calculated 0.25±0.04 mm and 0.76±0.14 mm, respectively.
After registration, Principal Component Analysis (PCA) is applied to the deformed models as a group to describe independent modes of variation in the dataset. The robustness of the SSM is also evaluated using three standard metrics: compactness, generality, and specificity. Regarding compactness, the first 9 principal modes of variations accounted for 95% variability, while the model’s generality error and the calculated specificity over 10,000 instances were found to be 2.6 mm and 2.99 mm, respectively.
The SIM results showed that the first mode of variation accounts for overall changes in intensity across the entire bone, while the second mode represented localized changes in the glenoid vault bone quality. The third mode showed changes in intensity at the posterior and inferior glenoid rim associated with posteroinferior glenoid rim erosion which suggests avoiding fixation in this region and preferentially placing screws in the anterosuperior region of the glenoid to improve implant fixation. / Graduate
|
326 |
A CORRELATION OF WESTERN ARCTIC OCEAN SEDIMENTATION DURING THE LATE HOLOCENE WITH AN ATMOSPHERIC TEMPERATURE PROXY RECORD FROM A GLACIAL LAKE IN THE BROOKS RANGE, ALASKAHarrison, Jeffrey Michael 22 April 2013 (has links)
No description available.
|
327 |
A Principal Component Regression Analysis for Detection of the Onset of Nocturnal Hypoglycemia in Type 1 Diabetic PatientsZuzarte, Ian Jeromino January 2008 (has links)
No description available.
|
328 |
[en] A NOVEL SEMIPARAMETRIC STRUCTURAL MODEL FOR ELECTRICITY FORWARD CURVES / [pt] MODELO ESTRUTURAL SEMI-PARAMÉTRICO PARA CURVAS FORWARD DE ELETRICIDADEMARINA DIETZE MONTEIRO 23 February 2021 (has links)
[pt] A proteção contra a volatilidade dos preços spot torna-se cada vez mais
importante nos mercados de energia desverticalizados. Portanto, ser capaz de
modelar preços forward e futuros de eletricidade é crucial em um ambiente
competitivo. A eletricidade difere de outras commodities devido à sua capacidade
de armazenamento e transporte limitados. Além disso, seus derivativos
estão associados a um período de entrega durante o qual a energia é concedida
continuamente, o que implica em muitas vezes os contratos de eletricidades
serem denominados swaps. Tais peculiaridades tornam a modelagem de preços
de contratos de energia elétrica uma tarefa não trivial, onde os modelos tradicionais
devem ser adaptados para atender às características mencionadas. Neste
contexto, foi proposto um modelo estrutural semi-paramétrico para obtenção
de uma curva forward de eletricidade contínua e diária através de critérios de
máxima suavidade. Ademais, os contratos forward elementares podem ser representados
por qualquer estrutura paramétrica para sazonalidade ou mesmo
para variáveis exógenas. Nossa estrutura reconhece a sobreposição dos swaps
e permite uma análise das oportunidades de arbitragem observadas nos mercados
de energia. A curva forward é calculada por um problema de otimização
hierárquico capaz de lidar com conjuntos de dados escassos de mercados com
baixa liquidez. Os resultados do PCA corroboram a capacidade do modelo em
explicar uma alta porcentagem da variância com apenas alguns fatores. / [en] Hedging against spot price volatilities becomes increasingly important in
deregulated power markets. Therefore, being able to model electricity forward
prices is crucial in a competitive environment. Electricity differs from other
commodities due to its limited storability and transportability. Furthermore,
its derivatives are associated with a delivery period during which electricity is
continuously delivered, implying on referring to power forwards as swaps. These
peculiarities make the modeling of electricity contract prices a non-trivial
task, where traditional models must be adapted to address the mentioned
characteristics. In this context, we propose a novel semiparametric structural
model to compute a continuous daily forward curve of electricity through
maximum smoothness criterion. In addition, elementary forward contracts
can be represented by any parametric structure for seasonality or even for
exogenous variables. Our framework acknowledges the overlapped swaps and
allows an analysis of arbitrage opportunities observed in power markets. The
smooth forward curve is computed by a hierarchical optimization problem able
to handle scarce data sets from low-liquidity markets. PCA results corroborate
our framework s capability to explain a high percentage of variance with only
a few factors.
|
329 |
Assessing Crash Occurrence On Urban Freeways Using Static And Dynamic Factors By Applying A System Of Interrelated EquationsPemmanaboina, Rajashekar 01 January 2005 (has links)
Traffic crashes have been identified as one of the main causes of death in the US, making road safety a high priority issue that needs urgent attention. Recognizing the fact that more and effective research has to be done in this area, this thesis aims mainly at developing different statistical models related to the road safety. The thesis includes three main sections: 1) overall crash frequency analysis using negative binomial models, 2) seemingly unrelated negative binomial (SUNB) models for different categories of crashes divided based on type of crash, or condition in which they occur, 3) safety models to determine the probability of crash occurrence, including a rainfall index that has been estimated using a logistic regression model. The study corridor is a 36.25 mile stretch of Interstate 4 in Central Florida. For the first two sections, crash cases from 1999 through 2002 were considered. Conventionally most of the crash frequency analysis model all crashes, instead of dividing them based on type of crash, peaking conditions, availability of light, severity, or pavement condition, etc. Also researchers traditionally used AADT to represent traffic volumes in their models. These two cases are examples of macroscopic crash frequency modeling. To investigate the microscopic models, and to identify the significant factors related to crash occurrence, a preliminary study (first analysis) explored the use of microscopic traffic volumes related to crash occurrence by comparing AADT/VMT with five to twenty minute volumes immediately preceding the crash. It was found that the volumes just before the time of crash occurrence proved to be a better predictor of crash frequency than AADT. The results also showed that road curvature, median type, number of lanes, pavement surface type and presence of on/off-ramps are among the significant factors that contribute to crash occurrence. In the second analysis various possible crash categories were prepared to exactly identify the factors related to them, using various roadway, geometric, and microscopic traffic variables. Five different categories are prepared based on a common platform, e.g. type of crash. They are: 1) Multiple and Single vehicle crashes, 2) Peak and Off-peak crashes, 3) Dry and Wet pavement crashes, 4) Daytime and Dark hour crashes, and 5) Property Damage Only (PDO) and Injury crashes. Each of the above mentioned models in each category are estimated separately. To account for the correlation between the disturbance terms arising from omitted variables between any two models in a category, seemingly unrelated negative binomial (SUNB) regression was used, and then the models in each category were estimated simultaneously. SUNB estimation proved to be advantageous for two categories: Category 1, and Category 4. Road curvature and presence of On-ramps/Off-ramps were found to be the important factors, which can be related to every crash category. AADT was also found to be significant in all the models except for the single vehicle crash model. Median type and pavement surface type were among the other important factors causing crashes. It can be stated that the group of factors found in the model considering all crashes is a superset of the factors that were found in individual crash categories. The third analysis dealt with the development of a logistic regression model to obtain the weather condition at a given time and location on I-4 in Central Florida so that this information can be used in traffic safety analyses, because of the lack of weather monitoring stations in the study area. To prove the worthiness of the weather information obtained form the analysis, the same weather information was used in a safety model developed by Abdel-Aty et al., 2004. It was also proved that the inclusion of weather information actually improved the safety model with better prediction accuracy.
|
330 |
Selective Multivariate Applications In Forensic ScienceRinke, Caitlin 01 January 2012 (has links)
A 2009 report published by the National Research Council addressed the need for improvements in the field of forensic science. In the report emphasis was placed on the need for more rigorous scientific analysis within many forensic science disciplines and for established limitations and determination of error rates from statistical analysis. This research focused on multivariate statistical techniques for the analysis of spectral data obtained for multiple forensic applications which include samples from: automobile float glasses and paints, bones, metal transfers, ignitable liquids and fire debris, and organic compounds including explosives. The statistical techniques were used for two types of data analysis: classification and discrimination. Statistical methods including linear discriminant analysis and a novel soft classification method were used to provide classification of forensic samples based on a compiled library. The novel soft classification method combined three statistical steps: Principal Component Analysis (PCA), Target Factor Analysis (TFA), and Bayesian Decision Theory (BDT) to provide classification based on posterior probabilities of class membership. The posterior probabilities provide a statistical probability of classification which can aid a forensic analyst in reaching a conclusion. The second analytical approach applied nonparametric methods to provide the means for discrimination between samples. Nonparametric methods are performed as hypothesis test and do not assume normal distribution of the analytical figures of merit. The nonparametric iv permutation test was applied to forensic applications to determine the similarity between two samples and provide discrimination rates. Both the classification method and discrimination method were applied to data acquired from multiple instrumental methods. The instrumental methods included: Laser Induced-Breakdown Spectroscopy (LIBS), Fourier Transform Infrared Spectroscopy (FTIR), Raman spectroscopy, and Gas Chromatography-Mass Spectrometry (GC-MS). Some of these instrumental methods are currently applied to forensic applications, such as GC-MS for the analysis of ignitable liquid and fire debris samples; while others provide new instrumental methods to areas within forensic science which currently lack instrumental analysis techniques, such as LIBS for the analysis of metal transfers. The combination of the instrumental techniques and multivariate statistical techniques is investigated in new approaches to forensic applications in this research to assist in improving the field of forensic science.
|
Page generated in 0.0736 seconds