• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 530
  • 232
  • 68
  • 48
  • 28
  • 25
  • 20
  • 17
  • 13
  • 12
  • 8
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 1178
  • 1032
  • 202
  • 193
  • 173
  • 161
  • 155
  • 147
  • 123
  • 121
  • 106
  • 96
  • 90
  • 84
  • 81
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
441

Model Complexity in Linear Regression: Extensions for Prediction and Heteroscedasticity

Luan, Bo 18 August 2022 (has links)
No description available.
442

Gauss-newton Based Learning For Fully Recurrent Neural Networks

Vartak, Aniket Arun 01 January 2004 (has links)
The thesis discusses a novel off-line and on-line learning approach for Fully Recurrent Neural Networks (FRNNs). The most popular algorithm for training FRNNs, the Real Time Recurrent Learning (RTRL) algorithm, employs the gradient descent technique for finding the optimum weight vectors in the recurrent neural network. Within the framework of the research presented, a new off-line and on-line variation of RTRL is presented, that is based on the Gauss-Newton method. The method itself is an approximate Newton's method tailored to the specific optimization problem, (non-linear least squares), which aims to speed up the process of FRNN training. The new approach stands as a robust and effective compromise between the original gradient-based RTRL (low computational complexity, slow convergence) and Newton-based variants of RTRL (high computational complexity, fast convergence). By gathering information over time in order to form Gauss-Newton search vectors, the new learning algorithm, GN-RTRL, is capable of converging faster to a better quality solution than the original algorithm. Experimental results reflect these qualities of GN-RTRL, as well as the fact that GN-RTRL may have in practice lower computational cost in comparison, again, to the original RTRL.
443

Data-true Characterization Of Neuronal Models

Suarez, Jose 01 January 2011 (has links)
In this thesis, a weighted least squares approach is initially presented to estimate the parameters of an adaptive quadratic neuronal model. By casting the discontinuities in the state variables at the spiking instants as an impulse train driving the system dynamics, the neuronal output is represented as a linearly parameterized model that depends on filtered versions of the input current and the output voltage at the cell membrane. A prediction errorbased weighted least squares method is formulated for the model. This method allows for rapid estimation of model parameters under a persistently exciting input current injection. Simulation results show the feasibility of this approach to predict multiple neuronal firing patterns. Results of the method using data from a detailed ion-channel based model showed issues that served as the basis for the more robust resonate-and-fire model presented. A second method is proposed to overcome some of the issues found in the adaptive quadratic model presented. The original quadratic model is replaced by a linear resonateand-fire model -with stochastic threshold- that is both computational efficient and suitable for larger network simulations. The parameter estimation method presented here consists of different stages where the set of parameters is divided in to two. The first set of parameters is assumed to represent the subthreshold dynamics of the model, and it is estimated using a nonlinear least squares algorithm, while the second set is associated with the threshold and iii reset parameters as its estimated using maximum likelihood formulations. The validity of the estimation method is then tested using detailed Hodgkin-Huxley model data as well as experimental voltage recordings from rat motoneurons.
444

Factors Affecting the Thai Natural Rubber Market Equilibrium: Demand and Supply Response Analysis Using Two-Stage Least Squares Approach

Chawananon, Chadapa 01 June 2014 (has links) (PDF)
Natural rubber is a major export crop and the sector is an important source of employment in Thailand. Very few rubber studies in the past have examined the demand and supply equations simultaneously and the previously results are dated. The objectives of this study was to estimate the econometric model of demand and supply of natural rubber in Thailand and determine if a relationship exists between the supply of rubber and its determinants. The data contained in the study are secondary time series annual data from 1977-2012. The instrumental variables estimation by two-stage least squares was used to solve and analyze the demand and supply of rubber. Results were statistically significant at 0.01 level, which showed that the U.S. GDP per capita, the estimated price, rainfall and rice price have a significant effect on quantity of rubber production in Thailand with an estimated elasticity of 1.4, 3.3, -3.6 and -2.6, respectively. The implications of the results are assessed through the lens of rubber producers, rubber consumers and agricultural policy makers.
445

Development of CPANEL, an Unstructured Panel Code, Using a Modified TLS Velocity Formulation

Satterwhite, Christopher R 01 September 2015 (has links) (PDF)
The use of panel codes in the aerospace industry dates back many decades. Recent advances in computer capability have allowed them to evolve, both in speed and complexity, to provide very quick solutions to complex flow fields. By only requiring surface discretization, panel codes offer a faster alternative to volume based methods, delivering a solution in minutes, as opposed to hours or days. Despite their utility, the availability of these codes is very limited due to either cost, or rights restrictions. This work incorporates modern software development practices, such as unit level testing and version control, into the development of an unstructured panel code, CPanel, with an object-oriented approach in C++. CPanel utilizes constant source and doublet panels to define the geometry and a vortex sheet wake representation. An octree data structure is employed to enhance the speed of geometrical queries and lay a framework for the application of a fast tree method. The challenge of accurately calculating surface velocities on an unstructured discretization is addressed with a constrained Hermite Taylor least-squares velocity formulation. Future enhancement was anticipated throughout development, leaving a strong framework from which to perform research on methods to more accurately predict the physical flow field with a tool based in potential flow theory. Program results are verified using the analytical solution for flow around an ellipsoid, vortex lattice method solutions for simple planforms, as well an anchored panel code, CBAERO. CPanel solutions show strong agreement with these methods and programs. Additionally, aerodynamic coefficients calculated via surface integration are consistent with those calculated from a Trefftz plane analysis in CPanel. This consistency is not demonstrated in solutions from CBAERO, suggesting the CHTLS velocity formulation is more accurate than more commonly used vortex core methods.
446

Partial least squares structural equation modelling with incomplete data. An investigation of the impact of imputation methods.

Mohd Jamil, J.B. January 2012 (has links)
Despite considerable advances in missing data imputation methods over the last three decades, the problem of missing data remains largely unsolved. Many techniques have emerged in the literature as candidate solutions. These techniques can be categorised into two classes: statistical methods of data imputation and computational intelligence methods of data imputation. Due to the longstanding use of statistical methods in handling missing data problems, it takes quite some time for computational intelligence methods to gain profound attention even though these methods have analogous accuracy, in comparison to other approaches. The merits of both these classes have been discussed at length in the literature, but only limited studies make significant comparison to these classes. This thesis contributes to knowledge by firstly, conducting a comprehensive comparison of standard statistical methods of data imputation, namely, mean substitution (MS), regression imputation (RI), expectation maximization (EM), tree imputation (TI) and multiple imputation (MI) on missing completely at random (MCAR) data sets. Secondly, this study also compares the efficacy of these methods with a computational intelligence method of data imputation, ii namely, a neural network (NN) on missing not at random (MNAR) data sets. The significance difference in performance of the methods is presented. Thirdly, a novel procedure for handling missing data is presented. A hybrid combination of each of these statistical methods with a NN, known here as the post-processing procedure, was adopted to approximate MNAR data sets. Simulation studies for each of these imputation approaches have been conducted to assess the impact of missing values on partial least squares structural equation modelling (PLS-SEM) based on the estimated accuracy of both structural and measurement parameters. The best method to deal with particular missing data mechanisms is highly recognized. Several significant insights were deduced from the simulation results. It was figured that for the problem of MCAR by using statistical methods of data imputation, MI performs better than the other methods for all percentages of missing data. Another unique contribution is found when comparing the results before and after the NN post-processing procedure. This improvement in accuracy may be resulted from the neural network¿s ability to derive meaning from the imputed data set found by the statistical methods. Based on these results, the NN post-processing procedure is capable to assist MS in producing significant improvement in accuracy of the approximated values. This is a promising result, as MS is the weakest method in this study. This evidence is also informative as MS is often used as the default method available to users of PLS-SEM software. / Minister of Higher Education Malaysia and University Utara Malaysia
447

Analysis of Order Strategies for Alternating Algorithms in Optimization

Ntiamoah, Daniel 05 June 2023 (has links)
No description available.
448

Non-invasive estimation of skin chromophores using Hyperspectral Imaging

Karambor Chakravarty, Sriya 21 August 2023 (has links)
Melanomas account for more than 1.7% of global cancer diagnoses and about 1% of all skin cancer diagnoses in the United States. This type of cancer occurs in the melanin-producing cells in the epidermis and exhibits distinctive variations in melanin and blood concentration values in the form of skin lesions. The current approach for evaluating skin cancer lesions involves visual inspection with a dermatoscope, typically followed by biopsy and histopathological analysis. However, this process, to decrease the risk of misdiagnosis, results in unnecessary biopsies, contributing to the emotional and financial distress of patients. The implementation of a non-invasive imaging technique to aid the analysis of skin lesions in the early stages can potentially mitigate these consequences. Hyperspectral imaging (HSI) has shown promise as a non-invasive technique to analyze skin lesions. Images taken of human skin using a hyperspectral camera are a result of numerous elements in the skin. Being a turbid, inhomogeneous material, the skin has chromophores and scattering agents, which interact with light and produce characteristic back-scattered energy that can be harnessed and examined with an HSI camera. In this study, a mathematical model of the skin is used to extract meaningful information from the hyperspectral data in the form of melanin concentration, blood volume fraction and blood oxygen saturation in the skin. The human skin is modelled as a bi-layer planar system, whose surface reflectance is theoretically calculated using the Kubelka-Munk theory and absorption laws by Beer and Lambert. Hyperspectral images of the dorsal portion of three volunteer subjects' hands 400 - 1000 nm range, were used to estimate the contributing parameters. The mean and standard deviation of these estimates are reported compared with theoretical values from the literature. The model is also evaluated for its sensitivity with respect to these parameters, and then fitted to measured hyperspectral data of three volunteer subjects in different conditions. The wavelengths and wavelength groups which were identified to result in the maximum change in percentage reflectance calculated from the model were 450 and 660 nm for melanin, 500 - 520 nm and 590 - 625 nm for blood volume fraction and 606, 646 and 750 nm for blood oxygen saturation. / Master of Science / Melanoma, the most serious type of skin cancer, develops in the melanin-producing cells in the epidermis. A characteristic marker of skin lesions is the abrupt variations in melanin and blood concentration in areas of the lesion. The present technique to inspect skin cancer lesions involves dermatoscopy, which is a qualitative visual analysis of the lesion's features using a few standardized techniques such as the 7-point checklist and the ABCDE rule. Typically, dermatoscopy is followed by a biopsy and then a histopathological analysis of the biopsy. To reduce the possibility of misdiagnosing actual melanomas, a considerable number of dermoscopically unclear lesions are biopsied, increasing emotional, financial, and medical consequences. A non-invasive imaging technique to analyze skin lesions during the dermoscopic stage can help alleviate some of these consequences. Hyperspectral imaging (HSI) is a promising methodology to non-invasively analyze skin lesions. Images taken of human skin using a hyperspectral camera are a result of numerous elements in the skin. Being a turbid, inhomogeneous material, the skin has chromophores and scattering agents, which interact with light and produce characteristic back-scattered energy that can be harnessed and analyzed with an HSI camera. In this study, a mathematical model of the skin is used to extract meaningful information from the hyperspectral data in the form of melanin concentration, blood volume fraction and blood oxygen saturation. The mean and standard deviation of these estimates are reported compared with theoretical values from the literature. The model is also evaluated for its sensitivity with respect to these parameters, and then fitted to measured hyperspectral data of six volunteer subjects in different conditions. Wavelengths which capture the most influential changes in the model response are identified to be 450 and 660 nm for melanin, 500 - 520 nm and 590 - 625 nm for blood volume fraction and 606, 646 and 750 nm for blood oxygen saturation.
449

The effect of sampling error on the interpretation of a least squares regression relating phosporus and chlorophyll

Beedell, David C. (David Charles) January 1995 (has links)
No description available.
450

CONTINUOUS ANTEDEPENDENCE MODELS FOR SPARSE LONGITUDINAL DATA

CHERUVU, VINAY KUMAR 30 January 2012 (has links)
No description available.

Page generated in 0.0636 seconds