• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 682
  • 252
  • 79
  • 57
  • 42
  • 37
  • 30
  • 26
  • 25
  • 14
  • 9
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 1503
  • 1029
  • 249
  • 238
  • 223
  • 215
  • 195
  • 185
  • 167
  • 163
  • 151
  • 124
  • 123
  • 122
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
511

Development of CPANEL, an Unstructured Panel Code, Using a Modified TLS Velocity Formulation

Satterwhite, Christopher R 01 September 2015 (has links) (PDF)
The use of panel codes in the aerospace industry dates back many decades. Recent advances in computer capability have allowed them to evolve, both in speed and complexity, to provide very quick solutions to complex flow fields. By only requiring surface discretization, panel codes offer a faster alternative to volume based methods, delivering a solution in minutes, as opposed to hours or days. Despite their utility, the availability of these codes is very limited due to either cost, or rights restrictions. This work incorporates modern software development practices, such as unit level testing and version control, into the development of an unstructured panel code, CPanel, with an object-oriented approach in C++. CPanel utilizes constant source and doublet panels to define the geometry and a vortex sheet wake representation. An octree data structure is employed to enhance the speed of geometrical queries and lay a framework for the application of a fast tree method. The challenge of accurately calculating surface velocities on an unstructured discretization is addressed with a constrained Hermite Taylor least-squares velocity formulation. Future enhancement was anticipated throughout development, leaving a strong framework from which to perform research on methods to more accurately predict the physical flow field with a tool based in potential flow theory. Program results are verified using the analytical solution for flow around an ellipsoid, vortex lattice method solutions for simple planforms, as well an anchored panel code, CBAERO. CPanel solutions show strong agreement with these methods and programs. Additionally, aerodynamic coefficients calculated via surface integration are consistent with those calculated from a Trefftz plane analysis in CPanel. This consistency is not demonstrated in solutions from CBAERO, suggesting the CHTLS velocity formulation is more accurate than more commonly used vortex core methods.
512

Metamodel-based collaborative optimization framework

Zadeh, Parviz M., Toropov, V.V., Wood, Alastair S. January 2009 (has links)
This paper focuses on the metamodel-based collaborative optimization (CO). The objective is to improve the computational efficiency of CO in order to handle multidisciplinary design optimization problems utilising high fidelity models. To address these issues, two levels of metamodel building techniques are proposed: metamodels in the disciplinary optimization are based on multi-fidelity modelling (the interaction of low and high fidelity models) and for the system level optimization a combination of a global metamodel based on the moving least squares method and trust region strategy is introduced. The proposed method is demonstrated on a continuous fiber-reinforced composite beam test problem. Results show that methods introduced in this paper provide an effective way of improving computational efficiency of CO based on high fidelity simulation models.
513

Partial least squares structural equation modelling with incomplete data. An investigation of the impact of imputation methods.

Mohd Jamil, J.B. January 2012 (has links)
Despite considerable advances in missing data imputation methods over the last three decades, the problem of missing data remains largely unsolved. Many techniques have emerged in the literature as candidate solutions. These techniques can be categorised into two classes: statistical methods of data imputation and computational intelligence methods of data imputation. Due to the longstanding use of statistical methods in handling missing data problems, it takes quite some time for computational intelligence methods to gain profound attention even though these methods have analogous accuracy, in comparison to other approaches. The merits of both these classes have been discussed at length in the literature, but only limited studies make significant comparison to these classes. This thesis contributes to knowledge by firstly, conducting a comprehensive comparison of standard statistical methods of data imputation, namely, mean substitution (MS), regression imputation (RI), expectation maximization (EM), tree imputation (TI) and multiple imputation (MI) on missing completely at random (MCAR) data sets. Secondly, this study also compares the efficacy of these methods with a computational intelligence method of data imputation, ii namely, a neural network (NN) on missing not at random (MNAR) data sets. The significance difference in performance of the methods is presented. Thirdly, a novel procedure for handling missing data is presented. A hybrid combination of each of these statistical methods with a NN, known here as the post-processing procedure, was adopted to approximate MNAR data sets. Simulation studies for each of these imputation approaches have been conducted to assess the impact of missing values on partial least squares structural equation modelling (PLS-SEM) based on the estimated accuracy of both structural and measurement parameters. The best method to deal with particular missing data mechanisms is highly recognized. Several significant insights were deduced from the simulation results. It was figured that for the problem of MCAR by using statistical methods of data imputation, MI performs better than the other methods for all percentages of missing data. Another unique contribution is found when comparing the results before and after the NN post-processing procedure. This improvement in accuracy may be resulted from the neural network¿s ability to derive meaning from the imputed data set found by the statistical methods. Based on these results, the NN post-processing procedure is capable to assist MS in producing significant improvement in accuracy of the approximated values. This is a promising result, as MS is the weakest method in this study. This evidence is also informative as MS is often used as the default method available to users of PLS-SEM software. / Minister of Higher Education Malaysia and University Utara Malaysia
514

Convnet features for age estimation

Bukar, Ali M., Ugail, Hassan 07 1900 (has links)
No / Research in facial age estimation has been active for over a decade. This is due to its numerous applications. Recently, convolutional neural networks (CNNs) have been used in an attempt to solve this age old problem. For this purpose, researchers have proposed various CNN architectures. Unfortunately, most of the proposed techniques have been based on relatively ‘shallow’ networks. In this work, we leverage the capability of an off-the-shelf deep CNN model, namely the VGG-Face model, which has been trained on millions of face images. Interestingly, despite being a simple approach, features extracted from the VGG-Face model, when reduced and fed into linear regressors, outperform most of the state-of-the-art CNNs. e.g. on both FGNET-AD and Morph II benchmark databases. Furthermore, contrary to using the last fully connected (FC) layer of the trained model, we evaluate the activations from different layers of the architecture. In fact, our experiments show that generic features learnt from intermediate layer activations carry more ageing information than the FC layers.
515

Analysis of Order Strategies for Alternating Algorithms in Optimization

Ntiamoah, Daniel 05 June 2023 (has links)
No description available.
516

On linear Reaction-Diffusion systems and Network Controllability

Aulin, Rebecka, Hage, Felicia January 2023 (has links)
In 1952 Alan Turing published his paper "The Chemical Basis of Morphogenesis", which described a model for how naturally occurring patterns, such as the stripes of a zebra and the spots of a leopard, can arise from a spatially homogeneous steady state through diffusion. Turing suggested that the concentration of the substances producing the patterns is determined by the reaction kinetics, how the substances interact, and diffusion.  In this project Turing's model with linear reactions kinetics was studied. The model was first solved using two different numerical methods; the finite difference method (FDM) and the finite element method (FEM) with different boundary conditions. A parameter study was then conducted, investigating the effect on the patterns of changing the parameters of the model. Lastly the controllability of the model and the least energy control was considered. The simulations were found to produce patterns provided the right parameters, as expected. From the investigation of the parameters it could be concluded that the size/tightness of the pattern and similarity of the substance concentration distributions depended on the choice of parameters. As for the controllability, a desired final state could be produced thorough simulations using control of the boundary and the energy cost of producing the pattern increased when decreasing the number of controls.
517

Non-invasive estimation of skin chromophores using Hyperspectral Imaging

Karambor Chakravarty, Sriya 21 August 2023 (has links)
Melanomas account for more than 1.7% of global cancer diagnoses and about 1% of all skin cancer diagnoses in the United States. This type of cancer occurs in the melanin-producing cells in the epidermis and exhibits distinctive variations in melanin and blood concentration values in the form of skin lesions. The current approach for evaluating skin cancer lesions involves visual inspection with a dermatoscope, typically followed by biopsy and histopathological analysis. However, this process, to decrease the risk of misdiagnosis, results in unnecessary biopsies, contributing to the emotional and financial distress of patients. The implementation of a non-invasive imaging technique to aid the analysis of skin lesions in the early stages can potentially mitigate these consequences. Hyperspectral imaging (HSI) has shown promise as a non-invasive technique to analyze skin lesions. Images taken of human skin using a hyperspectral camera are a result of numerous elements in the skin. Being a turbid, inhomogeneous material, the skin has chromophores and scattering agents, which interact with light and produce characteristic back-scattered energy that can be harnessed and examined with an HSI camera. In this study, a mathematical model of the skin is used to extract meaningful information from the hyperspectral data in the form of melanin concentration, blood volume fraction and blood oxygen saturation in the skin. The human skin is modelled as a bi-layer planar system, whose surface reflectance is theoretically calculated using the Kubelka-Munk theory and absorption laws by Beer and Lambert. Hyperspectral images of the dorsal portion of three volunteer subjects' hands 400 - 1000 nm range, were used to estimate the contributing parameters. The mean and standard deviation of these estimates are reported compared with theoretical values from the literature. The model is also evaluated for its sensitivity with respect to these parameters, and then fitted to measured hyperspectral data of three volunteer subjects in different conditions. The wavelengths and wavelength groups which were identified to result in the maximum change in percentage reflectance calculated from the model were 450 and 660 nm for melanin, 500 - 520 nm and 590 - 625 nm for blood volume fraction and 606, 646 and 750 nm for blood oxygen saturation. / Master of Science / Melanoma, the most serious type of skin cancer, develops in the melanin-producing cells in the epidermis. A characteristic marker of skin lesions is the abrupt variations in melanin and blood concentration in areas of the lesion. The present technique to inspect skin cancer lesions involves dermatoscopy, which is a qualitative visual analysis of the lesion's features using a few standardized techniques such as the 7-point checklist and the ABCDE rule. Typically, dermatoscopy is followed by a biopsy and then a histopathological analysis of the biopsy. To reduce the possibility of misdiagnosing actual melanomas, a considerable number of dermoscopically unclear lesions are biopsied, increasing emotional, financial, and medical consequences. A non-invasive imaging technique to analyze skin lesions during the dermoscopic stage can help alleviate some of these consequences. Hyperspectral imaging (HSI) is a promising methodology to non-invasively analyze skin lesions. Images taken of human skin using a hyperspectral camera are a result of numerous elements in the skin. Being a turbid, inhomogeneous material, the skin has chromophores and scattering agents, which interact with light and produce characteristic back-scattered energy that can be harnessed and analyzed with an HSI camera. In this study, a mathematical model of the skin is used to extract meaningful information from the hyperspectral data in the form of melanin concentration, blood volume fraction and blood oxygen saturation. The mean and standard deviation of these estimates are reported compared with theoretical values from the literature. The model is also evaluated for its sensitivity with respect to these parameters, and then fitted to measured hyperspectral data of six volunteer subjects in different conditions. Wavelengths which capture the most influential changes in the model response are identified to be 450 and 660 nm for melanin, 500 - 520 nm and 590 - 625 nm for blood volume fraction and 606, 646 and 750 nm for blood oxygen saturation.
518

The effect of sampling error on the interpretation of a least squares regression relating phosporus and chlorophyll

Beedell, David C. (David Charles) January 1995 (has links)
No description available.
519

CONTINUOUS ANTEDEPENDENCE MODELS FOR SPARSE LONGITUDINAL DATA

CHERUVU, VINAY KUMAR 30 January 2012 (has links)
No description available.
520

A NOVEL SYNERGISTIC MODEL FUSING ELECTROENCEPHALOGRAPHY AND FUNCTIONAL MAGNETIC RESONANCE IMAGING FOR MODELING BRAIN ACTIVITIES.

Michalopoulos, Konstantinos 26 August 2014 (has links)
No description available.

Page generated in 0.034 seconds