Spelling suggestions: "subject:"nonparametric"" "subject:"onparametric""
411 |
MEASURING COMMERCIAL BANK PERFORMANCE AND EFFICIENCY IN SUB-SAHARAN AFRICANGU, BRYAN, Mesfin, Tsegaye January 2009 (has links)
<p>This paper offers to measure efficiency of banks in Sub Saharan Africa and its determining input andout put factors on two fonts. At this purpose, we applied the first font; Data Envelopment Analysis(DEA) for assessing efficiency level. The actual and target level of inputs/outputs to foster efficiencyare shown in the results. Secondly, the banks ratio analysis measuring banks performance throughreturns volatility for each bank, asset utilization and provision for bad and doubtful debts over thestudy period are all used as tools for this analysis. Our results suggest that Sub Saharan AfricanBanks are about 98.35% efficient. We are aware that the level of efficiency could be subject to up anddown swing if environmental factors influencing banks efficiency where taken into consideration.Finally, our result (DEA) is more sensitive to loans, other liabilities, other non interest expense,securities and deposit.</p>
|
412 |
Parametric study of manifolds using finite element analysisBäckström, Kristoffer January 2008 (has links)
<p>Volvo Aero Corporation takes part in a development project called Future Launchers Preparatory Program (FLPP) which aims to develop Next Generation Launchers (NGL) for future space flights. FLPP involves several projects and one these are focused on the development of the next generation rocket engines for the NGL.</p><p>The environment of a rocket engine is extremely hostile, characterized by high pressure levels and rapid thermal transients. Even though the components are manufactured from super alloys, the life of these components is measured in seconds. In the light of these facts, it is obvious that all components have to be optimized to the last detail. This thesis work is a part of the optimization procedure with the objective to perform a parametric study of manifolds that will be particular useful during the concept work of the turbines for the FLPP program.</p><p>The methods of probabilistic analysis have been employed in this study. This approach involves Ishikawa analysis (Cause and Effects) as well deriving transfer functions through defining and performing simulations in a structured manner according to a Design of Experiment model. Transfer functions, which are derived through a series of Finite Element Analysis, describe the relation between design parameter and stress levels. The transfer function can be considered as a simplified physical model which only is applicable within the range used of the design parameters. The use of transfer function is especially powerful when performing Monte Carlo simulations to determine the likelihood of plasticity.</p><p>One short coming of transfer functions is that only the parameters included from the beginning can be altered and assessed. One also have to consider the simplifications introduced through the modelling, such as transfer functions derived using linear elastic simulations can not be used for assessment of plastic deformations. The method developed in this thesis will be further developed in following studies. This report is therefore meant to serve as a guide for the next investigator at Volvo Aero Corporation.</p>
|
413 |
Photonic microwave processor based on fiber optical parametric amplifierLi, Jia, January 2009 (has links)
Thesis (M. Phil.)--University of Hong Kong, 2009. / Includes bibliographical references. Also available in print.
|
414 |
Sensor Validation Using Linear Parametric Models, Artificial Neural Networks and CUSUM / Sensorvalidering medelst linjära konfektionsmodeller, artificiella neurala nätverk och CUSUMNorman, Gustaf January 2015 (has links)
Siemens gas turbines are monitored and controlled by a large number of sensors and actuators. Process information is stored in a database and used for offline calculations and analyses. Before storing the sensor readings, a compression algorithm checks the signal and skips the values that explain no significant change. Compression of 90 % is not unusual. Since data from the database is used for analyses and decisions are made upon results from these analyses it is important to have a system for validating the data in the database. Decisions made on false information can result in large economic losses. When this project was initiated no sensor validation system was available. In this thesis the uncertainties in measurement chains are revealed. Methods for fault detection are investigated and finally the most promising methods are put to the test. Linear relationships between redundant sensors are derived and the residuals form an influence structure allowing the faulty sensor to be isolated. Where redundant sensors are not available, a gas turbine model is utilized to state the input-output relationships so that estimates of the sensor outputs can be formed. Linear parametric models and an ANN (Artificial Neural Network) are developed to produce the estimates. Two techniques for the linear parametric models are evaluated; prediction and simulation. The residuals are also evaluated in two ways; direct evaluation against a threshold and evaluation with the CUSUM (CUmulative SUM) algorithm. The results show that sensor validation using compressed data is feasible. Faults as small as 1% of the measuring range can be detected in many cases.
|
415 |
Modeling the Behavior of an Electronically Switchable Directional Antenna for Wireless Sensor NetworksSilase, Geletu Biruk January 2011 (has links)
Reducing power consumption is among the top concerns in Wireless Sensor Networks, as the lifetime of a Wireless Sensor Network depends on its power consumption. Directional antennas help achieve this goal contrary to the commonly used omnidirectional antennas that radiate electromagnetic power equally in all directions, by concentrating the radiated electromagnetic power only in particular directions. This enables increased communication range at no additional energy cost and reduces contention on the wireless medium. The SPIDA (SICS Parasitic Interference Directional Antenna) prototype is one of the few real-world prototypes of electronically switchable directional antennas for Wireless Sensor Networks. However, building several prototypes of SPIDA and conducting real-world experiments using them may be expensive and impractical. Modeling SPIDA based on real-world experiments avoids the expenses incurred by enabling simulation of large networks equipped with SPIDA. Such a model would then allow researchers to develop new algorithms and protocols that take advantage of the provided directional communication on existing Wireless Sensor Network simulators. In this thesis, a model of SPIDA for Wireless Sensor Networks is built based on thoroughly designed real-world experiments. The thesis builds a probabilistic model that accounts for variations in measurements, imperfections in the prototype construction, and fluctuations in experimental settings that affect the values of the measured metrics. The model can be integrated into existing Wireless Sensor Network simulators to foster the research of new algorithms and protocols that take advantage of directional communication. The model returns the values of signal strength and packet reception rate from a node equipped with SPIDA at a certain point in space given the two-dimensional distance coordinates of the point and the configuration of SPIDA as inputs. / Phone:+46765816263 Additional email: burkaja@yahoo.com
|
416 |
Statistical environmental models: Hurricanes, lightning, rainfall, floods, red tide and volcanoesWooten, Rebecca Dyanne 01 June 2006 (has links)
This study consists of developing descriptive, parametric, linear and non-linear statistical models for such natural phenomena as hurricanes, lightning, flooding, red tide and volcanic fallout. In the present study, the focus of research is determining the stochastic nature of phenomena in the environment. These statistical models are necessary to address the variability of nature and the misgivings of the deterministic models, particularly when considering the necessity for man to estimate the occurrence and prepare for the aftermath.The relationship between statistics and physics looking at the correlation between wind speed and pressure versus wind speed and temperature play a significant role in hurricane prediction. Contrary to previous studies, this study indicates that a drop in pressure is a result of the storm and less a cause. It shows that temperature is a key indicator that a storm will form in conjunction with a drop in pressure.
This study demonstrates a model that estimates the wind speed within a storm with a high degree of accuracy. With the verified model, we can perform surface response analysis to estimate the conditions under which the wind speed is maximized or minimized. Additional studies introduce a model that estimates the number of lightning strikes dependent on significantly contributing factors such as precipitable water, the temperatures within a column of air and the temperature range. Using extreme value distribution and historical data we can best fit flood stages, and obtain profiling estimate return periods. The natural logarithmic count of Karenia Brevis was used to homogenize the variance and create the base for an index of the magnitude of an outbreak of Red Tide. We have introduced a logistic growth model that addresses the subject behavior as a function of time and characterizes the growth rate of Red Tide.
This information can be used to develop strategic plans with respect to the health of citizens and to minimize the economic impact. Studying the bivariate nature of tephra fallout from volcanoes, we analyze the correlation between the northern and eastern directions of a topological map to find the best possible probabilistic characterization of the subject data.
|
417 |
Ranking-Based Methods for Gene Selection in Microarray DataChen, Li 21 March 2006 (has links)
DNA microarrays have been used for the purpose of monitoring expression levels of thousands of genes simultaneously and identifying those genes that are differentially expressed. One of the major goals of microarray data analysis is the detection of differentially expressed genes across two kinds of tissue samples or samples obtained under two experimental conditions. A large number of gene detection methods have been developed and most of them are based on statistical analysis. However the statistical analysis methods have the limitations due to the small sample size and unknown distribution and error structure of microarray data. In this thesis, a study of ranking-based gene selection methods which have weak assumption about the data was done. Three approaches are proposed to integrate the individual ranks to select differentially expressed genes in microarray data. The experiments are implemented on the simulated and biological microarray data, and the results show that ranking-based methods outperform the t-test and SAM in selecting differentially expressed genes, especially when the sample size is small.
|
418 |
Parametric study of LCROSS impact plumeLamb, Justin Meredith 04 April 2014 (has links)
In 2009, NASA's LCROSS mission impacted Cabeus Crater near the Lunar South Pole with the spent Centaur upper stage rocket. The impact was observed by the trailing sheperding spacecraft (S-S/C) that impacted the moon 250 seconds after the Centaur impact. The main objective of the LCROSS mission was to verify the existence of water ice in the lunar regolith---the subsequent analysis of the data confirmed water ice present in the crater. The analysis of the S-S/C instrument data suggested that the plume consisted of two components: a central "spike" component and a thin, outward "cone" component. A model has been developed at The University of Texas at Austin improve the analysis of the data obtained by the S-S/C. This model is created with a free-molecular ballistic grain code that involves simulating individual regolith grains in the debris plume through grain-heating and grain-movement models and then modeling the spectral radiance properties of the grains as observed by the S-S/C. Mie scattering theory is used to model scattering and absorption of incoming solar radiation by the particles in the plume assuming they are perfect spheres. The UT LCROSS code was utilized in a parametric study that evaluated the effect of variations in assumed model plume parameters on the modeling of S-S/C UV-VIS instrument observations. The plume parameters were chosen based on the assumption that the dust plume was split into two components: a central spike and a surrounding high angle cone. The following parameters were varied: the spike and cone angles, the spike and cone grain radius distributions, and the spike mass fraction. The following parameters could be varied but were given fixed values: ice fraction between plume components, ice grain purity, albedo, and ice fraction in plume. The impact of these plume parameters upon plume brightness and blue/red color ratio was determined. Two grain models were used. In the initial grain species model all grains have a soil core surrounded by a thin ice shell. In the second, two species model two grain types were utilized: a pure ice grain component and a pure soil grain component. / text
|
419 |
Optical parametric processes in biophotonics and microwave photonics applicationsCheung, Ka-yi., 張嘉兒. January 2010 (has links)
published_or_final_version / Electrical and Electronic Engineering / Master / Master of Philosophy
|
420 |
Topology optimization for additive manufacturing of customized meso-structures using homogenization and parametric smoothing functionsSundararajan, Vikram Gopalakrishnan 16 February 2011 (has links)
Topology optimization tools are useful for distributing material in a geometric domain to match targets for mass, displacement, structural stiffness, and other characteristics as closely as possible. Topology optimization tools are especially applicable to additive manufacturing applications, which provide nearly unlimited freedom for customizing the internal and external architecture of a part. Existing topology optimization tools, however, do not take full advantage of the capabilities of additive manufacturing. Prominent tools use micro- or meso-scale voids or artificial materials to parameterize the topology optimization problem, but they use filters, penalization functions, and other schemes to force convergence to regions of fully dense (solid) material and fully void (open) space in the final structure as a means of accommodating conventional manufacturing processes. Since additive manufacturing processes are capable of fabricating intermediate densities (e.g., via porous mesostructures), significant performance advantages could be achieved by preserving and exploiting those features during the topology optimization process. Towards this goal, a topology optimization tool has been created by combining homogenization with parametric smoothing functions. Rectangular mesoscale voids are used to represent material topology. Homogenization is used to analyze its properties. B-spline based parametric smoothing functions are used to control the size of the voids throughout the design domain, thereby smoothing the topology and reducing the number of required design variables relative to homogenization-based approaches. Resulting designs are fabricated with selective laser sintering technology, and their geometric and elastic properties are evaluated experimentally. / text
|
Page generated in 0.0505 seconds