Spelling suggestions: "subject:"metaparameter"" "subject:"afterparameter""
391 |
Mantis The Black-Box Scanner : Finding XSS vulnerabilities through parse errorsLiljebjörn, Johan, Broman, Hugo January 2020 (has links)
Abstract [en] Background. Penetration testing is a good technique for finding web vulnerabilities. Vulnerability scanners are often used to aid with security testing. The increased scope is becoming more difficult for scanners to handle in a reasonable amount of time. The problem with vulnerability scanners is that they rely on fuzzing to find vulnerabilities. A problem with fuzzing is that: it generates a lot of network traffic; scans can be excruciatingly slow; limited vulnerability detection if the output string is modified due to filtering or sanitization. Objectives. This thesis aims to investigate if an XSS vulnerability scanner can be made more scalable than the current state-of-the-art. The idea is to examine how reflected parameters can be detected, and if a different methodology can be applied to improve the detection of XSS vulnerabilities. The proposed vulnerability scanner is named Mantis. Methods. The research methods used in this thesis are literature review and experiment. In the literature review, we collected information about the investigated problem to help us analyze the identified research gaps. The experiment evaluated the proposed vulnerability scanner with the current state-of-the-art using the dataset, OWASP benchmark. Results. The result shows that reflected parameters can be reliably detected using approximate string matching. Using the parameter mapping, it was possible to detect reflected XSS vulnerabilities to a great extent. Mantis had an average scan time of 78 seconds, OWASP ZAP 95 seconds and Arachni 17 minutes. The dataset had a total of 246 XSS vulnerabilities. Mantis detected the most at 213 vulnerabilities, Arachni detected 183, and OWASP ZAP 137. None of the scanners had any false positives. Conclusions. Mantis has proven to be an efficient vulnerability scanner for detecting XSS vulnerabilities. Focusing on the set of characters that may lead to the exploitation of XSS has proven to be a great alternative to fuzzing. More testing of Mantis is needed to determine the usability of the vulnerability scanner in a real-world scenario. We believe the scanner has the potential to be a great asset for penetration testers in their work.
|
392 |
Online Parameter Learning for Structural Condition Monitoring SystemUnknown Date (has links)
The purpose of online parameter learning and modeling is to validate and restore the properties of a structure based on legitimate observations. Online parameter learning assists in determining the unidentified characteristics of a structure by offering enhanced predictions of the vibration responses of the system. From the utilization of modeling, the predicted outcomes can be produced with a minimal amount of given measurements, which can be compared to the true response of the system. In this simulation study, the Kalman filter technique is used to produce sets of predictions and to infer the stiffness parameter based on noisy measurement. From this, the performance of online parameter identification can be tested with respect to different noise levels. This research is based on simulation work showcasing how effective the Kalman filtering techniques are in dealing with analytical uncertainties of data. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2020. / FAU Electronic Theses and Dissertations Collection
|
393 |
Parameter Estimation for Generalized Pareto DistributionLin, Der-Chen 01 May 1988 (has links)
The generalized Pareto distribution was introduced by Pickands (1975). Three methods of estimating the parameters of the generalized Pareto distribution were compared by Hosking and Wallis (1987) . The methods are maximum likelihood, method of moments and probability-weighted moments.
An alternate method of estimation for the generalized Pareto distribution, based on least square regression of expected order statistics (REOS), is developed and evaluated in this thesis . A Monte Carlo comparison is made between this method and the estimating methods considered by Hosking and Wallis (1987). This method is shown to be generally superior to the maximum likelihood, method of moments and probability-weighted moments
|
394 |
Modeling, Identification, and Control of an Unmanned Surface VehicleSonnenburg, Christian R. 16 January 2013 (has links)
This dissertation addresses the modeling, identification, and control of an automated planing vessel. To provide motion models for trajectory generation and to enable model-based control design for trajectory tracking, several experimentally identified models are compared over a wide range of speed and planing conditions for the Virginia Tech Ribcraft Unmanned Surface Vehicle. The modeling and identification objective is to determine a model which is sufficiently rich to enable effective model-based control design and trajectory optimization, sufficiently simple to allow parameter identification, and sufficiently general to describe a variety of hull forms and actuator configurations. Beginning with a 6 degree of freedom nonlinear dynamic model, several linear steering and speed models are obtained as well as a thruster model.
The Ribcraft USV tracks trajectories generated with the selected maneuvering models by using a back- stepping trajectory controller. A PD cascade trajectory control law is also developed and the performance of the two controllers is compared using aggressive trajectories. The backstepping control law compares favorably to the PD cascade controller. The backstepping control law is then further modified to account for nonlinear sternward dynamics and for a constant or slowly varying fluid flow. / Ph. D.
|
395 |
Image Based Computational Hemodynamics for Non-Invasive and Patient-Specific Assessment of Arterial StenosisKhan, Md Monsurul Islam 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / While computed tomographic angiography (CTA) has emerged as a powerful noninvasive option that allows for direct visualization of arterial stenosis(AS), it cant assess the hemodynamic abnormality caused by an AS. Alternatively, trans-stenotic pressure gradient (TSPG) and fractional flow reserve (FFR) are well-validated hemodynamic indices to assess the ischemic severity of an AS. However, they have significant restriction in practice due to invasiveness and high cost. To fill the gap, a new computational modality, called InVascular has been developed for non-invasive quantification TSPG and/or FFR based on patient's CTA, aiming to quantify the hemodynamic abnormality of the stenosis and help to assess the therapeutic/surgical benefits of treatment for the patient. Such a new capability gives rise to a potential of computation aided diagnostics and therapeutics in a patient-specific environment for ASs, which is expected to contribute to precision planning for cardiovascular disease treatment. InVascular integrates a computational modeling of diseases arteries based on CTA and Doppler ultrasonography data, with cutting-edge Graphic Processing Unit (GPU) parallel-computing technology. Revolutionary fast computing speed enables noninvasive quantification of TSPG and/or FFR for an AS within a clinic permissible time frame. In this work, we focus on the implementation of inlet and outlet boundary condition (BC) based on physiological image date and and 3-element Windkessel model as well as lumped parameter network in volumetric lattice Boltzmann method. The application study in real human coronary and renal arterial system demonstrates the reliability of the in vivo pressure quantification through the comparisons of pressure waves between noninvasive computational and invasive measurement. In addition, parametrization of worsening renal arterial stenosis (RAS) and coronary arterial stenosis (CAS) characterized by volumetric lumen reduction (S) enables establishing the correlation between TSPG/FFR and S, from which the ischemic severity of the AS (mild, moderate, or severe) can be identified. In this study, we quantify TSPG and/or FFR for five patient cases with visualized stenosis in coronary and renal arteries and compare the non-invasive computational results with invasive measurement through catheterization. The ischemic severity of each AS is predicted. The results of this study demonstrate the reliability and clinical applicability of InVascular.
|
396 |
Algorithms For Low-Distortion Embeddings Into Geometrically Restricted SpacesCarpenter, Timothy E. 30 August 2019 (has links)
No description available.
|
397 |
Modeling and Uncertainty Analysis of CCHP systemsSmith, Joshua Aaron 15 December 2012 (has links)
Combined Cooling Heating and Power (CCHP) systems have been recognized as a viable alternative to conventional electrical and thermal energy generation in buildings because of their high efficiency, low environmental impact, and power grid independence. Many researchers have presented models for comparing CCHP systems to conventional systems and for optimizing CCHP systems. However, many of the errors and uncertainties that affect these modeling efforts have not been adequately addressed in the literature. This dissertation will focus on the following key issues related to errors and uncertainty in CCHP system modeling: (a) detailed uncertainty analysis of a CCHP system model with novel characterization of weather patterns, fuel prices and component efficiencies; (b) sensitivity analysis of a method for estimating the hourly energy demands of a building using Department of Energy (DOE) reference building models in combination with monthly utility bills; (c) development of a practical technique for selecting the optimal Power Generation Unit (PGU) for a given building that is robust with respect to fuel cost and weather uncertainty; (d) development of a systematic method for integrated calibration and parameter estimation of thermal system models. The results from the detailed uncertainty analysis show that CCHP operational strategies can effectively be assessed using steady state models with typical year weather data. The results of the sensitivity analysis reveal that the DOE reference buildings can be adjusted using monthly utility bills to represent the hourly energy demands of actual buildings. The optimal PGU sizing study illustrates that the PGU can be selected for a given building in consideration of weather and fuel cost uncertainty. The results of the integrated parameter estimation study reveal that using the integrated approach can reduce the effect of measurement error on the accuracy of predictive thermal system models.
|
398 |
Parameter indentifiability of ARX models via discrete time nonlinear system controllabilityÖzbay, Hitay. January 1987 (has links)
No description available.
|
399 |
STRUCTURAL UNCERTAINTY IN HYDROLOGICAL MODELSAbhinav Gupta (11185086) 28 July 2021 (has links)
All hydrological models incur various uncertainties that can be broadly classified into three categories: measurement, structural, and parametric uncertainties. Measurement uncertainty exists due to error in measurements of properties and variables (e.g. streamflows that are typically an output and rainfall that serves as an input to hydrological models). Structural uncertainty exists due errors in mathematical representation of real-world hydrological processes. Parametric uncertainty exists due to structural and measurement uncertainty and limited amount of data availability for calibration. <br>Several studies have addressed the problem of measurement and parametric uncertainties but studies on structural uncertainty are lacking. Specifically, there does not exist any model that can be used to quantify structural uncertainties at an ungauged location. This was the first objective of the study: to develop a model of structural uncertainty that can be used to quantify total uncertainty (including structural uncertainty) in streamflow estimates at ungauged locations in a watershed. The proposed model is based on the idea that since the effect of structural uncertainty is to introduce a bias into the parameter estimation, one way to accommodate structural uncertainty is to compensate for this bias. The developed model was applied to two watersheds: Upper Wabash Busseron Watershed (UWBW) and Lower Des Plaines Watershed (LDPW). For UWBW, mean daily streamflow data were used while for LDPW mean hourly streamflow data were used. The proposed model worked well for mean daily data but failed to capture the total uncertainties for hourly data likely due to higher measurement uncertainties in hourly streamflow data than what was assumed in the study.<br>Once a hydrological and error model is specified, the next step is to estimate model- and error- parameters. Parameter estimation in hydrological modeling may be carried out using either formal Bayesian methodology or informal Bayesian methodology. In formal Bayesian methodology, a likelihood function, motivated from probability theory, is specified over a space of models (or residuals), and a prior probability distribution is assigned over the space of models. There has been significant debate on whether the likelihood functions used in Bayesian theory are justified in hydrological modeling. However, relatively little attention has been given to justification of prior probabilities. In most hydrological modeling studies, a uniform prior over hydrological model parameters is used to reflect a complete lack of knowledge of a modeler about model parameters before calibration. Such a prior is also known as a non-informative prior. The second objective of this study was to scrutinize the assumption of uniform prior as non-informative using the principle of maximum information gain. This principle was used to derive non-informative priors for several hydrological models, and it was found that the obtained prior was significantly different from a uniform prior. Further, the posterior distributions obtained by using this prior were significantly different from those obtained by using uniform priors.<br>The information about uncertainty in a modeling exercise is typically obtained from residual time series (the difference between observed and simulated streamflows) which is an aggregate of structural and measurement uncertainties for a fixed model parameter set. Using this residual time series, an estimate of total uncertainty may be obtained but it is impossible to separate structural and measurement uncertainties. The separation of these two uncertainties is, however, required to facilitate the rejection of deficient model structures, and to identify whether the model structure or the measurements need to be improved to reduce the total uncertainty. The only way to achieve this goal is to obtain an estimate of measurement uncertainty before model calibration. An estimate of measurement uncertainties in streamflow can be obtained by using rating-curve analysis but it is difficult to obtain an estimate of measurement uncertainty in rainfall. In this study, the classic idea of repeated sampling is used to get an estimate of measurement uncertainty in rainfall and streamflows. In the repeated sampling scheme, an experiment is performed several times under identical conditions to get an estimate of measurement uncertainty. This kind of repeated sampling, however, is not strictly possible for environmental observations, therefore, repeated sampling was used in an approximate manner using a machine learning algorithm called random forest (RF). The main idea is to identify rainfall-runoff events across several different watersheds which are similar to each other such that they can be thought of as different realizations of the same experiment performed under identical conditions. The uncertainty bounds obtained by RF were compared against the uncertainty band obtained by rating-curve analysis and runoff-coefficient method. Overall, the results of this study are encouraging in using RF as a pseudo repeated sampler. <br>In the fourth objective, importance of uncertainty in estimated streamflows at ungauged locations and uncertainty in measured streamflows at gauged locations is illustrated in water quality modeling. The results of this study showed that it is not enough to obtain an uncertainty bound that envelops the true streamflows, but that the individual realizations obtained by the model of uncertainty should be able to emulate the shape of the true streamflow time series for water quality modeling.
|
400 |
A Survey of Applications of Spline Functions to Statistics.Mawk, Russell Lynn 01 August 2001 (has links) (PDF)
This thesis provides a survey study on applications of spline functions to statistics. We start with a brief history of splines. Then, we discuss the application of splines to statistics as they are applied today. Several topics included in the discussion are splines, spline regression, spline smoothing, and estimating the smoothing parameter for spline regression. Also, we give a very brief discussion of multivariate splines in statistics and wavelets in statistics. Both of these topics are currently subjects for continuing research by many mathematicians.
|
Page generated in 0.0695 seconds