• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 135
  • 64
  • 58
  • 45
  • 18
  • 12
  • 9
  • 9
  • 8
  • 5
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 405
  • 128
  • 110
  • 60
  • 47
  • 39
  • 38
  • 30
  • 28
  • 27
  • 26
  • 26
  • 25
  • 25
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Rotate and Hold and Scan (RAHAS): Structured Light Illumination for Use in Remote Areas

Crane, Eli Ross 01 January 2011 (has links)
As a critical step after the discovery of material culture in the field, archaeologists have a need to document these findings with a slew of different physical measurements and photographs from varying perspectives. 3-D imaging is becoming increasingly popular as the primary documenting method to replace the plethora of tests and measurements, but for remote areas 3-D becomes more cumbersome due to physical and environmental constraints. The difficulty of using a 3-D imaging system in such environments is drastically lessened while using the RAHAS technique, since it acquires scans untethered to a computer. The goal of this thesis is to present the RAHAS Structured Light Illumination technique for 3-D image acquisition, and the performance of the RAHAS technique as a measurement tool for documentation of material culture on a field trip to the Rio Platano Biosphere in Honduras.
142

Apport des technologies d'imagerie non invasives dans l'évaluation du pronostic des pathologies cardiovasculaires. / Utility of non-invasive imaging techniques in evaluating thé prognosis of cardiovascular disease

Chopard dit Jean, Romain 17 June 2014 (has links)
Pour ce travail de thèse, nous avons réalisé cinq études originales en utilisant trois technologies d'imageries cardiovasculaires non-invasives.-Nous avons démontré, à partir d'une étude ex-vivo sur des artères coronaires humaines, que le scanner64 détecteurs ne permettait pas de caractériser précisément les différents composants des plaques. Ladistinction des plaques fibreuses et des plaques lipidiques est en effet impossible. Par ailleurs, notretravail a montré que l'IVUS ne devait pas servir d'imagerie de référence lors des études sur la plaque carcet examen présente lui aussi de nombreuses imprécisions.-Notre travail sur la thrombo-aspiration rapporte un effet très significatif de l'extraction effective dethrombus lors des thrombo-aspirations à la phase aiguë des STEMI, avec une réduction de la taille du no-reflow et de l'infarctus, évaluées en IRM ; une thrombo-aspiration positive représentant par ailleurs dansnotre travail, un critère indépendamment lié à la taille finale de l'infarctus. L'extraction effective dethrombus pourrait être considéré, en salle de cathétérisme, comme un critère de jugement de l'efficacitéde la thrombo-aspiration.-Notre étude sur les syndomes coronaires à coronaires angiographiquement normales a confirmé l'intérêtde l'IRM dans le bilan étiologique de cette présentation clinique, permettant un diagnostic étiologiquedans 2/3 des cas. Par ailleurs, nous avons observé une excellente évolution pour le tiers des patients chezqui l'IRM ne décèle pas d'anomalie myocardique. Des études d'une plus grande envergure serontnécessaires afin de confirmer nos résultats.-A partir d'IRM cardiaque réalisées chez des patients ayant présenté un premier épisode de STEMI, nousavons pu déterminer une valuer seuil de troponine prédictive de la survenue d'un no-reflow.-Enfin, à partir d'analyses en Speckle Tracking, nous avons mis en évidence une dysfonction systolique VD,objectivée par une altération des valeurs de Strain longitudinal VD, chez les patients présentant une EPgrave ou de gravité intermédiaire, comparativement à un groupe de patients avec une EP non grave. / In this doctoral thesis, we report on five original studies that use three différent non-invasive cardiovascular imaging techniques:- In an ex vivo study of human coronary arteries, we show that 64-slice computed tomography (CT) scan isnot capable of distinguishing between différent components of plaques. Indeed, it is impossible todifferentiate between fibrous and lipid plaques. Our study also showed that intravascular ultrasound(IVUS) should not be used as thé référence method in studies of plaque composition, since this techniquealso suffers from numerous limitations.- Our study of thé efficacy of thrombo-aspiration showed a significant benefit with effective extraction ofthrombus during thrombo-aspiration at thé acute phase of ST élévation myocardial infarction (STEMI),notably with a réduction of thé extent of no-reflow and of infarct size as evaluated by magnetic résonanceimaging (MRI). Productive thrombo-aspiration was shown in our study to be an independent predictor offinal infarct size. Effective extraction of thrombotic material could be considered in thé cathlab as acriterion for evaluating thé success of thé thrombo-aspiration procédure.- Our study of acute coronary syndromes with normal coronary arteries confirmed thé utility of MRI inestablishing thé etiology of this clinical présentation, and made it possible to establish an etiologicaldiagnosis in two-thirds of patients. We also observed excellent outcomes in thé third of patients in whomMRI did not find any myocardial anomalies. Larger studies are warranted to confirm thèse findings.- Based on cardiac MRI performed in patients presenting a first épisode of STEMI, we established athreshold value of troponin that predicts thé occurrence of no-reflow.- Lastly, using speckle-tracking analysis, we demonstrated impaired systolic right ventricular function inpatients with intermediate to high risk pulmonary embolism (PE), evaluated by altérations in longitudinalstrain values at thé level of thé right ventricle, compared to a control group of patients with low risk PE.
143

Stores and consumers : two perspectives on food purchasing

Holmberg, Carina January 1996 (has links)
What characterises the grocery store — customer relation? Can grocery stores expect to have an active role in households’ management of everyday life? Can households, on the other hand demand active participation and help from the stores? By which methods can we learn more both about food-related consumer behaviour and about the relation between stores and consumers? Based on two empirical studies of consumer behaviour, this books deals with these issues. In the first part of the book an empirical study using 165 weeks of POS scanner data for fourteen product categories is presented. These data were used to analyse effects of different promotional activities, and the results indicated that in-store activities, particularly in the form of in-store display, were important. The results from the first study brought the question of store — customer relation into focus, why this relation was studied next. The second part of the book consists of this exploratory, empirical study applying participant interviewing. In this case it meant talking to grocery shoppers while observing them. Some ten households with small children were accompanied to the grocery store, providing a context based consumer perspective. Different aspects of food and purchasing, such as interest in food and the social role of the meal, were treated. Tentatively, the dimensions planning and involvement are suggested as dimensions by which to separate households or shopping trips from each other. If validated, these dimensions might be useful as instruments for retailers interested in adapting to their customers different needs and purchasing behaviours. The concluding part discusses the two empirical studies, both in terms of method and in terms of contribution to knowledge on consumer behaviour. One important issue is here the value, for a single researcher, of combining two studies with such different methods. / Diss. Stockholm : Handelshögsk.
144

Data post-processing and hardware architecture of electromagnetic near field scanner

Tankielun, Adam January 2007 (has links)
Zugl.: Hannover, Univ., Diss., 2007
145

OSCAR - the oppportunistic scanner

Griesser, Andreas January 2006 (has links)
Zugl.: Zürich, Techn. Hochsch., Diss., 2006
146

Tomógrafo em nível de simulação utilizando micro-ondas em banda ultra larga (UWB) com transmissor em tecnologia CMOS para detecção precoce de câncer de mama. / Tomograph in simulation level using microwave in ultra wide band with transmitter in CMOS technology for early breast cancer detection.

Stelvio Henrique Ignácio Barboza 29 May 2014 (has links)
O sistema desenvolvido obteve boa resposta na detecção de modelos numéricos de tumores com dimensões a partir de 5mm, representada na localização adequada e determinação do tamanho obtidos por meio de simulações envolvendo os modelos dos blocos especificados. Como objetivo principal do trabalho será apresentado o projeto, fabricação e resultados de testes de um circuito integrado gerador de pulsos com o formato da derivada de quinta ordem do pulso de gaussiano (transmissor UWB) fabricado utilizando a tecnologia IBM 0.18 CMOS. Os blocos principais que formam o circuito gerador de pulso são: circuito gerador de onda quadrada, gerador de atrasos, detector de fase e etapa de saída (formador de pulso). O gerador de onda quadrada foi implementado a partir de um buffer de RF com um inversor na saída com casamento de impedância. O gerador de atrasos foi implementado a partir de uma cascata de inversores. O circuito detector de fase é composto por bloco n- dinâmico , n-latch e inversor estático para forma pulsos em alta velocidade. As dimensões dos transistores foram definidas de modo a obter característica adequada de um pulso Gaussiano de 5ª ordem, considerando especificações exigidas de Sistema de Detecção de Câncer de Mama. Foi implementado o leiaute em full Custom com dimensões mínimas da tecnologia. Cinco chips diferentes foram testados. E os valores da fonte de alimentação foram variadas em 1,62V, 1,80V e 1.98, então foram medidos os valores de saída pico a pico e largura de pulso para cada chip. O consumo de energia medido foi de 244 uW, e a amplitude do pulso de saída 115,2 mV pico a pico e largura de pulso de 407,8 ps com um sinal de entrada senoidal de amplitude 806mVp à 100 MHz . O pulso gerado pelo gerador de pulso resultou em uma PSD (Power Spectral Density) com largura de banda de 0,6 GHz a 7,8 GHz, que é adequado para aplicações de UWB para detecção do câncer de mama. / As a result for detecting the numeric representation, the system could identify tumors from 5mm of extent with adequate localization, as well size determination. The primary goal of this work, therefore, is to bring out the project, manufacturing process and achieved results of tests regarding an integrated circuit for generating pulses which are shaped as the derivative of fifth order of the Gaussian pulse (UWMB transmitter) using the UWB 0.18 CMOS. The pulse generator circuit is composed by: circuit for generating square waves, delay generator, phase detector and output stage. The generator of square wave was implemented from one buffer of rf, with an inverter in the output and impedance matching. The delay generator was implemented from one cascade of inverters. The circuit for detecting the stages is assembled with n block dynamic, n-latch and static inverter for quickly generating pulses (high speed pulse generation). The dimensions of the transistors were defined in order to obtain the adequate characteristics of one Gaussian pulse of 5th order, considering the required specification of the Detection System for Cancer of Breast. It was implemented using the full Custom layout, taking into account the minimum dimensions for such technology. Five different chips were tested. The values of the source energy varied among 1,62V, 1,80V and 1,98V, being later measured the output values, peak to peak, as well the pulse width for each chip. The measured energy consumption was 244 uW, the amplitude of the output pulse was 115.2 mV peak to peak, and the pulse width was 407,8ps with sinusoidal input signal of 806mVp amplitude at 100MHz. As a result, it was obtained a PSD (Power Spectral Density) with band width of 0,6GHz to 7,8GHz from the pulse generator, which is quite adequate for UWB applications for detecting the breast cancer.
147

Processos erosivos em áreas úmidas, APA do Banhado Grande - RS

Etchelar, Cecilia Balsamo January 2017 (has links)
A Área de Proteção Ambiental do Banhado Grande na Bacia Hidrográfica do rio Gravataí, inserida na região metropolitana de Porto Alegre, abriga uma extensa Área Úmida em sua porção central. A partir de técnicas de Sensoriamento Remoto e Geoprocessamento para a análise das variáveis físicas como, geologia, cota altimétrica, pedologia e índices de vegetação como o NDVI e o NDWI, foi possível classificar compartimentos de Unidades de Paisagem associados às Áreas Úmidas em: Unidade de Paisagem Turfeira e Unidade de Paisagem / Planície de Inundação. A delimitação destas áreas, visa o planejamento do uso racional destes ambientes para a sua manutenção e preservação. No início da década de 1970, um trecho do rio Gravataí foi canalizado, com objetivo de drenar as Áreas Úmidas e viabilizar a expansão das áreas de cultivo do arroz, esta intervenção ocasionou um processo erosivo em forma de voçoroca no Banhado Grande, no município de Glorinha. A partir de uma série temporal de imagens satelitais foi possível mapear e quantificar a evolução do processo erosivo da voçoroca entre os anos de 2003 a 2015. Os resultados do mapeamento mostraram um aumento de quatro vezes na área da voçoroca de 2909,62 m² em 2003 para 12097,70 m² em 2015. No mapeamento da voçoroca e no monitoramento dos processos erosivos no banhado foram o uso das técnicas como: a) monitoramento por estaqueamento; b) mapeamento por imagens de satélite; b) mapeamento por varredura de Laser Scanner Terrestre. O Laser Scanner Terrestre mostrou-se uma ferramenta potencial para o mapeamento da voçoroca em virtude de sua alta precisão e rapidez na coleta de dados no campo, gerando modelos digitais de elevação com alta precisão. O uso do modelo permitiu identificar áreas de erosão deposição e de sedimentos, a partir do perfil topográfico. É necessário o monitoramento contínuo do processo erosivo na área do Banhado Grande, combinando métodos para modelar a sua dinâmica. Essas ferramentas de apoio são fundamentais para a elaboração do Plano de Manejo e estudos que visam a restauração das Áreas Úmidas do rio Gravataí. / The Banhado Grande Environmental Protected Area in the Gravataí river basin, located in the metropolitan area of Porto Alegre, has an extensive wetland area in its central portion. From remote sensing and geoprocessing techniques for the analysis of physical variables such as geology, topography, pedology and vegetation indices such as NDVI and NDWI, it was possible to classify compartments of Landscape Units associated to Wetlands in: Peat Landscape Unit and Landscape Unit / Flood Plain. The delimitation of these areas, aims at planning the rational use of these environments for their maintenance and preservation. In the beginning of the 1970s, a section of the Gravataí river was channeled, with the objective of draining the wetlands and making possible the expansion of rice cultivation areas, this intervention caused an erosive process in the form of gully in Banhado Grande, in the municipality of Glorinha. From a time series of satellite images, it was possible to map and quantify the evolution of the erosion process of the voçoroca between the years 2003 to 2015. The results of the mapping showed a significant increase in the gully of 2909.62 m² in 2003 to 12097.70 m² in 2015. In the mapping of gully and in the monitoring of the erosive processes in the swamp, were used techniques such as: a) stacking monitoring; b) mapping by satellite images; b) scanning mapping of terrestrial Laser Scanner. The terrestrial Laser Scanner has proved to be a potential tool for the mapping of the gully because of its high precision and rapid data collection in the field, generating digital elevation models with high precision. The use of the model allowed identifying areas of erosion deposition and sediment from the topographic profile. It is necessary, however, to choose suitable techniques for the continuous monitoring of the erosive process in the area of the Grande Banhado, combining methods to model the dynamics. These support tools are fundamental for the elaboration of the Management Plan and studies that aim at the restoration of the wetlands of the Gravataí river flood plain.
148

Análise de amostragem e interpolação na geração de MDE / Sampling and interpolation analysis in DEM generation

Miranda, Gisele Horta Barroso 16 February 2017 (has links)
Submitted by Reginaldo Soares de Freitas (reginaldo.freitas@ufv.br) on 2017-06-23T17:30:39Z No. of bitstreams: 1 texto completo.pdf: 2134115 bytes, checksum: 7db29fac012c3432c10cf4a61436e711 (MD5) / Made available in DSpace on 2017-06-23T17:30:39Z (GMT). No. of bitstreams: 1 texto completo.pdf: 2134115 bytes, checksum: 7db29fac012c3432c10cf4a61436e711 (MD5) Previous issue date: 2017-02-16 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / O presente trabalho propõe uma análise sobre métodos de amostragem e interpolação para a modelagem do terreno utilizando dados oriundos de levantamento Laser Scanner terrestre. Esta análise é fundamentada na proposta de se obter um Modelo Digital de Elevação (MDE) capaz de representar o terreno de forma mais fidedigna possível. A problemática consiste na manipulação da nuvem de pontos coletada no levantamento dos dados com esta tecnologia, a qual possui uma densa quantidade de pontos constituída por coordenadas tridimensionais requerendo uma grande demanda de recursos computacionais. Dessa forma, propõe-se nesta pesquisa avaliar diferentes tipos de amostragens da nuvem de pontos, a fim de encontrar uma amostra de tamanho ideal que seja capaz de representar o fenômeno de forma mais fidedigna possível. Bem como aplicar aos conjuntos amostrais, diferentes interpoladores e analisar a influência dos métodos de interpolação, para se obter resultados capazes de modelar o terreno com eficiência. Propõe-se ainda, para análise da eficiência avaliar a acurácia posicional de acordo com o padrão ET-CQDG e Decreto-lei n° 89.817/1984, dos MDEs gerados em relação à um MDE de referência obtido a partir do conjunto de dados originais (58 mil pontos) levantados em campo em uma área de 0,97 ha. Dentre as amostragens Aleatória, Sistemática Quadrada, Sistemática Triangular e Sistemática Hexagonal para os dados utilizados, o melhor grid amostral foi o sistemático hexagonal, apresentando valores de RMS inferiores aos demais. Quanto as análises relacionadas a qualidade cartográfica dos interpoladores IDW, Spline, TIN, Vizinho Natural e Krigagem, os melhores interpoladores apontados pelos resultados foram TIN e Vizinho Natural que possuíram resultados idênticos e Krigagem que classificou diferentemente a amostragem Sistemática Hexagonal com 250 pontos dos demais. Em relação ao número de amostragem, os resultados mostraram que MDE’s com classificações na escala 1:1.000 – Classe A, foram gerados, com até 250 pontos, em alguns casos. Nesse sentido o esforço computacional entre amostras de 250 pontos e 3 mil pontos é razoavelmente o mesmo, quando comparado às amostras de 50 mil e 30 mil pontos, portanto, para esta área de estudo, é melhor utilizar de amostragens com redução de 94% do que amostras com redução de 99% visto que ambas resultaram em MDE’s de boa qualidade. / The present work proposes an analysis on sampling and interpolation methods for the terrain modeling using data from a Terrain Laser Scanner survey. This analysis is based on the proposal to obtain a Digital Elevation Model (DEM) capable of representing the terrain in the most reliable way possible. The problematic is the manipulation of the cloud of points collected in the data collection with this technology, which has a dense number of points consisting of three-dimensional coordinates requiring a great demand of computational resources. Thus, it is proposed in this research to evaluate different types of samplings of the cloud of points, in order to find a sample of ideal size that is able to represent the phenomenon in the most reliable way possible. As well as applying to the sample sets, different interpolators and to analyze the influence of the interpolation methods, to obtain results capable of modeling the terrain with efficiency. It is also proposed, for efficiency analysis, to evaluate the positional accuracy according to the ET- CQDG standard and Decree-Law no. 89.817 / 1984, of the DEMs generated in relation to a reference DEM obtained from the original data set (58 thousand points) raised in the field in an area of 0.97 ha. Among the Random, Square Systematics, Triangular Systematics and Hexagonal Systematics samplings, for the data used, the best sampling grid was the systematic hexagonal, presenting lower MSE values than the others. As for the analyzes related to the cartographic quality of the IDW, Spline, TIN, Natural Neighbor and Kriging interpolators, the best interpolators indicated by the results were TIN and Neighbor Natural that had identical results and Kriging that classified differently the Hexagonal Systematic sampling with 250 points of the others. Regarding the number of samples, the results showed that DEMs with 1:1,000 - Class A classifications were generated, with up to 250 points, in some cases. In this sense, the computational effort between samples of 250 points and 3,000 points is reasonably the same, when compared to samples of 50,000 and 30,000 points, therefore, for this area of study, it is better to use samples with a 94% reduction than samples with a 99% reduction since both resulted in good quality DEMs.
149

Laser scanner terrestre na caracterização de alvos florestais

Bordin, Fabiane January 2015 (has links)
O resultado do escaneamento de um Laser Scanner Terrestre (LST) é uma nuvem de pontos com coordenadas geométricas (X, Y, Z), informações de cor (R, G, B) provenientes de uma câmera fotográfica acoplada ao equipamento e, ainda, a informação do retorno da intensidade do pulso laser (I). Esses sistemas de varredura possuem algumas características como, por exemplo, sua rapidez na aquisição de informações, registro de cenas em 3D e coleta de informações sem contato direto que se aplicam de forma importante nas análises florestais. Contudo, a grande vantagem da utilização de um LST na área florestal é a possibilidade de caracterizar alvos remotamente de forma rápida e não destrutiva. Assim, este trabalho teve como objetivo principal avaliar os dados de intensidade de retorno do laser provenientes de um sistema LST para a caracterização de alvos florestais. Metodologicamente foram realizados experimentos controlados que envolveram as seguintes etapas: calibração radiométrica do LST; avaliação da influência da distância nos dados de intensidade de retorno do laser e; análise do efeito de borda em imageamento de alvos florestais (considerado um dos principais problemas que afeta os dados intensidade de retorno quando se utiliza um LST). O equipamento utilizado durante os experimentos foi um laser scanner modelo Ilris 3D da Optech que trabalha no intervalo do infravermelho médio com comprimento de onde de 1535 nm. Os resultados mostraram que para esse comprimento de onda os alvos florestais devem ser imageados a uma distância maior ou igual a 5m e o processamento dos dados com resolução radiométrica de 8 bits foi mais adequado, pois proporcionou uma caracterização geométrica do alvo com efeito visual de melhor qualidade se comparado com o processamento de 16 bits. Os resultados dos experimentos realizados sobre o efeito de borda possibilitaram identificar dois tipos de distorções que ocorrem em dados de nuvem de pontos adquiridos com um LST. O primeiro afetou os valores de intensidade de retorno do laser e o segundo criou um efeito que deslocou os pontos no espaço. Para minimizar este efeito foi desenvolvido um algoritmo, o IRA (Intensity Recovery Algorithm), que possibilitou recuperar automaticamente os valores de intensidade de retorno do laser minimizando em até 35,7% o efeito de borda nos imageamentos do alvo estudado na pesquisa. Assim para o uso de um LST, na caracterização geométrica de alvos florestais, é necessário desenvolver modelos de calibração da intensidade de retorno do pulso laser, pois os sistemas LST são distintos em termos de faixa do espectro eletromagnético que operam. Por fim, no que tange ao efeito de borda concluiu-se que o algoritmo IRA necessita ser aprimorado com outras abordagens computacionais e matemáticas que poderão ser desenvolvidos em estudos futuros. / The result of the scanning of a terrestrial laser scanner (TLS) is a point cloud with geometric coordinates (X, Y, Z), color information (R, G, B) from a camera coupled to the equipment, and also the return information of intensity of the laser pulse (I). These scanning systems have some characteristics, for example, its speed in acquiring information, and of 3D scenes with record of data remotely which applies significantly in the forestry analysis. The advantage of using a TLS in the forestry area is the possibility of remote acquisition of data enabling a fast and non-destructive work. This work aimed to evaluate the laser return intensity data from a TLS system for the characterization of forest targets. Methodologically were performed controlled experiments involving the following steps: radiometric calibration of TLS; evaluating the influence of the distance in the laser return intensity data and; analysis of the edge effect in imaging forest targets (considered one of the main problems that affect the data intensity return when using a TLS). The equipment used during the experiments was a laser scanner Ilris Optech 3D model that works in the mid-infrared range with wavelength of 1535 nm. The results showed that for this wavelength forest targets should be imaged at a distance greater than or equal to 5m and processing of the radiometric data with 8-bit resolution is more suitable because it provided a geometric characteristics of the target with better visual effect quality compared with the 16-bit processing. The results of the experiments on the edge effect possible to identify two types of distortions that occur in cloud data points acquired with a LST. The first affect the laser return intensity values and the second set offset an effect that the points in space. To minimize this effect an algorithm, the IRA (Intensity Recovery Algorithm), was developed which enabled automatically retrieve the laser return intensity values up to 35.7% of minimizing the edge effect in the target imaging surveys studied in research. Thus for use of an TLS, the geometrical characterization of forest targets, it is necessary to develop calibration models of the return laser pulse intensity, for TLS systems are different in terms of the electromagnetic spectrum operating range. Finally, with respect to the edge effect it was concluded that the IRA algorithm needs to be enhanced with other computational and mathematical approaches that may be developed in future studies.
150

Next Generation Black-Box Web Application Vulnerability Analysis Framework

January 2017 (has links)
abstract: Web applications are an incredibly important aspect of our modern lives. Organizations and developers use automated vulnerability analysis tools, also known as scanners, to automatically find vulnerabilities in their web applications during development. Scanners have traditionally fallen into two types of approaches: black-box and white-box. In the black-box approaches, the scanner does not have access to the source code of the web application whereas a white-box approach has access to the source code. Today’s state-of-the-art black-box vulnerability scanners employ various methods to fuzz and detect vulnerabilities in a web application. However, these scanners attempt to fuzz the web application with a number of known payloads and to try to trigger a vulnerability. This technique is simple but does not understand the web application that it is testing. This thesis, presents a new approach to vulnerability analysis. The vulnerability analysis module presented uses a novel approach of Inductive Reverse Engineering (IRE) to understand and model the web application. IRE first attempts to understand the behavior of the web application by giving certain number of input/output pairs to the web application. Then, the IRE module hypothesizes a set of programs (in a limited language specific to web applications, called AWL) that satisfy the input/output pairs. These hypotheses takes the form of a directed acyclic graph (DAG). AWL vulnerability analysis module can then attempt to detect vulnerabilities in this DAG. Further, it generates the payload based on the DAG, and therefore this payload will be a precise payload to trigger the potential vulnerability (based on our understanding of the program). It then tests this potential vulnerability using the generated payload on the actual web application, and creates a verification procedure to see if the potential vulnerability is actually vulnerable, based on the web application’s response. / Dissertation/Thesis / Masters Thesis Computer Science 2017

Page generated in 0.0437 seconds