31 |
Effective Analysis, Characterization, and Detection of Malicious Activities on the WebEshete, Birhanu Mekuria January 2013 (has links)
The Web has evolved from a handful of static web pages to billions of dynamic and interactive web pages. This evolution has positively transformed the paradigm of communication, trading, and collaboration for the benefit of humanity. However, these invaluable benefits of the Web are shadowed by cyber-criminals who use the Web as a medium to perform malicious activities motivated by illegitimate benefits. Cyber-criminals often lure victims to visit malicious web pages, exploit vulnerabilities on victims’ devices, and then launch attacks that could lead to: stealing invaluable credentials of victims, downloading and installation of malware on victims’ devices, or complete compromise of victims’ devices to mount future attacks. While the current state-of-the-art is to detect malicious web pages is promising, it is yet limited in addressing the following three problems. First, for the sake of focused detection of certain class of malicious web pages, existing techniques are limited to partial analysis and characterization of attack payloads. Secondly, attacker-motivated and benign evolution of web page artifacts have challenged the resilience of existing detection techniques. The third problem is the prevalence and evolution of Exploit Kits used in spreading web-borne malware. In this dissertation, we present the approaches and the tools we developed to address these problems. To the address partial analysis and characterization of attack payloads, we propose a holistic and lightweight approach that combines static analysis and minimalistic emulation to analyze and detect malicious web pages. This approach leverages features from URL structure, HTML content, JavaScript executed on the client, and reputation of URLs on social networking websites to train multiple models, which are then used in confidence-weighted majority vote classifier to detect unknown web pages. Evaluation of the approach on a large corpus of web pages shows that the approach not only is precise enough in detecting malicious web pages with very low false signals but also does detection with a minimal performance penalty. To address the evolution of web page artifacts, we propose an evolution-aware approach that tunes detection models inline with the evolution of web page artifacts. Our approach takes advantage of evolutionary searching and optimization using Genetic Algorithm to decide the best combination of features and learning algorithms, i.e., models, as a function of detection accuracy and false signals. Evaluation of our approach suggests that it reduces false negatives by about 10% on a fairly large testing corpus of web pages. To tackle the prevalence of Exploit Kits on the Web, we first analyze source code and runtime behavior of several Exploit Kits in a contained setting. In addition, we analyze the behavior of live Exploit Kits on the Web in a contained environment. Combining the analysis results, we characterize Exploit Kits pertinent to their attack-centric and self-defense behaviors. Based on these behaviors, we draw distinguishing features to train classifiers used to detect URLs that are hosted by Exploit Kits. The evaluation of our classifiers on independent testing dataset shows that our approach is effective in precisely detecting malicious URLs linked with Exploit Kits with very low false positives.
|
32 |
Efficient Automated Security Analysis of Complex Authorization PoliciesTruong, Anh January 2015 (has links)
Access Control is becoming increasingly important for today's ubiquitous systems. Sophisticated security requirements need to be ensured by authorization policies for increasingly complex and large applications. As a consequence, designers need to understand such policies and ensure that they meet the desired security constraints while administrators must also maintain them so as to comply with the evolving needs of systems and applications. These tasks are greatly complicated by the expressiveness and the dimensions of the authorization policies. It is thus necessary to provide policy designers and administrators with automated analysis techniques that are capable to foresee if, and under what conditions, security properties may be violated. For example, some analysis techniques have already been proposed in the literature for Role-Based Access Control (RBAC) policies. RBAC is a security model for access control that has been widely adopted in real-world applications. Although RBAC simplifies the design and management of policies, modifications of RBAC policies in complex organizations are difficult and error prone activities due to the limited expressiveness of the basic RBAC model. For this reason, RBAC has been extended in several directions to accommodate various needs arising in the real world such as Administrative RBAC (ARBAC) and Temporal RBAC (TRBAC). This Dissertation presents our research efforts to find the best trade-off between scalability and expressiveness for the design and benchmarking of analysis techniques for authorization policies. We review the state-of-the-art of automated analysis for authorization policies, identify limitations of available techniques and then describe our approach that is based on recently developed symbolic model checking techniques based on Satisfiability Modulo Theories (SMT) solving (for expressiveness) and carefully tuned heuristics (for scalability). Particularly, we present the implementation of the techniques on the automated analysis of ARBAC and ATRBAC policies and discuss extensive experiments that show that the proposed approach is superior to other state-of-the-art analysis techniques. Finally, we discuss directions for extensions.
|
33 |
Interaction among vegetation and morphology in channelized riversMarofi Fathpour, Navid January 2016 (has links)
The presence of aquatic vegetation on riverbed and embankments influences flow structure and consequently flow resistance, sediment transport, morphology, and ecology. These influences would lead to a hydraulic diversity, which is a key ingredient of physical habitat in streams. According to this fact, vegetation is commonly incorporated within stream restoration. Although significance of vegetation as an inseparable part of riverine systems is recognized, but yet it is still difficult to predict how the associated influences will respond to the introduction of vegetation and how advantages of vegetation can be optimized to a multitude of different processes. The primary impact of vegetation is slower flow velocity and thus, reduction in conveyance capacity. In addition to affecting the velocity profile over the full depth, vegetation affects turbulence intensity and diffusion. When mean kinetic energy converts to turbulent kinetic energy within the area planted with stems, turbulence intensity will begin to intensify. As a result of velocity and turbulence changes, aquatic vegetation can affect sediment movement and consequently bed form shapes could be stabilized with new patterns. Bed form characteristics (length, shape, structure, dimensions, stability, etc.) also depend on flow structure and can be divided into different categories according to the bed load materials. Locations and extension of vegetation in river channels is a fundamental factor should be considered besides the general impacts of vegetation in rivers. Isolated patches of vegetation are more common in practical applications rather than uniform vegetated channels in which the mean properties of the vegetation canopy are independent of the location. The present study considered the changes in bed forms through semi-circular patches of emergent vegetation, which are located at the banks of the channel. One of the goals is to find out how an island of vegetation modifies the morphology of rivers and mass transport. In particular, this research focuses on the physics of sediment transport and its effect on bed forms, and flow resistance in the presence of a patch of vegetation, by using experimental data and numerical modeling. Providing a physically based model for estimating the effects of vegetation on flow parameters, turbulence dispersion, and sediment transportation, the results of the present study contribute to extending the knowledge of morphology and mass transport in vegetated streams.
|
34 |
Downstream suspended sediment dynamics of reservoir sediment flushingTarekegn, Tesfaye Haimanot January 2015 (has links)
Reservoir sediment flushing is increasingly considered beneficial to reduce sedimentation of reservoirs and maintain sediment supply downstream of impounded rivers. Nevertheless, flushing of the accumulated sediments downstream of the dam also bears numerous negative impacts. In this study, first the most important downstream impacts of fine sediment releases of flushing were identified based on previously published research of twenty case studies in eleven countries. The results showed that the long-term as well as short term biological and physical impacts decreased with distance from the dam. The temporal scale of impacts on macro-invertebrates could span from few weeks or a month to several months while the effect on fish could last for a number of years. The impacts on downstream vegetation dynamics is driven by many years of flushing activities. The study also enabled proposing generic management strategies aimed to reduce the impacts. Second, fine sediment transport in coarse immobile bed, which is a common phenomenon downstream of dams during flushing releases, dam removal and also in many mountain and canyon rivers, was investigated. Particularly, the dynamics of the downstream erosion and transport of fine sediments released during sediment flushing was investigated based on a series of flume experiments that were carried out in immobile gravel bed and using a one-dimensional (1-D) suspended sediment transport model developed in the present study. In the framework of the flume experiment, firstly gravel bed roughness, porosity and roughness density were exclusively extracted from gravel surface elevation data in which developing a spatial filter to overcome elevation errors was carried out. Secondly a new technique to acquire fine sediment erosion in immobile coarse bed in running water condition was developed. The method proved to be the back bone of all fine sediment erosion experiments conducted in the present study and could be used for similar studies. This study presents a first work of direct measurement of erosion rate and characterizing its spatial heterogeneity in gravel bed. The experimental data of erosion rate of fine sediments showed that it varied spatially with high erosion rate on the stoss side of gravels and less on the lee side conforming to sweeps and ejections characteristics in coherent flow structure of gravel bed flows. Erosion rate was significantly affected by increase in roughness of immobile gravel bed with high erosion rate noticed when sand level was reduced although the effect on stream-wise velocity was not significant. The vertical profile of erosion rate was found to decrease linearly and showed an exponential decay in time in the gravel matrix.
Third, a new non-equilibrium erosion rate relation is proposed. Drag force profile in the interfacial sublayer of clean gravel bed was found to be scaled well with roughness density and allowed predicting the effective shear stress distribution available for fine sediment entrainment with an empirical equation. The new relation is a modified version of the pick-up rate function of van Rijn (1984b) in which the predicted shear stress in the roughness layer was implemented. The most important finding was that if the shear stress distribution in the interfacial sublayer is predicted, a relation for sand bed condition can be applied to predict fine sediment erosion rate in immobile gravel bed. This approach is conceptually superior to previous approaches where erosion rate in sand bed condition was scaled empirically for various fine sediment bed level within the interfacial sublayer. Finally, the effect of the interaction between hydrodynamic and sediment wave dynamics of sediment flushing on spatial pattern of sediment deposition was investigated. The 1-D model was developed to include major processes observed in sediment flushing: sediment wave celerity correction, variable bed roughness, bed exchange in immobile bed, hindered settling velocity and rough bed porosity. The proposed erosion rate relation showed encouraging results when implemented in the 1-D model. The wave celerity factor did not show significant effect on the spatial lag in immobile bed condition although was significant in sand bed condition. Variable bed roughness modified both the flow field and sediment deposition in which larger length of sediment deposit was noted. The immobile bed porosity allowed modelling clogged depth of fine sediments. The model was also found to be very valuable to investigate flushing scenarios that reduce significant deposition through the analysis of the dependence of deposition on peak-to-base flow and intermittence of releases. The highest peak-to-base flows produced the longest and thickest region of deposition while those with the lowest ratio produced the shortest and thinnest. A single flushing release followed by clear water release reduced area or length of sediment deposition more than intermittent flushing followed by inter- and post-flushing clear water releases. In the latter case, the peak of concentration reduced but remained higher for longer duration than the former, which suggests that a large quantity of clear water release has to be available.
Overall, the present research represents a step forward in understanding relevant processes involved in the downstream transport of fine sediments released during sediment flushing and the associated impacts that can help the development of better management strategies and predictive tools.
|
35 |
Cyclic Codes: Low-Weight Codewords and LocatorsTinnirello, Claudia January 2016 (has links)
Error correcting codes has become an integral part of the design of reliable data transmissions and storage systems. They are also playing an increasingly important role for other applications such as the analysis of pseudorandom sequences and the design of secure cryptosystems. Cyclic codes form a large class of widely used error correcting codes, including important codes such as the Bose-Chaudhuri-Hocquenghem (BCH) codes, quadratic residue (QR) codes and Golay codes. In this thesis I tackle two problems related to cyclic codes: finding low-weight codewords and decoding.
Computing efficiently low-weight codewords of a cyclic code is often a key ingredient of correlation attacks to LFSR-based stream ciphers. The best general purpose algorithm is based on the generalized birthday problem. In the first part of the thesis I show an alternative approach based on discrete logarithms which has - in some cases relevant for applications - much lower memory complexity requirements and a comparable time complexity. The second part of the thesis is devoted to some results concerning efficient bounded-distance decoding for cyclic codes.
|
36 |
Modelling the morphodynamics of tidal channelsVignoli, Gianluca January 2004 (has links)
The present study deals with the morphodynamics of rivers and estuaries. The morphodynamic behaviour of natural systems has been investigated using numerical tools. As a first step the hydrodynamics and morphodynamics of both convergent and non-convergent tidal channels have been studied. The analysis has been made using a second order numerical scheme, solving the 1D flow equations and the continuity equation for the bed evolution in order to point out the main characteristics of the phenomenon. Numerical results show that the behaviour is non-linear also for relatively small values of the ratio between tidal amplitude and mean flow depth.
The morphodynamic behaviour of tidal dominated estuaries is characterised by the formation of a rising landward bed profile. This trend is due to the flood dominated character of the convergent estuaries with horizontal bed profile, which induces a landward net sediment flux. Due to this sediment flux the channel is filled in with sediment and a beach can form, whose position depends on the geometrical characteristics (channel length, flow depth, convergence degree) and on the hydrodynamic characteristic (tidal amplitude, friction factor). The equilibrium conditions are characterised by symmetrical flood and ebb phases. The second step is the developments of a three-dimensional numerical model for the comprehension of the altimetric behaviour of almost straight channels. Bed forms can form spontaneously starting from a configuration with a plane bed profile. The research activity concerns in particular the geometrical characterisation of the bed forms: wavelength, Fourier composition, mean celerity, maximum scour and deposition and the time scale of the formation phenomenon. The investigation is oriented to the characterisation of the equilibrium geometry, to the description of the dynamical behaviour, of the flow field and of the concentration field. This goal has been obtained using a fully non-linear 3D numerical model, which takes into account sediment transported as bed load and the suspended load. The numerical scheme is that proposed by Casulli & Cattani (1994) suitably modified for this particular problem; the advection-diffusion equation for the sediments transported into suspension , is solved using an original semi-analytical conservative scheme.
Results are in agreement with those obtained through analytical linearized theories (see Tubino et al., 1999 for a review). Numerical results suggest that in fine sediment channels, when the suspended load is dominant over the bed load, the instability process is different with respect to the gravel bed case. When suspended load is dominant the model predicts the tendency of free bars to emerge from the free surface, forming islands. A similar behaviour is observed also in gravel bed rivers at relatively high value of the width to depth ratio, while under suspended load dominated conditions the maximum deposits are relatively large also for values of the aspect ratio close to the threshold for bar formation. Finally the analysis and the results for the formation of alternate bars in uniform flow have been extended to the tidal context, adopting a suitable basic flow. In the tidal case the bed forms show vanishing celerity, therefore the altimetric and the planimetric morphological responses might interact. Numerical simulation under tidal conditions are very long until a month of computational time, because the time step for the computation must be mush shorter respect to the tidal period. Therefore we have looked for a suitable 2D approximate formulation for the suspended flux, in order to reduce the number of computational nodes and so also the computational time. In the literature there are many approaches for the evaluation the suspended load through analytical perturbative methods. Here a comparison has been made between the numerical solution and an analytical solution, showing that the latter can be applied for a range of the parameters relatively small.
|
37 |
On Efficient Algorithms for Stochastic Simulation of Biochemical Reaction SystemsVo, Hong Thanh January 2013 (has links)
Computational techniques provide invaluable tools for developing a quantitative understanding the complexity of biological systems. The knowledge of the biological system under study is formalized in a precise form by a model. A simulation algorithm will realize the dynamic interactions encoded in the model. The simulation can uncover biological implications and derive further predictive experiments. Several successful approaches with different levels of detail have been introduced to deal with various biological pathways including regulatory networks, metabolic pathways and signaling pathways. The Stochastic simulation algorithm (SSA), in particular, is an exact method to realize the time evolution of a well-mixed biochemical reaction network. It takes the inherent randomness in biological reactions and the discrete nature of involved molecular species as the main source in sampling a reaction event. SSA is useful for reaction networks with low populations of molecular species, especially key species. The macroscopic response can be significantly affected when these species involved in the reactions both quantitatively and qualitatively. Even though the underlying assumptions of SSA are obviously simplified for real biological networks, it has been proved having the capability of reproducing the stochastic effects in biological behaviour.
Essentially, SSA uses a Monte Carlo simulation technique to realize temporal behaviour of biochemical network. A reaction is randomly selected to fire at a time according to its propensity by conducting a search procedure. The fired reaction leads the system to a new configuration. At this new configuration, reactions have to update their propensities to reflect the changes.
In this thesis we investigate new algorithms for improving performance of SSA. First, we study the application of tree-based search for improving the search of a reaction firing, and devise a solution to optimize the average search length. We prove that by a tree-based search the performance of SSA can be sensibly improved, moving the search from linear time complexity to logarithmic complexity. We combine this idea with others from the literature, and compare the performance of our algorithm with previous ones. Our experiments show that our algorithm is faster, especially on large models.
Second, we focus on reducing the cost of propensity updates. Although the computational cost for evaluating one reaction propensity is small, the cumulative cost for a large number of reactions contributes a significant portion to the simulation performance. Typical experiments show that the propensity updates contribute 65% to 85%, and in some special cases up to 99%, of the total simulation time even though a dependency graph was applied. Moreover, sometimes one models the kinetics using a complex propensity formula, further increasing the cost of propensity updates. We study and propose a new exact simulation algorithm, called RSSA named after Rejection-based SSA, to reduce the cost of propensity updates. The principle of RSSA is using an over-approximation of propensities to select a reaction firing. The exact propensity value is evaluated only as needed. Thus, the propensity updates are postponed and collapsed as much as possible. We show through experiments that the propensity updates by our algorithm is significantly reduced, and hence substantially improving the simulation time.
Third, we extend our study for reaction-diffusion processes. The simulation should explicitly account the diffusion of species in space. The compartment-based reaction-diffusion simulation is based on dividing the space into subvolumes so that the subvolumes are well-mixed. The diffusion of a species between subvolumes is modelled as an additional unimolecular reaction. We propose a new algorithm, called Rejection-based Reaction Diffusion (RRD), to efficiently simulate such reaction-diffusion systems. RRD combines the tree-based search and the idea of RSSA to select the next reaction firing in a subvolume. The highlight of RRD comparing with previous algorithms is the selection of both the subvolume and the reaction uses only the over-approximation of propensities. We prove the correctness and experimentally show performance improvement of RRD over other compartment-based approaches in literature.
Finally, we focus on performing a statistical analysis of the targeted event by stochastic simulation. A direct application of SSA is generating trajectories and then counting the number of the successful ones. Rare events, which occur only with a very small probability, however, make this approach infeasible since a prohibitively large number of trajectories would need to be generated before the estimation becomes reasonably accurate. We propose a new method, called splitting SSA (sSSA), to improve the accuracy and efficiency of stochastic simulation while applying to this problem. Essentially, sSSA is a kind of biased simulation in which it encourages the evolution of the system making the target event more likely, yet in such a way that allows one to recover an unbiased estimated probability. We compare both performance and accuracy for sSSA and SSA by experimenting in some concrete scenarios. Experimental results prevail that sSSA is more efficient than the naive SSA approach.
|
38 |
Formal failure analyses for effective fault management: an aerospace perspectiveBittner, Benjamin January 2016 (has links)
The possibility of failures is a reality that all modern complex engineering systems need to deal with. In this dissertation we consider two techniques to analyze the nature and impact of faults on system dynamics, which is fundamental to reliably manage them. Timed failure propagation analysis studies how and how fast faults propagate through physical and logical parts of a system. We develop formal techniques to validate and automatically generate representations of such behavior from a more detailed model of the system under analysis.
Diagnosability analysis studies the impact of faults on observable parameters and tries to understand whether the presence of faults can be inferred from the observations within a useful time frame. We extend a recently developed framework for specifying diagnosis requirements, develop efficient algorithms to assess diagnosability under a fixed set of observables, and propose an automated technique to select optimal subsets of observables. The techniques have been implemented and evaluated on realistic models and case studies developed in collaboration with engineers from the European Space Agency, demonstrating the practicality of the contributions.
|
39 |
Automatic Population of Structured Knowledge Bases via Natural Language ProcessingFossati, Marco January 2017 (has links)
The Web has evolved into a huge mine of knowledge carved in different forms, the predominant one still being the free-text document. This motivates the need for Intelligent Web-reading Agents: hypothetically, they would skim through disparate Web sources corpora and generate meaningful structured assertions to fuel Knowledge Bases (KBs).
Ultimately, comprehensive KBs, like Wikidata and DBpedia, play a fundamental role to cope with the issue of information overload. On account of such vision, this thesis depicts a set of systems based on Natural Language Processing (NLP), which take as input unstructured or semi-structured information sources and produce machine-readable statements for a target KB. We implement four main research contributions: (1) a one-step methodology for crowdsourcing the Frame Semantics annotation; (2) a NLP technique implementing the above contribution to perform N-ary Relation Extraction from Wikipedia, thus enriching the target KB with properties; (3) a taxonomy learning strategy to produce an intuitive and exhaustive class hierarchy from the Wikipedia category graph, thus augmenting the target KB with classes; (4) a recommender system that leverages a KB network to yield atypical suggestions with detailed explanations, serving as a proof of work for real-world end users. The outcomes are incorporated into the Italian DBpedia chapter, can be queried through its public endpoint, and/or downloaded as standalone data dumps.
|
40 |
Nomos 3: legal compliance of software requirementsIngolfo, Silvia January 2015 (has links)
Laws and regulations are increasingly impacting the design and development of software systems, as legislations around the world attempt to control the impact of software on social and private life. Software systems need to be designed from the beginning in a law-aware fashion to ensure compliance with applicable laws. Moreover, they need to evolve over time as new laws pass and existing ones are amended. In this interdisciplinary field many challenges remain open. For any given norm, there are alternative ways to comply with it for a system-to-be. Moreover, revising some requirements or adding new ones can have an important impact on what norms apply. To complicate matters, there is a sizeable knowledge gap between technical and legal experts, and this hampers requirements analysts in dealing with the problem on their own. This thesis proposes to use conceptual models of law and requirements to help requirements engineers address these problems by answering questions such as ``Given this set of requirements, which norms are applicable?'', ``Which norms are complied with?'', ``What are the alternative ways I use to comply with a norm?''. The thesis proposes the Nomos 3 framework that includes a modeling language for law and requirements, reasoning support for Nomos 3 models, as well as a systematic process for establishing compliance. The proposed framework is evaluated by means of illustrative case studies, a scalability study for the reasoning mechanism, as well as other specific studies intended to assess the effectiveness of the proposed concepts, tools, and process.
|
Page generated in 0.0562 seconds