• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 89
  • 89
  • 40
  • 33
  • 31
  • 28
  • 15
  • 14
  • 12
  • 11
  • 10
  • 10
  • 10
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
82

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
83

Bezpečnostní metriky platformy SAP / Security Metrics of SAP Platform

Třeštíková, Lenka January 2017 (has links)
Main goal of this thesis is analyzing potential security risks of the SAP NetWeaver platform and identifying various vulnerabilities, that are results of poor system configuration, incorrect segregation of duties or insufficient patch management. Methodology for platform evaluation is defined by vulnerabilities, security requirements and controls will be created.
84

Utvärdering av den upplevda användbarheten hos CySeMoL och EAAT med hjälp av ramverk för ändamålet och ISO/IEC 25010:2011

Frost, Per January 2013 (has links)
This report describes a study aimed at uncovering flaws and finding potential improvements from when the modelling tool EAAT is used in conjunction with the modelling language CySeMoL. The study was performed by developing a framework and applying it on CySeMoL and EAAT in real life context networks. The framework was developed in order to increase the number of flaws uncovered as well as gather potential improvements to both EAAT and CySeMoL. The basis of the framework is a modified version of the Quality in use model from ISO/IEC 25010:2011 standard. Upon the characteristics and sub characteristics of this modified model different values for measuring usability where attached. The purpose of these values is to measure usability from the perspectives of both creating and interpreting models. Furthermore these values are based on several different sources on how to measure usability. The complete contents of the framework and the underlying ideas, upon which the framework is based, are presented in this report. The framework in this study was designed in order to enable it to be used universally with any modelling language in conjunction with a modelling tool. Its design is also not limited to the field of computer security and computer networks, although that is the intended context of CySeMoL as well as the context described in this report. However, utilization outside the intended area of usage will most likely require some modifications, in order to work in a fully satisfying. Several flaws where uncovered regarding the usability of CySeMoL and EAAT, but this is also accompanied by several recommendations on how to improve both CySeMoL and EAAT. Because of the outline of the framework, the most severe flaws have been identified and recommendations on how to rectify these shortcomings have been suggested.
85

Effects of Behavioral Decision-Making in Game-theoretic Frameworks for Security Resource Allocation in Networked Systems

Mustafa Abdallah (13150149) 26 July 2022 (has links)
<p>Facing increasingly sophisticated attacks from external adversaries, interdependent systems owners have to judiciously allocate their (often limited) security budget in order to reduce their cyber risks. However, when modeling human decision-making, behavioral economics has shown that humans consistently deviate from classical models of decision-making. Most notably, prospect theory, for which Kahneman won the 2002 Nobel memorial prize in economics, argues that humans perceive gains, losses and probabilities in a skewed manner. While there is a rich literature on prospect theory in economics and psychology, most of the existing work studying the security of interdependent systems does not take into account the aforementioned biases.</p> <p><br></p> <p>In this thesis, we propose novel mathematical behavioral security game models for the study of human decision-making in interdependent systems modeled by directed attack graphs. We show that behavioral biases lead to suboptimal resource allocation patterns. We also analyze the outcomes of protecting multiple isolated assets with heterogeneous valuations via decision- and game-theoretic frameworks, including simultaneous and sequential games. We show that behavioral defenders over-invest in higher-valued assets compared to rational defenders. We then propose different learning-based techniques and adapt two different tax-based mechanisms for guiding behavioral decision-makers towards optimal security investment decisions. In particular, we show the outcomes of such learning and mechanisms on four realistic interdependent systems. In total, our research establishes rigorous frameworks to analyze the security of both large-scale interdependent systems and heterogeneous isolated assets managed by human decision makers, and provides new and important insights into security vulnerabilities that arise in such settings.  </p>
86

Advanced metering infrastructure reference model with automated cyber security analysis

Blom, Rikard January 2017 (has links)
European Union has set a target to install nearly 200 million smart metersspread over Europe before 2020, this leads into a vast increase of sensitiveinformation flow for Distribution System Operators (DSO’s), simultaneously thisleads to raised cyber security threats. The in and outgoing information of the DSOneeds to be processed and stored by different Information technology (IT)- andOperational Technology (OT)-systems depending on the information. High demandsare therefore required of the enterprise cyber security to be able to protect theenterprise IT- and OT-systems. Sensitive customer information and a variety ofservices and functionality is examples that could be fatal to a DSO if compromised.For instance, if someone with bad intentions has the possibility to tinker with yourelectricity, while you’re away on holiday. If they succeed with the attack and shuttingdown the house electricity, your food stored in your fridge and freezer would mostlikely to be rotted, additionally damage from defrost water leaking could cause severedamaging on walls and floors. In this thesis, a detailed reference model of theadvanced metering architecture (AMI) has been produced to support enterprisesinvolved in the process of implementing smart meter architecture and to adapt to newrequirements regarding cyber security. This has been conduct using foreseeti's toolsecuriCAD, foreseeti is a proactive cyber security company using architecturemanagement. SecuriCAD is a modeling tool that can conduct cyber security analysis,where the user can see how long time it would take for a professional penetrationtester to penetrate the systems in the model depending of the set up and defenseattributes of the architecture. By varying defense mechanisms of the systems, fourscenarios have been defined and used to formulate recommendations based oncalculations of the advanced meter architecture. Recommendation in brief: Use smalland distinct network zones with strict communication rules between them. Do diligentsecurity arrangements for the system administrator PC. The usage of IntrusionProtection System (IPS) in the right fashion can delay the attacker with a percentageof 46% or greater. / Europeiska Unionen har satt upp ett mål att installera nära 200miljoner smarta elmätare innan år 2020, spritt utöver Europa, implementeringen ledertill en rejäl ökning av känsliga dataflöden för El-distributörer och intresset av cyberattacker ökar. Både ingående och utgående information behöver processas och lagraspå olika IT- och OT-system beroende på informationen. Höga krav gällande ITsäkerhet ställs för att skydda till exempel känslig kundinformation samt en mängdvarierande tjänster och funktioner som är implementerade i systemen. Typer avattacker är till exempel om någon lyckats få kontroll over eltillgängligheten och skullestänga av elektriciteten till hushåll vilket skulle till exempel leda till allvarligafuktskador till följd av läckage från frysen. I den här uppsatsen så har en tillräckligtdetaljerad referens modell för smart elmätar arkitektur tagits fram för att möjliggörasäkerhetsanalyser och för att underlätta för företag i en potentiell implementation avsmart elmätare arkitektur. Ett verktyg som heter securiCAD som är utvecklat avforeseeti har använts för att modellera arkitekturen. securiCAD är ett modelleringsverktyg som använder sig av avancerade beräknings algoritmer för beräkna hur långtid det skulle ta för en professionell penetrationstestare att lyckats penetrera de olikasystem med olika sorters attacker beroende på försvarsmekanismer och hurarkitekturen är uppbyggd. Genom att variera systemens försvar och processer så harfyra scenarion definierats. Med hjälp av resultaten av de fyra scenarierna så harrekommendationer tagits fram. Rekommendationer i korthet: Använd små ochdistinkta nätverkszoner med tydliga regler som till exempel vilka system som fårkommunicera med varandra och vilket håll som kommunikationen är tillåten.Noggranna säkerhetsåtgärder hos systemadministratörens dator. Användningen avIPS: er, genom att placera och använda IPS: er på rätt sätt så kan man fördröjaattacker med mer än 46% enligt jämförelser mellan de olika scenarier.
87

Portfolio Optimization under Value at Risk, Average Value at Risk and Limited Expected Loss Constraints

Gambrah, Priscilla S.N January 2014 (has links)
<p>In this thesis we investigate portfolio optimization under Value at Risk, Average Value at Risk and Limited expected loss constraints in a framework, where stocks follow a geometric Brownian motion. We solve the problem of minimizing Value at Risk and Average Value at Risk, and the problem of finding maximal expected wealth with Value at Risk, Average Value at Risk, Limited expected loss and Variance constraints. Furthermore, in a model where the stocks follow an exponential Ornstein-Uhlenbeck process, we examine portfolio selection under Value at Risk and Average Value at Risk constraints. In both geometric Brownian motion (GBM) and exponential Ornstein-Uhlenbeck (O.U) models, the risk-reward criterion is employed and the optimal strategy is found. Secondly, the Value at Risk, Average Value at Risk and Variance is minimized subject to an expected return constraint. By running numerical experiments we illustrate the effect of Value at Risk, Average Value at Risk, Limited expected loss and Variance on the optimal portfolios. Furthermore, in the exponential O.U model we study the effect of mean-reversion on the optimal strategies. Lastly we compare the leverage in a portfolio where the stocks follow a GBM model to that of a portfolio where the stocks follow the exponential O.U model.</p> / Master of Science (MSc)
88

Bezpečnostní analýza virtuální reality a její dopady / Security Analysis of Immersive Virtual Reality and Its Implications

Vondráček, Martin January 2019 (has links)
Virtuální realita je v současné době využívána nejen pro zábavu, ale i pro práci a sociální interakci, kde má soukromí a důvěrnost informací vysokou prioritu. Avšak bohužel, bezpečnostní opatření uplatňovaná dodavateli softwaru často nejsou dostačující. Tato práce přináší rozsáhlou bezpečnostní analýzu populární aplikace Bigscreen pro virtuální realitu, která má více než 500 000 uživatelů. Byly využity techniky analýzy síťového provozu, penetračního testování, reverzního inženýrství a dokonce i metody pro application crippling. Výzkum vedl k odhalení kritických zranitelností, které přímo narušovaly soukromí uživatelů a umožnily útočníkovi plně převzít kontrolu nad počítačem oběti. Nalezené bezpečnostní chyby umožnily distribuci škodlivého softwaru a vytvoření botnetu pomocí počítačového červa šířícího se ve virtuálních prostředích. Byl vytvořen nový kybernetický útok ve virtální realitě nazvaný Man-in-the-Room. Dále byla objevena bezpečnostní chyba v Unity engine. Zodpovědné nahlášení objevených chyb pomohlo zmírnit rizika pro více než půl milionu uživatelů aplikace Bigscreen a uživatele všech dotčených aplikací v Unity po celém světě.
89

Informational Efficiency and the Reaction to Terrorism: A Financial Perspective

Roland, Nicholas 01 January 2016 (has links)
The purpose of this study is to measure the message terror organizations hope to convey using the financial markets as a proxy of measurement to determine patterns within the marketplace and the effects on the terrorists’ ability to deliver a desired message due to the increased use of digital devices and access to instantaneous news, seen over the past decade. Using death count, geographic location, and event type, this study identified 109 attacks between 1985 and 2015 to be analyzed against 5 market indices and 5 securities. Measuring the effects within a 10-day sample window from the time of the attack (+ or - 5 days) using average abnormal returns, standard deviation, Sharpe Ratio and the initial reactions in the market place as a percentage of total attacks, the effects on average abnormal returns on the market proxies were measured on three levels; The entire sample period from 1985 to 2015; the first half of the sample period 1985-1999; and the second half of the sample period 2000-2015. Analyzing trends in abnormal returns and standard deviation, the results of the study were inconclusive.

Page generated in 0.1334 seconds