• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 10
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 43
  • 15
  • 10
  • 8
  • 7
  • 7
  • 7
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Entropický generátor náhodných čísel / Entropic generator of random numbers

Kolář, Michal January 2015 (has links)
This paper is focused on generating random number via entropy generators. There are described sources of entropy on personal computer and described methods of detecting their entropy. There are also described basic operations used for construction and for concept solution this generator under NIST recommendation for personal computer and methods of their testing.
12

Analysis of cyber security in smart grid systems

Masonganye, James January 2017 (has links)
Cyber security is a major concern due to global incidents of intrusion. The impact of the attacks on the electricity grid can be significant, resulting in the collapsing of the national economy. Electricity network is needed by banks, government security agencies, hospitals and telecommunication operators. The purpose of this research is to investigate the various types of cyber security threats, including ICT technologies required for safe operation of the smart grid to protect and mitigate the impact of cyber security. The modelling of cyber security using the Matlab/SimPowerSystem simulates the City of Tshwane power system. Eskom components used to produce energy, interconnect to the City of Tshwane power distribution substations and simulated using Simulink SimPowerSystem. / Dissertation (MEng)--University of Pretoria, 2017. / Electrical, Electronic and Computer Engineering / MEng / Unrestricted
13

A quantitative measure of the security risk level of enterprise networks

Munir, Rashid, Pagna Disso, Jules F., Awan, Irfan U., Mufti, Muhammad R. January 2013 (has links)
No / Along with the tremendous expansion of information technology and networking, the number of malicious attacks which cause disruption to business processes has concurrently increased. Despite such attacks, the aim for network administrators is to enable these systems to continue delivering the services they are intended for. Currently, many research efforts are directed towards securing network further whereas, little attention has been given to the quantification of network security which involves assessing the vulnerability of these systems to attacks. In this paper, a method is devised to quantify the security level of IT networks. This is achieved by electronically scanning the network using the vulnerability scanning tool (Nexpose) to identify the vulnerability level at each node classified according to the common vulnerability scoring system standards (critical, severe and moderate). Probabilistic approach is then applied to calculate an overall security risk level of sub networks and entire network. It is hoped that these metrics will be valuable for any network administrator to acquire an absolute risk assessment value of the network. The suggested methodology has been applied to a computer network of an existing UK organization with 16 nodes and a switch.
14

Fair Comparison of ASIC Performance for SHA-3 Finalists

Zuo, Yongbo 22 June 2012 (has links)
In the last few decades, secure algorithms have played an irreplaceable role in the protection of private information, such as applications of AES on modems, as well as online bank transactions. The increasing application of secure algorithms on hardware has made implementations on ASIC benchmarks extremely important. Although all kinds of secure algorithms have been implemented into various devices, the effects from different constraints on ASIC implementation performance have never been explored before. In order to analyze the effects from different constraints for secure algorithms, SHA-3 finalists, which includes Blake, Groestl, Keccak, JH, and Skein, have been chosen as the ones to be implemented for experiments in this thesis. This thesis has first explored the effects of different synthesis constraints on ASIC performance, such as the analysis of performance when it is constrained for frequency, or maximum area, etc. After that, the effects of choosing various standard libraries were tested, for instance, the performance of UMC 130nm and IBM 130nm standard libraries have been compared. Additionally, the effects of different technologies have been analyzed, such as 65nm, 90nm, 130nm and 180nm of UMC libraries. Finally, in order to further understand the effects, experiments for post-layout analysis has been explored. While some algorithms remain unaffected by floor plan shapes, others have shown preference for a specific shape, such as JH, which shows a 12% increase in throughput/area with a 1:2 rectangle compared to a square. Throughout this thesis, the effects of different ASIC implementation factors have been comprehensively explored, as well as the details of the methodology, metrics, and the framework of the experiments. Finally, detailed experiment results and analysis will be discussed in the following chapters. / Master of Science
15

Padronização do Y-90 pelo método CIEMAT/NIST em sistema de cintilação líquida e pelo método do traçador em sistema de coincidência 4πβ-γ / Standardization of Y-90 by CIEMAT/NIST method in scintillation counting system and by tracing method in 4πβ-γ coincidence system

Tatiane da Silva Nascimento Sales 30 May 2014 (has links)
O 90Y tem uma meia-vida de 2,7 dias, decaindo com 99,98% por emissão beta para o estado fundamental do 90Zr. Neste trabalho foram aplicadas duas metodologias para a padronização do 90Y. O método do traçador em um sistema de coincidência de 4πβ-γ, onde foi medido o emissor beta puro, misturado com um emissor de beta-gama, que proporciona a eficiência de detecção beta. Para este método, o radionuclídeo 24Na, que decai com meia-vida de 0,623 dia pela emissão beta, com energia beta máxima de 1393 keV, seguido por dois raios gama, foi usado como traçador. A eficiência foi obtida, selecionando-se o pico de absorção total com energia de 1369 keV no canal gama. Alíquotas conhecidas do traçador, previamente padronizadas pelo método de coincidência 4πβ-γ, foram misturadas com alíquotas conhecidas de 90Y. A atividade do emissor beta puro foi calculada por meio de um sistema de coincidência por software (SCS) usando discriminação eletrônica para alterar a eficiência de beta. O comportamento da curva de extrapolação foi predito por meio do código Esquema, que utiliza a técnica de Monte Carlo. O outro método usado foi o método CIEMAT/NIST desenvolvido para sistemas de contagem de cintilação líquida. Para este método, utilizou-se uma solução padrão de 3H. O sistema 2100TR TRICARB foi usado para as medições, o qual opera em coincidência com duas fotomultiplicadoras; uma fonte externa, colocada perto do sistema de medição foi usada para determinar o parâmetro quenching. O coquetel utilizado foi o Ultima Gold, a variação do fator de quenching foi obtida pelo uso de nitrometano. As amostras radioativas foram preparadas em frascos de vidro com baixa concentração de potássio. As atividades determinadas pelos dois métodos foram comparadas e os resultados obtidos são concordantes dentro das incertezas experimentais. Por meio deste trabalho, foi possível avaliar o desempenho do método CIEMAT/NIST, que apresenta várias vantagens em relação ao método do traçador, entre elas a facilidade para a preparação das fontes, medidas simples e rápidas sem a necessidade de determinar as curvas de extrapolação. / The 90Y has a half-life of 2.7 days, decaying with 99.98 % by beta emission to the ground state of 90Zr. In this work two methodologies for the standardization of yttrium-90 (90Y) were applied. One was the tracing method performed in a 4πβ-γ coincidence system, measuring the pure beta emitter mixed with a beta-gamma emitter, which provides the beta detection efficiency. For this method, the radionuclide 24Na, which decays with half life of 0.623 day by beta particle, with end point energy of 1393 keV followed by two gamma-rays, was used as tracer, the efficiency was obtained by selecting the 1369 keV total energy absorption peak at the gamma channel. Known aliquots of the tracer, previously standardized by 4πβ-γ coincidence, were mixed with known aliquots of 90Y. The activity was calculated by means of a Software Coincidence System (SCS) using electronic discrimination for changing the beta efficiency. The behavior of the extrapolation curve was predicted by means of the Esquema code, which uses the Monte Carlo technique. The other was the CIEMAT/NIST method developed for Liquid Scintillation Counting (LSC) systems. For this method, a 3H standard solution was used. A TRICAB 2100TR system was used for the measurements. It operates with two photomultipliers in coincidence and an external source placed near the measurement system is used for determining the quenching parameter. Ultima Gold was the liquid scintillation cocktail. In order to obtain the quenching parameter curve a nitro methane carrier solution was used. The radioactive samples were prepared in glass vials with low potassium concentration. The activities determined by the two methods were compared and they are in agreement within the experimental uncertainties. By means of this work it was possible to evaluate the performance of the CIEMAT/NIST method, which presents several advantages with respect to the tracer method, among them is the facility for the preparation of the sources, simple and fast measurements without the need of determining extrapolation curves.
16

Padronização do Y-90 pelo método CIEMAT/NIST em sistema de cintilação líquida e pelo método do traçador em sistema de coincidência 4πβ-γ / Standardization of Y-90 by CIEMAT/NIST method in scintillation counting system and by tracing method in 4πβ-γ coincidence system

Sales, Tatiane da Silva Nascimento 30 May 2014 (has links)
O 90Y tem uma meia-vida de 2,7 dias, decaindo com 99,98% por emissão beta para o estado fundamental do 90Zr. Neste trabalho foram aplicadas duas metodologias para a padronização do 90Y. O método do traçador em um sistema de coincidência de 4πβ-γ, onde foi medido o emissor beta puro, misturado com um emissor de beta-gama, que proporciona a eficiência de detecção beta. Para este método, o radionuclídeo 24Na, que decai com meia-vida de 0,623 dia pela emissão beta, com energia beta máxima de 1393 keV, seguido por dois raios gama, foi usado como traçador. A eficiência foi obtida, selecionando-se o pico de absorção total com energia de 1369 keV no canal gama. Alíquotas conhecidas do traçador, previamente padronizadas pelo método de coincidência 4πβ-γ, foram misturadas com alíquotas conhecidas de 90Y. A atividade do emissor beta puro foi calculada por meio de um sistema de coincidência por software (SCS) usando discriminação eletrônica para alterar a eficiência de beta. O comportamento da curva de extrapolação foi predito por meio do código Esquema, que utiliza a técnica de Monte Carlo. O outro método usado foi o método CIEMAT/NIST desenvolvido para sistemas de contagem de cintilação líquida. Para este método, utilizou-se uma solução padrão de 3H. O sistema 2100TR TRICARB foi usado para as medições, o qual opera em coincidência com duas fotomultiplicadoras; uma fonte externa, colocada perto do sistema de medição foi usada para determinar o parâmetro quenching. O coquetel utilizado foi o Ultima Gold, a variação do fator de quenching foi obtida pelo uso de nitrometano. As amostras radioativas foram preparadas em frascos de vidro com baixa concentração de potássio. As atividades determinadas pelos dois métodos foram comparadas e os resultados obtidos são concordantes dentro das incertezas experimentais. Por meio deste trabalho, foi possível avaliar o desempenho do método CIEMAT/NIST, que apresenta várias vantagens em relação ao método do traçador, entre elas a facilidade para a preparação das fontes, medidas simples e rápidas sem a necessidade de determinar as curvas de extrapolação. / The 90Y has a half-life of 2.7 days, decaying with 99.98 % by beta emission to the ground state of 90Zr. In this work two methodologies for the standardization of yttrium-90 (90Y) were applied. One was the tracing method performed in a 4πβ-γ coincidence system, measuring the pure beta emitter mixed with a beta-gamma emitter, which provides the beta detection efficiency. For this method, the radionuclide 24Na, which decays with half life of 0.623 day by beta particle, with end point energy of 1393 keV followed by two gamma-rays, was used as tracer, the efficiency was obtained by selecting the 1369 keV total energy absorption peak at the gamma channel. Known aliquots of the tracer, previously standardized by 4πβ-γ coincidence, were mixed with known aliquots of 90Y. The activity was calculated by means of a Software Coincidence System (SCS) using electronic discrimination for changing the beta efficiency. The behavior of the extrapolation curve was predicted by means of the Esquema code, which uses the Monte Carlo technique. The other was the CIEMAT/NIST method developed for Liquid Scintillation Counting (LSC) systems. For this method, a 3H standard solution was used. A TRICAB 2100TR system was used for the measurements. It operates with two photomultipliers in coincidence and an external source placed near the measurement system is used for determining the quenching parameter. Ultima Gold was the liquid scintillation cocktail. In order to obtain the quenching parameter curve a nitro methane carrier solution was used. The radioactive samples were prepared in glass vials with low potassium concentration. The activities determined by the two methods were compared and they are in agreement within the experimental uncertainties. By means of this work it was possible to evaluate the performance of the CIEMAT/NIST method, which presents several advantages with respect to the tracer method, among them is the facility for the preparation of the sources, simple and fast measurements without the need of determining extrapolation curves.
17

Geração de números pseudo-aleatórios empregando mapas caóticos

ARTILES, José Antonio Pérez de Morales 26 February 2016 (has links)
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2017-07-11T13:06:08Z No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) TeseJoseversaoCD.pdf: 2349040 bytes, checksum: f9cf2bfb304c798e864da4edd16e3a90 (MD5) / Made available in DSpace on 2017-07-11T13:06:08Z (GMT). No. of bitstreams: 2 license_rdf: 811 bytes, checksum: e39d27027a6cc9cb039ad269a5db8e34 (MD5) TeseJoseversaoCD.pdf: 2349040 bytes, checksum: f9cf2bfb304c798e864da4edd16e3a90 (MD5) Previous issue date: 2016-02-26 / CNPQ / Geradores de números pseudo-aleatórios são amplamente utilizados em aplicações científicas e tecnológicas. Particularmente em criptografia, estes são empregados em sistemas de chave secreta, como geradores de sequências de cifragem. Neste trabalho, propomos algumas metodologias para o projeto destes geradores a partir de mapas caóticos. A primeira é baseada em duas técnicas: salto de amostras e discretização codificada variante no tempo. Mostra-se que o procedimento possui alta taxa de geração de bits por amostra caótica quando comparado com a codificação fixa no tempo, além de dispensar pós-processamento para melhoria de suas propriedades aleatórias. A outra metodologia utilizada é o emprego de sequências-m para eliminar a correlação residual na sequência codificada. A discretização variante no tempo apresenta uma característica de correlação bem definida que é aproveitada por um novo bloco de pós-processamento que utiliza sequências-m de menor complexidade linear que a metodologia anterior. Validam-se os métodos propostos empregando a bateria de teste NIST. / Random number generators are widely used in scientific and technological applications. Particularly in cryptography, they are used in secret-key systems, such as key sequence generators. In this work, we present two methodologies for the design of these generators from chaotic maps. The first one is based on two techniques: Skipping and time-varying coded discretization. We show that the proposed method has higher bit generation rate when compared to fixed-time coded discretization and dispenses post-processing in order to improve their random properties. Another methodology is the use of m-sequences to eliminate the residual correlation of the coded sequence. The time-varying coded discretization has a well-defined correlation characteristic that is exploited by a new block ofpost-processing using m-sequences that requires less memory than the previous methodology. The effectiveness of this procedure is verified through the NIST test.
18

The Risk Assessment based on international standards, a credibility evaluation: A case study on international standards of Risk Assessment and Management in the Information Security context

Hedian, Daniel, Silva Neto, Gil January 2015 (has links)
Summary Organizations face risks regardless of the type of industry or government. Historically risks have been undertaken in various processes and coped with differently by society. An appropriate application of risk management is widely acknowledged as one of the most critical aspects of undertaking business activities across all sectors in society, public and private. In order to carry out this activity as part of the crucial actions the organizations implement as part of their culture, many standards have been developed at the international level. These standards provide the groundwork for entities to start implementing these processes and reduce the risk they face with a standardized set of procedures across sectors. Risk assessment faces abundant arguments that lead to doubt the credibility of the standards implemented by different organizations, as not a single method or definition is agreed upon across cultural and sectorial barriers. Therefore, the credibility of the standardized assessment is doubted. This study aims to evaluate the credibility of standardized risk assessments with a focus on the Information Security Risk Assessment Standards, in particular ISO 27005 and NIST 800-30 in collaboration with the Swedish Armed Forces. The research adapts the frameworks available in literature to evaluate credibility of risk assessments to the international standardized assessment procedure. The standards credibility will be evaluated with different criteria divided in five categories considered applicable to the standardised risk assessment procedure. Also, input from experts in organizations currently employing the standards and academic experts in the field will also be utilized. This study utilizes a qualitative case study approach. The credibility evaluation performance of each international standard is similar; the only category that NIST 800-30 has a significant better performance is the category related to the final Risk Assessment Results (Report). The NIST provides a further step in the process as well as the guidelines and templates in order to develop different parts of the assessment process including the report, which is considered a best practice of a standardised risk assessment. The findings of the research contradict four criteria of the framework found in the literature, related to with what can be learned from past risk assessments, to the wide ranging of the required scope of a risk assessment, the relevance of the disclosure of information on the final risk assessment report related to the composition of the assessment group and finally the procedure for finding consensus among stakeholders. The research question “How credible are standardized risk assessments?” provide a holistic understanding of the credibility of the standards previously mentioned, determining that these provide a solid framework for companies to start assessing the risks in a regulated and standardized procedure. These oversee the problems embedded in the subjectivity of a risk assessment and the ever-changing (intrinsic and extrinsic) aspects of stakeholder behaviour with a lack of a systemic approach to solve these issues, which also include the lack of proper handling of risk uncertainty and the lack of transparency on the final risk assessment report. The study provides a groundwork which can be used in order to develop future research. This study also provides a grounded framework which can be used by entities utilizing the standards in order to reflect their procedures of their risk assessment activities. Keywords: Credibility, risk assessment, risk management, international standards, risk, information security, ISO 27005, NIST 800-30.
19

Fully Digital Chaotic Oscillators Applied to Pseudo Random Number Generation

Mansingka, Abhinav S. 05 1900 (has links)
This thesis presents a generalized approach for the fully digital design and implementation of chaos generators through the numerical solution of chaotic ordinary differential equations. In particular, implementations use the Euler approximation with a fixed-point twos complement number representation system for optimal hardware and performance. In general, digital design enables significant benefits in terms of power, area, throughput, reliability, repeatability and portability over analog implementations of chaos due to lower process, voltage and temperature sensitivities and easy compatibility with other digital systems such as microprocessors, digital signal processing units, communication systems and encryption systems. Furthermore, this thesis introduces the idea of implementing multidimensional chaotic systems rather than 1-D chaotic maps to enable wider throughputs and multiplier-free architectures that provide significant performance and area benefits. This work focuses efforts on the well-understood family of autonomous 3rd order "jerk" chaotic systems. The effect of implementation precision, internal delay cycles and external delay cycles on the chaotic response are assessed. Multiplexing of parameters is implemented to enable switching between chaotic and periodic modes of operation. Enhanced chaos generators that exploit long-term divergence in two identical systems of different precision are also explored. Digital design is shown to enable real-time controllability of 1D multiscroll systems and 4th order hyperchaotic systems, essentially creating non-autonomous chaos that has thus far been difficult to implement in the analog domain. Seven different systems are mathematically assessed for chaotic properties, implemented at the register transfer level in Verilog HDL and experimentally verified on a Xilinx Virtex 4 FPGA. The statistical properties of the output are rigorously studied using the NIST SP. 800-22 statistical testing suite. The output is adapted for pseudo random number generation by truncating statistically defective bits. Finally, a novel post-processing technique using the Fibonacci series is proposed and implemented with a non-autonomous driven hyperchaotic system to provide pseudo random number generators with high nonlinear complexity and controllable period length that enables full utilization of all branches of the chaotic output as statistically secure pseudo random output.
20

Compliance & Standards - The Journey To Security

Johan, Boström January 2021 (has links)
We are in the age of Information Technology (IT) and amazinginnovations are developed. Management systems are now completelydigitalized and it has enabled people to continue working remotely inthe midst of a pandemic. With great innovations there are those thatseek to misuse or destroy systems for personal gain. Therefore IT &Information security is paramount both for organisation and products.To offer both an international approach for common security practicesand provide best results for IT & Information security there existsstandards and frameworks. In this thesis, the standard frameworksgeneral impact and value from both an organisational and a vendorsperspective is evaluated and assessed. To answer the research questionsof this thesis, standards and supporting theory were analysed andinterviewees with security professionals were held. Standards provideorganisational goals for developing a well-functioning and resilientsecurity. Standards also provide a common baseline between customerand vendors, minimising the need for tailoring in products’ securityrequirements. Furthermore, a certification for standards can increasethe confidence of the organisation or product, and generate a businessvalue. Whilst there are many benefits, the standards offer a structure onhow security can be built, but an organisation needs to understand anddevelop a security adapted to their organisation. In addition to setting upa security framework and implementing controls, organisation need tocreate security assurance processes to continuously review and evaluatemeasures to ascertain security posture.

Page generated in 0.0339 seconds