• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 433
  • 38
  • 35
  • 29
  • 19
  • 11
  • 8
  • 8
  • 8
  • 8
  • 8
  • 8
  • 7
  • 4
  • 4
  • Tagged with
  • 757
  • 757
  • 464
  • 347
  • 184
  • 182
  • 159
  • 122
  • 112
  • 112
  • 108
  • 103
  • 100
  • 86
  • 84
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
611

A framework for the application of network telescope sensors in a global IP network

Irwin, Barry Vivian William January 2011 (has links)
The use of Network Telescope systems has become increasingly popular amongst security researchers in recent years. This study provides a framework for the utilisation of this data. The research is based on a primary dataset of 40 million events spanning 50 months collected using a small (/24) passive network telescope located in African IP space. This research presents a number of differing ways in which the data can be analysed ranging from low level protocol based analysis to higher level analysis at the geopolitical and network topology level. Anomalous traffic and illustrative anecdotes are explored in detail and highlighted. A discussion relating to bogon traffic observed is also presented. Two novel visualisation tools are presented, which were developed to aid in the analysis of large network telescope datasets. The first is a three-dimensional visualisation tool which allows for live, near-realtime analysis, and the second is a two-dimensional fractal based plotting scheme which allows for plots of the entire IPv4 address space to be produced, and manipulated. Using the techniques and tools developed for the analysis of this dataset, a detailed analysis of traffic recorded as destined for port 445/tcp is presented. This includes the evaluation of traffic surrounding the outbreak of the Conficker worm in November 2008. A number of metrics relating to the description and quantification of network telescope configuration and the resultant traffic captures are described, the use of which it is hoped will facilitate greater and easier collaboration among researchers utilising this network security technology. The research concludes with suggestions relating to other applications of the data and intelligence that can be extracted from network telescopes, and their use as part of an organisation’s integrated network security systems
612

Towards a unified fraud management and digital forensic framework for mobile applications

Bopape, Rudy Katlego 06 1900 (has links)
Historically, progress in technology development has continually created new opportunities for criminal activities which, in turn, have triggered the need for the development of new security-sensitive systems. Organisations are now adopting mobile technologies for numerous applications to capitalise on the mobile revolution. They are now able to increase their operational efficiency as well as responsiveness and competitiveness and, most importantly, can now meet new, growing customers’ demands. However, although mobile technologies and applications present many new opportunities, they also present challenges. Threats to mobile phone applications are always on the rise and, therefore, compel organisations to invest money and time, among other technical controls, in an attempt to protect them from incurring losses. The computerisation of core activities (such as mobile banking in the banking industry, for example) has effectively exposed organisations to a host of complex fraud challenges that they have to deal with in addition to their core business of providing services to their end consumers. Fraudsters are able to use mobile devices to remotely access enterprise applications and subsequently perform fraudulent transactions. When this occurs, it is important to effectively investigate and manage the cause and findings, as well as to prevent any future similar attacks. Unfortunately, clients and consumers of these organisations are often ignorant of the risks to their assets and the consequences of the compromises that might occur. Organisations are therefore obliged, at least, to put in place measures that will not only minimise fraud but also be capable of detecting and preventing further similar incidents. The goal of this research was to develop a unified fraud management and digital forensic framework to improve the security of Information Technology (IT) processes and operations in organisations that make available mobile phone applications to their clients for business purposes. The research was motivated not only by the increasing reliance of organisations on mobile applications to service their customers but also by the fact that digital forensics and fraud management are often considered to be separate entities at an organisational level. This study proposes a unified approach to fraud management and digital forensic analysis to simultaneously manage and investigate fraud that occurs through the use of mobile phone applications. The unified Fraud Management and Digital Forensic (FMDF) framework is designed to (a) determine the suspicious degree of fraudulent transactions and (b) at the same time, to feed into a process that facilitates the investigation of incidents. A survey was conducted with subject matter experts in the banking environment. Data was generated through a participatory self-administered online questionnaire. Collected data was then presented, analysed and interpreted quantitatively and qualitatively. The study found that there was a general understanding of the common fraud management methodologies and approaches throughout the banking industry and the use thereof. However, while many of the respondents indicated that fraud detection was an integral part of their processes, they take a rather reactive approach when it comes to fraud management and digital forensics. Part of the reason for the reactive approach is that many investigations are conducted in silos, with no central knowledge repository where previous cases can be retrieved for comparative purposes. Therefore, confidentiality, integrity and availability of data are critical for continued business operations. To mitigate the pending risks, the study proposed a new way of thinking that combines both components of fraud management and digital forensics for an optimised approach to managing security in mobile applications. The research concluded that the unified FMDF approach was considered to be helpful and valuable to professionals who participated in the survey. Although the case study focused on the banking industry, the study appears to be instrumental in informing other types of organisations that make available the use of mobile applications for their clients in fraud risk awareness and risk management in general. / Computing / M. Sc. (Computing)
613

Medida do Kerma ar na superfície de entrada em tomografia computadorizada odontológica

Legnani, Adriano 26 August 2011 (has links)
O presente trabalho tem por objetivo mensurar o Kerma Ar na Superfície de Entrada, Ka,e, em Tomografia Computadorizada Odontológica (TCO), também conhecida como CBCT (Cone Beam Computed Tomography). A coleta de dados foi realizada em seis clínicas de radiodiagnóstico odontológico. Destas, cinco possuem o aparelho de TCO e uma possui o aparelho de raios X panorâmico. O arranjo experimental é composto de um simulador humanóide preenchido com água. Este simulador tem dimensões semelhantes a de uma criança, com o diâmetro lateral da cabeça igual a 14,5 cm. O Ka,e foi medido com dosímetros termoluminescentes sobre a superfície do simulador. As localizações dos dosímetros correspondem aos olhos e glândulas salivares (parótida). A técnica empregada engloba toda a região da cabeça, como um exame craniofacial. A configuração de FOV (Field of View) estendido oferece os maiores valores de dose. Os parâmetros técnicos (kVp e mAs) e os diferentes modos de execução do FOV, influenciam os resultados da dose. Além disso, há a diferença entre os resultados de três tomógrafos do mesmo modelo. Há também a falta de simetria na distribuição do Ka,e entre as duas faces do simulador. O tomógrafo de menor Ka,e, neste trabalho, pertence a mesma marca daqueles com os menores resultados descritos na literatura. / This aim of this work is to measure the Entrance Surface Air Kerma, Ka,e, in Dental Computed Tomography, also known as CBCT (Cone Beam Computed Tomography). Data collection was conducted in six dental radiology clinics. Five of these have dental computed tomography scanner and one of them has a panoramic X ray scanner. The experimental essay is composed of a humanoid phantom filled with water. In addition, it has similar dimensions to a child that has head with lateral diameter of 14.5 cm. The Ka,e was measured with thermoluminescence dosimeters on the surface of the phantom. The locations of the dosimeters correspond to eyes and salivary gland (parotid glands). The technique used comprises the total area of the head, like a craniofacial examination. The configuration of extended FOV provides the highest Ka,e values. The technical parameters (kVp and mAs) and different modes of executing the FOV influenced the Ka,e results. Furthermore, there were differences between the results of three scanners of the same model. There is also a lack of symmetry on Ka,e distribution between both sides of the phantom. The CBCT with the lowest Ka,e on this work is of the same type of those with the lowest doses related on literature.
614

Avaliação da dose de radiação ocupacional em medicina nuclear nos exames de cintilografia de perfusão miocárdica

Komatsu, Cássio Vilela 29 November 2013 (has links)
Em medicina nuclear, os trabalhadores diretamente envolvidos nos exames são frequentemente expostos à radiação ionizante. Neste estudo, utilizou-se um detector Geiger-Mueller (GM) para medir as doses da radiação ocupacional durante a realização de algumas das etapas mais críticas para a exposição à radiação em exames de cintilografia de perfusão miocárdica (CPM), são elas: 1) fracionamento das atividades no preparo das seringas; 2) administração do radiofármaco Tecnécio99m-sestamibi nas etapas de repouso e estresse; e 3) aquisição das imagens diagnósticas na sala de exames. Na avaliação, procurou-se discriminar e relacionar o tempo de experiência profissional às doses medidas. Para isso, foi acompanhado um total de 494 procedimentos entre os meses de outubro e dezembro de 2012, sendo 229 seringas preparadas no fracionamento das atividades, 165 administrações de radiofármaco (55 na etapa de repouso realizadas por profissionais com tempo de experiência superior a 2 anos, 55 na etapa de repouso realizada por profissionais com tempo de experiência inferior a 1 ano, e 55 na etapa de estresse), e 100 aquisições de imagem (50 na etapa de repouso e 50 na etapa de estresse). Foram avaliados também os registros das doses obtidas na monitoração individual por dosimetria termoluminescente (TLD), realizada entre julho de 2010 e dezembro de 2012. Os resultados obtidos com o detector GM, quando extrapolados para o acúmulo de doses no período de um ano, mostraram-se significantes em relação ao limite anual de 20 mSv determinado pela legislação brasileira para uma média em cinco anos consecutivos. As doses médias acumuladas nos procedimentos avaliados corresponderam aos seguintes percentuais em relação a esse limite: 1) 13%, no fracionamento das atividades; 2) 8% e 35%, na administração dos radiofármacos das etapas de repouso e estresse, respectivamente; e 3) 4% e 10%, na aquisição das imagens das etapas de repouso e estresse, respectivamente. Esses valores foram compatíveis com os resultados da monitoração individual por TLD, cujos valores registrados foram superiores (34,6% a 63,2% do limite de 20 mSv) pelo fato de não discriminar as doses em cada procedimento. Em virtude dos valores de dose encontrados, o uso de equipamentos de proteção individual e a agilidade na realização dos procedimentos, ligada a experiência profissional, contribuem de forma efetiva para a redução destes valores de dose. / In nuclear medicine, workers directly involved in the exams are frequently exposed to ionizing radiation. In this study, a Geiger-Mueller detector was used to measure the occupational radiation doses while conducting some of the steps with critical radiation exposure during myocardial perfusion scintigraphy exams, which are: 1) fractionation of radiopharmaceutical activities in single-dose syringes, 2) Technetium99m-sestamibi administration during rest and stress steps, and 3) diagnostic images acquisition in the exam room. In the evaluation, it was sought to distinguish and relate the length of professional experience to measured doses. For that reason a total of 494 procedures were followed up including 229 fractionation of radiopharmaceutical activities in single-dose syringes, 165 radiopharmaceutical administrations (55 during rest step performed by professionals with experience time above two years, 55 during rest step performed by professionals with experience time below one year, and 55 during stress step), and 100 image acquisitions (50 during rest step and 50 during stress step). Dose records obtained during individual monitoring by thermoluminescent dosimetry (TLD) conducted between July 2010 and December 2012 were also evaluated. The results obtained by the GM detector, when extrapolated for dose accumulation over one year, proved to be significant in relation to the 20 mSv annual limit determined by Brazilian regulations to an average over five consecutive years. The mean accumulated doses evaluated during the procedures correspond to the following percentages relative to the annual limit value: 1) 13%, at the fractionation of radiopharmaceutical activities, 2) 8% and 35%, during rest and stress steps of radiopharmaceuticals administration, respectively, and 3) 4% and 10%, during rest and stress images acquisition, respectively. These values are consistent to the results of individual monitoring by TLD. These values were consistent to the results of individual monitoring by TLD, whose registered values were higher (34.6% to 63.2% of the limit of 20 mSv) due to the fact that they don't discriminate the dose by each procedure. Because of the dose values found, the use of personal protective equipment and the agility in procedures, linked to professional experience, effectively contribute to the reduction of these dose values.
615

Medida do Kerma ar na superfície de entrada em tomografia computadorizada odontológica

Legnani, Adriano 26 August 2011 (has links)
O presente trabalho tem por objetivo mensurar o Kerma Ar na Superfície de Entrada, Ka,e, em Tomografia Computadorizada Odontológica (TCO), também conhecida como CBCT (Cone Beam Computed Tomography). A coleta de dados foi realizada em seis clínicas de radiodiagnóstico odontológico. Destas, cinco possuem o aparelho de TCO e uma possui o aparelho de raios X panorâmico. O arranjo experimental é composto de um simulador humanóide preenchido com água. Este simulador tem dimensões semelhantes a de uma criança, com o diâmetro lateral da cabeça igual a 14,5 cm. O Ka,e foi medido com dosímetros termoluminescentes sobre a superfície do simulador. As localizações dos dosímetros correspondem aos olhos e glândulas salivares (parótida). A técnica empregada engloba toda a região da cabeça, como um exame craniofacial. A configuração de FOV (Field of View) estendido oferece os maiores valores de dose. Os parâmetros técnicos (kVp e mAs) e os diferentes modos de execução do FOV, influenciam os resultados da dose. Além disso, há a diferença entre os resultados de três tomógrafos do mesmo modelo. Há também a falta de simetria na distribuição do Ka,e entre as duas faces do simulador. O tomógrafo de menor Ka,e, neste trabalho, pertence a mesma marca daqueles com os menores resultados descritos na literatura. / This aim of this work is to measure the Entrance Surface Air Kerma, Ka,e, in Dental Computed Tomography, also known as CBCT (Cone Beam Computed Tomography). Data collection was conducted in six dental radiology clinics. Five of these have dental computed tomography scanner and one of them has a panoramic X ray scanner. The experimental essay is composed of a humanoid phantom filled with water. In addition, it has similar dimensions to a child that has head with lateral diameter of 14.5 cm. The Ka,e was measured with thermoluminescence dosimeters on the surface of the phantom. The locations of the dosimeters correspond to eyes and salivary gland (parotid glands). The technique used comprises the total area of the head, like a craniofacial examination. The configuration of extended FOV provides the highest Ka,e values. The technical parameters (kVp and mAs) and different modes of executing the FOV influenced the Ka,e results. Furthermore, there were differences between the results of three scanners of the same model. There is also a lack of symmetry on Ka,e distribution between both sides of the phantom. The CBCT with the lowest Ka,e on this work is of the same type of those with the lowest doses related on literature.
616

Avaliação da dose de radiação ocupacional em medicina nuclear nos exames de cintilografia de perfusão miocárdica

Komatsu, Cássio Vilela 29 November 2013 (has links)
Em medicina nuclear, os trabalhadores diretamente envolvidos nos exames são frequentemente expostos à radiação ionizante. Neste estudo, utilizou-se um detector Geiger-Mueller (GM) para medir as doses da radiação ocupacional durante a realização de algumas das etapas mais críticas para a exposição à radiação em exames de cintilografia de perfusão miocárdica (CPM), são elas: 1) fracionamento das atividades no preparo das seringas; 2) administração do radiofármaco Tecnécio99m-sestamibi nas etapas de repouso e estresse; e 3) aquisição das imagens diagnósticas na sala de exames. Na avaliação, procurou-se discriminar e relacionar o tempo de experiência profissional às doses medidas. Para isso, foi acompanhado um total de 494 procedimentos entre os meses de outubro e dezembro de 2012, sendo 229 seringas preparadas no fracionamento das atividades, 165 administrações de radiofármaco (55 na etapa de repouso realizadas por profissionais com tempo de experiência superior a 2 anos, 55 na etapa de repouso realizada por profissionais com tempo de experiência inferior a 1 ano, e 55 na etapa de estresse), e 100 aquisições de imagem (50 na etapa de repouso e 50 na etapa de estresse). Foram avaliados também os registros das doses obtidas na monitoração individual por dosimetria termoluminescente (TLD), realizada entre julho de 2010 e dezembro de 2012. Os resultados obtidos com o detector GM, quando extrapolados para o acúmulo de doses no período de um ano, mostraram-se significantes em relação ao limite anual de 20 mSv determinado pela legislação brasileira para uma média em cinco anos consecutivos. As doses médias acumuladas nos procedimentos avaliados corresponderam aos seguintes percentuais em relação a esse limite: 1) 13%, no fracionamento das atividades; 2) 8% e 35%, na administração dos radiofármacos das etapas de repouso e estresse, respectivamente; e 3) 4% e 10%, na aquisição das imagens das etapas de repouso e estresse, respectivamente. Esses valores foram compatíveis com os resultados da monitoração individual por TLD, cujos valores registrados foram superiores (34,6% a 63,2% do limite de 20 mSv) pelo fato de não discriminar as doses em cada procedimento. Em virtude dos valores de dose encontrados, o uso de equipamentos de proteção individual e a agilidade na realização dos procedimentos, ligada a experiência profissional, contribuem de forma efetiva para a redução destes valores de dose. / In nuclear medicine, workers directly involved in the exams are frequently exposed to ionizing radiation. In this study, a Geiger-Mueller detector was used to measure the occupational radiation doses while conducting some of the steps with critical radiation exposure during myocardial perfusion scintigraphy exams, which are: 1) fractionation of radiopharmaceutical activities in single-dose syringes, 2) Technetium99m-sestamibi administration during rest and stress steps, and 3) diagnostic images acquisition in the exam room. In the evaluation, it was sought to distinguish and relate the length of professional experience to measured doses. For that reason a total of 494 procedures were followed up including 229 fractionation of radiopharmaceutical activities in single-dose syringes, 165 radiopharmaceutical administrations (55 during rest step performed by professionals with experience time above two years, 55 during rest step performed by professionals with experience time below one year, and 55 during stress step), and 100 image acquisitions (50 during rest step and 50 during stress step). Dose records obtained during individual monitoring by thermoluminescent dosimetry (TLD) conducted between July 2010 and December 2012 were also evaluated. The results obtained by the GM detector, when extrapolated for dose accumulation over one year, proved to be significant in relation to the 20 mSv annual limit determined by Brazilian regulations to an average over five consecutive years. The mean accumulated doses evaluated during the procedures correspond to the following percentages relative to the annual limit value: 1) 13%, at the fractionation of radiopharmaceutical activities, 2) 8% and 35%, during rest and stress steps of radiopharmaceuticals administration, respectively, and 3) 4% and 10%, during rest and stress images acquisition, respectively. These values are consistent to the results of individual monitoring by TLD. These values were consistent to the results of individual monitoring by TLD, whose registered values were higher (34.6% to 63.2% of the limit of 20 mSv) due to the fact that they don't discriminate the dose by each procedure. Because of the dose values found, the use of personal protective equipment and the agility in procedures, linked to professional experience, effectively contribute to the reduction of these dose values.
617

Analýza bezpečnosti práce a pracovní úrazovosti v oblasti pozemní dopravy a manipulace s materiálem. / Analysis of work safety and work injuries in the field of land transport and material handling.

KAINZ, Aleš January 2010 (has links)
My Diploma thesis describes the issues of safety and labour accidents in an agriculture in particular looking at the handling of agricultural products. The accident at work is due to an aggregation of several interacting factors and the factors are con-sidered as the main source and cause of accidents at work that affect the creation of industrial accidents. The most important element in protecting the health and safety is prevention. An Obligation to assess risk is one of the fundamental principles of the concept of preventive occupational health and safety policy, which is enforced in all countries declaring the principles of safe - company. The basic need to protect the health of individuals is well established in the Constitution of the Czech Republic. The primary objective of this work is based on an analysis of factors involved in causing accidents at work and on the basis of the analysis, rules and recommenda-tions for the farms so that they serve as a support in the elimination of such acci-dents.
618

A model to enhance the perceived trustworthiness of Eastern Cape essential oil producers selling through electronic marketplaces

Gcora, Nozibele January 2016 (has links)
Eastern Cape Province farmers in the natural essential oils industry are yet to fully realise the use of electronic commerce (e-commerce) platforms, such as electronic marketplaces (e-marketplaces) for business purposes. This is due to the issues that include lack of awareness, poor product quality, untrusted payment gateways and unsuccessful delivery that are associated with e-marketplaces. As a result, farmers do not trust e-marketplaces and therefore hesitate to engage in e-marketplaces for business purposes. This is further complicated by natural essential oils buyers‟ tendency of preferring face-to-face interaction with a supplier rather than online interaction as they need quality assurance. As such, this research proposes a model to enhance the perceived trustworthiness of natural essential oil producers in the Eastern Cape Province selling through e-marketplaces. The model constitutes the factors that could be considered in assisting essential oil producers to create a perception of trustworthiness to buyers in e-marketplaces. These factors were evaluated amongst five organisations involved in the production, retail or processing of essential oils using a multiple-case study methodology. The study‟s use of multiple-case study was applied within the interpretivist paradigm and five cases were considered. Interviews, document analysis and observations were used for data collection. Data analysis was done using within-case analysis followed by cross-case analysis to establish factors of trust. The essential oil producers based in the Gauteng, Kwazulu-Natal and Western Cape provinces were cases that had been successfully using e-marketplaces for a notable period of time. Accordingly, factors that contributed to the successful use of e-marketplaces informed the proposed model of this research. The model proposes that perceived trustworthiness of enterprises in e-marketplaces can be achieved through following the uncertainty reduction stages (Entry, Personal and Exit) and applying uncertainty reduction strategies (passive, active and interactive).
619

Framework de Kernel para um sistema de segurança imunologica

Carbone, Martim d'Orey Posser de Andrade 23 June 2006 (has links)
Orientador: Paulo Licio de Geus / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-07T00:53:39Z (GMT). No. of bitstreams: 1 Carbone_Martimd'OreyPosserdeAndrade_M.pdf: 997778 bytes, checksum: 199d19777ac058e5c7dcecaa5c43639f (MD5) Previous issue date: 2006 / Resumo: O crescimento alarmante da quantidade e da sofisticação dos ataques aos quais estão sujeitos os sistemas computacionais modernos traz à tona a necessidade por novos sistemas de segurança mais eficientes. Na natureza, há um sistema biológico que realiza esta tarefa com notável eficácia: o sistema imunológico humano. Este sistema é capaz de garantir a sobrevivência de um ser humano por décadas, além de ser capaz de aprender sobre novas ameaças e criar defesas para combatê-Ias. Sua eficácia, somada à semelhança entre o cenário da segurança biológica e o da segurança computacional, motivou a criação do projeto Imuno, cujo objetivo é a construção de um sistema de segurança para computadores baseado nos princípios do sistema imunológico humano. Após o estudo inicial, a modelagem conceitual do sistema e a implementação de protótipos restritos de certas funcionalidades do sistema Imuno, este trabalho tem como objetivo avançar rumo à construção de um sistema de segurança imunológico completo, de escopo geral. Para isso, torna-se necessária a implementação de uma framework em nível de sistema operacional, que suporte as funcionalidades relacionadas à prevenção, detecção e resposta que serão utilizadas por este sistema de segurança. Projetada para o kernel Linux 2.6, esta framework é composta por algumas frameworks pré-existentes, como Lima Security Modules (LSM), Netfilter, Class-based Kernel Resource Management (CKRM), BSD Secure Levels (SEClvl) e UndoFS, ajustadas de acordo com os requisitos levantados para a framework; e somadas a uma nova arquitetura de ganchos multifuncionais. Esta arquitetura expande a infraestrutura nativa dos ganchos LSM, tornando-os flexíveis e genéricos o bastante para serem utilizados com outras funcionalidades de segurança além de controle de acesso, como detecção e resposta, além de poderem ser controlados do espaço de usuário em tempo real. Um protótipo foi implementado para a versão 2.6.12 do Linux e submetido a testes, visando avaliar tanto o impacto de desempenho gerado como também o seu comportamento em um cenário de ataque simulado. Os resultados destes testes são expostos no final deste trabalho, junto com as conclusões gerais sobre o projeto e propostas de extensão / Abstract: The alarming growth in the quantity and the sophistication of the attacks that threaten modem computer systems shows the need for new, more efticient security systems. In nature, there is a biological system that accomplishes this task with a remarkable efticiency: the human immune system. Not only this system is capable of assuring the survival of a human being for decades; it is also capable of learning about new threats and creating defenses to fight them. Its efticiency, combined with the similarity that exists between the biological and the computer security problems, has motivated the creation of the Imuno project, whose goal is the construction of a computer security system based on the principIes of the human immune system. After initial studies, the system's conceptual modeling and the implementation of prototypes of certain Imuno functionalities, this project's goal is to advance towards the construction of a complete, general scope immune security system. In order to accomplish that, the implementation of an operating system leveI framework that supports the prevention, detection and response security functionalities to be used by such a system is necessary. Designed for the 2.6 Linux kernel, this framework is composed of several pre-existing frameworks, such as Linux Security Modules (L8M), Netfilter, Class-based Kernel Resource Management (CKRM), BSD Secure Levels (8EClvl) and UndoFS, adjusted according to the framework requirements; and supplemented by a new multifunctional hook architecture. This architecture expands L8M's native hook infrastructure, making them flexible and generic enough to be used by other security functionalities beyond access control, such as detection and response, and also capable of being controlled from userspace in real-time. A prototype has been implemented for Linux version 2.6.12 and submitted to various tests, aiming to evaluate the performance overhead it creates and its behavior in a simulated attack situation. These tests' results are shown at the end of this document, along with a general conclusion about the project and extension proposals / Mestrado / Mestre em Ciência da Computação
620

Gerenciamento baseado em modelos da configuração de sistemas de segurança em ambientes de redes complexos / Model-based configuration management of security systems in complex network environments

Pereira, João Porto de Albuquerque 24 May 2006 (has links)
Orientador: Paulo Licio de Geus / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-07T08:33:59Z (GMT). No. of bitstreams: 1 Pereira_JoaoPortodeAlbuquerque_D.pdf: 3410336 bytes, checksum: b604fcebba7d50ce5939b35de40ce518 (MD5) Previous issue date: 2006 / Resumo: Os mecanismos de segurança empregados em ambientes de redes atuais têm complexidade crescente e o gerenciamento de suas configurações adquire um papel fundamental para proteção desses ambientes. Particularmente em redes de computadores de larga escala, os administradores de segurança se vêem confrontados com o desafio de projetar, implementar, manter e monitorar um elevado número de mecanismos, os quais possuem sintaxes de configuração heterogêneas e complicadas. Uma conseqüência dessa situação é que erros de configuração são causas freqüentes de vulnerabilidades de segurança. O presente trabalho oferece uma sistemática para o gerenciamento da configuração de sistemas de segurança de redes que corresponde especialmente às necessidades dos ambientes complexos encontrados em organizações atuais. A abordagem, construída segundo o paradigma de Gerenciamento Baseado em Modelos, inclui uma técnica de modelagem que trata uniformemente diferentes tipos de mecanismos e permite que o projeto de suas configurações seja executado de forma modular, mediante um modelo orientado a objetos. Esse modelo é segmentado em Subsistemas Abstratos, os quais encerram um grupo de mecanismos de segurança e outras entidades relevantes do sistema ¿ incluindo seus diferentes tipos de mecanismo e as inter-relações recíprocas entre eles. Uma ferramenta de software apóia a abordagem, oferecendo um diagrama para edição de modelos que inclui técnicas de visualização de foco e contexto. Essas técnicas são particularmente adaptadas para cenários de larga escala, possibilitando ao usuário a especificação de certa parte do sistema sem perder de vista o contexto maior no qual essa parte se encaixa. Após a conclusão da modelagem, a ferramenta deriva automaticamente parâmetros de configuração para cada mecanismo de segurança do sistema, em um processo denominado refinamento de políticas. Os principais resultados deste trabalho podem ser sumarizados nos seguintes pontos: (i) uma técnica de modelagem uniforme e escalável para o gerenciamento de sistemas de segurança em ambientes complexos e de larga escala; (ii) um processo para o projeto de configurações apoiado por uma ferramenta que inclui técnicas de foco e contexto para melhor visualização e manipulação de grandes modelos; (iii) uma abordagem formal para a validação do processo de refinamento de políticas / Abstract: The security mechanisms employed in current networked environments are increasingly complex, and their configuration management has an important role for the protection of these environments. Especially in large scale networks, security administrators are faced with the challenge of designing, deploying, maintaining and monitoring a huge number of mechanisms, most of which have complicated and heterogeneous configuration syntaxes. Consequently, configuration errors are nowadays a frequent cause of security vulnerabilities. This work offers an approach to the configuration management of network security systems specially suited to the needs of the complex environments of today¿s organizations. The approach relies upon the Model-Based Management (MBM) paradigm and includes a modelling framework that allows the design of security systems to be performed in a modular fashion, by means of an object-oriented model. This model is segmented into logical units (so-called Abstract Subsystems) that enclose a group of security mechanisms and other relevant system entities, offering a more abstract representation of them. In this manner, the administrator is able to design a security system¿including its different mechanism types and their mutual relations¿by means of an abstract and uniform modelling technique. A software tool supports the approach, offering a diagram editor for models, which includes focus and context visualization techniques. These techniques are particularly suitable to large scale scenarios, enabling a designer to precisely specify a given part of the system without losing the picture of the context to which this part belongs. After the model is complete, the tool automatically derives configuration parameters for each security mechanism in the system, in a process called policy refinement. The major results of this work can be summarised as follows: (i) definition of a uniform and scalable object-oriented modelling framework for the configuration management of large, complex network security systems; (ii) development of a configuration design process assistes by a tool that implements focus and context techniques to improve visualization and manipulation of large models; (iii) a formal validation approach of the policy refinement process / Doutorado / Doutor em Ciência da Computação

Page generated in 0.0444 seconds