Spelling suggestions: "subject:"integrity"" "subject:"ntegrity""
491 |
[en] CHARACTERIZATION OF ACOUSTIC EMISSION (AE) SIGNAL GRAFH FOR EVALUATION STRUCTURAL INTEGRITY OF STEEL ASTM A 516 / A516 M – GRADE 60 / [pt] CARACTERIZAÇÃO DO SINAL GRÁFICO DE EMISSÃO ACÚSTICA (EA) PARA AVALIAÇÃO DE INTEGRIDADE ESTRUTURAL DO AÇO ASTM A 516 /A 516 M - GRAU 60MARCO ANTONIO COSTA DA SILVA 21 June 2011 (has links)
[pt] A manutenção não é mais vista apenas como uma geradora de custos para as
empresas. Cresce a necessidade das empresas em redução de custos, aumento de
disponibilidade e confiabilidade dos equipamentos e instalações, e é neste sentido que a
manutenção tem se destacado, cada vez mais, como uma área fundamental para o sucesso
das empresas que tem forte base industrial. No entanto, a manutenção teve uma grande
evolução tecnológica, principalmente dentro dos conceitos da manutenção preditiva, onde
se busca o desenvolvimento de novas técnicas para avaliação de integridade dos
equipamentos, a sua condição de operação. A emissão acústica é uma destas técnicas. No
presente trabalho utilizou-se o ensaio de tração para gerar estímulo estrutural em corpos
de provas de aço ASTM A 516 / A 516 M-Grau 60 e gerar de sinais de emissão acústica.
Durante os níveis de carregamentos em patamares previamente estabelecidos no
procedimento experimental, os corpos de provas foram monitorados através do ensaio de
emissão acústica, para a avaliação de sua integridade estrutural. Os resultados foram
então digitalizados, armazenados e processados para posterior análise. Concluiu-se que é
possível utilizar o ensaio de emissão acústica para avaliação de integridade estrutural dos
equipamentos em serviço, sendo o mesmo capaz de perceber, detectar e localizar fontes
ativas que possam comprometer a integridade estrutural do equipamento. Desta forma é
possível programar uma intervenção da equipe de manutenção no equipamento, sem
atingir um nível de danos que possam provocar sua parada e, conseqüentemente, do
processo produtivo. / [en] Maintenance has not else been seen just as an expenses generator by worldwide
companies. According to the importance it has taken on, new investments in technologies
and tools were implemented. In companies, it increased the need of costs reduction,
equipment and installation availability and reliability, and at this point, maintenance has
stood out, each day more, as a fundamental area for success of those companies who have
strong and large industrial bases. However, maintenance had a big technological
improvement, mainly in predictive maintenance concepts, subject which always work in
order to the development of new technologies and methodologies, and also new
techniques for equipment integrity evaluation, on operational conditions. Therefore, in
this Dissertation, tests were carried out in several specimen (ASTM A 516 / A 516 M -
Grade 60 Steel), using tension test as loading conditions, to stimulate the specimen
structurally, and consequently, the generation of acoustics emission signals. The
specimen were monitored through acoustics emission tests, in each loading steps
previously established, for their structural integrity evaluation. Than, the results were
digitalized, stored and processed in the Acoustics Emission Equipment, for further
analysis. All the results are presented as a graph, and also were studied, analyzed and
compared. We can conclude that it is possible to use the acoustics emission test for
structural integrity evaluation in equipment under working conditions, and it is capable
to realize, detect and locate active sources, that can be compromising to the equipment
structural integrity, and also capable of make possible an maintenance team intervention,
without causing premature failures which provoke breakdowns.
|
492 |
Protection des systèmes informatiques vis-à-vis des malveillances : un hyperviseur de sécurité assisté par le matériel / Protection of the computer systems face to face hostilities : a hypersight of sight (security) assisted by the material ( equipment)Morgan, Benoît 06 December 2016 (has links)
L'utilisation des systèmes informatiques est aujourd'hui en pleine évolution. Le modèle classique qui consiste à associer à chaque utilisateur une machine physique qu'il possède et dont il va exploiter les ressources devient de plus en plus obsolète. Aujourd'hui, les ressources informatiques que l'on utilise peuvent être distribuées n'importe où dans l'Internet et les postes de travail du quotidien ne sont plus systématiquement des machines réelles. Cette constatation met en avant deux phénomènes importants qui sont à l'origine de l'évolution de notre utilisation de l'informatique : le Cloud computing et la virtualisation. Le Cloud computing (ou informatique en nuage en français) permet à un utilisateur d'exploiter des ressources informatiques, de granularités potentiellement très différentes, pendant une durée variable, qui sont à disposition dans un nuage de ressources. L'utilisation de ces ressources est ensuite facturée à l'utilisateur. Ce modèle peut être bien sûr avantageux pour une entreprise qui peut s'appuyer sur des ressources informatiques potentiellement illimitées, qu'elle n'a pas nécessairement à administrer et gérer elle-même. Elle peut ainsi en tirer un gain de productivité et un gain financier. Du point de vue du propriétaire des machines physiques, le gain financier lié à la location des puissances de calcul est accentué par une maximisation de l'exploitation de ces machines par différents clients.L'informatique en nuage doit donc pouvoir s'adapter à la demande et facilement se reconfigurer. Une manière d'atteindre ces objectifs nécessite notamment l'utilisation de machines virtuelles et des techniques de virtualisation associées. Même si la virtualisation de ressources informatiques n'est pas née avec le Cloud, l'avènement du Cloud a considérablement augmenté son utilisation. L'ensemble des fournisseurs d'informatique en nuage s'appuient aujourd'hui sur des machines virtuelles, qui sont beaucoup plus facilement déployables et migrables que des machines réelles.Ainsi, s'il est indéniable que l'utilisation de la virtualisation apporte un véritable intérêt pour l'informatique d'aujourd'hui, il est par ailleurs évident que sa mise en œuvre ajoute une complexité aux systèmes informatiques, complexité à la fois logicielle (gestionnaire de machines virtuelles) et matérielle (nouveaux mécanismes d'assistance à la virtualisation intégrés dans les processeurs). A partir de ce constat, il est légitime de se poser la question de la sécurité informatique dans ce contexte où l'architecture des processeurs devient de plus en plus complexe, avec des modes de plus en plus privilégiés. Etant donné la complexité des systèmes informatiques, l'exploitation de vulnérabilités présentes dans les couches privilégiées ne risque-t-elle pas d'être très sérieuse pour le système global ? Étant donné la présence de plusieurs machines virtuelles, qui ne se font pas mutuellement confiance, au sein d'une même machine physique, est-il possible que l'exploitation d'une vulnérabilité soit réalisée par une machine virtuelle compromise ? N'est-il pas nécessaire d'envisager de nouvelles architectures de sécurité prenant en compte ces risques ?C'est à ces questions que cette thèse propose de répondre. En particulier, nous présentons un panorama des différents problèmes de sécurité dans des environnements virtualisés et des architectures matérielles actuelles. A partir de ce panorama, nous proposons dans nos travaux une architecture originale permettant de s'assurer de l'intégrité d'un logiciel s'exécutant sur un système informatique, quel que soit son niveau de privilège. Cette architecture est basée sur une utilisation mixte de logiciel (un hyperviseur de sécurité développé par nos soins, s'exécutant sur le processeur) et de matériel (un périphérique de confiance, autonome, que nous avons également développé). / Computer system are nowadays evolving quickly. The classical model which consists in associating a physical machine to every users is becoming obsolete. Today, computer resources we are using can be distributed any place on the Internet and usual workstations are not systematically a physical machine anymore. This fact is enlightening two important phenomenons which are leading the evolution of the usage we make of computers: the Cloud computing and hardware virtualization. The cloud computing enable users to exploit computers resources, with a fine grained granularity, with a non-predefined amount of time, which are available into a cloud of resources. The resource usage is then financially charged to the user. This model can be obviously profitable for a company which wants to lean on a potentially unlimited amount of resources, without administrating and managing it. A company can thereby increase its productivity and furthermore save money. From the physical machine owner point of view, the financial gain related to the leasing of computing power is multiplied by the optimization of machine usage by different clients. The cloud computing must be able to adapt quickly to a fluctuating demand a being able to reconfigure itself quickly. One way to reach these goals is dependant of the usage of virtual machines and the associated virtualization techniques. Even if computer resource virtualization has not been introduced by the cloud, the arrival of the cloud it substantially increased its usage. Nowadays, each cloud provider is using virtual machines, which are much more deployable and movable than physical machines. Virtualization of computer resources was before essentially based on software techniques. But the increasing usage of virtual machines, in particular in the cloud computing, leads the microprocessor manufacturers to include virtualization hardware assistance mechanisms. Theses hardware extensions enable on the one hand to make virtualization process easier et on the other hand earn performances. Thus, some technologies have been created, such as Intel VT-x and VT-d or AMD-V by AMD and virtualization extensions by ARM. Besides, virtualization process needs the implementation of extra functionalities, to be able to manage the different virtual machine, schedule them, isolate and share hardware resources like memory and peripherals. These different functionalities are in general handled by a virtual machine manager, whose work can be more or less eased by the characteristics of the processor on which it is executing.In general, these technologies are introducing new execution modes on the processors, more and more privileged and complex.Thus, even if we can see that virtualization is a real interest for modern computer science, it is either clear that its implementation is adding complexity to computer systems, at the same time software and hardwarecomplexity. From this observation, it is legitimate do ask the question about computer security in this context where the architecture of processors is becoming more and more complex, with more and more privileged execution modes. Given the presence of multiple virtual machine, which do not trust each other, in the same physical machine, is it possible that the exploitation of one vulnerability be carried out by a compromised virtual machine ? Isn't it necessary to consider new security architectures taking these risks into account?This thesis is trying to answer to these questions. In particular, we are introducing state of the art security issues in virtualized environment of modern architectures. Starting from this work, we are proposing an originalarchitecture ensuring the integrity of a software being executed on a computer system, regardless its privilege level. This architecture is both using software, a security hypervisor, and hardware, a trusted peripheral, we have both designed and implemented.
|
493 |
Integrity-Based Kernel Malware DetectionZhu, Feng 05 June 2014 (has links)
Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware.
We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests.
We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware.
We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks).
In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.
|
494 |
The Effect of a Multi-Component Consultation Intervention on the Pragmatic Language Skills of Students with Symptoms of Attention Deficit Hyperactivity DisorderCooper, Brittney M 03 July 2014 (has links)
The purpose of this research was to determine if a multi-component consultation intervention was effective in improving pragmatic performance in students with ADHD. Participants for this study consisted of 7 children for whom 3 data points were obtained by a parent or 2 data points by a teacher. Changes in pragmatic performance were measured by comparing reports provided by parents or teachers pre- and post- intervention. Descriptive analysis procedures were completed to summarize changes in pragmatic behavior.
Results revealed the mean overall change in pragmatic behavior for children in the MCC condition (Χ=1.133) was greater than the change seen in the CAU condition (.334) after 2 months of intervention as per parent reported data. Data indicated improvement in each behavior but incongruence between teachers and parents was found. Results support the hypothesis that the multi-component consultation intervention is effective in improving the pragmatic language performance of children with ADHD.
|
495 |
Evaluating ecological integrity and social equity in national parks : case studies from Canada and South AfricaTimko, Joleen Allison 05 1900 (has links)
There are concerns that many national parks worldwide are ineffective at conserving biological diversity and ecosystem processes, are socially unjust in their relations with Indigenous communities, or both. This dissertation asks: can national parks protect ecological integrity and concurrently address social equity issues? It presents empirical results of a systematic evaluation of six case study national parks in Canada and South Africa. Purposive sampling was used to select the six case study national parks. Data sources included State of the Park Reports; park ecological monitoring data; archival data; and semi-structured interviews with park biologists, managers, and Indigenous members of park co-management boards.
Status and trend assessments and effectiveness evaluations of park ecological monitoring data were used to evaluate how effectively the parks addressed three ecological integrity criteria. Results show that all six parks effectively addressed the priority indicators for which they had monitoring data. However, the effectiveness ratings of each park decreased when all indicators, including those identified as priorities but lacking monitoring data, were analysed. This indicates that the parks had generally identified more priority indicators than they were actually able to address (for reasons including lack of budget or trained staff, managerial challenges). Thematic coding of semi-structured interview and archival data, and the assignation of numerical ratings to these data, were used to evaluate how effectively the parks addressed three equity criteria. Results show that all but one of the case study parks were equitable, parks with more comprehensive co-management and support from neighbouring Indigenous groups were more equitable than parks with lower levels of co-management, the parks with settled land claims were not necessarily more equitable overall, and a few parks were found to be co-managed in name only. The overall results of this evaluation demonstrate that parks effective at protecting ecological integrity can also successfully address social equity, but that further efforts to integrate these two realms are both possible and necessary. A logical starting point would be to build upon those existing integrative processes already institutionalised in many parks and protected areas: the co-management and integrated conservation and development efforts. / Forestry, Faculty of / Graduate
|
496 |
Assessing symptoms of eutrophication in estuariesLemley, Daniel Alan January 2015 (has links)
Water quality and ecological integrity of estuaries reflect activities within the entire upstream catchment. Much emphasis has been placed on the response of estuaries to anthropogenic stressors through the use of monitoring programmes. Key to the success of these programmes is the use of indicators as they transform data into useful information. The ultimate aim of this study was to identify eutrophic symptoms in selected estuaries in the Gouritz Water Management Area (WMA) using a multi-metric classification method. The study sites included the permanently open Duiwenhoks, Goukou, Gouritz, and Kaaimans estuaries, as well as the temporarily open/closed Hartenbos, Klein Brak, Great Brak, Gwaing, and Goukamma estuaries. Initially, the “pressure” associated with all the estuaries in the Gouritz WMA was determined (e.g. total daily nutrient loads and changes in river inflow affecting flushing time) using long-term flow and water quality monitoring data provided by the Department of Water Affairs (DWA). Subsequently, the “state” of the selected estuaries was assessed using a variety of indicators, including: inorganic nutrients (~ N and P), phytoplankton, epiphytes and microphytobenthos. Estuaries with longer flushing time suggested greater vulnerability to eutrophication. For example, the Gwaing (281.11 kg DIN d-1; 78.85 kg DIP d-1) and Hartenbos (38.33 kg DIN d-1; 21.51 kg DIP d-1) estuaries generally received the highest daily inorganic nutrient loads. However, at the time of sampling, the Hartenbos Estuary had a longer flushing time (i.e. weeks) compared to the Gwaing Estuary (i.e. couple of days). Field data confirmed the greater vulnerability to eutrophication associated with longer flushing times, as the indicators measured in the Hartenbos Estuary exceeded all the proposed ‘thresholds’ of a eutrophic system (~ overall ‘Poor’ condition). The Great Brak Estuary (~ overall ‘Fair’ condition) provided an interesting example where eutrophic symptoms were only detected when assessing the microalgal primary producers (i.e. high biomass and low diversity). Furthermore, the need to distinguish between naturally and anthropogenically induced symptoms was highlighted. One such scenario was noted in the Kaaimans Estuary, where its natural hydro-morphological characteristics (i.e. steep banks, low river inflow, and deep/narrow channel) led to the possible exaggeration of its overall eutrophic condition, i.e. received a ‘Fair’ rather than ‘Good’ rating, arising from ‘Poor’ ratings for dissolved oxygen and benthic diatom diversity. Overall, this study demonstrated the importance of adopting a holistic approach when assessing the condition and trophic status of estuaries. Studies such as these allow for the detection of vulnerable and degraded systems, which can provide important information with regards to the identification of management priorities.
|
497 |
An assessment of personnel accountability within manufacturing at Continental Tyre South AfricaLiebenberg, Johann January 2007 (has links)
South Africa as a country producing certain commodities is experiencing profound changes as a result of globalisation. Globalisation refers to the sourcing of goods from locations around the world to take advantage of national differences in the cost and quality of factors of production, such as labour, energy, land and capital. By doing this, companies hope to lower their overall cost structure and improve the quality or functionality of their product offering, thereby allowing them to compete more effectively. To remain competitive in a global environment, having personnel who are accountable for what they have control over, can help revitalise the business character, strengthen the global competitiveness of corporations, heighten innovation, improve the quality of products and services produced by companies’ world wide, and increase the responsiveness of organisations to the needs and wants of customers. Continental Tyre South Africa (CTSA) is a local and global supplier of tyres, and the global sourcing for tyres by sales divisions, requires CTSA to remain competitive on price in the global market. The overall purpose of the research was to assess the current level of personnel accountability on all levels within manufacturing at CTSA and to formulate recommendations to address the shortfalls identified through the research. The research methodology for this study comprised of the following steps: Firstly, the contextual elements that define accountability were researched; Secondly, the appropriate research methodology techniques were researched and applied. The current level of accountability was assessed in an empirical study which involved completing a questionnaire during structured interviews with respondents; and Thirdly, the data obtained from the questionnaires were analysed showing some areas of accountability, but shortfalls in other areas. The final step of this study entailed the formulation of recommendations to address the shortfalls identified in the different levels and elements of accountability namely: Responsibility, Ability, Means, Authority and Measurement.
|
498 |
A Model for Managing Data IntegrityMallur, Vikram January 2011 (has links)
Consistent, accurate and timely data are essential to the functioning of a modern organization. Managing the integrity of an organization’s data assets in a systematic manner is a challenging task in the face of continuous update, transformation and processing to support business operations. Classic approaches to constraint-based integrity focus on logical consistency within a database and reject any transaction that violates consistency, but leave unresolved how to fix or manage violations. More ad hoc approaches focus on the accuracy of the data and attempt to clean data assets after the fact, using queries to flag records with potential violations and using manual efforts to repair. Neither approach satisfactorily addresses the problem from an organizational point of view.
In this thesis, we provide a conceptual model of constraint-based integrity management (CBIM) that flexibly combines both approaches in a systematic manner to provide improved integrity management. We perform a gap analysis that examines the criteria that are desirable for efficient management of data integrity. Our approach involves creating a Data Integrity Zone and an On Deck Zone in the database for separating the clean data from data that violates integrity constraints. We provide tool support for specifying constraints in a tabular form and generating triggers that flag violations of dependencies. We validate this by performing case studies on two systems used to manage healthcare data: PAL-IS and iMED-Learn. Our case studies show that using views to implement the zones does not cause any significant increase in the running time of a process.
|
499 |
An exploration of stakeholder perceptions of academic dishonesty and approaches used to promote academic integrity in nursing studentsHarrison, Nigel January 2013 (has links)
An increased number of investigations for academic dishonesty with nursing students was a catalyst for this research. The aim was to explore stakeholder perceptions of academic dishonesty and approaches used to promote academic integrity. Literature reviewed was largely anecdotal, focusing on accounts of incidents and concern over nurses’ fitness to practise, recognising a need to enhance understanding and strategic solutions. A single case study design was utilised, capturing views of expert witnesses, including nursing students, academic staff, practice mentors and administrative and support staff, using individual interviews and nominal groups. Documentary evidence of incidence occurring between 2004 and 2010 were also analysed. An integrated definition of Academic and Practice Misconduct specific to nursing was developed and a range of contributing factors influencing students identified. Incidence within the school was found to have gradually reduced, where collusion and plagiarism was found to be the most common types occurring; highest at academic level five and in essays. Almost half of academic staff had reported an alleged incident. A hierarchy of Academic and Practice Misconduct emerged, indicating a range of severity and degrees of deliberateness. A self-assessment tool has been developed to enable students to measure their level of risk of Academic and Practice Misconduct. Five themes emerged from thematic analysis of data on approaches used to promote academic integrity: devising strategies, policies and procedures; educating stakeholders; implementing holistic preventative processes and deterrents; detecting and managing alleged incidents; and on-going monitoring and enhancement. This was synthesised into a collaborative cycle with four phases for use by stakeholders, listing activities undertaken at course, school and university level and in practice settings. A self-assessment tool has been developed for academic staff to measure their level of involvement in promoting Academic and Practice Integrity. The concepts of risk and person centred approaches are utilised as theoretical frameworks to underpin the research findings. The study is presented as an integration of research, education and practice.
|
500 |
Řešení integrity digitálního archivu pomocí relativní časové autentizace / The solution of digital archive integrity using relative time authenticationVohnout, Rudolf January 2008 (has links)
The research objective of the dissertation is a phenomenon of electronic documents archivation in long-term timeframe with focus on their security and integrity. This work at the begging deals with research issues of current approaches and solutions in the specified topic and focuses on relationship examining of the long-term archivation on restrictive conditions set by currently valid legislation scope. The main focus is on the archive integrity by means of alternative methods not related on absolute archival time. Design part deals with creation of the universal system solution leading to ensure non-repudiation of entrusted, archive managed, objects. This draft was empirically verified and results were processed and evaluated by means of statistics methods. The presented proposal of this work endeavors contribution to strengthen legal and evidence equivalence of long-term archived electronic documents on the same level of their paper siblings.
|
Page generated in 0.0714 seconds