• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 30
  • 30
  • 21
  • 9
  • 6
  • 6
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 211
  • 211
  • 47
  • 38
  • 38
  • 34
  • 33
  • 26
  • 25
  • 24
  • 24
  • 24
  • 23
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Hardware Architectures for Software Security

Edmison, Joshua Nathaniel 20 October 2006 (has links)
The need for hardware-based software protection stems primarily from the increasing value of software coupled with the inability to trust software that utilizes or manages shared resources. By correctly utilizing security functions in hardware, trust can be removed from software. Existing hardware-based software protection solutions generally suffer from utilization of trusted software, lack of implementation, and/or extreme measures such as processor redesign. In contrast, the research outlined in this document proposes that substantial, hardware-based software protection can be achieved, without trusting software or redesigning the processor, by augmenting existing processors with security management hardware placed outside of the processor boundary. Benefits of this approach include the ability to add security features to nearly any processor, update security features without redesigning the processor, and provide maximum transparency to the software development and distribution processes. The major contributions of this research include the the augmentation methodology, design principles, and a graph-based method for analyzing hardware-based security systems. / Ph. D.
82

Situation Awareness: A Network Centric Approach

Ojha, Ananya 19 November 2008 (has links)
Situation (al) awareness (SA) is critical to analyze, predict and perform tasks effectively in a dynamic environment. Many studies on SA have ignored network dynamism and its effect on SA, focusing on simple environments. Many studies involving the network and SA have refrained from attempting to model information space dynamism (i.e. dynamic scenarios which may have more than one probable outcome). Few studies have identified the need for a flexible, robust and overarching framework which could model both the network and information space dynamisms and provide for analysis of different types of networks (heterogeneous/homogeneous) at multiple scales. We utilize the NCOPP (Network Centric Operations Performance & Prediction), a uniform framework with "plug-&-play" capabilities to provide analysis and performance prediction of networked information systems. In this work, we demonstrate the flexibility of the NCOPP framework and its ability to model a hierarchical sensor system satisfactorily. We model the network & information space dynamisms using probability and statistics theory (e.g. Bayesian prediction, probability distribution curves). We model the behavior of entities/nodes involved in the process of sharing information to achieve greatly improved situation awareness about a dynamic environment within hierarchical information network systems. Our behavior model mathematically represents how successful/unsuccessful predictions critically impact the achievement of effective situation awareness. In the behavior model, we tie together the cost of considering predictions which accounts for limited resources and the indirect effect of unsuccessful predictions. We research and show how the NCOPP framework can model real world networked information systems at different levels of granularity. We leverage the framework's capabilities to perform experiments that not only assist in an objective comparison of distributed information filtering and central data processing paradigms but also provide important insights into the effect of network dynamism on the quality and completeness of information in the system. We demonstrate the ability of incorporating key network information, in the process of achieving SA to improve the performance of the system. We exhibit the improvement in performance achieved with inclusion of the network characteristics during dynamic allocation of resources. We were able to show that simple hierarchical filtering (via distributed processing) results in significant reduction in the information in regards to "false alarms" when compared to systems employing central information processing. Experimental results show a direct positive impact in the completeness of SA when information sharing in hierarchical systems is supplemented by network delay information. Overall, we demonstrated the ability of the NCOPP framework to provide meaningful insights into the interactions of key factors involved in operation of networked information systems, with a particular emphasis on SA. / Master of Science
83

A simulation approach for modelling and investigation of inventory inaccuracy in warehouse operation

Kamaludin, Adzhar January 2010 (has links)
This thesis is focused on a simulation modelling approach to address the inventory inaccuracy problems in a warehouse operation. The main motivation which led to this research was a desire to investigate the inventory inaccuracy issues that have been highlighted by a logistics company. Previous and current research into inventory inaccuracy issues is largely related to the development of RFID technology as a possible solution to inventory problems. Since the inventory inaccuracy related to RFID technology is focused on the overall measurement of inventory management and retail business, there are differences between this existing research and the research presented in this thesis which is focused on issues of inventory inaccuracy in a warehouse operation. In this thesis, warehouse operation is studied as a detailed sequence of processes that are involved in the flow of items physically in parallel with related information being stored in the computer system. In these processes there are many places where errors can occur in counting or recording details of inventory, or in physically moving, storing or picking items incorrectly. These details of a warehouse operation are used to develop a conceptual model of inventory inaccuracy in warehouse operations. The study also found that typically a product needs to be considered differently at different stages of its progress through a warehouse (and therefore within different sections of the conceptual model). This is because initially batches of a product are likely to be delivered from a supplier, therefore if errors occur soon after the product is delivered to the warehouse, the error might involve the whole batch (for example the batch may be misplaced and put in an incorrect storage location), or the error might involve just part of the batch (for example poor transportation by forklift truck may damage the packaging carton and some of the items within the carton). When the product is stored ready for meeting customer orders, it needs to be considered as individual items (and errors can occur in counting of individual items or individual items may be misplaced or stolen). Finally, when a customer order is received, the product will be picked and grouped to meet the requirements of the order (for example, one order may require 10 of the product whilst another order may require 20 of the product). Errors might again occur to the whole group or to just part of the group. (Continued ...)
84

Investigating absorptive capacity in boards, corporate governance and the value creating board

Schonning, Aud Randi January 2013 (has links)
Within corporate governance research, boards of directors constitute an essential part and are described as "the apex of the internal control system" (Jensen, 1993, p.862). Several stands of research have investigated whether, and to which degree, boards’ composition, structure and processes have impact on board task performance, but board processes and specifically the use of knowledge and skills have not been thoroughly researched, yet. Simultaneously, there is a gap within organisational behaviour research on how knowledge is explored, transformed and exploited, which is conceptualised as absorptive capacity. Further, the concept of absorptive capacity has so to date not been researched in a board context. In this thesis board processes are studied by exploring the impact of absorptive capacity on board task performance. Three dimensions of absorptive capacity, exploratory learning, transformative learning and exploitative learning, are used in the analyses. The research is conducted using mixed methods (based on a survey and a case study). A quantitative analysis is based on the Norwegian Value Creating Board Survey, and a case study is conducted based on records, observations from board meetings and interviews in the Norwegian health company Healthy. The findings show that the three dimensions of absorptive capacity, positively and significantly, mediate the relation between presence of knowledge and skills and board task performance. Complementarities between the three learning processes exist with the result that the three learning processes together are a stronger mediator than a single process. The qualitative findings show that 1) information flows have an impact on absorptive capacity, 2) that the role and power of the CEO and the division of labour between the CEO and the chair, might have an impact on board task performance and 3) that a comprehensive utilisation of consensus has an impact on transformative and exploitative learning, 4) that effort norms are positively correlated to use of knowledge and skills and 5) that activation triggers have impacts on the learning processes. The research contributes to theory with an extended application of the concept of absorptive capacity to boards, responding to calls from researchers to conduct new and more extensive research to analyse and integrate the concept. The thesis further contributes by shedding new light on learning processes in boards, underpinning former conceptual models. In the case study several findings are reported which are presented in an extended and modified model of determinants of board tasks. Finally, this thesis contributes to mixed methods research in boards. The findings have implications for board practice with regard to board selections, board evaluations and learning processes in boards. Corporate governance codes should be aligned with these findings.
85

Managing Logistical Complexity: Agility and Quality in Newspaper Distribution : An Empirical Study of Herenco Distribution AB

NKume-Kwene, Samuel Ngole, Besong, Fred Tanyi January 2009 (has links)
<p><strong>Introduction</strong></p><p>Overtime the execution and control of business activities to meet and even exceed customer satisfac-tion has become an absolute top priority. This is because with an increase in the demand for diverse products and services of quality in unprecedented numbers, there is an automatic injection of complexi-ty into the activities and processes which companies undertake in order to fulfill customer satisfaction. This complexity which could be logistical in nature is usually centered on the provision of quality prod-ucts and services on a timely basis for customer satisfaction. In order to keep this complexity aspect in check while fulfilling customer satisfaction, there is the need to manage the different facets of complex-ity that relate to quality and agility.</p><p><strong>Purpose</strong></p><p>The purpose of this study is to understand the managerial actions on the logistical challenges of quality and agility in a Newspaper Distribution Company.</p><p><strong>Method</strong></p><p>In order to fulfill the purpose, the authors undertook a qualitative-single case study following an induc-tive approach. Interviews were conducted with two managers and these were basically face-to-face in-terviews though we also conducted some of the interviews by phone.</p><p><strong>Findings</strong></p><p>Managing complexity challenges of quality and agility requires the utilization of Total Quality Manage-ment (TQM), Just-in-Time (JIT) and Information flow (IF). Through the utilization of TQM, quality standards are enhanced through continuous improvement and the pursuit of excellence in the activities of the company. JIT as a philosophy helps in the elimination of waste and in the speeding up of processes within a company’s supply chain that result to the timely delivery of goods and services to customers in order to enhance customer satisfaction. Also, Information flow through the aid of diverse technologies such as mobile phones, radio phones, the internet, the World Wide Web, Customer Rela-tionship Management systems, Structured Query Language relational database but also word of mouth transmission have helped in the facilitation of decision making in the company relating to the delivery of quality products and services in an agile or responsive manner for customer satisfaction.</p><p><strong>Practical and Theoretical Implication</strong></p><p>The attainment of the requisites of agility while maintaining delivery quality may not be sufficient to enhance customer satisfaction. The information in the model provides management with a pathway to follow in solving logistical challenges towards enhancing customer satisfaction. The study offers theory development opportunities.</p><p><strong>Originality</strong></p><p>A model of logistical complexity management was designed for the attainment of customer satisfaction.</p>
86

Causal pattern inference from neural spike train data

Echtermeyer, Christoph January 2009 (has links)
Electrophysiological recordings are a valuable tool for neuroscience in order to monitor the activity of multiple or even single neurons. Significant insights into the nervous system have been gained by analyses of resulting data; in particular, many findings were gained from spike trains whose correlations can give valuable indications about neural interplay. But detecting, specifying, and representing neural interactions is mathematically challenging. Further, recent advances of recording techniques led to an increase in volume of collected data, which often poses additional computational problems. These developments call for new, improved methods in order to extract crucial information. The matter of this thesis is twofold: It presents a novel method for the analysis of neural spike train data, as well as a generic framework in order to assess the new and related techniques. The new computational method, the Snap Shot Score, can be used to inspect spike trains with respect to temporal dependencies, which are visualised as an information flow network. These networks can specify the relationships in the data, indicate changes in dependencies, and point to causal interactions. The Snap Shot Score is demonstrated to reveal plausible networks both in a variety of simulations and for real data, which indicate its value for understanding neural dynamics. Additional to the Snap Shot Score, a neural simulation framework is suggested, which facilitates the assessment of neural network inference techniques in a highly automated fashion. Due to a new formal concept to rate learned networks, the framework can be used to test techniques under partial observability conditions. In the presence of hidden units quantification of results has been a tedious task that had to be done by hand, but which can now be automated. Thereby high throughput assessments become possible, which facilitate a comprehensive simulation-based characterisation of new methods.
87

Adaptive constraint solving for information flow analysis

Dash, Santanu Kumar January 2015 (has links)
In program analysis, unknown properties for terms are typically represented symbolically as variables. Bound constraints on these variables can then specify multiple optimisation goals for computer programs and nd application in areas such as type theory, security, alias analysis and resource reasoning. Resolution of bound constraints is a problem steeped in graph theory; interdependencies between the variables is represented as a constraint graph. Additionally, constants are introduced into the system as concrete bounds over these variables and constants themselves are ordered over a lattice which is, once again, represented as a graph. Despite graph algorithms being central to bound constraint solving, most approaches to program optimisation that use bound constraint solving have treated their graph theoretic foundations as a black box. Little has been done to investigate the computational costs or design e cient graph algorithms for constraint resolution. Emerging examples of these lattices and bound constraint graphs, particularly from the domain of language-based security, are showing that these graphs and lattices are structurally diverse and could be arbitrarily large. Therefore, there is a pressing need to investigate the graph theoretic foundations of bound constraint solving. In this thesis, we investigate the computational costs of bound constraint solving from a graph theoretic perspective for Information Flow Analysis (IFA); IFA is a sub- eld of language-based security which veri es whether con dentiality and integrity of classified information is preserved as it is manipulated by a program. We present a novel framework based on graph decomposition for solving the (atomic) bound constraint problem for IFA. Our approach enables us to abstract away from connections between individual vertices to those between sets of vertices in both the constraint graph and an accompanying security lattice which defines ordering over constants. Thereby, we are able to achieve significant speedups compared to state-of-the-art graph algorithms applied to bound constraint solving. More importantly, our algorithms are highly adaptive in nature and seamlessly adapt to the structure of the constraint graph and the lattice. The computational costs of our approach is a function of the latent scope of decomposition in the constraint graph and the lattice; therefore, we enjoy the fastest runtime for every point in the structure-spectrum of these graphs and lattices. While the techniques in this dissertation are developed with IFA in mind, they can be extended to other application of the bound constraints problem, such as type inference and program analysis frameworks which use annotated type systems, where constants are ordered over a lattice.
88

Effektivisering av 3d-projektering / Streamline of 3d-modelling

Andersson, Johannes January 2016 (has links)
Purpose: Sweden's construction industry is facing great challenges in form of increased housing with the increasing demands on the environment and economy, this combined with the current low unemployment today and many upcoming retirements. To meet these challenges, it requires development and changes which BIM is considered to handle if it is correct performed. 3d planning is an important part of BIM, where it lays the ground for further step in the building process following two steps, production and management. Method: Selected methods of this thesis is a literature review, semi-structured interviews and document analysis. In the literature review writings about, or related to, 3d planning was studied. The interview method gave more detailed information about the problems that planners experience in reality. The document analysis created a deeper understanding of how 3d design work. Findings: The results show that with simple means it is possible to avoid common problems. Which leads to streamline of 3d design and the entire construction process. This in the form of aspects including economy, the environment and sustainable construction. Implications: Development and improvements in 3d design leads to more effective planning which in contributing to more effective construction. Increase of efficiency can now be justified by the savings in the economy and the environment. But visions of development potential in the further step exists today. One step further, for example, be monitoring the working environment in the planning phase. More education in BIM at all levels of the construction industry should create a better understanding of each other and alleviate the communication gaps that currently prevails. Limitations: This thesis is limited to the design phase of the building process conducted with 3d modeling. / Syfte: Sveriges byggbransch står idag inför stora utmaningar i form av ökat bostadsbyggande med allt högre krav på miljö och ekonomi. Detta i kombination med rådande låg arbetslöshet redan idag och många kommande pensionsavgångar. För att möta dessa utmaningar krävs utveckling och förändring, vilket BIM anses klara om det utförs korrekt. 3d projektering utgör en viktig byggsten inom BIM, där den lägger grunden för vidare arbete i byggprocessens följande två steg som är produktion och förvaltning. Metod: Valda metoder för detta examensarbete är litteraturstudie, semistrukturerad intervju och dokumentanalys. I litteraturstudien studerades litteratur som handlar om eller är relaterade till 3d projektering. Intervjumetoden gav djupare information om problematiken som projektörer upplever i verkligheten. Dokumentanalysen skapade en djupare förståelse hur 3d projektering fungerar. Resultat: Studiens resultat visar att det med hjälp av enkla medel går att undvika vanligt förekommande problem inom 3d projektering för att effektiviseras. Dessa enkla medel är främst förbättrad kommunikation, kompetens och förståelse för varandra. Effektivisering av 3d projektering leder till att öka effektiviteten i hela byggprocessen. Detta med besparingar på bland annat ekonomi, miljö, arbetssäkerhet och hållbart byggande. Konsekvenser: Utveckling och förbättringar inom 3d projektering leder till effektivare projektering som i sin bidrar till effektivare byggprocess. Ökning av effektiviteten kan idag motiveras med besparingar inom ekonomi och miljö. Men visioner om utvecklingspotential i vidare steg finns idag. Ett steg längre kan exempelvis vara kontroll av arbetsmiljön redan i projektering. Mer utbildning inom BIM i alla nivåer för byggbranschen bör skapa en bättre förståelse för varandra och mildra de kommunikationsbrister som idag råder. Begränsningar: Detta examensarbete begränsar sig till projekteringsfasen inom byggprocessen som utförs med 3d modellering.
89

No Hypervisor Is an Island : System-wide Isolation Guarantees for Low Level Code

Schwarz, Oliver January 2016 (has links)
The times when malware was mostly written by curious teenagers are long gone. Nowadays, threats come from criminals, competitors, and government agencies. Some of them are very skilled and very targeted in their attacks. At the same time, our devices – for instance mobile phones and TVs – have become more complex, connected, and open for the execution of third-party software. Operating systems should separate untrusted software from confidential data and critical services. But their vulnerabilities often allow malware to break the separation and isolation they are designed to provide. To strengthen protection of select assets, security research has started to create complementary machinery such as security hypervisors and separation kernels, whose sole task is separation and isolation. The reduced size of these solutions allows for thorough inspection, both manual and automated. In some cases, formal methods are applied to create mathematical proofs on the security of these systems. The actual isolation solutions themselves are carefully analyzed and included software is often even verified on binary level. The role of other software and hardware for the overall system security has received less attention so far. The subject of this thesis is to shed light on these aspects, mainly on (i) unprivileged third-party code and its ability to influence security, (ii) peripheral devices with direct access to memory, and (iii) boot code and how we can selectively enable and disable isolation services without compromising security. The papers included in this thesis are both design and verification oriented, however, with an emphasis on the analysis of instruction set architectures. With the help of a theorem prover, we implemented various types of machinery for the automated information flow analysis of several processor architectures. The analysis is guaranteed to be both sound and accurate. / Förr skrevs skadlig mjukvara mest av nyfikna tonåringar. Idag är våra datorer under ständig hot från statliga organisationer, kriminella grupper, och kanske till och med våra affärskonkurrenter. Vissa besitter stor kompetens och kan utföra fokuserade attacker. Samtidigt har tekniken runtomkring oss (såsom mobiltelefoner och tv-apparater) blivit mer komplex, uppkopplad och öppen för att exekvera mjukvara från tredje part. Operativsystem borde egentligen isolera känslig data och kritiska tjänster från mjukvara som inte är trovärdig. Men deras sårbarheter gör det oftast möjligt för skadlig mjukvara att ta sig förbi operativsystemens säkerhetsmekanismer. Detta har lett till utveckling av kompletterande verktyg vars enda funktion är att förbättra isolering av utvalda känsliga resurser. Speciella virtualiseringsmjukvaror och separationskärnor är exempel på sådana verktyg. Eftersom sådana lösningar kan utvecklas med relativt liten källkod, är det möjligt att analysera dem noggrant, både manuellt och automatiskt. I några fall används formella metoder för att generera matematiska bevis på att systemet är säkert. Själva isoleringsmjukvaran är oftast utförligt verifierad, ibland till och med på assemblernivå. Dock så har andra komponenters påverkan på systemets säkerhet hittills fått mindre uppmärksamhet, både när det gäller hårdvara och annan mjukvara. Den här avhandlingen försöker belysa dessa aspekter, huvudsakligen (i) oprivilegierad kod från tredje part och hur den kan påverka säkerheten, (ii) periferienheter med direkt tillgång till minnet och (iii) startkoden, samt hur man kan aktivera och deaktivera isolationstjänster på ett säkert sätt utan att starta om systemet. Avhandlingen är baserad på sex tidigare publikationer som handlar om både design- och verifikationsaspekter, men mest om säkerhetsanalys av instruktionsuppsättningar. Baserat på en teorembevisare har vi utvecklat olika verktyg för den automatiska informationsflödesanalysen av processorer. Vi har använt dessa verktyg för att tydliggöra vilka register oprivilegierad mjukvara har tillgång till på ARM- och MIPS-maskiner. Denna analys är garanterad att vara både korrekt och precis. Så vitt vi vet är vi de första som har publicerat en lösning för automatisk analys och bevis av informationsflödesegenskaper i standardinstruktionsuppsättningar. / <p>QC 20160919</p> / PROSPER / HASPOC
90

Arquitetura de aplicativos móveis com fluxo seguro de informação. / Architecture of mobile applications with information flow control.

Paiva, Oscar Zibordi de 17 May 2016 (has links)
A adoção de lojas de aplicativos e Open APIs por um número crescente de empresas, muitas nem mesmo atuantes no ramo de tecnologia, revela o interesse das mesmas em exteriorizar a concepção e desenvolvimento de software corporativo. Com isso, as empresas almejam multiplicar as funcionalidades disponíveis a seus clientes, utilizando uma fração do custo e do tempo que seriam tradicionalmente gastos para fazê-lo. Ao mesmo tempo, o acesso a dados e sistemas corporativos por softwares de desenvolvedores potencialmente desconhecidos suscita preocupações de segurança, tornando-se imperativo garantir a adequação desses softwares às políticas de segurança institucionais. Entretanto, carece-se de meios automáticos capazes de garantir a mencionada adequação nas plataformas móveis, seja nos seus ambientes de execução ou em seus kits de desenvolvimento de software. Este trabalho, utilizando de ideias recentes da área de Controle de Fluxo de Informação, propõe a arquitetura de um ambiente de execução para aplicativos móveis que garante por construção a adequação dos mesmos a determinadas políticas de confidencialidade e integridade de dados, mesmo na presença de código malicioso. A praticidade de tal arquitetura é validada através da implementação de um aplicativo exemplo. Tal implementação ilustra o funcionamento dos mecanismos de segurança propostos e a compatibilidade dos mesmos a um conjunto de funcionalidades adequado ao cenário de manipulação de dados corporativos. / The adoption of application stores and Open APIs by a growing number of companies, many of them not even related to the technology business, reveals their interest in externalizing the conception and development of corporate software. By doing so, these companies expect to multiply the number of functionalities available to their customers, spending a fraction of the traditionally required time and cost. On the other hand, access to corporate data and services by software developed by potentially unknown parties raises security concerns, making it imperative to ensure the adequacy of the mentioned software to the institutional security policies. Nevertheless, there is a lack of automatic tools capable of guaranteeing the mentioned adequacy in mobile platforms, either in their runtime environments or in their software development kits. This work, using recent ideas from the Information Flow Control area, proposes the architecture of a run-time environment for mobile applications that guarantees by construction their adequacy to some confidentiality and integrity policies, even in the presence of malicious code. The practicality of this architecture is validated by the implementation of an example application. This implementation illustrates the working of the proposed security mechanisms and their compatibility to a set of functionalities relevant to the scenario of corporate data manipulation.

Page generated in 0.1345 seconds