• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 382
  • 116
  • Tagged with
  • 498
  • 498
  • 498
  • 498
  • 498
  • 471
  • 27
  • 10
  • 10
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Interoperability for electronic ID

Zygadlo, Zuzanna January 2009 (has links)
<p>Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It is not only the technical interoperability of electronic IDs which today makes many electronic transactions impossible. The main obstacle to global interoperability of electronic IDs is the lack of trust in digital certificates issued by various Certification Authorities (CAs). Relying Parties (RPs) need to trust digital certificates in order to be able to validate them. We observe that the multi-vendor PKI community lacks an independent source of electronic IDs quality information, which could make digital certificate validation possible on a global scale. Therefore, this thesis presents the concept of a new PKI trust model which introduces a Validation Authority (VA) as a single trust point in the global PKI trust network. The main goal of this thesis is to develop a model of Global Validation Service (GVS), which uses Global Validation Authority (GVA), based on the VA concept, to provide digital certiÞcate validation and signature veriÞcation to Relying Parties. The presented research focuses on investigating technical, legal and business issues which would enable RPs to delegate the certificate validation to an independent Validation Authority.</p>
2

Forensic analysis of an unknown embedded device

Eide, Jarle, Olsen, Jan Ove Skogheim January 2006 (has links)
<p>Every year thousands of new digital consumer device models come on the market. These devices include video cameras, photo cameras, computers, mobile phones and a multitude of different combinations. Most of these devices have the ability to store information in one form or another. This is a problem for law enforcement agencies as they need access to all these new kinds of devices and the information on them in investigations. Forensic analysis of electronic and digital equipment has become much more complex lately because of the sheer number of new devices and their increasing internal technological sophistication. This thesis tries to help the situation by reverse engineering a Qtek S110 device. More specifically we analyze how the storage system of this device, called the object store, is implemented on the device’s operating system, Windows Mobile. We hope to figure out how the device stores user data and what happens to this data when it is "deleted". We further try to define a generalized methodology for such forensic analysis of unknown digital devices. The methodology takes into account that such analysis will have to be performed by teams of reverse-engineers more than single individuals. Based on prior external research we constructed and tested the methodology successfully. We were able to figure our more or less entirely the object store’s internal workings and constructed a software tool called BlobExtractor that can extract data, including "deleted", from the device without using the operating system API. The main reverse engineering strategies utilized was black box testing and disassembly. We believe our results can be the basis for future advanced recovery tools for Windows Mobile devices and that our generalized reverse engineering methodology can be utilized on many kinds of unknown digital devices.</p>
3

Bluetooth broadcasting

Ho, Johan January 2006 (has links)
<p>Background: The wireless technology Bluetooth has rapidly become more commonly supported by electronic devices like mobile phones and PDAs. Several companies are currently developing Bluetooth broadcasting systems to use for marketing. This report is a result of researching the use of Bluetooth broadcasting for delivering information for more general purposes, how well Bluetooth actually works for broadcasting, and also on the topic of user privacy. Results: Broadcasting with Bluetooth did work with a few devices at the same time, since Bluetooth allows up to seven connections to one Bluetooth radio at the same time. By making a passive system where the user is the one which requests information, it also will not affect users' privacy. However, my research also found a few issues with the Bluetooth which might affect a broadcast, the most noticeable of them being the somewhat low transfer rate, an issue with device discovery not always working when a lot of users are doing device discovery at the same time. The fact that it only supports seven connections is also a limitation. Basically, while it is possible to use Bluetooth for broadcasting, it might be problematic to use it for targeting a large audience. Conclusions: Even with the problems mentioned, Bluetooth broadcasting provides quite a unique way of broadcasting, and with the newer versions of Bluetooth the issues mentioned might be less of a problem. Bluetooth broadcasting definitely has some potential.</p>
4

Applicability and Identified Benefits of Agent Technology : -Implementation and Evaluation of an Agent System

Haug, Mari Torgersrud, Kristensen, Elin Marie January 2006 (has links)
<p>Agent oriented approaches are introduced with intention to facilitate software development in situations where other methods have shortcomings. Agents offer new possibilities and solutions to problems due to their properties and characteristics. Agent technology offer a high abstraction level and is therefore a more appropriate tool for making intelligent systems. Multi-agent systems are well suited in application areas with dynamic and challenging environments, and is advantageous in support for decision making and automation of tasks. Reduced coupling, encapsulation of functionality and a high abstraction level are some of the claimed benefits related to agent technology. Empirical studies are needed to investigate if agent technology is as good as asserted. This master thesis will give a deeper understanding of agent technology and benefits related to it. To investigate different aspects, an experiment was designed to reveal the applicability and the benefits. Two multi-agent systems were implemented and used as objects to be studied in the empirical study. As part of the investigation, a proper application area were chosen. The application area can be characterized as a scheduling problem with a dynamic and complex environment. Prometheus and JACK were used as development and modeling tools. Achieved experiences related to the development process will be presented in this report. The findings of the empirical study indicate reduced coupling and increased encapsulation of functionality. To achieve these benefits, the number of entities and functions had to be increased, and thus the number of code-lines. Further, the results indicate that more entities and lines of code will not have a significant influence on the development effort, due to the high abstraction level of agent technology.</p>
5

WebSys- Robustness Assessment and Testing

Pham, Thuy Hue Thi January 2006 (has links)
<p>In recent years, the World Wide Web (WWW) has become a popular platform for system development. There are several factors that make Web-development special. There is a large number of quality requirements in the Web-based system. Web projects involve people with a diverse background, such as technical people with background in programming and non-technical people with background in graphical design. In addition, the Web-based system are often not developed separately, but is integrating existing subsystems. The time-to-marked requirement is strong. Web-based system must tolerate errors and abnormal situations caused by internal component failure or user mistakes. Therefore, robustness is considered to be a critical factor for Web-based systems. Building a robust Web-based system is never an easy task. Furthermore, the end users of Web-based systems have different backgounds. Many have knowledge of the Web, others have little or no knowledge of the Web. Since Web-systems are used by people with a rather diverse background, it is important that the Web-based systems must have error tolerance and ability to survive due to user mistake. The main focus of this project is analyzing robustness of Web-based system. In order to analyze robustness of Web-based system, it is necessary to carry out a robustness assessment. Assessment methods are used to evaluate the robustness and give an estimating of the system's robustness. Further, robustness testing of a Web-based system has to be performed to get an idea of the system's current robustness. The result of estimating and test result will also be discussed, compared and evaluated. An Automatic Acceptance Testing of Web Applications (AutAT) will be used to test the robustness of a Web-based system. DAIM (Norwegian: Digtal Arkivering og Innlevering av Masteroppgaver) is the target system that will be tested the robustness of. Keywords: Robustness, testing, robustness assessment, robustness estimating, Web-based system, AutAT, DAIM.</p>
6

BUCS: Patterns and Robustness : Experimentation with Safety Patterns in Safety-Critical Software Systems

Ljosland, Ingvar January 2006 (has links)
<p>In modern society, we rely on safely working software systems. This is the final report in a masters degree project to reveal key issues in the science field of computer software architecture and design of safety-critical software systems. A pre-study of a navigation system implied that functionality related problems and safety-critical problems do not stack one to one, but rather is a case of solving these aspects in different layers. This means that changes in software systems functionality do not necessary mean that change in safety-critical modules has to be done as well, and visa versa. To further support the findings in the pre-study, an experiment was created to investigate these matters. A group of twenty-three computer science students from the Norwegian University of Science and Technology (NTNU) participated as subjects in the experiment. They were asked to make two functional additions and two safety-critical additions to a software robot emulator. A dynamic web tool was created to present information to the subjects, and they could here answer surveys and upload their task solutions. The results of the experiment shows that there were not found any evidence that the quality attributes got affected by the design approaches. This means that the findings of this study suggest that there is difficult to create safety-critical versions of software architectural design patterns, because all design patterns have a set of additions and concequences to a system, and all sides of the implications of the design pattern should be discussed by the system architects before used in a safety-critical system.</p>
7

eGovernment Services in a Mobile Environment

Olaussen, Gunn, Torgersen, Kirsti Nordseth January 2006 (has links)
<p>This report was written as part of our thesis based on an assignment provided by Kantega. It deals with the use of mobile phones to access eGovernment services using the Liberty Identity Web Services Framework (ID-WSF). Additionally, it explores the use of strong authentication mechanisms on mobile phones while using the phone as the terminal to access such services. The first part of the report describes the project and its goals. In this process, it defines three research questions that will be answered in the course of the thesis. Furthermore, it outlines how these questions should be answered. This part also includes a presentation of the prototype that was developed later in the project. The second part of the report concentrates on the theoretical basis of the assignment. Existing standards and technologies for strong authentication and Liberty-enabled products are explored and evaluated. The focus of the evaluation is upon whether the technologies could be used in the prototype. The third part of the report contains the requirements specification, design, implementation and testing documentation for the prototype. This part aims to describe all aspects of the prototype development and enables us to show that it is a valid proof-of-concept. Requirements and design incorporating strong authentication into the prototype are also provided, although this functionality was not implemented as specified in the assignment. The last part of the report evaluates the results obtained in the course of the thesis and especially the resulting prototype. The prototype fulfills our requirements well, but there where some reservations on the security in the strong authentication design. This part also looks at what can be done in the future to further explore the topic and improve the results. Finally, it shows how the report has answered the research questions we defined in the beginning of the thesis by completing a prototype that accesses eGovernment services using Liberty ID-WSF.</p>
8

Financial News Mining: : Extracting useful Information from Continuous Streams of Text

Lægreid, Tarjei, Sandal, Paul Christian January 2006 (has links)
<p>Online financial news sources continuously publish information about actors involved in the Norwegian financial market. These are often short messages describing temporal relations. However, the amount of information is overwhelming and it requires a great effort to stay up to date on both the latest news and historical relations. Therefore it would have been advantageous to automatically analyse the information. In this report we present a framework for identifying actors and relations between them. Text mining techniques are employed to extract the relations and how they evolve over time. Techniques such as part of speech tagging, named entity identification, along with traditional information retrieval and information extraction methods are employed. Features extracted from the news articles are represented as vectors in a vector space. The framework employs the feature vectors to identify and describe relations between entities in the financial market. A qualitative evaluation of the framework shows that the approach has promising results. Our main finding is that vector representations of features have potential for detecting relations between actors, and how these relations evolve. We also found that the approach taken is dependent on an accurate identification of named entities.</p>
9

Model Driven Enterprise Analysis : A model-driven tool-assisted process for criticality and availability analysis of enterprise systems

Hermansen, Thomas January 2006 (has links)
<p>Today more and more companies acquire enterprise-scale solutions for their organization. Enterprise-scale solutions connect departments and business functions in the organization in order to facilitate the coordination, communication, and work flow between them. However, when systems get more interconnected and complex, they are also more prone to faults. If business critical parts of the system are affected, this can be devastating for a company. When designing large enterprise scale systems, one uses a wide range of specialized models with different view points and applications. This fragmentation and specialization of the representation of the system decreases the clarity of the total enterprise model and implies that it is difficult to analyze the enterprise as a whole. To overcome this problem, specialized software tools that can integrate the sub models in a total model can be developed. This thesis will develop a tool assisted extended process to the development process Rational Unified Process that helps analyzing the design of enterprise solutions by integrating the behavioral and structural models of the system into a unified representation. The tools take basis in digitized models represented in UML, the industry standard language for modeling software systems. We will focus on the two quality attributes availability and criticality.</p>
10

Redesign and optimalization of the Peer2Me Framework : A Framework for developing Applications supporting mobile collaboration using J2ME

Hestnes, Steinar A., Vatn, Torbjørn January 2006 (has links)
<p>This project was started to develop a new improved version of the Peer2Me framework. After having evaluated the first version of the Peer2Me framework in our depth study project in the fall of 2005, quite a few possible improvements came up. This report starts with an introduction to Computer Supported Cooperative Work (CSCW), wireless networking, Peer-to-Peer (P2P) computing, and mobile ad hoc networking. It also introduces some central concepts concerning design of a software architecture, and technology relevant to the development of the Peer2Me framework. The redesign of the framework was started by eliciting a set of new requirements, constituting the basis for designing the new Peer2Me architecture. Through an iterative and incremental development process, Peer2Me framework v2.0 was developed with several new features. An instant messenger application has been developed using both versions of the framework, in order to compare them. A thorough comparison of Peer2Me v1.0 and Peer2Me v2.0 shows that the redesign has resulted in a reduced framework footprint and complexity, a simplified interface towards the MIDlets, and a considerably increase in transfer rate.</p>

Page generated in 0.1288 seconds