• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 215
  • 70
  • Tagged with
  • 285
  • 283
  • 283
  • 280
  • 10
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

The use of the Network Layer Packet Redundancy Scheme in the OpMiGua hybrid network

Skavdal, Arnt Erling January 2007 (has links)
<p>The OpMiGua hybrid network has been proposed as viable migration approach from circuit switched all-optical networks to 3rd generation packet based optical networks. A special feature in the OpMiGua network compared with other hybrid optical network architecture proposals, is the combination of service guarantees with high performance. In order to achieve this, the OpMiGua network features two different service classes, Guaranteed Service (GS) and Best-effort service (BE). A crucial issue in this context is how the performance of these service classes, in particular the BE class, may be improved. To this aim, the Network Layer Packet Redundancy Scheme (NLPRS) is a viable candidate to reduce the packet loss rate in such networks. The NLPRS has earlier been studied in Optical Packet Switched (OPS) networks, but has never been considered used in hybrid optical network architectures.</p>
112

A Routing Protocol for MANETs

Gironés Quesada, Luis January 2007 (has links)
<p>In this master thesis there has been a description of what MANETs are and why they are so interesting. Because of its characteristics, the tradicional routing protocols for wired networks are not advisable for them. A specific routing protocol for MANETs is necessary. In this thesis the main groups of these protocols have been explained and some of the most commonly used of them were studied. We saw that each protocol is better in a specific environment. None of them are perfect for all the ranges of nodes mobility, traffic, number of nodes, etc. The two main groups of protocols studied are the proactive and the reactive ones. The main characteristic of the proactive is that each node maintains a route to every node in the network. Besides, it periodically updates this information. No matter if there is communication between the nodes or not. As representative examples of proactive protocols, OLSR and DSDV were described here. On the other hand, in the reactive ones the nodes only calculate the routes between those nodes that want to communicate. This kind of protocols perform in a more efficient usage of the bandwith (which is very limited in the MANETs medium) and the resources of the nodes. However, as a drawback, when the route is not available yet, the delay to achieve it can be great. The reactive protocols choosen here to be studied were AODV and DSR. In the reactive, the main problem is the delay to achieve a new route. In the proactive, it is the high usage of resources and bandwith when it is not necessary. Both, reactive and proactive also have the problem of the scalability. To solve these problems, a new kind of protocols appeared: the hybrid ones. A hybrid routing protocol combines both, the proactive and reactive to achieve better performance. The most popular of them is ZRP and its operation was described here too. None of the existing protocols are suitable for a MANET with a large number of nodes, each one of them with a different velocity and traffic. ZRP solves in part the problem of the scalabilty, but under different patterns of traffic and nodes velocity performs worse than the OLSR, DSR and AODV. Understanding the strengths and weaknesses of each protocol, a new one was proposed. The objective of this new protocol was to be suitable to MANETs with nodes moving freely, with different ranges of speed and traffic. Also, another objective was to improve the scalability of the reactive and proactive protocols. The protocol proposed here was called Penaguila. As ZRP and other hybrid routing protocols, it is based on having some nodes working in proactive mode creating areas, and comunicating this areas with other nodes working in reactive mode. The difference between Penaguila and ZRP, is that Penaguila takes into acount the speed and traffic of each node. Therefore, Penaguila tries to have each node working in the mode more suitable for itself. Also, an evaluation of the OLSR, AODV, DSR, ZRP and Penaguila has been done. Since it was not possible to program Penaguila in NS-2 because of the short time to write the thesis, it was only feasible to do a qualitative study. In this study the advantages and disadvantages of each protocol were exposed and the concluision was that Penaguila can outperform the existing protocols when: A) The network is large, since it is a hierarchical routing protocol. B) The nodes have very different speeds and amount of traffic.</p>
113

Value of Investing in Information Security : A metastudy initiated by norSIS

Johansen, Marie Kristin January 2007 (has links)
<p>The ratio of companies and organizations in Norway with a number of employees between 5 and 9 and Internet access increased from 66% to 86% during a five year period from 2001 to 2006. This increased use of the Internet puts small companies in a vulnerable position considering information security. They are known to be remarkably less willing to pay for information security compared to companies with more employees and more revenue. There is no such thing as two identical organizations. Every single one has it's own assets, weaknesses, employees and fundamental strategies. This makes each company's requirement for ICT-systems and information security identical as well. One solution might be good for one company but not for others. The differences in organizational structure and mentality is important variables in the process of building a good and secure infrastructure for the organizations. The Australian Computer Crime Surveys presents four readiness to protect factors, they consist of: Technology, policies, training and standards. These factors are used as a template for this thesis. If companies focus on these four aspects of information security, and succeed in combining them in an optimal manner they are said to have security in depth. There is no use in investing great amounts of money on technology if these are not used in a justifiable manner. There might be several reasons for improper use of the technologies, among them; lack of knowledge, laziness and carelessness. The companies continuous inability to calculate their own risks of adverse events and their total losses experienced due to computer crime makes it difficult to perform investment analysis on information security. Smaller companies do often have very limited amount of money to spend in general, and therefore also on information security. The investment analysis model chosen therefore take the maximum amount of spend able money into account. The accuracy of the model presented relies in the companies ability to present trustworthy data, and use both willingness to pay calculations and cost/benefit-investments analysis methods, resulting in a more thorough presentation of an ALE/ROI method used in a proof of concept using estimated data based on surveys, professionals experiences and prices used by a Norwegian ICT-operations company.</p>
114

Experiences with Linux Mobile

Sivertsen, Frode January 2007 (has links)
<p>Mobile phones are becoming more and more complex in terms of both hardware and software. Linux Mobile, as a term covering both the kernel and its surrounding components that together form the operating system, is said to have the potential to become the de-facto standard operating system for mobile phones and an enabler for advanced future mobile services. This master thesis evaluates key aspects and central mechanisms of the Linux kernel and how it supports its surrounding hardware and software components in a flexible manner. The main work consisted of investigating the necessary kernel subsystems, with focus on the latest major kernel release for as being able to provide the demanded real-time responsiveness for mobile phones. Further, the typical hardware architecture for this form factor is examined and discussed with focus on the important aspects of responsibility, power management, and memory. The combination of the hardware and the flexibility of Linux is demonstrated through the booting process. Both major commercial and open source development platforms are investigated to elaborate on the opportunities of employing Linux as an enabler for advanced mobile services. The attempt of building a cross platform tool chain as a basis for a development platform was carried out with only partial success. It is described with the results achieved and steps planned. Based on the topics discussed and the results achieved the thesis is concluded with a discussion of whether Linux Mobile has the potential to become the de-facto standard mobile operating system, and what challenges and opportunities that are brought along with it.</p>
115

Field Test of Mobile Terminals in a Wireless City

Stray, Petter January 2007 (has links)
<p>A rising question today is whether or not wireless networks and terminals are at the border of being able to compete with the cellular phone service. A variety of terminals are available and citywide wireless networks are either already deployed or under deployment in many cities worldwide. Although voice over wireless networks already is up and running in many office buildings and hospitals, few have experimented with and tested the use of voice over IP in public wireless networks. In this thesis, a series of field experiments are conducted in ''Wireless Trondheim'', a city-sized wireless network. Through tests gathering information on voice quality, network capacity and other metrics critical for a voice service, differences between terminals and the state of the technology is presented. Using the network analysis tool IxChariot a selection of Wi-Fi enabled mobile terminals from Qtek and HTC are tested under different conditions and network loads. The tests unveil vast differences among the tested terminals. While some terminals are capable of handling multiple conversations at once (e.g. call waiting and teleconference functionality) others have trouble keeping the quality of a single conversation good enough for it to be of any value. Allover, the achieved voice quality for the tested terminals in the Wireless Trondheim network lies well below the quality of the GSM service. The radios on the mobile Wi-Fi terminals are strongly affected by interference in a densely populated outdoor environment, which makes it difficult to maintain good voice quality. The results obtained in the thesis indicate that the tested mobile terminals are not yet ready to deliver telephony over a shared outdoor wireless network with sufficient voice quality.</p>
116

Detection of Hidden Software Functionality

Jensen, Jostein January 2007 (has links)
<p>Downloading software from unknown sources constitutes a great risk. Studies have described file-sharing networks where the probability of downloading infected files is as high as 70% [1] under certain circumstances. This work presents theory on malicious software with emphasize on code turning computers into bots and thereby, possibly botnets. It is observed that malware authors start using more advanced techniques to deceive owners of compromised computers. To evade detection, stealth techniques known from rootkits are more and more commonly adapted. Rootkit technology is therefore studied to be able to determine how bots, and other forms malicious software, can be hidden from both automated anti-virus detection mechanisms and human inspections of computers. The mechanisms used to evade detection by traditional anti-virus tools are in many cases effective. Dynamic behavioural analysis of software during installation is therefore suggested as a strategy to supplement the traditional tools. Several detection strategies are presented, which can be used to determine the behaviour of software during installation. This knowledge is used to design a laboratory environment capable of detecting the mentioned categories of malicious code. An implementation of the laboratory is provided, and experiments are performed to determine the usefulness of the setup. The software used to set up the laboratory environment are all distributed free of license cost. An evaluation is made and improvements to the system are proposed. The value of behavioural analysis has been demonstrated, and the functionality of the laboratory environment has proved to extremely useful. Advanced users will find the functionality of the laboratory setup powerful. However, future work has to be done to automate the behavioural detection processes so the public can benefit from this work.</p>
117

Integration or 'Patchwork' for Communication and Messages in a Hospital

Årving, Alexander Braadvig January 2007 (has links)
<p>Most research studies concerning implementation and use of ICT in hospitals relate to information systems. Various technologies and information systems are implemented in modern hospitals, such as EPRs, administrative systems, diagnostic tools, and lab systems. Common sense would suggest that collaboration—for maximum efficiency—would appreciate “seamless” integration of information systems and centralized control. Former research argues, however, that tighter integration in human-to-system communication could produce additional work or that it just relocates the workload instead of improving the efficiency and that “patchworks” sometimes are more beneficial than tight integration. The motivation for my research was to examine whether this strategy in hospitals would apply similarly in human-to-human communication and messaging, with a human receiver or “callee”. This thesis presents some former research of ICT systems in health care and explains some relevant concepts from theory. It also provides a description of the health care domain, with information about the Norwegian health sector and hospitals. My field research was conducted at different sections at the Children’s Clinic at Rikshospitalet, the Norwegian national hospital. I used techniques from qualitative research (interviews and observations) to examine and evaluate communication routines and practices in the hospital. This thesis describes the most important types and technologies of communication and discusses thoughts and opinions among hospital workers on today’s practices and requests for future improvements. Many hospital workers found some of today’s systems and practices for human-to-human communication interruptive, rigid, and cumbersome; and some improvements and changes could be desirable. An integration of voice and text messaging in handheld devices could improve the quality of clinical communication and reduce interruptions from using pagers. It could also be beneficial with context-aware communication systems, utilizing information about location and availability. Such information could be available through integrated functionality such as a positioning system, dynamic addressbook, and calendar applications. Due to different professions and roles in a hospital, though, it is hard to find common opinions. My research indicates that administrative and office personnel seemed more interested than clinical personnel in a tighter integrated communication and coordination system. Clinicians, however, tended to be a bit more skeptical of too much integration and new technology in human-to-human communication. Important aspects of human-to-human related work are flexibility, responsibility, professional judgment, and assessment of importance; and human contact and face-to-face communication will always be necessary. When introducing new systems, it is necessary to take into consideration actual and different needs among hospital workers, information security and privacy, user-friendliness, robustness and backup, and user involvement. It is also worth bearing in mind that a hospital is an old and rigid type of organization with deep-rooted traditions, routines, and professions consisting of individualists. Improvements of communication practices and new functionality are desired and will most certainly occur as part of a modernization and digitalization process. However, hospital changes take time and must be done gradually with small steps.</p>
118

Cost comparison studies of different network platforms : OpEx modeling and analysis

Moe, Øyvind Solberg January 2007 (has links)
<p>The importance of Operational Expenditures (OpEx) on the overall network cost has been shown in recent studies. This master thesis is part of a series of techno economical studies focused on the evaluation of OpEx and the provision of a general tool. This thesis proposes a network model which is integrated with recently a recently proposed service model and service migration model, combined into an OpEx tool. This tool allows network operators comparing different network platforms with different technologies, studying the impact on the cost of new control and/or management capabilities, etc. This tool allows the evaluation of OpEx cost of different network platforms and the identification of the most important cost factor. This thesis has focused on the most costly process by finding an optimal solution that minimizes it. Hence this master thesis forms a complete cost reduction study of an optical backbone network.</p>
119

Storm Worm: A P2P Botnet

Mukamurenzi, Nelly Marylise January 2008 (has links)
<p>In this thesis, P2P botnets are studied and analysed using Storm Worm as the case study. A theoretical honeypot experiment is described for the purpose of observing the attack method, behaviour and pattern of Storm Worm and potentially collect forensic evidence to help in investigations of malware attacks of this kind.</p>
120

Biometric Solutions for Personal Identification

Larsen, Tormod Emsell January 2008 (has links)
<p>With a high level of accuracy and ease of use, the technology of biometrics has the recent years gained popularity and in many cases replaced traditional identification methods based on passwords or tokens. While fingerprint matching is the most mature and most widely used technique of today, several others exist. Among these is ear recognition, which so far has received scant attention, but still has showed good results in performance. The thesis gives a general presentation of the biometric technology, with its advantages and challenges. In addition, the new and novel technology of ear recognition using thermal imagery is elaborated and discussed. An experiment of small scale, aiming to test the ability of thermal ear recognition as a method for identification, was performed. The test also considers the affect on the performance when the ear temperature varies. An EER of 20.7 % with a corresponding detection rate of 78 % was achieved when considering only ears with the same temperature. By including the applied temperature changes, an increase in the EER to 31.5 % with a corresponding detection rate of 72 % was observed. The results indicate that thermal images of the ears are not sufficiently distinguishable for use in establishing identity by itself, but it might be suitable as a supplement to other biometric techniques.</p>

Page generated in 0.0755 seconds