• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 382
  • 116
  • Tagged with
  • 498
  • 498
  • 498
  • 498
  • 498
  • 471
  • 27
  • 10
  • 10
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Capability Assessment of Indoor Positioning Systems

Landmark, Andreas Dypvik January 2009 (has links)
<p>Location systems are seen as a promising technology for tracking people and objects to improve efficiency and quality in the healthcare domain. To increase the chances of success when introducing this new technology there are certain operational capabilities that need to be understood. The purpose of this Thesis is to explore how these operational capabilities can be assessed by experiment. The thesis proposes a method for describing the operational capabilities of a location system using a two-dimensional matrix of purposes of location systems in the healthcare domain, as found in literature. Using this matrix it is possible to assess and predict the requirements for a location system based on a classification of the purpose of the installation. Conversely it is possible to use the same matrix to find purposes that can be solved with a given location system. Using the Sonitor Indoor Positioning System it was also demonstrated how the operational capabilities of a location system could be found through a series of small low cost and low effort experiments. In conclusion three dimensions relating to operational capabilities were identified: granularity, resolution, and concurrency. Granularity and concurrency were shown to be successfully assessed through experiment, while resolution was found analytically. We also found a method to predict the impact of infrastructure size on the operational capability of the location system based on the same small experiments.</p>
232

Security Testing of Web Based Applications

Erdogan, Gencer January 2009 (has links)
<p>Web applications are becoming more and more popular in means of modern information interaction, which leads to a growth of the demand of Web applications. At the same time, Web application vulnerabilities are drastically increasing. This will inevitably expose more Web application users to malicious attacks, causing them to lose valuable information or be harmed in other ways. One of the most important software security practices that is used to mitigate the increasing number of vulnerabilities is security testing. The most commonly applied security testing methodologies today are extensive and are sometimes too complicated with their many activities and phases. Because of this complexity, developers very often tend to neglect the security testing process. Today, there is only a few security testing methodologies developed especially for Web applications and their agile development environment. It is therefore necessary to give attention to security testing methodologies for Web applications. A survey of state-of-the-art security testing methodologies for Web applications is performed. Based on some predefined criterions, Agile Security Testing is selected as the most adequate security testing methodology for Web applications, and is further extended to support all the predefined criterions. Furthermore, the extended Agile Security Testing methodology (EAST) is integrated into the Software Development Life Cycle applied by the Administrative Information Services group at the Department of General Infrastructure Services at CERN−The European Organization for Nuclear Research. Finally, by using the EAST methodology and the security testing methodology applied by the AIS group (which is an ad hoc way of performing security tests), an evaluation of the EAST methodology compared to existing ad hoc ways of performing security tests is made. The security testing process is carried out two times using the EAST methodology and two times using the ad hoc approach. In total, 9 vulnerability classes are tested. The factors that are used to measure the efficiency is: (1) the amount of time spent on the security testing process, (2) the amount of vulnerabilities found during the security testing process and (3) the ability to mitigate false-positives during the security testing process. The results show that the EAST methodology is approximately 21% more effective in average regarding time spent, approximately 95% more effective regarding the amount of vulnerabilities found, and has the ability to mitigate false-positives, compared to existing ad hoc ways of performing security tests. These results show that structured security testing of Web applications is possible not being too complicated with many activities and phases. Furthermore, it mitigates three important factors that are used as basis to neglect the security testing process. These factors are: The complexity of the testing process, the “too time-consuming” attitude against security testing of Web applications and that it’s considered to lack a significant payoff.</p>
233

Practices of Agile Software Product-Line Engineering : A qualitative assessment of empirical studies

Gylterud, Snorre January 2009 (has links)
<p>This thesis elaborated on the how Software Product-Line Engineering is combined with Agile Software Development to improve Software Engineering, through investigating published case studies and performing interviews in several companies. This combination are often described as Agile Software Product-Line Engineering and our study aimed to describe what agility is for software product lines and find out more on how this approach could be realized. Agile Software Product-Line Engineering could reap benefits from the best of the two software engineering approaches combining long term strategic efforts with short term agility. By following a specified research method that combines qualitative research methods we were able to ensure validity in our analysis and generalize the findings of this study. We used both semi-structured interviews and textual analysis techniques. The companies under study seem to combine Software Product-Line Engineering and Agile Software Development with success, reducing initial investment and exploiting reuse, and we found several practices that are interesting for further study. Based on these practices we present our view of a top-down approach to Agile Software Product-Line Engineering starting with several characteristics and a proposal for a definition of the field. Further, a framework for implementing the approach based on our research is presented, before we describe our thoughts on how the practice areas of Software Product-Line Engineering can be combined with Agile Software Development practices. We think that this thesis could be used as a guideline for further study and implementation of Agile Software Product-Lines. We believe that the data we cover is comprehensive based on the small existing research field and covers the general ideas of both the fields included in the combination.</p>
234

A Mobile Guide for geographically displaying estates listed for sale.

Romundstad, Rune January 2007 (has links)
Instead of displaying a simple list with the houses and apartments which are listed for sale, an alternative is to display them geographically in a map to get an overview of where they are located. To make it simple the map also has an icon of where you are located, so you know where you are according to the houses. A prototype of such a system with the most important functionality has been implemented. The system gives the user the option to choose different other facilities he or she wants to show in the map together with the apartments, such as schools and parking spaces etc. The system gives also short information of the different apartments when clicking on them, and the possibility to open the prospect of that specific apartment. The prototype has been evaluated by some “experts” from one of the largest estate agencies, together with a user- and a technical evaluation. The feedback from the “experts” has been very positive, and they are convinced that such a system will be very helpful for potential users. The response regarding the user-friendliness has also been good, except from some comments on too many confirmations needed when running the application. During the technical evaluation we got confirmed that a J2ME application not is directly portable between different devices without slightly adjustments.
235

Real-Time Online Multiplayer Mobile Gaming

Jarrett, Martin, Sorteberg, Eivind January 2007 (has links)
Gaming on mobile phones is a business with a great growth potential both in profit and popularity. In today's modern world, the number of potential users of online multiplayer mobile games is enormous. This is because of the wide deployment of mobile phones and the increasing general interest in gaming. For game developers, this is an interesting business area, since mobile games are faster and easier to develop than console or computer games, due to the mobile games' smaller size and reduced complexity. Telecom companies, on the other hand, may profit from this both by attracting users through exclusive contents only available to their subscribers, and trough the potential network traffic generated by online multiplayer games. Some multiplayer mobile games are available on the market today. However, few of these can be played real-time, which often involves a more entertaining and attractive gameplay compared to slower, turn-based games. This project has focused on two main areas. Firstly, different network technologies and transport protocols have been tested to evaluate whether these are suitable for real-time multiplayer mobile games or not. This was done by testing the different networks' response times and transfer speeds. Secondly, a framework for developing this kind of games has been developed. Also, a game prototype has been implemented based on this framework, and the experience from this development has been recorded to provide assistance for future development projects within the same scope. The results from the tests show that, among the widely available mobile networks today, only UMTS (3G) and EDGE offer performance sufficient for a fast and stable real-time multiplayer mobile game. GPRS is too slow and unstable, and using this technology for real-time game communication is likely to lead to lags and an incoherent gameplay. Furthermore, the tests have clearly shown that UDP is far better suited for in-game communication than TCP, because of UDP's superior response time. For developers of such games, there are several challenges that have to be closely considered. Synchronization of clients is a very difficult task because of high network latencies. Furthermore, mobile phones are weak in terms of available resources. Managing these problems requires distribution of calculations and efficient algorithms. The game framework developed in this project has proved to provide a good basis for developing different game concepts within real-time multiplayer mobile gaming. Common functionality for such games is implemented in the framework, thus helping game developers avoid having to reinvent the wheel. This project has shown that successful real-time multiplayer mobile games are definitely possible to implement. However, doing this is a great challenge, both for developers, distributors, and telecom companies offering such games to their subscribers. A middle way has to be found between the complexity of the game, the need for frequent network updates, and the user cost involved with playing the game. If this middle way is found, it is very likely that such a game could be a great success.
236

An Evaluation of messages-based Systems Integration with respect to Performance : A case study

Smaavik, Trond Fallan, Øvstetun, Nils Torstein January 2007 (has links)
This report describes a case-study evaluation of two integration strategies with particular focus on performance. The study is motivated by integration challengers within a company we have cooperated with, and our wish to gain insight into systems integration. The goal of the study has been to evaluate the performance of two message-based system integration strategies. We have evaluated this by implementing several applications which are integrated using either Web services technologies or an integration technology provided by our cooperator. Our research questions have been as follows: Q1: Which integration solution has best performance in a publish-subscribe scenario? Q2: Which integration solution has best performance in a request-response scenario? The results show that the Web service applications has best performance when sending small messages (up to 160kB). For large messages, the applications based on the integration technology from the cooperating company perform better. he contributions of the study may be split in two. The contribution to the company is the performance evaluation of their technology. Collected data for response time and throughput, and performance models for our test applications are also contributed. For a broader context, we contribute with performance evaluation of Web services technologies. Data is collected for response time and throughput on test applications, and performance models are made. The comparisons of integrations based on Web services and MIP also serves as example of the performance of Web services versus other middleware.
237

Semantic Relations in Yahoo! News Search

Evensen, Øyvind Arne January 2007 (has links)
In this thesis we propose a novel approach were 3 days of raw Yahoo! News search query logs are analyzed to find semantic relations among queries. The analysis is based on two independent contributions. The first use session data extracted from the query logs. By finding the term best describing each session, we get a vocabulary of queries related to that term. Sessions with similar terms are merged to create larger groups of queries with one common term or phrase. The second contribution is the use of temporal correlation to give a measure of frequency variation similarity. Queries that show a similar variation over time have a high chance of either being semantically related or appear in the same situations. These two contributions are then merged into related term groups, based on their session group label and the most prominent term or phrase of the correlation query. With the use of non strict parameter settings on the contribution calculations, a great number of queries are found. With the intersection of the results this leaves high accuracy groups of related queries with a term or phrase as group label. A prototype search application was developed to use the created term groups in a search environment. The groups of queries were converted into a tree structure with their group label as the main node. This navigation tree structure let the user navigate up and down in the tree or click directly on a tree node to view its results. When a users search match one of the generated groups, he or she is presented with the first search results of the trees main node together with its children.
238

Automated verification of design adherence in software implementation

Flobakk, Rune January 2007 (has links)
Software design and architecture specify how a system should be implemented to achieve the required quality attributes. Being able to automatically verify the design adherence during implementation will continuously assure that the system realizes the quality attributes, as well as over time does not drift away from them. This thesis investigates how a software design can be used to automatically verify and enforce rules for implementation. The current tool support for automatic design enforcement is assessed and reviewed. In addition, a prototype contribution to this practice, a plug-in for the Maven software project management system, is presented.
239

Finding Security Patterns to Countermeasure Software Vulnerabilities

Borstad, Ole Gunnar January 2008 (has links)
Software security is an increasingly important part of software development as the risk from attackers is constantly evolving through increased exposure, threats and economic impact of security breaches. Emerging security literature describes expert knowledge such as secure development best practices. This knowledge is often not applied by software developers because they lack security awareness, security training and secure development methods and tools. Existing methods and tools require too much effort and security is often given less priority in the trade-off between functionality and security. This thesis defines a tool supported approach to secure software analysis and design. Possible vulnerabilities and their causes are identified through analysis of software specifications and designs, resulting in vulnerability cause graphs. The security modelling tool SeaMonster is extended to include security activity graphs; this technique is used with vulnerability cause graphs to model vulnerabilities and security improvement activities. A security activity graph is created to identify activities that keep the vulnerabilities from instantiating in the final software product. The activities in the security activity graph can be the use of security patterns. This way the above approach is used to find a security pattern as a countermeasure to a vulnerability, and can be used with the security pattern design templates implemented in a preliminary project. This is a way of providing coupling between security expertise and software developers to apply security knowledge in software development practice. The approach and tools are tested and demonstrated through a development case study of a medical patient journal system. The main contributions of this thesis are an approach to secure software analysis and design, an extension of the security modelling tool SeaMonster, a case study of the approach and tools that show how security can be incorporated in early stages of software development. The contributions are intended to improve availability of security knowledge, to increase security awareness and bridge the gap between software experts and software developers.
240

Patient friendly Presentation of Electronic Patient Records

Stallemo, Kjetil January 2008 (has links)
Reading an electronic patient record (EPR) is a very challenging task because of the medical jargons, which are almost impossible to understand for the layman. This becomes a highly relevant challenge because of the more extensive use of the internet to get medical information. Also the Norwegian laws state that the patient has the right to read his or her own EPR. A master thesis executed in 2006, and a specialization project in 2007 addressed this subject and developed a prototype for adapting EPRs to a patient presentation. This thesis continues this work and aims to extend the system with more functionality and improve the translation of the EPRs. The main issues discussed in the thesis are how disambiguating between Norwegian words and medical terms, provide summaries of EPRs, and supply the patient with external information about his or her health condition. In addition the refined user interface from the specialization project was implemented. The conclusion of this thesis is that the Support Vector Machine classifier with character bigrams provides good and accurate disambiguation between Norwegian words and medical terms. The external information functionality provides correct and quality assured information from the patient hand book. There are still some issues, and possible improvements on providing only precise and relevant articles. Summarizing of EPRs is achieved through named entity extraction of ICD codes, and then presenting the codes together with their corresponding descriptions. This implementation seems to be accurate, correct, and precise.

Page generated in 0.113 seconds