• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 382
  • 116
  • Tagged with
  • 498
  • 498
  • 498
  • 498
  • 498
  • 471
  • 27
  • 10
  • 10
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
331

Motif discovery in biological sequences

Sandve, Geir Kjetil January 2005 (has links)
This master thesis is a Ph.D. research plan for motif discovery in biological sequences, and consists of three main parts. Chapter 2 is a survey of methods for motif discovery in DNA regulatory regions, with a special emphasis on computational models. The survey presents an integrated model of the problem that allows systematic and coherent treatment of the surveyed methods. Chapter 3 presents a new algorithm for composite motif discovery in biological sequences. This algorithm has been used with success for motif discovery in protein sequences, and will in future work be extended on to explore properties of the DNA regulatory mechanism. Finally, chapter 4 describes several current research projects, as well as some more general future directions of research. The research focuses on the development of new algorithms for the discovery of composite motifs in DNA. These algorithms will partly be used for systematic exploration of the DNA regulatory mechanism. An increased understanding of this mechanism may lead to more accurate computational models, and hence more sensitive motif discovery methods.
332

HPC File Server Monitoring and Tuning

Andresen, Rune Johan January 2005 (has links)
As HPC systems grow, the distributed file systems serving these systems need to handle an increased load of data. In order to maintain performance, these underlying file servers need to distributethe load of data volumes efficiently over available disks. This is particularly true at CERN, the European European Organizationfor Nuclear Research, which expects to behandling Pentabytes of data in the near future. In this thesis, new utilities that analyze file serverdata which is then used to semiautomatically tune thefiles system, are developed. This is achieved using a commercial database to store the dataand then integrating it with the file server. This requires a database and a system design that can handle a large amount of data. File server data collections associated with aprocess known as "volumes", can vary in size, and be accessed at any time. To increase the overall system performance, volume history data is analyzed to locate volumes that may be gathered for increased system performance throuhgh load balancing. For instance, using the volume history data, it is possible to detect and gather volumes that are most accessed during the day with volumes that are most accessed during the night on one file server. The file server capacity is hence optimized. As part of this work, a user interface which can visualize the history data for volumes and partitions, is designed and implemented on top of the AFS file system at CERN. Our initial results presented in this thesisreveal that it is possible to locate volumes that have a repeating access period, and thus, gather them on the same partition. Other analyses and suggestions for future work will also be discussed.
333

Use of GPU Functionality in Volume Rendering

Eide, Kristian Edvard Nigar January 2005 (has links)
Volume rendering describes the processes of creating a 2D projection of a 3D discretely sampled data set. This field has a number of applications, most notably within medical imaging, where the output of CT and MRI scanners is a volume data set, as well as geology where seismic surveys are visualized as an aid when searching for oil & gas. Rendering a volume is a computationally intensive task due to the large amount of data that needs to be processed, and it is only recently, with the advent of commodity 3D accelerator cards, that interactive rendering of volumes has become possible. The latest versions of 3D graphics cards include a Graphics Processing Unit, or GPU, which is capable of executing small code fragments at very high speed. These small programs, while not as flexible as traditional programming, still represent a significant improvement in what is possible to achieve with the added computational ability provided by the graphics card. This thesis explores how volume rendering can be enhanced by the use of a GPU. In particular, it shows an improvement to the GPU-based raycasting approach presented in [1] and also a method for integrating the “depth peeling” technique [6] with a volume renderer for correctly rendering transparent geometry embedded in the volume. In addition, an introduction to volume rendering and GPU programming is given, and a rendering of a volume with the Phong illumination model is shown.
334

a Multivariate Image Analysis Toolbox

Hagen, Reidar Strand January 2005 (has links)
The toolkit has been implemented as planned: The ground work for visualisation mappings and relationships between datasets have been finished. Wavelet transforms have been to compress datasets in order to reduce computational time. Principal Component Analysis and other transforms are working. Examples of use have been provided, and several ways of visualizing them have been provided. Multivariate Image Analysis is viable on regular Workstations.
335

Access Control in Heterogenous Health Care Systems : A comparison of Role Based Access Control Versus Decision Based Access Control

Magnussen, Gaute, Stavik, Stig January 2006 (has links)
Role based access control (RBAC) is widely used in health care systems today. Some of the biggest systems in use at Norwegian hospitals utilizes role based integration. The basic concept of RBAC is that users are assigned to roles, permissions are assigned to roles and users acquire permissions by being members of roles. An alternative approach to the role based access distribution, is that information should be available only to those who are taking active part in a patient’s treatment. This approach is called decision based access control (DBAC). While some RBAC implementations grant access to a groups of people by ward, DBAC ensures that access to relevant parts of the patient’s medical record is given for treatment purposes regardless of which department the health care worker belongs to. Until now the granularity which the legal framework describes has been difficult to follow. The practical approach has been to grant access to entire wards or organizational units in which the patient currently resides. Due to the protection of personal privacy, it is not acceptable that any medical record is available to every clinician at all times. The most important reason to implement DBAC where RBAC exists today, is to get an access control model that is more dynamic. The users should have the access they need to perform their job at all times, but not more access than needed. With RBAC, practice has shown that it is very hard to make dynamic access rules when properties such as time and tasks of an employee’s work change. This study reveals that pretty much all security measures in the RBAC systems can be overridden by the use of emergency access features. These features are used extensively in everyday work at the hospitals, and thereby creates a security risk. At the same time conformance with the legal framework is not maintained. Two scenarios are simulated in a fictional RBAC and DBAC environment in this report. The results of the simulation show that a complete audit of the logs containing access right enhancements in the RBAC environment is unfeasible at a large hospital, and even checking a few percent of the entries is also a very large job. Changing from RBAC to DBAC would probably affect this situation to the better. Some economical advantages are also pointed out. If a change is made, a considerable amount of time that is used by health care workers to unblock access to information they need in their everyday work will be saved.
336

Scenario testing in a real environment : Key card Administration System at the University Hospital in North Norway

Halmø, Yngve, Jenssen, Geir-Arne January 2006 (has links)
Software is gradually replacing paper based administration systems. The migration to electronic systems is supposed to make life easier for the users. If this is to be the case then these software systems must be created in such a way that the end users are able to use them effectively. To achieve usable systems, software testing must be utilized. There are many ways to test a program, with or without involving real users. Scenario testing is a somewhat poorly documented discipline in software testing, with ambiguous definitions. It does however seem to be well suited in combination with users to test external parts of a software system in a late state of development. This project is based on the work done in the software engineering depth study [12]. There we conducted empirical work and internal testing of the software system KAS, and laid the foundation for this Master’s thesis. In this report we have continued the work with this software and concentrated on its external characteristics and user testing. We have analyzed scenario testing further through a software test of this system involving its future users. The users have been given tasks to complete through stories that explain what to do but not how to do it. We have observed the test subjects closely throughout the tests, and collected important data. The results have been evaluated in order to assess their usefulness, which further points to the quality of scenario testing as a testing method. The results have also spawned functional requirements which have been implemented into the KAS. Through this project we have gained experience that can be useful to others conducting scenario tests or doing research in software testing in the future.
337

Software Architecture of the Algorithmic Music System ImproSculpt

Semb, Thor Arne Gald, Småge, Audun January 2006 (has links)
This document investigates how real-time algorithmic music composition software constrains and shapes software architecture. To accomplish this, we have employed a method known as Action Research on the software system ImproSculpt. ImproSculpt is real-time algorithmic music composition system for use in both live performances and studio contexts, created by Øyvind Brandtsegg. Our role was to improve the software architecture of ImproSculpt, while gathering data for our research goal. To get an overview of architecture and architectural tactics we could use to improve the structure of the system, a literature study was first conducted on this subject. A design phase followed, where the old architecture was analyzed, and a suggestion for a new system architecture was proposed. After the design phase was completed, we performed four iterations of the action resesarch cyclical process model, where we implemented our new architecture step by step, evaluating and learning from the process as we went along. This project is a follow up of our previous research project, “Artistic Software” [3], that investigated how algorithmic composition was influenced by software.
338

BUCS Implementing safety : An approach as to how to implement safety concerns

Vindegg, Ole-Johan Sikkeland January 2005 (has links)
BUCS Implementing safety An approach as to how to implement safety concernsting safety
339

An empirical study of component-based software engineering in Statoil

Ha, Vu, Tran, Kiet Ve January 2006 (has links)
Our master thesis is an extension based on our thesis written in the autumn 2005.
340

Component Based System Development in the Norwegian Software Industry

Sommerseth, Marius January 2006 (has links)
Today it has become common practice to apply systematic reuse during software development. By reuse, the gain from creating a piece of software can be multiplied, as instead of creating a new component each time, old ones can be reused. This increases productivity (shorter time-to-market, less cost) and also software quality, as the components get well tested through using them in different systems. There are, however, many ways of applying reuse. There are different types of components that can be applied in systematic reuse. The most common ones are internally developed, OSS, COTS, or outsourced components. There are also many different ways to share and access the components among the developers. Today all companies who apply reuse have some sort of distributed way of sharing. To use product families is also one way of applying reuse. This can take reuse to another level as the reused parts can be vast, but it can also be used for branding a line of products. The main part of this thesis is a quantitative survey that was executed with a questionnaire. 32 Norwegian software companies participated in the survey. The questionnaire asked about who applied reuse and product families, how they applied it, and what the respondents thought were important when applying it. The data collected is used to answer 3 research questions and are also discussed against related research. The data is also used to see if there are any differences between how reuse is applied in companies of different sizes and internally in departments as well as for whole companies. Also the impact of different program languages and development processes/methods on reuse is explored. This survey builds upon the pre-study “Reuse through product-families and framework” [MS00]. In the pre-study subjects from 12 Norwegian software development companies were interviewed about how they utilized reuse and product families. This was a qualitative survey with open questions, which was used to discover trends in Norwegian software development companies, and these trends are in this thesis examined. The data from another survey done by IKT-Norge is also used in this thesis, but only the questions added extra for NTNU. These were about process improvement as well as reuse. There were a total of 142 Norwegian companies that responded, and 60 who answered the extra questions. The IKT-Norge survey is also compared against the thesis survey.

Page generated in 0.0884 seconds