• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 448
  • 58
  • 38
  • 35
  • 24
  • 14
  • 11
  • 9
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 750
  • 750
  • 240
  • 179
  • 126
  • 118
  • 111
  • 110
  • 99
  • 93
  • 79
  • 65
  • 65
  • 65
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

Model-Based Test Case Generation for Real-Time Systems

Hessel, Anders January 2007 (has links)
<p>Testing is the dominant verification technique used in the software industry today. The use of automatic test case execution increases, but the creation of test cases remains manual and thus error prone and expensive. To automate generation and selection of test cases, model-based testing techniques have been suggested.</p><p>In this thesis two central problems in model-based testing are addressed: the problem of how to formally specify coverage criteria, and the problem of how to generate a test suite from a formal timed system model, such that the test suite satisfies a given coverage criterion. We use model checking techniques to explore the state-space of a model until a set of traces is found that together satisfy the coverage criterion. A key observation is that a coverage criterion can be viewed as consisting of a set of items, which we call coverage items. Each coverage item can be treated as a separate reachability problem. </p><p>Based on our view of coverage items we define a language, in the form of parameterized observer automata, to formally describe coverage criteria. We show that the language is expressive enough to describe a variety of common coverage criteria described in the literature. Two algorithms for test case generation with observer automata are presented. The first algorithm returns a trace that satisfies all coverage items with a minimum cost. We use this algorithm to generate a test suite with minimal execution time. The second algorithm explores only states that may increase the already found set of coverage items. This algorithm works well together with observer automata.</p><p>The developed techniques have been implemented in the tool CoVer. The tool has been used in a case study together with Ericsson where a WAP gateway has been tested. The case study shows that the techniques have industrial strength.</p>
462

Knowledge Technology Applications for Knowledge Management

Andersson, Kent January 2000 (has links)
<p>We investigate how the knowledge in knowledge technology applications for knowledge management can be represented to let the user directly manage the knowledge content of the applications.</p><p>In paper I we design a representation of diagnosis knowledge that allows the user to add new components and inspect the function of the device. The representation allows an integration of model based knowledge with compiled and heuristic knowledge so that the device and its function can be represented a suitable level of abstraction and let other parts be represented as non-model based knowledge.</p><p>In paper II we use simplified rules for describing the time, resources, activities and amounts required in a tunnelling project and a simulation engine for estimating time and amounts consumed in the tunnelling process. The rules are designed to allow a user to change the facts and computations of the system.</p><p>In paper III we present the constraint modelling language CML and show how to model a nurse scheduling problem and a train scheduling problem without programming. The idea is to preserve the problem structure of the domain, allowing constraint formulations that reflect natural language expressions familiar to the users. CML problem specifications are transformed automatically to standard constraint programs </p><p>In paper IV we investigate the use of decision tables for representing requirements on staff scheduling explicitly, providing structure, decision support and overview to the user. The requirements are compiled automatically to a program that use hand-written procedures for efficient scheduling.</p><p>It seems possible to let the user modify central parts of the knowledge content in the applications with these representations, by using various illustration techniques. The techniques used are object-based graphics for manipulating device components and connections in diagnosis, simplified rules for simulation of tunnelling activities, text-based query language specification of scheduling problems and finally, decision tables for constraint problems and decision support.</p>
463

Test case generation using symbolic grammars and quasirandom sequences

Felix Reyes, Alejandro 06 1900 (has links)
This work presents a new test case generation methodology, which has a high degree of automation (cost reduction); while providing increased power in terms of defect detection (benefits increase). Our solution is a variation of model-based testing, which takes advantage of symbolic grammars (a context-free grammar where terminals are replaced by regular expressions that represent their solution space) and quasi-random sequences to generate test cases. Previous test case generation techniques are enhanced with adaptive random testing to maximize input space coverage; and selective and directed sentence generation techniques to optimize sentence generation. Our solution was tested by generating 200 firewall policies containing up to 20 000 rules from a generic firewall grammar. Our results show how our system generates test cases with superior coverage of the input space, increasing the probability of defect detection while reducing considerably the needed number the test cases compared with other previously used approaches. / Software Engineering and Intelligent Systems
464

A Concurrency and Time Centered Framework for Certification of Autonomous Space Systems

Dechev, Damian 2009 December 1900 (has links)
Future space missions, such as Mars Science Laboratory, suggest the engineering of some of the most complex man-rated autonomous software systems. The present process-oriented certification methodologies are becoming prohibitively expensive and do not reach the level of detail of providing guidelines for the development and validation of concurrent software. Time and concurrency are the most critical notions in an autonomous space system. In this work we present the design and implementation of the first concurrency and time centered framework for product-oriented software certification of autonomous space systems. To achieve fast and reliable concurrent interactions, we define and apply the notion of Semantically Enhanced Containers (SEC). SECs are data structures that are designed to provide the flexibility and usability of the popular ISO C++ STL containers, while at the same time they are hand-crafted to guarantee domain-specific policies, such as conformance to a given concurrency model. The application of nonblocking programming techniques is critical to the implementation of our SEC containers. Lock-free algorithms help avoid the hazards of deadlock, livelock, and priority inversion, and at the same time deliver fast and scalable performance. Practical lock-free algorithms are notoriously difficult to design and implement and pose a number of hard problems such as ABA avoidance, high complexity, portability, and meeting the linearizability correctness requirements. This dissertation presents the design of the first lock-free dynamically resizable array. Our approach o ers a set of practical, portable, lock-free, and linearizable STL vector operations and a fast and space effcient implementation when compared to the alternative lock- and STM-based techniques. Currently, the literature does not offer an explicit analysis of the ABA problem, its relation to the most commonly applied nonblocking programming techniques, and the possibilities for its detection and avoidance. Eliminating the hazards of ABA is left to the ingenuity of the software designer. We present a generic and practical solution to the fundamental ABA problem for lock-free descriptor-based designs. To enable our SEC container with the property of validating domain-specific invariants, we present Basic Query, our expression template-based library for statically extracting semantic information from C++ source code. The use of static analysis allows for a far more efficient implementation of our nonblocking containers than would have been otherwise possible when relying on the traditional run-time based techniques. Shared data in a real-time cyber-physical system can often be polymorphic (as is the case with a number of components part of the Mission Data System's Data Management Services). The use of dynamic cast is important in the design of autonomous real-time systems since the operation allows for a direct representation of the management and behavior of polymorphic data. To allow for the application of dynamic cast in mission critical code, we validate and improve a methodology for constant-time dynamic cast that shifts the complexity of the operation to the compiler's static checker. In a case study that demonstrates the applicability of the programming and validation techniques of our certification framework, we show the process of verification and semantic parallelization of the Mission Data System's (MDS) Goal Networks. MDS provides an experimental platform for testing and development of autonomous real-time flight applications.
465

Knowledge Technology Applications for Knowledge Management

Andersson, Kent January 2000 (has links)
We investigate how the knowledge in knowledge technology applications for knowledge management can be represented to let the user directly manage the knowledge content of the applications. In paper I we design a representation of diagnosis knowledge that allows the user to add new components and inspect the function of the device. The representation allows an integration of model based knowledge with compiled and heuristic knowledge so that the device and its function can be represented a suitable level of abstraction and let other parts be represented as non-model based knowledge. In paper II we use simplified rules for describing the time, resources, activities and amounts required in a tunnelling project and a simulation engine for estimating time and amounts consumed in the tunnelling process. The rules are designed to allow a user to change the facts and computations of the system. In paper III we present the constraint modelling language CML and show how to model a nurse scheduling problem and a train scheduling problem without programming. The idea is to preserve the problem structure of the domain, allowing constraint formulations that reflect natural language expressions familiar to the users. CML problem specifications are transformed automatically to standard constraint programs In paper IV we investigate the use of decision tables for representing requirements on staff scheduling explicitly, providing structure, decision support and overview to the user. The requirements are compiled automatically to a program that use hand-written procedures for efficient scheduling. It seems possible to let the user modify central parts of the knowledge content in the applications with these representations, by using various illustration techniques. The techniques used are object-based graphics for manipulating device components and connections in diagnosis, simplified rules for simulation of tunnelling activities, text-based query language specification of scheduling problems and finally, decision tables for constraint problems and decision support.
466

Model Based Coding : Initialization, Parameter Extraction and Evaluation

Yao, Zhengrong January 2005 (has links)
This thesis covers topics relevant to model-based coding. Model-based coding is a promising very low bit rate video coding technique. The idea behind this technique is to parameterize a talking head and to extract and transmit the parameters describing facial movements. At the receiver, the parameters are used to reconstruct the talking head. Since only high-level animation parameters are transmitted, very high compression can be achieved with this coding scheme. This thesis covers the following three key problems. Although it is a fundamental problem, the initialization problem, has been neglected some extent in the literature. In this thesis, we pay particular attention to the study of this problem. We propose a pseudo-automatic initialization scheme: an Analysis-by-Synthesis scheme based on Simulated Annealing. It has been proved to be an efficient scheme. Owing to technical advance today and the newly emerged MPEG-4 standard, new schemes of performing texture mapping and motion estimation are suggested which use sample based direct texture mapping; the feasibility of using active motion estimation is explored which proves to be able to give more than 10 times tracking resolution. Based on the matured face detection technique, Dynamic Programming is introduced to face detection module and work for face tracking. Another important problem addressed in this thesis is how to evaluate the face tracking techniques. We studied the evaluation problems by examining the commonly used method, which employs a physical magnetic sensor to provide "ground truth". In this thesis we point out that it is quite misleading to use such a method.
467

Integrated information management in complex product development

Malvius, Diana January 2009 (has links)
Where do companies begin their efforts when trying to improve information management in product development? In large companies involving many people, multiple processes and highly technological products several factors have an impact on efficiency. Interdisciplinary integration and structured information are two overall proposed key factors that have been identified as important to obtain efficient information management. Measurement of satisfaction level among information systems users is proposed as an angle of approach to identify key improvement areas from an operative perspective that are argued to be strategic for management to address. However, the need for adjustments to contextual prerequisites and a changing environment makes evaluation necessary prior to measurement. An evaluation framework is proposed to identify metrics that are tailored and kept in line with business and development strategies to ensure their relevancy. This research has aimed at taking a holistic approach to information management in complex product development. The research focus has been on the integration between engineering disciplines where software and electrical R&amp;D departments at automotive companies have been the main source of the analysis material. Integrated information management entails support for activities within the engineering domain. Several strategies are discussed to manage trade-offs in organizations in order to succeed with integrated information management. A needs-based balance is one important approach proposed to resolve changing and conflicting needs. Furthermore, it is argued that operative and strategic goals should be allowed to co-exist. Providing the right infrastructure to support designers in their everyday work does not necessarily mean additional functionality to existing information systems or automated work activities by improved document templates. Rather, it is suggested that a shift in focus (from addressing detailed requirements management to reflecting on interrelationships between information objects and system inter-dependencies) would be a strong mechanism to succeed with information management. The transition into model-based development is argued to be a much needed change for organizations to obtain integrated information management, since a model-based approach is considered an important basis for structured information. Anticipated benefits with integrated information management are increased information availability, reduced information overflow, and enhanced communication and understanding of critical system dependencies / QC 20100730
468

A Concurrency and Time Centered Framework for Certification of Autonomous Space Systems

Dechev, Damian 2009 December 1900 (has links)
Future space missions, such as Mars Science Laboratory, suggest the engineering of some of the most complex man-rated autonomous software systems. The present process-oriented certification methodologies are becoming prohibitively expensive and do not reach the level of detail of providing guidelines for the development and validation of concurrent software. Time and concurrency are the most critical notions in an autonomous space system. In this work we present the design and implementation of the first concurrency and time centered framework for product-oriented software certification of autonomous space systems. To achieve fast and reliable concurrent interactions, we define and apply the notion of Semantically Enhanced Containers (SEC). SECs are data structures that are designed to provide the flexibility and usability of the popular ISO C++ STL containers, while at the same time they are hand-crafted to guarantee domain-specific policies, such as conformance to a given concurrency model. The application of nonblocking programming techniques is critical to the implementation of our SEC containers. Lock-free algorithms help avoid the hazards of deadlock, livelock, and priority inversion, and at the same time deliver fast and scalable performance. Practical lock-free algorithms are notoriously difficult to design and implement and pose a number of hard problems such as ABA avoidance, high complexity, portability, and meeting the linearizability correctness requirements. This dissertation presents the design of the first lock-free dynamically resizable array. Our approach o ers a set of practical, portable, lock-free, and linearizable STL vector operations and a fast and space effcient implementation when compared to the alternative lock- and STM-based techniques. Currently, the literature does not offer an explicit analysis of the ABA problem, its relation to the most commonly applied nonblocking programming techniques, and the possibilities for its detection and avoidance. Eliminating the hazards of ABA is left to the ingenuity of the software designer. We present a generic and practical solution to the fundamental ABA problem for lock-free descriptor-based designs. To enable our SEC container with the property of validating domain-specific invariants, we present Basic Query, our expression template-based library for statically extracting semantic information from C++ source code. The use of static analysis allows for a far more efficient implementation of our nonblocking containers than would have been otherwise possible when relying on the traditional run-time based techniques. Shared data in a real-time cyber-physical system can often be polymorphic (as is the case with a number of components part of the Mission Data System's Data Management Services). The use of dynamic cast is important in the design of autonomous real-time systems since the operation allows for a direct representation of the management and behavior of polymorphic data. To allow for the application of dynamic cast in mission critical code, we validate and improve a methodology for constant-time dynamic cast that shifts the complexity of the operation to the compiler's static checker. In a case study that demonstrates the applicability of the programming and validation techniques of our certification framework, we show the process of verification and semantic parallelization of the Mission Data System's (MDS) Goal Networks. MDS provides an experimental platform for testing and development of autonomous real-time flight applications.
469

Comparison Of Kinematic Results Between Metu-kiss &amp / Ankara University-vicon Gait Analysis Systems

Civek, Ezgi 01 December 2006 (has links) (PDF)
KISS (Kinematic Support System) is a locally developed gait analysis system at Middle East Technical University (METU), and the performance of the system was evaluated before as a whole. However, such evaluations do not differentiate between the efficacy of the data acquisition system and the model-based gait analysis methodology. In this thesis, kinematic results of the KISS system will be compared with those of the Ankara University based commercial VICON (Oxford Metrics Ltd., Oxford, UK) system, in view of evaluating the performance of data acquisition system and the gait analysis methodology separately. This study is expected to provide guidelines for future developments on the KISS system.
470

Risk-conscious design of off-grid solar energy houses

Hu, Huafen 16 November 2009 (has links)
Zero energy houses and (near) zero energy buildings are among the most ambitious targets of society moving towards an energy efficient built environment. The "zero" energy consumption is most often judged on a yearly basis and should thus be interpreted as yearly net zero energy. The fully self sustainable, i.e. off-grid, home poses a major challenge due to the dynamic nature of building load profiles, ambient weather condition and occupant needs. In current practice, the off-grid status is accomplishable only by relying on backup generators or utilizing a large energy storage system. The research develops a risk based holistic system design method to guarantee a match between onsite sustainable energy generation and energy demand of systems and occupants. Energy self-sufficiency is the essential constraint that drives the design process. It starts with information collection of occupants' need in terms of life style, risk perception, and budget planning. These inputs are stated as probabilistic risk constraints that are applied during design evolution. Risk expressions are developed based on the relationships between power unavailability criteria and "damages" as perceived by occupants. A power reliability assessment algorithm is developed to aggregate the system underperformance causes and estimate all possible power availability outcomes of an off-grid house design. Based on these foundations, the design problem of an off-grid house is formulated as a stochastic programming problem with probabilistic constraints. The results show that inherent risks in weather patterns dominate the risk level of off-grid houses if current power unavailability criteria are used. It is concluded that a realistic and economic design of an off-grid house can only be achieved after an appropriate design weather file is developed for risk conscious design methods. The second stage of the research deals with the potential risk mitigation when an intelligent energy management system is installed. A stochastic model based predictive controller is implemented to manage energy allocation to sub individual functions in the off-grid house during operation. The controller determines in real time the priority of energy consuming activities and functions. The re-evaluation of the risk indices show that the proposed controller helps occupants to reduce damages related to power unavailability, and increase thermal comfort performance of the house. The research provides a risk oriented view on the energy self-sufficiency of off-grid solar houses. Uncertainty analysis is used to verify the match between onsite sustainable energy supply and demand under dynamic ambient conditions in a manner that reveals the risks induced by the fact that new technologies may not perform as well as expected. Furthermore, taking occupants' needs based on their risk perception as constraints in design evolution provides better guarantees for right sized system design.

Page generated in 0.0778 seconds