• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 20
  • Tagged with
  • 57
  • 57
  • 57
  • 47
  • 25
  • 18
  • 11
  • 10
  • 10
  • 9
  • 8
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Proceedings of the 19th Bilateral Student Workshop CTU Prague

21 July 2017 (has links) (PDF)
This technical report publishes the proceedings of the 19th Prague Workshop, which was held from 27th to 28. November 2015. The workshop offers a possibility for young scientists to present their current research work in the fields of computer graphics, human-computer-interaction, robotics and usability. The works is meant as a platform to bring together researchers from both the Czech Technical University in Prague (CTU) and the University of Applied Sciences Dresden (HTW). The German Academic Exchange Service offers its financial support to allow student participants the bilateral exchange between Prague and Dresden.
22

Proceedings of the 20th Bilateral Student Workshop CTU Prague

21 July 2017 (has links) (PDF)
This technical report publishes the proceedings of the 20th Prague Workshop, which was held from 25th to 26th November 2016 The workshop offers a possibility for young scientists to present their current research work in the fields of computer graphics, human-computer-interaction, robotics and usability. The works is meant as a platform to bring together researchers from both the Czech Technical University in Prague (CTU) and the University of Applied Sciences Dresden (HTW). The German Academic Exchange Service offers its financial support to allow student participants the bilateral exchange between Prague and Dresden.
23

Computing Quantiles in Markov Reward Models

Ummels, Michael, Baier, Christel 10 July 2014 (has links) (PDF)
Probabilistic model checking mainly concentrates on techniques for reasoning about the probabilities of certain path properties or expected values of certain random variables. For the quantitative system analysis, however, there is also another type of interesting performance measure, namely quantiles. A typical quantile query takes as input a lower probability bound p ∈ ]0,1] and a reachability property. The task is then to compute the minimal reward bound r such that with probability at least p the target set will be reached before the accumulated reward exceeds r. Quantiles are well-known from mathematical statistics, but to the best of our knowledge they have not been addressed by the model checking community so far. In this paper, we study the complexity of quantile queries for until properties in discrete-time finite-state Markov decision processes with nonnegative rewards on states. We show that qualitative quantile queries can be evaluated in polynomial time and present an exponential algorithm for the evaluation of quantitative quantile queries. For the special case of Markov chains, we show that quantitative quantile queries can be evaluated in pseudo-polynomial time.
24

Creating and Maintaining Consistent Documents with Elucidative Development

Bartho, Andreas 20 September 2016 (has links) (PDF)
Software systems usually consist of multiple artefacts, such as requirements, class diagrams, or source code. Documents, such as specifications and documentation, can also be viewed as artefacts. In practice, however, writing and updating documents is often neglected because it is expensive and brings no immediate benefit. Consequently, documents are often outdated and communicate wrong information about the software. The price is paid later when a software system must be maintained and much implicit knowledge that existed at the time of the original development has been lost. A simple way to keep documents up to date is generation. However, not all documents can be fully generated. Usually, at least some content must be written by a human author. This handwritten content is lost if the documents must be regenerated. In this thesis, Elucidative Development is introduced. It is an approach to create documents by partial generation. Partial generation means that some parts of the document are generated whereas others are handwritten. Elucidative Development retains manually written content when the document is regenerated. An integral part of Elucidative Development is a guidance system, which informs the author about changes in the generated content and helps him update the handwritten content. / Softwaresysteme setzen sich üblicherweise aus vielen verschiedenen Artefakten zusammen, zum Beispiel Anforderungen, Klassendiagrammen oder Quellcode. Dokumente, wie zum Beispiel Spezifikationen oder Dokumentation, können auch als Artefakte betrachtet werden. In der Praxis wird aber das Schreiben und Aktualisieren von Dokumenten oft vernachlässigt, weil es zum einen teuer ist und zum anderen keinen unmittelbaren Vorteil bringt. Dokumente sind darum häufig veraltet und vermitteln falsche Informationen über die Software. Den Preis muss man später zahlen, wenn die Software gepflegt wird, weil viel von dem impliziten Wissen, das zur Zeit der Entwicklung existierte, verloren ist. Eine einfache Möglichkeit, Dokumente aktuell zu halten, ist Generierung. Allerdings können nicht alle Dokumente generiert werden. Meist muss wenigstens ein Teil von einem Menschen geschrieben werden. Dieser handgeschriebene Inhalt geht verloren, wenn das Dokument neu generiert werden muss. In dieser Arbeit wird das Elucidative Development vorgestellt. Dabei handelt es sich um einen Ansatz zur Dokumenterzeugung mittels partieller Generierung. Das bedeutet, dass Teile eines Dokuments generiert werden und der Rest von Hand ergänzt wird. Beim Elucidative Development bleibt der handgeschriebene Inhalt bestehen, wenn das restliche Dokument neu generiert wird. Ein integraler Bestandteil von Elucidative Development ist darüber hinaus ein Hilfesystem, das den Autor über Änderungen an generiertem Inhalt informiert und ihm hilft, den handgeschriebenen Inhalt zu aktualisieren.
25

Towards a Maturity Model to Measure Organizational Agility in the Software and IT Services Industry

Wendler, Roy 05 October 2016 (has links) (PDF)
Agile software development methods reduce project costs and development time by simultaneously enhancing quality. But despite these advantages, agile principles are rarely adopted by the whole organization. In fact, it seems difficult to describe what distinguishes an agile organization from another. Therefore, the objective of this thesis is to gain an understanding what factors constitute agility at an enterprise level and to develop a maturity model as measurement tool. To fulfill this aim, the thesis as based on a multi-paradigmatic approach combining behavioral and design science and utilizes a pluralistic set of research methods belonging to both paradigms. A comprehensive analysis of agility-related frameworks showed that despite partial similarity there is no consensus about what constitutes an “agile organization”. Hence, the thesis identified the structure to be found behind the concept of organizational agility using an exploratory research approach. A survey among organizations in the software and IT services industry was conducted and showed that organizational agility can be described using six interrelated factors that can be further aggregated into the three basic dimensions of “Agility Prerequisites,” “Agility of People,” and “Structures Enhancing Agility.” Based on these results, the Organizational Agility Maturity Model has been developed providing a theoretically and empirically grounded structure of organizational agility supporting the efforts of developing a common understanding of the concept. The application of the maturity model furthermore creates useful benefits for organizations and underscores the strategic character of organizational agility. It generates an awareness about the complexity of organizational agility. Furthermore, it may serve as a reference frame to implement a systematic and well-directed approach for improvements and continuous assessment of actions taken.
26

Skalierbare Ausführung von Prozessanwendungen in dienstorientierten Umgebungen

Preißler, Steffen 19 November 2012 (has links) (PDF)
Die Strukturierung und Nutzung von unternehmensinternen IT-Infrastrukturen auf Grundlage dienstorientierter Architekturen (SOA) und etablierter XML-Technologien ist in den vergangenen Jahren stetig gewachsen. Lag der Fokus anfänglicher SOA-Realisierungen auf der flexiblen Ausführung klassischer, unternehmensrelevanter Geschäftsprozesse, so bilden heutzutage zeitnahe Datenanalysen sowie die Überwachung von geschäftsrelevanten Ereignissen weitere wichtige Anwendungsklassen, um sowohl kurzfristig Probleme des Geschäftsablaufes zu identifizieren als auch um mittel- und langfristige Veränderungen im Markt zu erkennen und die Geschäftsprozesse des Unternehmens flexibel darauf anzupassen. Aufgrund der geschichtlich bedingten, voneinander unabhängigen Entwicklung der drei Anwendungsklassen, werden die jeweiligen Anwendungsprozesse gegenwärtig in eigenständigen Systemen modelliert und ausgeführt. Daraus resultiert jedoch eine Reihe von Nachteilen, welche diese Arbeit aufzeigt und ausführlich diskutiert. Vor diesem Hintergrund beschäftigte sich die vorliegende Arbeit mit der Ableitung einer konsolidierten Ausführungsplattform, die es ermöglicht, Prozesse aller drei Anwendungsklassen gemeinsam zu modellieren und in einer SOA-basierten Infrastruktur effizient auszuführen. Die vorliegende Arbeit adressiert die Probleme einer solchen konsolidierten Ausführungsplattform auf den drei Ebenen der Dienstkommunikation, der Prozessausführung und der optimalen Verteilung von SOA-Komponenten in einer Infrastruktur.
27

Measuring energy consumption for short code paths using RAPL

Hähnel, Marcus, Döbel, Björn, Völp, Marcus, Härtig, Hermann 28 May 2013 (has links) (PDF)
Measuring the energy consumption of software components is a major building block for generating models that allow for energy-aware scheduling, accounting and budgeting. Current measurement techniques focus on coarse-grained measurements of application or system events. However, fine grain adjustments in particular in the operating-system kernel and in application-level servers require power profiles at the level of a single software function. Until recently, this appeared to be impossible due to the lacking fine grain resolution and high costs of measurement equipment. In this paper we report on our experience in using the Running Average Power Limit (RAPL) energy sensors available in recent Intel CPUs for measuring energy consumption of short code paths. We investigate the granularity at which RAPL measurements can be performed and discuss practical obstacles that occur when performing these measurements on complex modern CPUs. Furthermore, we demonstrate how to use the RAPL infrastructure to characterize the energy costs for decoding video slices.
28

Multi-Quality Auto-Tuning by Contract Negotiation

Götz, Sebastian 13 August 2013 (has links) (PDF)
A characteristic challenge of software development is the management of omnipresent change. Classically, this constant change is driven by customers changing their requirements. The wish to optimally leverage available resources opens another source of change: the software systems environment. Software is tailored to specific platforms (e.g., hardware architectures) resulting in many variants of the same software optimized for different environments. If the environment changes, a different variant is to be used, i.e., the system has to reconfigure to the variant optimized for the arisen situation. The automation of such adjustments is subject to the research community of self-adaptive systems. The basic principle is a control loop, as known from control theory. The system (and environment) is continuously monitored, the collected data is analyzed and decisions for or against a reconfiguration are computed and realized. Central problems in this field, which are addressed in this thesis, are the management of interdependencies between non-functional properties of the system, the handling of multiple criteria subject to decision making and the scalability. In this thesis, a novel approach to self-adaptive software--Multi-Quality Auto-Tuning (MQuAT)--is presented, which provides design and operation principles for software systems which automatically provide the best possible utility to the user while producing the least possible cost. For this purpose, a component model has been developed, enabling the software developer to design and implement self-optimizing software systems in a model-driven way. This component model allows for the specification of the structure as well as the behavior of the system and is capable of covering the runtime state of the system. The notion of quality contracts is utilized to cover the non-functional behavior and, especially, the dependencies between non-functional properties of the system. At runtime the component model covers the runtime state of the system. This runtime model is used in combination with the contracts to generate optimization problems in different formalisms (Integer Linear Programming (ILP), Pseudo-Boolean Optimization (PBO), Ant Colony Optimization (ACO) and Multi-Objective Integer Linear Programming (MOILP)). Standard solvers are applied to derive solutions to these problems, which represent reconfiguration decisions, if the identified configuration differs from the current. Each approach is empirically evaluated in terms of its scalability showing the feasibility of all approaches, except for ACO, the superiority of ILP over PBO and the limits of all approaches: 100 component types for ILP, 30 for PBO, 10 for ACO and 30 for 2-objective MOILP. In presence of more than two objective functions the MOILP approach is shown to be infeasible.
29

Entwurf eines Frameworks für CTI-Lösungen im Call Center

Bauer, Nikolai 16 December 2002 (has links) (PDF)
Besonders in Call Centern spielt die unter dem Begriff CTI (Computer Telephony Integration) zusammengefasste Integration von IT-Systemen und Telefonanlagen eine wichtige Rolle. Wenn auch diese Integration auf technischer Ebene in der Regel zufriedenstellend gelöst wird, zeigt ein Blick auf die Softwareentwicklung in diesem Bereich noch Nachholbedarf. Die vorliegende Arbeit greift dieses Problem auf und versucht, den Ansatz CTI auf die Ebene der Entwicklung verteilter Anwendungen abzubilden. Ziel dabei ist es, Erkenntnisse darüber zu erzielen, inwieweit ein allgemeines Basismodell als Framework für die Entwicklung von CTI-Anwendungen definiert werden kann und welchen Mehrwert es mit sich bringt. Parallel dazu wird die Frage untersucht, inwieweit bewährte Methoden und Technologien verteilter Systeme auf diesem Spezialgebiet ihre Anwendung finden können. Dazu wird ein allgemeines Anwendungsmodell für CTI-Lösungen und darauf aufbauend ein objektorientiertes, verteiltes Framework entworfen. Das Framework selbst wird als Prototyp implementiert und diversen Leistungsmessungen unterzogen. / Computer Telephony Integration (CTI) plays an important role wherever computer and telecommunication systems have to interact. Applications in a call center are typical examples. This integration has been studied widely from a technical viewpoint only, but not at the level of application development. Since telecommunication systems are naturally distributed systems, CTI eventually leads to distributed applications. This thesis presents an example of a general, object-oriented framework for CTI applications and examines the use of proven technologies and methodologies for distributed applications. Based on a prototype implementation the practicability of the concept is being examined and verified.
30

Object-Oriented Development for Reconfigurable Architectures

Fröhlich, Dominik 30 November 2009 (has links) (PDF)
Reconfigurable hardware architectures have been available now for several years. Yet the application development for such architectures is still a challenging and error-prone task, since the methods, languages, and tools being used for development are inappropriate to handle the complexity of the problem. This thesis introduces a novel approach that tackles the complexity challenge by raising the level of abstraction to system-level and increasing the degree of automation. The approach is centered around the paradigms of object-orientation, platforms, and modeling. An application and all platforms being used for its design, implementation, and deployment are modeled with objects using UML and an action language. The application model is then transformed into an implementation, whereby the transformation is steered by the platform models. In this thesis solutions for the relevant problems behind this approach are discussed. It is shown how UML can be used for complete and precise modeling of applications and platforms. Application development is done at the system-level using a set of well-defined, orthogonal platform models. Thereby the core features of object-orientation - data abstraction, encapsulation, inheritance, and polymorphism - are fully supported. Novel algorithms are presented, that allow for an automatic mapping of such application models to the target architecture. Thereby the problems of platform mapping, estimation of implementation characteristics, and synthesis of UML models are discussed. The thesis explores the utilization of platform models for generation of highly optimized implementations in an automatic yet adaptable way. The approach is evaluated by a number of relevant applications. The execution of the generated implementations is supported by a run-time service. This service manages the hardware configurations and objects comprising the application. Moreover, it serves as broker for hardware objects. The efficient management of configurations and objects at run-time is discussed and optimized life cycles for these entities are proposed. Mechanisms are presented that make the approach portable among different physical hardware architectures. Further, this thesis presents UML profiles and example platforms that support system-level design. These extensions are embodied in a novel type of model compiler. The compiler is accompanied by an implementation of the run-time service. Both have been used to evaluate and improve the presented concepts and algorithms.

Page generated in 0.0615 seconds