• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 170
  • 93
  • 31
  • 14
  • Tagged with
  • 308
  • 108
  • 104
  • 55
  • 55
  • 55
  • 47
  • 46
  • 39
  • 35
  • 35
  • 29
  • 26
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

VS2DRT: Variably saturated two dimensional reactive transport modeling in the vadose zone

Haile, Sosina Shimeles 19 March 2013 (has links) (PDF)
Contaminate transport in vadose is a huge concern since the vadose zone is the main passage way for ground water recharge. Understanding this process is crucial in order to prevent contamination, protect and rehabilitate ground water resources. Reactive transport models are instrumental for such purposes and there are numerous solute transport simulation programs for both ground water and vadose zone but most of this models are limited to simple Linear, Langmuir and Freundlich sorption models and first order decay and fail to simulate more complex geochemical reactions that are common in the vadose zone such as cation exchange, surface complexation, redox reaction and biodegradation. So it is necessary to enhance capabilities of solute transport models by incorporating well tested hydrogeochemical models like PHREEQC in to them to be able closely approximate the geochemical transport process in the subsurface. In this PhD research a new reactive transport model called VS2DRT was created by coupling existing public domain solute and heat transport models VS2DT, VS2DH with hydro-chemical model PHREEQC using non-iterative operator splitting technique. VS2DRT was compiled using MinGW compiler using tools like autotools and automake. A graphical user interface was also created using QT creator and Argus ONE numerical development tools. The new model was tested for one dimensional conservative Cl transport, surface complexation, cation exchange, dissolution of calcite and gypsum, heat and solute transport as well as for two dimensional cation exchange cases. Their results were compared with VS2DT, VS2DH, HP1 and HP2 models and the results are in good agreement.
242

Proceedings of the 9th Workshop on Aspects, Components, and Patterns for Infrastructure Software (ACP4IS '10)

January 2010 (has links)
Aspect-oriented programming, component models, and design patterns are modern and actively evolving techniques for improving the modularization of complex software. In particular, these techniques hold great promise for the development of "systems infrastructure" software, e.g., application servers, middleware, virtual machines, compilers, operating systems, and other software that provides general services for higher-level applications. The developers of infrastructure software are faced with increasing demands from application programmers needing higher-level support for application development. Meeting these demands requires careful use of software modularization techniques, since infrastructural concerns are notoriously hard to modularize. Aspects, components, and patterns provide very different means to deal with infrastructure software, but despite their differences, they have much in common. For instance, component models try to free the developer from the need to deal directly with services like security or transactions. These are primary examples of crosscutting concerns, and modularizing such concerns are the main target of aspect-oriented languages. Similarly, design patterns like Visitor and Interceptor facilitate the clean modularization of otherwise tangled concerns. Building on the ACP4IS meetings at AOSD 2002-2009, this workshop aims to provide a highly interactive forum for researchers and developers to discuss the application of and relationships between aspects, components, and patterns within modern infrastructure software. The goal is to put aspects, components, and patterns into a common reference frame and to build connections between the software engineering and systems communities.
243

Preface

January 2010 (has links)
Aspect-oriented programming, component models, and design patterns are modern and actively evolving techniques for improving the modularization of complex software. In particular, these techniques hold great promise for the development of "systems infrastructure" software, e.g., application servers, middleware, virtual machines, compilers, operating systems, and other software that provides general services for higher-level applications. The developers of infrastructure software are faced with increasing demands from application programmers needing higher-level support for application development. Meeting these demands requires careful use of software modularization techniques, since infrastructural concerns are notoriously hard to modularize. Aspects, components, and patterns provide very different means to deal with infrastructure software, but despite their differences, they have much in common. For instance, component models try to free the developer from the need to deal directly with services like security or transactions. These are primary examples of crosscutting concerns, and modularizing such concerns are the main target of aspect-oriented languages. Similarly, design patterns like Visitor and Interceptor facilitate the clean modularization of otherwise tangled concerns. Building on the ACP4IS meetings at AOSD 2002-2009, this workshop aims to provide a highly interactive forum for researchers and developers to discuss the application of and relationships between aspects, components, and patterns within modern infrastructure software. The goal is to put aspects, components, and patterns into a common reference frame and to build connections between the software engineering and systems communities.
244

Modell-basierte Verifikation von vernetzten mechatronischen Systemen

Hirsch, Martin January 2008 (has links)
Zugl.: Paderborn, Univ., Diss., 2008
245

Modell-basierte Verifikation von vernetzten mechatronischen Systemen

Hirsch, Martin. Unknown Date (has links) (PDF)
Paderborn, Universiẗat, Diss., 2008.
246

RAfEG: Referenz-Systemarchitektur und prototypische Umsetzung -- Ausschnitt aus dem Abschlussbericht zum Projekt "Referenzarchitektur für E-Government" (RAfEG) --

Kunis, Raphael, Rünger, Gudula 07 December 2007 (has links) (PDF)
Das Ziel des RAfEG-Projektes bestand in der Entwicklung einer Referenzarchitektur "E-Government", die die notwendigen Komponenten zur Realisierung informations- und kommunikationstechnischer Systeme (IuK-Systeme) für typische Prozesse in nachgeordneten Behörden der Innenministerien der Bundesländer bereitstellte. Die Architektur RAfEG stellt einen ganzheitlichen Ansatz dar, der viele wesentliche Aspekte, beginnend mit der formalen Beschreibung der fachlichen Zusammenhänge bis hin zur Entwicklung von verteilt agierenden Softwarekomponenten behördlicher Geschäftsprozesse umfasst. Die Architektur liefert unter Berücksichtigung hardwareseitiger Voraussetzungen die Struktur von Softwarekomponenten zur Verwaltungsautomatisierung. Die Architektur RAfEG wurde als räumlich verteiltes komponentenbasiertes Softwaresystem entworfen. Dabei war es notwendig, Konzepte zur effizienten Nutzung von heterogenen Systemen für interaktive Anwendungen im Bereich E-Government zu entwickeln. Die prototypische Umsetzung der Architektur erfolgte für Planfeststellungsverfahren/Plangenehmigungsprozesse am Beispiel des Regierungspräsidiums Leipzig. Das Vorhaben war geprägt von der Entwicklung eines durchgängigen Konzeptes zur optimalen IuK-technischen Unterstützung von Verwaltungsprozessen. Dies führte von der Modellierung der fachlichen Zusammenhänge (Fachkonzept) über die entwicklungsorientierte, methodische Abbildung der zu implementierenden Sachverhalte (Datenverarbeitungskonzept) bis zur komponentenbasierten Softwareentwicklung (Implementierungskonzept). Dieses Konzept mündete in einer Referenzarchitektur für typische E-Government-Prozesse. Dazu wurden neben den rein fachlichen, aufgabenbezogenen Aspekten insbesondere Sicherheitsaspekte sowie technische und organisatorische Schnittstellen ausführlich betrachtet. Der durchgängige Einsatz von Open Source Software führt hierbei zu einer kosteneffizienten, flexiblen Referenzlösung, die durch ihre komponentenbasierte Struktur als weiteren Aspekt sehr gut an spezielle Anforderungen anpassbar ist.
247

Automatic Generation of Trace Links in Model-driven Software Development

Grammel, Birgit 17 February 2014 (has links)
Traceability data provides the knowledge on dependencies and logical relations existing amongst artefacts that are created during software development. In reasoning over traceability data, conclusions can be drawn to increase the quality of software. The paradigm of Model-driven Software Engineering (MDSD) promotes the generation of software out of models. The latter are specified through different modelling languages. In subsequent model transformations, these models are used to generate programming code automatically. Traceability data of the involved artefacts in a MDSD process can be used to increase the software quality in providing the necessary knowledge as described above. Existing traceability solutions in MDSD are based on the integral model mapping of transformation execution to generate traceability data. Yet, these solutions still entail a wide range of open challenges. One challenge is that the collected traceability data does not adhere to a unified formal definition, which leads to poorly integrated traceability data. This aggravates the reasoning over traceability data. Furthermore, these traceability solutions all depend on the existence of a transformation engine. However, not in all cases pertaining to MDSD can a transformation engine be accessed, while taking into account proprietary transformation engines, or manually implemented transformations. In these cases it is not possible to instrument the transformation engine for the sake of generating traceability data, resulting in a lack of traceability data. In this work, we address these shortcomings. In doing so, we propose a generic traceability framework for augmenting arbitrary transformation approaches with a traceability mechanism. To integrate traceability data from different transformation approaches, our approach features a methodology for augmentation possibilities based on a design pattern. The design pattern supplies the engineer with recommendations for designing the traceability mechanism and for modelling traceability data. Additionally, to provide a traceability mechanism for inaccessible transformation engines, we leverage parallel model matching to generate traceability data for arbitrary source and target models. This approach is based on a language-agnostic concept of three similarity measures for matching. To realise the similarity measures, we exploit metamodel matching techniques for graph-based model matching. Finally, we evaluate our approach according to a set of transformations from an SAP business application and the domain of MDSD.
248

Model Driven Development and Maintenance of Business Logic for Information Systems

Brückmann, Tobias 17 December 2010 (has links)
Since information systems become more and more important in today\''s society, business firms, organizations, and individuals rely on these systems to manage their daily business and social activities. The dependency of possibly critical business processes on complex IT systems requires a strategy that supports IT departments in reducing the time needed to implement changed or new domain requirements of functional departments. In this context, software models help to manage system\''s complexity and provide a tool for communication and documentation purposes. Moreover, software engineers tend to use automated software model processing such as code generation to improve development and maintenance processes. Particularly in the context of web-based information systems, a number of model driven approaches were developed. However, we believe that compared to the user interface layer and the persistency layer, there could be a better support of consistent approaches providing a suitable architecture for the consistent model driven development of business logic. To ameliorate this situation, we developed an architectural blueprint consisting of meta models, tools, and a method support for model driven development and maintenance of business logic from analysis until system maintenance. This blueprint, which we call Amabulo infrastructure, consists of five layers and provides concepts and tools to set up and apply concrete infrastructures for model driven development projects. Modeling languages can be applied as needed. In this thesis we focus on business logic layers of J2EE applications. However, concrete code generation rules can be adapted easily for different target platforms. After providing a high-level overview of our Amabulo infrastructure, we describe its layers in detail: The Visual Model Layer is responsible for all visual modeling tasks. For this purpose, we discuss requirements for visual software models for business logic, analyze several visual modeling languages concerning their usefulness, and provide an UML profile for business logic models. The Abstract Model Layer provides an abstract view on the business logic model in the form of a domain specific model, which we call Amabulo model. An Amabulo model is reduced to pure logical information concerning business logic aspects. It focuses on information that is relevant for the code generation. For this purpose, an Amabulo model integrates model elements for process modeling, state modeling, and structural modeling. It is used as a common interface between visual modeling languages and code generators. Visual models of the Visual Model Layer are automatically transformed into an Amabulo model. The Abstract System Layer provides a formal view onto the system in the form of a Coloured Petri Net (CPN). A Coloured Petri Net representation of the modeled business logic is a formal structure and independent of the actual business logic implementation. After an Amabulo model is automatically transformed into a CPN, it can be analyzed and simulated before any line of code is generated. The Code Generation Layer is responsible for code generation. To support the design and implementation of project-specific code generators, we discuss several aspects of code integration issues and provide object-oriented design approaches to tackle the issues. Then, we provide a conceptual mapping of Amabulo model elements into architectural elements of a J2EE infrastructure. This mapping explicitly considers robustness features, which support a later manual integration of generated critical code artifacts and external systems. The Application Layer is the target layer of an Amabulo infrastructure and comprises generated code artifacts. These artifacts are instances of a specific target platform specification, and they can be modified for integration purposes with development tools. Through the contributions in this thesis, we aim to provide an integrated set of solutions to support an efficient model driven development and maintenance process for the business logic of information systems. Therefore, we provide a consistent infrastructure blueprint that considers modeling tasks, model analysis tasks, and code generation tasks. As a result, we see potential for reducing the development and maintenance efforts for changed domain requirements and simultaneously guaranteeing robustness and maintainability even after several changes.
249

Computing Quantiles in Markov Reward Models

Ummels, Michael, Baier, Christel 10 July 2014 (has links) (PDF)
Probabilistic model checking mainly concentrates on techniques for reasoning about the probabilities of certain path properties or expected values of certain random variables. For the quantitative system analysis, however, there is also another type of interesting performance measure, namely quantiles. A typical quantile query takes as input a lower probability bound p ∈ ]0,1] and a reachability property. The task is then to compute the minimal reward bound r such that with probability at least p the target set will be reached before the accumulated reward exceeds r. Quantiles are well-known from mathematical statistics, but to the best of our knowledge they have not been addressed by the model checking community so far. In this paper, we study the complexity of quantile queries for until properties in discrete-time finite-state Markov decision processes with nonnegative rewards on states. We show that qualitative quantile queries can be evaluated in polynomial time and present an exponential algorithm for the evaluation of quantitative quantile queries. For the special case of Markov chains, we show that quantitative quantile queries can be evaluated in pseudo-polynomial time.
250

Motives for Participation in Open-Source Software Projects: A Survey among R Package Authors

Mair, Patrick, Hofmann, Eva, Gruber, Kathrin, Hatzinger, Reinhold, Zeileis, Achim, Hornik, Kurt 04 1900 (has links) (PDF)
One of the cornerstones of the R system for statistical computing is the multitude of contributed packages making an extremely broad range of statistical techniques and other quantitative methods freely available. This study investigates which factors are the crucial determinants responsible for the participation of the package authors in the R project. For this purpose a survey was conducted among R package authors, collecting data on different types of participation in the R project, three psychometric scales (hybrid forms of motivation, work design characteristics, and values), as well as various specie-demographic factors. These data are analyzed using item response theory and generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the knowledge characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation. (authors' abstract) / Series: Research Report Series / Department of Statistics and Mathematics

Page generated in 0.0748 seconds