• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 145
  • 39
  • 29
  • 13
  • 6
  • 6
  • 6
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 294
  • 294
  • 141
  • 95
  • 88
  • 86
  • 78
  • 78
  • 66
  • 57
  • 48
  • 43
  • 40
  • 37
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Domain-specific modeling and verification language EDOLA

Zhang, Hehua 19 December 2009 (has links) (PDF)
With the widely use of software technique in everyday applications, the correctness of software becomes more and more important. Formal verification is an important method to improve the correctness of software. However, it mainly takes formal languages as its modeling languages, which are based on mathematical logic, automata or graph theory, hard for learning and domain description. That hinders the applications of formal verification in industry. This dissertation investigates the design and practice of domain modeling and verification language EDOLA, to possess all the features of the usability for domain description, reusability and automatic verification. It proposes a three-level design method with the domain knowledge level, the common module level and the verification support level. The main contributions are summarized as follows: 1. In the domain knowledge level, the extraction and representation methods of the domain knowledge on both job-shop scheduling and PLC control software are proposed. It defines domain-specific operators of the job-shop scheduling problem, timed Petri net, etc. for the job-shop scheduling description. It also defines the operators of the scan cycle pattern, the complete environment pattern and five kinds of verification requests for the PLC domain description. It presents the formal semantics of the defined domain-specific operators, for the further EDOLA definition and its automatic verification. 2. In the common module level, the method to define common operators is presented with real-time as an example for common knowledge. It proposes two kinds of basic time operators and four advanced ones, which help EDOLA to describe real-time features easily and make the reusability of EDOLA design among time-sensitive domains possible. 3. In the verification support level, it presents a properties-oriented abstraction strategy, which reduces the state space and exploring space during automatic verifi- cation. It then formulates the encoding rules from EDOLA to first-order logic, thus implements the verification of the models with infinite states, with the help of first-order logic automatic theorem provers. 4. A prototype of the PLC domain modeling and verification language: EDOLA-PLC are developed and its tools are implemented. The tools provide an EDOLA-PLC editor and a compiler with the functionalities like syntax checking, semantics checking and translation-based automatic verification. 5. A case study of the EDOLA-PLC language on a dock fire-fighting control system is presented. It indicates that EDOLA-PLC is easy to describe both the PLC domain knowledge and the properties to be verified; is easy to describe the common knowledge: real-time and can be verified automatically. The results show that the abstraction strategy adopted in the verification support level of EDOLA-PLC improves the efficiency of automatic verification.
112

Developing and Testing Pervasive Computing Applications: A Tool-Based Methodology

Bruneau, Julien 16 May 2012 (has links) (PDF)
Malgré des progrès récents, développer une application d'informatique ubiquitaire reste un défi à cause d'un manque de canevas conceptuels et d'outils aidant au développement. Ce défi implique de prendre en charge des objets communicants hétérogènes, de surmonter la complexité des technologies de systèmes distribués, de définir l'architecture d'une application, et d'encoder cela dans un programme. De plus, tester des applications d'informatique ubiquitaire est problématique car cela implique d'acquérir, de tester et d'interfacer une variété d'entités logicielles et matérielles. Ce procédé peut rapidement devenir coûteux en argent et en temps lorsque l'environnement ciblé implique de nombreuses entités. Cette thèse propose une méthodologie outillée pour dévelop- per et tester des applications d'informatique ubiquitaire. Notre méthodologie fournit tout d'abord le langage de conception DiaSpec. Ce langage permet de définir une taxonomie d'entités spécifiques à un domaine applicatif, s'abstrayant ainsi de leur hétérogénéité. Ce langage inclut également une couche permettant de définir l'architecture d'une application. Notre suite outillée fournit un compilateur qui, à partir de descriptions DiaSpec, génère un canevas de programmation guidant les phases d'implémentation et de test. Afin d'aider à la phase de test, nous proposons une approche de simulation et un outil intégré dans notre méthodologie outillée : l'outil DiaSim. Notre approche utilise le support de test généré par DiaSpec pour tester les applications de manière transparente dans un environnement physique simulé. La simulation d'une application est rendue graphiquement dans un outil de visualisation 2D. Nous avons combiné DiaSim avec un langage dédié permet- tant de décrire les phénomènes physiques en tant qu'équations différentielles, permettant des simulations réalistes. DiaSim a été utilisé pour simuler des applications dans des domaines applicatifs variés. Notre approche de simulation a également été appliquée à un système avionique, démontrant la généralité de notre approche de simulation.
113

Der Wortassoziationsversuch als wissensdiagnostisches Instrument im arbeitspsychologischen Kontext : eine Befundintegration zur Verfahrensvalidierung / Word associations as an knowledge elicitation instrument in the context of occupational psychology

Ceglarek, Petra January 2008 (has links)
Wissensanalysen besitzen arbeitspsychologische Relevanz, da kompetentes Arbeitshandeln das Beherrschen eines gesicherten Basiswissens voraussetzt. In der arbeitspsychologischen Praxis werden Wissensdiagnosen beispielsweise eingesetzt in Wissensmanagementprozessen, zur Evaluation von Weiterbildungsmaßnahmen oder zur Entwicklung wissensbasierter Systeme. Der Wortassoziationsversuch als ein Verfahren zur Verbalisation fachspezifischen Wissens kann dazu einen Beitrag leisten. Dabei werden Probanden Stimuli aus einer umschriebenen Domäne des Fachwissenbereichs vorgegeben, auf welche diese stichwortartig alle Assoziationen benennen sollen, welche ihnen einfallen. Je mehr jemand assoziiert, desto größer ist – gemäß der Annahme einer netzwerkanalogen Repräsentation – dessen Wissensbesitz. Da die Verfahrensgüte des Wortassoziationsversuchs bisher ungeklärt war, sollten anhand von insgesamt 17 Feldstudien die Haupt- und Nebengütekriterien bestimmt werden. Es zeigte sich, daß der Wortassoziationsversuch in der Lage ist, explizites, deklaratives Fachwissen von Probanden zu erheben, und somit ein brauchbares wissensdiagnostisches Instrument darstellt. Die Reliabilität des Wortassoziationsversuchs konnte belegt werden, somit ist eine wichtige Voraussetzung zur Beurteilung der Validität sowie der Veränderungssensitivität gegeben. Auch die Prüfung der Validität anhand der Außenkriterien Geschäftsführerbeurteilung sowie Klausurleistung erbrachte zufriedenstellende Koeffizienten und kann daher ebenfalls als belegt angesehen werden. Ebenso konnte i.S. der diskriminanten Validierung gezeigt werden, daß mittels der Assoziationstechnik tatsächlich das Konstrukt des Fachwissens und nicht der generellen Fähigkeit zur Wortflüssigkeit erfaßt wird. Insgesamt zeigt sich der Wortassoziationsversuch damit als ein valides, reliables, m.E. Objektives, veränderungssensitves, von den Probanden akzeptiertes, ökonomisches und damit für die arbeitspsychologische Praxis nützliches Verfahren. / Providing methods and instruments to assess the elicitation of domain-specific knowledge from (working) persons is of major relevance for occupational psychology, since basic knowledge is a precondition for competent work performance. In occupational practice, knowledge elicitation methods are realised in organisational knowledge management processes, for training evaluations or for developing knowledge based systems. Free term entry (FTE), which helps to verbalise domain specific knowledge, can contribute greatly in this context. The method involves presenting subjects with stimuli from a specific domain, then the subjects have to list in note form all associations that come to their minds. The more the subject associates, the grater his knowledge – assuming a network-analog representation. Since the quality of the performance data of FTE tests has as of yet been inconclusive, I identified primary and secondary quality criteria using a total of 17 field studies. I was able to show that FTE is able to elicit explicit, declarative domain specific knowledge, and thus is a useful tool for this purpose. Its reliability, an important precondition for validity and sensitivity, was proved. An assessment of the validity on the basis of two external criteria (an appraisal of the subject's vocational expertise by the managing director as a performance measure performance and the subject´s exam performance as a measure of individual domain-specific knowledge) leads to good coefficients. Assessment of the discriminant validity shows that the FTE method captures the construct of domain specific knowledge instead of the general word fluency ability. Overall, the mean frequency of associations is a sensitive measurement for the extent of the individual domain-specific knowledge as well as the extent of vocational expertise – the FTE method is a valid, reliable, objective, economical instrument accepted by the subjects, and therefore is useful for the practice of occupational psychology.
114

A Domain-Specific Conceptual Query System

Shen, Xiuyun 02 August 2007 (has links)
This thesis presents the architecture and implementation of a query system resulted from a domain-specific conceptual data modeling and querying methodology. The query system is built for a high level conceptual query language that supports dynamically user-defined domain-specific functions and application-specific functions. It is DBMS-independent and can be translated to SQL and OQL through a normal form. Currently, it has been implemented in neuroscience domain and can be applied to any other domain.
115

A Distributed Approach to Crawl Domain Specific Hidden Web

Desai, Lovekeshkumar 03 August 2007 (has links)
A large amount of on-line information resides on the invisible web - web pages generated dynamically from databases and other data sources hidden from current crawlers which retrieve content only from the publicly indexable Web. Specially, they ignore the tremendous amount of high quality content "hidden" behind search forms, and pages that require authorization or prior registration in large searchable electronic databases. To extracting data from the hidden web, it is necessary to find the search forms and fill them with appropriate information to retrieve maximum relevant information. To fulfill the complex challenges that arise when attempting to search hidden web i.e. lots of analysis of search forms as well as retrieved information also, it becomes eminent to design and implement a distributed web crawler that runs on a network of workstations to extract data from hidden web. We describe the software architecture of the distributed and scalable system and also present a number of novel techniques that went into its design and implementation to extract maximum relevant data from hidden web for achieving high performance.
116

Metamodeling For The Hla Federation Architectures

Topcu, Okan 01 December 2007 (has links) (PDF)
This study proposes a metamodel, named Federation Architecture Metamodel (FAMM), for describing the architecture of a High Level Architecture (HLA) compliant federation. The metamodel provides a domain specific language and a formal representation for the federation adopting Domain Specific Metamodeling approach to HLA-compliant federations. The metamodel supports the definitions of transformations both as source and as target. Specifically, it supports federate base code generation from a described federate behavior, and it supports transformations from a simulation conceptual model. A salient feature of FAMM is the behavioral description of federates based on live sequence charts (LSCs). It is formulated in metaGME, the meta-metamodel for the Generic Modeling Environment (GME). This thesis discusses specifically the following points: the approach to building the metamodel, metamodel extension from Message Sequence Chart (MSC) to LSC, support for model-based code generation, and action model and domain-specific data model integration. Lastly, this thesis presents, through a series of modeling case studies, the Federation Architecture Modeling Environment (FAME), which is a domain-specific model-building environment provided by GME once FAMM is invoked as the base paradigm.
117

Feature Oriented Domain Specific Language For Dependency Injection In Dynamic Software Product Lines

Dayibas, Orcun 01 September 2009 (has links) (PDF)
Base commonality of the Software Product Line (SPL) Engineering processes is to analyze commonality and variability of the product family though, SPLE defines many various processes in different abstraction levels. In this thesis, a new approach to configure (according to requirements) components as building blocks of the architecture is proposed. The main objective of this approach is to support domain design and application design processes in SPL context. Configuring the products is made into a semi-automatic operation by defining a Domain Specific Language (DSL) which is built on top of domain and feature-component binding model notions. In order to accomplish this goal, dependencies of the components are extracted from the software by using the dependency injection method and these dependencies are made definable in CASE tools which are developed in this work.
118

Comparison Of Domain-independent And Domain-specific Location Predictors With Campus-wide Wi-fi Mobility Data

Karakoc, Mucahit 01 September 2010 (has links) (PDF)
In mobile computing systems, predicting the next location of a mobile wireless user has gained interest over the past decade. Location prediction may have a wide-range of application areas such as network load balancing, advertising and web page prefetching. In the literature, there exist many location predictors which are divided into two main classes: domain-independent and domain-specific. Song et al. compare the prediction accuracy of the domain-independent predictors from four major families, namely, Markov-based, compression-based, PPM and SPM predictors on Dartmouth&#039 / s campus-wide Wi-Fi mobility data. As a result, the low-order Markov predictors are found as the best predictor. In another work, Bayir et al. propose a domain-specific location predictor (LPMP) as the application of a framework used for discovering mobile cell phone user profiles. In this thesis, we evaluate LPMP and the best Markov predictor with Dartmouth&#039 / s campus-wide Wi-Fi mobility data in terms of accuracy. We also propose a simple method which improves the accuracy of LPMP slightly in the location prediction part of LPMP. Our results show that the accuracy of the best Markov predictor is better than that of LPMP in total. However, interestingly, LPMP yields more accurate results than the best Markov predictor does for the users with the low prediction accuracy.
119

Transforming Mission Space Models To Executable Simulation Models

Ozhan, Gurkan 01 September 2011 (has links) (PDF)
This thesis presents a two step automatic transformation of Field Artillery Mission Space Conceptual Models (ACMs) into High Level Architecture (HLA) Federation Architecture Models (FAMs) into executable distributed simulation code. The approach followed in the course of this thesis adheres to the Model-Driven Engineering (MDE) philosophy. Both ACMs and FAMs are formally defined conforming to their metamodels, ACMM and FAMM, respectively. ACMM is comprised of a behavioral component, based on Live Sequence Charts (LSCs), and a data component based on UML class diagrams. Using ACMM, the Adjustment Followed by Fire For Effect (AdjFFE) mission, which serves as the source model for the model transformation case study, is constructed. The ACM to FAM transformation, which is defined over metamodel-level graph patterns, is carried out with the Graph Rewriting and Transformation (GReAT) tool. Code generation from a FAM is accomplished by employing a model interpreter that produces Java/AspectJ code. The resulting code can then be executed on an HLA Run-Time Infrastructure (RTI). Bringing a fully fledged transformation approach to conceptual modeling is a distinguishing feature of this thesis. This thesis also aims to bring the chart notations to the attention of the mission space modeling community regarding the description of military tasks, particularly their communication aspect. With the experience gained, a set of guidelines for a domainindependent transformer from any metamodel-based conceptual model to FAM is offered.
120

Individualized Health Related Quality of Life Measures: their use in children and their psychometric properties

Ishaque, Sana Unknown Date
No description available.

Page generated in 0.0457 seconds