• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 263
  • Tagged with
  • 263
  • 263
  • 263
  • 263
  • 263
  • 30
  • 28
  • 27
  • 25
  • 24
  • 24
  • 21
  • 19
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

A Conformance And Interoperability Test Suite For Turkey

Sinaci, Ali Anil 01 June 2009 (has links) (PDF)
Conformance to standards and interoperability is a major challenge of today`s applications in all domains. Several standards have been developed and some are still under development to address the various layers in the interoperability stack. Conformance and interoperability testing involves checking whether the applications conform to the standards so that they can interoperate with other conformant systems. Only through testing, correct information exchange among applications can be guaranteed. National Health Information System (NHIS) of Turkey aims to provide a nation-wide infrastructure for sharing Electronic Health Records (EHRs). In order to guarantee the interoperability, the Ministry of Health (MoH), Turkey, developed an Implementation/Integration/Interoperability Profile based on HL7 standards. TestBATN - Testing Business Process, Application, Transport and Network Layers - is a domain and standards independent set of tools which can be used to test all of the layers of the interoperability stack, namely, the Communication Layer, Document Content Layer and the Business Process Layer. In this thesis work, the requirements for conformance and interoperability testing of the NHIS are analyzed, a testing approach is designated, test cases for several NHIS services are developed and deployed and a test execution control and monitoring environment within TestBATN is designed and implemented through the identified testing requirements. The work presented in this thesis is part of the TestBATN system supported by the T&Uuml / BiTAK TEYDEB Project No: 7070191 in addition by the Ministry of Health, Turkey.
202

An Ilp-based Concept Discovery System For Multi-relational Data Mining

Kavurucu, Yusuf 01 July 2009 (has links) (PDF)
Multi Relational Data Mining has become popular due to the limitations of propositional problem definition in structured domains and the tendency of storing data in relational databases. However, as patterns involve multiple relations, the search space of possible hypothesis becomes intractably complex. In order to cope with this problem, several relational knowledge discovery systems have been developed employing various search strategies, heuristics and language pattern limitations. In this thesis, Inductive Logic Programming (ILP) based concept discovery is studied and two systems based on a hybrid methodology employing ILP and APRIORI, namely Confidence-based Concept Discovery and Concept Rule Induction System, are proposed. In Confidence-based Concept Discovery and Concept Rule Induction System, the main aim is to relax the strong declarative biases and user-defined specifications. Moreover, this new method directly works on relational databases. In addition to this, the traditional definition of confidence from relational database perspective is modified to express Closed World Assumption in first-order logic. A new confidence-based pruning method based on the improved definition is applied in the APRIORI lattice. Moreover, a new hypothesis evaluation criterion is used for expressing the quality of patterns in the search space. In addition to this, in Concept Rule Induction System, the constructed rule quality is further improved by using an improved generalization metod. Finally, a set of experiments are conducted on real-world problems to evaluate the performance of the proposed method with similar systems in terms of support and confidence.
203

A Comparison Of Predator Teams With Distinct Genetic Similarity Levels In Single Prey Hunting Problem

Yalcin, Cagri 01 August 2009 (has links) (PDF)
In the domain of the complex control problems for agents, neuroevolution, i.e. artificial evolution of neural networks, methods have been continuously shown to offer high performance solutions which may be unpredictable by external controller design. Recent studies have proved that these methods can also be successfully applied for cooperative multi-agent systems to evolve the desired team behavior. For a given task which may benefit from both cooperation and behavioral specialization, the genetic diversity of the team members may have important effects on the team performance. In this thesis, the single prey hunting problem is chosen as the case, where the performance of the evolved predator teams with distinct genetic similarity levels are systematically examined. For this purpose, three similarity levels, namely homogeneous, partially heterogeneous and heterogeneous, are adopted and analyzed in various problem-specific and algorithmic settings. Our similarity levels differ from each other in terms of the number of groups of identical agents in a single predator team, where identicalness of two agents refers to the fact that both have the same synaptic weight vector in their neural network controllers. On the other hand, the problem-specific conditions comprise three different fields of vision for predators, whereas algorithmic settings refer to varying number of individuals in the populations, as well as two different selection levels such as team and group levels. According to the experimental results within a simulated grid environment, we show that different genetic similarity level-field of vision-algorithmic setting combinations beget different performance results.
204

Combined Attacks On Block Ciphers

Oztop, Nese 01 August 2009 (has links) (PDF)
Cryptanalytic methods are very important tools in terms of evaluating the security of block ciphers in a more accurate and reliable way. Differential and linear attacks have been the most effective cryptanalysis methods since the early 1990s. However, as the technology developed and more secure ciphers are designed, these fundamental methods started to be not so efficient. In order to analyze the ciphers, new methods should be introduced. One approach is inventing new techniques that are different from the existing ones. Another approach is extending or combining known cryptanalytic methods to analyze the cipher in a different way. This thesis is a survey of the attacks that are generated by combination of existing techniques and their applications on specific block ciphers. Mentioned attacks are namely differential-linear, differential-bilinear, higher order differential-linear, differential-nonlinear, square-nonlinear, impossible differential and boomerang type attacks.
205

Variability Modeling In Software Product Lines

Kasikci, Baris Can Cengiz 01 September 2009 (has links) (PDF)
Software product lines provide enhanced means for systematic reuse when constructing systems within a particular domain. In order to achieve this, systems in a product line are expected to have a significant amount of commonality. Variability is what distinguishes these systems from one another and is spread across various product line artifacts. This thesis focuses on modeling and managing product line variability. The concept of concerns is proposed as a means of variability modeling. Another proposal is related to the use of context free grammars to represent product line variability and to guarantee that any application derived according to the variability framework thus defined will be a valid one. This approach is evaluated for an example domain, in the light of novel evaluation criteria that are also introduced in the scope of this thesis.
206

Feature Oriented Domain Specific Language For Dependency Injection In Dynamic Software Product Lines

Dayibas, Orcun 01 September 2009 (has links) (PDF)
Base commonality of the Software Product Line (SPL) Engineering processes is to analyze commonality and variability of the product family though, SPLE defines many various processes in different abstraction levels. In this thesis, a new approach to configure (according to requirements) components as building blocks of the architecture is proposed. The main objective of this approach is to support domain design and application design processes in SPL context. Configuring the products is made into a semi-automatic operation by defining a Domain Specific Language (DSL) which is built on top of domain and feature-component binding model notions. In order to accomplish this goal, dependencies of the components are extracted from the software by using the dependency injection method and these dependencies are made definable in CASE tools which are developed in this work.
207

A Monolithic Approach To Automated Composition Of Semantic Web Services With The Event Calculus

Okutan, Cagla 01 September 2009 (has links) (PDF)
In this thesis, a web service composition and execution framework is presented for semantically annotated web services. A monolithic approach to automated web service composition and execution problem is chosen, which provides some benefits by separating the composition and execution phases. An AI planning method using a logical formalism called Event Calculus is chosen for the composition phase. This formalism allows one to generate a narrative of actions and temporal orderings using abductive planning techniques given a goal. Functional properties of services, namely input/output/precondition/effects(IOPE) are taken into consideration in the composition phase and non-functional properties, namely quality of service (QoS) parameters are used in selecting the most appropriate solution to be executed. The repository of OWL-S semanticWeb services are translated to Event Calculus axioms and the resulting plans found by the Abductive Event Calculus Planner are converted to graphs. These graphs can be sorted according to a score calculated using the defined quality of service parameters of the atomic services in the composition to determine the optimal solution. The selected graph is converted to an OWL-S file which is executed consequently.
208

Crossing: A Framework To Develop Knowledge-based Recommenders In Cross Domains

Azak, Mustafa 01 February 2010 (has links) (PDF)
Over the last decade, excess amount of information is being provided on the web and information filtering systems such as recommender systems have become one of the most important technologies to overcome the &bdquo / Information Overload&amp / #8223 / problem by providing personalized services to users. Several researches have been made to improve quality of recommendations and provide maximum user satisfaction within a single domain based on the domain specific knowledge. However, the current infrastructures of the recommender systems cannot provide the complete mechanisms to meet user needs in several domains and recommender systems show poor performance in cross-domain item recommendations. Within this thesis work, a dynamic framework is proposed which differs from the previous works as it focuses on the easy development of knowledge-based recommenders and it proposes an intensive cross domain capability with the help of domain knowledge. The framework has a generic and flexible structure that data models and user interfaces are generated based on ontologies. New recommendation domains can be integrated to the framework easily in order to improve recommendation diversity. The cross-domain recommendation is accomplished via an abstraction in domain features if the direct matching of the domain features is not possible when the domains are not very close to each other.
209

Quality Of Service Aware Dynamic Admission Control In Ieee 802.16j Non-transparent Relay Networks

Kilic, Eda 01 February 2010 (has links) (PDF)
Today, telecommunication is improving rapidly. People are online anywhere anytime. Due to increasing demand in communication, wireless technologies are progressing quickly trying to provide more services in a wide range. In order to address mobility and connectivity requirements of users in wide areas, Worldwide Interoperability for Microwave Access (Wimax) has been introduced as a forth generation telecommunication technology. Wimax, which is also called Metropolitan Area Network (MAN), is based on IEEE 802.16 standard where a Base Station (BS) provides last mile broadband wireless access to the end users known as Mobile Stations (MS). However, in places where high constructions exist, the signal rate between MS and BS decreases or even the signal can be lost completely due to shadow fading. As a response to this issue, recently an intermediate node specification, namely Relay Station, has been defined in IEEE 802.16j standard for relaying, which provides both throughput enhancement and coverage extension. However, this update has introduced a new problem / call admission control in non-transparent relay networks that support coverage extension. In this thesis, a Quality of Service (QoS) aware dynamic admission control algorithm for IEEE 802.16j non-transparent relay networks is introduced. Our objectives are admitting more service flows, utilizing the bandwidth, giving individual control to each relay station (RS) on call acceptance and rejection, and finally not affecting ongoing service flow quality in an RS due to the dense population of service flows in other RSs. The simulation results show that the proposed algorithm outperforms the other existing call admission control algorithms. Moreover, this algorithm can be interpreted as pioneer call admission control algorithm in IEEE 802.16j non-transparent networks.
210

A Novel Report Generation Approach For Medical Applications: The Sisds Methodology And Its Applications

Kuru, Kaya 01 February 2010 (has links) (PDF)
In medicine, reliable data are available only in a few areas and necessary information on prognostic implications is generally missing. In spite of the fact that a great amount of money has been invested to ease the process, an effective solution has yet to be found. Unfortunately, existing data collection approaches in medicine seem inadequate to provide accurate and high quality data, which is a prerequisite for building a robust and effective DDSS. In this thesis, many different medical reporting methodologies and systems which have been used up to now are evaluated / their strengths and deficiencies are revealed to shed light on how to set up an ideal medical reporting type. This thesis presents a new medical reporting method, namely &ldquo / Structured, Interactive, Standardized and Decision Supporting Method&rdquo / (SISDS) that encompasses most of the favorable features of the existing medical reporting methods while removing most of their deficiencies such as inefficiency and cognitive overload as well as introducing and promising new advantages. The method enables professionals to produce multilingual medical reports much more efficiently than the existing approaches in a novel way by allowing free-text-like data entry in a structured form. The proposed method in this study is proved to be more effective in many perspectives, such as facilitating the complete and the accurate data collection process and providing opportunities to build DDSS without tedious pre-processing and data preparation steps, mainly helping health care professionals practice better medicine.

Page generated in 0.0811 seconds