• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 263
  • Tagged with
  • 263
  • 263
  • 263
  • 263
  • 263
  • 30
  • 28
  • 27
  • 25
  • 24
  • 24
  • 21
  • 19
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Variable Shaped Detector: A Negative Selection Algorithm

Ataser, Zafer 01 February 2013 (has links) (PDF)
Artificial Immune Systems (AIS) are class of computational intelligent methods developed based on the principles and processes of the biological immune system. AIS methods are categorized mainly into four types according to the inspired principles and processes of immune system. These categories are clonal selection, negative selection, immune network and danger theory. The approach of negative selection algorithm (NSA) is one of the major AIS models. NSA is a supervised learning algorithm based on the imitation of the T cells maturation process in thymus. In this imitation, detectors are used to mimic the cells, and the process of T cells maturation is simulated to generate detectors. Then, NSA classifies the specified data either as normal (self) data or as anomalous (non-self) data. In this classification task, NSA methods can make two kinds of classification errors: a self data is classified as anomalous, and a non-self data is classified as normal data. In this thesis, a novel negative selection method, variable shaped detector (V-shaped detector), is proposed to increase the classification accuracy, or in other words decreasing classification errors. In V-shaped detector, new approaches are introduced to define self and represent detectors. V-shaped detector uses the combination of Local Outlier Factor (LOF) and kth nearest neighbor (k-NN) to determine a different radius for each self sample, thus it becomes possible to model the self space using self samples and their radii. Besides, the cubic b-spline is proposed to generate a variable shaped detector. In detector representation, the application of cubic spline is meaningful, when the edge points are used. Hence, Edge Detection (ED) algorithm is developed to find the edge points of the given self samples. V-shaped detector was tested using different data sets and compared with the well-known one-class classification method, SVM, and the similar popular negative selection method, NSA with variable-sized detector termed V-detector. The experiments show that the proposed method generates reasonable and comparable results.
32

A Pre-enactment Model For Measuring Process Quality

Guceglioglu, A.selcuk 01 June 2006 (has links) (PDF)
Most of the process measurement studies are related with time and cost based models. Although quality is the other conventional aspect, there are no widely used models for measuring the process quality in the literature. In order to provide complementary information about the quality, a process quality measurement model has been chosen to be developed and the studies about process characteristics have been searched in the scope of the thesis. Moreover, by utilizing the similarities between process and software, the studies in software quality have been investigated. In the light of the researches, a model is built on the basis of ISO/IEC 9126 Software Product Quality Model. Some of the quality attributes are redefined in the model according to the process characteristics. In addition, new attributes unique only to the process are developed. A case study is performed and its results discussed from different perspectives of applicability, understandability and suitability.
33

3d Face Reconstruction Using Stereo Vision

Dikmen, Mehmet 01 September 2006 (has links) (PDF)
3D face modeling is currently a popular area in Computer Graphics and Computer Vision. Many techniques have been introduced for this purpose, such as using one or more cameras, 3D scanners, and many other systems of sophisticated hardware with related software. But the main goal is to find a good balance between visual reality and the cost of the system. In this thesis, reconstruction of a 3D human face from a pair of stereo cameras is studied. Unlike many other systems, facial feature points are obtained automatically from two photographs with the help of a dot pattern projected on the object&amp / #8217 / s face. It is seen that using projection pattern also provided enough feature points to derive 3D face roughly. These points are then used to fit a generic face mesh for a more realistic model. To cover this 3D model, a single texture image is generated from the initial stereo photographs.
34

Finite Element Simulation Of Crack Propagation For Steel Fiber Reinforced Concrete

Ozenc, Kaan 01 August 2009 (has links) (PDF)
Steel fibers or fibers in general are utilized in concrete to control the tensile cracking and to increase its toughness. In literature, the effects of fiber geometry, mechanical properties, and volume on the properties of fiber reinforced concrete have often been experimentally investigated by numerous studies. Those experiments have shown that useful improvements in the mechanical behavior of brittle concrete are achieved by incorporating steel fibers. This study proposes a simulation platform to determine the influence of fibers on crack propagation and fracture behavior of fiber reinforced concrete. For this purpose, a finite element (FE) simulation tool is developed for the fracture process of fiber reinforced concrete beam specimens subjected to flexural bending test. Within this context, the objective of this study is twofold. The first one is to investigate the effects of finite element mesh size and element type on stress intensity factor (SIF) calculation through finite element analysis. The second objective is to develop a simulation of the fracture process of fiber reinforced concrete beam specimens. The properties of the materials, obtained from literature, and the numerical simulation procedure, will be explained. The effect of fibers on SIF is included by unidirectional elements with nonlinear generalized force-deflection capability. Distributions and orientation of fibers and possibility of anchorage failure are also added to simulation. As a result of this study it was observed that with the adopted simulation tool, the load-deflection relation obtained by experimental studies is predicted reasonably.
35

Amesurement Framework For Component Oriented Software Systems

Salman, Nael 01 November 2006 (has links) (PDF)
A measurement framework is presented for component oriented (CO) software systems. Fundamental concepts in component orientation are defined. The factors that influence CO systems&rsquo / structural complexity are identified. Metrics quantifying and characterizing these factors are defined. A set of properties that a CO complexity metric must satisfy are defined. Metrics are evaluated first using the set of properties defined in this thesis and also using the set of properties defined by Tian and Zelkowitz in [84]. Evaluation results revealed that metrics satisfy all properties in both sets. Empirical validation of metrics is performed using data collected from graduate students&rsquo / projects. Validation results revealed that CO complexity metrics can be used as predictors of development effort, Design effort, integration effort (characterizing system integrabiltiy), correction effort (characterizing system maintainability), function points count (characterizing system functionality), and programmer productivity. An Automated metrics collection tool is implemented and integrated with a dedicated CO modeling tool. The metrics collection tool automatically collects complexity metrics from system models and performs prediction estimations accordingly.
36

Realizing The Specification And Execution Of Workflows Through The Event Calculus

Yilmaz, Huseyin 01 December 2006 (has links) (PDF)
Workflow management promises a solution to an age-old problem: controlling,monitoring, optimizing and supporting business processes. What is new about workflow management is the explicit representation of the business process logic which allows for computerized support. In the light of this support, many researchers developed different approaches to model new systems with different capabilities to solve this age-old problem. One of the approaches is using logicbased methodology for the specification and execution of workflows. Here, the event calculus, a logic programming formalism for representing events and their effects especially in database applications, is used for this approach. It is shown that the control flow graph of a workflow specification can be expressed as a set of logical formulas and the event calculus can be used to specify the role of a workflow manager through a set of rules for the execution dependencies of activities. Constructed workflow formalization through Event Calculus is realized by using recent technologies, and the resulting product is named as EventFlow,including some administrative interfaces to manage system and workflow engine. The thesis describes the architecture and implementation details of EventFlow, an editor developed for graphical representation of control flow graph, and technologies used in the implementation. And an example application is built to show the usability and execution of the implemented system.
37

A Clustering Method For The Problem Of Protein Subcellular Localization

Bezek, Perit 01 December 2006 (has links) (PDF)
In this study, the focus is on predicting the subcellular localization of a protein, since subcellular localization is helpful in understanding a protein&rsquo / s functions. Function of a protein may be estimated from its sequence. Motifs or conserved subsequences are strong indicators of function. In a given sample set of protein sequences known to perform the same function, a certain subsequence or group of subsequences should be common / that is, occurrence (frequency) of common subsequences should be high. Our idea is to find the common subsequences through clustering and use these common groups (implicit motifs) to classify proteins. To calculate the distance between two subsequences, traditional string edit distance is modified so that only replacement is allowed and the cost of replacement is related to an amino acid substitution matrix. Based on the modified string edit distance, spectral clustering embeds the subsequences into some transformed space for which the clustering problem is expected to become easier to solve. For a given protein sequence, distribution of its subsequences over the clusters is the feature vector which is subsequently fed to a classifier. The most important aspect if this approach is the use of spectral clustering based on modified string edit distance.
38

A Java Toolbox For Wavelet Based Image Denoising

Tuncer, Guney 01 August 2005 (has links) (PDF)
Wavelet methods for image denoising have became widespread for the last decade. The effectiveness of this denoising scheme is influenced by many factors. Highlights can be listed as choosing of wavelet used, the threshold determination and transform level selection for thresholding. For threshold calculation one of the classical solutions is Wiener filter as a linear estimator. Another one is VisuShrink using global thresholding for nonlinear area. The purpose of this work is to develop a Java toolbox which is used to find best denoising schemes for distinct image types particularly Synthetic Aperture Radar (SAR) images. This can be accomplished by comparing these basic methods with well known data adaptive thresholding methods such as SureShrink, BayeShrink, Generalized Cross Validation and Hypothesis Testing. Some nonwavelet denoising process are also introduced. Along with simple mean and median filters, more statistically adaptive median, Lee, Kuan and Frost filtering techniques are also tested to assist wavelet based denoising scheme. All of these methods on the basis of wavelet models and some traditional methods will be implemented in pure java code using plug-in concept of ImageJ which is a popular image processing tool written in Java.
39

An Assessment And Analysis Tool For Statistical Process Control Of Software Processes

Kirbas, Serkan 01 February 2007 (has links) (PDF)
Statistical process control (SPC) which includes very powerful techniques used in other mature engineering disciplines for providing process control is not used by many software organizations. In software engineering domain, SPC is currently utilized only by organizations which have high maturity levels according to the process improvement models like CMM, ISO/IEC 15504 and CMMI. Guidelines and software tools to implement SPC techniques should be developed for effective use and dissemination of SPC especially for low maturity organizations. In this thesis, a software tool (SPC-AAT) which we developed to assess the suitability of software processes and metrics for SPC and use of SPC tools is presented. With SPC-AAT, we aim to ease and enhance application of SPC especially for emergent and low maturity organizations. Control charts, histograms, bar charts and pareto charts are the supported SPC tools for this purpose. We also explained the validation of the tool over two processes of a software organization in three case studies.
40

Configuring Electronic Relationship Among The Ihe Actors

Gulderen, Ozgur 01 January 2007 (has links) (PDF)
Today, the healthcare IT vendors develop interfaces for their products to comply with the IHE Profiles and test them with other vendors in annual events, called IHE Connactathons which are one week events organized to perform interoperability testing. Network configuration including hostnames, IP addresses,port numbers for healthcare services and URLs for the various web services used in IHE profiles are stored on online databases accessible from the IHE website. When all companies register for participation to the connectathon, a default configuration is created in the database, assigning default values for healthcare service specific parameters. Afterwards, a company is expected to use the configuration parameters from that database during the connectathon. Except from the connectathons, the vendors could not test their products with other vendors&rsquo / . In this thesis, research on IHE Configuration Management Tool based on IHE profiles, ebXML ebBP, CPP and CPA specifications is presented. The intention behind developing such a tool is to automate the creation and exchange of configuration information between IHE Actors in a Virtual Connectathon Portal. The results of this research enable the IHE product vendors to test their products in interactions with other vendors through the web anytime/anywhere by automatic business service interface configuration. The research enables the healthcare systems to engage with many trading partners at the same time automatically. The work presented in this thesis is realized as a part of IST-027065 RIDE project funded by European Commission.

Page generated in 0.0789 seconds