• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22015
  • 3290
  • 2297
  • 1634
  • 1583
  • 627
  • 602
  • 240
  • 219
  • 217
  • 180
  • 180
  • 180
  • 180
  • 180
  • Tagged with
  • 40640
  • 6653
  • 5728
  • 5247
  • 4156
  • 4043
  • 3374
  • 3201
  • 2826
  • 2715
  • 2608
  • 2476
  • 2363
  • 2341
  • 2306
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

A Tabu Search Approach to Multiple Sequence Alignment

Lightner, Carin Ann 05 August 2008 (has links)
Sequence alignment methods are used to detect and quantify similarities between different DNA and protein sequences that may have evolved from a common ancestor. Effective sequence alignment methodologies also provide insight into the structure function of a sequence and are the first step in constructing evolutionary trees. In this dissertation, we use a tabu search approach to multiple sequence alignment. A tabu search is a heuristic approach that uses adaptive memory features to align multiple sequences. The adaptive memory feature, a tabu list, helps the search process avoid local optimal solutions and explores the solution space in an efficient manner. We develop two main tabu searches that progressively align sequences. A randomly generated bifurcating tree guides the alignment. The objective is to optimize the alignment score computed using either the sum of pairs or parsimony scoring function. The use of a parsimony scoring function provides insight into the homology between sequences in the alignment. We also explore iterative refinement techniques such as a hidden Markov model and an intensification heuristic to further improve the alignment. This approach to multiple sequence alignment provides improved alignments as compared to several other methods.
102

Tabu Search and Genetic Algorithm for Phylogeny Inference

Lin, Yu-Min 21 October 2008 (has links)
Phylogenetics is the study of evolutionary relations between different organisms. Phylogenetic trees are the representations of these relations. Researchers have been working on finding fast and systematic approaches to reconstruct phylogenetic trees from observed data for over 40 years. It has been shown that, given a certain criterion to evaluate each tree, finding the best fitted phylogenetic trees among all possible trees is an NP-hard problem. In this study, we focus on the topology searching techniques for the maximum-parsimony and maximum-likelihood phylogeny inference. We proposed two search methods based on tabu search and genetic algorithms. We first explore the feasibility of using tabu search for finding the maximum-parsimony trees. The performance of the proposed algorithm is evaluated based on its efficiency and accuracy. Then we proposed a hybrid method of the tabu search and genetic algorithm. The experimental results indicate that the hybrid method can provide maximum-parsimony trees with a ggood level of accuracy and efficiency. The hybrid method is also implemented for finding maximum-likelihood trees. The experimental results show that the proposed hybrid method produce better maximum-likelihood trees than the default-setting dnaml program in average on the tested data sets. On a much larger data set, the hybrid method outperforms the default-setting dnaml program and has equally good performance as the dnaml program with the selected jumble option.
103

Long-Term Spatial Load Forecasting Using Human-Machine Co-construct Intelligence Framework

Hong, Tao 28 October 2008 (has links)
This thesis presents a formal study of the long-term spatial load forecasting problem: given small area based electric load history of the service territory, current and future land use information, return forecast load of the next 20 years. A hierarchical S-curve trending method is developed to conduct the basic forecast. Due to uncertainties of the electric load data, the results from the computerized program may conflict with the nature of the load growth. Sometimes, the computerized program is not aware of the local development because the land use data lacks such information. A human-machine co-construct intelligence framework is proposed to improve the robustness and reasonability of the purely computerized load forecasting program. The proposed algorithm has been implemented and applied to several utility companies to forecast the long-term electric load growth in the service territory and to get satisfying results.
104

Soft Computing Approaches to Routing and Wavelength Assignment in Wavelength-Routed Optical Networks

Lea, Djuana 30 November 2004 (has links)
The routing and wavelength assignment (RWA) problem is essential for achieving efficient performance in wavelength-routed optical networks. For a network without wavelength conversion capabilities, the RWA problem consists of selecting an appropriate path and wavelength for each connection request while ensuring that paths that share common links are not assigned the same wavelength. The purpose of this research is to develop efficient adaptive methods for routing and wavelength assignment in wavelength-routed optical networks with dynamic traffic. The proposed methods utilize soft computing techniques including genetic algorithms, fuzzy control theory, simulated annealing, and tabu search. All four algorithms consider the current availability of network resources before making a routing decision. Simulations for each algorithm show that each method outperforms fixed and alternate routing strategies. The fuzzy-controlled algorithm achieved the lowest blocking rates and the shortest running times in most cases.
105

Ad-hoc Wireless Routing for Wildlife Tracking with Environmental Power Constraint

McClusky, Douglas 18 December 2006 (has links)
The purpose of this paper is to suggest an algorithm by which mica motes can organize themselves into a network to relay packets as quickly as possible under energy constraints from environmental harvesting. This problem is part of a larger project to develop a means to monitor red wolves using a mica mote network. The network has three parts: sensor motes attached to collars on the wolves, a base station or base stations that receive packets and display them in useable form for scientists and relay motes that forward packets from the sensor motes to a base station. The proposed algorithm adapts Hohlt et al's Flexible Power Scheduling to work under Kansal et al's Environmental Harvesting power constraint. Employing this strategy changes energy consumption from a performance objective to a constraint, allowing me to add my own throughput maximizing piece to the algorithm, based on dynamic programming and microeconomics. I also discuss the ongoing development of a simulation of this algorithm, designed to test its performance and to solve implementation problems.
106

ASSESSING THE IMPACT OF STRATEGIC SAFETY STOCK PLACEMENT IN A MULTI-ECHELON SUPPLY CHAIN

Bryksina, Elena Alexandrovna 16 December 2005 (has links)
The objective of this study is to develop prescriptions for strategically placing safety stocks in an arborescent supply chain in which there are moderate to severe risks of disruptions in supply. Our work builds off of recently published work by Graves and Willems (2003) that demonstrates that a simple-to-compute, congestion-based adjustment to supply lead times, first developed by Ettl et al. (2000), can be embedded in a non-linear optimization problem to minimize total investment in safety stock across the entire supply chain. We are interested in investigating how the Graves and Willems (GW) model performs under uncertainty in supply. We first propose an adjustment to the model (Mod-GW) by considering two types of fulfillment times, a normal fulfillment time and a worst possible fulfillment time , which allows us to account for supply uncertainty, or disruptions in supply. We evaluate the performance of GW and Mod-GW using Monte Carlo simulation and, using motivation from Timed-Petri Net analysis, develop an Informed Safety Stock Adjustment (ISSA) algorithm to compute the additional buffer stock levels necessary to improve downstream service performance to the target level. We find that the service performance of the Mod-GW solution is most sensitive to the probability of disruption at any node in the supply chain, requiring higher safety stock adjustments through ISSA as this probability increases. In particular, the relative value of the holding costs for components and finished goods?and the resulting impact on where safety stock is held in the network?is an important moderating factor in determining the level of service performance degradation of the Mod-GW solution as either , the probability of disruption at node j, or , the ratio of the disrupted and normal lead times, increases (i.e., as disruptions exert more impact on the network). The Informed Safety Stock Adjustment algorithm generally suggests a sufficient complementary amount to the safety stock.
107

Optimization in Job Shop Scheduling Using Alternative Routes

Davenport, Catherine Elizabeth 09 January 2003 (has links)
The ability of a production system to complete orders on time is a critical measure of customer service. While there is typically a preferred routing for a job through the processing machines, often an alternative route is available that can be used to avoid bottleneck operations and improve due date performance. In this paper a heuristic approach is given to dynamically select routing alternatives for a set of jobs to be processed in a job shop. The approach is coupled with a job shop scheduling algorithm developed by Hodgson et al. (1998, 2000) to minimize the latest job (Lmax).
108

Heuristic Methods for Gang-Rip Saw Arbor Design and Scheduling

AKSAKALLI, VURAL 14 November 1999 (has links)
<p>AKSAKALLI, VURAL. Heuristic Methods for Gang-Rip Saw Arbor Design and Scheduling. (Under the direction of Dr. Yahya Fathi).This research considers the problem of designing and scheduling arbors for gang-rip saw systems. Such systems are typically used within the furniture manufacturing industry for processing lumber, where lumber boards are first ripped lengthwise into strips of different widths, and then, cut to the required lengths to be used in manufacturing.A saw with multiple cutting channels is used to perform this operation. This saw has fixed blades at specific positions on a rotating shaft which rips incoming lumber boards into required finished widths. The pattern of cutting channels (i.e., the setting of the blades) along the saw shaft is referred to as an ''arbor''.A typical instance of the problem consists of (1) a set of required finished widths and their corresponding demands, (2) a frequency distribution of lumber boards in the uncut stock, (3) a shaft length, and (4) a blade width. The objective is to design a set of (one or more) arbors and the corresponding quantity of lumber to run through each arbor, such that the total amount of waste generated is minimized while the demand is satisfied.In the research, we focus on solving the problem using only one arbor. First, we discuss the computational complexity of the problem and propose a total enumeration procedure which can be used to solve relatively small instances. Then, we develop algorithms based on heuristic approaches such as local improvement procedures, simulated annealing, and genetic algorithms. Our computational experiments indicate that a local improvement procedure with two nested loops, performing local search with a different neighborhood structure within each loop, gives very high quality solutions to the problem within very short execution times.<P>
109

A WAVELET-BASED PROCEDURE FOR PROCESS FAULT DETECTION

Lada, Emily Kate 06 April 2000 (has links)
<p>The objective of this research is to develop a procedure for detectingfaults in a particular process by analyzing data generated by theprocess on a compressed wavelet scale. In order to compress the data,three different methods are compared for reducing the number ofwavelet coefficients needed to obtain an accurate representation ofthe data. One method is ultimately chosen to compress all data usedin this study. This method, which involves minimizing the relativereconstruction error, effectively balances model parsimony againstreconstruction error. The compressed data from seven in-control runsis used to define a standard, or baseline, by which to compare newdata sets. The compressed baseline process is defined by forming theunion set of the top, or coarsest, coefficient positions selected bythe relative reconstruction error method for each of the seven runs.In order to determine if a new, compressed data set differssignificantly from the baseline, a variant of Hotelling'sT²-statistic for two-sample problems is formulated. This statisticis tested by applying it to four induced-fault data sets, as well asto 21 in-control data sets. The results provide substantial evidenceof the statistic's effectiveness in detecting process faults fromcompressed data. It is also shown that traditional bootstrappingmethods cannot be implemented in this case.<P>
110

Simulation Optimization Using Soft Computing

Medaglia, Andres L. 25 January 2001 (has links)
<p>To date, most of the research in simulation optimization has been focused on single response optimization on the continuous space of input parameters. However, the optimization of more complex systems does not fit this framework. Decision makers often face the problem of optimizing multiple performance measures of systems with both continuous and discrete input parameters. Previously acquired knowledge of the system by experts is seldomincorporated into the simulation optimization engine. Furthermore, when the goals of the system design are stated in natural language or vague terms, current techniques are unable to deal with this situation. For these reasons, we define and study the fuzzy single response simulation optimization (FSO) and fuzzy multiple response simulation optimization (FMSO) problems.<p>The primary objective of this research is to develop an efficient and robust method for simulation optimization of complex systems with multiple vague goals. This method uses a fuzzy controller to incorporate existing knowledge to generate high quality approximate Pareto optimal solutions in a minimum number of simulation runs.<p>For comparison purposes, we also propose an evolutionary method for solving the FMSO problem. Extensive computational experiments on the design of a flow line manufacturing system (in terms of tandem queues with blocking) have been conducted. Both methods are able to generate high quality solutions in terms of Zitzlerand Thiele's "dominated space" metric. Both methods are also able to generate an even sample of the Pareto front. However, the fuzzy controlled method is more efficient, requiring fewer simulation runs than the evolutionary method to achieve the same solution quality.<p>To accommodate the complexity of natural language, this research also provides a new Bezier curve-based mechanism to elicit knowledge and express complex vague concepts. To date, this is perhaps the most flexible and efficient mechanism for both automatic and interactive generation of membership functions for convex fuzzy sets.<p><P>

Page generated in 0.0761 seconds