• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 341
  • 133
  • 67
  • 62
  • 37
  • 22
  • 19
  • 14
  • 11
  • 8
  • 7
  • 7
  • 6
  • 5
  • 4
  • Tagged with
  • 872
  • 219
  • 99
  • 95
  • 79
  • 73
  • 68
  • 63
  • 55
  • 51
  • 49
  • 46
  • 44
  • 42
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
751

TOP-K AND SKYLINE QUERY PROCESSING OVER RELATIONAL DATABASE

Samara, Rafat January 2012 (has links)
Top-k and Skyline queries are a long study topic in database and information retrieval communities and they are two popular operations for preference retrieval. Top-k query returns a subset of the most relevant answers instead of all answers. Efficient top-k processing retrieves the k objects that have the highest overall score. In this paper, some algorithms that are used as a technique for efficient top-k processing for different scenarios have been represented. A framework based on existing algorithms with considering based cost optimization that works for these scenarios has been presented. This framework will be used when the user can determine the user ranking function. A real life scenario has been applied on this framework step by step. Skyline query returns a set of points that are not dominated (a record x dominates another record y if x is as good as y in all attributes and strictly better in at least one attribute) by other points in the given datasets. In this paper, some algorithms that are used for evaluating the skyline query have been introduced. One of the problems in the skyline query which is called curse of dimensionality has been presented. A new strategy that based on the skyline existing algorithms, skyline frequency and the binary tree strategy which gives a good solution for this problem has been presented. This new strategy will be used when the user cannot determine the user ranking function. A real life scenario is presented which apply this strategy step by step. Finally, the advantages of the top-k query have been applied on the skyline query in order to have a quickly and efficient retrieving results.
752

Les autres Métis : the English Métis of the Prince Albert settlement 1862-1886

Code, Paget James 14 January 2008 (has links)
In the mid-nineteenth century Métis society re-established itself west of Red River in the Saskatchewan country. This thesis tells the long overlooked story of the English Métis of the Prince Albert Settlement, beginning with James Isbisters initial farm in 1862 and the wave of Métis who followed him west in search of a better life. Questions of Identity, Politics, and Religion are answered to place the English Métis in the historical context of the Métis nation and the events of the Canadian states institutional expansion onto the Western prairies. The place of the English Métis vis-à-vis their French, First Nations, and Euro-Canadian neighbours is examined, as are their attempts to secure a land base and continued collective identity under pressures from hostile state and economic forces. Their importance in the events of the period which would have long lasting national and local significance is also examined. A survey of the community and the changes it went through is given from the initial settlement period to the dissolution of the English Métis as a recognizable collective force following Louis Riels uprising.
753

Inventory Routing Investigations

Song, Jin-Hwa 08 July 2004 (has links)
The elimination of distribution inefficiencies, occurring due to the timing of customers' orders is an important reason for companies to introduce vendor managed inventory programs. By managing their customers' inventories, suppliers may be able to reduce demand variability and therefore distribution costs. We develop technology to measure the effectiveness of distribution strategies. We develop a methodology that allows the computation of tight lower bounds on the total mileage required to satisfy customer demand over a period of time. As a result, companies will be able to gain insight into the effectiveness of their distribution strategy. This technology can also be used to suggest desirable delivery patterns and to analyze tactical and strategic decisions. Secondly, we study the inventory routing problem with continuous moves (IRP-CM). The typical inventory routing problem deals with the repeated distribution of a single product, from a single facility, with an unlimited supply, to a set of customers that can all be reached with out-and-back trips. Unfortunately, this is not always the reality. We introduce the IRP-CM to study two important real-life complexities: limited product availabilities at facilities and customers that cannot be served using out-and-back tours. We need to design delivery tours spanning several days, covering huge geographic areas, and involving product pickups at different facilities. We develop a heuristic and an optimization algorithm to construct distribution plans. The heuristic is an innovative randomized greedy algorithm, which includes linear programming based postprocessing technology. To solve the IRP-CM to optimality, we give a time-discretized integer programming model and develop a branch-and-cut algorithm. As instances of time-discretized models tend to be large we discuss several possibilities for reducing the problem size. We introduce a set of valid inequalities, called delivery cover inequalities, in order to tighten the bounds given by the LP relaxation of the time-discretized model. We also introduce branching schemes exploiting the underlying structure of the IRP-CM. An extensive computational study demonstrates the effectiveness of the optimization algorithm. Finally, we present an integrated approach using heuristics and optimization algorithms providing effective and efficient technology for solving inventory problems with continuous moves.
754

Computer vision and machine learning methods for the analysis of brain and cardiac imagery

Mohan, Vandana 06 December 2010 (has links)
Medical imagery is increasingly evolving towards higher resolution and throughput. The increasing volume of data and the usage of multiple and often novel imaging modalities necessitates the use of mathematical and computational techniques for quicker, more accurate and more robust analysis of medical imagery. The fields of computer vision and machine learning provide a rich set of techniques that are useful in medical image analysis, in tasks ranging from segmentation to classification and population analysis, notably by integrating the qualitative knowledge of experts in anatomy and the pathologies of various disorders and making it applicable to the analysis of medical imagery going forward. The object of the proposed research is exactly to explore various computer vision and machine learning methods with a view to the improved analysis of multiple modalities of brain and cardiac imagery, towards enabling the clinical goals of studying schizophrenia, brain tumors (meningiomas and gliomas in particular) and cardiovascular disorders. In the first project, a framework is proposed for the segmentation of tubular, branched anatomical structures. The framework uses the tubular surface model which yields computational advantages and further incorporates a novel automatic branch detection algorithm. It is successfully applied to the segmentation of neural fiber bundles and blood vessels. In the second project, a novel population analysis framework is built using the shape model proposed as part of the first project. This framework is applied to the analysis of neural fiber bundles towards the detection and understanding of schizophrenia. In the third and final project, the use of mass spectrometry imaging for the analysis of brain tumors is motivated on two fronts, towards the offline classification analysis of the data, as well as the end application of intraoperative detection of tumor boundaries. SVMs are applied for the classification of gliomas into one of four subtypes towards application in building appropriate treatment plans, and multiple statistical measures are studied with a view to feature extraction (or biomarker detection). The problem of intraoperative tumor boundary detection is formulated as a detection of local minima of the spatial map of tumor cell concentration which in turn is modeled as a function of the mass spectra, via regression techniques.
755

Engineering nanomaterials with enhanced functionality

Li, Shanghua January 2006 (has links)
<p>This thesis deals with the engineering of novel nanomaterials, particularly nanocomposites and nanostructured surfaces with enhanced functionalities. The study includes two parts; in the first part, an in situ sol-gel polymerization approach is used for the synthesis of polymer-inorganic hybrid material and its exceptional transparent UV-shielding effect has been investigated. In the second part, electrodeposition process has been adapted to engineer surfaces and the boiling performance of the fabricated nanostructured surfaces is evaluated.</p><p>In the first part of the work, polymer-inorganic hybrid materials composed of poly(methylmethacrylate) (PMMA) and zinc compounds were prepared by in situ sol-gel transition polymerization of zinc complex in PMMA matrix. The immiscibility of heterophase of solid organic and inorganic constituents was significantly resolved by an in situ sol-gel transition polymerization of ZnO nanofillers within PMMA in the presence of dual functional agent, monoethanolamine, which provided strong secondary interfacial interactions for both complexing and crosslinking of constituents.</p><p>In the second part of the work, nanoengineering on the surface of copper plates has been performed in order to enhance the boiling heat transfer coefficient. Micro-porous surfaces with dendritic network of copper nanoparticles have been obtained by electrodeposition with dynamic templates. To further alter the grain size of the dendritic branches, the nanostructured surfaces underwent a high temperature annealing treatment.</p><p>Comprehensive characterization methods of the polymer-inorganic hybrid materials and nanoengineered surfaces have been undertaken. XRD, 1H NMR, FT-IR, TGA, DSC, UV-Vis, ED, SEM, TEM and HRTEM have been used for basic physical properties. Pool boiling tests were performed to evaluate the boiling performance of the electrodeposited nanostructured micro-porous structures.</p><p>The homogeneous PZHM exhibited enhanced UV-shielding effects in the entire UV range even at very low ZnO content of 0.02 wt%. Moreover, the relationship between band gap and particle size of incorporated ZnO by sol-gel process was in good agreement with the results calculated from the effective mass model between bandgap and particle size. The fabricated enhanced surface has shown an excellent performance in nucleate boiling. At heat flux of 1 W/cm2, the heat transfer coefficient is enhanced over 15 times compared to a plain reference surface. A model has been presented to explain the enhancement based on the structure characteristics.</p>
756

Διαστηματική ανάλυση και ολική βελτιστοποίηση / Interval analysis and global optimization

Σωτηρόπουλος, Δημήτριος 24 June 2007 (has links)
- / -
757

BC digitization survey results

Hives, Chris 09 February 2009 (has links)
This BC Digitization Survey Results presentation was one of several presentations delivered at the BC Digitization Symposium 2008 held on December 1 & 2, 2008. For more information, please visit the BC Digitization Symposium 2008 website at: http://symposium.westbeyondthewest.ca.
758

Étude des étoiles de la branche horizontale extrême par l'astérosismologie

Van Grootel, Valérie January 2008 (has links)
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal
759

Développement d’un algorithme de branch-and-price-and-cut pour le problème de conception de réseau avec coûts fixes et capacités

Larose, Mathieu 12 1900 (has links)
De nombreux problèmes en transport et en logistique peuvent être formulés comme des modèles de conception de réseau. Ils requièrent généralement de transporter des produits, des passagers ou encore des données dans un réseau afin de satisfaire une certaine demande tout en minimisant les coûts. Dans ce mémoire, nous nous intéressons au problème de conception de réseau avec coûts fixes et capacités. Ce problème consiste à ouvrir un sous-ensemble des liens dans un réseau afin de satisfaire la demande, tout en respectant les contraintes de capacités sur les liens. L'objectif est de minimiser les coûts fixes associés à l'ouverture des liens et les coûts de transport des produits. Nous présentons une méthode exacte pour résoudre ce problème basée sur des techniques utilisées en programmation linéaire en nombres entiers. Notre méthode est une variante de l'algorithme de branch-and-bound, appelée branch-and-price-and-cut, dans laquelle nous exploitons à la fois la génération de colonnes et de coupes pour la résolution d'instances de grande taille, en particulier, celles ayant un grand nombre de produits. En nous comparant à CPLEX, actuellement l'un des meilleurs logiciels d'optimisation mathématique, notre méthode est compétitive sur les instances de taille moyenne et supérieure sur les instances de grande taille ayant un grand nombre de produits, et ce, même si elle n'utilise qu'un seul type d'inégalités valides. / Many problems in transportation and logistics can be formulated as network design models. They usually require to transport commodities, passengers or data in a network to satisfy a certain demand while minimizing the costs. In this work, we focus on the multicommodity capacited fixed-charge network design problem which consists of opening a subset of the links in the network to satisfy the demand. Each link has a capacity and a fixed cost that is paid if it is opened. The objective is to minimize the fixed costs of the opened links and the transportation costs of the commodities. We present an exact method to solve this problem based on mixed integer programming techniques. Our method is a specialization of the branch-and-bound algorithm, called branch-and-price-and-cut, in which we use column generation and cutting-plane method to solve large-scale instances. We compare our method with CPLEX, currently one of the best solver. Numerical results show that our method is competitive on medium-scale instances and better on large-scale instances.
760

Planification de trajectoires avion : approche par analogie lumineuse.

Dougui, Nour Elhouda 15 December 2011 (has links) (PDF)
Dans le cadre du projet européen SESAR, la nécessité d'accroître la capacité du trafic aérien a motivé la planification de trajectoires avions 4D (espace + temps). Afin de mettre en place une planification pré-tactique (évitement de zones avec une mauvaise météo ou congestionnées pour un avion) et de mettre en place une planification tactique (générer des ensembles de trajectoires 4D sans conflit), nous introduisons un nouvel algorithme : l'algorithme de propagation de la lumière (APL). Cet algorithme est basé sur une méthode de propagation de front d'onde qui s'inspire de l'analogie avec la propagation de la lumière et qui est adapté au problème de planification de trajectoires. L'APL donne des résultats satisfaisant pour une journée de trafic réel sur la France tout en satisfaisant les contraintes spécifiques à la gestion du trafic aérien. L'APL a ensuite été adapté pour prendre en compte les incertitudes qui concernent la vitesse réelle des avions. Ainsi adapté aux incertitude, l'APL a été testé sur la même journée de trafic avec mise en place de points RTA (Real Time Arrival). Les points RTA permettent de réduire l'incertitude dans le cas où l'APL n'arrive pas à résoudre les conflits. Les résultats obtenus sont très encourageants.

Page generated in 0.3065 seconds