751 |
Cracking the code of 3' ss selection in s.cerevisiaeMeyer, Markus 26 March 2010 (has links)
The informational content of 3' splice sites is low and the mechanisms whereby they are selected are not clear. Here we enunciate a set of rules that govern their selection. For many introns, secondary structures are a key factor, because they occlude alternative 3'ss from the spliceosome and reduce the effective distance between the BS and the 3'ss to a maximum of 45 nucleotides. Further alternative 3'ss are disregarded by the spliceosome because they lie at 9 nucleotides or less from the branch site, or because they are weak splice sites. With these rules, we are able to explain the splicing pattern of the vast majority of introns in Saccharomyces cerevisiae. When in excess, L30 blocks the splicing of its own transcript by interfering with a critical rearrangement that is required for the proper recognition of the intron 3' end, and thus for splicing to proceed. We show that the protein Cbp80 has a role in promoting this rearrangement and therefore antagonizes splicing regulation by L30. / Tanto la información que define el sitio de splicing 3' como los mecanismos de selección del mismo son poco conocidos. En este trabajo, proponemos una serie de reglas que gobiernan esta selección. Las estructuras secundarias son claves en el caso de muchos intrones, porque son capaces de ocultar sitios de splicing alternativos 3' al spliceosoma, y además reducen la distancia efectiva entre el punto de ramificación y el sitio de splicing 3' a un máximo de 45 nucleotidos. Otros sitios de splicing alternativo 3' no son considerados por el spliceosoma como tales porque se encuentran a 9 nucleotidos o menos del punto de ramificación, o porque son sitios de splicing débiles. Con estas reglas somos capaces de explicar el splicing de la mayoría de intrones de Saccharomyces cerevisiae. El exceso de proteína L30 bloquea el splicing de su propio tránscrito porque interfiere con la reorganización necesaria para el correcto reconocimiento del 3' final del intrón, y por tanto de su splicing. Demostramos que la proteína Cbp80 está implicada en promover esta reorganización y que por tanto antagoniza la regulación del splicing por L30.
|
752 |
TOP-K AND SKYLINE QUERY PROCESSING OVER RELATIONAL DATABASESamara, Rafat January 2012 (has links)
Top-k and Skyline queries are a long study topic in database and information retrieval communities and they are two popular operations for preference retrieval. Top-k query returns a subset of the most relevant answers instead of all answers. Efficient top-k processing retrieves the k objects that have the highest overall score. In this paper, some algorithms that are used as a technique for efficient top-k processing for different scenarios have been represented. A framework based on existing algorithms with considering based cost optimization that works for these scenarios has been presented. This framework will be used when the user can determine the user ranking function. A real life scenario has been applied on this framework step by step. Skyline query returns a set of points that are not dominated (a record x dominates another record y if x is as good as y in all attributes and strictly better in at least one attribute) by other points in the given datasets. In this paper, some algorithms that are used for evaluating the skyline query have been introduced. One of the problems in the skyline query which is called curse of dimensionality has been presented. A new strategy that based on the skyline existing algorithms, skyline frequency and the binary tree strategy which gives a good solution for this problem has been presented. This new strategy will be used when the user cannot determine the user ranking function. A real life scenario is presented which apply this strategy step by step. Finally, the advantages of the top-k query have been applied on the skyline query in order to have a quickly and efficient retrieving results.
|
753 |
Les autres Métis : the English Métis of the Prince Albert settlement 1862-1886Code, Paget James 14 January 2008 (has links)
In the mid-nineteenth century Métis society re-established itself west of Red River in the Saskatchewan country. This thesis tells the long overlooked story of the English Métis of the Prince Albert Settlement, beginning with James Isbisters initial farm in 1862 and the wave of Métis who followed him west in search of a better life. Questions of Identity, Politics, and Religion are answered to place the English Métis in the historical context of the Métis nation and the events of the Canadian states institutional expansion onto the Western prairies. The place of the English Métis vis-à-vis their French, First Nations, and Euro-Canadian neighbours is examined, as are their attempts to secure a land base and continued collective identity under pressures from hostile state and economic forces. Their importance in the events of the period which would have long lasting national and local significance is also examined. A survey of the community and the changes it went through is given from the initial settlement period to the dissolution of the English Métis as a recognizable collective force following Louis Riels uprising.
|
754 |
Inventory Routing InvestigationsSong, Jin-Hwa 08 July 2004 (has links)
The elimination of distribution inefficiencies, occurring due to the timing of customers' orders is an important reason for companies to introduce vendor managed inventory programs. By managing their customers' inventories, suppliers may be able to reduce demand variability and therefore distribution costs. We develop technology to measure the effectiveness of distribution strategies. We develop a methodology that allows the computation of tight lower bounds on the total mileage required to satisfy customer demand
over a period of time. As a result, companies will be able to gain insight into the effectiveness of their distribution strategy. This technology can also be used to suggest desirable delivery patterns and to analyze tactical and strategic decisions.
Secondly, we study the inventory routing problem with continuous moves (IRP-CM). The typical inventory routing problem deals with the repeated distribution of a single product, from a single facility, with an unlimited supply, to a set of customers that can all be reached with out-and-back trips. Unfortunately, this is not always the reality. We introduce the IRP-CM to study two important real-life complexities: limited product
availabilities at facilities and customers that cannot be served using out-and-back tours. We need to design delivery tours spanning several days, covering huge geographic areas, and involving product pickups at different facilities. We develop a heuristic and an optimization algorithm to construct distribution
plans. The heuristic is an innovative randomized greedy algorithm, which includes linear programming based postprocessing technology. To solve the IRP-CM to optimality, we give a time-discretized integer programming model and develop a branch-and-cut algorithm. As instances of time-discretized models tend to be large we discuss several possibilities for reducing the problem size. We introduce a set of valid inequalities, called delivery cover inequalities, in order to tighten the bounds given by the LP relaxation of the time-discretized model. We also introduce
branching schemes exploiting the underlying structure of the IRP-CM. An extensive computational study demonstrates the effectiveness of the optimization algorithm. Finally, we present an integrated approach using heuristics and optimization algorithms providing effective and efficient technology for solving inventory problems with continuous moves.
|
755 |
Computer vision and machine learning methods for the analysis of brain and cardiac imageryMohan, Vandana 06 December 2010 (has links)
Medical imagery is increasingly evolving towards higher resolution and throughput. The increasing volume of data and the usage of multiple and often novel imaging modalities necessitates the use of mathematical and computational techniques for quicker, more accurate and more robust analysis of medical imagery. The fields of computer vision and machine learning provide a rich set of techniques that are useful in medical image analysis, in tasks ranging from segmentation to classification and population analysis, notably by integrating the qualitative knowledge of experts in anatomy and the pathologies of various disorders and making it applicable to the analysis of medical imagery going forward. The object of the proposed research is exactly to explore various computer vision and machine learning methods with a view to the improved analysis of multiple modalities of brain and cardiac imagery, towards enabling the clinical goals of studying schizophrenia, brain tumors (meningiomas and gliomas in particular) and cardiovascular disorders.
In the first project, a framework is proposed for the segmentation of tubular, branched anatomical structures. The framework uses the tubular surface model which yields computational advantages and further incorporates a novel automatic branch detection algorithm. It is successfully applied to the segmentation of neural fiber bundles and blood vessels.
In the second project, a novel population analysis framework is built using the shape model proposed as part of the first project. This framework is applied to the analysis of neural fiber bundles towards the detection and understanding of schizophrenia.
In the third and final project, the use of mass spectrometry imaging for the analysis of brain tumors is motivated on two fronts, towards the offline classification analysis of the data, as well as the end application of intraoperative detection of tumor boundaries. SVMs are applied for the classification of gliomas into one of four subtypes towards application in building appropriate treatment plans, and multiple statistical measures are studied with a view to feature extraction (or biomarker detection). The problem of intraoperative tumor boundary detection is formulated as a detection of local minima of the spatial map of tumor cell concentration which in turn is modeled as a function of the mass spectra, via regression techniques.
|
756 |
Engineering nanomaterials with enhanced functionalityLi, Shanghua January 2006 (has links)
<p>This thesis deals with the engineering of novel nanomaterials, particularly nanocomposites and nanostructured surfaces with enhanced functionalities. The study includes two parts; in the first part, an in situ sol-gel polymerization approach is used for the synthesis of polymer-inorganic hybrid material and its exceptional transparent UV-shielding effect has been investigated. In the second part, electrodeposition process has been adapted to engineer surfaces and the boiling performance of the fabricated nanostructured surfaces is evaluated.</p><p>In the first part of the work, polymer-inorganic hybrid materials composed of poly(methylmethacrylate) (PMMA) and zinc compounds were prepared by in situ sol-gel transition polymerization of zinc complex in PMMA matrix. The immiscibility of heterophase of solid organic and inorganic constituents was significantly resolved by an in situ sol-gel transition polymerization of ZnO nanofillers within PMMA in the presence of dual functional agent, monoethanolamine, which provided strong secondary interfacial interactions for both complexing and crosslinking of constituents.</p><p>In the second part of the work, nanoengineering on the surface of copper plates has been performed in order to enhance the boiling heat transfer coefficient. Micro-porous surfaces with dendritic network of copper nanoparticles have been obtained by electrodeposition with dynamic templates. To further alter the grain size of the dendritic branches, the nanostructured surfaces underwent a high temperature annealing treatment.</p><p>Comprehensive characterization methods of the polymer-inorganic hybrid materials and nanoengineered surfaces have been undertaken. XRD, 1H NMR, FT-IR, TGA, DSC, UV-Vis, ED, SEM, TEM and HRTEM have been used for basic physical properties. Pool boiling tests were performed to evaluate the boiling performance of the electrodeposited nanostructured micro-porous structures.</p><p>The homogeneous PZHM exhibited enhanced UV-shielding effects in the entire UV range even at very low ZnO content of 0.02 wt%. Moreover, the relationship between band gap and particle size of incorporated ZnO by sol-gel process was in good agreement with the results calculated from the effective mass model between bandgap and particle size. The fabricated enhanced surface has shown an excellent performance in nucleate boiling. At heat flux of 1 W/cm2, the heat transfer coefficient is enhanced over 15 times compared to a plain reference surface. A model has been presented to explain the enhancement based on the structure characteristics.</p>
|
757 |
Διαστηματική ανάλυση και ολική βελτιστοποίηση / Interval analysis and global optimizationΣωτηρόπουλος, Δημήτριος 24 June 2007 (has links)
- / -
|
758 |
BC digitization survey resultsHives, Chris 09 February 2009 (has links)
This BC Digitization Survey Results presentation was one of several presentations delivered at the BC Digitization Symposium 2008 held on December 1 & 2, 2008. For more information, please visit the BC Digitization Symposium 2008 website at: http://symposium.westbeyondthewest.ca.
|
759 |
Étude des étoiles de la branche horizontale extrême par l'astérosismologieVan Grootel, Valérie January 2008 (has links)
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal
|
760 |
Développement d’un algorithme de branch-and-price-and-cut pour le problème de conception de réseau avec coûts fixes et capacitésLarose, Mathieu 12 1900 (has links)
De nombreux problèmes en transport et en logistique peuvent être formulés comme des modèles de conception de réseau. Ils requièrent généralement de transporter des produits, des passagers ou encore des données dans un réseau afin de satisfaire une certaine demande tout en minimisant les coûts. Dans ce mémoire, nous nous intéressons au problème de conception de réseau avec coûts fixes et capacités. Ce problème consiste à ouvrir un sous-ensemble des liens dans un réseau afin de satisfaire la demande, tout en respectant les contraintes de capacités sur les liens. L'objectif est de minimiser les coûts fixes associés à l'ouverture des liens et les coûts de transport des produits.
Nous présentons une méthode exacte pour résoudre ce problème basée sur des techniques utilisées en programmation linéaire en nombres entiers. Notre méthode est une variante de l'algorithme de branch-and-bound, appelée branch-and-price-and-cut, dans laquelle nous exploitons à la fois la génération de colonnes et de coupes pour la résolution d'instances de grande taille, en particulier, celles ayant un grand nombre de produits.
En nous comparant à CPLEX, actuellement l'un des meilleurs logiciels d'optimisation mathématique, notre méthode est compétitive sur les instances de taille moyenne et supérieure sur les instances de grande taille ayant un grand nombre de produits, et ce, même si elle n'utilise qu'un seul type d'inégalités valides. / Many problems in transportation and logistics can be formulated as network design models. They usually require to transport commodities, passengers or data in a network to satisfy a certain demand while minimizing the costs. In this work, we focus on the multicommodity capacited fixed-charge network design problem which consists of opening a subset of the links in the network to satisfy the demand. Each link has a capacity and a fixed cost that is paid if it is opened. The objective is to minimize the fixed costs of the opened links and the transportation costs of the commodities.
We present an exact method to solve this problem based on mixed integer programming techniques. Our method is a specialization of the branch-and-bound algorithm, called branch-and-price-and-cut, in which we use column generation and cutting-plane method to solve large-scale instances.
We compare our method with CPLEX, currently one of the best solver. Numerical results show that our method is competitive on medium-scale instances and better on large-scale instances.
|
Page generated in 0.054 seconds