Spelling suggestions: "subject:"puts"" "subject:"cuts""
101 |
Obchvat obcí Želechovice nad Dřevnicí a Lípa / Bypass of Želechovice nad Dřevnicí and LípaŠkrabal, Jiří January 2014 (has links)
The thesis is focused on the design of the northern bypass of the villages Želechovice nad Dřevnicí and Lipa. The reason for the study is excessive traffic intensity and with that associated noise intensity. The objective of study is drop of traffic in villages and faster and smoother passage of transit traffic continued toward the road I/49. Seven direction solutions were designed of which one was chosen for further solution. This solution was designed in two variants.
|
102 |
Vyhledávací studie silnice II/152 mezi Modřicemi a Ivančicemi / Location Study of The Road II/152 in Stage Modřice - IvančiceKonečná, Tereza January 2015 (has links)
The Diploma thesis is devoted to working out the searching study for modernization of the road II/152 between the cities Modřice and Ivančice with connection of the existing infrastructure to the new transport solution. The aim was the improvement of the quality of transport, reduce the traffic in the municipalities and the noice related to it and the improvement of environment.Four variants were designed and the most suitable possibility was solved
|
103 |
Erfolgreich gegen Kürzungen: Förderer der Stadtbibliothek Chemnitz protestierenStraube, Barbara 19 April 2010 (has links)
Der Stadtbibliothek Chemnitz sollten durch die kommunalen Sparzwänge 42% des Medienetats für das Jahr 2010 gekürzt werden, obwohl sie die meistbesuchteste Kultureinrichtung von Chemnitz ist. Der Förderverein der Stadtbibliothek e.V. hat dagegen in vielfältiger Form Protest eingelegt und tausende Bürger motiviert dagegen zu protestieren. Dies mit Erfolg: Die Kürzungen wurden teilweise zurückgenommen.
|
104 |
Visual Analytics of Cascaded Bottlenecks in Planar Flow NetworksPost, Tobias, Gillmann, Christina, Wischgoll, Thomas, Hamann, Bernd, Hagen, Hans 25 January 2019 (has links)
Finding bottlenecks and eliminating them to increase the overall flow of a network often appears in real world applications, such as production planning, factory layout, flow related physical approaches, and even cyber security. In many cases, several edges can form a bottleneck (cascaded bottlenecks). This work presents a visual analytics methodology to analyze these cascaded bottlenecks. The methodology consists of multiple steps: identification of bottlenecks, identification of potential improvements, communication of bottlenecks, interactive adaption of bottlenecks, and a feedback loop that allows users to adapt flow networks and their resulting bottlenecks until they are satisfied with the flow network configuration. To achieve this, the definition of a minimal cut is extended to identify network edges that form a (cascaded) bottleneck. To show the effectiveness of the presented approach, we applied the methodology to two flow network setups and show how the overall flow of these networks can be improved.
|
105 |
Algorithmes pour voyager sur un graphe contenant des blocages / A guide book for the traveller on graphs full of blockagesBergé, Pierre 03 December 2019 (has links)
Nous étudions des problèmes NP-difficiles portant sur les graphes contenant des blocages.Nous traitons les problèmes de coupes du point de vue de la complexité paramétrée. La taille p de la coupe est le paramètre. Étant donné un ensemble de sources {s1,...,sk} et une cible t, nous proposons un algorithme qui construit une coupe de taille au plus p séparant au moins r sources de t. Nous nommons ce problème NP-complet Partial One-Target Cut. Notre algorithme est FPT. Nous prouvons également que la variante de Partial One-Target Cut, où la coupe est composée de noeuds, est W[1]-difficile. Notre seconde contribution est la construction d'un algorithme qui compte les coupes minimums entre deux ensembles S et T en temps $2^{O(plog p)}n^{O(1)}$.Nous présentons ensuite plusieurs résultats sur le ratio de compétitivité des stratégies déterministes et randomisées pour le problème du voyageur canadien.Nous prouvons que les stratégies randomisées n'utilisant pas de mémoire ne peuvent pas améliorer le ratio 2k+1. Nous apportons également des éléments concernant les bornes inférieures de compétitivité de l'ensemble des stratégies randomisées. Puis, nous étudions la compétitivité en distance d'un groupe de voyageurs avec et sans communication. Enfin, nous nous penchons sur la compétitivité des stratégies déterministes pour certaines familles de graphes. Deux stratégies, avec un ratio inférieur à 2k+1 sont proposées: une pour les graphes cordaux avec poids uniformes et l'autre pour les graphes où la taille de la plus grande coupe minimale séparant s et t est au plus k. / We study NP-hard problems on graphs with blockages seen as models of networks which are exposed to risk of failures.We treat cut problems via the parameterized complexity framework. The cutset size p is taken as a parameter. Given a set of sources {s1,...,sk} and a target $t, we propose an algorithm which builds a small edge cut of size p separating at least r sources from t. This NP-complete problem is called Partial One-Target Cut. It belongs to the family of multiterminal cut problems. Our algorithm is fixed-parameter tractable (FPT) as its execution takes $2^{O(p^2)}n^{O(1)}$. We prove that the vertex version of this problem, which imposes cuts to contain vertices instead of edges, is W[1]-hard. Then, we design an FPT algorithm which counts the minimum vertex (S,T)-cuts of an undirected graph in time $2^{O(plog p)}n^{O(1)}$.We provide numerous results on the competitive ratio of both deterministic and randomized strategies for the Canadian Traveller Problem. The optimal ratio obtained for the deterministic strategies on general graphs is 2k+1, where k is a given upper bound on the number of blockages. We show that randomized strategies which do not use memory cannot improve the bound 2k+1. In addition, we discuss the tightness of lower bounds on the competitiveness of randomized strategies. The distance competitive ratio for a group of travellers possibly equipped with telecommunication devices is studied. Eventually, a strategy dedicated to equal-weight chordal graphs is proposed while another one is built for graphs with small maximum (s,t)-cuts. Both strategies outperform the ratio 2k+1.
|
106 |
Missed Opportunity: Three Baseline Evaluations of Federal Opportunity Zones PolicySnidal, Michael January 2023 (has links)
The 2017 Tax Cuts and Jobs Act contained the largest federal initiative for place-based investment in over half a century. Opportunity Zones (“OZs”) are expected to cost the US government over $15 billion in forgone tax revenue through 2026, exceeding both the Clinton Era Empowerment Zones and the Great Society programs of Lyndon Johnson. Have OZs increased neighborhood investment and, if so, what types of projects and neighborhoods have benefitted? This dissertation presents three baseline evaluations of OZ.
The first essay discusses the findings from 76 interviews with community and government officials, program managers, developers, businesses, and fund managers about OZ outcomes in West Baltimore. The second essay uses a difference-in-differences (DID) event study framework, an adjusted interrupted time series analysis, and census tract matching techniques to compare small business and residential lending outcomes in OZs with areas that were eligible but not designated. The final essay combines an online search for OZ supported affordable housing projects, a DID design that examines Low-Income Housing Tax Credit outcomes, and 16 interviews with community development experts to evaluate whether and how OZ is having an impact on affordable housing production.
These three analyses show that OZ is a missed opportunity. OZ is stimulating investment conversations and local government capacity, but it is failing at oversight and community engagement and not changing outcomes for distressed community development or affordable housing. OZ is failing because it provides weak incentives for capital gains investors seeking market rate returns, because it does not support investors and developers already active in distressed neighborhoods, and because of several related design flaws that inhibit mission driven development. The essays propose specific policy changes necessary for OZ to encourage investment in highly distressed neighborhoods and to support affordable housing production.
|
107 |
Implementation and Field Testing of Improved Bridge Parapet DesignsKalabon, Amy Elizabeth 30 May 2014 (has links)
No description available.
|
108 |
Aerodynamic instability of tall structures with complex corner shapes / 複雑な角部形状を持つ塔状構造物の空力不安定性Thinzar, Hnin 25 March 2024 (has links)
京都大学 / 新制・課程博士 / 博士(工学) / 甲第25248号 / 工博第5207号 / 新制||工||1994(附属図書館) / 京都大学大学院工学研究科社会基盤工学専攻 / (主査)教授 八木 知己, 教授 杉浦 邦征, 教授 高橋 良和 / 学位規則第4条第1項該当 / Doctor of Agricultural Science / Kyoto University / DFAM
|
109 |
From interactive to semantic image segmentationGulshan, Varun January 2011 (has links)
This thesis investigates two well defined problems in image segmentation, viz. interactive and semantic image segmentation. Interactive segmentation involves power assisting a user in cutting out objects from an image, whereas semantic segmentation involves partitioning pixels in an image into object categories. We investigate various models and energy formulations for both these problems in this thesis. In order to improve the performance of interactive systems, low level texture features are introduced as a replacement for the more commonly used RGB features. To quantify the improvement obtained by using these texture features, two annotated datasets of images are introduced (one consisting of natural images, and the other consisting of camouflaged objects). A significant improvement in performance is observed when using texture features for the case of monochrome images and images containing camouflaged objects. We also explore adding mid-level cues such as shape constraints into interactive segmentation by introducing the idea of geodesic star convexity, which extends the existing notion of a star convexity prior in two important ways: (i) It allows for multiple star centres as opposed to single stars in the original prior and (ii) It generalises the shape constraint by allowing for Geodesic paths as opposed to Euclidean rays. Global minima of our energy function can be obtained subject to these new constraints. We also introduce Geodesic Forests, which exploit the structure of shortest paths in implementing the extended constraints. These extensions to star convexity allow us to use such constraints in a practical segmentation system. This system is evaluated by means of a “robot user” to measure the amount of interaction required in a precise way, and it is shown that having shape constraints reduces user effort significantly compared to existing interactive systems. We also introduce a new and harder dataset which augments the existing GrabCut dataset with more realistic images and ground truth taken from the PASCAL VOC segmentation challenge. In the latter part of the thesis, we bring in object category level information in order to make the interactive segmentation tasks easier, and move towards fully automated semantic segmentation. An algorithm to automatically segment humans from cluttered images given their bounding boxes is presented. A top down segmentation of the human is obtained using classifiers trained to predict segmentation masks from local HOG descriptors. These masks are then combined with bottom up image information in a local GrabCut like procedure. This algorithm is later completely automated to segment humans without requiring a bounding box, and is quantitatively compared with other semantic segmentation methods. We also introduce a novel way to acquire large quantities of segmented training data relatively effortlessly using the Kinect. In the final part of this work, we explore various semantic segmentation methods based on learning using bottom up super-pixelisations. Different methods of combining multiple super-pixelisations are discussed and quantitatively evaluated on two segmentation datasets. We observe that simple combinations of independently trained classifiers on single super-pixelisations perform almost as good as complex methods based on jointly learning across multiple super-pixelisations. We also explore CRF based formulations for semantic segmentation, and introduce novel visual words based object boundary description in the energy formulation. The object appearance and boundary parameters are trained jointly using structured output learning methods, and the benefit of adding pairwise terms is quantified on two different datasets.
|
110 |
Formalisation de la cohérence et calcul des séquences de coupe minimales pour les systèmes binaires dynamiques et réparables / Formal definition of coherency and computation of minimal cut sequences for binary dynamic and repairable systemsChaux, Pierre-Yves 15 April 2013 (has links)
L'analyse prévisionnelle des risques d'un système complexe repose aujourd'hui sur une modélisation de la dynamique du système vis-à-vis des défaillances et réparations de ses composants. L'analyse qualitative d'un tel système consiste à rechercher et à analyser les scénarios conduisant à la panne. En raison de leur nombre, il est courant de ne s'intéresser qu'aux scénarios les plus caractéristiques, les Séquences de Coupe Minimales (SCM). L'absence de formalisation de ces SCM a généré soit des définitions spécifiques à certains outils de modélisation soit des définitions informelles. Les travaux présentés dans cette thèse proposent: i) un cadre et une définition formelle des séquences de coupe minimales, tout deux indépendants de l'outil de modélisation de fiabilité utilisé, ii) une méthode permettant leur calcul, méthode basée sur des propriétés déduites de leur définition, iii) l'extension des premières définitions aux composants multimodes. Ce cadre permet le calcul des SCM pour des installations décrites avec les Boolean logic Driven Markov Processes (BDMP). Sous l'hypothèse que l'ensemble des scénarios représentés implicitement via le modèle de sûreté établi peut être modélisé à l'aide d'un automate fini, ces travaux définissent la notion de cohérence des systèmes dynamiques et réparables, et le moyen d'obtenir une représentation minimale de l'ensemble des scénarios menant à la défaillance du système. / Preventive risk assessment of a complex system rely on a dynamic models which describe the link between the system failure and the scenarios of failure and repair events from its components. The qualitative analyses of a binary dynamic and repairable system is aiming at computing and analyse the scenarios that lead to the system failure. Since such systems describe a large set of those, only the most representative ones, called Minimal Cut Sequences (MCS), are of interest for the safety engineer. The lack of a formal definition for the MCS has generated multiple definitions either specific to a given model (and thus not generic) or informal. This work proposes i) a formal framework and definition for the MCS while staying independent of the reliability model used, ii) the methodology to compute them using property extracted from their formal definition, iii) an extension of the formal framework for multi-states components in order to perform the qualitative analyses of Boolean logic Driven Markov Processes (BDMP) models. Under the hypothesis that the scenarios implicitly described by any reliability model can always be represented by a finite automaton, this work is defining the coherency for dynamic and repairable systems as the way to give a minimal representation of all scenarios that are leading to the system failure.
|
Page generated in 0.0394 seconds