• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 74
  • 38
  • 17
  • 14
  • 13
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 238
  • 46
  • 39
  • 38
  • 36
  • 34
  • 27
  • 26
  • 25
  • 24
  • 21
  • 21
  • 18
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Les lotissements d'Orléans et la formation d'une périphérie urbaine (1875-1958) : processus d'extension, formes et règlements / The subdivisions of Orleans' city and the urbanization of the outskirts (1875-1958) : mecanism of urban extension, forms and regulations

Launay, Yann 29 September 2015 (has links)
La formation de la périphérie urbaine d’Orléans des années 1870 à l’après-guerre, étudiée à partir d’une échelle d’analyse particulière, le lotissement, constitue le sujet central de cette thèse. L’analyse des processus d’extension resitue plus largement les lotissements dans la ville. Elle montre d’abord les relations étroites qu’ils entretiennent avec le quartier Dunois, aménagé en 1879-1880. Si les Plans d’aménagement, d’embellissement et d’extension (loi Cornudet, 1919), ont peu d’impact sur la construction des lotissements, ces derniers témoignent, à des degrés divers, de la mise en oeuvre du Projet de reconstruction et d’aménagement de la commune (1949). La compréhension du cadre réglementaire et juridique permet d’éclairer la politique d’aménagement de voirie municipale et les projets de lotissement sur une longue durée. Elle nous renseigne également sur la constitution du paysage urbain. Cette étude offre ainsi de nouvelles clés de lecture du territoire orléanais, en nous informant non seulement sur les acteurs et leurs pratiques, mais également sur les formes urbaines et architecturales que ces hommes ont générées / This dissertation focuses on the urbanization of Orléans’ (France) outskirts, from the 1870s to the post-war period, studied from a special analysis of scale, i.e. subdivisions. Analysis of mechanisms and forms of urban extension situates subdivisions in the city on a wider scale. It first enhances the close relationship that subdivisions have with the quartier Dunois that was planned in 1879-1880. In 1919, a French town-planning law named “Cornudet Act” was established. It appears that it had little impact on the construction of subdivisions. In fact, it is the implementation of town-planning for reconstruction and rehabilitation in 1949 that showed more its influence on the construction of subdivisions. Understanding the regulatory and legal framework can illuminate road system rehabilitation policy of the city and subdivision projects on a long-term perspective. It highlights the creation of urban landscape as well. This study provides new keys to understand Orléans’ territory: not only does it accounts for the actors’ roles and their practices, but it also provides new insights into urban and architectural forms that these people generated.
222

Analýza mluvených projevů / Analysis of Speech Standard

NOVÁKOVÁ, Lucie January 2013 (has links)
The aim of the diploma thesis is to give a partial characterization of speech of several age groups: ninth grade students, people aged 19-24 years, people aged 44-50 years and people aged 65 or more years and also draw a comparison between them in certain areas. The thesis contains the theoretical part, which focuses on simple functional communication style in general, non-verbal communication elements, symptoms suprasegmental speech marks and vocabulary range. The practical part is divided into two chapters. The first one describes necessities and areas of research. The second one provides a summary of speech specifics of individual age groups and mutual comparison of them. The practical part focuses on expression quality, ie., accuracy and linguistic correctness. It is also focused on the periphery of the vocabulary range.
223

Aide au tolérancement tridimensionnel : modèle des domaines / Three-dimensional tolerancing assistance : domains model

Mansuy, Mathieu 25 June 2012 (has links)
Face à la demande de plus en plus exigeante en terme de qualité et de coût de fabrication des produits manufacturés, la qualification et quantification optimal des défauts acceptables est primordial. Le tolérancement est le moyen de communication permettant de définir les variations géométriques autorisé entre les différents corps de métier intervenant au cours du cycle de fabrication du produit. Un tolérancement optimal est le juste compromis entre coût de fabrication et qualité du produit final. Le tolérancement repose sur 3 problématiques majeures: la spécification (normalisation d'un langage complet et univoque), la synthèse et l'analyse de tolérances. Nous proposons dans ce document de nouvelles méthodes d'analyse et de synthèse du tolérancement tridimensionnel. Ces méthodes se basent sur une modélisation de la géométrie à l'aide de l'outil domaine jeux et écarts développé au laboratoire. La première étape consiste à déterminer les différentes topologies composant un mécanisme tridimensionnel. Pour chacune de ces topologies est définie une méthode de résolution des problématiques de tolérancement. Au pire des cas, les conditions de respect des exigences fonctionnelles se traduisent par des conditions d'existence et d'inclusions sur les domaines. Ces équations de domaines peuvent ensuite être traduites sous forme de système d'inéquations scalaires. L'analyse statistique s'appuie sur des tirages de type Monte-Carlo. Les variables aléatoires sont les composantes de petits déplacements des torseur écarts défini à l'intérieur de leur zone de tolérance (modélisée par un domaine écarts) et les dimensions géométriques fixant l'étendue des jeux (taille du domaine jeux associé). A l'issue des simulations statistiques, il est possible d'estimer le risque de non-qualité et les jeux résiduels en fonction du tolérancement défini. Le développement d'une nouvelle représentation des domaines jeux et écarts plus adapté, permet de simplifier les calculs relatifs aux problématiques de tolérancement. Le traitement local de chaque topologie élémentaire de mécanisme permet d'effectuer le traitement global des mécanismes tridimensionnels complexes avec prise en compte des jeux. / As far as the demand in quality and cost of manufacturing increase, the optimal qualification and quantification of acceptable defects is essential. Tolerancing is the means of communication between all actors of manufacturing. An optimal tolerancing is the right compromise between manufacturing cost and quality of the final product. Tolerancing is based on three major issues: The specification (standardization of a complete and unequivocal language), synthesis and analysis of the tolerancing. We suggest in this thesis some new analysis and synthesis of the three-dimensional tolerancing. These methods are based on a geometric model define by the deviations and clearances domains developed on the laboratory. The first step consists in determining the elementary topology that composes a three-dimensional mechanism. For each kind of these topologies one resolution method is defined. In worst case, the condition of functional requirement respect is traduced by existence and inclusions conditions on the domains. Then these domains equations can be translated in inequalities system of scalar. The statistical analysis uses the Monte-Carlo simulation. The random variables are the small displacements components of the deviation torsor which is defined inside its tolerance area (model by a deviations domain) and the geometrics dimensions which set the extent of clearance (size of the clearance domain). Thanks to statistical simulation, it is possible to estimate the non-quality rate in regards to the defined tolerancing. The development of a new representation of clearances and deviations domains most suitable, allows us to simplify the calculation for tolerancing problems. The local treatment of elementary topology makes enables the global treatment of complex three-dimensional mechanisms with take into account of clearances.
224

Frequency Synthesis for Cognitive Radio Receivers and Other Wideband Applications

Zahir, Zaira January 2017 (has links) (PDF)
The radio frequency (RF) spectrum as a natural resource is severely under-utilized over time and space due to an inefficient licensing framework. As a result, in-creasing cellular and wireless network usage is placing significant demands on the licensed spectrum. This has led to the development of cognitive radios, software defined radios and mm-wave radios. Cognitive radios (CRs) enable more efficient spectrum usage over a wide range of frequencies and hence have emerged as an effective solution to handle huge network demands. They promise versatility, flex-ability and cognition which can revolutionize communications systems. However, they present greater challenges to the design of radio frequency (RF) front-ends. Instead of a narrow-band front-end optimized and tuned to the carrier frequency of interest, cognitive radios demand front-ends which are versatile, configurable, tun-able and capable of transmitting and receiving signals with different bandwidths and modulation schemes. The primary purpose of this thesis is to design a re-configurable, wide-band and low phase-noise fast settling frequency synthesizer for cognitive radio applications. Along with frequency generation, an area efficient multi-band low noise amplifier (LNA) with integrated built-in-self-test (BIST) and a strong immunity to interferers has also been proposed and implemented for these radios. This designed LNA relaxes the specification of harmonic content in the synthesizer output. Finally some preliminary work has also been done for mm-wave (V-band) frequency synthesis. The Key Contributions of this thesis are: A frequency synthesizer, based on a type-2, third-order Phase Locked Loop (PLL), covering a frequency range of 0.1-5.4 GHz, is implemented using a 0.13 µm CMOS technology. The PLL uses three voltage controlled oscillators (VCOs) to cover the whole range. It is capable of switching between any two frequencies in less than 3 µs and has phase noise values, compatible with most communication standards. The settling of the PLL in the desired state is achieved in dynamic multiple steps rather than traditional single step settling. This along with other circuit techniques like a DAC-based discriminator aided charge pump, fast acquisition pulse-clocked based PFD and timing synchro-negation is used to obtain a significantly reduced settling time A single voltage controlled LC-oscillator (LC-VCO) has been designed to cover a wide range of frequencies (2.0-4.1 GHz) using an area efficient and switch-able multi-tap inductor and a capacitor bank. The switching of the multi-tap inductor is done in the most optimal manner so as to get good phase-noise at the output. The multi-tap inductor provides a significant area advantage, and in spite of a degraded Q, provides an acceptable phase noise of -123 dBc/Hz and -113 dBc/Hz at an offset of 1 MHz at carrier frequencies of 2 and 4 GHz, respectively. Implemented in a 0.13 µm CMOS technology, the oscillator with ≈ 69 % tuning range, occupies an active area of only 0.095 mm2. An active inductor based noise-filter has been proposed to improve the phase-noise performance of the oscillator without much increase in the area. A variable gain multi-band low noise amplifier (LNA) is designed to operate over a wide range of frequencies (0.8 GHz to 2.4 GHz) using an area efficient switchable-π network. The LNA can be tuned to different gain and linearity combinations for different band settings. Depending upon the location of the interferers, a specific band can be selected to provide optimum gain and the best signal-to-intermodulation ratio. This is accomplished by the use of an on-chip Built-in-Self-Test (BIST) circuit. The maximum power gain of the amplifier is 19 dB with a return loss better than 10 dB for 7 mW of power consumption. The noise figure is 3.2 dB at 1 GHz and its third-order intercept point (I I P3) ranges from -15 dBm to 0 dBm. Implemented in a 0.13 µm CMOS technology, the LNA occupies an active area of about 0.29 mm2. Three different types of VCOs (stand-alone LC VCO, push-push VCO and a ring oscillator based VCO) for generating mm-wave frequencies have been implemented using 65-nm CMOS technology and their measured results have been analyzed
225

Problem-Based SRS: método para especificação de requisitos de software baseado em problemas / Problem-Based SRS: method for sofware requirements specification based on problems

Souza, Rafael Gorski Moreno 23 August 2016 (has links)
Especificação de requisitos é reconhecida como como uma atividade critica nos processos de desenvolvimento de software por causa de seu impacto nos riscos do projeto quando mal executada. Um grande conjunto de estudos discute aspectos teóricos, proposições de técnicas e práticas recomendadas para a Engenharia de Requisitos (ER). Para ter sucesso, ER tem que assegurar que requisitos especificados são completos e corretos, o que significa que todas as intenções dos stakeholders são cobertas pelos requisitos e que não existem requisitos desnecessários. Entretanto, a captura precisa das intenções stakeholders continua sendo um desafio e é o maior fator para falhas em projetos de software. Esta dissertação apresenta um novo método denominado “Problem-Based SRS” que tem como objetivo melhorar a qualidade da especificação de requisitos de software (SRS – Software Requirements Specification) no sentido de que os requisitos especificados forneçam uma resposta adequada para os problemas dos clientes reais. Neste método, o conhecimento sobre os requisitos de software é construído a partir do conhecimento sobre os problemas do cliente. O Problem-Based SRS consiste de uma organização de atividades e resultados através de um processo que contem cinco etapas. O método fornece suporte ao time de engenharia de requisitos para analisar sistematicamente o contexto de negócio e especificar os requisitos de software, considerando o vislumbre e a visão do software. Os aspectos de qualidade das especificações são avaliados utilizando técnicas de rastreabilidade e princípios do axiomatic design. Os casos de estudo realizados e apresentados nesta dissertação apontam que o método proposto pode contribuir de forma significativa para uma melhor especificação de requisitos de software. / Requirements specification has long been recognized as critical activity in software development processes because of its impact on project risks when poorly performed. A large amount of studies addresses theoretical aspects, propositions of techniques, and recommended practices for Requirements Engineering (RE). To be successful, RE have to ensure that the specified requirements are complete and correct what means that all intents of the stakeholders in a given business context are covered by the requirements and that no unnecessary requirement was introduced. However, the accurate capture the business intents of the stakeholders remains a challenge and it is a major factor of software project failures. This master’s dissertation presents a novel method referred to as “Problem-Based SRS” aiming at improving the quality of the Software Requirements Specification (SRS) in the sense that the stated requirements provide suitable answers to real customer ́s businesses issues. In this approach, the knowledge about the software requirements is constructed from the knowledge about the customer ́s problems. Problem-Based SRS consists in an organization of activities and outcome objects through a process that contains five main steps. It aims at supporting the software requirements engineering team to systematically analyze the business context and specify the software requirements, taking also into account a first glance and vision of the software. The quality aspects of the specifications are evaluated using traceability techniques and axiomatic design principles. The cases studies conducted and presented in this document point out that the proposed method can contribute significantly to improve the software requirements specification.
226

Problem-Based SRS: método para especificação de requisitos de software baseado em problemas / Problem-Based SRS: method for sofware requirements specification based on problems

Souza, Rafael Gorski Moreno 23 August 2016 (has links)
Especificação de requisitos é reconhecida como como uma atividade critica nos processos de desenvolvimento de software por causa de seu impacto nos riscos do projeto quando mal executada. Um grande conjunto de estudos discute aspectos teóricos, proposições de técnicas e práticas recomendadas para a Engenharia de Requisitos (ER). Para ter sucesso, ER tem que assegurar que requisitos especificados são completos e corretos, o que significa que todas as intenções dos stakeholders são cobertas pelos requisitos e que não existem requisitos desnecessários. Entretanto, a captura precisa das intenções stakeholders continua sendo um desafio e é o maior fator para falhas em projetos de software. Esta dissertação apresenta um novo método denominado “Problem-Based SRS” que tem como objetivo melhorar a qualidade da especificação de requisitos de software (SRS – Software Requirements Specification) no sentido de que os requisitos especificados forneçam uma resposta adequada para os problemas dos clientes reais. Neste método, o conhecimento sobre os requisitos de software é construído a partir do conhecimento sobre os problemas do cliente. O Problem-Based SRS consiste de uma organização de atividades e resultados através de um processo que contem cinco etapas. O método fornece suporte ao time de engenharia de requisitos para analisar sistematicamente o contexto de negócio e especificar os requisitos de software, considerando o vislumbre e a visão do software. Os aspectos de qualidade das especificações são avaliados utilizando técnicas de rastreabilidade e princípios do axiomatic design. Os casos de estudo realizados e apresentados nesta dissertação apontam que o método proposto pode contribuir de forma significativa para uma melhor especificação de requisitos de software. / Requirements specification has long been recognized as critical activity in software development processes because of its impact on project risks when poorly performed. A large amount of studies addresses theoretical aspects, propositions of techniques, and recommended practices for Requirements Engineering (RE). To be successful, RE have to ensure that the specified requirements are complete and correct what means that all intents of the stakeholders in a given business context are covered by the requirements and that no unnecessary requirement was introduced. However, the accurate capture the business intents of the stakeholders remains a challenge and it is a major factor of software project failures. This master’s dissertation presents a novel method referred to as “Problem-Based SRS” aiming at improving the quality of the Software Requirements Specification (SRS) in the sense that the stated requirements provide suitable answers to real customer ́s businesses issues. In this approach, the knowledge about the software requirements is constructed from the knowledge about the customer ́s problems. Problem-Based SRS consists in an organization of activities and outcome objects through a process that contains five main steps. It aims at supporting the software requirements engineering team to systematically analyze the business context and specify the software requirements, taking also into account a first glance and vision of the software. The quality aspects of the specifications are evaluated using traceability techniques and axiomatic design principles. The cases studies conducted and presented in this document point out that the proposed method can contribute significantly to improve the software requirements specification.
227

Fundamentos de lógica, conjuntos e números naturais

Santos, Rafael Messias 28 August 2015 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / The present work has as main objective to approach the fundaments of logic and the notions of sets in a narrow and elementary way, culminating in the construction of natural numbers. We present and advance, as far as possible, natural and intuitively, the concepts of propositions and open propositions, and the use of these in the speci cation sets, according with the axiom of the speci cation. We also present the logic connectives of open propositions and logic equivalences, relating them to the sets. We showed the concept of Theorem, as well as some forms of writing and demonstrations in the scope of the sets, and we used properties and relations of sets in the demonstration techniques. Our study ended with the construction of natural numbers and some of its properties, for example, the Relation Order. / O presente trabalho tem como principal objetivo abordar os fundamentos de lógica e as noções de conjuntos de maneira estreita e elementar, culminando na constru- ção dos números naturais. Apresentamos, e progredimos na medida do possível, de forma natural e/ou intuitiva, os conceitos de proposições e proposições abertas, e o uso destes nas especi cações de conjuntos, de acordo com o axioma da especi cação. Apresentamos também os conectivos lógicos de proposições abertas e as equivalências lógicas, relacionando-os aos conjuntos. Mostramos o conceito de Teorema, bem como algumas formas de escritas e demonstrações no âmbito dos conjuntos, e utilizamos propriedades e relações de conjuntos nas técnicas de demonstração. Encerramos nosso estudo com a construção dos números naturais e algumas das suas principais propriedades, como por exemplo, a Relação de Ordem.
228

Intégration des méthodes de sensibilité d'ordre élevé dans un processus de conception optimale des turbomachines : développement de méta-modèles

Zhang, Zebin 15 December 2014 (has links)
La conception optimale de turbomachines repose usuellement sur des méthodes itératives avec des évaluations soit expérimentales, soit numériques qui peuvent conduire à des coûts élevés en raison des nombreuses manipulations ou de l’utilisation intensive de CPU. Afin de limiter ces coûts et de raccourcir les temps de développement, le présent travail propose d’intégrer une méthode de paramétrisation et de métamodélisation dans un cycle de conception d’une turbomachine axiale basse vitesse. La paramétrisation, réalisée par l’étude de sensibilité d’ordre élevé des équations de Navier-Stokes, permet de construire une base de données paramétrée qui contient non seulement les résultats d’évaluations, mais aussi les dérivées simples et les dérivées croisées des objectifs en fonction des paramètres. La plus grande quantité d’informations apportée par les dérivées est avantageusement utilisée lors de la construction de métamodèles, en particulier avec une méthode de Co-Krigeage employée pour coupler plusieurs bases de données. L’intérêt économique de la méthode par rapport à une méthode classique sans dérivée réside dans l’utilisation d’un nombre réduit de points d’évaluation. Lorsque ce nombre de points est véritablement faible, il peut arriver qu’une seule valeur de référence soit disponible pour une ou plusieurs dimensions, et nécessite une hypothèse de répartition d’erreur. Pour ces dimensions, le Co-Krigeage fonctionne comme une extrapolation de Taylor à partir d’un point et de ses dérivées. Cette approche a été expérimentée avec la construction d’un méta-modèle pour une hélice présentant un moyeu conique. La méthodologie fait appel à un couplage de bases de données issues de deux géométries et deux points de fonctionnement. La précision de la surface de réponse a permis de conduire une optimisation avec un algorithme génétique NSGA-2, et les deux optima sélectionnés répondent pour l’un à une maximisation du rendement, et pour l’autre à un élargissement de la plage de fonctionnement. Les résultats d’optimisation sont finalement validés par des simulations numériques supplémentaires. / The turbomachinery optimal design usually relies on some iterative methods with either experimental or numerical evaluations that can lead to high cost due to numerous manipulations and intensive usage of CPU. In order to limit the cost and shorten the development time, the present thesis work proposes to integrate a parameterization method and the meta-modelization method in an optimal design cycle of an axial low speed turbomachine. The parameterization, realized by the high order sensitivity study of Navier-Stokes equations, allows to construct a parameterized database that contains not only the evaluations results, but also the simple and cross derivatives of objectives as a function of parameters. Enriched information brought by the derivatives are utilized during the meta-model construction, particularly by the Co-Kriging method employed to couple several databases. Compared to classical methods that are without derivatives, the economic benefit of the proposed method lies in the use of less reference points. Provided the number of reference points is small, chances are a unique point presenting at one or several dimensions, which requires a hypothesis on the error distribution. For those dimensions, the Co-Kriging works like a Taylor extrapolation from the reference point making the most of its derivatives. This approach has been experimented on the construction of a meta-model for a conic hub fan. The methodology recalls the coupling of databases based on two fan geometries and two operating points. The precision of the meta-model allows to perform an optimization with help of NSGA-2, one of the optima selected reaches the maximum efficiency, and another covers a large operating range. The optimization results are eventually validated by further numerical simulations.
229

Polyfunkční dům Prudký - stavebně technologický projekt / Multifunctional house Prudký - construction technology project

Kaláb, Ondřej January 2018 (has links)
The aim of this Diploma Thesis is constructionally - technological solution of the rough construction of polyfunctional house on the corner of Bánskobystrická and Medlánecká streets in Brno Řečkovice. In the Diploma Thesis I will deal with the plan of the work procedure of the execution of roof construction layers, machine sets, itemized budget, construction schedule, construction site equipment, inspection and testing plan, ecology and transport relations related to the construction. I would like to point out the complication of the construction with a lack of space because the construction is in the area of Brno Řečkovice. The plan of construction execution was more difficult due to lack of space for storage of materials and machinery handling.
230

Produkce digitálních obrazových dat a jejich kontrola / Digital Images Production and Quality Control

Vychodil, Bedřich January 2013 (has links)
(EN) This dissertation provides a broad understanding of fundamental terminology and an inside view of the digitization workflow and quality control processes. The main foci are on quality assurance during and outside the digitization processes and identification of the most suitable format for access, long term preservation, rendering and reformatting issues. An analysis of selected digitization centers is also included. An application called DIFFER (Determinator of Image File Format propERties) and subsequently The Image Data Validator - DIFFER has been developed using results from previously conducted research. The application utilizes new methods and technologies to help accelerate the whole processing and quality control. The goal was to develop a quality control application for a select group of relevant still image file formats capable of performing identification, characterization, validation and visual/mathematical comparison integrated into an operational digital preservation framework. This application comes with a well-structured graphic user interface, which helps the end user to understand the relationships between various file format properties, detect visual and non visual errors and simplify decision-making. Additional comprehensive annexes, describing the most crucial still image...

Page generated in 0.1832 seconds