• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 327
  • 58
  • 46
  • 35
  • 21
  • 9
  • 9
  • 8
  • 7
  • 6
  • 4
  • 4
  • 4
  • 3
  • 3
  • Tagged with
  • 635
  • 66
  • 65
  • 54
  • 54
  • 49
  • 47
  • 45
  • 41
  • 36
  • 35
  • 34
  • 33
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Análise de consistência de traçado de uma rodovia de múltiplas faixas. / Geometric design consistency analysis of a multilane highway.

Torres, Ana Luísa Martins 22 October 2015 (has links)
O número de acidentes de trânsito é crescente nas últimas décadas no Brasil. Uma das principais causas de acidentes em rodovias brasileiras é o excesso de velocidade, que contribui para a possibilidade de ocorrência de acidentes. As velocidades praticadas pelos motoristas são também função dos elementos geométricos que compõem a via (raio, rampa, largura da faixa, etc). A consistência de traçado não afeta a expectativa dos motoristas e garante uma operação segura. A maioria dos motoristas consegue perceber as falhas de coordenação, mas tecnicamente, por exemplo, desconhecem a origem das mesmas. Esta pesquisa apresenta como objetivo a análise de consistência de um trecho de uma determinada rodovia do país de múltiplas faixas, com elevado índice de acidentes e alto fluxo de veículos comerciais. Os pontos com maior ocorrência de acidentes foram identificados e realizaram-se medições de velocidade para elaboração de um modelo de previsão de velocidade operacional (V85) do trecho de estudo. De posse deste modelo, procedeu-se à análise de consistência através do método dos critérios de segurança, que identificou 2 seções com problemas de consistência. Por fim, verificou-se se estas seções correspondiam aos locais de maior número de acidentes: a tangente T5 precede uma curva com alto índice de acidentes (km 511+000); o local com maior concentração de acidentes (km 514) foi classificado como RAZOÁVEL. / In Brazil, the number of traffic accidents is increasing in recent decades, and speeding is one of the major causes of accidents on highways, which contributes to the possibility of accidents. The speed used by the driver is determined by the geometric elements of the road (radius, slope, lane width, etc). Geometric design consistency does not affect the expectation of drivers and ensures a safe operation. Most drivers can perceive coordination failures, but, technically, they are unaware of their source. This research aims to perform a consistency analysis of a stretch of a Brazilian multiple lane highway with a high rate of accidents and large number of commercial vehicles. Thus, the sections with a higher accident occurrence were identified, and speeds were measured to determine the operating speed (V85) and, subsequently, to develop a model to predict operating speeds. With this model, the design consistency was analyzed using the method of the safety criteria. This analysis identified two sections with consistency problems. Finally, it was verified if these sections correspond to the sites with higher accident occurrence: tangent T5 precedes a curve with a high number of accidents (km 511+000); the site with higher accident occurrence (km 514) was classified as FAIR.
152

An analysis of continuous consistency models in real time peer-to-peer fighting games

Huynh, Martin, Valarino, Fernando January 2019 (has links)
This study analyses different methods of maintaining a consistent state between two peers in a real time fighting game played over a network. Current methods of state management are explored in a comprehensive literature review, which establishes a baseline knowledge and theoretical comparison of use cases for the two most common models: delay and rollback. These results were then further explored by a practical case study where a test fighting game was created in Unity3D that implemented both delay and rollback networking. Networking strategies were tested by a group of ten users under different simulated network conditions and their experiences were documented using a Likert-style questionnaire for each stage of testing. Based on user feedback it was found that the implemented rollback strategy provided an overall better user experience. Rollback was found to be more responsive and stable than the delay implementation as network latency was increased, suggesting that rollback is also more fault tolerant than delay.
153

Seleção de modelos para segmentação de sequências simbólicas usando máxima verossimilhança penalizada / A model selection criterion for the segmentation of symbolic sequences using penalized maximum likelihood

Castro, Bruno Monte de 20 February 2013 (has links)
O problema de segmentação de sequências tem o objetivo de particionar uma sequência ou um conjunto delas em um número finito de segmentos distintos tão homogêneos quanto possível. Neste trabalho consideramos o problema de segmentação de um conjunto de sequências aleatórias, com valores em um alfabeto $\\mathcal$ finito, em um número finito de blocos independentes. Supomos ainda que temos $m$ sequências independentes de tamanho $n$, construídas pela concatenação de $s$ segmentos de comprimento $l^{*}_j$, sendo que cada bloco é obtido a partir da distribuição $\\p _j$ em $\\mathcal^{l^{*}_j}, \\; j=1,\\cdots, s$. Além disso denotamos os verdadeiros pontos de corte pelo vetor ${{\\bf k}}^{*}=(k^{*}_1,\\cdots,k^{*}_)$, com $k^{*}_i=\\sum _{j=1}^l^{*}_j$, $i=1,\\cdots, s-1$, esses pontos representam a mudança de segmento. Propomos usar o critério da máxima verossimilhança penalizada para inferir simultaneamente o número de pontos de corte e a posição de cada um desses pontos. Também apresentamos um algoritmo para segmentação de sequências e realizamos algumas simulações para mostrar seu funcionamento e sua velocidade de convergência. Nosso principal resultado é a demonstração da consistência forte do estimador dos pontos de corte quando o $m$ tende ao infinito. / The sequence segmentation problem aims to partition a sequence or a set of sequences into a finite number of segments as homogeneous as possible. In this work we consider the problem of segmenting a set of random sequences with values in a finite alphabet $\\mathcal$ into a finite number of independent blocks. We suppose also that we have $m$ independent sequences of length $n$, constructed by the concatenation of $s$ segments of length $l^{*}_j$ and each block is obtained from the distribution $\\p _j$ over $\\mathcal^{l^{*}_j}, \\; j=1,\\cdots, s$. Besides we denote the real cut points by the vector ${{\\bf k}}^{*}=(k^{*}_1,\\cdots,k^{*}_)$, with $k^{*}_i=\\sum _{j=1}^l^{*}_j$, $i=1,\\cdots, s-1$, these points represent the change of segment. We propose to use a penalized maximum likelihood criterion to infer simultaneously the number of cut points and the position of each one those points. We also present a algorithm to sequence segmentation and we present some simulations to show how it works and its convergence speed. Our principal result is the proof of strong consistency of this estimators when $m$ grows to infinity.
154

Análise da influência e características das vias no número e na severidade dos acidentes: estudo de caso na autoestrada Grajaú-Jacarepaguá / Analysis of the influence of roads characteristics on the number and severity of accidents: case study on the Grajaú-Jacarepaguá Highway

Leal, Bruno Alexandre Brandimarte 27 April 2017 (has links)
O objetivo deste trabalho é analisar a relação de características das vias com os acidentes de trânsito. A principal motivação para o desenvolvimento deste estudo está na ampla variedade de características que podem ter influência nos acidentes e na complexidade da relação entre elas, que pode tornar a análise distinta em cada caso. As vantagens decorrentes da adoção de ações baseadas no gerenciamento de riscos e no tratamento preventivo da segurança viária priorizam atuações sobre veículos e, em especial, sobre o esquema viário como alternativa viável para reduzir o peso do fator humano nos acidentes. Para isso foi elaborado um longo referencial teórico, enumerando potenciais características e as relacionando com acidentes, ressaltando a importância de avaliar e tratar aspectos concatenados ao veículo e à via, com o intuito de conseguir uma análise mais eficaz nas condições de segurança e contribuindo nas áreas acadêmica e profissional fornecendo dados e informações para ajudar na identificação e escolhas de características que mais influenciem a segurança. Também foi desenvolvido um estudo de caso, considerando a Autoestrada Grajaú-Jacarepaguá (Avenida Menezes Cortes), localizada na cidade do Rio de Janeiro. A partir de dois bancos de dados de acidentes, fornecidos pela Companhia de Engenharia de Tráfego do Rio de Janeiro (CET-Rio) e uma investigação de campo realizada na autoestrada, foram identificadas relações dos acidentes com as características viárias. Concluiu-se que as características das vias têm grande relevância sobre os acidentes ocorridos. Os resultados indicam que a consistência geométrica está relacionada aos acidentes de trânsito, porém ainda não existe um modelo de previsão que ajude a entender tal comportamento. De forma geral, as atividades de reconstrução e manutenção da via também dão apoio nas questões de segurança de tráfego. Apesar das limitações, espera-se que o estudo apresentado nesse projeto sirva como referência para futuras intervenções e melhorias das vias. / This study aims at analyzing a relationship of influence between the accidents and the road characteristics. The main study motivation is on the wide variety of characteristics that may influence the accidents, which can make the analysis distinct in each case. The advantages from the adoption of actions based on road safety risk management and the preventive treatment prioritize interventions on vehicles and, in particular, on the road system as a viable alternative to reduce the role of the human factor in accidents. For this, a long theoretical framework was developed, enumerating potential characteristics and relating them to accidents, emphasizing the importance of evaluating e treating related to the vehicle and to the highway to achieve a more effective analysis in safety conditions and contributing in the academic and professional areas providing data and information to help in the identification and selection of characteristics that influence the safety. Also, a case study was prepared, considering the Grajaú-Jacarepaguá Highway (Menezes Cortes Avenue), located in Rio de Janeiro. Based on two accidents databases provided by Agency of Engineering of Traffic of Rio de Janeiro (CET-Rio) and a highway field investigation, some relationships of accidents with roads characteristics were identified. It can be concluded that roads characteristics have great relevance on the occurred accidents. The results indicate that geometric consistency have a relationship with traffic accidents, however there isn\'t still a predictive model to support the understanding of this behavior. In general, the reconstruction and maintenance activities also support traffic safety issues. Despite the limitations, it is expected that the presented study in this project will serve as a reference for the future interventions and improvements of roads.
155

[en] APPLICATIONS OF THE FIRST CONSISTENCY PROOF PRESENTED BY GENTZEN FOR PEANO ARITHMETIC / [pt] APLICAÇÕES DA PRIMEIRA PROVA DE CONSISTÊNCIA APRESENTADA POR GENTZEN PARA A ARITMÉTICA DE PEANO

MARIA FERNANDA PALLARES COLOMAR 14 November 2003 (has links)
[pt] Na antologia que M.E. Szabo realizara dos trabalhos de Gentzen e publicara em 1969 se transcrevem, em um apêndice, algumas passagens apresentadas por Bernays ao editor pertencentes a uma primeira prova de consistência para a Aritmética de Peano realizada por Gentzen que não tinha sido publicada até então. À diferença das outras provas de consistência realizadas por Gentzen e já conhecidas na década de trinta, esta prova não utiliza o procedimento de indução transfinita até e0. Ao contrário, baseia-se na definição de um processo de redução de seqüentes que se associa sistematicamente a todo seqüente derivável permitindo reconhecê-lo como verdadeiro. Nós reconstruímos essa prova realizando algumas variações e estudamos o modo pelo qual a técnica principal utilizada (a definição do processo de redução de seqüentes) pode ser vista em relação a resultados da lógica clássica de primeira ordem tais como provas de completude. A parte central da nossa dissertação é a realização de uma versão desta prova de consistência para um sistema formal para a Aritmética de Heyting. / [en] In the antology of Gentzens works made by M.E.Szabo and published in 1969, we find out in an appendix, some passages presented by Bernays to the editor. These texts belong to a first proof of Peanos Arithmetic consistency that Gentzen did not publish. In a different way from the other proofs of consistency made by Gentzen and already known in the thirties, this proof does not use the procedure of transfinite induction up to e0. On the contrary, it is based on the definition of a reduction process for sequents that is systematically associated to every derivable sequent allowing us to recognize it as a true sequent. We reconstructed this proof making some variations and we studied how the main technique used (the definition of the reduction process) could be seen in relation with other results of first order logic like proofs of completness. The main part of our dissertation is another version of this consistency proof for a formal system for Heyting Arithmetic.
156

XEM: XML Evolution Management

Kramer, Diane S. 21 July 2001 (has links)
"As information on the World Wide Web continues to proliferate at an astounding rate, the Extensible Markup Language (XML) has been emerging as a standard format for data representation on the web. In many application domains, specific document type definitions (DTDs) are designed to enforce a semantically agreed-upon structure of the XML documents. In XML context, these structural definitions serve as schemata. However, both the data and the structure (schema) of XML documents tend to change over time for a multitude of reasons, including to correct design errors in the DTD, to allow expansion of the application scope over time, or to account for the merging of several businesses into one. Most of the current software tools that enable the use of XML do not provide explicit support for such data or schema changes. Using these tools in a changing environment entails making manual edits to DTDs and XML data and reloading them from scratch. In this vein, we put forth the first solution framework, called XML Evolution Manager (XEM), to manage the evolution of DTDs and XML documents. XEM provides a minimal yet complete taxonomy of basic change primitives. These primitives, classified as either data or schema changes, are consistency-preserving. For a data change, they ensure that the modified XML document conforms to its DTD both in structure and constraints. For a schema change, they ensure that the new DTD is well-formed, and all existing XML documents are transformed also to conform to the modified DTD. We prove both the completeness of our evolution taxonomy, as well as its consistency-preserving nature. To verify the feasibility of our XEM approach we have implemented a working prototype system in Java, using the XML4J parser from IBM and PSE Pro as our backend storage system. We present an experimental study run on this system where we compare the relative efficiencies of the primitive operations in terms of their execution times. We then contrast these execution times against the time to reload the data, which would be required in a manual system. Based on the results of these experiments we conclude that our approach improves upon the previous method of making manual changes and reloading data from scratch by providing automated evolution management facilities for DTDs and XML documents."
157

The Schroedinger-Poisson Selfconsistency in Layered Quantum Semiconductor Structures

Moussa, Jonathan Edward 24 November 2003 (has links)
"We develop a selfconsistent solution of the Schroedinger and Poisson equations in semiconductor heterostructures with arbitrary doping profiles and layer geometries. An algorithm for this nonlinear problem is presented in a multiband k.P framework for the electronic band structure using the finite element method. The discretized functional integrals associated with the Schroedinger and Poisson equations are used in a variational approach. The finite element formulation allows us to evaluate functional derivatives needed to linearize Poisson’s equation in a natural manner. Illustrative examples are presented using a number of heterostructures including single quantum wells, an asymmetric double quantum well, p-i-n-i superlattices and trilayer superlattices."
158

Waste management in Ericsson To give a method to decide better on any of waste items produced in Ericsson AB in Borås to choose the most appropriate based on sustainability

Salimi, Mahdi January 2010 (has links)
This report deals with the managing of the waste of a company, Ericsson- site of Borås, in an analytical context. Based on sustainability (concept and aspects), they are interested to have a method to check their waste management capability whether they are in right direction. Among all studied methods, Analytic Hierarchy Process (AHP) is utilized. This method works based on a mathematical algorithm starting by making a hierarchy, continuing with pairwise comparisons between correspondent items, then doing calculations and finally checking and reviewing to be certain of the correctness of the whole process by an eligible team of decision makers. In spite of some critiques that scientifically are accepted, it remains reliable for the purpose.The method is applied to some instances of waste items, wood boxes and pallets and hard plastics, in Ericsson. Then, two controversial issues of the selected method, consistency and rank reversal, are investigated and discussed on the mentioned waste items. Application of the method for their future use is foreseen thereafter.
159

Seleção de modelos para segmentação de sequências simbólicas usando máxima verossimilhança penalizada / A model selection criterion for the segmentation of symbolic sequences using penalized maximum likelihood

Bruno Monte de Castro 20 February 2013 (has links)
O problema de segmentação de sequências tem o objetivo de particionar uma sequência ou um conjunto delas em um número finito de segmentos distintos tão homogêneos quanto possível. Neste trabalho consideramos o problema de segmentação de um conjunto de sequências aleatórias, com valores em um alfabeto $\\mathcal$ finito, em um número finito de blocos independentes. Supomos ainda que temos $m$ sequências independentes de tamanho $n$, construídas pela concatenação de $s$ segmentos de comprimento $l^{*}_j$, sendo que cada bloco é obtido a partir da distribuição $\\p _j$ em $\\mathcal^{l^{*}_j}, \\; j=1,\\cdots, s$. Além disso denotamos os verdadeiros pontos de corte pelo vetor ${{\\bf k}}^{*}=(k^{*}_1,\\cdots,k^{*}_)$, com $k^{*}_i=\\sum _{j=1}^l^{*}_j$, $i=1,\\cdots, s-1$, esses pontos representam a mudança de segmento. Propomos usar o critério da máxima verossimilhança penalizada para inferir simultaneamente o número de pontos de corte e a posição de cada um desses pontos. Também apresentamos um algoritmo para segmentação de sequências e realizamos algumas simulações para mostrar seu funcionamento e sua velocidade de convergência. Nosso principal resultado é a demonstração da consistência forte do estimador dos pontos de corte quando o $m$ tende ao infinito. / The sequence segmentation problem aims to partition a sequence or a set of sequences into a finite number of segments as homogeneous as possible. In this work we consider the problem of segmenting a set of random sequences with values in a finite alphabet $\\mathcal$ into a finite number of independent blocks. We suppose also that we have $m$ independent sequences of length $n$, constructed by the concatenation of $s$ segments of length $l^{*}_j$ and each block is obtained from the distribution $\\p _j$ over $\\mathcal^{l^{*}_j}, \\; j=1,\\cdots, s$. Besides we denote the real cut points by the vector ${{\\bf k}}^{*}=(k^{*}_1,\\cdots,k^{*}_)$, with $k^{*}_i=\\sum _{j=1}^l^{*}_j$, $i=1,\\cdots, s-1$, these points represent the change of segment. We propose to use a penalized maximum likelihood criterion to infer simultaneously the number of cut points and the position of each one those points. We also present a algorithm to sequence segmentation and we present some simulations to show how it works and its convergence speed. Our principal result is the proof of strong consistency of this estimators when $m$ grows to infinity.
160

Rhéologie des boues résiduaires : rôle de la thixotropie et de la composition sur les propriétés liquides et solides / Rheology of sewage sludge : Role of thixotropy and composition on liquid and solid properties

Thiene, Oumar 30 November 2018 (has links)
Alors que la quantité de boues résiduaires produites à l’échelle mondiale ne cesse d’augmenter, la gestion de ces boues est désormais devenue un enjeu sociétal majeur. Dans le cadre d’une politique de développement durable, son efficacité passe nécessairement par la réduction des volumes produits (via l’optimisation des procédés de traitement) et par une meilleure valorisation. Pour cela, les procédés ainsi que les outils mis en oeuvre pour traiter puis valoriser le matériau doivent être de plus en plus performants. Les traitements étant essentiellement basés sur des écoulements, cette performance accrue nécessite la mesure et la maîtrise des caractéristiques des boues résiduaires telles que la consistance et les propriétés d’écoulement. Cependant le matériau est complexe et ses propriétés rhéologiques varient selon sa composition. Ainsi, le principal critère utilisé pour déterminer sa consistance, la teneur en matière solide, s’avère limité puisque la boue la plus consistante n’est pas nécessairement celle qui contient le plus de matière solide. Un état de l’art sur le comportement rhéologique des matériaux non-Newtoniens en général et des boues résiduaires en particulier a montré l’intérêt de s’intéresser au rôle de la matière organique et de la thixotropie dans la variabilité des propriétés rhéologiques du matériau. Dans cet objectif, une méthodologie expérimentale a d’abord été développée pour s’assurer de l’obtention de résultats fiables. Cela a permis de montrer l’influence des procédures expérimentales sur les mesures rhéologiques et de définir une procédure expérimentale robuste garantissant la répétabilité et la reproductibilité des résultats obtenus. Puis, à partir de cette procédure expérimentale, il a été montré que les effets de la thixotropie n’affectent que les propriétés solides des boues, les propriétés liquides en étant indépendantes. De plus, cette approche a permis de mettre en évidence que le comportement thixotrope est intrinsèquement lié à la matière organique contenue dans les boues : plus la matière organique est élevée, plus les boues sont thixotropes. Ensuite, à partir de la matière solide et de la proportion de matière organique contenue dans le solide, un critère plus pertinent que la teneur en matière solide a été établi pour comparer les propriétés rhéologiques entre boues d’origines différentes. Ce critère, appelé consistance équivalente, s’avère suffisant pour tenir compte de l’influence de la composition sur les propriétés rhéologiques des boues. Enfin, selon la valeur de la consistance équivalente, il a été montré que le comportement rhéologique des boues résiduaires peut être considéré comme Newtonien, non-Newtonien simple (effets thixotropes négligeables) ou non-Newtonien avec des effets thixotropes. / Sludge management is a major issue as the production is increasing year after year while the solutions to reuse it are limited. In the context of sustainable management and circular economy, treatment processes need to be well controlled to pilot operating systems. Since treatment processes are essentially based on flowing properties, improving the efficiency requires the measurement and control of sludge rheology, among which viscous properties. However, sludge is complex material and its rheological properties highly dependent of its composition. The solid content, the most usual parameter considered to evaluate rheological properties, is quite limited as sludge consistency is not directly connected to the solid content. An in-depth state of the art review on non-Newtonian materials rheology with a focus on sewage sludge has underlined the role of organic matter and thixotropy in complex fluid rheological properties. In that purpose, an experimental methodology was defined to ensure reliable results. This allowed to highlight the impact of experimental setups on rheological measurements and to define a robust experimental methodology that ensures the repeatability and the reproducibility of the measurements. Then, thanks to this experimental methodology, it has been shown that thixotropy is a solid-like characteristic, viscous properties being independent of material history. In addition, this approach has made it possible to demonstrate that the thixotropic behavior is intrinsically connected to the organic matter content: the higher the organic matter, the more thixotropic the sludge. Moreover, from the solid matter and the ratio of organic matter in the solid, a criterion (more relevant than the solid content) has been established to compare the rheological properties between sludge of different origins. This criterion, called equivalent consistency, appears to be sufficient to take into account the impact of the composition on sludge rheological properties. Finally, it has been shown that the rheological behavior of sewage sludge can be considered as Newtonian, simple non-Newtonian (with negligible thixotropic effects) or non-Newtonian with thixotropic effects depending the value of the equivalent consistency.

Page generated in 0.4199 seconds