• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • 4
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 50
  • 19
  • 14
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Solving dense linear systems on accelerated multicore architectures / Résoudre des systèmes linéaires denses sur des architectures composées de processeurs multicœurs et d’accélerateurs

Rémy, Adrien 08 July 2015 (has links)
Dans cette thèse de doctorat, nous étudions des algorithmes et des implémentations pour accélérer la résolution de systèmes linéaires denses en utilisant des architectures composées de processeurs multicœurs et d'accélérateurs. Nous nous concentrons sur des méthodes basées sur la factorisation LU. Le développement de notre code s'est fait dans le contexte de la bibliothèque MAGMA. Tout d'abord nous étudions différents solveurs CPU/GPU hybrides basés sur la factorisation LU. Ceux-ci visent à réduire le surcoût de communication dû au pivotage. Le premier est basé sur une stratégie de pivotage dite "communication avoiding" (CALU) alors que le deuxième utilise un préconditionnement aléatoire du système original pour éviter de pivoter (RBT). Nous montrons que ces deux méthodes surpassent le solveur utilisant la factorisation LU avec pivotage partiel quand elles sont utilisées sur des architectures hybrides multicœurs/GPUs. Ensuite nous développons des solveurs utilisant des techniques de randomisation appliquées sur des architectures hybrides utilisant des GPU Nvidia ou des coprocesseurs Intel Xeon Phi. Avec cette méthode, nous pouvons éviter l'important surcoût du pivotage tout en restant stable numériquement dans la plupart des cas. L'architecture hautement parallèle de ces accélérateurs nous permet d'effectuer la randomisation de notre système linéaire à un coût de calcul très faible par rapport à la durée de la factorisation. Finalement, nous étudions l'impact d'accès mémoire non uniformes (NUMA) sur la résolution de systèmes linéaires denses en utilisant un algorithme de factorisation LU. En particulier, nous illustrons comment un placement approprié des processus légers et des données sur une architecture NUMA peut améliorer les performances pour la factorisation du panel et accélérer de manière conséquente la factorisation LU globale. Nous montrons comment ces placements peuvent améliorer les performances quand ils sont appliqués à des solveurs hybrides multicœurs/GPU. / In this PhD thesis, we study algorithms and implementations to accelerate the solution of dense linear systems by using hybrid architectures with multicore processors and accelerators. We focus on methods based on the LU factorization and our code development takes place in the context of the MAGMA library. We study different hybrid CPU/GPU solvers based on the LU factorization which aim at reducing the communication overhead due to pivoting. The first one is based on a communication avoiding strategy of pivoting (CALU) while the second uses a random preconditioning of the original system to avoid pivoting (RBT). We show that both of these methods outperform the solver using LU factorization with partial pivoting when implemented on hybrid multicore/GPUs architectures. We also present new solvers based on randomization for hybrid architectures for Nvidia GPU or Intel Xeon Phi coprocessor. With this method, we can avoid the high cost of pivoting while remaining numerically stable in most cases. The highly parallel architecture of these accelerators allow us to perform the randomization of our linear system at a very low computational cost compared to the time of the factorization. Finally we investigate the impact of non-uniform memory accesses (NUMA) on the solution of dense general linear systems using an LU factorization algorithm. In particular we illustrate how an appropriate placement of the threads and data on a NUMA architecture can improve the performance of the panel factorization and consequently accelerate the global LU factorization. We show how these placements can improve the performance when applied to hybrid multicore/GPU solvers.
42

Análise de textura em imagens baseado em medidas de complexidade / Image Texture Analysis based on complex measures

Condori, Rayner Harold Montes 30 November 2015 (has links)
A análise de textura é uma das mais básicas e famosas áreas de pesquisa em visão computacional. Ela é também de grande importância em muitas outras disciplinas, tais como ciências médicas e biológicas. Por exemplo, uma tarefa comum de análise de textura é a detecção de tecidos não saudáveis em imagens de Ressonância Magnética do pulmão. Nesta dissertação, nós propomos um método novo de caracterização de textura baseado nas medidas de complexidade tais como o expoente de Hurst, o expoente de Lyapunov e a complexidade de Lempel-Ziv. Estas medidas foram aplicadas sobre amostras de imagens no espaço de frequência. Três métodos de amostragem foram propostas, amostragem: radial, circular e por caminhadas determinísticas parcialmente auto- repulsivas (amostragem CDPA). Cada método de amostragem produz um vetor de características por medida de complexidade aplicada. Esse vetor contem um conjunto de descritores que descrevem a imagem processada. Portanto, cada imagem será representada por nove vetores de características (três medidas de complexidade e três métodos de amostragem), os quais serão comparados na tarefa de classificação de texturas. No final, concatenamos cada vetor de características conseguido calculando a complexidade de Lempel-Ziv em amostras radiais e circulares com os descritores obtidos através de técnicas de análise de textura tradicionais, tais como padrões binários locais (LBP), wavelets de Gabor (GW), matrizes de co-ocorrência en níveis de cinza (GLCM) e caminhadas determinísticas parcialmente auto-repulsivas em grafos (CDPAg). Este enfoque foi testado sobre três bancos de imagens: Brodatz, USPtex e UIUC, cada um com seus próprios desafios conhecidos. As taxas de acerto de todos os métodos tradicionais foram incrementadas com a concatenação de relativamente poucos descritores de Lempel-Ziv. Por exemplo, no caso do método LBP, o incremento foi de 84.25% a 89.09% com a concatenação de somente cinco descritores. De fato, simplesmente concatenando cinco descritores são suficientes para ver um incremento na taxa de acerto de todos os métodos tradicionais estudados. Por outro lado, a concatenação de un número excessivo de descritores de Lempel-Ziv (por exemplo mais de 40) geralmente não leva a melhora. Neste sentido, vendo os resultados semelhantes obtidos nos três bancos de imagens analisados, podemos concluir que o método proposto pode ser usado para incrementar as taxas de acerto em outras tarefas que envolvam classificação de texturas. Finalmente, com a amostragem CDPA também se obtém resultados significativos, que podem ser melhorados em trabalhos futuros. / Texture analysis is one of the basic and most popular computer vision research areas. It is also of importance in many other disciplines, such as medical sciences and biology. For example, non-healthy tissue detection in lung Magnetic Resonance images is a common texture analysis task. We proposed a novel method for texture characterization based on complexity measures such as Lyapunov exponent, Hurst exponent and Lempel-Ziv complexity. This measurements were applied over samples taken from images in the frequency domain. Three types of sampling methods were proposed: radial sampling, circular sampling and sampling by using partially self-avoiding deterministic walks (CDPA sampling). Each sampling method produce a feature vector which contains a set of descriptors that characterize the processed image. Then, each image will be represented by nine feature vectors which are means to be compared in texture classification tasks (three complexity measures over samples from three sampling methods). In the end, we combine each Lempel-Ziv feature vector from the circular and radial sampling with descriptors obtained through traditional image analysis techniques, such as Local Binary Patterns (LBP), Gabor Wavelets (GW), Gray Level Co-occurrence Matrix (GLCM) and Self-avoiding Deterministic Walks in graphs (CDPAg). This approach were tested in three datasets: Brodatz, USPtex and UIUC, each one with its own well-known challenges. All traditional methods success rates were increased by adding relatively few Lempel-Ziv descriptors. For example in the LBP case the increment went from 84.25% to 89.09% with the addition of only five descriptors. In fact, just adding five Lempel-Ziv descriptors are enough to see an increment in the success rate of every traditional method. However, adding too many Lempel-Ziv descriptors (for example more than 40) generally doesnt produce better results. In this sense, seeing the similar results we obtain in all three databases, we conclude that this approach may be used to increment the success rate in a lot of others texture classification tasks. Finally, the CDPA sampling also obtain very promising results that we can improve further on future works.
43

Missbruk av skatteavtal : Kan de föreslagna reglerna i BEPS åtgärdspunkt 6 motverka förfaranden som missbrukar skatteavtal. Om inte, kan Sverige motverka sådana förfaranden genom att tillämpa generalklausulen mot skatteflykt? / Treaty abuse : Can the proposed rules in BEPS Action 6 counteract ac-tions that may lead to treaty abuse. If not, is it possible for Sweden to apply the Swedish general anti-avoidance rule, that is applicable against tax evasion?

Persson, Anna, Tedenhag, Jessica January 2015 (has links)
BEPS-projektet startades år 2012 för att förhindra att skattesubjekt använder kryphål i skatteavtalen och staternas nationella lagstiftningar för att erhålla skatteförmåner. År 2013 utgav OECD en handlingsplan som identifierar 15 åtgärder som ämnas vidtas. Åtgärdspunkt 6 reglerar problematiken kring hur skatteavtal missbrukas genom att skattesubjekt ”shoppar” efter jurisdiktionen med det skatteavtal som leder till den förmånligaste beskattningen. Åtgärdspunkt 6 föreslår att en specifik LOB-regel eller en generalklausul, PPT-regeln, införs i OECDs modellavtal för att motverka det aktuella förfarandet. LOB-regeln reglerar i vilka specifika situationer en skatteförmån är tillämplig och kan beviljas medan PPT-regeln är av allmän karaktär och innehåller allmänt hållna formuleringar för att täcka in förhållanden som är svåra att förutse på förhand. Uppsatsens syfte är att utreda huruvida de föreslagna reglerna kan uppfylla ändamålet med åtgärdspunkten, att motverka missbruk av skatteavtal. Om detta inte är möjligt undersöks om Sverige kan motverka sådana förfaranden genom att tillämpa generalklausulen mot skatteflykt som återfinns i skatteflyktslagen. Författarna bedömmer att LOB-regeln är allt för komplex i sin nuvarande lydelse vilket försvårar en tillämpning av regeln. Eftersom PPT-regeln är vag ger den utrymme för godtyckliga bedömningar, vilket inte ger ett förutsebart utfall när det inte finns klar vägledning. Författarna är med hänsyn till detta av uppfattningen att reglerna i dess nuvarande lydelse och form inte kan leva upp till syftet med åtgärdspunkt 6.  Enligt Peru-målet är skatteflyktslagen rent principiellt tillämplig på förfaranden som omfattas av skatteavtal. Generalklausulen ska då prövas mot förfarandet. Det fjärde rekvisitet, i strid med lagstiftnings syfte, är dock svårtillämpat och domstolens bedömningar varierar. Författarnas anser dock att genom att införa i skatteavtalen att dess syfte inte är att tillåta missbruk av avtalet kan generalklauslen motverka det aktuella förfarandet och skydda mot erodering av den svenska skattebasen vad gäller missbruk av skatteavtal. / In 2012 the BEPS project started with the purpose of preventing tax subjects from using loopholes in the tax treaties and national tax laws in order to receive tax benefits. In 2013, the OECD published an action plan that identifies 15 actions that is meant to be taken regarding this issue. Action 6 regulates treaty abuse through treaty shopping, which means that a tax subject is searching for the tax jurisdiction with the tax treaty that leads to the most beneficial taxation. To prevent this, Action 6 suggests that a specific LOB-rule and a general anti-avoidance rule, PPT-rule, should be included in the OECD model convention. The LOB-rule regulates in which specific situations a treaty benefit can be granted. The PPT-rule is more general and contains general wordings to cover situations that is difficult to foresee. The purpose of this thesis is to examine whether the proposed rules can fulfill the purpose of Action 6, to prevent treaty abuse. If not possible, it will be determined if Sweden can prevent treaty abuse by applying the general anti-avoidance rule against tax evasion stated in the Swedish skatteflyktslagen. The authors of the thesis are of the opinion that the LOB-rule is too complex in its current wording which makes it difficult to apply. Since the PPT-rule is vague, there is a wide scope for arbitrary assessments leads to an unpredictable outcome when the guidance is unclear. Therefore, the authors find that the rules in its current wordings cannot satisfy the purpose of Action 6.  According to the Peru-judgement the skatteflyktslagen can be applicable to situations that are covered by a tax treaty. The transaction should be tested against the general antiavoidance. The fourth prerequisite, in contravention of the purpose of the legislation, is difficult of apply and the judgements of the court varies. The authors believe that if the treaty preamble includes a statement which clarifies that the intention of the treaty is not to allow treaty abuse, the Swedish general anti-avoidance rule can prevent treaty abuse and thereby protect the Swedish tax base against BEPS regarding treaty abuse.
44

Walks, Transitions and Geometric Distances in Graphs / Marches, Transitions et Distances G´eom´etriques dans les Graphes

Bellitto, Thomas 27 August 2018 (has links)
Cette thèse étudie les aspects combinatoires, algorithmiques et la complexité de problèmes de théorie des graphes, et tout spécialement de problèmes liés aux notions de marches, de transitions et de distance dans les graphes. Nous nous intéressons d’abord au problème de traffic monitoring, qui consiste à placer aussi peu de capteurs que possible sur les arcs d’un graphe de façon à pouvoir reconstituer des marches d’objets. La caractérisation d’instances intéressantes dans la pratique nous amène à la notion de transitions interdites, qui renforce le modèle de graphe. Notre travail sur les graphes à transitions interdites comprend aussi l’étude de la notion d’ensemble de transitions connectant, que l’on peut voir comme l’analogue en terme de transitions de la notion d’arbre couvrant. Une partie importante de cette thèse porte sur les graphes géométriques, qui sont des graphes dont les sommets sont des points de l’espace réel et dont les arêtes sont déterminées par les distances géométriques entre les sommets. Ces graphes sont au coeur du célèbre problème de Hadwiger-Nelson et nous sont d’une grande aide dans notre étude de la densité des ensembles qui évitent la distance 1 dans plusieurs types d’espaces normés. Nous développons des outils pour étudier ces problèmes et les utilisons pour prouver la conjecture de Bachoc-Robins sur plusieurs paralléloèdres. Nous nous penchons aussi sur le cas du plan euclidien et améliorons les bornes sur la densité des ensembles évitant la distance 1 et sur son nombre chromatique fractionnaire. Enfin, nous étudions la complexité de problèmes d’homomorphismes de graphes et établissons des théorèmes de dichotomie sur la complexité des homomorphismes localement injectifs vers les tournois réflexifs. / This thesis studies combinatorial, algorithmic and complexity aspects of graph theory problems, and especially of problems related to the notions of walks, transitions and distances in graphs. We first study the problem of traffic monitoring, in which we have to place as few censors as possible on the arcs of a graph to be able to retrace walks of objects. The characterization of instances of practical interests brings us to the notion of forbidden transitions, which strengthens the model of graphs. Our work on forbidden-transition graphs also includes the study of connecting transition sets, which can be seen as a translation to forbidden-transition graphs of the notion of spanning trees. A large part of this thesis focuses on geometric graphs, which are graphs whose vertices are points of the real space and whose edges are determined by geometric distance between the vertices. This graphs are at the core of the famous Hadwiger- Nelson problem and are of great help in our study of the density of sets avoiding distance 1 in various normed spaces. We develop new tools to study these problems and use them to prove the Bachoc-Robins conjecture on several parallelohedra. We also investigate the case of the Euclidean plane and improve the bounds on the density of sets avoiding distance 1 and on its fractional chromatic number. Finally, we study the complexity of graph homomorphism problems and establish dichotomy theorems for the complexity of locally-injective homomorphisms to reflexive tournaments.
45

Análise de textura em imagens baseado em medidas de complexidade / Image Texture Analysis based on complex measures

Rayner Harold Montes Condori 30 November 2015 (has links)
A análise de textura é uma das mais básicas e famosas áreas de pesquisa em visão computacional. Ela é também de grande importância em muitas outras disciplinas, tais como ciências médicas e biológicas. Por exemplo, uma tarefa comum de análise de textura é a detecção de tecidos não saudáveis em imagens de Ressonância Magnética do pulmão. Nesta dissertação, nós propomos um método novo de caracterização de textura baseado nas medidas de complexidade tais como o expoente de Hurst, o expoente de Lyapunov e a complexidade de Lempel-Ziv. Estas medidas foram aplicadas sobre amostras de imagens no espaço de frequência. Três métodos de amostragem foram propostas, amostragem: radial, circular e por caminhadas determinísticas parcialmente auto- repulsivas (amostragem CDPA). Cada método de amostragem produz um vetor de características por medida de complexidade aplicada. Esse vetor contem um conjunto de descritores que descrevem a imagem processada. Portanto, cada imagem será representada por nove vetores de características (três medidas de complexidade e três métodos de amostragem), os quais serão comparados na tarefa de classificação de texturas. No final, concatenamos cada vetor de características conseguido calculando a complexidade de Lempel-Ziv em amostras radiais e circulares com os descritores obtidos através de técnicas de análise de textura tradicionais, tais como padrões binários locais (LBP), wavelets de Gabor (GW), matrizes de co-ocorrência en níveis de cinza (GLCM) e caminhadas determinísticas parcialmente auto-repulsivas em grafos (CDPAg). Este enfoque foi testado sobre três bancos de imagens: Brodatz, USPtex e UIUC, cada um com seus próprios desafios conhecidos. As taxas de acerto de todos os métodos tradicionais foram incrementadas com a concatenação de relativamente poucos descritores de Lempel-Ziv. Por exemplo, no caso do método LBP, o incremento foi de 84.25% a 89.09% com a concatenação de somente cinco descritores. De fato, simplesmente concatenando cinco descritores são suficientes para ver um incremento na taxa de acerto de todos os métodos tradicionais estudados. Por outro lado, a concatenação de un número excessivo de descritores de Lempel-Ziv (por exemplo mais de 40) geralmente não leva a melhora. Neste sentido, vendo os resultados semelhantes obtidos nos três bancos de imagens analisados, podemos concluir que o método proposto pode ser usado para incrementar as taxas de acerto em outras tarefas que envolvam classificação de texturas. Finalmente, com a amostragem CDPA também se obtém resultados significativos, que podem ser melhorados em trabalhos futuros. / Texture analysis is one of the basic and most popular computer vision research areas. It is also of importance in many other disciplines, such as medical sciences and biology. For example, non-healthy tissue detection in lung Magnetic Resonance images is a common texture analysis task. We proposed a novel method for texture characterization based on complexity measures such as Lyapunov exponent, Hurst exponent and Lempel-Ziv complexity. This measurements were applied over samples taken from images in the frequency domain. Three types of sampling methods were proposed: radial sampling, circular sampling and sampling by using partially self-avoiding deterministic walks (CDPA sampling). Each sampling method produce a feature vector which contains a set of descriptors that characterize the processed image. Then, each image will be represented by nine feature vectors which are means to be compared in texture classification tasks (three complexity measures over samples from three sampling methods). In the end, we combine each Lempel-Ziv feature vector from the circular and radial sampling with descriptors obtained through traditional image analysis techniques, such as Local Binary Patterns (LBP), Gabor Wavelets (GW), Gray Level Co-occurrence Matrix (GLCM) and Self-avoiding Deterministic Walks in graphs (CDPAg). This approach were tested in three datasets: Brodatz, USPtex and UIUC, each one with its own well-known challenges. All traditional methods success rates were increased by adding relatively few Lempel-Ziv descriptors. For example in the LBP case the increment went from 84.25% to 89.09% with the addition of only five descriptors. In fact, just adding five Lempel-Ziv descriptors are enough to see an increment in the success rate of every traditional method. However, adding too many Lempel-Ziv descriptors (for example more than 40) generally doesnt produce better results. In this sense, seeing the similar results we obtain in all three databases, we conclude that this approach may be used to increment the success rate in a lot of others texture classification tasks. Finally, the CDPA sampling also obtain very promising results that we can improve further on future works.
46

Representações de gestores e profissionais sobre o trabalho da rede de enfrentamento ao tráfico de pessoas com fins de exploração sexual em Sergipe

Lisboa, Lucivânia de Oliveira 26 February 2015 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Human traffic has been taken in a huge proportion in the last few decades, driven forward by socio economic inequality including all the kinds of race/ ethnicity, becoming an important point of the happening called "Human Traffic". This genres approaching research, is focused on analyze the actions of the managers and professionals about the work of the `Rede de EnfrentamentoaoTrafico de Pessoas para fins de exploração Sexual de Sergipe´, pointing trend and challenges daily faced. It´s based on the historical- dialectic which tries to comprehend the human traffic to the sexual exploring beyond this phenomenalistic expression, showing the particular and universal aspects in addition to some specificities of the object. Were used different information sources like: bibliographies, documents, oral sources by the semi-structured interviews with women, managers and professionals of the executing agencies, promotion areas and accountable defense. The result points besides others issues, the fragility and lack of ability inside of the management in actions in the department area, lack of communication between different opinions, hierarchy of services , none knowledge of the legislation and studies about the occurrence. / O tráfico de seres humanos vem assumindo proporções gigantescas nas últimas décadas, impulsionado pelas desigualdades socioeconômicas, aliadas as de gênero raça/etnia, questões estruturantes do fenômeno tráfico de pessoas. A presente pesquisa,sob a abordagem de gênero, objetiva analisar as representações de gestores e profissionais sobre o trabalho da Rede de Enfrentamento do Tráfico de Pessoas para fins de Exploração Sexual em Sergipe, apontando tendências e desafios enfrentados no cotidiano. Norteia-se pelo método histórico-dialético, o qual busca compreender a questão do Tráfico de Pessoas para fins de Exploração Sexual para além da sua expressãofenomênica, destacando os aspectossingulares e universais, bem como as particularidades do objeto. Foram utilizadas diferentes fontes de informação: bibliografias, documentos, fontes orais por meio da entrevista semiestruturada com mulheres e homens gestores e profissionais das instituições executoras, nos eixos de Promoção, Defesa e responsabilização. Os resultados evidenciam, entre outras questões, a existência de fragilidade/desarticulação da gestão intersetorial nas ações e instâncias das secretarias, falta de comunicação entre diferentes políticas, hierarquização dos serviços, desconhecimento sobre a legislação e estudos sobre o fenômeno.
47

Lateral Control of Heavy Vehicles / Sidostyrning av tunga fordon

Jawahar, Aravind, Palla, Lokesh January 2023 (has links)
The automotive industry has been involved in making vehicles autonomous to different levels in the past decade rapidly. Particularly in the commercial vehicle market, there is a significant necessity to make trucks have a certain level of automation to help reduce dependence on human efforts to drive. This could help in reducing several accidents caused by human error. Interestingly there are several challenges and solutions in achieving and implementing autonomous driving for trucks. First, a benchmark of different control architectures that can make a truck drive autonomously are explored. The chosen controllers (Pure Pursuit, Stanley, Linear Quadratic Regulator, Sliding Mode Control and Model Predictive Control) vary in their simplicity in implementation and versatility in handling different vehicle parameters and constraints. A thorough comparison of these path tracking controllers are performed using several metrics. Second, a collision avoidance system based on cubic polynomials, inspired by rapidly exploring random tree (RRT) is presented. Some of the path tracking controllers are limited by their ability and hence a standalone collision avoidance system is needed to provide safe maneuvering. Simulations are performed for different test cases with and without obstacles. These simulations help compare safety margin and driving comfort of each path tracking controller that are integrated with the collision avoidance system. Third, different performance metrics like change in acceleration input, change in steering input, error in path tracking, deviation from base frame of track file and lateral and longitudinal margin between ego and target vehicle are presented. To conclude, a set of suitable controllers for heavy articulated vehicles are developed and benchmarked. / Bilindustrin har varit involverad i att göra fordon autonoma till olika nivåer under det senaste decenniet snabbt. Särskilt på marknaden för kommersiella fordon finns det ett stort behov av att få lastbilar att ha en viss nivå av automatisering för att minska beroendet av mänskliga ansträngningar att köra. Detta kan hjälpa till att minska flera olyckor orsakade av mänskliga fel. Intressant nog finns det flera utmaningar och lösningar för att uppnå och implementera autonom körning för lastbilar. Först utforskas ett riktmärke av olika styrarkitekturer som kan få en lastbil att köra autonomt. De valda kontrollerna (Pure Pursuit, Stanley, Linear Quadratic Regulator, Sliding Mode Control och Model Predictive Control) varierar i sin enkelhet i implementering och mångsidighet när det gäller att hantera olika fordonsparametrar och begränsningar. En grundlig jämförelse av dessa vägspårningskontroller utförs med hjälp av flera mätvärden. För det andra presenteras ett system för undvikande av kollisioner baserat på kubiska polynom, inspirerat av snabbt utforskande slumpmässiga träd (RRT). Vissa av vägspårningskontrollerna är begränsade av sin förmåga och därför behövs ett fristående system för att undvika kollisioner för att ge säker manövrering. Simuleringar utförs för olika testfall med och utan hinder. Dessa simuleringar hjälper till att jämföra säkerhetsmarginal och körkomfort för varje vägspårningskontroller som är integrerade med kollisionsundvikande systemet. För det tredje presenteras olika prestandamått som förändring i accelerationsinmatning, förändring i styrinmatning, fel i banspårning, avvikelse från basramen för spårfilen och lateral och longitudinell marginal mellan ego och målfordon. Avslutningsvis utvecklas och benchmarkas en uppsättning lämpliga styrenheter för tunga ledade fordon.
48

Constructing a psycho-social model for team cohesion at a financial institution

Moerane, Elias Mochabo 06 1900 (has links)
The purpose of the study was to construct a psycho-social model for team cohesion at a financial institution. The financial institution had been in existence for 127 years, and had faced significant challenges throughout its history of acquisitions and mergers to establish working teams that would give it a competitive edge in global financial markets. The research objective was to develop a psycho-social model for team cohesion by investigating the interrelationships and overall relationships amongst the independent constructs (self-worth, personality preferences and conflict resolution styles) and the relevant outcome (team cohesion). Furthermore, the study also scientifically tested the possible moderating effect of the employees’ socio-demographic characteristics (race, gender, age, level of education, job level and tenure) on the fostering of team cohesiveness. A quantitative cross-sectional survey design approach was selected and applied to a simple probability sample (N = 463) using standardised, valid and reliable measuring instruments. The population consisted of permanent employees, and the results revealed significant relationships between the construct variables. The canonical correlation indicated a significant overall relationship between the contingencies of self-worth domains, personality preferences and conflict resolution styles, and the team cohesion-related dispositions of cohesiveness and engaged. The structured equation modelling indicated a good fit of the data between the individuals’ contingencies of self-worth domains (family support, God’s love, virtues, competition, work competence, physical appearance and pleasing others), the accommodating conflict resolution style, an extraversion personality preference, and team cohesion. Hierarchical moderated regression showed that race, age, educational level and job tenure significantly moderated the relationship between the participants’ psycho-social attributes and team cohesion. Tests for significant mean differences revealed significant differences in terms of the socio-biographical variables. On a theoretical level, the study deepened understanding of the antecedent constructs (self-worth, personality preferences and conflict resolution styles) and team cohesion construct. On an empirical level, the study produced an empirically tested psycho-social model for team cohesion. This study will add significant practical, valuable knowledge to the organisation in managing the future establishment and enhancement of team cohesion, and when integrating new team members to the environment during organisational restructuring and re-alignment after acquisitions and mergers, without negatively affecting organisational effectiveness. These findings invariably provided new insight in managing and understanding inherent interpersonal conflict among employees in the workplace and the enhancement of team cohesion practices, thus adding to the existing body of knowledge in the fields of Consulting Psychology and Industrial and Organisational Psychology, more specifically in financial organisations. / Psychology / D. Phil. (Consulting Psychology)
49

Statistical physics of constraint satisfaction problems

Lamouchi, Elyes 10 1900 (has links)
La technique des répliques est une technique formidable prenant ses origines de la physique statistique, comme un moyen de calculer l'espérance du logarithme de la constante de normalisation d'une distribution de probabilité à haute dimension. Dans le jargon de physique, cette quantité est connue sous le nom de l’énergie libre, et toutes sortes de quantités utiles, telle que l’entropie, peuvent être obtenue de là par des dérivées. Cependant, ceci est un problème NP-difficile, qu’une bonne partie de statistique computationelle essaye de résoudre, et qui apparaît partout; de la théorie des codes, à la statistique en hautes dimensions, en passant par les problèmes de satisfaction de contraintes. Dans chaque cas, la méthode des répliques, et son extension par (Parisi et al., 1987), se sont prouvées fortes utiles pour illuminer quelques aspects concernant la corrélation des variables de la distribution de Gibbs et la nature fortement nonconvexe de son logarithme negatif. Algorithmiquement, il existe deux principales méthodologies adressant la difficulté de calcul que pose la constante de normalisation: a). Le point de vue statique: dans cette approche, on reformule le problème en tant que graphe dont les nœuds correspondent aux variables individuelles de la distribution de Gibbs, et dont les arêtes reflètent les dépendances entre celles-ci. Quand le graphe en question est localement un arbre, les procédures de message-passing sont garanties d’approximer arbitrairement bien les probabilités marginales de la distribution de Gibbs et de manière équivalente d'approximer la constante de normalisation. Les prédictions de la physique concernant la disparition des corrélations à longues portées se traduise donc, par le fait que le graphe soit localement un arbre, ainsi permettant l’utilisation des algorithmes locaux de passage de messages. Ceci va être le sujet du chapitre 4. b). Le point de vue dynamique: dans une direction orthogonale, on peut contourner le problème que pose le calcul de la constante de normalisation, en définissant une chaîne de Markov le long de laquelle, l’échantillonnage converge à celui selon la distribution de Gibbs, tel qu’après un certain nombre d’itérations (sous le nom de temps de relaxation), les échantillons sont garanties d’être approximativement générés selon elle. Afin de discuter des conditions dans lesquelles chacune de ces approches échoue, il est très utile d’être familier avec la méthode de replica symmetry breaking de Parisi. Cependant, les calculs nécessaires sont assez compliqués, et requièrent des notions qui sont typiquemment étrangères à ceux sans un entrainement en physique statistique. Ce mémoire a principalement deux objectifs : i) de fournir une introduction a la théorie des répliques, ses prédictions, et ses conséquences algorithmiques pour les problèmes de satisfaction de constraintes, et ii) de donner un survol des méthodes les plus récentes adressant la transition de phase, prédite par la méthode des répliques, dans le cas du problème k−SAT, à partir du point de vu statique et dynamique, et finir en proposant un nouvel algorithme qui prend en considération la transition de phase en question. / The replica trick is a powerful analytic technique originating from statistical physics as an attempt to compute the expectation of the logarithm of the normalization constant of a high dimensional probability distribution known as the Gibbs measure. In physics jargon this quantity is known as the free energy, and all kinds of useful quantities, such as the entropy, can be obtained from it using simple derivatives. The computation of this normalization constant is however an NP-hard problem that a large part of computational statistics attempts to deal with, and which shows up everywhere from coding theory, to high dimensional statistics, compressed sensing, protein folding analysis and constraint satisfaction problems. In each of these cases, the replica trick, and its extension by (Parisi et al., 1987), have proven incredibly successful at shedding light on keys aspects relating to the correlation structure of the Gibbs measure and the highly non-convex nature of − log(the Gibbs measure()). Algorithmic speaking, there exists two main methodologies addressing the intractability of the normalization constant: a) Statics: in this approach, one casts the system as a graphical model whose vertices represent individual variables, and whose edges reflect the dependencies between them. When the underlying graph is locally tree-like, local messagepassing procedures are guaranteed to yield near-exact marginal probabilities or equivalently compute Z. The physics predictions of vanishing long range correlation in the Gibbs measure, then translate into the associated graph being locally tree-like, hence permitting the use message passing procedures. This will be the focus of chapter 4. b) Dynamics: in an orthogonal direction, we can altogether bypass the issue of computing the normalization constant, by defining a Markov chain along which sampling converges to the Gibbs measure, such that after a number of iterations known as the relaxation-time, samples are guaranteed to be approximately sampled according to the Gibbs measure. To get into the conditions in which each of the two approaches is likely to fail (strong long range correlation, high energy barriers, etc..), it is very helpful to be familiar with the so-called replica symmetry breaking picture of Parisi. The computations involved are however quite involved, and come with a number of prescriptions and prerequisite notions (s.a. large deviation principles, saddle-point approximations) that are typically foreign to those without a statistical physics background. The purpose of this thesis is then twofold: i) to provide a self-contained introduction to replica theory, its predictions, and its algorithmic implications for constraint satisfaction problems, and ii) to give an account of state of the art methods in addressing the predicted phase transitions in the case of k−SAT, from both the statics and dynamics points of view, and propose a new algorithm takes takes these into consideration.
50

Counter Revolutionary Programs: Social Catholicism and the Cristeros

Newcomer, Daniel 20 April 2011 (has links)
No description available.

Page generated in 0.159 seconds