• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 133
  • 110
  • 34
  • 25
  • 15
  • 14
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 400
  • 41
  • 36
  • 34
  • 29
  • 28
  • 28
  • 26
  • 26
  • 25
  • 24
  • 23
  • 22
  • 22
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Matematický popis trajektorie pohybu vozidla / Mathematical description of vehicle motion trajectory

Lorenczyk, Jiří January 2020 (has links)
The goal of this thesis is to nd types of curves which would allow for the construction of a path that could be traversed by a vehicle. It seems that a minimal constraint for such a path is the continuity of curve's curvature. This leads to a closer look at the three types of curves: Clothoids, which are able to smoothly connect straights with arcs of a constant curvature, interpolation quintic splines, which are C2 smooth in the interpolation nodes and -splines, these belong to the family of quintic polynomial curves too, however, they are characterised by the vector of parameters which modies the shape of the curve. The thesis is accompanied by an application allowing for manual construction of the path composed of spline curves.
232

Sestavení technologie součásti "cage" ve firmě CCI Brno / Technology assembling of a part "cage" in a company CCI

Tkáčová, Alena January 2011 (has links)
This thesis is focused on techology of manufacturing a "cage" component at CCI Brno company. It solves drilling holes to the outside diameter of it. In the first part the company CCI is introduced, and in the second part is focused on component and material analysis. Next step is analyzing present manufacturing procedures and proposing its change. At the end of this thesis is techno-economic comparison of the present and proposed technological change.
233

A Generalized Acceptance Urn Model

Wagner, Kevin P 05 April 2010 (has links)
An urn contains two types of balls: p "+t" balls and m "-s" balls, where t and s are positive real numbers. The balls are drawn from the urn uniformly at random without replacement until the urn is empty. Before each ball is drawn, the player decides whether to accept the ball or not. If the player opts to accept the ball, then the payoff is the weight of the ball drawn, gaining t dollars if a "+t" ball is drawn, or losing s dollars if a "-s" ball is drawn. We wish to maximize the expected gain for the player. We find that the optimal acceptance policies are similar to that of the original acceptance urn of Chen et al. with s=t=1. We show that the expected gain function also shares similar properties to those shown in that work, and note the important properties that have geometric interpretations. We then calculate the expected gain for the urns with t/s rational, using various methods, including rotation and reflection. For the case when t/s is irrational, we use rational approximation to calculate the expected gain. We then give the asymptotic value of the expected gain under various conditions. The problem of minimal gain is then considered, which is a version of the ballot problem. We then consider a Bayesian approach for the general urn, for which the number of balls n is known while the number of "+t" balls, p, is unknown. We find formulas for the expected gain for the random acceptance urn when the urns with n balls are distributed uniformly, and find the asymptotic value of the expected gain for any s and t. Finally, we discuss the probability of ruin when an optimal strategy is used for the (m,p;s,t) urn, solving the problem with s=t=1. We also show that in general, when the initial capital is large, ruin is unlikely. We then examine the same problem with the random version of the urn, solving the problem with s=t=1 and an initial prior distribution of the urns containing n balls that is uniform.
234

Pre-Straining Operation : Prediction of Strain Paths Within a Forming Limit Diagram

Olofsson, Elin, Al-Fadhli, Mohammed January 2022 (has links)
In a Sheet Metal Forming (SMF) operation, complex geometries in multi-stage forming processes are mostly common. Forming a blank, major and minor straining willoccur. Follow the straining of the blank elements over the forming process will provide its strain paths. The strain paths can be visualized in a Forming Limit Diagram(FLD); with a Forming Limit Curve (FLC) corresponding to the strained material.In the diagram, the determination whether an element is critical due to fracture ornecking is determined. Utilizing the FLD, the formability of a material is defined;the elements and their paths are however linear. Manufacture a sheet metal usinga multi-stage forming process will contribute to Non-Linear Strain Paths (NLSP).Thus, the FLD is no longer valid. Providing a tool from the company RISE IVF AB to be used for pre-strainingoperations, the objective of this thesis work is to enhance and investigate the possibility of generating the three main strain paths - uniaxial tension, plane strain,equibiaxial tension - of the dual-phased steel DP800. This study is in collaborationwith Volvo Cars Body Components (VCBC) in Olofström, where the pre-strainingwill be used in a future study of the SMF non-linear behaviour. Utilizing the finiteelement software AutoForm - specialized on SMF operations - this numerical basedstudy can be conducted. The ability of generating the three main strain paths will be achieved by modifying the blank geometries and provided tooling. By changing the dimensions ofthe punch and draw beads, critical regions and forced concentrated straining weresupposed to be achieved. These changes are implemented with the intention to fulfillthe criterion of the straining in terms of magnitude and gradient. The result from the simulations shows that the modifications have different effecton both the straining level and gradient. The modifications of both the draw beadand the punch were not of any significant use, while the blank dimension was mostvital when generating sufficient strain paths. Hence, the tooling modifications withinthis thesis work did not enhance the prediction of the three strain paths.
235

Challenging the Link Between Early Childhood Television Exposure and Later Attention Problems: A Multiverse Approach

McBee, Matthew T., Brand, Rebecca J., Dixon, Wallace E. 01 April 2021 (has links)
In 2004, Christakis and colleagues published findings that he and others used to argue for a link between early childhood television exposure and later attention problems, a claim that continues to be frequently promoted by the popular media. Using the same National Longitudinal Survey of Youth 1979 data set (N = 2,108), we conducted two multiverse analyses to examine whether the finding reported by Christakis and colleagues was robust to different analytic choices. We evaluated 848 models, including logistic regression models, linear regression models, and two forms of propensity-score analysis. If the claim were true, we would expect most of the justifiable analyses to produce significant results in the predicted direction. However, only 166 models (19.6%) yielded a statistically significant relationship, and most of these employed questionable analytic choices. We concluded that these data do not provide compelling evidence of a harmful effect of TV exposure on attention.
236

A test for Non-Gaussian distributions on the Johannesburg stock exchange and its implications on forecasting models based on historical growth rates.

Corker, Lloyd A January 2002 (has links)
Masters of Commerce / If share price fluctuations follow a simple random walk then it implies that forecasting models based on historical growth rates have little ability to forecast acceptable share price movements over a certain period. The simple random walk description of share price dynamics is obtained when a large number of investors have equal probability to buy or sell based on their own opinion. This simple random walk description of the stock market is in essence the Efficient Market Hypothesis, EMT. EMT is the central concept around which financial modelling is based which includes the Black-Scholes model and other important theoretical underpinnings of capital market theory like mean-variance portfolio selection, arbitrage pricing theory (APT), security market line and capital asset pricing model (CAPM). These theories, which postulates that risk can be reduced to zero sets the foundation for option pricing and is a key component in financial software packages used for pricing and forecasting in the financial industry. The model used by Black and Scholes and other models mentioned above are Gaussian, i.e. they exhibit a random nature. This Gaussian property and the existence of expected returns and continuous time paths (also Gaussian properties) allow the use of stochastic calculus to solve complex Black- Scholes models. However, if the markets are not Gaussian then the idea that risk can be. (educed to zero can lead to a misleading and potentially disastrous sense of security on the financial markets. This study project test the null hypothesis - share prices on the JSE follow a random walk - by means of graphical techniques such as symmetry plots and Quantile-Quantile plots to analyse the test distributions. In both graphical techniques evidence for the rejection of normality was found. Evidenceleading to the rejection of the hypothesis was also found through nonparametric or distribution free methods at a 1% level of significance for Anderson-Darling and Runs test.
237

Compétences et pratiques langagières en situation transculturelle : parcours langagiers des enfants bilingues soninké-français / Skills and language practices in transcultural situations : language learning paths of soninke-french bilingual children

Camara, Hawa 17 November 2014 (has links)
Les enfants de migrants grandissent dans un contexte de bilinguisme ou de plurilinguisme en France et sont souvent considérés comme étant allophones. Le rapport aux langues des enfants varie au cours de leur développement mais peu maîtrisent parfaitement les deux langues. Nous nous sommes intéressés aux enfants grandissant dans des familles migrantes parlant le soninké, langue de tradition orale parlée en Afrique de l'Ouest. Pour ce, à l'aide de l'ELAL d'Avicenne, outil créé et mis en place par une équipe pluridisciplinaire du service de psychopathologie de l'enfant et de l'adolescent de l'hôpital Avicenne, nous avons évalué des enfants âgés de 4 à 6 ans en Mauritanie (soninké) et en France (français et soninké). Les évaluations langagières de ces enfants mises en lien avec leurs histoires familiales et les histoires migratoires de leurs parents, nous ont permis d'amorcer des pistes quant aux facteurs mis en jeu dans la transmission et l'acquisition de la langue soninké en France et de montrer l'importance du bilinguisme, quelque soit le degré, en situation transculturelle. / Migrant children grow up in bilingual and multilingual surroundings in France and are often considered as allophones. The relationship of children to languages changes during their development, but few of them master two languages. We were interested in children growing up in immigrant families speaking soninke, which is a West African language of oral tradition. For that, by means the ELAL d'Avicenne, a unique tool created and developed by a multidisciplinary team in the department of child and adolescent psychopathology at Avicenne hospital, we met 4 to 6 year-old children in Mauritania (Soninke) and in France (French and Soninke). Language assessments of these children linked to their family stories and their parents' migration stories, have allowed us to initiate tracks on the factors involved in the acquisition and transmission of, the soninke language in France, and to show the importance of bilingualism, regardless of its degree, in transcultural situations.
238

Infeasible Path Detection : a Formal Model and an Algorithm / Détection de chemins infaisables : un modèle formel et un algorithme

Aïssat, Romain 30 January 2017 (has links)
Le test boîte blanche basé sur les chemins est largement utilisé pour la validation de programmes. A partir du graphe de flot de contrôle (CFG) du programme sous test, les cas de test sont générés en sélectionnant des chemins d'intérêt, puis en essayant de fournir, pour chaque chemin, des valeurs d'entrées concrètes qui déclencheront l'exécution du programme le long de ce chemin.Il existe de nombreuses manières de définir les chemins d'intérêt: les méthodes de test structurel sélectionnent des chemins remplissant un critère de couverture concernant les éléments du graphe; dans l'approche aléatoire, les chemins sont tirés selon une distribution de probabilité sur ces éléments. Ces méthodes aléatoires ont l'avantage de fournir un moyen d'évaluer la qualité d'un jeu de test à travers la probabilité minimale de couvrir un élément du critère.Fournir des valeurs concrètes d'entrées nécessite de construire le prédicat de cheminement chaque chemin, i.e. la conjonction des contraintes sur les entrées devant être vérifiée pour que le système s'exécute le long de ce chemin. Cette construction se fait par exécution symbolique. Les données de test sont ensuite déterminées par résolution de contraintes. Si le prédicat d'un chemin est insatisfiable, le chemin est dit infaisable. Il est très courant qu'un programme présente de tels chemins, leur nombre surpassent généralement de loin celui des faisables. Les chemins infaisables sélectionnés lors la première étape ne contribuent pas au jeu de test final, et doivent être tirés à nouveau. La présence de ces chemins pose un sérieux problème aux méthodes structurelles et à toutes les méthodes d'analyse statique, la qualité des approximations qu'elles fournissent étant réduite par les données calculées le long de chemins infaisables.De nombreuses méthodes ont été proposées pour résoudre ce problème, telles que le test concolique ou le test aléatoire basé sur les domaines d'entrée. Nous présentons un algorithme qui construit de meilleures approximations du comportement d'un programme que son CFG, produisant un nouveau CFG qui sur-approxime l'ensemble des chemins faisables mais présentant moins de chemins infaisables. C'est dans ce nouveau graphe que sont tirés les chemins.Nous avons modélisé notre approche et prouvé formellement, à l'aide de l'assistant de preuve interactif Isabelle/HOL, les propriétés principales établissant sa correction.Notre algorithme se base sur l'exécution symbolique et la résolution de contraintes, permettant de détecter si certains chemins sont infaisables ou non. Nos programmes peuvent contenir des boucles, et leurs graphes des cycles. Afin d'éviter de suivre infiniment les chemins cycliques, nous étendons l'exécution symbolique avec la détection de subsomptions. Une subsomption peut être vue comme le fait qu'un certain point atteint durant l'analyse est un cas particulier d'un autre atteint précédemment: il n'est pas nécessaire d'explorer les successeurs d'un point subsumé, ils sont subsumés par les successeurs du subsumeur. Notre algorithme a été implémenté par un prototype, dont la conception suit fidèlement la formalisation, offrant un haut niveau de confiance dans sa correction.Dans cette thèse, nous présentons les concepts théoriques sur lesquels notre approche se base, sa formalisation à l'aide d'Isabelle/HOL, les algorithmes implémentés par notre prototype et les diverses expériences menées et résultats obtenus à l'aide de ce prototype. / White-box, path-based, testing is largely used for the validation of programs. Given the control-flow graph (CFG) of the program under test, a test suit is generated by selecting a collection of paths of interest, then trying to provide, for each path, some concrete input values that will make the program follow that path during a run.For the first step, there are various ways to define paths of interest: structural testing methods select some set of paths that fulfills coverage criteria related to elements of the graph; in random-based techniques, paths are selected according to a given distribution of probability over these elements (for instance, uniform probability over all paths of length less than a given bound). Both approaches can be combined as in structural statistical testing. The random-based methods above have the advantage of providing a way to assess the quality of a test set as the minimal probability of covering an element of a criterion.The second step requires to produce for each path its path predicate, i.e. the conjunction of the constraints over the input parameters that must hold for the system to run along that path. This is done using symbolic execution. Then, constraint-solving is used to compute test data. If there is no input values such that the path predicate evaluates to true, the path is infeasible. It is very common for a program to have infeasible paths and such paths can largely outnumber feasible paths. Infeasible paths selected during the first step will not contribute to the final test suite, and there is no better choice than to select another path, hoping for its feasibility. Handling infeasible paths is the serious limitation of structural methods since most of the time is spent selecting useless paths. It is also a major challenge for all techniques in static analysis of programs, since the quality of the approximations they provide is lowered by data computed for paths that do not correspond to actual program runs.To overcome this problem, different methods have been proposed, like concolic testing or random testing based on the input domain. In path-biased random testing, paths are drawn according to a given distribution and their feasibility is checked in a second step. We present an algorithm that builds better approximations of the behavior of a program than its CFG, providing a transformed CFG, which still over-approximates the set of feasible paths but with fewer infeasible paths. This transformed graph is used for drawing paths at random.We modeled our graph transformations and formally proved, using the interactive theorem proving environment Isabelle/HOL, the key properties that establish the correctness of our approach.Our algorithm uses symbolic execution and constraint solving, which allows to detect whether some paths are infeasible. Since programs can contain loops, their graphs can contain cycles. In order to avoid to follow infinitely a cyclic path, we enrich symbolic execution with the detection of subsumptions. A subsumption can be interpreted as the fact that some node met during the analysis is a particular case of another node met previously: there is no need to explore the successors of the subsumed node: they are subsumed by the successors of the subsumer. Our algorithm has been implemented by a prototype, whose design closely follows said formalization, giving a good level of confidence in its correctness.In this thesis, we introduce the theoretical concepts on which our approach relies, its formalization in Isabelle/HOL, the algorithms our prototype implements and the various experiments done and results obtained using it.
239

Kostant principal filtration and paths in weight lattice / Filtration principale de Kostant et chemins dans des réseaux de poids

Kusumastuti, Nilamsari 24 October 2019 (has links)
Il existe plusieurs filtrations intéressantes définies sur la sous-algèbre de Cartan d'une algèbre de Lie simple complexe issues de contextes très variés : l'une est la filtration principale qui provient du dual de Langlands, une autre provient de l'algèbre de Clifford associée à une forme bilinéaire invariante non-dégénérée, une autre encore provient de l'algèbre symétrique et la projection de Chevalley, deux autres enfin proviennent de l'algèbre enveloppante et des projections de Harish-Chandra. Il est connu que toutes ces filtrations coïncident. Ceci résulte des travaux de Rohr, Joseph et Alekseev-Moreau. La relation remarquable entre les filtrations principale et de Clifford fut essentiellement conjecturée par Kostant. L'objectif de ce mémoire de thèse est de proposer une nouvelle démonstration de l'égalité entre les filtrations symétrique et enveloppante pour une algèbre de Lie simple de type A ou C. Conjointement au résultat et Rohr et le théorème d'Alekseev-Moreau, ceci fournit une nouvelle démonstration de la conjecture de Kostant, c'est-à-dire une nouvelle démonstration du théorème de Joseph. Notre démonstration est très différentes de la sienne. Le point clé est d'utiliser une description explicite des invariants via la représentation standard, ce qui est possible en types A et C. Nous décrivons alors les images de leurs différentielles en termes d'objects combinatoires, appelés des chemins pondérés, dans le graphe cristallin de la représentation standard. Les démonstrations pour les types A et C sont assez similaires, mais ne nouveaux phénomènes apparaissent en type C, ce qui rend la démonstration nettement plus délicate dans ce cas. / There are several interesting filtrations on the Cartan subalgebra of a complex simple Lie algebra coming from very different contexts: one is the principal filtration coming from the Langlands dual, one is coming from the Clifford algebra associated with a non-degenerate invariant bilinear form, one is coming from the symmetric algebra and the Chevalley projection, and two other ones are coming from the enveloping algebra and Harish-Chandra projections. It is known that all these filtrations coincide. This results from a combination of works of several authors (Rohr, Joseph, Alekseev-Moreau). The remarkable connection between the principal filtration and the Clifford filtration was essentially conjectured by Kostant. The purpose of this thesis is to establish a new correspondence between the enveloping filtration and the symmetric filtration for a simple Lie algebra of type A or C. Together with Rohr's result and Alekseev-Moreau theorem, this provides another proof of Kostant's conjecture for these types, that is, a new proof of Joseph's theorem. Our proof is very different from his approach. The starting point is to use an explicit description of invariants via the standard representation which is possible in types A and C. Then we describe the images of their differentials by the generalised Chevalley and Harish-Chandra projections in term of combinatorial objects, called weighted paths, in the crystal graph of the standard representation. The proofs for types A and C are quite similar, but there are new phenomenons in type C which makes the proof much more tricky in this case.
240

Algorithmes de recherche d'itinéraires en transport multimodal / Shortest path Algorithms in multimodal transportation

Gueye, Fallou 14 December 2010 (has links)
Ce travail de thèse s’est intéressé au transport urbain de passagers dans un contexte d’offre de transport multimodale consistant en la coexistence de plusieurs modes de transport. Dans la pratique, un problème de transport multimodal nécessite la prise en compte de plusieurs objectifs et de contraintes spécifiques liées aux modes ou à la séquence de modes utilisés. De telles contraintes sont appelées contraintes de viabilité.Cette thèse CIFRE s’est déroulée en collaboration avec la société MobiGIS, spécialisée dans le conseil et le développement d’applications autour des Systèmes d’Information Géographiques.Le problème étudié dans cette thèse est celui de la recherche d’itinéraires viables multimodaux point à point bi-objectif pour lequel il s’agit à la fois de minimiser le temps de trajet et le nombre de changements de mode. Compte tenu notamment des objectifs considérés, ce problème est de complexité polynomiale.Sur la base d’une modélisation multi-couches des réseaux de transport multimodaux et d’une modélisation par un automate à états finis des contraintes de viabilité nous avons proposé différents algorithmes de résolution de ce problème basés sur le principe de fixation et extension de labels. Nous avons également proposé une règle de dominance basée sur les états de l’automate de viabilité et permettant d’élaguer le nombre de labels explorés par nos algorithmes. Des adaptations en bidirectionnel ou en utilisant le principe de la recherche A_ ont également été proposées.Les algorithmes proposés ont été évalués sur une partie du réseau de transport de la ville de Toulouse et les expérimentations ont mis en évidence l’intérêt de la règle de dominance basée sur les états ainsi que de l’approche bidirectionnelle développée.Un prototype logiciel implémentant différentes fonctionnalités des algorithmes de plus courts chemins a été développé. Il permet notamment de réaliser des calculs d’itinéraires point à point, des calculs d’accessibilité ou des calculs de distancier / This thesis focuses on urban passenger multimodal transportation. In practice, a multimodal transportation problem requires taking into account several objectives and specific constraints related to modes or sequence of used modes. Such constraints are called viability constraints. This work has been carried out in collaboration with MobiGIS, a company specialized in consulting and development of applications around Geographical Information Systems.The problem studied in this thesis is the bi-objective multimodal viable point-to-point shortest path, aiming at minimizing the total travel time and the total number of mode changes. Given the considered objectives, this problem is polynomial.On the basis of a multi-layered graph model of the multimodal transportation networks, and of a finite state automaton model of the viability constraints, we propose various algorithms for solving this problem, based on the principle of label setting and extension.We also proposed a new dominance rule based on the states of the automaton to reduce the number of labels explored by our algorithms. Bidirectional and A* variants are also proposed.The algorithms are evaluated the transportation network of the city of Toulouse and experiments demonstrate the interest of the proposed dominance rules and bidirectional approach. A prototype software implementing different features of the shortest path algorithms has been developed. It notably enables calculations of point-to-point routes, accessibility and origin-destination matrices

Page generated in 0.0406 seconds