641 |
Cálculo variacional: aspectos teóricos e aplicaçõesFlores, Ana Paula Ximenes [UNESP] 03 February 2011 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:27:10Z (GMT). No. of bitstreams: 0
Previous issue date: 2011-02-03Bitstream added on 2014-06-13T18:07:05Z : No. of bitstreams: 1
flores_apx_me_rcla.pdf: 626396 bytes, checksum: bbb4081c4e9cec255b879824f0d39683 (MD5) / O principal objetivo deste trabalho é o estudo da teoria do Cálculo de Variações com ênfase na Equação de Euler, que trata de uma condição necessária para uma função ser extremo de um funcional. Existe uma grande variedade de problemas, mas neste trabalho trataremos de problemas com fronteiras fixas, tempo final livre, estado final livre, funcional dependente de mais de uma função e problemas com alguns tipos de restrições. Dois problemas do Cálculo de uma variável e um exemplo de controle ótimo são estudados para ilustrar a aplicabilidade do Cálculo Variacional / The main purpose of this work is the study of the theory of the Calculus of Variations, with emphasis on the Euler equation, that is a necessary condition for a function to be an extreme of a functional. There are a large variety of problems but we will consider the problem of xed boundary, free nal time, free nal state, functionals that contain several independent functions and problems with some constraints. Two problems of the Calculus of one variable and an example of optimal control problem are studied to illustrate the applicability of Variational Calculus
|
642 |
Construção de gráficos de funções polinomiais de grau 3Bonjovani, Evandro Luís [UNESP] 25 September 2015 (has links) (PDF)
Made available in DSpace on 2016-05-17T16:51:33Z (GMT). No. of bitstreams: 0
Previous issue date: 2015-09-25. Added 1 bitstream(s) on 2016-05-17T16:55:16Z : No. of bitstreams: 1
000863013.pdf: 790803 bytes, checksum: b3f1547454ddefa83cbd5140508cc432 (MD5) / O objetivo deste trabalho e apresentar aos alunos do Ensino Médio um roteiro para a construção de gráficos de funções polinomiais do terceiro grau utilizando conceitos básicos de limites e derivadas, que não costumam ser trabalhados na Educação Básica. A maneira proposta provém da necessidade de se ampliar o entendimento de conteúdos que o aluno utiliza sem se aprofundar em sua origem, muitas vezes memorizando as técnicas sem, de fato, compreendê-las. Para tal objetivo, fazemos uma análise das principais de nifições e teoremas relacionados a sequências, limites e derivadas. Em seguida, há o estudo das variações das funções e, por fim, a proposta de um roteiro para a construção de gráficos / This work aims to introduce some guidelines to High School students in order to build on third degree polynomial function graphs. It is intended to use basic concepts of limit and derivatives, which are not usually taught at Junior School. The suggested approach outlines the necessity of enlarging and improving the understanding of syllabus that is used by the student with no adequate knowledge of its origin, and occasionally trying to know techniques by heart without understanding them. Thus, we do an analysis of the main de nitions and theorems related to sequences, limit and derivatives. Afterwards, there is the study of function variation and, nally, the proposed guidance to graph construction
|
643 |
O cálculo variacional e o problema da braquistócrona /Sousa Júnior, José Ribamar Alves de. January 2010 (has links)
Orientador: Suzinei Aparecida Siqueira Marconato / Banca: Renata Zotin Gomes de Oliveira / Banca: Sueli Mieko Tanaki Aki / Resumo: Neste trabalho estudamos o problema da Braquistócrona de duas formas distintas: através da teoria do Cálculo Variacional para problemas com fronteiras xas e também através das considerações feitas por Johann Bernoulli, utilizando conceitos de Óptica e Geometria. Apresentamos também uma simulação computacional dos resultados obtidos / Abstract: In this work we study the Brachistochrone Problem of two di erent ways; by theory of Variational Calculus for problems with xed boundary and by considerations of Johann Bernoulli, with concepts of Optics and Geometry. A computational simulation of the obtained results, is presented too / Mestre
|
644 |
Transport optimal et ondelettes : nouveaux algorithmes et applications à l'image / Optimal transportation and wavelets : new algorithms and application to imageHenry, Morgane 08 April 2016 (has links)
Le transport optimal trouve un nombre grandissant d’applications, dont celle qui nous intéresse dans ce travail, l'interpolation d’images. Malgré cet essor, la résolution numérique de ce transport soulève des difficultés et le développement d’algorithmes efficaces reste un problème d'actualité, en particulier pour des images de grande taille, comme on en trouve dans certains domaines (météorologie,...).Nous nous intéressons dans ce travail à la formulation de Benamou et Brenier, qui ont placé le problème dans un contexte de mécanique des milieux continus en ajoutant une dimension temporelle. Leur formulation consiste en la minimisation d’une fonctionnelle sur un espace des contraintes contenant une condition de divergence nulle, et les algorithmes existants utilisent une projection sur cet espace.A l'opposé, dans cette thèse, nous définissons et mettons en oeuvre des algorithmes travaillant directement dans cet espace.En effet, nous montrons que la fonctionnelle a de meilleures propriétés de convexité sur celui-ci.Pour travailler dans cet espace, nous considérons trois représentations des champs de vecteurs à divergence nulle. La première est une base d’ondelettes à divergence nulle. Cette formulation a été implémentée numériquement dans le cas des ondelettes périodiques à l'aide d'une descente de gradient, menant à un algorithme de convergence lente mais validant la faisabilité de la méthode. La deuxième approche consiste à représenter les vecteurs à divergence nulle par leur fonction de courant munie d'un relèvement des conditions au bord et la troisième à utiliser la décomposition de Helmholtz-Hodge.Nous montrons de plus que dans le cas unidimensionnel en espace, en utilisant l’une ou l'autre de ces deux dernières représentations, nous nous ramenons à la résolution d’une équation de type courbure minimale sur chaque ligne de niveau du potentiel, munie des conditions de Dirichlet appropriées.La minimisation de la fonctionnelle est alors assurée par un algorithme primal-dual pour problèmes convexes de Chambolle-Pock, qui peut aisément être adapté à nos différentes formulations et est facilement parallèlisable, menant à une implémentation performante et simple.En outre, nous démontrons les gains significatifs de nos algorithmes par rapport à l’état de l’art et leur application sur des images de taille réelle. / Optimal transport has an increasing number of applications, including image interpolation, which we study in this work. Yet, numerical resolution is still challenging, especially for real size images found in applications.We are interested in the Benamou and Brenier formulation, which rephrases the problem in the context of fluid mechanics by adding a time dimension.It is based on the minimization of a functional on a constraint space, containing a divergence free constraint and the existing algorithms require a projection onto the divergence-free constraint at each iteration.In this thesis, we propose to work directly in the space of constraints for the functional to minimize.Indeed, we prove that the functional we consider has better convexity properties on the set of constraints.To work in this space, we use three different divergence-free vector decompositions. The first in which we got interested is a divergence-free wavelet base. This formulation has been implemented numerically using periodic wavelets and a gradient descent, which lead to an algorithm with a slow convergence but validating the practicability of the method.First, we represented the divergence-free vector fields by their stream function, then we studied the Helmholtz-Hodge decompositions. We prove that both these representations lead to a new formulation of the problem, which in 1D + time, is equivalent to the resolution of a minimal surface equation on every level set of the potential, equipped with appropriate Dirichlet boundary conditions.We use a primal dual algorithm for convex problems developed by Chambolle and Pock, which can be easily adapted to our formulations and can be easily sped up on parallel architectures. Therefore our method will also provide a fast algorithm, simple to implement.Moreover, we show numerical experiments which demonstrate that our algorithms are faster than state of the art methods and efficient with real-sized images.
|
645 |
Sémantique et implantation d'une extension de ML pour la preuve de programmes / Semantics and implementation of an extension of ML for proving programsLepigre, Rodolphe 18 July 2017 (has links)
Au cours des dernières années, les assistants de preuves on fait des progrès considérables et ont atteint un grand niveau de maturité. Ils ont permit la certification de programmes complexes tels que des compilateurs et même des systèmes d'exploitation. Néanmoins, l'utilisation d'un assistant de preuve requiert des compétences techniques très particulières, qui sont très éloignées de celles requises pour programmer de manière usuelle. Pour combler cet écart, nous entendons concevoir un langage de programmation de style ML supportant la preuve de programmes. Il combine au sein d'un même outil la flexibilité de ML et le fin niveau de spécification offert par un assistant de preuve. Autrement dit, le système peut être utilisé pour programmer de manière fonctionnelle et fortement typée tout en autorisant l'obtention de nouvelles garanties au besoin.On étudie donc un langage en appel par valeurs dont le système de type étend une logique d'ordre supérieur. Il comprend un type égalité entre les programmes non typés, un type de fonction dépendant, la logique classique et du sous-typage. La combinaison de l'appel par valeurs,des fonctions dépendantes et de la logique classique est connu pour poser des problèmes de cohérence. Pour s'assurer de la correction du système (cohérence logique et sûreté à l'exécution), on propose un cadre théorique basé sur la réalisabilité classique de Krivine. La construction du modèle repose sur une propriété essentielle qui lie les différent niveaux d'interprétation des types d'une manière novatrice.On démontre aussi l'expressivité de notre système en se basant sur son implantation dans un prototype. Il peut être utilisé pour prouver des propriétés de programmes standards tels que la fonction « map »sur les listes ou le tri par insertion. / In recent years, proof assistant have reached an impressive level of maturity. They have led to the certification of complex programs such as compilers and operating systems. Yet, using a proof assistant requires highly specialised skills and it remains very different from standard programming. To bridge this gap, we aim at designing an ML-style programming language with support for proofs of programs, combining in a single tool the flexibility of ML and the fine specification features of a proof assistant. In other words, the system should be suitable both for programming (in the strongly-typed, functional sense) and for gradually increasing the level of guarantees met by programs, on a by-need basis.We thus define and study a call-by-value language whose type system extends higher-order logic with an equality type over untyped programs, a dependent function type, classical logic and subtyping. The combination of call-by-value evaluation, dependent functions and classical logic is known to raise consistency issues. To ensure the correctness of the system (logical consistency and runtime safety), we design a theoretical framework based on Krivine's classical realisability. The construction of the model relies on an essential property linking the different levels of interpretation of types in a novel way.We finally demonstrate the expressive power of our system using our prototype implementation, by proving properties of standard programs like the map function on lists or the insertion sort.
|
646 |
An Inverse Lambda Calculus Algorithm for Natural Language ProcessingJanuary 2010 (has links)
abstract: Natural Language Processing is a subject that combines computer science and linguistics, aiming to provide computers with the ability to understand natural language and to develop a more intuitive human-computer interaction. The research community has developed ways to translate natural language to mathematical formalisms. It has not yet been shown, however, how to automatically translate different kinds of knowledge in English to distinct formal languages. Most of the recent work presents the problem that the translation method aims to a specific formal language or is hard to generalize. In this research, I take a first step to overcome this difficulty and present two algorithms which take as input two lambda-calculus expressions G and H and compute a lambda-calculus expression F. The expression F returned by the first algorithm satisfies F@G=H and, in the case of the second algorithm, we obtain G@F=H. The lambda expressions represent the meanings of words and sentences. For each formal language that one desires to use with the algorithms, the language must be defined in terms of lambda calculus. Also, some additional concepts must be included. After doing this, given a sentence, its representation and knowing the representation of several words in the sentence, the algorithms can be used to obtain the representation of the other words in that sentence. In this work, I define two languages and show examples of their use with the algorithms. The algorithms are illustrated along with soundness and completeness proofs, the latter with respect to typed lambda-calculus formulas up to the second order. These algorithms are a core part of a natural language semantics system that translates sentences from English to formulas in different formal languages. / Dissertation/Thesis / M.S. Computer Science 2010
|
647 |
Students' Ways of Thinking about Two-Variable Functions and Rate of Change in SpaceJanuary 2012 (has links)
abstract: This dissertation describes an investigation of four students' ways of thinking about functions of two variables and rate of change of those two-variable functions. Most secondary, introductory algebra, pre-calculus, and first and second semester calculus courses do not require students to think about functions of more than one variable. Yet vector calculus, calculus on manifolds, linear algebra, and differential equations all rest upon the idea of functions of two (or more) variables. This dissertation contributes to understanding productive ways of thinking that can support students in thinking about functions of two or more variables as they describe complex systems with multiple variables interacting. This dissertation focuses on modeling the way of thinking of four students who participated in a specific instructional sequence designed to explore the limits of their ways of thinking and in turn, develop a robust model that could explain, describe, and predict students' actions relative to specific tasks. The data was collected using a teaching experiment methodology, and the tasks within the teaching experiment leveraged quantitative reasoning and covariation as foundations of students developing a coherent understanding of two-variable functions and their rates of change. The findings of this study indicated that I could characterize students' ways of thinking about two-variable functions by focusing on their use of novice and/or expert shape thinking, and the students' ways of thinking about rate of change by focusing on their quantitative reasoning. The findings suggested that quantitative and covariational reasoning were foundational to a student's ability to generalize their understanding of a single-variable function to two or more variables, and their conception of rate of change to rate of change at a point in space. These results created a need to better understand how experts in the field, such as mathematicians and mathematics educators, thinking about multivariable functions and their rates of change. / Dissertation/Thesis / Ph.D. Mathematics 2012
|
648 |
Completeness and the ZX-calculusBackens, Miriam K. January 2015 (has links)
Graphical languages offer intuitive and rigorous formalisms for quantum physics. They can be used to simplify expressions, derive equalities, and do computations. Yet in order to replace conventional formalisms, rigour alone is not sufficient: the new formalisms also need to have equivalent deductive power. This requirement is captured by the property of completeness, which means that any equality that can be derived using some standard formalism can also be derived graphically. In this thesis, I consider the ZX-calculus, a graphical language for pure state qubit quantum mechanics. I show that it is complete for pure state stabilizer quantum mechanics, so any problem within this fragment of quantum theory can be fully analysed using graphical methods. This includes questions of central importance in areas such as error-correcting codes or measurement-based quantum computation. Furthermore, I show that the ZX-calculus is complete for the single-qubit Clifford+T group, which is approximately universal: any single-qubit unitary can be approximated to arbitrary accuracy using only Clifford gates and the T-gate. In experimental realisations of quantum computers, operations have to be approximated using some such finite gate set. Therefore this result implies that a wide range of realistic scenarios in quantum computation can be analysed graphically without loss of deductive power. Lastly, I extend the use of rigorous graphical languages outside quantum theory to Spekkens' toy theory, a local hidden variable model that nevertheless exhibits some features commonly associated with quantum mechanics. The toy theory for the simplest possible underlying system closely resembles stabilizer quantum mechanics, which is non-local; it thus offers insights into the similarities and differences between classical and quantum theories. I develop a graphical calculus similar to the ZX-calculus that fully describes Spekkens' toy theory, and show that it is complete. Hence, stabilizer quantum mechanics and Spekkens' toy theory can be fully analysed and compared using graphical formalisms. Intuitive graphical languages can replace conventional formalisms for the analysis of many questions in quantum computation and foundations without loss of mathematical rigour or deductive power.
|
649 |
The noncommutative torus as a minimal submanifold of the noncommutative 3-sphereTiger Norkvist, Axel January 2018 (has links)
In this thesis an algebraic structure, called real calculus, is used as a way to represent noncommutative manifolds in an algebraic setting. Several classical geometric concepts are defined for real calculi, such as metrics and affine connections, and real calculus homomorphisms are introduced. These homomorphisms are then used to define embeddings of real calculi representing manifolds, anda notion of minimal embedding is introduced. The motivating example of the thesis is the noncommutative torus as embedded into a localization of the noncommutative 3-sphere, where it is shown that the noncommutative torus is a minimal embedding of the noncommutative 3-sphere for certain perturbations of the standard metric.
|
650 |
Avaliação do estado de saúde bucal de pacientes com fibrose císticaChapper, Ana January 2010 (has links)
Esse estudo transversal avaliou a saúde bucal de 36 pacientes com fibrose cística (FC). Um questionário foi aplicado para obter informações sobre autocuidados e outros aspectos que pudessem influenciar os resultados. Os exames, realizados por examinadora treinada e calibrada, foram placa visível (IPV), sangramento gengival (ISG), profundidade de sondagem (PS), perda de inserção (PI), exsudato à sondagem (SS), presença de sítios com cálculo dental, experiência de cárie (ceo/CPO-D + MBA) e número de dentes com sinais de anomalias no esmalte. Os resultados das variáveis clínicas foram comparados (P<0,05), segundo dois extratos de idade e segundo a experiência de cárie positiva (EC+) ou não (EC-), por meio dos testes t de Student ou U de Mann-Whitney. O teste de Spearman foi usado para identificar possíveis correlações entre os achados. Para as idades 12 anos e > 12 anos os resultados foram: IPV (65,40±35,13% e 58,90±34,51%), ISG (14,00±21,61% e 26,51±27,09%) e presença de cálculo dental (21,59±28,96% e 11,97±9,29%), sem diferenças significativas entre os grupos. As médias de PS foram 1,30±0,32mm e 1,70±0,30mm (P=0,00) e de SS foram 3,89±9,92% e 7,72±20,04% (P=0,09). A média de PI, somente presente no grupo >12 anos, foi 0,15±0,35mm. Os valores do índice ceo/CPO-D+MBA foram 1,74±3,33/1,20±1,74 e 5,14±4,26, respectivamente. A presença de anomalias de esmalte variou de nenhum até todos os dentes com sinais, em ambos os grupos. A comparação dos grupos EC+ e EC- revelou que os valores do IPV (70,52±30,92% e 52±37,50%) e ISG (25,13±29,40% e 10,09±10,04%), não diferiram significativamente, ao contrário dos percentuais de cálculo, com valores de 9,97±10,06% e 28,86±32,05% nos grupos, respectivamente. Placa bacteriana (r=0,50; P=0,02), e preferência por alimentos doces (r=0,48; P=0,02) relacionaram-se positivamente com experiência cárie na dentição decídua, assim como o histórico de dor (r=0,48; P=0,03), que também teve relação positiva com a dentição permanente (r=0,52; P=0,02). Correlação positiva foi observada entre a perda de inserção e a percepção da gengiva já ter sangrado anteriormente (r=0,51; P=0,00). Os níveis de escolaridade da mãe e do pai relacionaram-se negativamente com a EC nas crianças FC (r=-0,57; r=-0,71). Conclusões: Nessa amostra, manifestações de cárie foram observadas a partir de sinais incipientes de perda mineral, sendo que a experiência de cárie foi semelhante àquela observada na população do sul do Brasil. A remoção de cálculo dental para tratamento da gengivite não deve ser priorizado no atendimento de pacientes císticos com atividade de cárie. Atenção odontológica deve ser direcionada à prevenção e ao tratamento da doença cárie. / This cross-sectional study assessed the dental health of 36 cystic fibrosis (CF) patients. Questionnaire was applied to obtain information about self-care and other aspects that could influence the results. One trained and calibrated examiner evaluated the presence of visible plaque (VIP), gingival bleeding (GB), bleeding on probing (BOP), pocket depth (PD), clinical attachment loss (CAL), presence of supragingival calculus, dmf/DMF-T index modified by the presence of active white spots, and the presence of enamel defects. The results of the clinical variables were compared (P<0.05), according to two age groups, and according to caries experience (CE), positive (+) or not (-), by Student t test or U Mann-Whitney test. The non-parametric test of Spearman was used to identify possible correlations between findings. The results seen for age’s 12 years old and > 12 years old were: VIP (65.40±35.13% and 58.90 ± 34.51%), GB (14.00±21.61% and 26.51% ± 27.09%) and presence of calculus (21.59±28.96% and 11.97±9.29%), without statistic differences between the groups. The mean value of PS was 1.30±0.32 mm and 1.70±0.30mm (P= 0.00) and BOP were 3.89± 9.92% and 20.04±7.72% (P= 0.09). The CAL mean value present in the group > 12 years old, was 0.15±0.35 mm. The values of the dmf/DMF-T modified index were 1.74 ±3.33 / 1.20 ± 1.74 and 5.14 ± 4.26, respectively. The presence of teeth with enamel defect, per individual, ranged from none to all teeth with signs in both groups. The analysis of the groups CE+ and CE- revealed that VIP (70.52±30.92% - 52.00±37.50%) and of GB (25.13±29.40% -10.09±10.04%), did not differ significantly. Differences were observed between the percentages of dental calculus (9.97±10.06% - 28.86±32.05%). Few significant correlations were observed between the findings. The presence of visible plaque (r= 0.50, P=0.02), and preference for sweet foods (r= 0.48, P=0.02) correlated positively with caries experience in primary dentition, as well as the history of pain (r= 0.48, P=0.03), and that also had positive correlation with the permanent dentition (r= 0.52, P=0.02). Positive correlation was also found between attachment loss and the perception of gingival bleeding by the patient (r= 0.51, P=0.00). Mother’s and father’s education were negatively correlated with CE+ in primary dentition (r= - 0.57, r= - 0.71). Conclusions: In the study, symptoms of decay were observed from the incipient signs of mineral loss, and the caries experience in the cystic patients sample were similar to those observed in the population from south of Brazil. In cystic patients with caries activity, the removal of dental calculus for the treatment of gingivitis should not take priority in the planning of dental care. Dental care should be directed to prevention and treatment of caries.
|
Page generated in 0.0676 seconds