• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 98
  • 40
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 179
  • 179
  • 55
  • 45
  • 39
  • 37
  • 25
  • 23
  • 23
  • 20
  • 19
  • 16
  • 16
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Efficient computational approach to identifying overlapping documents in large digital collections

Monostori, Krisztian, 1975- January 2002 (has links)
Abstract not available
142

Ergodic and Combinatorial Proofs of van der Waerden's Theorem

Rothlisberger, Matthew Samuel 01 January 2010 (has links)
Followed two different proofs of van der Waerden's theorem. Found that the two proofs yield important information about arithmetic progressions and the theorem. van der Waerden's theorem explains the occurrence of arithmetic progressions which can be used to explain such things as the Bible Code.
143

Multi-tree algorithms for computational statistics and phyiscs

March, William B. 20 September 2013 (has links)
The Fast Multipole Method of Greengard and Rokhlin does the seemingly impossible: it approximates the quadratic scaling N-body problem in linear time. The key is to avoid explicitly computing the interactions between all pairs of N points. Instead, by organizing the data in a space-partitioning tree, distant interactions are quickly and efficiently approximated. Similarly, dual-tree algorithms, which approximate or eliminate parts of a computation using distance bounds, are the fastest algorithms for several fundamental problems in statistics and machine learning -- including all nearest neighbors, kernel density estimation, and Euclidean minimum spanning tree construction. We show that this overarching principle -- that by organizing points spatially, we can solve a seemingly quadratic problem in linear time -- can be generalized to problems involving interactions between sets of three or more points and can provide orders-of-magnitude speedups and guarantee runtimes that are asymptotically better than existing algorithms. We describe a family of algorithms, multi-tree algorithms, which can be viewed as generalizations of dual-tree algorithms. We support this thesis by developing and implementing multi-tree algorithms for two fundamental scientific applications: n-point correlation function estimation and Hartree-Fock theory. First, we demonstrate multi-tree algorithms for n-point correlation function estimation. The n-point correlation functions are a family of fundamental spatial statistics and are widely used for understanding large-scale astronomical surveys, characterizing the properties of new materials at the microscopic level, and for segmenting and processing images. We present three new algorithms which will reduce the dependence of the computation on the size of the data, increase the resolution in the result without additional time, and allow probabilistic estimates independent of the problem size through sampling. We provide both empirical evidence to support our claim of massive speedups and a theoretical analysis showing linear scaling in the fundamental computational task. We demonstrate the impact of a carefully optimized base case on this computation and describe our distributed, scalable, open-source implementation of our algorithms. Second, we explore multi-tree algorithms as a framework for understanding the bottleneck computation in Hartree-Fock theory, a fundamental model in computational chemistry. We analyze existing fast algorithms for this problem, and show how they fit in our multi-tree framework. We also show new multi-tree methods, demonstrate that they are competitive with existing methods, and provide the first rigorous guarantees for the runtimes of all of these methods. Our algorithms will appear as part of the PSI4 computational chemistry library.
144

Additive stucture, rich lines, and exponential set-expansion

Borenstein, Evan 19 May 2009 (has links)
We will survey some of the major directions of research in arithmetic combinatorics and their connections to other fields. We will then discuss three new results. The first result will generalize a structural theorem from Balog and Szemerédi. The second result will establish a new tool in incidence geometry, which should prove useful in attacking combinatorial estimates. The third result evolved from the famous sum-product problem, by providing a partial categorization of bivariate polynomial set functions which induce exponential expansion on all finite sets of real numbers.
145

Mixed order hyper-networks for function approximation and optimisation

Swingler, Kevin January 2016 (has links)
Many systems take inputs, which can be measured and sometimes controlled, and outputs, which can also be measured and which depend on the inputs. Taking numerous measurements from such systems produces data, which may be used to either model the system with the goal of predicting the output associated with a given input (function approximation, or regression) or of finding the input settings required to produce a desired output (optimisation, or search). Approximating or optimising a function is central to the field of computational intelligence. There are many existing methods for performing regression and optimisation based on samples of data but they all have limitations. Multi layer perceptrons (MLPs) are universal approximators, but they suffer from the black box problem, which means their structure and the function they implement is opaque to the user. They also suffer from a propensity to become trapped in local minima or large plateaux in the error function during learning. A regression method with a structure that allows models to be compared, human knowledge to be extracted, optimisation searches to be guided and model complexity to be controlled is desirable. This thesis presents such as method. This thesis presents a single framework for both regression and optimisation: the mixed order hyper network (MOHN). A MOHN implements a function f:{-1,1}^n →R to arbitrary precision. The structure of a MOHN makes the ways in which input variables interact to determine the function output explicit, which allows human insights and complexity control that are very difficult in neural networks with hidden units. The explicit structure representation also allows efficient algorithms for searching for an input pattern that leads to a desired output. A number of learning rules for estimating the weights based on a sample of data are presented along with a heuristic method for choosing which connections to include in a model. Several methods for searching a MOHN for inputs that lead to a desired output are compared. Experiments compare a MOHN to an MLP on regression tasks. The MOHN is found to achieve a comparable level of accuracy to an MLP but suffers less from local minima in the error function and shows less variance across multiple training trials. It is also easier to interpret and combine from an ensemble. The trade-off between the fit of a model to its training data and that to an independent set of test data is shown to be easier to control in a MOHN than an MLP. A MOHN is also compared to a number of existing optimisation methods including those using estimation of distribution algorithms, genetic algorithms and simulated annealing. The MOHN is able to find optimal solutions in far fewer function evaluations than these methods on tasks selected from the literature.
146

Tópicos em combinatória / Topics in combinatorics

Domingues, Deborah Pereira 16 August 2018 (has links)
Orientador: José Plínio de Oliveira Santos / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica / Made available in DSpace on 2018-08-16T18:39:44Z (GMT). No. of bitstreams: 1 Domingues_DeborahPereira_M.pdf: 925996 bytes, checksum: 6a430acfaa4475e03a36ee7e09bbf42a (MD5) Previous issue date: 2010 / Resumo: Neste trabalho estudamos dois importantes tópicos em combinatória. O primeiro deles é o Teorema Enumerativo de Pólya. No capítulo 2 é dada uma demonstração deste teorema usando o Teorema de Burnside. Também neste capítulo, encontram-se algumas de suas diversas aplicações. O segundo tópico trata de Teoria de Partições. Esta dissertação aborda alguns objetos de estudo desta área. O primeiro objeto é o método de Partition Analisys, usado para achar funções geradoras de vários tipos de interessantes funções de partição. Ainda relacionado a funções geradoras, o capítulo 3 aborda um pouco sobre q-séries. O segundo objeto é o método gráfico, que utiliza a representação gráfica de Ferrers para uma partição. Ainda neste capítulo, são usados os conceitos de quadrado de Durfee e símbolo de Frobenius para provar algumas identidades. / Abstract: This paper presents two important topics in combinatorics. The first one is the Pólya Enumeration Theorem. In chapter 2 is given a demonstration of this theorem by Burnside's Theorem. Also in this chapter are some of their various applications. The second topic deals with the Theory of Partition. This dissertation addresses some aspects of the study on this area. The first is Partition Analysis, this method is used to find the generating functions of various kinds of interesting partition functions. In the third chapter we deal with q-series which is also related to generating functions. The second is the graphical method, which uses a Ferrers's graphical representation of a partition. In addition, we use the concepts of Durfee square and Frobenius's symbol to prove some identities. / Mestrado / Mestre em Matemática
147

Análise combinatória e probabilidade para o ensino médio

Oliveira, Rígel Alves Rabelo de 29 May 2015 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / This work aims to create a solid theoretical basis of the content Combinatorial Analysis and Probability at the level of secondary education and by presenting the most common problems of the content, serve to deepen roadmap for students who are studying and for teachers need guidance on the best way content can be presented to students. The examples presented were chosen among the main topics addressed by the contents, and were avoided exhaustive repetition of similar problems, thus making the material summarized in terms of number of examples, but extremely thorough in order to lay their study dividing problems into groups and for each group presenting the various existing ways to solve them. The work uses to solve problems as only the principles that underlie Combinatorial Analysis excluding almost completely the use of ready-made formulas that both hinder and confuse students (and some teachers) in their studies. / Este trabalho tem como principal objetivo criar um embasamento teórico sólido do conteúdo Análise Combinatória e Probabilidade ao nível de Ensino Médio e, através da apresentação dos problemas mais comuns do conteúdo, servir de roteiro de aprofundamento para alunos que o estejam estudando bem como para professores que necessitem de uma orientação sobre a melhor forma como o conteúdo pode ser apresentado aos seus alunos. Os exemplos apresentados foram escolhidos entre os principais temas abordados pelos conteúdos, e foram evitadas repetições exaustivas de problemas similares, tornando assim o material resumido em termos de quantidade de exemplos, porém extremamente completo com o intuito de esquematizar seu estudo dividindo os problemas em grupos e apresentando para cada grupo as diversas formas existentes para resolvê-los. O trabalho utiliza para resolução dos problemas tão somente os princípios que fundamentam a Análise Combinatória excluindo quase que por completo o uso de fórmulas prontas que tanto dificultam e confundem os alunos (e também alguns professores) em seus estudos.
148

Ferramentas de contagem e o estudo de partições de inteiros

Pereira, Emerson Campos 23 July 2016 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / In this paper we present some counting tools like graphs and generating functions. Before we discuss about the basics of counting that are additive and multiplicative principles. At the end we show an algorithm that calculates the number of integer partitions that inherently uses the idea of graphs. / Neste texto apresentaremos algumas ferramentas de contagem como grafos e funções geradoras. Antes discutiremos sobre os princípios básicos de contagem que são os princípios aditivo e multiplicativo. Ao final exibiremos um algoritmo que calcula o número de partições de inteiros que utiliza intrinsecamente a ideia de grafos.
149

Análise combinatória e probabilidade: atividades pautadas com foco nos PCNs e no currículo da rede estadual

Oliveira, Gildo Gouveia de 02 March 2018 (has links)
This work aims to offer in detail teaching of Combinatorial Analysis and Probability, taking into account the National Curricular Parameters (NCP) and the Curricular Framework of the Sergipe State Network. We will present questions formulated from the above contents from news media using mathematical concepts in problem situations so that the teacher broadens and diversifies his teaching strategies, considering the abilities collected in the National High School Examination (ENEM). / Este trabalho tem o propósito de mostrar o ensino da Análise Combinatória e Probabilidade, levando em consideração os Parâmetros Curriculares Nacionais (PCN) e o Referencial Curricular da Rede Estadual de Sergipe. Mostraremos questões formuladas dos conteúdos acima a partir de notícias midiáticas utilizando conceitos matemáticos em situações problemas para que o professor amplie e diversifique suas estratégias de ensino, considerando as habilidades cobradas no Exame Nacional do Ensino Médio (ENEM). / Itabaiana, SE
150

A modelagem matemática como metodologia para o estudo de análise combinatória

Brumano, Cleuza Eunice Pereira 31 October 2014 (has links)
Submitted by Renata Lopes (renatasil82@gmail.com) on 2016-02-22T18:37:37Z No. of bitstreams: 1 cleuzaeunicepereirabrumano.pdf: 2346978 bytes, checksum: 6812b6ec12cc51dfd3a4a57ef979ce84 (MD5) / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2016-02-26T14:13:22Z (GMT) No. of bitstreams: 1 cleuzaeunicepereirabrumano.pdf: 2346978 bytes, checksum: 6812b6ec12cc51dfd3a4a57ef979ce84 (MD5) / Made available in DSpace on 2016-02-26T14:13:22Z (GMT). No. of bitstreams: 1 cleuzaeunicepereirabrumano.pdf: 2346978 bytes, checksum: 6812b6ec12cc51dfd3a4a57ef979ce84 (MD5) Previous issue date: 2014-10-31 / O presente trabalho aborda a modelagem matemática como uma alternativa de ensino que se dá através de uma concepção que permite ao educador desenvolver uma busca pela interação proveniente da matemática contextualizada na realidade dos estudantes, e que prioriza a construção do conhecimento por parte do aluno. Dessa forma, os conteúdos matemáticos são estudados partindo-se de um tema sugerido pelos alunos, o que proporciona maior interação entre estes e o professor. Neste contexto, objetiva-se analisar a aplicação desta estratégia como uma proposta eficaz para favorecer o ensino de Análise Combinatória. O estudo aponta para a possibilidade de compreensão acerca deste conteúdo tomando-se como base as implicações obtidas em um grupo de quatro alunos da segunda série do Ensino Médio e, metodologicamente, utilizando-se da Modelagem Matemática para tal fim. Foi realizada uma pesquisa bibliográfica buscando identificar como a modelagem pode ser aplicada ao ensino da disciplina em questão. A abordagem de pesquisa adotada se caracteriza qualitativa e os dados foram coletados por meio de dispositivos de áudio/vídeo e anotações. As informações obtidas estão organizadas neste trabalho da seguinte forma: inicialmente, apresenta-se o conceito de modelo, modelagem e modelação matemática; em seguida, expõe-se sobre o ensino de Análise Combinatória e finalmente apresenta-se a pesquisa de campo com os devidos comentários e conclusões. / This paper addresses the mathematical modeling as an educational alternative through a conception that allows the educator to develop a search for the interaction from the contextualized mathematics on the students reality and that gives priority to the student knowledge construction. Thus the mathematical contents are studied starting from a topic suggested by students which provides greater interaction between them and the teacher. In this context, I aim to analyze the application of this strategy as an effective proposal to assist the teaching of the Combinatorial Analysis. The study indicates the possibility of understanding about this content taking as a basis the implications obtained in a group of four students from the second year of the high school, and methodologically, using Mathematical Modeling for that purpose. A bibliographical research was made seeking to identify how modeling can be applied to teach mathematics. The research approach adopted was qualitative and the data were collected through audio/video devices and notes. The information obtained are organized in this work as follow: at the beginning, we present the concept of model, modeling and mathematical modeling; then, we expose about the teaching of Combinatorial Analysis and finally we present the field research with the comments and conclusions.

Page generated in 0.0875 seconds