• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 263
  • 68
  • 11
  • 10
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 360
  • 153
  • 88
  • 64
  • 62
  • 61
  • 34
  • 33
  • 31
  • 29
  • 28
  • 27
  • 24
  • 24
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

Incompletude e auto-organização : sobre a determinação de verdades logicas e matemáticas

Tassinari, Ricardo Pereira 12 December 2003 (has links)
Orientador: Itala Maria Loffredo D'Ottaviano / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciencias Humanas / Made available in DSpace on 2018-08-03T21:04:12Z (GMT). No. of bitstreams: 1 Tassinari_RicardoPereira_D.pdf: 1211433 bytes, checksum: 0edefc8fa8eb0b9895dac7b85a9aa2de (MD5) Previous issue date: 2003 / Resumo: Os Teoremas da Incompletude de Gödel têm sido, recorrentemente, citados nos estudos sobre auto-organização, como propiciando exemplos de processos não-mecânicos e verdadeiramente auto-organizados. Um dos fundamentos desses estudos está relacionado às análises que afirmam que os resultados obtidos por Gödel, associados à Tese/Definição de Church sobre calculabilidade, implicam na impossibilidade de uma modelagem mecânica completa de processos relativos à cognição humana. Dois desses processos que podem ser citados como auto-organizados, e cuja não-mecanicidade decorreria dos teoremas de Gödel, seriam os processos de determinação de fórmulas verdadeiras de teorias aritméticas de primeira ordem e de determinação de fórmulas verdadeiras de lógicas de ordens superiores, já que existem resultados lógico-matemáticos de incompletude desses sistemas formais. O objetivo central desta Tese consiste em analisar esses processos de determinação de verdades aritméticas e de verdades de lógicas de ordens superiores, a partir de uma análise dos resultados decorrentes dos teoremas de Gödel e da Teoria da Auto-Organização de Debrun, para mostrar que eles constituem processos não-mecânicos, segundo a acepção da Tese/Definição de Church, e auto-organizados, segundo Debrun. Apresentamos, preliminarmente, uma demonstração cuidadosa do Segundo Teorema da Incompletude de Gödel e uma introdução à Teoria da Auto-Organização de Debrun; bem como realizamos uma análise detalhada de como os resultados obtidos a partir do Segundo Teorema de Gödel permitem concluir que existem processos não-mecânicos, no sentido da Tese/Definição de Church, por argumentos distintos dos utilizados em alguns trabalhos da literatura. Mostramos que sempre existe um sistema formal cujo conjunto de teoremas é exatamente o conjunto de fórmulas determinadas como verdadeiras por qualquer função recursiva parcial que simule a capacidade humana de determinação de verdades aritméticas de primeira ordem e de verdades de lógicas de ordens superiores, enquanto, segundo o Segundo Teorema da Incompletude de Gödel, não existem sistemas formais cujos teoremas sejam todas as fórmulas que conseguimos identificar como verdadeiras / Abstract: Gödel¿s Incompleteness Theorems have been mentioned in the studies on self-organization as providing examples of non-mechanical and truly self-organized processes. One of the fundaments of these studies is related to the analyses that assert that Gödel¿s results, associated to Church¿s Thesis/Definition on calculability, imply the impossibility of complete mechanical modeling of processes related to human cognition. Two of these processes that can be mentioned as self-organized, whose non-mechanicity is implied by Gödel¿s theorems, would be the process of determination of true formulae of first order arithmetical theories and the process of determination of true formulae of higher-order logics, since there are logical-mathematical results on the incompleteness of these formal systems. The central aim of this Thesis is to analyze these processes of determination of first order arithmetical truths and higher-order logical truths, from an analysis of the results from Gödel¿s theorems and Debrun¿s Self-Organization Theory, in order to show that these processes constitute non-mechanical self-organized processes, according to Church¿s Thesis/Definition and Debrun¿s Theory. Preliminarily, we present a careful proof of Gödel¿s Second Incompleteness Theorem and an introduction to Debrun¿s Self-Organization Theory; as well as we analyze, in detail, how the results obtained from Gödel¿s theorems allow us to conclude that non-mechanical processes exists, in the sense of Church¿s Thesis/Definition, by using arguments that do not appear in known papers in the literature. We show that there is always a formal system whose set of theorems is exactly the set of formulae determined as true by any partial recursive function that simulates the human capability of determination of first order arithmetical truths and higher-order logical truths, while, according to Gödel¿s Second Incompleteness Theorem, there is no formal system whose theorems are all the formulae that we can identify as true formulae / Doutorado / Doutor em Filosofia
272

Hierarquias de sistemas de dedução natural e de sistemas de tableaux analiticos para os sistemas Cn de da Costa

Castro, Milton Augustinis de 29 June 2004 (has links)
Orientador: Itala Maria Loffredo D'Ottaviano / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciencias Humanas / Made available in DSpace on 2018-08-04T03:51:10Z (GMT). No. of bitstreams: 1 Castro_MiltonAugustinisde_D.pdf: 2474337 bytes, checksum: 7ed081ca3994305f5b416383e9264734 (MD5) Previous issue date: 2004 / Doutorado / Filosofia / Doutor em Filosofia
273

[en] TRAFFIC CONTROL THROUGH FUZZY LOGIC AND NEURAL NETWORKS / [pt] CONTROLE DE SEMÁFOROS POR LÓGICA FUZZY E REDES NEURAIS

ALEXANDRE ROBERTO RENTERIA 17 June 2002 (has links)
[pt] Este trabalho apresenta a utilização de lógica fuzzy e de redes neurais no desenvolvimento de um controlador de semáforos - o FUNNCON. O trabalho realizado consiste em quatro etapas principais: estudo dos fundamentos de engenharia de tráfego; definição de uma metodologia para a avaliação de cruzamentos sinalizados; definição do modelo do controlador proposto; e implementação com dados reais em um estudo de caso.O estudo sobre os fundamentos de engenharia de tráfego aborda a definição de termos,os parâmetros utilizados na descrição dos fluxos de tráfego, os tipos de cruzamentos e seus semáforos, os sistemas de controle de tráfego mais utilizados e as diversas medidas de desempenho.Para se efetuar a análise dos resultados do FUNNCON, é definida uma metodologia para a avaliação de controladores. Apresenta-se, também, uma investigação sobre simuladores de tráfego existentes, de modo a permitir a escolha do mais adequado para o presente estudo. A definição do modelo do FUNNCON compreende uma descrição geral dos diversos módulos que o compõem. Em seguida, cada um destes módulos é estudado separadamente: o uso de redes neurais para a predição de tráfego futuro; a elaboração de um banco de cenários ótimos através de um otimizador; e a criação de regras fuzzy a partir deste banco.No estudo de caso, o FUNNCON é implementado com dados reais fornecidos pela CET-Rio em um cruzamento do Rio de Janeiro e comparado com o controlador existente.É constatado que redes neurais são capazes de fornecer bons resultados na predição do tráfego futuro. Também pode ser observado que as regras fuzzy criadas a partir do banco de cenários ótimos proporcionam um controle efetivo do tráfego no cruzamento estudado. Uma comparação entre o desempenho do FUNNCON e o do sistema atualmente em operação é amplamente favorável ao primeiro. / [en] This work presents the use of fuzzy logic and neural networks in the development of a traffic signal controller - FUNNCON. The work consists of four main sections: study of traffic engineering fundamentals; definition of a methodology for evaluation of traffic controls; definition of the proposed controller model; and implementation on a case study using real data.The study of traffic engineering fundamentals considers definitions of terms,parameters used for traffic flow description, types of intersections and their traffic signals,commonly used traffic control systems and performance measures.In order to analyse the results provided by FUNNCON, a methodology for the evaluation of controllers is defined. The existing traffic simulators are investigated, in order to select the best one for the present study.The definition of the FUNNCON model includes a brief description of its modules.Thereafter each module is studied separately: the use of neural networks for future traffic prediction; the setup of a best scenario database using an optimizer; and the extraction of fuzzy rules from this database.In the case study, FUNNCON is implemented with real data supplied by CET-Rio from an intersection in Rio de Janeiro; its performance is compared with that of the existing controller.It can be observed that neural networks can present good results in the prediction of future traffic and that the fuzzy rules created from the best scenario database lead to an effective traffic control at the considered intersection. When compared with the system in operation, FUNNCON reveals itself much superior.
274

[en] ALFRED TARSKI: LOGICAL CONSEQUENCE, LOGICAL NOTIONS, AND LOGICAL FORMS / [pt] ALFRED TARSKI: CONSEQÜÊNCIA LÓGICA, NOÇÕES LÓGICAS E FORMAS LÓGICAS

STEFANO DOMINGUES STIVAL 17 September 2004 (has links)
[pt] O tema da presente dissertação é o problema da demarcação entre os termos lógicos e extralógicos no âmbito das ciências formais, anunciado primeiramente por Alfred Tarski em seu artigo de 1936, On the Concept of Logical Consequence. Depois de expor e discutir o problema em questão, mostrando seu surgimento a partir da necessidade de uma definição materialmente adequada do conceito de conseqüência lógica, analisamos a solução formulada por Tarski em um artigo publicado postumamente, intitulado What Are Logical Notions? Algumas discussões subsidiárias, igualmente importantes para o trabalho como um todo, dizem respeito à concepção dos conceitos de modelo e interpretação que se podem depreender dos artigos supracitados, e de como ela difere da assim chamada concepção standard em teoria de modelos. Nosso objetivo principal é mostrar o lugar ocupado pelo conceito de forma lógica na obra de Tarski, e de como sua concepção acerca deste conceito implica uma visão ampliada do conceito de conseqüência lógica, cuja caracterização correta torna necessária a estratificação das formas lógicas numa hierarquia de tipos. / [en] The subject of this paper is the problem of demarcation between logical and extra-logical terms of formal languages, as formulated for the first time by Tarski in his 1936 paper On the Concept of Logical Consequence. After presenting and discussing the demarcation problem, pointing out how it arises from the need for a materially adequate definition of the concept of logical consequence, we analyze the solution presented by Tarski in his posthumously published paper, entitled What Are Logical Notions? Some subsidiary issues, that are also important for the work as a whole, concern the conception of model and interpretation that springs from the two papers mentioned, and how this conception differs from the standard conception in model theory. Our main goal is to show the place occupied by the concept of logical form in Tarski`s work, and how his conception of this concept implies a broader view about the related concept of logical consequence whose correct characterization makes necessary the stratification of logical forms into a hierarchy of types.
275

[pt] DESENVOLVIMENTO DE MÉTODO MULTIRESÍDUO EM GC-MS/MS E LÓGICA FUZZY NO ESTUDO DE PAHS, PCBS, PBDES E PESTICIDAS ORGANOCLORADOS E SUA APLICAÇÃO EM SISTEMAS COSTEIROS DO BRASIL / [en] MULTIRESIDUE METHOD DEVELOPMENT IN GC-MS/MS AND FUZZY LOGIC IN THE STUDY OF PAHS, PCBS, PBDES AND ORGANOCHLORINE PESTICIDES AND THEIR APPLICATION IN COASTAL SYSTEMS IN BRAZIL

LEONARDO GRIPP BOM AMORIM 31 August 2023 (has links)
[pt] O atual trabalho teve como objetivo o desenvolvimento de um método multiresíduo e aplicação em testemunhos sedimentares para uma análise geocronológica da contaminação da Baía de Sepetiba por quatro classes de contaminantes orgânicos: (i) hidrocarbonetos policíclicos aromáticos (HPAs), (ii) bifenilos policlorados (PCBs); (iii) éteres difenílicos polibromados (PBDEs) e (iv) pesticidas organoclorados (OCPs). Além dos analitos contemplados no método desenvolvido, foram analisados outros parâmetros (HPAs alquilados, Biomarcadores de petróleo, Hidrocarbonetos Alifáticos, Carbono Orgânico Total e Nitrogênio Total) para que fosse possível uma análise mais aprofundada das fontes e da origem da MO e de compostos associados ao petróleo. Os testemunhos sedimentares foram coletados entre julho e outubro de 2016 e foram datados pela análise da atividade do decaimento de 210Pb e validado usando perfis de Zn e Cd. O método desenvolvido é baseado na extração ultrassônica com uma mistura de diclorometano:metanol (9:1 v/v) e cromatografia gasosa acoplada com análise de espectrometria de massa em tandem (GC-MS/MS) no modo de monitoramento de reações múltiplas (MRM). Um total de 89 compostos, dentre HPAs, PCBs, OCPs e PBDEs, foram identificados usando dois padrões de íon produto/precursor para cada analito. O limite de detecção do método (MDL; 0,001 – 0,055 ng g-1) e limite de quantificação do método (MQL; 0,002 – 0,184 ng g-1) estão abaixo dos níveis de poluição aceitáveis adotados pelas diretrizes internacionais de qualidade de sedimentos. O método, que foi publicado na revista Analytical and Bioanalytical Chemstry em maio de 2022, mostrou-se seletivo, sensível, preciso e linear, com a vantagem de reduzir o tempo de manuseio da amostra e a quantidade utilizada de materiais como solvente e adsorventes. A fim de analisar de forma mais aprofundada as fontes e origens dos HPAs nos testemunhos analisados, foi aplicado uma análise de dados baseada na Lógica Fuzzy, mais especificamente pelo algoritmo Fuzzy C-Means (FCM). Para que tivéssemos um melhor embasamento teórico da aplicação dessa ferramenta, um artigo foi publicado na revista Ocean and Coastal Research em fevereiro de 2022, com base na reavaliação estatística dos dados de dois trabalhos sobre contaminação do sedimento de baías costeiras por HPAs. Resultados obtido por meio de ferramentas de avaliação tradicionais foram comparadas com aqueles obtidos por FCM, que apresentaram maior detalhamento qualitativo. Embora a Lógica Fuzzy não produza interpretações quantitativas, sua aplicação gera dados adequados para que se evite uma inferência enviesada de fontes de contaminação de HPAs. A aplicação do método na análise dos testemunhos sedimentares foi bem-sucedida e a maioria das camadas apresentou valores de OCPs e PBDEs abaixo do MQL. Os valores de PCBs variaram de <MQL a 1,47 ng g-1 (média de 0,24 ± 0,38 ng g-1) em T18 e de 0,35 a 6,24 ng g-1 (média de 1,82 ± 1,37 ng g-1) em T26. Os HPAs variaram de 67,97 a 404,61 ng g-1 (média de 192,38 ± 98,48 ng g-1) em T18 e de 154,59 a 685,47 ng g-1 (média de 382,55 ± 168,45 ng g-1) em T26. / [en] The present work aimed at developing a multiresidue method and application in sedimentary testimonies for a geochronological analysis of the contamination of Sepetiba Bay by four classes of organic contaminants: (i) polycyclic aromatic hydrocarbons (PAHs), (ii) polychlorinated biphenyls (PCBs); (iii) polybrominated diphenyl ethers (PBDEs) and (iv) organochlorinated pesticides (OCPs). In addition to the analytes contemplated in the method developed, other parameters were analyzed (alkylated PAHs, petroleum biomarkers, aliphatic hydrocarbons, total organic carbon and total nitrogen) to allow a more in-depth analysis of the sources and origin of organic matter and compounds associated with petroleum. Sediment cores were collected between July and October 2016 by Gonçalvez et al. (2020) and were dated by analysis of 210Pb decay activity and validated using Zn and Cd profiles. The developed method is based on ultrasonic extraction with a mixture of dichloromethane:methanol (9:1 v/v) and gas chromatography coupled with tandem mass spectrometry analysis (GC-MS/MS) in multiple reaction monitoring (MRM) mode. A total of 89 compounds, among PAHs, PCBs, OCPs, and PBDEs, were identified using two product/precursor ion standards for each analyte. The method detection limit (MDL; 0.001 - 0.055 ng g-1) and method quantification limit (MQL; 0.002 - 0.184 ng g-1) are below acceptable pollution levels adopted by international sediment quality guidelines. The method, which was published in the journal Analytical and Bioanalytical Chemstry in May 2022, was shown to be selective, sensitive, accurate, and linear, with the advantage of reducing sample handling time and the amount used of materials such as solvent and adsorbents.In order to further analyze the sources and origins of PAHs in the analyzed core samples, we applied a data analysis based on Fuzzy Logic, more specifically the Fuzzy C-Means (FCM) algorithm. In order to have a better theoretical basis for the application of this tool, an article was published in the journal Ocean and Coastal Research in February 2022, based on the statistical re-evaluation of data from two studies on sediment contamination of coastal bays by PAHs. Results obtained through traditional assessment tools were compared with those obtained by FCM, which showed greater qualitative detail. Although Fuzzy Logic does not produce quantitative interpretations, its application generates adequate data to avoid biased inference of PAH contamination sources. The application of the method in the analysis of the sediment cores was successful and most layers showed OCPs and PBDEs values below the MQL. PCBs values ranged from <MQL to 1.47 ng g-1 (mean 0.24 ± 0.38 ng g-1) in T18 and from 0.35 to 6.24 ng g-1 (mean 1.82 ± 1.37 ng g-1) in T26. PAHs ranged from 67.97 to 404.61 ng g-1 (mean 192.38 ± 98.48 ng g-1) at T18 and from 154.59 to 685.47 ng g-1 (mean 382.55 ± 168.45 ng g-1) at T26.
276

Verification of Hybrid Systems using Satisfiability Modulo Theories

Mover, Sergio January 2014 (has links)
Embedded systems are formed by hardware and software components that interact with the physical environment and thus may be modeled as Hybrid Systems. Due to the complexity the system,there is an increasing need of automatic techniques to support the design phase, ensuring that a system behaves as expected in all the possible operating conditions.In this thesis, we propose novel techniques for the verification and the validation of hybrid systems using Satisfiability Modulo Theories (SMT). SMT is an established technique that has been used successfully in many verification approaches, targeted for both hardware and software systems. The use of SMT to verify hybrid systems has been limited, due to the restricted support of complex continuous dynamics and the lack of scalability. The contribution of the thesis is twofold. First, we propose novel encoding techniques, which widen the applicability and improve the effectiveness of the SMT-based approaches. Second, we propose novel SMT-based algorithms that improve the performance of the existing state of the art approaches. In particular we show algorithms to solve problems such as invariant verification, scenario verification and parameter synthesis. The algorithms fully exploit the underlying structure of a network of hybrid systems and the functionalities of modern SMT-solvers. We show and discuss the effectiveness of the the proposed techniques when applied to benchmarks from the hybrid systems domain.
277

Existential completion and pseudo-distributive laws: an algebraic approach to the completion of doctrines

Trotta, Davide 17 December 2019 (has links)
The main purpose of this thesis is to combine the categorical approach to logic given by the study of doctrines, with the universal algebraic techniques given by the theory of the pseudo-monads and pseudo-distributive laws. Every completions of doctrines is then formalized by a pseudo-monad, and then combinations of these are studied by the analysis of the pseudo-distributive laws. The starting point are the works of Maietti and Rosolini, in which they describe three completions for elementary doctrines: the first which adds full comprehensions, the second comprehensive diagonals, and the third quotients. Then we determine the existential completion of a primary doctrine, and we prove that the 2-monad obtained from it is lax-idempotent, and that the 2-category of existential doctrines is isomorphic to the 2-category of algebras for this 2-monad. We also show that the existential completion of an elementary doctrine is again elementary and we extend the notion of exact completion of an elementary existential doctrine to an arbitrary elementary doctrine. Finally we present the elementary completion for a primary doctrine whose base category has finite limits. In particular we prove that, using a general results about unification for first order languages, we can easily add finite limits to a syntactic category, and then apply the elementary completion for syntactic doctrines. We conclude with a complete description of elementary completion for primary doctrine whose base category is the free product completion of a discrete category, and we show that the 2-monad constructed from the 2-adjunction is lax-idempotent.
278

A Formal Foundation of FDI Design via Temporal Epistemic Logic

Gario, Marco January 2016 (has links)
Autonomous systems must be able to detect and promptly react to faults. Fault Detection and Identification components (FDI) are in charge of detecting the occurrence of faults. The FDI depends on the concrete design of the system, needs to take into account how faults might interact, and can only have a partial view of the run-time state through sensors. For these reasons, the development of the FDI and certification of its correctness and quality are difficult tasks. This difficulty is compounded by the fact that current approaches for verification of the FDI rely on manual inspection and testing. Motivated by industrial case-studies from the European Space Agency, this thesis proposes a formal foundation for FDI design that covers specification, validation, verification, and synthesis. The Alarm Specification Language (ASLk) is introduced in order to formally capture a set of interesting and desirable properties of the FDI components. ASLk is equipped with a semantics based on Temporal Epistemic Logic, thus enabling reasoning about partially observable systems. Automated reasoning techniques can then be applied to perform validation, verification, and synthesis of the FDI. This formal process guarantees that the generated FDI satisfies the designer expectations. The problems deriving from this process were out of reach for existing model-checking techniques. Therefore, we develop novel and efficient techniques for model-checking temporal epistemic logic over infinite state systems.
279

Planning and Scheduling in Temporally Uncertain Domains

Micheli, Andrea January 2016 (has links)
Any form of model-based reasoning is limited by the adherence of the model to the actual reality. Scheduling is the problem of finding a suitable timing to execute a given set of activities accommodating complex temporal constraints. Planning is the problem of finding a strategy for an agent to achieve a desired goal given a formal model of the system and the environment it is immersed in. When time and temporal constraints are considered, the problem takes the name of temporal planning. A common assumption in existing techniques for planning and scheduling is controllability of activities: the agent is assumed to be able to control the timing of starting and ending of each activity. In several practical applications, however, the actual timing of actions is not under direct control of the plan executor. In this thesis, we focus on this temporal uncertainty issue in scheduling and in temporal planning: we propose to natively express temporal uncertainty in the model used for reasoning. We first analyze the state-of-the-art on the subject, presenting a rationalization of existing works. Second, we show how Satisfiability Modulo Theory (SMT) solvers can be exploited to quickly solve different kinds of query in the realm of scheduling under uncertainty. Finally, we address the problem of temporal planning in domains featuring real-time constraints and actions having duration that is not under the control of the planning agent.
280

An Effective SMT Engine for Formal Verification

Griggio, Alberto January 2009 (has links)
Formal methods are becoming increasingly important for debugging and verifying hardware and software systems, whose current complexity makes the traditional approaches based on testing increasingly-less adequate. One of the most promising research directions in formal verification is based on the exploitation of Satisfiability Modulo Theories (SMT) solvers. In this thesis, we present MathSAT, a modern, efficient SMT solver that provides several important functionalities, and can be used as a workhorse engine in formal verification. We develop novel algorithms for two functionalities which are very important in verification -- the extraction of unsatisfiable cores and the generation of Craig interpolants in SMT -- that significantly advance the state of the art, taking full advantage of modern SMT techniques. Moreover, in order to demonstrate the usefulness and potential of SMT in verification, we develop a novel technique for software model checking, that fully exploits the power and functionalities of the SMT engine, showing that this leads to significant improvements in performance.

Page generated in 0.0595 seconds