• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 532
  • 172
  • 54
  • 45
  • 35
  • 21
  • 14
  • 12
  • 10
  • 9
  • 7
  • 7
  • 5
  • 4
  • 4
  • Tagged with
  • 1072
  • 266
  • 152
  • 145
  • 122
  • 111
  • 107
  • 99
  • 98
  • 79
  • 63
  • 62
  • 58
  • 58
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Topics in Soft Computing

Keukelaar, J. H. D. January 2002 (has links)
No description available.
182

Interval Neutrosophic Sets and Logic: Theory and Applications in Computing

Wang, Haibin 12 January 2006 (has links)
A neutrosophic set is a part of neutrosophy that studies the origin, nature, and scope of neutralities, as well as their interactions with different ideational spectra. The neutrosophic set is a powerful general formal framework that has been recently proposed. However, the neutrosophic set needs to be specified from a technical point of view. Here, we define the set-theoretic operators on an instance of a neutrosophic set, and call it an Interval Neutrosophic Set (INS). We prove various properties of INS, which are connected to operations and relations over INS. We also introduce a new logic system based on interval neutrosophic sets. We study the interval neutrosophic propositional calculus and interval neutrosophic predicate calculus. We also create a neutrosophic logic inference system based on interval neutrosophic logic. Under the framework of the interval neutrosophic set, we propose a data model based on the special case of the interval neutrosophic sets called Neutrosophic Data Model. This data model is the extension of fuzzy data model and paraconsistent data model. We generalize the set-theoretic operators and relation-theoretic operators of fuzzy relations and paraconsistent relations to neutrosophic relations. We propose the generalized SQL query constructs and tuple-relational calculus for Neutrosophic Data Model. We also design an architecture of Semantic Web Services agent based on the interval neutrosophic logic and do the simulation study.
183

the dilemma of the choice between dolphin sets and non-dolphin sets-the case of the eastern tropical pacific purse-seine fishery

Huang, Yi-ting 25 June 2010 (has links)
In the Eastern Tropical Pacific (ETP), the operation mode of purse-seine fishing can be divided into dolphin sets and non-dolphin sets. Dolphin sets can catch more yellowfin tuna, but cause a lot of dolphin deaths. On the contrary, non-dolphin sets can catch more skipjack tuna, but discard a lot of tuna. Therefore, the choice of the operation mode to the ETP purse-seine fishery is very influential. If the fishermen use more dolphin sets, they are able to catch high quantity and more valuable yellowfin tuna. Conversely, if the fishermen use more non-dolphin sets, they will not hurt dolphins. But the tuna fish caught by this operation mode is smaller in size and the discarding rate is extremely high which will affect the overall profits of the fishermen. The research will investigate how the fishermen make the choice of the operation mode in the dilemma between dolphin sets which cause dolphins-bycatch and non-dolphin sets which result in tuna-discarding, In addition, if managers can ban dolphin sets or non-dolphin sets to maximize the profit of the fishermen and maintain the ecological balance between tuna and dolphins.
184

Topics in Soft Computing

Keukelaar, J. H. D. January 2002 (has links)
No description available.
185

Iterative Decoding of Codes on Graphs

Sankaranarayanan, Sundararajan January 2006 (has links)
The growing popularity of a class of linear block codes called the low-density parity-check (LDPC) codes can be attributed to the low complexity of the iterative decoders, and their potential to achieve performance very close to the Shannon capacity. This makes them an attractive candidate for ECC applications in communication systems. This report proposes methods to systematically construct regular and irregular LDPC codes.A class of regular LDPC codes are constructed from incidence structures in finite geometries like projective geometry and affine geometry. A class of irregular LDPC codes are constructed by systematically splitting blocks of balanced incomplete block designs to achieve desired weight distributions. These codes are decoded iteratively using message-passing algorithms, and the performance of these codes for various channels are presented in this report.The application of iterative decoders is generally limited to a class of codes whose graph representations are free of small cycles. Unfortunately, the large class of conventional algebraic codes, like RS codes, has several four cycles in their graph representations. This report proposes an algorithm that aims to alleviate this drawback by constructing an equivalent graph representation that is free of four cycles. It is theoretically shown that the four-cycle free representation is better suited to iterative erasure decoding than the conventional representation. Also, the new representation is exploited to realize, with limited success, iterative decoding of Reed-Solomon codes over the additive white Gaussian noise channel.Wiberg, Forney, Richardson, Koetter, and Vontobel have made significant contributions in developing theoretical frameworks that facilitate finite length analysis of codes. With an exception of Richardson's, most of the other frameworks are much suited for the analysis of short codes. In this report, we further the understanding of the failures in iterative decoders for the binary symmetric channel. The failures of the decoder are classified into two categories by defining trapping sets and propagating sets. Such a classification leads to a successful estimation of the performance of codes under the Gallager B decoder. Especially, the estimation techniques show great promise in the high signal-to-noise ratio regime where the simulation techniques are less feasible.
186

Order Sets in the Clinical Setting

Hall, Susan 27 August 2013 (has links)
Clinicians and hospital administrators are increasingly challenged to achieve efficient evidence-based care. Clinical decision support (CDS) tools are being introduced into the clinical setting to facilitate the bridging of knowledge gaps at the point of care. Order sets are one of the tools used to facilitate this knowledge translation. Using the realist review methodology and a focus group of interview participants, this thesis explored retrospectively some of the causal relationships that lead to effective and successful order set adoption. Findings demonstrate the need for in-depth and regular review of context and order set adoption. Technology can offer some enhancements in the form of delivery tools, but it also introduces new and complex challenges for development and implementation. Ongoing software development is needed to improve delivery formats as well as incorporate effective tools to allow for efficient continuous quality improvement supports. / Graduate / 0769 / 0566 / hallsm.77@gmail.com
187

Refinamento de consultas em lógicas de descrição utilizando teoria dos Rough sets / Query refinement in description logics using the Rough set theory

Oliveira, Henrique Viana January 2012 (has links)
OLIVEIRA, Henrique Viana. Refinamento de consultas em lógicas de descrição utilizando teoria dos Rough sets. 2012. 111 f. Dissertação (Mestrado em ciência da computação)- Universidade Federal do Ceará, Fortaleza-CE, 2012. / Submitted by Elineudson Ribeiro (elineudsonr@gmail.com) on 2016-07-11T16:33:31Z No. of bitstreams: 1 2012_dis_hvoliveira.pdf: 789598 bytes, checksum: d75ef093adc56cc930f52c1e486ead5a (MD5) / Approved for entry into archive by Rocilda Sales (rocilda@ufc.br) on 2016-07-18T13:43:45Z (GMT) No. of bitstreams: 1 2012_dis_hvoliveira.pdf: 789598 bytes, checksum: d75ef093adc56cc930f52c1e486ead5a (MD5) / Made available in DSpace on 2016-07-18T13:43:45Z (GMT). No. of bitstreams: 1 2012_dis_hvoliveira.pdf: 789598 bytes, checksum: d75ef093adc56cc930f52c1e486ead5a (MD5) Previous issue date: 2012 / Query Refinement consists of methods that modify the terms of a consult aiming the change of its result obtained previously. Refinements can be done of several ways and different approaches can be applied to it. This work proposes to apply methods of Query Refinement based on Rough Set theory, using it as an alternative for the refinement problem. The proposed methods will be grounded in the languages of Description Logics, which are commonly used on problems involving knowledge bases or ontologies representation. Two extensions of Description Logics with the Rough Set theory are introduced in this dissertation. We will prove the complexity of satisfiability of these logics, as well as the complexities of the query refinement methods applied to these logics. Finally, we will show quality measures which will aid to choose the results of the refinements obtained. / Refinamento de consulta consiste de técnicas que modificam os termos de uma consulta com o objetivo de alterar os resultados obtidos inicialmente. Para a realização de tal fim, diversas abordagens podem ser aplicadas e diferentes tipos de refinamentos podem ser considerados. Este trabalho propõe aplicar a teoria dos Rough Sets como uma nova alternativa de solução para o problema. Através das noções presentes nessa teoria, iremos desenvolver técnicas que serão aplicadas nas linguagens de Lógicas de Descrição, que são comumente utilizadas em problemas de representação de bases de conhecimento ou ontologias. Além disso, introduziremos duas extensões de Lógicas de Descrição capazes de representar as operações da teoria dos Rough Sets. Provaremos os resultados de complexidade de decisão dessas duas lógicas, assim como os resultados de complexidade das técnicas de refinamentos desenvolvidas. Por fim, mostraremos métricas de qualidade que poderão ser usadas para melhorar o resultado dos refinamentos obtidos.
188

Soft-clustering in static and dynamic environments

Saltos Atiencia, Ramiro Javier January 2016 (has links)
Doctor en Sistemas de Ingeniería / En la actualidad, el entorno macro y micro económico en el cual op eran las empresas está cambiando constantemente y a gran velo cidad. Esto se deb e princip almente al auge de las nuevas tecnologías, la revolución de la información y la gran facilidad con la que ahora p o demos comu nicarnos con cualquier parte d el mundo. Debido a la gran canti dad de información que está siendo generada segundo tras segundo, junto con la facilidad con la que se puede acceder a ella, la minería de datos y la investigación de op eracion es se han convertido en una de las herramientas más imp ortantes para desarrollar sistemas de ap oyo a la toma de decisiones en to dos los niveles. La imp ortancia del traba jo conjunto de ambas disciplinas radica en la faci lidad relativa con la que extraen cono cimiento de enormes bases de datos y la utilizan para optimizar los pro cesos organizacionales relevantes. En particular, para la min ería de datos, estos hechos h an provo cado qu e las bases de datos sean dinámicas, es decir, el nà omero de datos disponibles crece cada segundo haciendo que tiempo de vida útil de los modelos se reduzca, haciendo necesario actualizarlos periódicamente de tal forma que encajen con la realidad actual. Lo anterior demanda el desarrollo de nuevos algoritmos que sean capaces de manejar este tipo de cambios, lo cual ha ido ganando importancia en los últimos años. Adicionalmente, la incertidumbre, ambigüedad e imprecisión presentes en los problemas de la vida real son factores muy importantes a considerar cuando se desarrollan diferentes algoritmos de minería de datos. La forma más común de lidiar con la incertidumbre viene dada por la teoría de probabilidad, sin embargo, la ambigüedad e imprecisión han sido dejadas de lado hasta la aparición de nuevas formas de tratarlas; dos de las cuales son la teoría de conjuntos " fuzzy" y "rough." Con las ideas anteriores en mente, en esta tesis, un algoritmo clásico de clustering basado en support vectors es estudiado profundamente y extendido a una versión rough-fuzzy con el fin de darle la habilidad de manejar la ambigüedad e imprecisión presente en el mundo real. Luego de esto, este novedoso algoritmo de soft-computing es generalizado a una versión dinámica siendo capaz de procesar bases de datos que reciben nueva información con el paso del tiempo. Finalmente, utilizando como base estos dos algoritmos, un método de detección de outliers es propuesto como una de las múltiples posibles aplicaciones que resultan de esta investigación. En cada capítulo, los experimentos computacionales, resultados y discusión son provistos enfatizando la contribución que este trabajo de investigación tiene para la sociedad señalando los futuros desarrollos y posibles campos de aplicación
189

Refinamento de Consultas em LÃgicas de DescriÃÃo Utilizando Teoria dos Rough Sets / Query Refinement in Description Logics Using the Rough Set Theory

Henrique Viana Oliveira 14 August 2012 (has links)
CoordenaÃÃo de AperfeiÃoamento de Pessoal de NÃvel Superior / Refinamento de consulta consiste de tÃcnicas que modificam os termos de uma consulta com o objetivo de alterar os resultados obtidos inicialmente. Para a realizaÃÃo de tal fim, diversas abordagens podem ser aplicadas e diferentes tipos de refinamentos podem ser considerados. Este trabalho propÃe aplicar a teoria dos Rough Sets como uma nova alternativa de soluÃÃo para o problema. AtravÃs das noÃÃes presentes nessa teoria, iremos desenvolver tÃcnicas que serÃo aplicadas nas linguagens de LÃgicas de DescriÃÃo, que sÃo comumente utilizadas em problemas de representaÃÃo de bases de conhecimento ou ontologias. AlÃm disso, introduziremos duas extensÃes de LÃgicas de DescriÃÃo capazes de representar as operaÃÃes da teoria dos Rough Sets. Provaremos os resultados de complexidade de decisÃo dessas duas lÃgicas, assim como os resultados de complexidade das tÃcnicas de refinamentos desenvolvidas. Por fim, mostraremos mÃtricas de qualidade que poderÃo ser usadas para melhorar o resultado dos refinamentos obtidos. / Query Refinement consists of methods that modify the terms of a consult aiming the change of its result obtained previously. Refinements can be done of several ways and different approaches can be applied to it. This work proposes to apply methods of Query Refinement based on Rough Set theory, using it as an alternative for the refinement problem. The proposed methods will be grounded in the languages of Description Logics, which are commonly used on problems involving knowledge bases or ontologies representation. Two extensions of Description Logics with the Rough Set theory are introduced in this dissertation. We will prove the complexity of satisfiability of these logics, as well as the complexities of the query refinement methods applied to these logics. Finally, we will show quality measures which will aid to choose the results of the refinements obtained.
190

Um background na teoria dos conjuntos / One background in set theory

Francisco Fagner Portela Aguiar 29 September 2015 (has links)
A teoria de conjuntos por vezes deixada de lado em algumas escolas de ensino mÃdio, constitui-se em um elemento primordial para o entendimento das funÃÃes, em especial. A nÃo abordagem, ou a sua abordagem superficial, deixa no estudante uma lacuna difÃcil de ser suprida em estudos posteriores. AliÃs, a lacuna deixada pode dificultar o desempenho do estudante no ensino superior. Diante desta constataÃÃo, à objetivo principal desta dissertaÃÃo fazer uma leitura dos principais tÃpicos ligados à Teoria de Conjuntos do ensino mÃdio, ao mesmo tempo em que faz uma ponte entre estes e outros pontos nÃo menos importantes, tratando conjuntos em uma linguagem mais acadÃmica. SerÃo abordados desde as propriedades e teoremas relacionados a conjuntos finitos, atà a sua generalizaÃÃo para conjuntos infinitos, culminando com o teorema de Cantor-Schroeder-Bernstein, o Axioma da Escolha, e o Lema de Zorn. Para tantos, realizaram-se pesquisas bibliogrÃficas em fontes variadas. / The set theory sometimes left out in some high schools, is in a key element for understanding the functions in particular. Failure to address this issue or its superficial approach leaves the student a difficult gap to be filled in later studies. Incidentally, the left gap may hinder student performance in higher education. If this is so, is the main objective of this work to a reinterpretation of the main topics linked to the high school set theory, while making a bridge between these and other equally important points dealing with sets in a more academic language. Will be covered from the properties and theorems related to finite sets up its generalization to infinite sets, culminating in the Cantor-Schroeder-Bernstein theorem, the Axiom of Choice and Zornâs Lemma. To this end, there were literature searches in various sources.

Page generated in 0.0505 seconds