• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 531
  • 172
  • 54
  • 45
  • 35
  • 21
  • 14
  • 12
  • 10
  • 9
  • 7
  • 7
  • 5
  • 4
  • 4
  • Tagged with
  • 1071
  • 266
  • 152
  • 145
  • 122
  • 110
  • 107
  • 99
  • 98
  • 79
  • 63
  • 62
  • 58
  • 58
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Portfolio selection based on minmax rule and fuzzy set theory.

January 2011 (has links)
Yang, Fan. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2011. / Includes bibliographical references (p. 100-106). / Abstracts in English and Chinese. / Abstract --- p.i / Acknowledgement --- p.iii / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Literature review --- p.1 / Chapter 1.2 --- The main contribution of this thesis --- p.5 / Chapter 1.3 --- Relations between the above three models --- p.7 / Chapter 2 --- Model 1 --- p.9 / Chapter 2.1 --- Introduction --- p.9 / Chapter 2.2 --- Minimax rule risk function --- p.11 / Chapter 2.3 --- Fuzzy liquidity of asset --- p.12 / Chapter 2.4 --- Notations --- p.15 / Chapter 2.5 --- Model formulation --- p.16 / Chapter 2.6 --- Numerical example and result --- p.25 / Chapter 3 --- Model 2 --- p.36 / Chapter 3.1 --- Introduction --- p.36 / Chapter 3.2 --- Notations --- p.39 / Chapter 3.3 --- Model formulation --- p.41 / Chapter 3.4 --- Numerical example and result --- p.45 / Chapter 4 --- Model 3 --- p.51 / Chapter 4.1 --- Introduction --- p.51 / Chapter 4.2 --- Notations --- p.52 / Chapter 4.3 --- Model formulation --- p.54 / Chapter 4.4 --- Numerical example and result --- p.62 / Chapter 5 --- Conclusion --- p.68 / Chapter A --- Source Data for Model 1 --- p.71 / Chapter B --- Source Data for Model 2 --- p.80 / Chapter C --- Source Data for Model 3 --- p.90 / Bibliography --- p.100
112

The fuzzification of Choquet integral and its applications. / CUHK electronic theses & dissertations collection

January 2005 (has links)
As the most essential feature in problem solving and decision making by humans, uncertainty information occur frequently in business, scientific and engineering disciplines. The explosive growth and diverse forms of uncertainty information in the stored data have generated an urgent requirement for new techniques and tools that can intelligently and automatically assist us in eliciting valuable knowledge from raw data. / The DCIFI is defined based on the Choquet extension of a signed fuzzy measure. A numerical calculation algorithm is implemented to derive the integration result of the DCIFI. A DCIFI regression model is designed to handle the regression problem where heterogeneous fuzzy data are involved. We propose a GA-based Double Optimization Algorithm (GDOA) to retrieve the internal coefficients of the DCIFI regression model. Besides that, A DCIFI projection classifier, which is capable of classifying heterogeneous fuzzy data efficiently and effectively, is established. We proposed a GA-based Classifier-learning Algorithm (GACA) to search the relevant internal parameters of the DCIFI projection classifier. Both the DCIFI regression model and projection classifier are very informative and powerful to deal with heterogeneous fuzzy data sets with strong interaction. Their performances are validated by a series of experiments both on synthetic and real data. (Abstract shortened by UMI.) / This thesis is mainly devoted to a comprehensive investigation on innovative data mining methodologies which merge the advantages of nonlinear integral (Choquet integral) in the representation of nonlinear relationship and fuzzy set theory in the description of uncertainty existed in practical data bases. It proposes two fuzzifications on the classical Choquet integral, one is the Defuzzified Choquet Integral with Fuzzy-valued Integrand (DCIFI), and the other is the Fuzzified Choquet Integral with Fuzzy-valued Integrand (FCIFI). The DCIFI and the FCIFI are regarded as generalizations of Choquet integral since both of them allow their integrands to be fuzzy-valued. The difference lies in that the DCIFI has its integration result non-fuzzified while the FCIFI has its integration result fuzzified. Due to the different forms of integration results, the DCIFI and the FCIFI have their distinct theoretic analyses, implementation algorithms, and application scopes, respectively. / by Rong Yang. / "April 2005." / Advisers: Kwong-Sak Leung; Pheng-Ann Heng. / Source: Dissertation Abstracts International, Volume: 67-01, Section: B, page: 0371. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2005. / Includes bibliographical references (p. 187-199). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstract in English and Chinese. / School code: 1307.
113

Generalized Julia Sets: An Extension of Cayley's Problem

Lewis, Owen 01 May 2005 (has links)
There are many iterative techniques to find a root or zero of a given function. For any iterative technique, it is often of interest to know which initial seeds lead to which roots. When the iterative technique used is Newton’s Method, this is known as Cayley’s Problem. In this thesis, I investigate two extensions of Cayley’s Problem. In particular, I study generalizations of Newton’s Method, in both C and R2, and the associated fractal structures that arise from using more sophisticated numerical approximation techniques.
114

An Algorithm for Computing the Symmetry Point of a Polytope

Belloni, Alexandre, Freund, Robert M. 01 1900 (has links)
Given a closed convex set C and a point x in C, let sym(x,C) denote the symmetry value of x in C, which essentially measures how symmetric C is about the point x. Denote by sym(C) the largest value of sym(x,C) among all x in C, and let x* denote the most symmetric point in C. These symmetry measures are all invariant under linear transformation, change in inner product, etc., and so are of interest in the study of the geometry of convex sets and arise naturally in the evaluation of the complexity of interior-point methods in particular. Herein we show that when C is given by the intersection of halfspaces, i.e., C={x | Ax <= b}, then x* as well as the symmetry value of C can be computed by using linear programming. Furthermore, given an approximate analytic center of C, there is a strongly polynomial-time algorithm for approximating sym(C) to any given relative tolerance. / Singapore-MIT Alliance (SMA)
115

Proactive Detection and Recovery of Lost Mobile Phones

Ong, Chen Hui, Kasim, Nelly, Jayasena, Sajindra Kolitha Bandara, Rudolph, Larry, Cham, Tat Jen 01 1900 (has links)
This paper describes the successful implementation of a prototype software application that independently and proactively detects whether a mobile phone is lost or misused. When the mobile phone is detected as being lost or misused, the application takes steps to mitigate the impact of loss and to gather evidence. The goal is to aid in the recovery of the mobile phone. The prototype works regardless of the cellular infrastructure the mobile phone is operating in and makes minimum demands on the owner of the mobile phone. The prototype was developed on Nokia 6600 mobile phones that run Symbian Operating System 7.0s. Development was done using Nokia’s Series 60 Developer’s Platform 2.0. / Singapore-MIT Alliance (SMA)
116

Network Bargaining: Creating Stability Using Blocking Sets

Steiner, David January 2012 (has links)
Bargaining theory seeks to answer the question of how to divide a jointly generated surplus between multiple agents. John Nash proposed the Nash Bargaining Solution to answer this question for the special case of two agents. Kleinberg and Tardos extended this idea to network games, and introduced a model they call the Bargaining Game. They search for surplus divisions with a notion of fairness, defined as balanced solutions, that follow the Nash Bargaining Solution for all contracting agents. Unfortunately, many networks exist where no balanced solution can be found, which we call unstable. In this thesis, we explore methods of changing unstable network structures to find fair bargaining solutions. We define the concept of Blocking Sets, introduced by Biro, Kern and Paulusma, and use them to create stability. We show that by removing a blocking set from an unstable network, we can find a balanced bargaining division in polynomial time. This motivates the search for minimal blocking sets. Unfortunately this problem is NP-hard, and hence no known efficient algorithm exists for solving it. To overcome this hardness, we consider the problem when restricted to special graph classes. We introduce a O(1)-factor approximation algorithm for the problem on planar graphs with unit edge weights. We then provide an algorithm to solve the problem optimally in graphs of bounded treewidth, which generalize trees.
117

Integrating environmental criteria into the supplier selection process

Wong, Yin-king. January 2000 (has links)
Thesis (M. Phil.)--University of Hong Kong, 2001. / Includes bibliographical references (leaves 163-177).
118

Eigen Fuzzy Sets of Fuzzy Relation with Applications / Eigen Fuzzy Sets of Fuzzy Relation with Applications

Naman, Saleem Muhammad January 2010 (has links)
Eigen fuzzy sets of fuzzy relation can be used for the estimation of highest and lowest levels of involved variables when applying max-min composition on fuzzy relations. By the greatest eigen fuzzy sets (set which can be greater anymore) maximum membership degrees of any fuzzy set can be found, with the help of least eigen fuzzy set (set which can be less anymore) minimum membership degrees of any fuzzy sets can be found as well.The lowest and highest level, impact or e ffect of anything can be found by applying eigen fuzzy set theory. The implicational aspect of this research study is medical and customer satisfaction level measurement. By applying methods of eigen fuzzy set theory the e ffectiveness of medical cure and customer satisfaction can be found with high precision.
119

Does providing a subtle reasoning hint remedy the conjunction fallacy?

Stergiadis, Dimitris January 2015 (has links)
Humans are in general poor at making judgments that adhere to the logical principles of probability theory. One demonstration of this is termed the “conjunction fallacy”: judging a conjunction (A&amp;B) as being more probable than its constituent (A). Systematic commitment of the conjunction fallacy has been shown in numerous studies on probability judgments. Different actions to remedy the fallacy have been suggested. According to the nested-sets hypothesis, when the nested-set structure of a problem becomes clear (i.e. the relation between categories and subcategories), then the conjunction fallacy is remedied. However, previous demonstrations of this remediation have provided very explicit task-related information and it can be questioned whether it is trivial that such information leads to more correct judgments. The primary aim of this study was to test the nested- sets hypothesis in two different formats of a probability judgment task, by more subtly hinting about the nested-set structure. Twenty-nine participants were randomly divided into two groups, one Probability condition and one Informed probability condition, where participants in the latter condition were provided with the hint. The second aim was to investigate whether the Informed probability condition was performed more slowly, potentially due to the time-cost of more elaborated judgments. The results show that a subtle hint about the nested-set structure was able to remedy the conjunction fallacy in a forced-choice probability judgment task but not statistically reliably in a probability estimation task. No response-time differences were observed between the conditions. The results support the nested-sets hypothesis and imply that even a subtle reasoning hint clarifying the relation between categories and subcategories might remedy one of the most robust probability judgment fallacies. / Människor är i allmänhet dåliga på att göra bedömningar som följer principer för sannolikhetsteori. En indikation på det är ”konjunktionsfelet”: att bedöma en konjunktion (A&amp;B) som mer sannolik än sin konstituent (A). Konjunktionsfelet har påvisats i flera studier på sannolikhetsbedömningar. Olika sätt att avhjälpa felet har föreslagits. Enligt nested-set hypotesen föreslås att när nested-set strukturen av ett problem blir tydlig (dvs. relationen mellan kategorier och subkategorier), minskas benägenheten att begå konjunktionsfelet. Däremot har tidigare demonstrationer av den här minskningen angett väldigt explicit uppgifts-relaterad information och det kan ifrågasättas om det är trivialt att sådan information leder till mer korrekta bedömningar. Studiens primära syfte var att testa nested-set hypotesen i två olika sannolikhetsbedömningsformat, genom att subtilt antyda om nested-set strukturen. Tjugonio deltagare delades slumpmässigt in i två grupper, en Sannolikhetsbetingelse och en Informerad sannolikhetsbetingelse, där den senare betingelsen fick den extra informationen. Det andra syftet var att undersöka om Informerad sannolikhetsbetingelsen skulle utföras långsammare, potentiellt på grund av tids-kostnaden av mer elaborerade bedömningar. Resultatet visar att en subtil antydan om nested-set strukturen minskade konjunktionsfelet i en fler-vals uppgift på sannolikhetsbedömningar men inte statistiskt pålitligt i en sannolikhetsestimeringsuppgift. Inga responstidsskillnader hittades mellan betingelserna. Resultat stödjer nested-set hypotesen och antyder att även en subtil antydan som klargör relationen mellan kategorier och underkategorier kan åtgärda ett av de mest robusta tankefel som observerats vid sannolikhetsbedömningar.
120

Continuities and discontinuities in working memory representations of collections over ontogeny

Tuerk, Arin Samantha 23 October 2014 (has links)
Working memory, or the ability to maintain and manipulate information such that it can be used to guide behavior, is known to be severely capacity limited, in most circumstances, to about 3-4 objects. Both infants and adults have the ability to surpass these limits by encoding to-be-remembered items in groups or collections, exploiting statistical regularities or conceptual information to devise more efficient coding schema. Despite progress made toward understanding continuities in working memory, little is known about how changes over development interact with the ability to employ maximally efficient mnemonic data structures. Paper 1 demonstrates that although adults can encode at most three mutually exclusive collections that accrue sequentially over time, they can circumvent this limit when items overlap in features (e.g. red and blue circles and triangles) and statistical regularities are introduced among collections defined by a single visual feature (e.g. most red items are triangular and not circular). Adults' performance suggests they are able to encode items from intersecting collections hierarchically and exploit statistical regularities among collections to reconstruct the numerosities of up to six collections in parallel, exemplifying how efficient coding can radically enhance working memory. Paper 2 demonstrates that young preschoolers can also represent three mutually exclusive collections that accrue in an intermixed fashion over time. Results show that the ability to surpass this capacity limit by hierarchically reorganizing collections and exploiting statistical regularities among them develops between the ages of three and seven. These results are discussed in the context of executive function development. Paper 3 provides evidence that computations of average size and orientation rely on qualitatively different processes with distinct developmental trajectories. Experiment 1 demonstrates that while the presence of additional identical elements in an array detrimentally impacts 6-month-olds' representations of element size, it improves the precision with which infants represent orientation. Experiment 2 demonstrates that performance is not affected when infants' attention is cued to a single item within arrays. These results are discussed in the context of the development of controlled attention. / Psychology

Page generated in 0.0484 seconds