Spelling suggestions: "subject:"convex"" "subject:"konvex""
171 |
Improved recurrent neural networks for convex optimization. / CUHK electronic theses & dissertations collectionJanuary 2008 (has links)
Constrained optimization problems arise widely in scientific research and engineering applications. In the past two decades, solving optimization problems using recurrent neural network methods have been extensively investigated due to the advantages of massively parallel operations and rapid convergence. In real applications, neural networks with simple architecture and good performance are desired. However, most existing neural networks have some limitations and disadvantages in the convergence conditions or architecture complexity. This thesis is concentrated on analysis and design of recurrent neural networks with simplified architecture but for solving more general convex optimization problems. In this thesis, some improved recurrent neural networks have been proposed for solving smooth and non-smooth convex optimization problems and applied to some selected applications. / In Part I, we first propose a one-layer recurrent neural network for solving linear programming problems. Compared with other neural networks for linear programming, the proposed neural network has simpler architecture and better convergence properties. Second, a one-layer recurrent neural network is proposed for solving quadratic programming problems. The global convergence of the neural network can be guaranteed if only the objective function of the programming problem is convex on the equality constraints and not necessarily convex everywhere. Compared with the other neural networks for quadratic programming, such as the Lagrangian network and projection neural network, the proposed neural network has simpler architecture which neurons is the same as the number of the optimization problems. Third, combining the projection and penalty parameter methods, a one-layer recurrent neural network is proposed for solving general convex optimization problems with linear constraints. / In Part II, some improved recurrent neural networks are proposed for solving non-smooth convex optimization problems. We first proposed a one-layer recurrent neural network for solving the non-smooth convex programming problems with only equality constraints. This neural network simplifies the Lagrangian network and extend the neural network to solve non-smooth convex optimization problems. Then, a two-layers recurrent neural network is proposed for the non-smooth convex optimization subject to linear equality and bound constraints. / In Part III, some selected applications of the proposed neural networks are also discussed. The k-winners-take-all (kWTA) operation is first converted to equivalent linear and quadratic optimization problems and two kWTA network models are tailed to do the kWTA operation. Then, the proposed neural networks are applied to some other problems, such as the linear assignment, support vector machine learning and curve fitting problems. / Liu, Qingshan. / Source: Dissertation Abstracts International, Volume: 70-06, Section: B, page: 3606. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2008. / Includes bibliographical references (leaves 133-145). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.
|
172 |
Functions of bounded variation and the isoperimetric inequality. / CUHK electronic theses & dissertations collectionJanuary 2013 (has links)
Lin, Jessey. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2013. / Includes bibliographical references (leaves 79-80). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
|
173 |
Parameter Continuation with Secant Approximation for Deep Neural NetworksPathak, Harsh Nilesh 03 December 2018 (has links)
Non-convex optimization of deep neural networks is a well-researched problem. We present a novel application of continuation methods for deep learning optimization that can potentially arrive at a better solution. In our method, we first decompose the original optimization problem into a sequence of problems using a homotopy method. To achieve this in neural networks, we derive the Continuation(C)- Activation function. First, C-Activation is a homotopic formulation of existing activation functions such as Sigmoid, ReLU or Tanh. Second, we apply a method which is standard in the parameter continuation domain, but to the best of our knowledge, novel to the deep learning domain. In particular, we use Natural Parameter Continuation with Secant approximation(NPCS), an effective training strategy that may find a superior local minimum for a non-convex optimization problem. Additionally, we extend our work on Step-up GANs, a data continuation approach, by deriving a method called Continuous(C)-SMOTE which is an extension of standard oversampling algorithms. We demonstrate the improvements made by our methods and establish a categorization of recent work done on continuation methods in the context of deep learning.
|
174 |
Maximum Likelihood Identification of an Information Matrix Under Constraints in a Corresponding Graphical ModelLi, Nan 22 January 2017 (has links)
We address the problem of identifying the neighborhood structure of an undirected graph, whose nodes are labeled with the elements of a multivariate normal (MVN) random vector. A semi-definite program is given for estimating the information matrix under arbitrary constraints on its elements. More importantly, a closed-form expression is given for the maximum likelihood (ML) estimator of the information matrix, under the constraint that the information matrix has pre-specified elements in a given pattern (e.g., in a principal submatrix). The results apply to the identification of dependency labels in a graphical model with neighborhood constraints. This neighborhood structure excludes nodes which are conditionally independent of a given node and the graph is determined by the non- zero elements in the information matrix for the random vector. A cross-validation principle is given for determining whether the constrained information matrix returned from this procedure is an acceptable model for the information matrix, and as a consequence for the neighborhood structure of the Markov Random Field (MRF) that is identified with the MVN random vector.
|
175 |
Sobre medidas unicamente maximizantes e outras questões em otimização ergódicaSpier, Thomás Jung January 2016 (has links)
Nessa dissertação estudamos Sistemas Dinâmicos do ponto de vista da Otimização Ergódica. Analizamos o problema da maximização da integral de potenciais com respeito a probabilidades invariantes pela dinâmica. Mostramos que toda medida ergódica e unicamente maximizante para algum potencial. Verificamos que o conjunto de potenciais com exatamente uma medida maximizadora e residual. Esses resultados são obtidos atrav es de técnicas da Teoria Ergódica e Análise Convexa. / In this thesis we study dynamical systems trough the viewpoint of ergodic optimization. We analyze the problem of maximizing integrals of potentials with respect to invariant probabilities. We show that every ergodic measure is uniquely maximizing for some potential. We also verify that the set of potentials with exactly one maximizing measure is residual. This results are obtained through techniques of ergodic theory and convex analysis.
|
176 |
Linear regularity of closed sets in Banach spaces. / CUHK electronic theses & dissertations collectionJanuary 2004 (has links)
by Zang Rui. / "Nov 2004." / Thesis (Ph.D.)--Chinese University of Hong Kong, 2004. / Includes bibliographical references (p. 78-82) / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Mode of access: World Wide Web. / Abstracts in English and Chinese.
|
177 |
Lagrangian duality in convex optimization.January 2009 (has links)
Li, Xing. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2009. / Includes bibliographical references (leaves 76-80). / Abstract also in Chinese. / Introduction --- p.4 / Chapter 1 --- Preliminary --- p.6 / Chapter 1.1 --- Notations --- p.6 / Chapter 1.2 --- On Properties of Epigraphs --- p.10 / Chapter 1.3 --- Subdifferential Calculus --- p.14 / Chapter 1.4 --- Conical Approximations --- p.16 / Chapter 2 --- Duality in the Cone-convex System --- p.20 / Chapter 2.1 --- Introduction --- p.20 / Chapter 2.2 --- Various of Constraint Qualifications --- p.28 / Chapter 2.2.1 --- Slater´ةs Condition Revisited --- p.28 / Chapter 2.2.2 --- The Closed Cone Constrained Qualification --- p.31 / Chapter 2.2.3 --- The Basic Constraint Qualification --- p.38 / Chapter 2.3 --- Lagrange Multiplier and the Geometric Multiplier --- p.45 / Chapter 3 --- Stable Lagrangian Duality --- p.48 / Chapter 3.1 --- Introduction --- p.48 / Chapter 3.2 --- Stable Farkas Lemma --- p.48 / Chapter 3.3 --- Stable Duality --- p.57 / Chapter 4 --- Sequential Lagrange Multiplier Conditions --- p.63 / Chapter 4.1 --- Introduction --- p.63 / Chapter 4.2 --- The Sequential Lagrange Multiplier --- p.64 / Chapter 4.3 --- Application in Semi-Infinite Programs --- p.71 / Bibliography --- p.76 / List of Symbols --- p.80
|
178 |
Convexity and duality in optimization theoryYoung, Stephen K January 1977 (has links)
Thesis. 1977. Ph.D.--Massachusetts Institute of Technology. Dept. of Mathematics. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND SCIENCE. / Bibliography: leaves 270-272. / by Stephen Kinyon Young. / Ph.D.
|
179 |
Theory and Poetry: John Ashbery's "Self-portrait in a Convex Mirror"Timmons, Jeffrey Wayne 20 May 1994 (has links)
This thesis examines John Ashbery's poem "Self-Portrait in a Convex Mirror" and its revision of the traditional distinction between theory and poetry. Drawing a relationship between the poem's subject and the practices of postmodern theoretical discourse, the thesis posits the poem as an artifact of these changes. Creating a context for the poem, these developments not only inform the climate in which Ashbery's poem takes on significance, but, as well, explain the changing nature of literary study. Historical in its approach to the pressures and impulses within this climate of aesthetic production, the thesis traces the distinction between science and literature and how it has influenced the creation of the literary discipline. Demonstrating that the disciplinary study of literature has always been the subject of debate and discussion, it uses this understanding to place present disagreements about the need or usefulness of theory in the context of historical disagreements over the difference of literature from science or philosophy. Explaining that postmodern theory has largely worked to foreground the arbitrary nature of distinctions such as that between theory and poetry, the thesis elaborates on how poststructuralism undoes these distinctions to show how they are always the result of particular political and ideological views of representation. Using this critical insight, the thesis then reads closely the details of the poem's relationship to postmodern theory, how it works to undo the distinction between theory and poetry. Having undone this traditional distinction, however, leaves the poem in an ambivalent and unstable position. Since it passes between extant categorical definitions its own nature remains undecided and, thus, maintains an engagement with and resistance to tradition. It remains caught between the need for the aesthetic past and the need for a freedom from that past. Chapter four, therefore, explores this ambivalence, particularly as it relates to the inheritance of romanticism and modernism. Finally, in chapter five, the thesis revises the main critical perception of Ashbery as postmodern, making a case for his closer affiliation with a late version of modernism. Because of Ashbery's preoccupation with the aesthetic past, his use of the imagery, insights, and idealism of our aesthetic history, he appears to re-create a distinction between high and popular art that is more consonant with a version of modernism.
|
180 |
A numerical approach to Tamme's problem in euclidean n-spaceAdams, Patrick Guy 09 June 1997 (has links)
Graduation date: 1998
|
Page generated in 0.0263 seconds