• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 110
  • 40
  • 29
  • 21
  • 10
  • 8
  • 6
  • 6
  • 6
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 284
  • 35
  • 35
  • 32
  • 27
  • 26
  • 24
  • 23
  • 22
  • 21
  • 21
  • 19
  • 18
  • 18
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

La foi en l’obscur : le sacré sale et le mysticisme du jeu chez Georges Bataille

Lavoie, Vincent 11 1900 (has links)
C’est durant la première moitié du vingtième siècle, c’est en étant contemporain des deux guerres mondiales, dans une France qui se sécularise, que Georges Bataille (1897 – 1962) construit une pensée – profondément nietzschéenne – scandaleuse. Conceptuellement, le corps n’est pas seulement au centre des préoccupations de Bataille, il se pense maintenant dans sa réalité outrancière : l’érotisme, l’ivresse et la souillure. On assiste à une sacralisation des débauches de toutes sortes. Et c’est ainsi que s’opère la singularité philosophique de Bataille puisque ce qui est sacré ne loge plus, selon lui, dans un sublime céleste, dans une divinité ou, en d’autres mots, dans les hauteurs, mais bien dans son exact contraire, c’est-à-dire dans le bas, dans l’excès, dans l’érotisme et dans le sale. Devant cette inversion de l’ordre du monde, ce qui est sacré, c’est la profanation même. C’est pourquoi, à partir de Mary Douglas, j’ai pu relier deux idées qui paraissent contraires : la souillure et le sacré. Ainsi, étant donné que Bataille est aussi écrivain (Sade fut l’une de ses influences majeures), j’ai pu inclure des récits de l’auteur qui illustrent parfaitement ces deux antinomies. C’est d’ailleurs l’un des traits les plus fondamentaux de son œuvre que j’ai aussi soulevé : toute l’œuvre de Bataille est paradoxale et antinomique. C’est justement à partir de ce même constat que j’aborde la question du mysticisme chez Bataille. Mais comme avec la question du sacré, la conception du mysticisme implique une critique implicite du christianisme (opposition du ciel contre la terre, par exemple). Nécessairement, c’est l’instant et son hasard – l’absence de but – qui ouvrent la voie à l’expérience, alors que l’écriture communique son essence tout en la rendant davantage intelligible. De là provient justement, par le caractère arbitraire de l’expérience, l’idée de chance que Bataille établit en diapason avec Nietzsche. À ce sujet, Nietzsche devient très présent dans le mysticisme bataillien, pensons à la figure du surhumain qu’on peut associer à celle de l’homme souverain ou encore à la volonté de puissance qu’on peut relier à celle de la volonté de chance. Dès lors, Bataille met de l’avant une mystique du jeu, celle-ci étant une mise en jeu radicale de soi-même d’où émerge la chance que Bataille évoque et qui n’est rien d’autre que la possibilité de l’expérience même. Somme toute, force est de constater que la fragilisation mentale et physique du mysticisme bataillien cache également un sacrifice de soi au nom de la jouissance, certes, mais aussi au nom du texte. / It was in the first half of contemporary 20th century when France was well secularized that Georges Bataille (1897 - 1962) constructed a scandalous - deeply Nietzschean - thought. The body is then conceptually not only at the center of Bataille's preoccupations, it now thinks of itself through its scandalous reality: eroticism, exhilarating and dirty. We are witnessing a sacralization of debauchery of all kinds and this is how Bataille's philosophical singularity operates. Indeed, a sacred thing does not exist in the sublime sky, in any deity or, in other words, in the heights, but in its exact opposite wich his the low, the excess, the eroticism and the dirty. This inversion in the order of the world is the desecration itself. This is why, with Mary Douglas, I was able to link two ideas that seemed contradictory to me: the dirty and the sacred. Knowing that Bataille is also a writer (Sade is a major influence), I can include a few stories to perfectly illustrate these two opposites. This is one of the most fundamental characteristics of the work I have mentioned: all of Bataille's works are paradoxical and antinomic. It is precisely from this same observation that I approach the question of mysticism in Bataille works. With the question of the sacred, the conception of mysticism involves an implicit critique of Christianity (opposition of sky and earth, for example). It is the instant and its hazard - the absence of purpose - that opens necessarily the way to experience as writing communicates its essence while making it more intelligible. From there precisely, Bataille developed the idea of chance in agreement with Nietzsche. Nietzsche becomes at the same time very present in the conception of the mysticism of Bataille. We can just think about the figure of the superhuman that we can associate with the sovereign man or even the will of power that we can compare to the will of chance. From then on, Bataille puts forward a mystic game, by placing himself in this radical game, from which the luck evoked by Bataille can emerge and which is nothing else than the possibility of experience itself. Finally, it is clear that the mental and physical fragility that are triggered in Bataille mystical experience also hide a form of self-sacrifice in the name of enjoyment but also in the name of the text.
272

Der Erste Weltkrieg und das ‚Ostjudentum‘. Westeuropäische Perspektiven am Beispiel von Arnold Zweig, Sammy Gronemann und Max Brod

Schneider, Ulrike 07 August 2019 (has links)
No description available.
273

Mnohost bytí: Ontologie Alaina Badioua / The Multiplicity of Being: The Ontology of Alain Badiou

Pivoda, Tomáš January 2012 (has links)
Tomáš Pivoda, The Multiplicity of Being: The Ontology of Alain Badiou PhD thesis Abstract The thesis introduces for the first time in the Czech philosophical context the ontology of the French philosopher Alain Badiou, as he set it out in his fundamental work Being and Event (L'être et l'événement, 1988). It first presents the starting point of Badiou's philosophy as well as the reasons of his identification of ontology with the set theory, and it points out Badiou's importance for contemporary philosophy, especially for the so called speculative realism around Quentin Meillassoux. The main axis of the exposition is then built around Badiou's four fundamental "Ideas": the multiplicity, the event, the truths and the subject, in connection with which it is shown how Badiou constructs his conceptual apparatus out of individual axioms of the set theory, whereby he follows the basic formal definition of multiplicity based on the operator . In connection with∈ the first Idea of multiplicity, the thesis exposes - with references to Martin Heidegger and Plato - Badiou's conceptual transposition of the couple one/multiple on the couple existence/being and defines the fundamental concepts of his ontology - the situation, the presentation, the representation and the void, with the help of which Badiou interprets...
274

Opérations de proximité en orbite : évaluation du risque de collision et calcul de manoeuvres optimales pour l'évitement et le rendez-vous / Orbital proximity operations : evaluation of collision risk and computation of optimal maneuvers for avoidance and rendezvous

Serra, Romain 10 December 2015 (has links)
Cette thèse traite de l'évitement de collision entre un engin spatial opérationnel, appelé objet primaire, et un débris orbital, dit secondaire. Ces travaux concernent aussi bien la question de l'estimation du risque pour une paire d'objets sphériques que celle du calcul d'un plan de manoeuvres d'évitement pour le primaire. Pour ce qui est du premier point, sous certaines hypothèses, la probabilité de collision s'exprime comme l'intégrale d'une fonction gaussienne sur une boule euclidienne, en dimension deux ou trois. On en propose ici une nouvelle méthode de calcul, basée sur les théories de la transformée de Laplace et des fonctions holonomes. En ce qui concerne le calcul de manoeuvres de propulsion, différentes méthodes sont développées en fonction du modèle considéré. En toute généralité, le problème peut être formulé dans le cadre de l'optimisation sous contrainte probabiliste et s'avère difficile à résoudre. Dans le cas d'un mouvement considéré comme relatif rectiligne, l'approche par scénarios se prête bien au problème et permet d'obtenir des solutions admissibles. Concernant les rapprochements lents, une linéarisation de la dynamique des objets et un recouvrement polyédral de l'objet combiné sont à la base de la construction d'un problème de substitution. Deux approches sont proposées pour sa résolution : une première directe et une seconde par sélection du risque. Enfin, la question du calcul de manoeuvres de proximité en consommation optimale et temps fixé, sans contrainte d'évitement, est abordée. Par l'intermédiaire de la théorie du vecteur efficacité, la solution analytique est obtenue pour la partie hors-plan de la dynamique képlérienne linéarisée. / This thesis is about collision avoidance for a pair of spherical orbiting objects. The primary object - the operational satellite - is active in the sense that it can use its thrusters to change its trajectory, while the secondary object is a space debris that cannot be controlled in any way. Onground radars or other means allow to foresee a conjunction involving an operational space craft,leading in the production of a collision alert. The latter contains statistical data on the position and velocity of the two objects, enabling for the construction of a probabilistic collision model.The work is divided in two parts : the computation of collision probabilities and the design of maneuvers to lower the collision risk. In the first part, two kinds of probabilities - that can be written as integrals of a Gaussian distribution over an Euclidean ball in 2 and 3 dimensions -are expanded in convergent power series with positive terms. It is done using the theories of Laplace transform and Definite functions. In the second part, the question of collision avoidance is formulated as a chance-constrained optimization problem. Depending on the collision model, namely short or long-term encounters, it is respectively tackled via the scenario approach or relaxed using polyhedral collision sets. For the latter, two methods are proposed. The first one directly tackles the joint chance constraints while the second uses another relaxation called risk selection to obtain a mixed-integer program. Additionaly, the solution to the problem of fixed-time fuel minimizing out-of-plane proximity maneuvers is derived. This optimal control problem is solved via the primer vector theory.
275

In the Company of Cheaters (16th-Century Aristocrats and 20th-Century Gangsters)

Murdock, Mark Cammeron 24 June 2009 (has links) (PDF)
This document contains a meta-commentary on the article that I co-authored with Dr. Corry Cropper entitled Breaking the Duel's Rules: Brantôme, Mérimée, and Melville, that will be published in the next issue of Essays in French Literature and Culture, and an annotated bibliography of primary and secondary sources featuring summaries and important quotes dealing with duels, honor, honor codes, cheating, historical causality, chance, and sexuality. Also, several examples of film noir are cited with brief summaries and key events noted. The article we wrote studies two instances of cheating in duels: one found in Brantôme's Discours sur les duels and the other in Prosper Mérimée's Chronique du règne de Charles IX, and the traditional, as well as anti-causal, repercussions they had. Melville's Le Deuxième souffle is also analyzed with regards to the Gaullist Gu Minda and the end of the aristocratic codes of honor that those of his generation dearly respected but that were overcome by the commercial world of republican law and order.
276

Situace surrealistického subjektu / Situation of the Surrealist Subject

Svěrák, Šimon January 2013 (has links)
Univerzita Karlova v Praze Filosofická fakulta katedra estetiky Diploma thesis Šimon Svěrák Situation of the Surrealist Subject (abstract) 2012 thesis supervisor: prof. PhDr. Vlastimil Zuska, CSc. Abstract This thesis focuses on the situation of a substantial subject in the historical development of the surrealist experience and confronts it with our original postmodern interpretation of thoughts of early Marx. The surrealist consciousness is based on a dialectical opposition between rational and irrational elements of cognitive processes. André Breton apprehends this dialectics under the perspective of love life and relates it to values of love, freedom and poetry. Nevertheless, this conception changes in the immanent development of the surrealist consciousness from Breton over the work and thoughts Salvador Dalí and Mikuláš Medek to Vratislav Effenberger. Effenberger removes positive values from surrealism and puts emphasis on the critical functions of the irrational. On the psychological field, all these ideas are based on the conception of the unconscious which means there is the substantial approach in them. Our critical interpretation of Marx shows, that the surrealist concept of subject is in the contradiction with its substantial determination. The subject has to be perceived as the essential...
277

台灣地區醫院效率與生產力變動之研究-非參數DEA方法之應用 / Efficiency and Productivity Growth of Hospitals in Taiwan: Nonparametric Data Envelopment Analysis

王媛慧 Unknown Date (has links)
本論文對於醫療市場的生產績效研究,係由兩篇獨立的學術研究報告所組成,研究重點在於利用非參數資料包絡分析方法 ( nonparametric DEA approach ),估計醫院的生產技術,以衡量醫院的技術效率及不同年度間之生產力變動,進而分析不同醫院間,生產績效差異的主要原因。本論文所採用的研究方法與探討的主題,不同於國內既有的相關文獻。 第一部分:生產不確定性與醫院效率 本部分主要探討在醫院面對不確定性時的效率評估。一般而言,醫院有兩種生產上的不確定性來源:醫師或醫院的診療結果所導致的生產不確定性;及消費者對醫療服務需求的不確定性 (Arrow, 1963)。當醫院面對生產不確定性時,醫院效率將與廠商如何處理不確定性問題有關,亦即,當廠商事前規劃愈縝密,未來可能的產出失靈水準愈低,則其生產效率表現愈佳。本文利用民國 82 及 83 年(準)醫學中心與(準)區域醫院資料,模擬醫院在面對生產不確定性時,各種可能的產出失靈水準,以chance constrained DEA 模式 (Land, Lovell and Thore, 1993) 估算醫院的隨機技術效率,並與傳統、確定性的DEA模式所得到之結果,做一比較。 Chance constrained DEA模式與傳統DEA模式的不同,在於前者估計出的生產前緣,並不總是包絡所有的樣本點,亦即,允許某廠商之產出超越生產前緣或說允許產出失靈可能性之存在,而後者則否。實證結果發現,在chance constrained DEA模式下,私立醫院的技術效率高於公立醫院,且呈現統計顯著性的差異,但兩者間的差異隨著醫院事前準備程度的提高而縮小;而傳統DEA模式也顯示,私立醫院的技術效率確實顯著地高於公立醫院。此外,若產出失靈水準夠低,則chance constrained DEA模式的效率值與傳統DEA模式的效率值,兩者間的分配會呈現統計顯著性差異。 在面對生產不確定性時,欲提升公立醫院的生產效率,應提高廠商事先規劃的程度,才能與私立醫院之生產效率並駕齊驅。一般而言,廠商事先準備的程度高低,與醫院本身的特性有關,因此,欲改善公立醫院緩衝產能的準備程度,以降低產出失靈水準,有必要進行體制層面的改革,亦即,從進行人事變革、財務之授權與彈性化等方向開始做起,如此應可提高公立醫院的生產效率。 第二部分:全民健康保險制度與醫院生產力變動 全民健保實施後,民眾對醫療服務的可近性提高,醫院間的市場結構改變,因此,醫院生產力與效率的提升,成為眾所關切的焦點。為瞭解醫院在全民健保實施後,資源是否有效配置,本部分利用民國 82 至 86 年醫學中心、區域醫院與地區醫院等大小型醫院資料,以範疇DEA模式估計Malmquist生產力變動指標,並將之分解為技術變動、純技術效率變動、及規模效率變動等三項變動來源。 實證結果發現,從82至86年醫院整體平均效率而言,CRS(VRS)生產技術下的平均效率為 66.00%(74.87%),表示不論大小型醫院,平均而言,皆存在技術不效率的情形。再者,在民國84年,亦即全民健保實施的年度,其效率水準明顯較其他年度為低,其餘年度的效率水準都相對較高,此一結果意謂,政策干擾對於醫院效率表現的影響,是短期性的。另外,小型醫院皆較大型醫院不效率,兩者的效率差異呈現統計顯著性;以權屬別而言,不論是大型醫院或小型醫院中的私立醫院,其生產效率均優於公立醫院,且兩者的效率差異呈現統計顯著性。而透過迴歸分析顯示,全民健保實施、權屬別之虛擬變數、佔床率、平均住院日、及以醫院產出衡量的集中度指標等,是影響醫院生產效率的重要因素。 從Malmquist生產力變動( et al., 1994)來看,平均而言,82-86年間醫院生產力成長率約在 -3.06 % 左右。就生產力變動來源而言,技術成長率(-2.74 %)與整體效率成長率(-0.33 %)均為負,而技術變動則是阻礙生產力成長的主要原因。此外,若以醫院整體效率變動來源來看,平均而言,整體效率退步是由於規模效率變動所致(-0.74%)。 此外,本文著重在 et al.(1994)、Ray and Desli (1997) 及Grifell and Lovell (1998) 三種定義下的Malmquist生產力變動指標之比較。研究結果發現,Grifell and Lovell (1998) 的一般化Malmquist生產力指數,並沒有正確衡量廠商的生產力變動及其變動來源項。而利用Kruskal-Wallis檢定結果發現,三個模式中的生產力變動差異,並不具統計顯著性,而變動來源項(技術變動與規模效率變動)亦顯示相同的結果。 / This dissertation is focused on the efficiency and productivity studies of hospitals in Taiwan. It includes two independent academic papers. The primary intention is to introduce the newly developed ideas in the measurement of efficiency and productivity, rather than to create new ones. The utilization of these ideas has not, however, been discussion in print. And some of the arguments we used and brought together are new regarding to the literature of hospital efficiency and productivity measurement. Utilizing the non-parametric data envelopment analysis (DEA) approaches, efficiency scores and productivity change indexes were estimated. Efforts were made to explain the difference of productivity performance among individual hospitals. Nevertheless, the methods we used and the economic approach behind them distinguish this study from other empirical studies of the medical market. Part I  Market Uncertainty and Hospital Efficiency This part of the dissertation is focused on the measurement of efficiency of hospitals, incorporating uncertainty. There are stochastic variations in production relationships for hospitals. Generally speaking, the uncertainty of hospitals comes from two major sources: the natural uncertainty of medical cares; and the uncertainty of demands for medical cares (Arrow, 1963). Given the uncertainty in the medical market, the efficiency of hospitals hinges on how decision-makers deal with it. Undoubtedly, an optimal planning of the output buffers improves the efficiency performance. Using the hospital survey data in 1993 and 1994, and employing the chance constrained DEA model (Land, Lovell and Thore, 1993), the stochastic efficiency indexes of public and private medical centers and regional hospitals were estimated. Compared with deterministic frontier enveloping a given set of sample observations all the time, the chance-constrained frontier envelops them most of the time. That is, the chance constrained DEA allows the possibilities of output failure. Imposing different values of output failure probability, the estimation results were compared with the traditional (deterministic) DEA models. The empirical evidences of the chance constrained DEA model showed that, on average, private hospitals performed significantly better than public hospitals. This result matches with the result of the traditional DEA model. With Mann-Whitney U test, we compared the distributions of efficiency indexes under chance constrained DEA and deterministic DEA models. The test results showed that the difference between these two different models is statistically significant given a higher probability of output failure. These results imply that the nature of risk and the manipulation for risk are different for public and private hospitals. We also find that that the efficiency performance of public hospitals could be improved by the increasing of its reserve capacity. Part II  National Health Insurance and Hospital Productivity Change In this part of the dissertation, we examine the impact of NHI on hospitals, and trace the sources of hospital productivity growth in Taiwan. To pursue our goal, we employ a data consisting of 157 medical centers, regional hospitals and district hospitals over the period 1993 to 1997, and resort to the Malmquist productivity index to measure total factor productivity change. The index could be decomposed into three components: technical change, pure technical efficiency change and scale efficiency change. The estimation technique used in the study is the deterministic non-parametric DEA approach. The results we find are revealing and suggestive to the public and the government in order to promote and assure the efficient delivery of quality health care. The average efficiency scores are 66.00% (74.87%) for CRS (VRS) technology and it means that there are substantial efficiency losses for the sample hospitals during the study period. The efficiency score of the hospitals as a whole in 1995 (the beginning year of NHI) was much lower than the other 4 years' efficiency scores. A censored Tobit regression analysis is used and identifies that NHI policy, ownership, rate of bed occupancy, average length of stay and the output-specific concentration level were all the significant determinants of technical efficiency. Empirical results indicate that most medical care regions became more output-specific concentrated. Total factor productivity on average deteriorated at an annual rate of -3.1%, and it was dominated by substantial technical regresses at an annual rate of -2.74%. The small hospitals were severely affected by NHI. Furthermore, within large and small hospital groups, the difference in technical change was statistically significant, but the differences in TFP and the associated components between ownership were not. Special attention was paid to compare  et al.(1994), Ray and Desli (1997) and Grifell and Lovell (1998) approaches to decomposing the Malmquist productivity index. Empirical results indicate that the first 2 approaches yield accurate productivity changes, while GL doesn't. However, they produce almost the same magnitude of average TFP. In addition, no significant differences in the measured technical change and efficiency change were found among the three approaches.
278

Second Chance Recovery Centre : the experiences of caregivers of Nyaope addicts

Mokutu, Kgothatso Selloane Lydia 12 1900 (has links)
Background: Drug rehabilitation is crucial for drug addicts. As much as drug rehabilitation (rehab) centres are helping in dealing with drug addiction. Some drug addicts may find that some of the drug rehabs do not meet their needs. Therefore, the study explored the experience of caregivers caring for nyaope addicts. Method: This study adopted a qualitative research approach and a case study design. The purposive sampling method was employed to select the sample. The sample comprised six caregivers. The structured interview and open-ended questionnaire were employed to collect data. An interview questionnaire was designed allowing the participants to respond at home and provide feedback. Their responses provided through this process were insufficient, participants were then requested face-to-face interviews and they agreed. Results: One of the main findings in this study was that caregiving affects the caregivers negatively. Caregiving has led to psychological and physical effects amongst the caregivers. Conclusion: A need was identified for support and awareness for the caregivers and rehabilitation centres in South Africa. This might reduce the relapse of substance abuse and help eradicate the number of substance abusers in South Africa. / Psychology / M.A. (Psychology (Research Consultation))
279

Scenario-Based Model Predictive Control for Systems with Correlated Uncertainties

González Querubín, Edwin Alonso 26 April 2024 (has links)
[ES] La gran mayoría de procesos del mundo real tienen incertidumbres inherentes, las cuales, al ser consideradas en el proceso de modelado, se puede obtener una representación que describa con la mayor precisión posible el comportamiento del proceso real. En la mayoría de casos prácticos, se considera que éstas tienen un comportamiento estocástico y sus descripciones como distribuciones de probabilidades son conocidas. Las estrategias de MPC estocástico están desarrolladas para el control de procesos con incertidumbres de naturaleza estocástica, donde el conocimiento de las propiedades estadísticas de las incertidumbres es aprovechado al incluirlo en el planteamiento de un problema de control óptimo (OCP). En éste, y contrario a otros esquemas de MPC, las restricciones duras son relajadas al reformularlas como restricciones de tipo probabilísticas con el fin de reducir el conservadurismo. Esto es, se permiten las violaciones de las restricciones duras originales, pero tales violaciones no deben exceder un nivel de riesgo permitido. La no-convexidad de tales restricciones probabilísticas hacen que el problema de optimización sea prohibitivo, por lo que la mayoría de las estrategias de MPC estocástico en la literatura se diferencian en la forma en que abordan tales restricciones y las incertidumbres, para volver el problema computacionalmente manejable. Por un lado, están las estrategias deterministas que, fuera de línea, convierten las restricciones probabilísticas en unas nuevas de tipo deterministas, usando la propagación de las incertidumbres a lo largo del horizonte de predicción para ajustar las restricciones duras originales. Por otra parte, las estrategias basadas en escenarios usan la información de las incertidumbres para, en cada instante de muestreo, generar de forma aleatoria un conjunto de posibles evoluciones de éstas a lo largo del horizonte de predicción. De esta manera, convierten las restricciones probabilísticas en un conjunto de restricciones deterministas que deben cumplirse para todos los escenarios generados. Estas estrategias se destacan por su capacidad de incluir en tiempo real información actualizada de las incertidumbres. No obstante, esta ventaja genera inconvenientes como su gasto computacional, el cual aumenta conforme lo hace el número de escenarios y; por otra parte, el efecto no deseado en el problema de optimización, causado por los escenarios con baja probabilidad de ocurrencia, cuando se usa un conjunto de escenarios pequeño. Los retos mencionados anteriormente orientaron esta tesis hacia los enfoques de MPC estocástico basado en escenarios, produciendo tres contribuciones principales. La primera consiste en un estudio comparativo de un algoritmo del grupo determinista con otro del grupo basado en escenarios; se hace un especial énfasis en cómo cada uno de estos aborda las incertidumbres, transforma las restricciones probabilísticas y en la estructura de su OCP, además de señalar sus aspectos más destacados y desafíos. La segunda contribución es una nueva propuesta de algoritmo MPC, el cual se basa en escenarios condicionales, diseñado para sistemas lineales con incertidumbres correlacionadas. Este esquema aprovecha la existencia de tal correlación para convertir un conjunto de escenarios inicial de gran tamaño en un conjunto de escenarios más pequeño con sus probabilidades de ocurrencia, el cual conserva las características del conjunto inicial. El conjunto reducido es usado en un OCP en el que las predicciones de los estados y entradas del sistema son penalizadas de acuerdo con las probabilidades de los escenarios que las componen, dando menor importancia a los escenarios con menores probabilidades de ocurrencia. La tercera contribución consiste en un procedimiento para la implementación del nuevo algoritmo MPC como gestor de la energía en una microrred en la que las previsiones de las energías renovables y las cargas están correlacionadas. / [CA] La gran majoria de processos del món real tenen incerteses inherents, les quals, en ser considerades en el procés de modelatge, es pot obtenir una representació que descriga amb la major precisió possible el comportament del procés real. En la majoria de casos pràctics, es considera que aquestes tenen un comportament estocàstic i les seues descripcions com a distribucions de probabilitats són conegudes. Les estratègies de MPC estocàstic estan desenvolupades per al control de processos amb incerteses de naturalesa estocàstica, on el coneixement de les propietats estadístiques de les incerteses és aprofitat en incloure'l en el plantejament d'un problema de control òptim (OCP). En aquest, i contrari a altres esquemes de MPC, les restriccions dures són relaxades en reformulades com a restriccions de tipus probabilístiques amb la finalitat de reduir el conservadorisme. Això és, es permeten les violacions de les restriccions dures originals, però tals violacions no han d'excedir un nivell de risc permès. La no-convexitat de tals restriccions probabilístiques fan que el problema d'optimització siga computacionalment immanejable, per la qual cosa la majoria de les estratègies de MPC estocàstic en la literatura es diferencien en la forma en què aborden tals restriccions i les incerteses, per a tornar el problema computacionalment manejable. D'una banda, estan les estratègies deterministes que, fora de línia, converteixen les restriccions probabilístiques en unes noves de tipus deterministes, usant la propagació de les incerteses al llarg de l'horitzó de predicció per a ajustar les restriccions dures originals. D'altra banda, les estratègies basades en escenaris usen la informació de les incerteses per a, en cada instant de mostreig, generar de manera aleatòria un conjunt de possibles evolucions d'aquestes al llarg de l'horitzó de predicció. D'aquesta manera, converteixen les restriccions probabilístiques en un conjunt de restriccions deterministes que s'han de complir per a tots els escenaris generats. Aquestes estratègies es destaquen per la seua capacitat d'incloure en temps real informació actualitzada de les incerteses. No obstant això, aquest avantatge genera inconvenients com la seua despesa computacional, el qual augmenta conforme ho fa el nombre d'escenaris i; d'altra banda, l'efecte no desitjat en el problema d'optimització, causat pels escenaris amb baixa probabilitat d'ocurrència, quan s'usa un conjunt d'escenaris xicotet. Els reptes esmentats anteriorment van orientar aquesta tesi cap als enfocaments de MPC estocàstic basat en escenaris, produint tres contribucions principals. La primera consisteix en un estudi comparatiu d'un algorisme del grup determinista amb un altre del grup basat en escenaris; on es fa un especial èmfasi en com cadascun d'aquests aborda les incerteses, transforma les restriccions probabilístiques i en l'estructura del seu problema d'optimització, a més d'assenyalar els seus aspectes més destacats i desafiaments. La segona contribució és una nova proposta d'algorisme MPC, el qual es basa en escenaris condicionals, dissenyat per a sistemes lineals amb incerteses correlacionades. Aquest esquema aprofita l'existència de tal correlació per a convertir un conjunt d'escenaris inicial de gran grandària en un conjunt d'escenaris més xicotet amb les seues probabilitats d'ocurrència, el qual conserva les característiques del conjunt inicial. El conjunt reduït és usat en un OCP en el qual les prediccions dels estats i entrades del sistema són penalitzades d'acord amb les probabilitats dels escenaris que les componen, donant menor importància als escenaris amb menors probabilitats d'ocurrència. La tercera contribució consisteix en un procediment per a la implementació del nou algorisme MPC com a gestor de l'energia en una microxarxa en la qual les previsions de les energies renovables i les càrregues estan correlacionades. / [EN] The vast majority of real-world processes have inherent uncertainties, which, when considered in the modelling process, can provide a representation that most accurately describes the behaviour of the real process. In most practical cases, these are considered to have stochastic behaviour and their descriptions as probability distributions are known. Stochastic model predictive control algorithms are developed to control processes with uncertainties of a stochastic nature, where the knowledge of the statistical properties of the uncertainties is exploited by including it in the optimal control problem (OCP) statement. Contrary to other model predictive control (MPC) schemes, hard constraints are relaxed by reformulating them as probabilistic constraints to reduce conservatism. That is, violations of the original hard constraints are allowed, but such violations must not exceed a permitted level of risk. The non-convexity of such probabilistic constraints renders the optimisation problem computationally unmanageable, thus most stochastic MPC strategies in the literature differ in how they deal with such constraints and uncertainties to turn the problem computationally tractable. On the one hand, there are deterministic strategies that, offline, convert probabilistic constraints into new deterministic ones, using the propagation of uncertainties along the prediction horizon to tighten the original hard constraints. Scenario-based approaches, on the other hand, use the uncertainty information to randomly generate, at each sampling instant, a set of possible evolutions of uncertainties over the prediction horizon. In this fashion, they convert the probabilistic constraints into a set of deterministic constraints that must be fulfilled for all the scenarios generated. These strategies stand out for their ability to include real-time updated uncertainty information. However, this advantage comes with inconveniences such as computational effort, which grows as the number of scenarios does, and the undesired effect on the optimisation problem caused by scenarios with a low probability of occurrence when a small set of scenarios is used. The aforementioned challenges steered this thesis toward stochastic scenario-based MPC approaches, and yielded three main contributions. The first one consists of a comparative study of an algorithm from the deterministic group with another one from the scenario-based group, where a special emphasis is made on how each of them deals with uncertainties, transforms the probabilistic constraints and on the structure of the optimisation problem, as well as pointing out their most outstanding aspects and challenges. The second contribution is a new proposal for a MPC algorithm, which is based on conditional scenarios, developed for linear systems with correlated uncertainties. This scheme exploits the existence of such correlation to convert a large initial set of scenarios into a smaller one with their probabilities of occurrence, which preserves the characteristics of the initial set. The reduced set is used in an OCP in which the predictions of the system states and inputs are penalised according to the probabilities of the scenarios that compose them, giving less importance to the scenarios with lower probabilities of occurrence. The third contribution consists of a procedure for the implementation of the new MPC algorithm as an energy manager in a microgrid in which the forecasts of renewables and loads are correlated. / González Querubín, EA. (2024). Scenario-Based Model Predictive Control for Systems with Correlated Uncertainties [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/203887
280

Application of the Duality Theory

Lorenz, Nicole 15 August 2012 (has links) (PDF)
The aim of this thesis is to present new results concerning duality in scalar optimization. We show how the theory can be applied to optimization problems arising in the theory of risk measures, portfolio optimization and machine learning. First we give some notations and preliminaries we need within the thesis. After that we recall how the well-known Lagrange dual problem can be derived by using the general perturbation theory and give some generalized interior point regularity conditions used in the literature. Using these facts we consider some special scalar optimization problems having a composed objective function and geometric (and cone) constraints. We derive their duals, give strong duality results and optimality condition using some regularity conditions. Thus we complete and/or extend some results in the literature especially by using the mentioned regularity conditions, which are weaker than the classical ones. We further consider a scalar optimization problem having single chance constraints and a convex objective function. We also derive its dual, give a strong duality result and further consider a special case of this problem. Thus we show how the conjugate duality theory can be used for stochastic programming problems and extend some results given in the literature. In the third chapter of this thesis we consider convex risk and deviation measures. We present some more general measures than the ones given in the literature and derive formulas for their conjugate functions. Using these we calculate some dual representation formulas for the risk and deviation measures and correct some formulas in the literature. Finally we proof some subdifferential formulas for measures and risk functions by using the facts above. The generalized deviation measures we introduced in the previous chapter can be used to formulate some portfolio optimization problems we consider in the fourth chapter. Their duals, strong duality results and optimality conditions are derived by using the general theory and the conjugate functions, respectively, given in the second and third chapter. Analogous calculations are done for a portfolio optimization problem having single chance constraints using the general theory given in the second chapter. Thus we give an application of the duality theory in the well-developed field of portfolio optimization. We close this thesis by considering a general Support Vector Machines problem and derive its dual using the conjugate duality theory. We give a strong duality result and necessary as well as sufficient optimality conditions. By considering different cost functions we get problems for Support Vector Regression and Support Vector Classification. We extend the results given in the literature by dropping the assumption of invertibility of the kernel matrix. We use a cost function that generalizes the well-known Vapnik's ε-insensitive loss and consider the optimization problems that arise by using this. We show how the general theory can be applied for a real data set, especially we predict the concrete compressive strength by using a special Support Vector Regression problem.

Page generated in 0.0854 seconds