Spelling suggestions: "subject:"soundness"" "subject:"roundness""
31 |
Regulace finančních trhů v mezinárodních souvislostech / Regulation of financial markets in international contextNistorenco, Taisia January 2015 (has links)
This diploma thesis comprises a theoretical, descriptive, comparative and econometric analysis of financial markets regulation in international context. In the first chapter I describe and analyse characteristics of modern financial markets, reasons for their regulation and institutional models of financial regulation and supervision. The second chapter offers an overview of the historical development of financial regulation and evaluates the role of regulatory factor in the outburst of the world financial crisis of 2008. In the third chapter I carry out a comparison of regulatory response to financial crisis in the USA and the EU. Forth chapter deals with regression analysis of the relation between Financial Soundness Indicators of three selected countries. The conclusions driven from this diploma thesis demonstrate that regulation is generally effective as a remedy for market failures, but in other aspects its effects are ambiguous. Intensity of regulation is a secondary impact factor in the formation of financial crises, in fact it is more reasonable to state that crises occur because of the failure of supervision rather than regulation. Due to the international attempts to harmonise the process of remediation of the consequences of the financial crisis, the regulatory response in the USA and the EU was very similar. Statistical analysis did not confirm the common idea that regulation represents a significant barrier for increasing the profitability of credit institutions.
|
32 |
Managing and Consuming Completeness Information for RDF Data SourcesDarari, Fariz 04 July 2017 (has links) (PDF)
The ever increasing amount of Semantic Web data gives rise to the question: How complete is the data? Though generally data on the Semantic Web is incomplete, many parts of data are indeed complete, such as the children of Barack Obama and the crew of Apollo 11. This thesis aims to study how to manage and consume completeness information about Semantic Web data. In particular, we first discuss how completeness information can guarantee the completeness of query answering. Next, we propose optimization techniques of completeness reasoning and conduct experimental evaluations to show the feasibility of our approaches. We also provide a technique to check the soundness of queries with negation via reduction to query completeness checking. We further enrich completeness information with timestamps, enabling query answers to be checked up to when they are complete. We then introduce two demonstrators, i.e., CORNER and COOL-WD, to show how our completeness framework can be realized. Finally, we investigate an automated method to generate completeness statements from text on the Web via relation cardinality extraction.
|
33 |
Managing and Consuming Completeness Information for RDF Data SourcesDarari, Fariz 20 June 2017 (has links)
The ever increasing amount of Semantic Web data gives rise to the question: How complete is the data? Though generally data on the Semantic Web is incomplete, many parts of data are indeed complete, such as the children of Barack Obama and the crew of Apollo 11. This thesis aims to study how to manage and consume completeness information about Semantic Web data. In particular, we first discuss how completeness information can guarantee the completeness of query answering. Next, we propose optimization techniques of completeness reasoning and conduct experimental evaluations to show the feasibility of our approaches. We also provide a technique to check the soundness of queries with negation via reduction to query completeness checking. We further enrich completeness information with timestamps, enabling query answers to be checked up to when they are complete. We then introduce two demonstrators, i.e., CORNER and COOL-WD, to show how our completeness framework can be realized. Finally, we investigate an automated method to generate completeness statements from text on the Web via relation cardinality extraction.
|
34 |
A JavaScript Backend for the Miking CompilerRågstad, William January 2022 (has links)
This thesis presents the design and implementation of an extension of the self-hosted Miking compiler to enable the generation of JavaScript code for different runtime environments and web browsers. Miking is a framework for developing domain-specific and general-purpose programming languages through sound language fragment composition, among other things, to create efficient compilers. Miking Core encapsulates a meta-language called Miking Lang and a fundamental functional Miking Expression language which Miking Lang itself is also lowered to. The presented backend translates Miking expressions into semantically equivalent effective, readable, sound, and correct JavaScript programs. In this report, development challenges, implementation methods, and techniques are also discussed and evaluated along with a review of the final compiler backend. Likewise, details on JavaScript-specific optimizations and pattern-matching compilation are presented, including how tail recursion is handled to enable complex Miking programs to be compiled into flexible and efficient JavaScript. / Detta examensarbete presenterar design och implementation för utveckling av Miking-kompilatorn, med syfte att möjliggöra generering av JavaScript-kod för olika exekveringsmiljöer och webbläsare. Miking är ett ramverk för att utveckla domänspecifika och generella programmeringsspråk genom sund komposition av språksfragment som kan används för att skapa effektiva kompilatorer. Miking Core ramverket innehåller ett metaspråk kallat Miking Lang, vilket ”sänks” till det mer grundläggande funktionella Miking Expression-språket. ”Sänkning” betyder i huvudsak att skriva om mer komplexa semantiska konstruktioner i form av enklare. Den backend som presenteras översätter Miking-uttryck till semantiskt ekvivalenta JavaScript program som också är effektiva, läsbara, sunda och korrekta. I denna rapport diskuteras och utvärderas även utvecklingsutmaningar, implementeringsmetod och andra tekniker som har använts under arbetet. På samma sätt presenteras detaljer om JavaScript-specifika optimeringar och mönstermatchningskompilering, inklusive hur svansrekursion hanteras för att möjliggöra kompilering av komplexa Miking-program till flexibel och effektiv JavaScript med hjälp av ”trampoline” teknik.
|
35 |
Performance and soundness of european banking systems / Performance et stabilité des systèmes bancaires européensTomuleasa, Ioana iuliana 14 July 2017 (has links)
Le système financier joue un rôle crucial dans la société moderne, devenant indispensable dans le développement économique d’une nation. Un système financier optimal et un secteur bancaire performant sont considères comme les conditions les plus importantes pour un développement économique durable. Témoignant de l’importance du secteur bancaire, en particulier au cours du dernier siècle, cette thèse discute les plus important indicateurs, désignes méthodologiques et déterminants de la performance et solidité des secteurs bancaires. La thèse explore en détail l’impact des facteurs spécifique à la banque, à l’industrie et ceux macroéconomique sur l’évolution de la performance et solidité bancaires européennes pendant la crise financière internationale. En outre, la thèse souligne les efforts importants déployés par les décideurs politiques pendant les dernières années pour efficacement stimuler les secteurs économiques et pour assurer un bon fonctionnement des systèmes financiers. / The financial system plays a crucial role in the modern society, becoming indispensable in the economic development of a nation. It was stated that an optimal financial system and well-functioning banking sector are commonly considered to be among the most important conditions for a sustainable economic development. Witnessing the importance of the banking sector, particularly in the last century, this thesis discusses the most important indicators, methodological designs and determinants of bank performance and soundness, exploring in detail the impact of bank-specific, industry-specific and macroeconomic factors on the evolution of European bank performance and soundness during the international financial crisis. Moreover, it emphasizes the important efforts made by policy-makers in recent years to provide effectively a sufficient stimulus to the economic sectors and to ensure a sound functioning of financial systems.
|
36 |
Geostatistical three-dimensional modeling of the subsurface unconsolidated materials in the Göttingen area / The transitional-probability Markov chain versus traditional indicator methods for modeling the geotechnical categories in a test site.Ranjineh Khojasteh, Enayatollah 27 June 2013 (has links)
Das Ziel der vorliegenden Arbeit war die Erstellung eines dreidimensionalen Untergrundmodells der Region Göttingen basierend auf einer geotechnischen Klassifikation der unkosolidierten Sedimente. Die untersuchten Materialen reichen von Lockersedimenten bis hin zu Festgesteinen, werden jedoch in der vorliegenden Arbeit als Boden, Bodenklassen bzw. Bodenkategorien bezeichnet.
Diese Studie evaluiert verschiedene Möglichkeiten durch geostatistische Methoden und Simulationen heterogene Untergründe zu erfassen. Derartige Modellierungen stellen ein fundamentales Hilfswerkzeug u.a. in der Geotechnik, im Bergbau, der Ölprospektion sowie in der Hydrogeologie dar.
Eine detaillierte Modellierung der benötigten kontinuierlichen Parameter wie z. B. der Porosität, der Permeabilität oder hydraulischen Leitfähigkeit des Untergrundes setzt eine exakte Bestimmung der Grenzen von Fazies- und Bodenkategorien voraus. Der Fokus dieser Arbeit liegt auf der dreidimensionalen Modellierung von Lockergesteinen und deren Klassifikation basierend auf entsprechend geostatistisch ermittelten Kennwerten. Als Methoden wurden konventionelle, pixelbasierende sowie übergangswahrscheinlichkeitsbasierende Markov-Ketten Modelle verwendet.
Nach einer generellen statistischen Auswertung der Parameter wird das Vorhandensein bzw. Fehlen einer Bodenkategorie entlang der Bohrlöcher durch Indikatorparameter beschrieben. Der Indikator einer Kategorie eines Probepunkts ist eins wenn die Kategorie vorhanden ist bzw. null wenn sie nicht vorhanden ist. Zwischenstadien können ebenfalls definiert werden. Beispielsweise wird ein Wert von 0.5 definiert falls zwei Kategorien vorhanden sind, der genauen Anteil jedoch nicht näher bekannt ist. Um die stationären Eigenschaften der Indikatorvariablen zu verbessern, werden die initialen Koordinaten in ein neues System, proportional zur Ober- bzw. Unterseite der entsprechenden Modellschicht, transformiert. Im neuen Koordinatenraum werden die entsprechenden Indikatorvariogramme für jede Kategorie für verschiedene Raumrichtungen berechnet. Semi-Variogramme werden in dieser Arbeit, zur besseren Übersicht, ebenfalls als Variogramme bezeichnet.
IV
Durch ein Indikatorkriging wird die Wahrscheinlichkeit jeder Kategorie an einem Modellknoten berechnet. Basierend auf den berechneten Wahrscheinlichkeiten für die Existenz einer Modellkategorie im vorherigen Schritt wird die wahrscheinlichste Kategorie dem Knoten zugeordnet. Die verwendeten Indikator-Variogramm Modelle und Indikatorkriging Parameter wurden validiert und optimiert. Die Reduktion der Modellknoten und die Auswirkung auf die Präzision des Modells wurden ebenfalls untersucht. Um kleinskalige Variationen der Kategorien auflösen zu können, wurden die entwickelten Methoden angewendet und verglichen. Als Simulationsmethoden wurden "Sequential Indicator Simulation" (SISIM) und der "Transition Probability Markov Chain" (TP/MC) verwendet. Die durchgeführten Studien zeigen, dass die TP/MC Methode generell gute Ergebnisse liefert, insbesondere im Vergleich zur SISIM Methode. Vergleichend werden alternative Methoden für ähnlichen Fragestellungen evaluiert und deren Ineffizienz aufgezeigt.
Eine Verbesserung der TP/MC Methoden wird ebenfalls beschrieben und mit Ergebnissen belegt, sowie weitere Vorschläge zur Modifikation der Methoden gegeben. Basierend auf den Ergebnissen wird zur Anwendung der Methode für ähnliche Fragestellungen geraten. Hierfür werden Simulationsauswahl, Tests und Bewertungsysteme vorgeschlagen sowie weitere Studienschwerpunkte beleuchtet.
Eine computergestützte Nutzung des Verfahrens, die alle Simulationsschritte umfasst, könnte zukünftig entwickelt werden um die Effizienz zu erhöhen.
Die Ergebnisse dieser Studie und nachfolgende Untersuchungen könnten für eine Vielzahl von Fragestellungen im Bergbau, der Erdölindustrie, Geotechnik und Hydrogeologie von Bedeutung sein.
|
37 |
Formalização de workflow nets utilizando lógica linear: análise qualitativa e quantitativaPassos, Lígia Maria Soares 27 May 2009 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This work presents a method for qualitative and quantitative analysis of WorkFlow
nets based on the proof trees of linear logic, and an approach for the verification of
workflow specifications in UML through the transformation of UML Activity Diagrams
into WorkFlow nets.
The qualitative analysis is concerned with the proof of soundness correctness criterion
defined for WorkFlow nets.
The quantitative analysis is based on the computation of symbolic dates for the planning
of resources used to handle each task of the workflow process modeled by a t-Time
WorkFlow net.
For the verification of the specifications of workflow processes mapped into UML
Activity Diagrams are presented formal rules to transform this ones into WorkFlow nets.
In this context is proposed the analysis and correction of critical points in UML Activity
Diagrams through the analysis of proof trees of linear logic.
The advantages of such an approach are diverse. The fact of working with linear
logic permits one to prove the correctness criterion soundness in a linear time without
considering the construction of the reachability graph, considering the proper structure
of the WorkFlow net instead of considering the corresponding automata.
Moreover, the computation of symbolic dates for the execution of each task mapped
into the t-Time WorkFlow net permits to plan the utilization of the resources involved
in the activities of the workflow process, through formulas that can be used for any case
handled by the correspondent workflow process, without to examine again the process to
recalculate, for each new case, the dates of start and conclusion for the activities involved
in the process.
Regarding the verification of workflow processes mapped into UML Activity Diagrams,
the major advantage of this approach is the transformation of a semi-formal model into
a formal model, such that some properties, like soundness, can be formally verified. / Este trabalho apresenta um método para a análise qualitativa e quantitativa de Work-
Flow nets baseado nas árvores de prova canônica da lógica linear e uma abordagem para a
verificação de especificações de processos de workflow em UML através da transformação
de Diagramas de Atividades da UML em WorkFlow nets.
A análise qualitativa refere-se à prova do critério de corretude soundness definido para
WorkFlow nets.
Já a análise quantitativa preocupa-se com o planejamento de recursos para cada atividade
de um processo de workflow mapeado em uma t-Time WorkFlow net e baseia-se no
cálculo de datas simbólicas para o planejamento de recursos utilizados na realização de
cada tarefa do processo de workflow.
Para a verificação das especificações de processos de workflow mapeados em Diagramas
de Atividades da UML são apresentadas regras formais para transformar estes diagramas
em WorkFlow nets. Neste contexto também é proposta a análise e correção de pontos
críticos em Diagramas de Atividades da UML através da análise de árvores de prova
canônica da lógica linear.
As vantagens das abordagens apresentadas neste trabalho são diversas. O fato de trabalhar
com lógica linear permite provar o critério de corretude soundness em tempo linear
e sem que seja necessária a construção de um grafo das marcações acessíveis, considerando
diretamente a própria estrutura da WorkFlow net, ao invés de considerar o seu autômato
correspondente.
Além disso, o cálculo de datas simbólicas correspondentes à execução de cada tarefa
mapeada em uma t-Time WorkFlow net permite planejar a utilização dos recursos envolvidos
nas atividades do processo de workflow, através de fórmulas que podem ser
utilizadas por qualquer caso tratado pelo processo de workflow correspondente, sem que
seja necessário percorrer novamente o processo de workflow inteiro para recalcular, para
cada novo caso, datas de início e término das atividades envolvidas no processo.
Já no que diz respeito à verificação de processos de workflow mapeados em Diagramas
de Atividades da UML, a principal vantagem desta abordagem é a transformação de
um modelo semi-formal em um modelo formal, para o qual algumas propriedades, como
soundness, podem ser formalmente verificadas. / Mestre em Ciência da Computação
|
Page generated in 0.0623 seconds