161 |
Cinderella - Adaptive Online Partitioning of Irregularly Structured DataHerrmann, Kai, Voigt, Hannes, Lehner, Wolfgang 01 July 2021 (has links)
In an increasing number of use cases, databases face the challenge of managing irregularly structured data. Irregularly structured data is characterized by a quickly evolving variety of entities without a common set of attributes. These entities do not show enough regularity to be captured in a traditional database schema. A common solution is to centralize the diverse entities in a universal table. Usually, this leads to a very sparse table. Although today's techniques allow efficient storage of sparse universal tables, query efficiency is still a problem. Queries that reference only a subset of attributes have to read the whole universal table including many irrelevant entities. One possible solution is to use a partitioning of the table, which allows pruning partitions of irrelevant entities before they are touched. Creating and maintaining such a partitioning manually is very laborious or even infeasible, due to the enormous complexity. Thus an autonomous solution is desirable. In this paper, we define the Online Partitioning Problem for irregularly structured data and present Cinderella. Cinderella is an autonomous online algorithm for horizontal partitioning of irregularly structured entities in universal tables. It is designed to keep its overhead low by incrementally assigning entities to partitions while they are touched anyway during modifications. The achieved partitioning allows queries that retrieve only entities with a subset of attributes easily pruning partitions of irrelevant entities. Cinderella increases the locality of queries and reduces query execution cost.
|
162 |
Preprocessing to Deal with Hard ProblemsHols, Eva-Maria Christiana 22 May 2020 (has links)
In der klassischen Komplexitätstheorie unterscheiden wir zwischen der Klasse P von in Polynomialzeit lösbaren Problemen, und der Klasse NP-schwer von Problemen bei denen die allgemeine Annahme ist, dass diese nicht in Polynomialzeit lösbar sind. Allerdings sind viele Probleme, die wir lösen möchten, NP-schwer. Gleichzeitig besteht eine große Diskrepanz zwischen den empirisch beobachteten und den festgestellten worst-case Laufzeiten. Es ist bekannt, dass Vorverarbeitung oder Datenreduktion auf realen Instanzen zu Laufzeitverbesserungen führt. Hier stoßen wir an die Grenze der klassischen Komplexitätstheorie.
Der Fokus dieser Arbeit liegt auf Vorverarbeitungsalgorithmen für NP-schwere Probleme. Unser Ziel ist es, bestimmte Instanzen eines NP-schweren Problems vorverarbeiten zu können, indem wir die Struktur betrachten. Genauer gesagt, für eine gegebene Instanz und einen zusätzlichen Parameter l, möchten wir in Polynomialzeit eine äquivalente Instanz berechnen, deren Größe und Parameterwert nur durch eine Funktion im Parameterwert l beschränkt ist. In der parametrisierten Komplexitätstheorie heißen diese Algorithmen Kernelisierung.
Wir werden drei NP-schwere Graphenprobleme betrachten, nämlich Vertex Cover, Edge Dominating Set und Subset Feedback Vertex Set. Für Vertex Cover werden wir bekannte Ergebnisse für Kernelisierungen vereinheitlichen, wenn der Parameter die Größe einer Entfernungsmenge zu einer gegebenen Graphklasse ist. Anschließend untersuchen wir die Kernelisierbarkeit von Edge Dominating Set. Es stellt sich heraus, dass die Kernelisierbarkeit deutlich komplexer ist. Dennoch klassifizieren wir die Existenz einer polynomiellen Kernelisierung, wenn jeder Graph in der Graphklasse eine disjunkte Vereinigung von konstant großen Komponenten ist. Schließlich betrachten wir das Subset Feedback Vertex Set Problem und zeigen, dass es eine randomisierte polynomielle Kernelisierung hat, wenn der Parameter die Lösungsgröße ist. / In classical complexity theory, we distinguish between the class P, of polynomial-time solvable problems, and the class NP-hard, of problems where the widely-held belief is that we cannot solve these problems in polynomial time. Unfortunately, many of the problems we want to solve are NP-hard. At the same time, there is a large discrepancy between the empirically observed running times and the established worst-case bounds. Using preprocessing or data reductions on real-world instances is known to lead to huge improvements in the running time. Here we come to the limits of classical complexity theory.
In this thesis, we focus on preprocessing algorithms for NP-hard problems. Our goal is to find ways to preprocess certain instances of an NP-hard problem by considering the structure of the input instance. More precisely, given an instance and an additional parameter l, we want to compute in polynomial time an equivalent instance whose size and parameter value is bounded by a function in the parameter l only.
In the field of parameterized complexity, these algorithms are called kernelizations.
We will consider three NP-hard graph problems, namely Vertex Cover, Edge Dominating Set, and Subset Feedback Vertex Set. For Vertex Cover, we will unify known results for kernelizations when parameterized by the size of a deletion set to a specified graph class. Afterwards, we study the existence of polynomial kernelizations for Edge Dominating Set when parameterized by the size of a deletion set to a graph class. We point out that the existence of polynomial kernelizations is much more complicated than for Vertex Cover. Nevertheless, we fully classify the existence of polynomial kernelizations when every graph in the graph class is a disjoint union of constant size components. Finally, we consider graph cut problems, especially the Subset Feedback Vertex Set problem. We show that this problem has a randomized polynomial kernelization when the parameter is the solution size.
|
163 |
On Cognitive Aspects of Human-Level Artificial IntelligenceBesold, Tarek R. 26 January 2015 (has links)
Following an introduction to the context of Human-Level Artificial Intelligence (HLAI) and (computational) analogy research, a formal analysis assessing and qualifying the suitability of the Heuristic-Driven Theory Projection (HDTP) analogy-making framework for HLAI purposes is presented. An account of the application of HDTP (and analogy-based approaches in general) to the study and computational modeling of conceptual blending is outlined, before a proposal and initial proofs of concept for the application of computational analogy engines to modeling and analysis questions in education studies, teaching research, and the learning sciences are described.
Subsequently, the focus is changed from analogy-related aspects in learning and concept generation to rationality as another HLAI-relevant cognitive capacity. After outlining the relation between AI and rationality research, a new conceptual proposal for understanding and modeling rationality in a more human-adequate way is presented, together with a more specific analogy-centered account and an architectural sketch for the (re)implementation of certain aspects of rationality using HDTP.
The methods and formal framework used for the initial analysis of HDTP are then applied for proposing general guiding principles for models and approaches in HLAI, together with a proposal for a formal characterization grounding the notion of heuristics as used in cognitive and HLAI systems as additional application example.
Finally, work is reported trying to clarify the scientific status of HLAI and participating in the debate about (in)adequate means for assessing the progress of a computational system towards reaching (human-level) intelligence.
Two main objectives are achieved: Using analogy as starting point, examples are given as inductive evidence for how a cognitively-inspired approach to questions in HLAI can be fruitful by and within itself. Secondly, several advantages of this approach also with respect to overcoming certain intrinsic problems currently characterizing HLAI research in its entirety are exposed. Concerning individual outcomes, an analogy-based proposal for theory blending as special form of conceptual blending is exemplified; the usefulness of computational analogy frameworks for understanding learning and education is shown and a corresponding research program is suggested; a subject-centered notion of rationality and a sketch for how the resulting theory could computationally be modeled using an analogy framework is discussed; computational complexity and approximability considerations are introduced as guiding principles for work in HLAI; and the scientific status of HLAI, as well as two possible tests for assessing progress in HLAI, are addressed.
|
164 |
K-Separator problem / Problème de k-SéparateurMohamed Sidi, Mohamed Ahmed 04 December 2014 (has links)
Considérons un graphe G = (V,E,w) non orienté dont les sommets sont pondérés et un entier k. Le problème à étudier consiste à la construction des algorithmes afin de déterminer le nombre minimum de nœuds qu’il faut enlever au graphe G pour que toutes les composantes connexes restantes contiennent chacune au plus k-sommets. Ce problème nous l’appelons problème de k-Séparateur et on désigne par k-séparateur le sous-ensemble recherché. Il est une généralisation du Vertex Cover qui correspond au cas k = 1 (nombre minimum de sommets intersectant toutes les arêtes du graphe) / Let G be a vertex-weighted undirected graph. We aim to compute a minimum weight subset of vertices whose removal leads to a graph where the size of each connected component is less than or equal to a given positive number k. If k = 1 we get the classical vertex cover problem. Many formulations are proposed for the problem. The linear relaxations of these formulations are theoretically compared. A polyhedral study is proposed (valid inequalities, facets, separation algorithms). It is shown that the problem can be solved in polynomial time for many special cases including the path, the cycle and the tree cases and also for graphs not containing some special induced sub-graphs. Some (k + 1)-approximation algorithms are also exhibited. Most of the algorithms are implemented and compared. The k-separator problem has many applications. If vertex weights are equal to 1, the size of a minimum k-separator can be used to evaluate the robustness of a graph or a network. Another application consists in partitioning a graph/network into different sub-graphs with respect to different criteria. For example, in the context of social networks, many approaches are proposed to detect communities. By solving a minimum k-separator problem, we get different connected components that may represent communities. The k-separator vertices represent persons making connections between communities. The k-separator problem can then be seen as a special partitioning/clustering graph problem
|
165 |
Complexity Theory of Leadership and Management InformationSimpson, Mark Aloysius 01 January 2018 (has links)
Implementing effective leadership strategies in management of information systems (MIS) can positively influence overall organizational performance. This study was an exploration of the general problem of failure to lead effectively in the current knowledge-based economy and the resulting deleterious effects on organizational performance and threats to continuing organizational viability. The specific problem was the lack of understanding regarding the interaction of leadership processes with MIS functions and the impact on organizational success. Managers' and employees' lived experiences of leadership in small- to medium-sized enterprises were explored, as well as how those experiences influenced the organization's adaptive responses regarding technology and performance in the knowledge-based economy. The complexity theory of leadership was applied as the theoretical foundation for this study. A phenomenological methodology was used. Data were collected through semi-structured interviews and analyzed through open coding to identify emergent themes from the data. The themes were leaders motivate employees' positive work-related behaviors, effective communication skills ensure accessibility and efficiency of the organizational information system, and leadership practices influence business productivity. This study contributes to social change by providing insights for managers and employees regarding effective strategies for working as teams and networks via the use of nontraditional leadership theory, which promotes company sustainability by demonstrating the benefits of responding to the changing economy.
|
166 |
Design of High Performance Flanges and its Influence on Manufacturing Costs, Structural Performance and Weight / Konstruktion av högpresterande flänsförbands inverkan på tillverkningskostnader, prestanda och viktAlcocer Bonifaz, Joaquin January 2019 (has links)
This project attempts to research the manufacturing cost, with an emphasis on machining, of high performance flanges for Turbine Rear Structure (TRS) applications, as well as the tradeoffs with structural performance and weight. A combination of traditional cost modelling techniques from the literature, as well as, the non-conventional manufacturing complexity index, as cost indicator are implemented. A multidisciplinary study is carried out with the aid of ANSYS Workbench in the form of computer simulated experiments to investigate tradeoffs in flanges. It is concluded that multidisciplinary studies of cost, performance and weight lacked model robustness to draw sound conclusions about flange design. However, the manufacturing complexity index after partial validation with experienced engineers shows promising results, and could be a way forward to estimate final machining operation cost for flanges in the future. / Syftet för detta projekt är att undersöka tillverkningskostnaden, med tonvikt på bearbetning av högpresterande flänsar för turbinapplikationer (TRS), samt dess relation till strukturella prestanda och vikt. Traditionella kostnadsmodelleringstekniker kombineras med det ickekonventionella tillverkningskomplexitetsindexet och används som kostnadsindikator. En tvärvetenskaplig studie genomförs med hjälp av ANSYS Workbench i form av dator simulerade experiment för att undersöka flänsavvägningar. En slutsats av studien är att multidisciplinära modeller av kostnad, prestanda och vikt saknade robusthet för att kunna dra djupgående slutsatser om prestandan för en flänsdesign. Tillverkningskomplexitetsindexet visar dock, efter partiell validering med erfarna ingenjörer, lovande resultat och kan vara framgångsrikt ett sätt att uppskatta den slutliga bearbetningskostnaden för flänsar.
|
167 |
Leading Change in Complex Systems: A Paradigm ShiftLeMaster, Cheryl Faye 29 August 2017 (has links)
No description available.
|
168 |
An Inquiry into Language Use in Multilinguals’ Writing: A Study of Third-Language LearnersTanova, Nadya 25 June 2012 (has links)
No description available.
|
169 |
Being, eating and being eaten : deconstructing the ethical subjectVrba, Minka 12 1900 (has links)
Thesis (MPhil (Philosophy))--University of Stellenbosch, 2006. / This study constitutes a conceptual analysis and critique of the notion of the subject, and the concomitant notion of responsibility, as it has developed through the philosophical history of the modern subject. The aim of this study is to present the reader with a critical notion of responsibility. This study seeks to divorce such a position from the traditional, normative view of the subject, as typified by the Cartesian position. Following Derrida, a deconstructive reading of the subject’s conceptual development since Descartes is presented. What emerges from this reading is that, despite various re-conceptualisations of the subject by philosophers as influential and diverse as Nietzsche, Heidegger and Levinas, their respective positions continue to affirm the subject as human. The position presented in this study challenges this notion of the subject as human, with the goal of opening-up and displacing the ethical frontier between human and non-human. It is argued that displacing this ethical frontier introduces complex responsibilities. These complex responsibilities resist the violence inherent to normative positions that typically exclude the non-human – particularly the animal – from the sphere of responsibility.
|
170 |
Constructing “Climate Change Knowledge”de Ruijter, Susann Unknown Date (has links) (PDF)
During the last decades “Climate Change” has become a vital topic on national and international political agendas. There it is presented as an irrevocable fact of global impact and thus of universal relevance. What has often been neglected are local discourses of marginalized groups and their specific contextualization of “Climate Change” phenomena. The aim of this project, to develop another perspective along these dominant narratives, has resulted in the research question How is social reality reconstructed on the phenomenon of “Climate Change” among the “Emerging Black Farmers” in the Swartland region in Western Cape, South Africa?
Taken as an example, “Climate Change Knowledge” is reconstructed through a case study on the information exchange between the NGO Goedgedacht Trust and local small-scale farmers in the post-Apartheid context of on-going political, social, economic and educational transition in South Africa.
Applying a constructivist approach, “Climate Change Knowledge” is not understood as an objectively given, but a socially constructed “reality” that is based on the interdependency of socio-economic conditions and individual assets, including language skills and language practice, sets of social norms and values, as well as strategies of knowledge transfer.
The data set consists of qualitative data sources, such as application forms and interview material, which are triangulated. The rationale of a multi-layered data analysis includes a discursive perspective as well as linguistic and ethical “side perspectives”.
Epistemologically, the thesis is guided by assumptions of complexity theory, framing knowledge around “Climate Change” as a fluid, constantly changing system that is shaped by constant intra- and inter-systemic exchange processes, and characterized by non-linearity, self-organization and representation of its constituents. From this point of departure, a theoretical terminology has been developed, which differentiates between symbols, interrelations, contents and content clusters. These elements are located in a system of spatio-temporal orientation and embedded into a broader (socio-economic) context of “historicity”. Content clusters are remodelled with the help of concept maps. Starting from that, a local perspective on “Climate Change” is developed, adding an experiential notion to the global narratives.
The thesis concludes that there is no single reality about “Climate Change” and that the farmers’ “Climate Change Knowledge” highly depends on experiential relativity and spatio-temporal immediacy. Furthermore, analysis has shown that the system’s historicity and social manifestations can be traced in the scope and emphasis of the content clusters discussed. Finally the thesis demonstrates that characteristics of symbols, interconnections and contents range between dichotomies of direct and indirect, predictable versus unpredictable, awareness and negligence or threat and danger, all coexisting and creating a continuum of knowledge production.
|
Page generated in 0.2723 seconds