• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 11
  • 5
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 131
  • 23
  • 21
  • 21
  • 19
  • 16
  • 16
  • 14
  • 13
  • 12
  • 12
  • 11
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Le problème des universaux dans l'Isagoge de Porphyre selon quelques commentateurs latins du XIIIe siècle (Pseudo-Robertus Anglicus, Jean le Page, Nicolas de Paris et Robert Kilwardby) : édition critique sélective, traduction française, analyses structurelle et formelle et étude historico-philosophique

Piché, David 09 April 2021 (has links)
Notre connaissance de l'histoire du problème des universaux, tel qu'il fut initialement formulé par Porphyre, souffre d'une importante lacune : nous ne savons strictement rien concernant le traitement que ce problème a reçu de la part de ceux qui faisaient profession de philosopher à l'Université de Paris entre 1230 et 1260. Notre thèse s'attaque à ce déficit de savoir en suivant une démarche heuristique qui se déploie en trois phases : premièrement, nous éditons, selon les règles de l'ecdotique, une partie substantielle du commentaire du Pseudo-Robertus Anglicus sur l'Isagoge de Porphyre et nous en offrons une traduction française; deuxièmement, nous sondons l'architectonique de ce texte et des écrits parallèles rédigés par des contemporains--nommément Jean le Page, Nicolas de Paris et Robert Kilwardby; troisièmement et finalement, par l'entremise d'une étude comparative qui met à profit ces documents inédits qui composent notre corpus, nous procédons à une étude historico-philosophique du commentaire isagogique du Pseudo-Robertus Anglicus.
82

L'influence du stoïcisme sur le De Abstinentia de Porphyre

Gingras, Delphine 13 December 2024 (has links)
Le traité De l’abstinence est écrit par Porphyre dans le but de convaincre son ami de revenir à la pratique du végétarisme, qu‟il a récemment abandonnée. Dans ce texte, l‟auteur présente une série d‟arguments anti-végétariens, qu‟il réfutera tout en défendant la pertinence de ce mode de vie pour le philosophe. Parmi les opposants, les stoïciens occupent une place importante, le troisième livre du traité leur étant presque entièrement consacré. En réfutant les arguments anti-végétariens des stoïciens, Porphyre développe ses positions avec un vocabulaire qu‟il leur emprunte. Ce faisant, il teinte son traité de l‟influence stoïcienne. Ce mémoire a pour objectif d‟analyser de quelle manière le dialogue entre Porphyre et les stoïciens influence l‟auteur du De abstinentia. L‟argument anti-végétarien attribué aux stoïciens consiste à dire qu‟il est impossible de demander à ce que les êtres humains épargnent la vie des animaux, puisque ceux-ci ne nous sont pas familiers (oikeion), du fait de leur absence de raison. Or, puisque dans la théorie stoïcienne la justice prend sa source dans les relations de familiarité qui lie les êtres rationnels entre eux, on ne peut considérer que la mise à mort des animaux dans le but de consommer leur chair est un acte injuste, voire impie, comme le soutient Porphyre. Les trois termes de ce débat feront chacun l‟objet d‟un chapitre : oikeiôsis, justice et logos. Ces trois notions permettront d‟approfondir la teneur du désaccord de Porphyre avec les stoïciens et de comprendre de quelle manière ce néoplatonicien se réapproprie le vocabulaire stoïcien pour le pousser à des conclusions qui sont conformes à sa métaphysique. On trouvera que derrière la question du végétarisme, c‟est le thème plus complexe du mode de vie qui anime le débat. / The treatise On Abstinence is written by Porphyry in order to convince his friend to return to the practice of vegetarianism, which he recently abandoned. In this text, a series of anti-vegetarian arguments are presented, which the author refutes while defending the relevance of this way of life for the philosopher. Among the opponents, the Stoics have an important place, the third book of the treaty being almost entirely devoted to them. Refuting the anti-vegetarian arguments of the Stoics, Porphyry develops his positions with a vocabulary borrowed to them. In doing so, he gives his treatise a Stoic flavour. This dissertation aims to analyze how the dialogue between Porphyry and the Stoics influences the author of the De abstinentia. The anti-vegetarian argument attributed to the Stoics says that it is impossible to ask of human to spare the life of animals, since they are not rational and, thus, not appropriate (oikeion) to us. Because, in the Stoic‟s theory, justice is rooted in the relations of appropriation between rational beings, we could not say that killing animals to eat their flesh is unjust, or impious, like Porphyry argues. One chapter will be dedicated to each of the terms of this debate: oikeiôsis, justice and reason. These three notions will allow us to further the analysis of the disagreement of Porphyry towards the Stoics and to understand how this neoplatonic philosopher uses the Stoics vocabulary to pursue his own metaphysical goals. We will find that behind the question of vegetarianism lies the more complex theme of the way of life.
83

The influence of some ancient philosophical and religious traditions on the soteriology of early Christianity

Gibson, Jan Albert 31 August 2002 (has links)
When reading the Bible in an independent way, i.e., not through the lenses of any official Church dogma, one is amazed by the many voices that come through to us. Add to this variety the literaiy finds from Nag Hammadi, as well as the Dead Sea Scrolls, then the question now confronting many spiritual pilgrims is how it came about that these obviously diverse theologies, represented in the socalled Old and New Testaments, were moulded into only one "orthodox" result. In what way and to what degree were the many Christian groups different and distinctive from one another, as well as from other Jewish groups? Furthermore, what was the influence of other religions, Judaism, the Mysteries, Gnostics and Philosophers on the development, variety of groups and ultimately 021 the consolidation of "orthodox" soteriology? / Systematic Theology and Theological Ethics / M.Th. (Systematic Theology)
84

Building a Data Mining Framework for Target Marketing

March, Nicolas 05 1900 (has links) (PDF)
Most retailers and scientists agree that supporting the buying decisions of individual customers or groups of customers with specific product recommendations holds great promise. Target-oriented promotional campaigns are more profitable in comparison to uniform methods of sale promotion such as discount pricing campaigns. This seems to be particulary true if the promoted products are well matched to the preferences of the customers or customer groups. But how can retailers identify customer groups and determine which products to offer them? To answer this question, this dissertation describes an algorithmic procedure which identifies customer groups with similar preferences for specific product combinations in recorded transaction data. In addition, for each customer group it recommends products which promise higher sales through cross-selling if appropriate promotion techniques are applied. To illustrate the application of this algorithmic approach, an analysis is performed on the transaction database of a supermarket. The identified customer groups are used for a simulation. The results show that appropriate promotional campaigns which implement this algorithmic approach can achieve an increase in profit from 15% to as much as 191% in contrast to uniform discounts on the purchase price of bestsellers. (author's abstract)
85

Secure Virtualization of Latency-Constrained Systems

Lackorzynski, Adam 16 April 2015 (has links) (PDF)
Virtualization is a mature technology in server and desktop environments where multiple systems are consolidate onto a single physical hardware platform, increasing the utilization of todays multi-core systems as well as saving resources such as energy, space and costs compared to multiple single systems. Looking at embedded environments reveals that many systems use multiple separate computing systems inside, including requirements for real-time and isolation properties. For example, modern high-comfort cars use up to a hundred embedded computing systems. Consolidating such diverse configurations promises to save resources such as energy and weight. In my work I propose a secure software architecture that allows consolidating multiple embedded software systems with timing constraints. The base of the architecture builds a microkernel-based operating system that supports a variety of different virtualization approaches through a generic interface, supporting hardware-assisted virtualization and paravirtualization as well as multiple architectures. Studying guest systems with latency constraints with regards to virtualization showed that standard techniques such as high-frequency time-slicing are not a viable approach. Generally, guest systems are a combination of best-effort and real-time work and thus form a mixed-criticality system. Further analysis showed that such systems need to export relevant internal scheduling information to the hypervisor to support multiple guests with latency constraints. I propose a mechanism to export those relevant events that is secure, flexible, has good performance and is easy to use. The thesis concludes with an evaluation covering the virtualization approach on the ARM and x86 architectures and two guest operating systems, Linux and FreeRTOS, as well as evaluating the export mechanism.
86

Speculation in Parallel and Distributed Event Processing Systems

Brito, Andrey 09 August 2010 (has links) (PDF)
Event stream processing (ESP) applications enable the real-time processing of continuous flows of data. Algorithmic trading, network monitoring, and processing data from sensor networks are good examples of applications that traditionally rely upon ESP systems. In addition, technological advances are resulting in an increasing number of devices that are network enabled, producing information that can be automatically collected and processed. This increasing availability of on-line data motivates the development of new and more sophisticated applications that require low-latency processing of large volumes of data. ESP applications are composed of an acyclic graph of operators that is traversed by the data. Inside each operator, the events can be transformed, aggregated, enriched, or filtered out. Some of these operations depend only on the current input events, such operations are called stateless. Other operations, however, depend not only on the current event, but also on a state built during the processing of previous events. Such operations are, therefore, named stateful. As the number of ESP applications grows, there are increasingly strong requirements, which are often difficult to satisfy. In this dissertation, we address two challenges created by the use of stateful operations in a ESP application: (i) stateful operators can be bottlenecks because they are sensitive to the order of events and cannot be trivially parallelized by replication; and (ii), if failures are to be tolerated, the accumulated state of an stateful operator needs to be saved, saving this state traditionally imposes considerable performance costs. Our approach is to evaluate the use of speculation to address these two issues. For handling ordering and parallelization issues in a stateful operator, we propose a speculative approach that both reduces latency when the operator must wait for the correct ordering of the events and improves throughput when the operation in hand is parallelizable. In addition, our approach does not require that user understand concurrent programming or that he or she needs to consider out-of-order execution when writing the operations. For fault-tolerant applications, traditional approaches have imposed prohibitive performance costs due to pessimistic schemes. We extend such approaches, using speculation to mask the cost of fault tolerance.
87

The influence of some ancient philosophical and religious traditions on the soteriology of early Christianity

Gibson, Jan Albert 31 August 2002 (has links)
When reading the Bible in an independent way, i.e., not through the lenses of any official Church dogma, one is amazed by the many voices that come through to us. Add to this variety the literaiy finds from Nag Hammadi, as well as the Dead Sea Scrolls, then the question now confronting many spiritual pilgrims is how it came about that these obviously diverse theologies, represented in the socalled Old and New Testaments, were moulded into only one "orthodox" result. In what way and to what degree were the many Christian groups different and distinctive from one another, as well as from other Jewish groups? Furthermore, what was the influence of other religions, Judaism, the Mysteries, Gnostics and Philosophers on the development, variety of groups and ultimately 021 the consolidation of "orthodox" soteriology? / Systematic Theology and Theological Ethics / M.Th. (Systematic Theology)
88

Efficiency Analysis of European Freight Villages-Three Peers for Benchmarking

Yang, Congcong, Taudes, Alfred, Dong, Guozhi January 2015 (has links) (PDF)
Measuring the performance of Freight Villages (FVs) has important implications for logistics companies and other related companies as well as governments. In this paper we apply Data Envelopment Analysis (DEA) to measure the performance of European FVs in a purely data-driven way incorporating the nature of FVs as complex operations that use multiple inputs and produce several outputs. We employ several DEA models and perform a complete sensitivity analysis of the appropriateness of the chosen input and output variables, and an assessment of the robustness of the efficiency score. It turns out that about half of the 20 FVs analyzed are inefficient, with utilization of the intermodal area and warehouse capacity and level of goods handed the being the most important areas of improvement. While we find no significant differences in efficiency between FVs of different sizes and in different countries, it turns out that the FVs Eurocentre Toulouse, Interporto Quadrante Europa and GVZ Nürnberg constitute more than 90% of the benchmark share. / Series: Working Papers on Information Systems, Information Business and Operations
89

Approximate Data Analytics Systems

Le Quoc, Do 22 March 2018 (has links) (PDF)
Today, most modern online services make use of big data analytics systems to extract useful information from the raw digital data. The data normally arrives as a continuous data stream at a high speed and in huge volumes. The cost of handling this massive data can be significant. Providing interactive latency in processing the data is often impractical due to the fact that the data is growing exponentially and even faster than Moore’s law predictions. To overcome this problem, approximate computing has recently emerged as a promising solution. Approximate computing is based on the observation that many modern applications are amenable to an approximate, rather than the exact output. Unlike traditional computing, approximate computing tolerates lower accuracy to achieve lower latency by computing over a partial subset instead of the entire input data. Unfortunately, the advancements in approximate computing are primarily geared towards batch analytics and cannot provide low-latency guarantees in the context of stream processing, where new data continuously arrives as an unbounded stream. In this thesis, we design and implement approximate computing techniques for processing and interacting with high-speed and large-scale stream data to achieve low latency and efficient utilization of resources. To achieve these goals, we have designed and built the following approximate data analytics systems: • StreamApprox—a data stream analytics system for approximate computing. This system supports approximate computing for low-latency stream analytics in a transparent way and has an ability to adapt to rapid fluctuations of input data streams. In this system, we designed an online adaptive stratified reservoir sampling algorithm to produce approximate output with bounded error. • IncApprox—a data analytics system for incremental approximate computing. This system adopts approximate and incremental computing in stream processing to achieve high-throughput and low-latency with efficient resource utilization. In this system, we designed an online stratified sampling algorithm that uses self-adjusting computation to produce an incrementally updated approximate output with bounded error. • PrivApprox—a data stream analytics system for privacy-preserving and approximate computing. This system supports high utility and low-latency data analytics and preserves user’s privacy at the same time. The system is based on the combination of privacy-preserving data analytics and approximate computing. • ApproxJoin—an approximate distributed joins system. This system improves the performance of joins — critical but expensive operations in big data systems. In this system, we employed a sketching technique (Bloom filter) to avoid shuffling non-joinable data items through the network as well as proposed a novel sampling mechanism that executes during the join to obtain an unbiased representative sample of the join output. Our evaluation based on micro-benchmarks and real world case studies shows that these systems can achieve significant performance speedup compared to state-of-the-art systems by tolerating negligible accuracy loss of the analytics output. In addition, our systems allow users to systematically make a trade-off between accuracy and throughput/latency and require no/minor modifications to the existing applications.
90

Abnahmetestgetriebene Entwicklung von ereignisbasierten Anwendungen

Weiß, Johannes 16 June 2017 (has links) (PDF)
Die Menge an verfügbaren, elektronisch auswertbaren Informationen nimmt stetig zu. Mobiltelefone mit unterschiedlichsten Sensoren, soziale Netzwerke und das Internet der Dinge sind Beispiele für Erzeuger von potentiell interessanten und verwertbaren Daten. Das Themenfeld der ereignisverarbeitenden Systeme (Event Processing – EP) bietet Technologien und Werkzeuge an, um eintreffende Daten, sog. Ereignisse, in nahezu Echtzeit zu verarbeiten. So können z.B. Muster in den Ereignissen erkannt werden. Durch die Erstellung von abgeleiteten Ereignissen können somit weitere Systemen auf diese Mustererkennung reagieren. So können u.a. zeitbasierte Funktionalitäten realisiert werden, wie z.B. das Überwachen von Aktienkursen in einem definierten Zeitraum. Im Gegensatz zu einem nachrichtenorientierten Kommunikationssystem können in EP-Anwendungen fachlich relevante Anwendungsfunktionalitäten umgesetzt werden. Die Validierung dieser Anwendungen durch Fachexperten gewinnt dadurch eine gesteigerte Bedeutung. Die abnahmetestgetriebene Entwicklung (Acceptance Test Driven Development – ATDD) ist eine Methode der agilen Softwareentwicklung und fokussiert sich auf die Integration von Fachexperten in die Erstellung und Auswertung von automatisierbaren Testfällen. Neben dem Potential der Automatisierung von manuellen Regressionstests liegt in der Methode die Möglichkeit den Wissenstransfer zwischen Entwicklern und Fachexperten zu verbessern. Die vorliegende Arbeit leistet mehrere Beiträge zur Untersuchung von ATDD im Bereich der EP-Anwendungsentwicklung. Zunächst wurden Anforderungen für eine entsprechende Werkzeugunterstützung auf Basis der Eigenschaften von EP-Anwendungen ermittelt und der Produktqualitätsklassifikationen funktionalen Eignung, Modularität und Benutzbarkeit zugeordnet. Im Rahmen einer systematischen Literaturrecherche wurden Ansätze aus der Literatur sowie die Werkzeugunterstützung der vorhandenen Produktlösungen analysiert. Dabei wurde deutlich, dass die verwandten Lösungen die identifizierten Anforderungen nicht ausreichend erfüllen. Dadurch motiviert wurde eine Testbeschreibungssprache sowie ein ausführendes, verteiltes Testsystem konzipiert und formal beschrieben. Die Testbeschreibungssprache bietet Kommandos zur produktunabhängigen Spezifikation von Testfällen an. Mit Hilfe des Testsystems ist es möglich, diese Testfälle gegen EP-Produktlösungen auszuführen. Anhand von ausgewählten Fallstudien und einer prototypischen Umsetzung des Lösungsansatzes wurde eine Validierung vorgenommen. Dabei wird ersichtlich, dass der vorgestellte Lösungsansatz den aktuellen Stand der Technik hinsichtlich funktionaler Eignung und Modularität in diesem Anwendungsbereich übersteigt. Die Benutzbarkeit wurde anhand von zwei Benutzerstudien tiefergehend untersucht. Dabei sind erste Erkenntnisse über die praktische Nutzung der Testbeschreibungssprache sowie zukünftige Fragestellungen aufgedeckt worden. In der ersten Studie wurde das Verstehen von Testfällen untersucht und dabei die automatisierbare Testbeschreibungssprache mit einer klassischen Testbeschreibungsvorlage verglichen. Hinsichtlich der Bearbeitungsdauer wurde ein signifikanter Effekt zugunsten der automatisierbaren Sprache ermittelt. Die zweite Studie betrachtet das Spezifizieren von Testfällen. Auch hier wurden Vorteile hinsichtlich der Bearbeitungsdauer aufgedeckt.

Page generated in 0.0266 seconds