• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 38
  • 4
  • 4
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 67
  • 67
  • 44
  • 15
  • 14
  • 12
  • 12
  • 11
  • 11
  • 10
  • 10
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Optimized Nested Complex Event Processing Using Continuous Caching

Ray, Medhabi 12 October 2011 (has links)
"Complex Event Processing (CEP) has become increasingly important for tracking and monitoring anomalies and trends in event streams emitted from business processes such as supply chain management to online stores in e-commerce. These monitoring applications submit complex event queries to track sequences of events that match a given pattern. While the state-of-the-art CEP systems mostly focus on the execution of flat sequence queries, we instead support the execution of nested CEP queries specified by the (NEsted Event Language) NEEL. However the iterative execution often results in the repeated recomputation of similar or even identical results for nested sub- expressions as the window slides over the event stream. This work proposes to optimize NEEL execution performance by caching intermediate results. In particular a method of applying selective caching of intermediate results called Continuous Sliding Caching technique has been designed. Then a further optimization of the previous technique which we call the Semantic Caching and the Continuous Semantic Caching have been proposed. Techniques for incrementally loading, purging and exploiting the cache content are described. Our experimental study using real- world stock trades evaluates the performance of our proposed caching strategies for different query types."
2

Investigating the Role of Post-Event Processing in the Maintenance of Social Anxiety Symptoms

Kane, Leanne 03 August 2022 (has links)
Individuals experience varying levels of anxiety in social situations. When intense and enduring, this anxiety can lead to difficulties in daily functioning. Considering the often-central roles that relationships and interactions play in people's lives, it becomes crucial to understand how unhelpful levels of social anxiety are maintained over time. According to cognitive theories of social anxiety disorder, post-event processing (PEP; e.g., the review of the negative aspects of past social situations), is one of the factors that can perpetuate symptoms of social anxiety. The objective of this dissertation was to investigate PEP and its relationship with other important cognitive and affective factors across two studies to better understand its role in social anxiety. In Study 1, I assessed the temporal links between PEP, anticipatory processing (AP), anxiety, performance appraisals, and memory. Participants (n = 101) completed two speeches, four days apart. In between the two speeches, they answered ecological momentary assessment alerts to measure PEP about the first speech and AP about the second speech. I found that both PEP and AP decreased over the two-day assessment period. Feeling more anxious during the first speech also triggered a cascade of negative thinking and affect, including worse performance appraisals, increased PEP and AP, and higher anxiety levels in anticipation of the second speech. Contrary to expectations, PEP was unrelated to change in performance appraisals over time. There was also preliminary evidence that PEP might be linked to the phenomenological memory qualities of the first speech, namely its valence and emotional intensity. In Study 2, I extended these findings by examining positive PEP and pleasant social interactions in addition to the typically studied negative PEP and stressful social interactions. Participants (n = 411) brought back to mind a recent stressful or pleasant social interaction, completed self-reported measures, and wrote a description of the recalled interaction. Participants who recalled a stressful interaction reported engaging in more negative PEP, and less positive PEP, compared to those who recalled a pleasant interaction. I also observed that higher social anxiety was linked with more negative and less positive PEP irrespective of whether the PEP was following a stressful or a pleasant interaction. Moreover, participants' descriptions of the interactions contained more negative words when they also reported having engaged in more negative PEP. Negative PEP was also associated with a more negative emotionally intense self-reported memory of the interaction. In addition, descriptions contained more positive and less negative words when participants reported engaging in more positive PEP. Positive PEP's relationship with memory depended on whether the interaction was stressful or pleasant. For the former, positive PEP was related to a more positive memory; for the latter, it was related to increased emotional intensity. Both studies help elucidate the complex nature of PEP. Their conclusions have many theoretical and clinical implications for the PEP and social anxiety field (e.g., how negative PEP evolves over time, how high social anxiety may be characterized by both more negative and less positive PEP). Considering methodological strengths and limitations provides additional questions and directions for future research examining negative and positive PEP.
3

Extending Event Sequence Processing:New Models and Optimization Techniques

Liu, Mo 25 April 2012 (has links)
Many modern applications, including online financial feeds, tag-based mass transit systems and RFID-based supply chain management systems transmit real-time data streams. There is a need for event stream processing technology to analyze this vast amount of sequential data to enable online operational decision making. This dissertation focuses on innovating several techniques at the core of a scalable E-Analytic system to achieve efficient, scalable and robust methods for in-memory multi-dimensional nested pattern analysis over high-speed event streams. First, I address the problem of processing flat pattern queries on event streams with out-of-order data arrival. I design two alternate solutions: aggressive and conservative strategies respectively. The aggressive strategy produces maximal output under the optimistic assumption that out-of-order event arrival is rare. The conservative method works under the assumption that out-of-order data may be common, and thus produces output only when its correctness can be guaranteed. Second, I design the integration of CEP and OLAP techniques (ECube model) for efficient multi-dimensional event pattern analysis at different abstraction levels. Strategies of drill-down (refinement from abstract to specific patterns) and of roll-up (generalization from specific to abstract patterns) are developed for the efficient workload evaluation. I design a cost-driven adaptive optimizer called Chase that exploits reuse strategies for optimal E-Cube hierarchy execution. Then, I explore novel optimization techniques to support the high- performance processing of powerful nested CEP patterns. A CEP query language called NEEL, is designed to express nested CEP pattern queries composed of sequence, negation, AND and OR operators. To allow flexible execution ordering, I devise a normalization procedure that employs rewriting rules for flattening a nested complex event expression. To conserve CPU and memory consumption, I propose several strategies for efficient shared processing of groups of normalized NEEL subexpressions. Our comprehensive experimental studies, using both synthetic as well as real data streams demonstrate superiority of our proposed strategies over alternate methods in the literature in both effectiveness and efficiency.
4

Distributed complex event detection for pervasive computing

O'Keeffe, Daniel Brendan January 2010 (has links)
No description available.
5

Event stream analytics

Poppe, Olga 05 January 2018 (has links)
Advances in hardware, software and communication networks have enabled applications to generate data at unprecedented volume and velocity. An important type of this data are event streams generated from financial transactions, health sensors, web logs, social media, mobile devices, and vehicles. The world is thus poised for a sea-change in time-critical applications from financial fraud detection to health care analytics empowered by inferring insights from event streams in real time. Event processing systems continuously evaluate massive workloads of Kleene queries to detect and aggregate event trends of interest. Examples of these trends include check kites in financial fraud detection, irregular heartbeat in health care analytics, and vehicle trajectories in traffic control. These trends can be of any length. Worst yet, their number may grow exponentially in the number of events. State-of-the-art systems do not offer practical solutions for trend analytics and thus suffer from long delays and high memory costs. In this dissertation, we propose the following event trend detection and aggregation techniques. First, we solve the trade-off between CPU processing time and memory usage while computing event trends over high-rate event streams. Namely, our event trend detection approach guarantees minimal CPU processing time given limited memory. Second, we compute online event trend aggregation at multiple granularity levels from fine (per matched event), to medium (per event type), to coarse (per pattern). Thus, we minimize the number of aggregates – reducing both time and space complexity compared to the state-of-the-art approaches. Third, we share intermediate aggregates among multiple event sequence queries while avoiding the expensive construction of matched event sequences. In several comprehensive experimental studies, we demonstrate the superiority of the proposed strategies over the state-of-the-art techniques with respect to latency, throughput, and memory costs.
6

High Performance Analytics in Complex Event Processing

Qi, Yingmei 02 January 2013 (has links)
Complex Event Processing (CEP) is the technical choice for high performance analytics in time-critical decision-making applications. Although current CEP systems support sequence pattern detection on continuous event streams, they do not support the computation of aggregated values over the matched sequences of a query pattern. Instead, aggregation is typically applied as a post processing step after CEP pattern detection, leading to an extremely inefficient solution for sequence aggregation. Meanwhile, the state-of-art aggregation techniques over traditional stream data are not directly applicable in the context of the sequence-semantics of CEP. In this paper, we propose an approach, called A-Seq, that successfully pushes the aggregation computation into the sequence pattern detection process. A-Seq succeeds to compute aggregation online by dynamically recording compact partial sequence aggregation without ever constructing the to-be-aggregated matched sequences. Techniques are devised to tackle all the key CEP- specific challenges for aggregation, including sliding window semantics, event purging, as well as sequence negation. For scalability, we further introduce the Chop-Connect methodology, that enables sequence aggregation sharing among queries with arbitrary substring relationships. Lastly, our cost-driven optimizer selects a shared execution plan for effectively processing a workload of CEP aggregation queries. Our experimental study using real data sets demonstrates over four orders of magnitude efficiency improvement for a wide range of tested scenarios of our proposed A-Seq approach compared to the state-of-art solutions, thus achieving high-performance CEP aggregation analytics.
7

Fault management of web services

Alam, Sazedul 27 August 2009
The use of service-oriented (SO) distributed systems is increasing. Within service orientation web services (WS) are the de facto standard for implementing service-oriented systems. The consumers of WS want to get uninterrupted and reliable service from the service providers. But WS providers cannot always provide services in the expected level due to faults and failures in the system. As a result the fault management of these systems is becoming crucial. This work presents a distributed event-driven architecture for fault management of Web Services. According to the architecture the managed WS report different events to the event databases. From event databases these events are sent to the event processors. The event processors are distributed over the network. They process the events, detect fault scenarios in the event stream and manage faults in the WS.
8

Fault management of web services

Alam, Sazedul 27 August 2009 (has links)
The use of service-oriented (SO) distributed systems is increasing. Within service orientation web services (WS) are the de facto standard for implementing service-oriented systems. The consumers of WS want to get uninterrupted and reliable service from the service providers. But WS providers cannot always provide services in the expected level due to faults and failures in the system. As a result the fault management of these systems is becoming crucial. This work presents a distributed event-driven architecture for fault management of Web Services. According to the architecture the managed WS report different events to the event databases. From event databases these events are sent to the event processors. The event processors are distributed over the network. They process the events, detect fault scenarios in the event stream and manage faults in the WS.
9

FAST flexible allocation for sensing tasks

Le, Thao P. January 2013 (has links)
The allocation of resources to tasks in a computationally efficient manner is a key problem in computer science. One important application domain for solutions to this class of problem is the allocation of sensing resources for environmental monitoring, surveillance, or similar sensing tasks. Within this domain, however, the complexity of the problem is compounded by a number of factors: new tasks may arrive at any time, resources may be shared between tasks under some conditions, tasks may be composed of inter-dependent sub-tasks, and tasks may compete for sensor resources. These factors combined with the dynamic nature of the topology of sensor networks (e.g. sensors may move out of range or become damaged) mean that it is extremely difficult or impossible to have a solution using existing techniques. In this thesis, we propose an efficient, agent-based solution (FAST for Flexible Allocation for Sensing Tasks) to this complex dynamic problem. The sensing resources in FAST can be either static or mobile or a mixture of both. Particularly, each resource is managed by a task leader agent (i.e. the actual sensor that is closest to the task central point). The problem is then modelled as a coordination problem where the task agents employ a novel multi-round Knapsack-based algorithm (GAP-E) to obtain a solution. If there are dependencies between sub-tasks, such relationships are solved prior to the actual allocation. At execution time, if there is any environment change that affects the task sensing type requirements, the previously determined sensor types for tasks are revised. When applicable, the agents are cooperative through exchanging and sharing resources to maximise their profits. In addition, FAST addresses the situation where sensor resource sharing is not possible and there is no incentive for sensor resources to be exchanged. In such situations, an additional post-process step underpinned by mechanism for exchanging resources through negotiation were introduced. Through those mechanisms, agents may, in a decentralized manner, decide the means to deliver on a sensing task given local conditions, and to alleviate the impact of task arrival time on the quality of the global solution. Via empirical evaluation, these steps significantly improved the number of sensing tasks that can be successfully completed with only a minor impact on execution time.
10

CPR: Complex Pattern Ranking for Evaluating Top-k Pattern Queries over Event Streams

January 2011 (has links)
abstract: Most existing approaches to complex event processing over streaming data rely on the assumption that the matches to the queries are rare and that the goal of the system is to identify these few matches within the incoming deluge of data. In many applications, such as stock market analysis and user credit card purchase pattern monitoring, however the matches to the user queries are in fact plentiful and the system has to efficiently sift through these many matches to locate only the few most preferable matches. In this work, we propose a complex pattern ranking (CPR) framework for specifying top-k pattern queries over streaming data, present new algorithms to support top-k pattern queries in data streaming environments, and verify the effectiveness and efficiency of the proposed algorithms. The developed algorithms identify top-k matching results satisfying both patterns as well as additional criteria. To support real-time processing of the data streams, instead of computing top-k results from scratch for each time window, we maintain top-k results dynamically as new events come and old ones expire. We also develop new top-k join execution strategies that are able to adapt to the changing situations (e.g., sorted and random access costs, join rates) without having to assume a priori presence of data statistics. Experiments show significant improvements over existing approaches. / Dissertation/Thesis / M.S. Computer Science 2011

Page generated in 0.1202 seconds