• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • 1
  • Tagged with
  • 5
  • 5
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Event Modeling in Social Media with Application to Disaster Damage Assessment

Liang, Yuan 16 December 2013 (has links)
This thesis addresses the modeling of events in social media, with an emphasis on the detection, tracking, and analysis of disaster-related events like the 2011 Tohuku Earthquake in Japan. Successful event modeling is critical for many applications including information search, entity extraction, disaster assessment, and emergency monitoring. However, modeling events in social media is challenging since: (i) social media is noisy and oftentimes incomplete, in the sense that users provide only partial evidence of their participation in an event; (ii) messages in social media are usually short, providing only little textual narrative (thereby making event detection difficult); and (iii) the size of short-lived events typically changes rapidly, growing and shrinking in sharp bursts. With these challenges in mind, this thesis proposes a framework for event modeling in social media and makes three major contributions: The first contribution is a signal processing-inspired approach for event detection from social media. Concretely, this research proposes an iterative spatial- temporal event mining algorithm for identifying and extracting topics from social media. One of the key aspects of the proposed algorithm is a signal processing-inspired approach for viewing spatial-temporal term occurrences as signals, analyzing the noise contained in the signals, and applying noise filters to improve the quality of event extraction from these signals. The second contribution is a new model of population dynamics of event-related crowds in social media as they first form, evolve, and eventually dissolve. To- ward robust population modeling, a duration model is proposed to predict the time users spend in a particular crowd. And then a time-evolving population model is designed for estimating the number of people departing a crowd, which enables the prediction of the total population remaining in a crowd. The third contribution of this thesis is a set of methods for event analytics for leveraging social media in an earthquake damage assessment scenario. Firstly, the difference between text tweets and image tweets is investigated, and then three features – tweet density, re-tweet density, and user tweeting count – are extracted to model the intensity attenuation of earthquakes. The observation that the relationship between social media activity vs. the loss/damage attenuation suggests that social media following a catastrophic event can provide rapid insight into the extent of damage.
2

TELEMETRY SYSTEMS SUSTAINMENT

Trimble, Michael L., Wells, John E., Wurth, Timothy J. 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Tactical training ranges provide an opportunity for all of the armed forces to assess operational readiness. To perform this task the various training ranges have deployed numerous telemetry systems. The current design efforts in place to upgrade the capabilities and unify the ranges under one telemetry system do not address the training ranges' need to maintain their training capability with the legacy systems that have been deployed until the new systems are ready. Two systems that have recently undergone sustainment efforts are the Player and Event Tracking System (TAPETS) and the Large Area Tracking Range (LATR). TAPETS is a telemetry system operated by the U.S. Army Operational Test Command. The TAPETS system is comprised of the ground mobile station Standard Range Unit (SRU) and the aircraft Inertial Global Positioning System (GPS) Integration (IGI) Pod. Both systems require a transponder for the wireless communications link. LATR is an over the horizon telemetry system operated by the U.S. Navy at various test ranges to track ground based, ship based, and airborne participants in training exercises. The LATR system is comprised of Rotary Wing (RW), Fixed Wing (FW) Pods, Fixed Wing Internal (FWI), Ship, and Ground Participant Instrumentation Packages (PIPs) as well as Ground Interrogation Station (GIS) and relay stations. Like the TAPETS system, each of these packages and stations also require a transponder for the wireless communications link. Both telemetry systems have developed additional capabilities in order to better support and train the Armed Forces, which consequently requires more transponders. In addition, some areas were experiencing failures in their transponders that have been deployed for many years. The available spare components of some systems had been depleted and the sustainment requirements along with the increased demand for assets were beginning to impact the ability of the systems to successfully monitor the training ranges during exercises. The path to maintaining operational capability chosen for the TAPETS system was a mixed approach that consisted of identifying a depot level repair facility for their transponders and funding the development of new transponder printed circuit boards (PCB's) where obsolescence prevented a sufficient number of repairable units. In the case of LATR, the decision was made to create new transponders to take advantage of cost effective state-of-the-art RF design and manufacturing processes. The result of this effort is a new transponder that is operationally indistinguishable from the legacy transponder in all installation environments. The purpose of this paper is to present two successful system sustainment efforts with different approaches to serve as models for preserving the current level of training range capabilities until the next generation of telemetry systems are deployed. While the two programs illustrated here deal primarily with the transponder components of the systems, these same methods can be applied to the other aspects of legacy telemetry system sustainment efforts.
3

A generic predictive information system for resource planning and optimisation

Tavakoli, Siamak January 2010 (has links)
The purpose of this research work is to demonstrate the feasibility of creating a quick response decision platform for middle management in industry. It utilises the strengths of current, but more importantly creates a leap forward in the theory and practice of Supervisory and Data Acquisition (SCADA) systems and Discrete Event Simulation and Modelling (DESM). The proposed research platform uses real-time data and creates an automatic platform for real-time and predictive system analysis, giving current and ahead of time information on the performance of the system in an efficient manner. Data acquisition as the backend connection of data integration system to the shop floor faces both hardware and software challenges for coping with large scale real-time data collection. Limited scope of SCADA systems does not make them suitable candidates for this. Cost effectiveness, complexity, and efficiency-orientation of proprietary solutions leave space for more challenge. A Flexible Data Input Layer Architecture (FDILA) is proposed to address generic data integration platform so a multitude of data sources can be connected to the data processing unit. The efficiency of the proposed integration architecture lies in decentralising and distributing services between different layers. A novel Sensitivity Analysis (SA) method called EvenTracker is proposed as an effective tool to measure the importance and priority of inputs to the system. The EvenTracker method is introduced to deal with the complexity systems in real-time. The approach takes advantage of event-based definition of data involved in process flow. The underpinning logic behind EvenTracker SA method is capturing the cause-effect relationships between triggers (input variables) and events (output variables) at a specified period of time determined by an expert. The approach does not require estimating data distribution of any kind. Neither the performance model requires execution beyond the real-time. The proposed EvenTracker sensitivity analysis method has the lowest computational complexity compared with other popular sensitivity analysis methods. For proof of concept, a three tier data integration system was designed and developed by using National Instruments’ LabVIEW programming language, Rockwell Automation’s Arena simulation and modelling software, and OPC data communication software. A laboratory-based conveyor system with 29 sensors was installed to simulate a typical shop floor production line. In addition, EvenTracker SA method has been implemented on the data extracted from 28 sensors of one manufacturing line in a real factory. The experiment has resulted 14% of the input variables to be unimportant for evaluation of model outputs. The method proved a time efficiency gain of 52% on the analysis of filtered system when unimportant input variables were not sampled anymore. The EvenTracker SA method compared to Entropy-based SA technique, as the only other method that can be used for real-time purposes, is quicker, more accurate and less computationally burdensome. Additionally, theoretic estimation of computational complexity of SA methods based on both structural complexity and energy-time analysis resulted in favour of the efficiency of the proposed EvenTracker SA method. Both laboratory and factory-based experiments demonstrated flexibility and efficiency of the proposed solution.
4

Event Episode Discovery from Document Sequences: A Temporal-based Approach

Chiang, Yu-Sheng 07 September 2005 (has links)
Recent advances in information and networking technologies have contributed significantly to global connectivity and greatly facilitated and fostered information creation, distribution, and access. The resultant ever-increasing volume of online textual documents creates an urgent need for new text mining techniques that can intelligently and automatically extract implicit and potentially useful knowledge from these documents for decision support. This research focuses on identifying and discovering event episodes together with their temporal relationships that occur frequently (referred to as evolution patterns in this study) in sequences of documents. The discovery of such evolution patterns can be applied in such domains as knowledge management and used to facilitate existing document management and retrieval techniques (e.g., event tracking). Specifically, we propose and design an evolution pattern (EP) discovery technique for mining evolution patterns from sequences of documents. We experimentally evaluate our proposed EP technique in the context of facilitating event tracking. Measured by miss and false alarm rates, the evolution-pattern supported event-tracking (EPET) technique exhibits better tracking effectiveness than a traditional event-tracking technique. The encouraging performance of the EPET technique demonstrates the potential usefulness of evolution patterns in supporting event tracking and suggests that the proposed EP technique could effectively discover event episodes and evolution patterns in sequences of documents.
5

LOGGNING AV INTERAKTION MED DATAINSAMLINGSMETODER FÖR WEBBEVENTLOGGNINGSVERKTYG : Experiment om påverkan i svarstider vid loggning av interaktionsdata / LOGING OF INTERACTION WITH DATA COLLECTING METHODS FOR WEB EVENT LOG TOOLS : Experiment about affect in response time when loging interaction data

Henriksson, William January 2018 (has links)
Denna studie undersöker en eventuell påverkan av webbeventloggningsverktyg förautomatiserade användbarhetstestning av användarnas interaktion. I ett experiment mätssvarstider då inspelad interaktion av testpersonerna återuppspelas på den webbapplikationsom testas av webbeventloggningsverktygen med olika datainsamlingsmetoder.Experimentet är uppbyggt av fyra grupper som består av 3 loggningsverktyg somimplementerades utefter de delmålen som sattes upp. Webbeventloggningsverktygensimplementation inspireras av studiens förstudie och i deras numrering loggas allt merinteraktion av användaren som leder till en ökande mängd loggning i bytes. Studiens resultatmötte hypotesen att svarstiden för webbapplikationen när en användare interagerar på sidanökade inte märkbart och det var inte heller en statistiskt signifikant skillnad när loggningenutfördes jämfört mot den nuvarande webbplatsen.

Page generated in 0.0848 seconds