• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 180
  • 34
  • 22
  • 21
  • 15
  • 6
  • 5
  • 5
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 354
  • 80
  • 65
  • 52
  • 51
  • 48
  • 43
  • 41
  • 39
  • 38
  • 36
  • 32
  • 32
  • 31
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Natural User Interface Design using Multiple Displays for Courier Dispatch Operations.

Clifford, Rory January 2013 (has links)
This thesis explores how Natural User Interface (NUI) interaction and Multiple Display Technology (MDT) can be applied to an existing Freight Management System (FMS), to improve the command and control interface of the dispatch operators. Situational Awareness (SA) and Task Efficiency (TE) are identified as being the main requirements for dispatchers. Based on studies that have been performed on SA and TE in other time critical occupations such as Emergency Medical Dispatch (EMD) and Air Traffic Control (ATC), a substitute dispatch display system was designed with focus on courier driver and freight management systems and monitoring. This system aims to alleviate cognitive overheads without disrupting the flow of the existing CFMS by providing extended screen area matched with a natural input mechanism for command and control functionality. This Master’s thesis investigates which of commercial state-of-the-art interface tools is best to use in a wide Field-of-View (FOV) multiple screen display and to dicern if there is any practical impact that a proposed NUI system will have to courier dispatching. To assess the efficacy of such a hypothetical system the author has developed an experimental prototype that combines a set of three monitors in a Multi-Monitor System to create the overall display system, accompanied with two traditional and two advanced NUI direct and indirect interaction techniques (mouse, trackpad, touch screen and gesture controller). Experiments using the prototype were conducted to determine the optimum configuration for control/display interface based upon task effectiveness, bandwidth and overall user desirability of these methods in supporting behavioural requirements of dispatch workstation task handling. The author use the well-studied and robust Fitts' Law for measuring and analysing user behaviour with NUIs. Evaluation of the prototype system finds that the multi-touch system paired with the multi-monitor system was the most responsive of the interaction techniques, direct or indirect. Based on these findings, employing such an interaction system is a viable option for deployment in FMS's. However for optimal efficiency, the firmware that supports the interactivity dynamics should be re-designed so it is optimized to touch interaction. This will allow the multi-touch system to be used effectively as an affordance technology. Although the gesture interaction approach has a lot of potential as an alternative NUI device, the performance of gesture input in this experimental setting had the worst performance of all conditions. This finding was largely a result of the interface device limitation within the wide FOV display range of the multi-monitor system. Further design improvements and experimentation are proposed to alleviate this problem for the gesture tracking and for the touchscreen modalities of interaction.
32

Evaluation of Storage Systems for Big Data Analytics

January 2017 (has links)
abstract: Recent trends in big data storage systems show a shift from disk centric models to memory centric models. The primary challenges faced by these systems are speed, scalability, and fault tolerance. It is interesting to investigate the performance of these two models with respect to some big data applications. This thesis studies the performance of Ceph (a disk centric model) and Alluxio (a memory centric model) and evaluates whether a hybrid model provides any performance benefits with respect to big data applications. To this end, an application TechTalk is created that uses Ceph to store data and Alluxio to perform data analytics. The functionalities of the application include offline lecture storage, live recording of classes, content analysis and reference generation. The knowledge base of videos is constructed by analyzing the offline data using machine learning techniques. This training dataset provides knowledge to construct the index of an online stream. The indexed metadata enables the students to search, view and access the relevant content. The performance of the application is benchmarked in different use cases to demonstrate the benefits of the hybrid model. / Dissertation/Thesis / Masters Thesis Computer Science 2017
33

Des organisations centrées processus aux organisations centrées connaissance : la cartographie de connaissances comme levier de transformation des organisations. Le cas de la démarche de "Transfert de Savois-Faire" chez Total. / From process centric to knowledge centric organizations : knowledge maps for organization transformation. The case of a "Know-HowTransfer" approach in Total.

Sellin, Kelly 17 May 2011 (has links)
Depuis une quinzaine d’années, des démarches de knowledge management sont mises en place dans les organisations pour optimiser l’utilisation et le partage des connaissances entre les individus. D’une logique essentiellement « outil » dans un premier temps, ces démarches ont progressivement évolué vers une logique davantage centrée sur les interactions entre ces individus et leurs connaissances : c’est l’Organisation 2.0, dite « centrée connaissance ». Cette recherche propose d’identifier des leviers de transformation des organisations centrées processus – qui associent la notion de facteur humain à celle de risque – en organisations centrées connaissance, en considérant cette dernière comme une opportunité de développement. Elle s’appuie pour cela sur le transfert de connaissances, au coeur du knowledge centricity, en introduisant des cartographies de connaissances individuelles pourfaire émerger des réseaux d’acteurs. Réalisée sous forme d’une recherche-intervention chez Total, elle s’appuie sur le projet de « Transfert de Savoir-Faire » du Groupe. Cette recherche met en avant les forces des cartographies comme outils d’accompagnement au transfert de connaissances de par : leur capacité à identifier et à valoriser un patrimoine de connaissances à travers une approche centrée sur l’individu, leur capacité d’ouverture sur un réseau d’acteurs (et de connaissances), à créer une dynamique collective de partage. Enfin, elles constituent un outil de management intervenant notamment dans l’aide à la décision et la gestion des risques (associés à un management des connaissances), et sont une opportunité de dialogue entre les différents acteurs de l’organisation. La combinaison de ces résultats permet à cette recherche de contribuer aux travaux menés surle knowledge centricity, via l’introduction des cartographies de connaissances comme levier de transformation par la voie du transfert de connaissances. Cette recherche montre également les limites de l’introduction de ces cartographies de connaissances dans des organisations centrées processus : une structure matricielle forte associée à des stratégies d’acteurs en environnement changeant sont autant de freins au développement d’une démarche collective et globale pouvant aboutir à une organisation centrée connaissance. / For several years, knowledge management approaches mainly based on Information Systems are implemented to optimize the use and sharing of knowledge between individuals. More recently, this technical view has evolved towards a more human centric approach to enhance the value created from the human interactions: it the Organization 2.0, called “knowledge centric”. This research proposes to identify levers of organization transformation, from a process centric view - which associates human factor with the concept of risk - to a knowledge-centric organization, where knowledge appears as an opportunity. Thus, the focus is set on knowledge transfer, at the heart of knowledge centricity, by introducing individual knowledge maps in the organization in order to make actor networks emerge.This research is based on an intervention-research carried-out in Total to deploy a know-how transfer approach. The main results show the strength of the combination of knowledge maps and knowledge transfer for an organizational transformation: the identification of the intellectual resources of an organization through a human centric approach, the identification of actor (and knowledge) networks, the creation of a collective sharing dynamic, a management tool for decision support and risk management, and an opportunity for dialogue between the actors. Through these results, this research contributes to the work on knowledge centricity through the introduction of individual knowledge maps as means of promoting change through knowledge transfer. This research also shows the limits of this introduction in process centric organizations: a strong matrix structure associated with actors' strategies in a changing environment is hampering the development of a collective and global approach leading to a knowledge centricity.
34

Abstractions to Support Dynamic Adaptation of Communication Frameworks for User-Centric Communication

Allen, Andrew A 29 March 2011 (has links)
The convergence of data, audio and video on IP networks is changing the way individuals, groups and organizations communicate. This diversity of communication media presents opportunities for creating synergistic collaborative communications. This form of collaborative communication is however not without its challenges. The increasing number of communication service providers coupled with a combinatorial mix of offered services, varying Quality-of-Service and oscillating pricing of services increases the complexity for the user to manage and maintain `always best' priced or performance services. Consumers have to manually manage and adapt their communication in line with differences in services across devices, networks and media while ensuring that the usage remain consistent with their intended goals. This dissertation proposes a novel user-centric approach to address this problem. The proposed approach aims to reduce the aforementioned complexity to the user by (1) providing high-level abstractions and a policy based methodology for automated selection of the communication services guided by high-level user policies and (2) providing services through the seamless integration of multiple communication service providers and providing an extensible framework to support the integration of multiple communication service providers. The approach was implemented in the Communication Virtual Machine (CVM), a model-driven technology for realizing communication applications. The CVM includes the Network Communication Broker, the layer responsible for providing a network-independent API to the upper layers of CVM. The initial prototype for the NCB supported only a single communication framework which limited the number, quality and types of services available. Experimental evaluation of the approach show the additional overhead of the approach is minimal compared to the individual communication services frameworks. Additionally the automated approach proposed out performed the individual communication services frameworks for cross framework switching.
35

Information retention for disaster-stricken networks using Content Centric Networking / Informationsbevarande för katastrofdrabbade nätverk genom Content Centric Networking

Andersson, Elias January 2017 (has links)
The underlying architecture of the Internet has been mostly the same since its beginning in the 1960s and the TCP/IP protocol stack remains ubiquitous. However the Internet is today used for much wider purposes than what was originally intended and now the most common use of the Internet is for the distribution of various forms of content. Information Centric Networking (ICN) is an alternative architecture responding to this change in usage, intended to be more prepared to handle the new requirements of the Internet not only today but also in the future. The primary concern in ICN is the secure and efficient distribution of content. Current ICN research often concerns applications on various disaster scenarios as it is believed that ICN has properties that match the requirements of such scenarios. In this thesis that research is continued by developing an especially designed information retention solution, using the existing ICN implementation of Content Centric Networking (CCN). The aim is to maximisise and prolong the availability of as much content as possible in disaster-stricken networks by preemptively replicating content across the network topology. The solution is then evaluated against a scenario set in a network topology consisting of virtual machines. The final result is that the solution performs satisfactorily and thus demonstrate the potential of ICN when applied to such scenarios. / Internets underliggande arkitektur har varit i stort sett oförändrad sedan sin begynnelse på 1960-talet, och TCP/IP protokollstacken är fortsatt universell. Dock så används Internet idag för betydligt bredare ändamål än de ursprungliga syftena, och nu används Internet främst för att distribuera olika former av innehåll. Information Centric Networking (ICN) är en alternativ arkitektur som svarar på denna förändring i använding, avsedd att vara mer förberedd att hantera de nya kraven på Internet inte bara idag men också i framtiden. Den största angelägenheten i ICN är att distribuera innehåll på ett säkert och effektivt vis. Nuvarande forskning inom ICN handlar ofta om tillämpningar på olika sorters katastrofscenarier då tron är att ICN har egenskaper som motsvarar kraven hos sådana scenarier. I den här uppsatsen fortsätts denna forskning genom att en speciellt formgiven informationsbevaringslösning utvecklas, som nyttjar den existerande ICN-implementationen Content Centric Networking (CCN). Målet är att maximera och förlänga tillgängligheten av så mycket innehåll som möjligt i katastrofdrabbade nätverk genom att i förebyggande syfte replikera innehåll genom nätverkstopologin. Lösningen evalueras sedan mot ett scenario som utspelas i en nätverkstopologi utav virtuella maskiner. Det slutgiltiga resultatet är att lösningen presterar tillfredsställande och på så vis demonstrerar potentialen hos ICN vid tillämpning på sådana scenarion.
36

THE EFFECT OF NETWORK CENTRIC OPERATIONS IN TELEMETRY FOR AIR FORCE FLIGHT TEST AND EVALUATION

Santos, Eunice E., Jones, Charles H., Harris, Charles 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / We discuss how aspects of telemetry can be effectively incorporated and modeled as a component within network-centric operations and warfare paradigms. Telemetry is particularly vital in Air Force Flight Test and Evaluation. As such, this paper has a specific emphasis and provides discussion within this domain. We also present how an existing framework for networkcentric operations and warfare can be particularly beneficial to telemetry modeling, and discuss the potential insights and utility within this context.
37

Integration Issues in Network-Based Flight Test Systems

Smith, Rachel, Newton, Todd, Moodie, Myron 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / The current paradigm for data acquisition and recording systems for flight test applications does not meet today's demand for high reliability and timing performance. Such systems are better served through a network-based approach that can provide the capacity at which systems must acquire, record, process, and telemeter data. As with any complex system, this approach does have challenges. This paper describes the methods used to develop a network-centric flight test system, including simulators, IEEE 1588 time synchronization, network message protocols, and addresses the integration issues involved such as network topology and reliable latency-bounded throughput. Solutions used in overcoming these integration issues in previous system designs are also presented.
38

NET-CENTRIFYING THE GOULD TA6000 OSCILLOGRAPH

Guadiana, Juan, Benitez, Jesus, Tiqui, Dwight 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Migrating analog architectures and equipments to network architectures is underway all across the globe. There is no doubt, a modern instrument must fit the network environment or simply will not be procured. Yet, funding constraints temper wholesale changes to net-centric technologies. The last analog stronghold in our data center is the oscillograph. Over 50 Gould TA 6000 Oscillographs reside at White Sands Missile Range. These are digital implementations of analog recorders, hence require analog signaling. Digital telemetry data (most common format) must be converted to analog to drive an oscillograph that converts analog back to digital to plot the data. The oscillograph’s interface board may be “hacked” by removing the Analog to Digital Converter (ADC) gaining direct access to the digital signal path. This idea was worth attempting as the prospect of replacing that many recorders with the newer network driven oscillographs is costly hence remote. This paper’s topic is the conversion of the hardware and a discussion on software issues. Though not pretty, it does preserve the large recorder investment for the time being. Issues with analog signaling, such as noise, drift and ground loops are gone. A commercial ethernet to digital adapter drives the new digital interface and transforms the recorder into an net-centric instrument.
39

NETGEN: A MODEL-DRIVEN TOOL FOR RAPID PROTOTYPING AND SIMULATION OF NETWORK-BASED FLIGHT TEST SYSTEMS

Price, Jeremy C., Moore, Michael S. 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / When network-centric flight test system components are developed concurrently, it is necessary to produce relevant simulated network traffic for exercising the network devices and other processing subsystems prior to system integration. Having an accurate and repeatable pattern of simulated network traffic is extremely important for debugging and subsystem integration. The simulated network traffic must be both representative of the real system and repeatable to aid test efforts. Our solution to this problem was to develop a model-driven network traffic generator – NETGEN. Using NETGEN to resolve errors, stress test, and verify requirements, we have achieved otherwise unattainable correctness, reliability, and success in our systems.
40

IMPACT OF NETWORKED DATA ACQUISITION SYSTEMS ON TRANSDUCERS

Eccles, Lee H. 10 1900 (has links)
International Telemetering Conference Proceedings / October 21, 2002 / Town & Country Hotel and Conference Center, San Diego, California / Transducers have traditionally been incorporated into data systems by connecting the transducer to a signal conditioner that is then connected to a multiplexer with an Analog-to-Digital Converter (ADC). The signal conditioning, multiplexer and the ADC are usually included within the same assembly that is called a Data Acquisition Unit (DAU) or an encoder. A network centric data system allows the same architecture to be used if the interface to the encoder is changed to be a network interface. However, a network centric architecture allows other options as well. The signal conditioning and ADC can be included within the same package as the transducer and the assembly can be interfaced to the network. When this is combined with the processing capability now available, a whole new range of possibilities present themselves. The transducer can now be digitally processed to provide a linear output, it can be converted to Engineering Units, digitally filtered or have a host of other functions performed within the housing that contains the transducer. However, the network centric approach does not produce these advantages without some disadvantages. The major problem that needs to be solved is how we time stamp the data. With the encoder we could time stamp the PCM frame and be able to determine the time that a sample was taken from that information. Even in systems that convert the encoder to have a network interface, the time stamp needs to be affixed to the data in the encoder. With a network centric approach, the sample can be taken in the transducer and how to time stamp it becomes a real problem. This is a problem that must be considered at the system level. Some method of making time available at a low enough level in the system to allow transducer outputs to be time stamped is either a network issue or it requires a separate interface.

Page generated in 0.0373 seconds