• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 256
  • 51
  • 34
  • 27
  • 27
  • 8
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 509
  • 509
  • 117
  • 79
  • 76
  • 72
  • 68
  • 58
  • 47
  • 44
  • 37
  • 36
  • 36
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Design of a data acquisition system to control and monitor a velocity probe in a fluid flow field

Herwig, Nancy Lou January 1982 (has links)
A data acquisition system to control the position of a velocity probe, and to acquire digital voltages as an indication of fluid velocity is presented. This system replaces a similar manually operated traverse system, it relieves the operator of control and acquisition tasks, while providing a more consistent and systematic approach to the acquisition process. The design includes the TRS-80 microcomputer, with external interfacing accomplished using the STD-based bus design. / Master of Science
242

Real time data acquisition for load management

Ghosh, Sushmita 15 November 2013 (has links)
Demand for Data Transfer between computers has increased ever since the introduction of Personal Computers (PC). Data Communicating on the Personal Computer is much more productive as it is an intelligent terminal that can connect to various hosts on the same I/O hardware circuit as well as execute processes on its own as an isolated system. Yet, the PC on its own is useless for data communication. It requires a hardware interface circuit and software for controlling the handshaking signals and setting up communication parameters. Often the data is distorted due to noise in the line. Such transmission errors are imbedded in the data and require careful filtering. The thesis deals with the development of a Data Acquisition system that collects real time load and weather data and stores them as historical database for use in a load forecast algorithm in a load management system. A filtering technique has been developed here that checks for transmission errors in the raw data. The microcomputers used in this development are the IBM PC/XT and the AT&T 3B2 supermicro computer. / Master of Science
243

The Economic Cost of Privacy in Global Governance : The normative study of Association of Southeast Asian Nations (ASEAN) response to the mass data collection.

Nilsson Punthapong, Sheena January 2024 (has links)
A normative study of a regional organisation exercising governance using Global governance as a guiding theory. Association of Southeast Asian Nations (ASEAN) is one of the biggest regional organisations, often compared to the European Union (EU) in terms of efficacy and non-legal binding approach, as well as the non-conformity of western liberal ideology. This thesis conducts a case study of ASEAN through the lens of interpretivist ontology and epistemology using critical discourse analysis while considering the deviation of the regional history, experience, and identity. The inevitable fully leaning reliance on technology that runs the societal and political infrastructure today has resulted in many states and regions to develop their Privacy law or internet governance. The thesis analyses frameworks, publications, and dialogues among ASEAN Member states as well as their dialogue partners. The texts are placed within the discursive practices that ASEAN functions as a collective entity in international relations in which governance no longer requires an official body of government. ASEAN’s long record of cooperation has always been motivated by economic prosperity. There is a notable growing concerns of privacy which is in need of data protection, ASEAN has displayed the realisation as well as a potential and gradual shift into a mindset where digital footprint can transcend from a nascent norm into what other community might take for granted as a universal right to the general public and the basic obligation of the government.
244

Asset Management Data Collection for Supporting Decision Processes

Pantelias, Aristeidis 23 August 2005 (has links)
Transportation agencies engage in extensive data collection activities in order to support their decision processes at various levels. However, not all the data collected supply transportation officials with useful information for efficient and effective decision-making. This thesis presents research aimed at formally identifying links between data collection and the supported decision processes. The research objective identifies existing relationships between Asset Management data collection and the decision processes to be supported by them, particularly in the project selection level. It also proposes a framework for effective and efficient data collection. The motivation of the project was to help transportation agencies optimize their data collection processes and cut down data collection and management costs. The methodology used entailed two parts: a comprehensive literature review that collected information from various academic and industrial sources around the world (mostly from Europe, Australia and Canada) and the development of a web survey that was e-mailed to specific expert individuals within the 50 U.S. Departments of Transportation (DOTs) and Puerto Rico. The electronic questionnaire was designed to capture state officials' experience and practice on: asset management endorsement and implementation; data collection, management and integration; decision-making levels and decision processes; and identified relations between decision processes and data collection. The responses obtained from the web survey were analyzed statistically and combined with the additional resources in order to develop the proposed framework and recommendations. The results of this research are expected to help transportation agencies and organizations not only reduce costs in their data collection but also make more effective project selection decisions. / Master of Science
245

Microscopic Control Delay Modeling at Signalized Arterials Using Bluetooth Technology

Rajasekhar, Lakshmi 10 January 2012 (has links)
Real-time control delay estimation is an important performance measure for any intersection to improve the signal timing plans dynamically in real-time and hence improve the overall system performance. Control delay estimates helps to determine the level-of-service (LOS) characteristics of various approaches at an intersection and takes into account deceleration delay, stopped delay and acceleration delay. All kinds of traffic delay calculation especially control delay calculation has always been complicated and laborious as there never existed a low-cost direct method to find them in real-time from the field. A recent validated technology called Bluetooth Median Access Control (MAC) ID matching traffic data collection technology seems to hold promise for continuous and cost-effective traffic data collection. Bluetooth traffic data synchronized with vehicle trajectory plot generated from GPS probe vehicle runs has been used to develop control delay models which has a potential to predict the control delays in real-time based on Bluetooth detection error parameters in field. Incorporating control delay estimates in real-time traffic control management would result in significant improvement in overall system performance. / Master of Science
246

Hidden labour: The skilful work of clinical audit data collection and its implications for secondary use of data via integrated health IT

McVey, Lynn, Alvarado, Natasha, Greenhalgh, J., Elshehaly, Mai, Gale, C.P., Lake, J., Ruddle, R.A., Dowding, D., Mamas, M., Feltbower, R., Randell, Rebecca 26 July 2021 (has links)
Yes / Secondary use of data via integrated health information technology is fundamental to many healthcare policies and processes worldwide. However, repurposing data can be problematic and little research has been undertaken into the everyday practicalities of inter-system data sharing that helps explain why this is so, especially within (as opposed to between) organisations. In response, this article reports one of the most detailed empirical examinations undertaken to date of the work involved in repurposing healthcare data for National Clinical Audits. Methods: Fifty-four semi-structured, qualitative interviews were carried out with staff in five English National Health Service hospitals about their audit work, including 20 staff involved substantively with audit data collection. In addition, ethnographic observations took place on wards, in ‘back offices’ and meetings (102 hours). Findings were analysed thematically and synthesised in narratives. Results: Although data were available within hospital applications for secondary use in some audit fields, which could, in theory, have been auto-populated, in practice staff regularly negotiated multiple, unintegrated systems to generate audit records. This work was complex and skilful, and involved cross-checking and double data entry, often using paper forms, to assure data quality and inform quality improvements. Conclusions: If technology is to facilitate the secondary use of healthcare data, the skilled but largely hidden labour of those who collect and recontextualise those data must be recognised. Their detailed understandings of what it takes to produce high quality data in specific contexts should inform the further development of integrated systems within organisations.
247

Komunikační strategie podniku / Company Communication Strategy

Škrlantová, Hana January 2015 (has links)
The diploma thesis deals with the optimization of marketing communication strategy of the hospital institution SurGal Clinic s.r.o. The goal of this thesis is to formulate more effective marketing communication strategy with hospital`s clients on the basis of obtained results from both conducted analyses of inner and outer environment and own marketing research. The suggested solutions were processed according to project management principals.
248

Data Acquisition and Processing Pipeline for E-Scooter Tracking Using 3D LIDAR and Multi-Camera Setup

Siddhant Srinath Betrabet (9708467) 07 January 2021 (has links)
<div><p>Analyzing behaviors of objects on the road is a complex task that requires data from various sensors and their fusion to recreate movement of objects with a high degree of accuracy. A data collection and processing system are thus needed to track the objects accurately in order to make an accurate and clear map of the trajectories of objects relative to various coordinate frame(s) of interest in the map. Detection and tracking moving objects (DATMO) and Simultaneous localization and mapping (SLAM) are the tasks that needs to be achieved in conjunction to create a clear map of the road comprising of the moving and static objects.</p> <p> These computational problems are commonly solved and used to aid scenario reconstruction for the objects of interest. The tracking of objects can be done in various ways, utilizing sensors such as monocular or stereo cameras, Light Detection and Ranging (LIDAR) sensors as well as Inertial Navigation systems (INS) systems. One relatively common method for solving DATMO and SLAM involves utilizing a 3D LIDAR with multiple monocular cameras in conjunction with an inertial measurement unit (IMU) allows for redundancies to maintain object classification and tracking with the help of sensor fusion in cases when sensor specific traditional algorithms prove to be ineffectual when either sensor falls short due to their limitations. The usage of the IMU and sensor fusion methods relatively eliminates the need for having an expensive INS rig. Fusion of these sensors allows for more effectual tracking to utilize the maximum potential of each sensor while allowing for methods to increase perceptional accuracy. </p> <p>The focus of this thesis will be the dock-less e-scooter and the primary goal will be to track its movements effectively and accurately with respect to cars on the road and the world. Since it is relatively more common to observe a car on the road than e-scooters, we propose a data collection system that can be built on top of an e-scooter and an offline processing pipeline that can be used to collect data in order to understand the behaviors of the e-scooters themselves. In this thesis, we plan to explore a data collection system involving a 3D LIDAR sensor and multiple monocular cameras and an IMU on an e-scooter as well as an offline method for processing the data to generate data to aid scenario reconstruction. </p><br></div>
249

The design of a low cost ad-hoc network for short distance data acquisition

Rossouw, Cornelius Marais 12 1900 (has links)
Thesis (MScEng (Electrical and Electronic Engineering))--Stellenbosch University, 2008. / In this thesis the design of a low-cost ad hoc network for short distance data acquisition applications with low data arrival intervals will be presented. The focus is on cost reduction by replacing the traditional high power radios with low-power RF transceivers. The conventional way of using multiple stationary repeater towers (depending on the network) is also replaced by using an ad hoc con guration, where each individual station also serves as a repeater station to adjacent stations. This approach reduces network design time enormously, seeing that the network is able to con gure itself. By using this auto-routing multi-hop approach, data acquisition points are no longer restricted to the reception areas of base stations. A CSMA contention protocol is used for the data communication. Current models used to model this protocol are dependent on various assumptions. In the research reported in this thesis, a statistical study of the collision probability is performed and the results used to expand the current CSMA models. Inter-dependent characteristics of this model are also further enhanced to provide a more realistic model. A simulink model of the particular CSMA protocol is also designed. Both the mathematical- and the simulink models provide relatively good predictions when compared to actual measured results
250

Möjligheter med ett IoT-baserat system för automatisk datainsamling inom byggindustrin : En fallstudie hos JM AB

Sadat, Yasman, Wännerdahl, Kristoffer January 2017 (has links)
Den tekniska utvecklingen har möjliggjort ett mer uppkopplat samhälle genom enklare informationstillgång och mer avancerad informationshantering. Den här framfarten har legat till grund för utveckling av samlingsbegreppet Internet of Things, där automatiserade datainsamlingssystem är betydande för insamling av mer kvantitativa datamängder. Detta har lett till att företag inom flera olika industrier har integrerat tekniken i delar av verksamheten för att samla in en betydande datamängd i syfte att utveckla och förbättra de egna processerna. Intresset och framförallt behovet av den här typen av teknologi finns inom byggindustrin. För ett byggföretag som vill arbeta med digitala lösningar i deras arbete gäller att skapa förståelse för varför den typen av lösning ska användas och hur den kan nyttjas på bästa sätt. Vid implementering och anpassning av en ny teknik är det nödvändigt för företag att identifiera vilka möjligheter och hinder som kan uppstå. Därav är syftet med studien att undersöka möjligheten att samla in realtidsdata genom en automatiserad insamlingsprocess i produktionsfasen hos ett byggföretag. Vidare undersöks hur processen bidrar till kvalitetssäkring och ständiga förbättringar i byggprojekten. Detta för att skapa medvetenhet hos byggföretag om en ny teknik som kan underlätta deras arbete och hindra uppkomst av problem. En fallstudie har genomförts hos byggföretaget JM AB som har en lång erfarenhet av byggprocesser och byggprojekt. För att skapa förståelse kring studiens ämne och slutsatser har en en teoretisk referensram innehållandes teorier om bland annat kvalitetssäkring och datainsamling byggts upp. Genom tio intervjuer med anställda av olika befattningar, två observationer på ett av fallföretagets byggprojekt och en intervju med ett externt företag sammanställdes en empirisk undersökning. På så sätt erhölls insikt i hur fallföretaget arbetar med datainsamlingsprocesser och vilka tekniker som finns på marknaden. Genom analys av fallstudien framkom det tydligt att fallföretaget inte har välutarbetade datainsamlingsprocesser inom flera områden i byggproduktionen. En viktig slutsats i studien är att användandet av ett processinnovationsramverk kan skapa förutsättningar för en effektiv och automatiserad datainsamlingsprocess. Ramverket ger stöd till huruvida en automatisk datainsamlingsprocess är lönsamt samt skapar förståelse kring processen innan den är redo att implementeras. Dessutom möjliggör en automatiserad datainsamlingsprocess kvalitetsförbättringar med hjälp av ett kvalitetskontrollramverk som kan öka kvaliteten genom att reducera problem och lösa dem fortare. Kvalitetskontrollerna blir systematiska och utifrån genererad data kan jämförelser göras för att identifiera defekter så att arbete mot snabbare handlingsåtgärder kan skapas. Utifrån slutsatserna har fyra rekommendationer tagits fram, den första är att använda processinnovationsramverk vid undersökning av en ny automateringsprocess. Den andra är att testa automatiserade datainsamlingsprocesser i mindre projekt för att erhålla ökad förståelse. Den tredje är att systematiskt arbeta med ett kvalitetskontrollramverk utifrån genererad data. Den sista rekommendationen är att använda insamlad data till faktabaserade beslut och sträva mot ständiga förbättringar. / Technological developments have enabled a more connected society with easier access to information that creates opportunities for more advanced information management. This has been the basis for development of the generic term Internet of Things, where automated data collection systems are significant for the collection of quantitative data set. Companies in various industries are embracing the integration of this type of technology in their businesses to generate data that is possessed to improve their processes. The interest and especially the need for this type of technology is huge in the construction industry.   For construction companies that wants to work with digital solutions, it is important to create an understanding of why this type of solution is to be used and how it best can be utilized. In implementation and adaption of a new technology, it is necessary for companies to identify the opportunities and obstacles that may arise. Hence, the purpose of this study is to examine the feasibility of collecting real-time data through an automated collection process in the production phase at a construction company, and how it contributes to quality assurance and continuous improvement in construction projects. In this way, awareness among construction companies of a new technique can facilitate their work and prevent upcoming problems. A case study has been carried out at the construction company JM AB who has a long experience of construction processes. To create an understanding about the study topic and conclusions, a theoretical framework was compiled including theories about quality assurance and data collection. Through ten interviews with employees of different positions, two observations at one of the case company’s construction projects and one interview with an external company, an empirical study was compiled. In this way, insights were obtained into how the case company is working with data collection processes and what techniques that are available on the market. Through analysis of the case study, it crearly appeared that the case company does not have well-developed data collection processes in several areas of the building production. One important conclusion of the study is that the use of a process innovation framework can create conditions for an efficient and automated data collection process. The framework supports whether an automated data collection process is profitable and it creates an understanding of the process prior of its implementation. Another important conclusion is that an automated data collection process allows quality improvements through the help of a quality control framework that can increase the quality by reducing problems and solving them faster. Hence, quality controls become systematic and based on generated data, comparisons can be made to identify defects in order to work systematically toward faster action measures. Based on the findings, four recommendations were developed. The first one is to use process innovation framework for the examination of a new automatic process. The second is to test automated data collection processes into smaller projects to obtain greater understanding. The third is to work systematically on a quality control framework based on generated data. The last recommendation is to use the collected data for fact-based decisions and to strive for continuous improvement.

Page generated in 0.0966 seconds