• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19624
  • 3369
  • 2417
  • 2007
  • 1551
  • 1432
  • 877
  • 406
  • 390
  • 359
  • 297
  • 233
  • 208
  • 208
  • 208
  • Tagged with
  • 38107
  • 12456
  • 9252
  • 7104
  • 6698
  • 5896
  • 5284
  • 5197
  • 4723
  • 3453
  • 3302
  • 2810
  • 2725
  • 2538
  • 2116
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Key Components in a Networked Data Acquisition System

Corry, Diarmuid 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / With the growing interest in networked data acquisition there has been a lot of focus on networked data acquisition systems. However, the requirements of a flight test instrumentation system go beyond networked DAU's. For example, a FTI network fabric has particular requirements for switches, time grandmasters, recorders, data servers and network terminals to the ground. This paper discusses these components and how they inter-operate in a single, fully networked system and discusses some FTI oriented requirements for same. Where relevant, we discuss the results of some experiments with network latencies; packet losses etc. and discuss some enhancements that can contribute to improved efficiency for flight test programs.
122

THE USE OF IEEE P1451.3 SMART SENSORS IN A DATA ACQUISITION NETWORK

Eccles, Lee H. 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the use of an IEEE p1451.3 Smart Sensor bus as part of a network centric data acquisition system. IEEE p1451.3 provides for synchronized data acquisition from a number of transducers on a bus. The standard provides for Transducer Electronic Data Sheets (TEDS) that the manufacturer can use to describe the function and capabilities of the sensor module. The standard also provides for TEDS where the user can store information relevant to a particular application. The information in these TEDS can be used to generate much of the information that is required to be able to process the data during or after a test. The use of this information to configure and operate a Network Based Data Acquisition is described.
123

Kan e-tjänster förenklas och bli mer motiverande med gamification och öppna data? : En kvalitativ studie / Is it possible to simplify e-services and become more motivating with gamification and open data? : A qualitative study

Cederberg, Petter January 2016 (has links)
Inför sommaren 2015 var 40 % av alla bygglovsansökningar som skickades in till Karlstads kommun antingen felaktiga eller ofullständiga och det leder till stora förseningar på tiden det tar från inskickad ansökan till att beslut fattas från kommunens handläggare. Denna kandidatuppsats syftar till att undersöka om det med hjälp av gamification och öppna data går att förenkla e-tjänster och på vilket sätt, med ansökan av byggnadslov som exempel på en kommunal e-tjänst på hos Karlstads kommun. Gamification innebär att applicera spelmekanismer eller spelupplevelser i icke spelrelaterade sammanhang för att öka motivationen och engagemanget hos användarna. Öppna data är när data som tidigare inte varit tillgänglig blir disponibel för allmänheten att använda, återanvända och distribuera utan andra förbehåll än källangivelse. Kandidatuppsatsen har genomförts i form av litteraturgenomgång som bas och kvalitativa intervjuer med personer som arbetar med gamification, öppna data eller kommunala e-tjänster. Slutsatserna av undersökningen visar att e-tjänster kan behöva bli mer självförklarande och enklare så att fler medborgare klarar av att använda e-tjänsten. Genom att applicera enklare spelmekanismer som passar till kontexten hos en e-tjänst och informationen som går att få ut genom öppna data är det möjligt att förenkla e-tjänster och göra de mer motiverande för slutanvändaren. Gamification kan göra e-tjänster enklare och mer motiverande med till exempel:  Ge användaren mer feedback  Sparfunktion så att användaren får möjlighet att spara sina framsteg  Förloppsindikator så att användaren kan följa hur långt i processen den kommit Öppna data kan göra e-tjänster enklare och mer motiverande med till exempel:  Geografisk data  Korslänkning av olika öppna data
124

AN INTEGRATED APPROACH TO DATA ACQUISITION AND DATA PROCESSING FOR INDUSTRIAL APPLICATIONS

Schmalz, Axel 10 1900 (has links)
International Telemetering Conference Proceedings / October 22-25, 1984 / Riviera Hotel, Las Vegas, Nevada / The requirements for data acquisition systems grow rapidly with the progress of technology. Increasingly complex test instruments become available. Integration of instruments and computers into an operational measurement system, however, is more difficult and expensive as requirements increase. A family of instruments was developed which can perform complex measurement tasks without a large integration effort since it provides a large number of compatible hardware and software modules for conditioning and conversion of signals into digital form, for data storage, data transmission, and data pre-processing.
125

IN FLIGHT DATA REDUCTION FOR REDUCED BANDWIDTH TRANSMISSION

Hicks, William T. 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / The desire to acquire large quantities of high speed vibration and acoustic data during aircraft testing is usually satisfied through on-board high speed recording methods. However there is often a need to have some of this data instantly available at the ground station for flight safety and other purposes. A Data Processor (DP) has been developed allowing an airborne data acquisition system to acquire large amounts of wideband analog data, process the data in real-time, and develop reduced bandwidth information from high bandwidth channels. The reduced data can be inserted into a Pulse Code Modulation (PCM) stream and telemetered via a Radio Frequency (RF) link with a potential for a 2000:1 reduction in bandwidth. This on-board processing capability also lends itself to additional tasks such as the generation of a reduced bandwidth marker channel which can flag critical time periods of data activity. This flagging technique can be used to facilitate ground station analysis of specific segments of data, resulting in significant cost and time savings.
126

Improving data quality : data consistency, deduplication, currency and accuracy

Yu, Wenyuan January 2013 (has links)
Data quality is one of the key problems in data management. An unprecedented amount of data has been accumulated and has become a valuable asset of an organization. The value of the data relies greatly on its quality. However, data is often dirty in real life. It may be inconsistent, duplicated, stale, inaccurate or incomplete, which can reduce its usability and increase the cost of businesses. Consequently the need for improving data quality arises, which comprises of five central issues of improving data quality, namely, data consistency, data deduplication, data currency, data accuracy and information completeness. This thesis presents the results of our work on the first four issues with regards to data consistency, deduplication, currency and accuracy. The first part of the thesis investigates incremental verifications of data consistencies in distributed data. Given a distributed database D, a set S of conditional functional dependencies (CFDs), the set V of violations of the CFDs in D, and updates ΔD to D, it is to find, with minimum data shipment, changes ΔV to V in response to ΔD. Although the problems are intractable, we show that they are bounded: there exist algorithms to detect errors such that their computational cost and data shipment are both linear in the size of ΔD and ΔV, independent of the size of the database D. Such incremental algorithms are provided for both vertically and horizontally partitioned data, and we show that the algorithms are optimal. The second part of the thesis studies the interaction between record matching and data repairing. Record matching, the main technique underlying data deduplication, aims to identify tuples that refer to the same real-world object, and repairing is to make a database consistent by fixing errors in the data using constraints. These are treated as separate processes in most data cleaning systems, based on heuristic solutions. However, our studies show that repairing can effectively help us identify matches, and vice versa. To capture the interaction, a uniform framework that seamlessly unifies repairing and matching operations is proposed to clean a database based on integrity constraints, matching rules and master data. The third part of the thesis presents our study of finding certain fixes that are absolutely correct for data repairing. Data repairing methods based on integrity constraints are normally heuristic, and they may not find certain fixes. Worse still, they may even introduce new errors when attempting to repair the data, which may not work well when repairing critical data such as medical records, in which a seemingly minor error often has disastrous consequences. We propose a framework and an algorithm to find certain fixes, based on master data, a class of editing rules and user interactions. A prototype system is also developed. The fourth part of the thesis introduces inferring data currency and consistency for conflict resolution, where data currency aims to identify the current values of entities, and conflict resolution is to combine tuples that pertain to the same real-world entity into a single tuple and resolve conflicts, which is also an important issue for data deduplication. We show that data currency and consistency help each other in resolving conflicts. We study a number of associated fundamental problems, and develop an approach for conflict resolution by inferring data currency and consistency. The last part of the thesis reports our study of data accuracy on the longstanding relative accuracy problem which is to determine, given tuples t1 and t2 that refer to the same entity e, whether t1[A] is more accurate than t2[A], i.e., t1[A] is closer to the true value of the A attribute of e than t2[A]. We introduce a class of accuracy rules and an inference system with a chase procedure to deduce relative accuracy, and the related fundamental problems are studied. We also propose a framework and algorithms for inferring accurate values with users’ interaction.
127

ACQUISITION AND TRANSMISSION OF SEISMIC DATA OVER PACKET RADIO

Hinterseer, Martin, Wegscheider, Christoph 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Research on earthquakes and volcanos is of particular importance in seismic instable regions. To improve opportunities of research and civil defence, continuous coverage of seismic activities of a large area at a certain time is necessary. This paper deals with the issues concerning the collecting of environmental data by a number of autonomous field stations and their transmission to central station through a cost effective low bandwidth packer radio data network. This paper deals with the acquisition, preprocessing and transmission of seismic data. Therefore, a prototype system is discussed, which will be developed at the University of Salzburg, Austria.
128

The Implications for Network Recorder Design in a Networked Flight Test Instrumentation Data Acquisition System

Cranley, Nikki 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / The higher bandwidth capacities available with the adoption of Ethernet technology for networked FTI data acquisition systems enable more data to be acquired. However, this puts increased demands on the network recorder to be able to support such data rates. During any given flight, the network recorder may log hundreds of GigaBytes of data, which must be processed and analyzed in real-time or in post-flight. This paper describes several approaches that may be adopted to facilitate data-on-demand data mining and data reduction operations. In particular, the use of filtering and indexing techniques that may be adopted to address this challenge are described.
129

The development and application of informatics-based systems for the analysis of the human transcriptome.

Kelso, Janet January 2003 (has links)
<p>Despite the fact that the sequence of the human genome is now complete it has become clear that the elucidation of the transcriptome is more complicated than previously expected. There is mounting evidence for unexpected and previously underestimated phenomena such as alternative splicing in the transcriptome. As a result, the identification of novel transcripts arising from the genome continues. Furthermore, as the volume of transcript data grows it is becoming increasingly difficult to integrate expression information which is from different sources, is stored in disparate locations, and is described using differing terminologies. Determining the function of translated transcripts also remains a complex task. Information about the expression profile &ndash / the location and timing of transcript expression &ndash / provides evidence that can be used in understanding the role of the expressed transcript in the organ or tissue under study, or in developmental pathways or disease phenotype observed.<br /> <br /> In this dissertation I present novel computational approaches with direct biological applications to two distinct but increasingly important areas of research in gene expression research. The first addresses detection and characterisation of alternatively spliced transcripts. The second is the construction of an hierarchical controlled vocabulary for gene expression data and the annotation of expression libraries with controlled terms from the hierarchies. In the final chapter the biological questions that can be approached, and the discoveries that can be made using these systems are illustrated with a view to demonstrating how the application of informatics can both enable and accelerate biological insight into the human transcriptome.</p>
130

Data Management and Curation: Services and Resources

Kollen, Christine, Bell, Mary 18 October 2016 (has links)
Poster from University of Arizona 2016 IT Summit / Are you or the researchers you work with writing a grant proposal that requires a data management plan? Are you working on a research project and have questions about how to effectively and efficiently manage your research data? Are you interested in sharing your data with other researchers? We can help! For the past several years, the University of Arizona (UA) Libraries, in collaboration with the Office of Research and Discovery and the University Information Technology Services, has been providing data management services and resources to the campus. We are interested in tailoring our services and resources to what you need. We conducted a research data management survey in 2014 and are currently working on the Data Management and Data Curation and Publication (DMDC) pilot. This poster will describe what data management and curation services we are currently providing, and ask for your feedback on potential new data management services and resources.

Page generated in 0.0521 seconds