• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 172
  • 39
  • 19
  • 13
  • 8
  • 7
  • 7
  • 7
  • 4
  • 4
  • 4
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 340
  • 340
  • 85
  • 69
  • 60
  • 49
  • 47
  • 47
  • 40
  • 38
  • 38
  • 37
  • 37
  • 34
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Power Quality and Unbalanced Conditions Assessment Based on Digital Fault Recorders

Huang, Huiying 22 January 2018 (has links)
With the rapid development of power systems, more and more smart devices are installed in power industries, and each of them is gathering tons of information every day.Due to the data explosion and the difficulty of processing these data, data visualization, a big data technology, has become a trend. With the help of information technology, the visualization of real-time data has been achieved in power industries and there are multiple successful examples such as one-line diagram, load flow dashboard and equipment dashboard.In fault analysis group, digital fault recorders are essential to record and report an event.They are triggered when a fault occurs and corresponding report is generated instantly.However, people seldom utilize the historical data from DFRs to analyze the power quality issues.Therefore, this thesis presents the development of a power quality dashboard by using the collected data from DFRs.Three related power quality analyses have been accomplished in this paper: voltage and current variation, harmonics and unbalance components.Recursive algorithm is applied to compute the phasors and errors; Discrete Fourier Transform is utilized to extract harmonics from the samples; and the symmetric components are calculated by "A"-matrix transformation.The start page for the dashboard is a google map with all the DFR markers, and after double-clicking the marker, the report page will be opened.With the reports, engineers can not only monitor the event but also analyze out the possible causes and characteristics for a fault.For those renewable energy substation, the harmonic contents can be supervised so that the damages and losses can be significantly reduced by identifying the high harmonics.Ultimately, the goal of the dashboard is to achieve warning status and harmonic gradient mapping in the future. / Master of Science
22

Designing for Interaction and Insight: Experimental Techniques For Visualizing Building Energy Consumption Data

Cao, Hetian 01 December 2017 (has links)
While more efficient use of energy is increasingly vital to the development of the modern industrialized world, emerging visualization tools and approaches of telling data stories provide an opportunity for the exploration of a wide range of topics related to energy consumption and conservation (Olsen, 2017). Telling energy stories using data visualization has generated great interest among journalists, designers and scientific researchers; over time it has been proven to be effective to provide knowledge and insights (Holmes, 2007). This thesis proposes a new angle of tackling the challenge of designing visualization experience for building energy data, which aims to invite the users to think besides the established data narratives, augment the knowledge and insight of energy-related issues, and potentially trigger ecological responsible behaviors, by investigating and evaluating the efficacy of the existing interactive energy data visualization projects, and experimenting with user-centric interactive interface and unusual visual expressions though the development of a data visualization prototype.
23

Data Triage and Visual Analytics for Scientific Visualization

Lee, Teng-Yok 15 December 2011 (has links)
No description available.
24

Using C# and WPF to Create a Dynamic 3D Telemetry Theater and Trajectory Visualization Tool

Reff, Mark, O'Neal, John 10 1900 (has links)
ITC/USA 2015 Conference Proceedings / The Fifty-First Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2015 / Bally's Hotel & Convention Center, Las Vegas, NV / Telemetry data from flight tests are normally plotted using MatLab™ or other third party software. For most of the trajectory and flight parameters, a static 2D or 3D line graph does not provide the proper data visualization that can be accomplished with 3D software. Commercial 3D software can be expensive and difficult to customize, and writing custom software using Direct3D or OpenGL can be complex and time consuming. These problems were overcome using C# and Windows Presentation Foundation (WPF) to quickly and easily create a 3D Telemetry Theater and Trajectory Visualization Tool to dynamically display actual and simulated flight tests.
25

Vícedimensionální přechodové funkce pro vizualizaci skalárních objemových dat / Multidimensional transfer functions for scalar volumetric data visualization

Mach, Pavel January 2015 (has links)
Direct volume rendering is an algorithm used for displaying three-dimensional scalar data, like image from Computed Tomography. This algorithm makes use of a concept of Transfer functions for assigning optical properties to the data values. We studied two dimensional transfer functions, that besides primary values have additional dataset as an input. In particular, we studied computation of this secondary dataset with respect to the primary image function shape. This was done by analysing eigenvalues of the Hessian matrix in each image point. We proposed one formula and implemented several others for computing the probability that image point belongs to the blood vessel. Powered by TCPDF (www.tcpdf.org)
26

Vizualizace kvality dat v Business Intelligence / Visualization of Data Quality in Business Intelligence

Pohořelý, Radovan January 2009 (has links)
This thesis deals with the area of Business Intelligence and especially the part of data quality. The goal is provide an overview of data quality issue and possible ways the data can be shown to have better and more engaging informative value. Another goal was to make a proposal solution to the visualization of the system state, particularly the section of data quality, at the concrete enterprise. The output of this thesis should provide a quideline for the implementation of the proposed solution.
27

AN ITERATIVE METHOD OF SENTIMENT ANALYSIS FOR RELIABLE USER EVALUATION

Jingyi Hui (7023500) 16 August 2019 (has links)
<div> <div> <p>Benefited from the booming social network, reading posts from other users overinternet is becoming one of commonest ways for people to intake information. Onemay also have noticed that sometimes we tend to focus on users provide well-foundedanalysis, rather than those merely who vent their emotions. This thesis aims atfinding a simple and efficient way to recognize reliable information sources amongcountless internet users by examining the sentiments from their past posts.<br></p><p>To achieve this goal, the research utilized a dataset of tweets about Apples stockprice retrieved from Twitter. Key features we studied include post-date, user name,number of followers of that user, and the sentiment of that tweet. Prior to makingfurther use of the dataset, tweets from users who do not have sufficient posts arefiltered out. To compare user sentiments and the derivative of Apples stock price, weuse Pearson correlation between them for to describe how well each user performs.Then we iteratively increase the weight of reliable users and lower the weight ofuntrustworthy users, the correlation between overall sentiment and the derivative ofstock price will finally converge. The final correlations for individual users are theirperformance scores. Due to the chaos of real world data, manual segmentation viadata visualization is also proposed as a denoise method to improve performance.Besides our method, other metrics can also be considered as user trust index, suchas numbers of followers of each user. Experiments are conducted to prove that ourmethod out performs others. With simple input, this method can be applied on awide range of topics including election, economy, and job market.<br></p> </div> </div>
28

VRMol - um ambiente virtual distribuído para visualização e análise de moléculas de proteínas. / VRMol - a distributed virtual enviroment to visualize and analyze molecules of proteins.

Rodello, Ildeberto Aparecido 12 February 2003 (has links)
Este trabalho utiliza conceitos de Realidade Virtual e Sistemas Distribuídos para desenvolver um Ambiente Virtual Distribuído para visualização e análise de moléculas de proteínas, denominado VRMol. O sistema foi implementado com a linguagem Java, incluindo as APls Java 3D e Java RMI, visando permitir que pesquisadores geograficamente dispersos troquem informações de uma maneira rápida e eficiente, acelerando a pesquisa e discussão remotas. Assim, foram desenvolvidos uma interface gráfica com Java 3D e um conjunto de métodos para troca de mensagens de acordo com o modelo de comunicação cliente/servidor, com Java RMI. Além disso, o sistema também permite a utilização de alguns dispositivos de entrada não convencionais como joystick e luvas. / This work use the Virtual Reality and the Distributed Systems concepts to develop a Distributed Virtual Environment to visualize and analyze molecules of proteins, called VRMol. The system was implemented with the Java programming language, including the Java 3D and Java RMI APIs, aiming to allow geographically disperse researches exchange information in a quick and efficient way, speeding up the remote research and discussion. Thus, was developed a graphical interface with Java 3D and a set of methods to exchange messages according to a client/server communication model with Java RMI. Furthermore, the system also allows the use of some non-conventional input devices as joysticks and gloves.
29

Performance driven design systems in practice

Joyce, Sam January 2016 (has links)
This thesis is concerned with the application of computation in the context of professional architectural practice and specifically towards defining complex buildings that are highly integrated with respect to design and engineering performance. The thesis represents applied research undertaken whilst in practice at Foster + Partners. It reviews the current state of the art of computational design techniques to quickly but flexibly model and analyse building options. The application of parametric design tools to active design projects is discussed with respect to real examples as well as methods to then link the geometric definitions to structural engineering analysis, to provide performance data in near real time. The practical interoperability between design software and engineering tools is also examined. The role of performance data in design decision making is analysed by comparing manual work-flows with methods assisted by computation. This extends to optimisation methods which by making use of design automation actively make design decisions to return optimised results. The challenges and drawbacks of using these methods effectively in real deign situations is discussed, especially the limitations of these methods with respect to incomplete problem definitions, and the design exploration resulting in modified performance requirements. To counter these issues a performance driven design work flow is proposed. This is a mixed initiative whereby designer centric understanding and decisions are computer assisted. Flexible meta-design descriptions that encapsulate the variability of the design space under consideration are explored and compared with existing optimisation approaches. Computation is used to produce and visualise the performance data from these large design spaces generated by parametric design descriptions and associated engineering analysis. Novel methods are introduced that define a design and performance space using cluster computing methods to speed up the generation of large numbers of options. The use of data visualisation is applied to design problems, showing how in real situations it can aid design orientation and decision making using the large amount of data produced. Strategies to enable these work-flows are discussed and implemented, focusing on re-appropriating existing web design paradigms using a modular approach concentrating on scalable data creation and information display.
30

Charting Contagions: Data Visualization of Disease in Late 19th-Century San Francisco Chinatown

Pashby, Michele 01 January 2019 (has links)
In the late 1800s in San Francisco, Chinese immigrants faced racism and were blamed for the city’s public health crisis. To the rest of San Francisco, disease originated from Chinese people. However, through data visualization we can see that this was not the case. This paper maps cases of disease against the city’s sanitation system and shows how the lack of adequate infrastructure contributed to high rates of disease. Data visualization is an increasingly important tool that historians need to utilize to uncover new insights.

Page generated in 0.1144 seconds