• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 5
  • 3
  • 2
  • Tagged with
  • 29
  • 29
  • 11
  • 8
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Bridging the Gap Between Space-Filling and Optimal Designs

January 2013 (has links)
abstract: This dissertation explores different methodologies for combining two popular design paradigms in the field of computer experiments. Space-filling designs are commonly used in order to ensure that there is good coverage of the design space, but they may not result in good properties when it comes to model fitting. Optimal designs traditionally perform very well in terms of model fitting, particularly when a polynomial is intended, but can result in problematic replication in the case of insignificant factors. By bringing these two design types together, positive properties of each can be retained while mitigating potential weaknesses. Hybrid space-filling designs, generated as Latin hypercubes augmented with I-optimal points, are compared to designs of each contributing component. A second design type called a bridge design is also evaluated, which further integrates the disparate design types. Bridge designs are the result of a Latin hypercube undergoing coordinate exchange to reach constrained D-optimality, ensuring that there is zero replication of factors in any one-dimensional projection. Lastly, bridge designs were augmented with I-optimal points with two goals in mind. Augmentation with candidate points generated assuming the same underlying analysis model serves to reduce the prediction variance without greatly compromising the space-filling property of the design, while augmentation with candidate points generated assuming a different underlying analysis model can greatly reduce the impact of model misspecification during the design phase. Each of these composite designs are compared to pure space-filling and optimal designs. They typically out-perform pure space-filling designs in terms of prediction variance and alphabetic efficiency, while maintaining comparability with pure optimal designs at small sample size. This justifies them as excellent candidates for initial experimentation. / Dissertation/Thesis / Ph.D. Industrial Engineering 2013
12

A Space-Filling Technique for the Visualization of Planar st-graph

Wang, Yuanmao January 2012 (has links)
Graphs currently attract an increasing number of computer scientists due to their widely adoptions in different areas. However, when people perform graph drawing, one of the most critical issues they need to concern is atheistics, i.e., to make the graph more suitable for human perceptions. In this work, we will aim at exploring one specific kind of graph ''planar st-graphs'' with space-filling technique in Info Vis area. We would cover edge crossing elimination, layer assignment, graph drawing algorithms, and new development of space-filling technique in planar st-graphs drawing etc. The final aim of this project is to develop a new algorithm to draw planar st-graphs based on a space-filling visualization approach with minimum edge crossings and maximum space usage.
13

Courbes remplissant l'espace et leur application en traitement d'images / Spacer-filling curves and their application in image processing

Nguyen, Giap 14 November 2013 (has links)
Les courbes remplissant l'espace sont connues pour la capacité d'ordonner les points multidimensionnels sur une ligne en tout conservant la localité, i.e. les points proches sont toujours proches sur la ligne. La conservation de la localité est beaucoup recherchée dans plusieurs applications. La courbe de Hilbert est la courbe remplissant l'espace qui conserve le mieux la localité. Cette courbe est originalement proposée en 2D, i.e. n'est qu'applicable aux points dans un espace 2D. Pour une perspective d'application dans le cas multidimensionnel, nous proposons dans cette thèse une généralisation de la courbe de Hilbert. La courbe généralisée est définie en s'appuyant sur la propriété essentielle de la courbe de Hilbert qui crée son niveau de conservation de la localité : l'adjacence. Ainsi, elle évite la dépendance du motif primitif RBG qui est le seul motif primitif de la courbe étendu par les recherches précédentes. Le résultat est donc une famille de courbe conservant bien la localité. L'optimisation de la conservation de la localité est aussi abordée pour permettre de retrouver la courbe qui conserve le mieux la localité. Pour cet objectif, nous proposons une mesure de la conservation de la localité. En s'appuyant sur les paramètres, cette mesure peut adapter aux différentes situations applicatives comme le changement de métrique ou de taille de localité. La construction est une partie importante de la thèse, elle est la base du calcul de l'index utilisé dans l'application. Pour un calcul de l'index rapide, la courbe de Hilbert autosimilaire est utilisée. La courbe de Hilbert satisfaisant les conditions de la courbe fait l'objet du chapitre 4. La courbe généralisée est enfin appliquée dans la recherche d'image. Il s'agit d'une recherche par le contenu où chaque image est caractérisée par un vecteur multidimensionnel. Les images sont ordonnées par la courbe sur une ligne ; ainsi, la recherche est simplifiée en une recherche sur une liste ordonnée. En donnant une image d'entrée, les images similaires sont celles correspondantes aux index voisins de l'index de l'image d'entrée. La conservation de la localité garantit que ces index correspondent aux images similaires. / The space-filling curves are known for the ability to order the multidimensional points on a line while preserving the locality, i.e. the close points are closely ordered on the line. The locality preserving is wished in many applications. Hilbert curve is the best locality preserving space-filling curve. This curve is originally proposed in 2D, i.e. it is only applied to points in a 2D space. For application in the multidimensional case, we propose in this thesis a generalization of Hilbert curve. Generalized curve is based on the essential property of Hilbert curve that creates its level of locality preserving: the adjacency. Thus, it avoids the dependence on the pattern RBG, which is the only pattern of the curve extended by previous researches. The result is a family of curves preserving well the locality. The optimization of the locality preserving is also addressed to find out the best locality preserving curve. For this purpose, we propose a measure of the locality preserving. Based on the parameters, this measure can adapt to different application situations such as the change of metric or locality size. The curve construction is an important part of the thesis. It is the basis of the index calculation used in application. For a rapid index calculation, the self-similar Hilbert curves is used. They are Hilbert curves satisfying the self-similar conditions specified in chapitre 4. The generalized curve is finally applied in image search. It is the question of the content-based image search (CBIR) where each image is characterized by a multidimensionalvector. Images are ordered by the curve of a line, and the search is simplified to the search on an ordered list. By giving an input image, similar images are those corresponding to neighbors of the index of the input. The locality preserving ensures that these indexes correspond to similar images.
14

Timages: Enhancing Time Graphs with Iconographic Information

Jänicke, Stefan 25 January 2019 (has links)
Various time-based visualization techniques have been designed to support the temporal analysis of data collections. While quantities play a secondary role in traditional timelines that reserve space for each individual data item to be observed, time graphs rather display quantitative information and they provide interaction means to filter for a subset of the data. Timages is a hybrid approach that enhances quantitative time graphs with qualitative information in an infographic-style. By (1) scaling thumbnails of data items dependent on relevance to the observed topic and by (2) time-dependent positioning these thumbnails inside a temporally aligned area with a novel space-filling strategy, the most relevant items in the entire data collection as well as predominant data items of certain time ranges are instantly seizable without the need to interact with the time graph.
15

Some Advances in Local Approximate Gaussian Processes

Sun, Furong 03 October 2019 (has links)
Nowadays, Gaussian Process (GP) has been recognized as an indispensable statistical tool in computer experiments. Due to its computational complexity and storage demand, its application in real-world problems, especially in "big data" settings, is quite limited. Among many strategies to tailor GP to such settings, Gramacy and Apley (2015) proposed local approximate GP (laGP), which constructs approximate predictive equations by constructing small local designs around the predictive location under certain criterion. In this dissertation, several methodological extensions based upon laGP are proposed. One methodological contribution is the multilevel global/local modeling, which deploys global hyper-parameter estimates to perform local prediction. The second contribution comes from extending the laGP notion of "locale" to a set of predictive locations, along paths in the input space. These two contributions have been applied in the satellite drag emulation, which is illustrated in Chapter 3. Furthermore, the multilevel GP modeling strategy has also been applied to synthesize field data and computer model outputs of solar irradiance across the continental United States, combined with inverse-variance weighting, which is detailed in Chapter 4. Last but not least, in Chapter 5, laGP's performance has been tested on emulating daytime land surface temperatures estimated via satellites, in the settings of irregular grid locations. / Doctor of Philosophy / In many real-life settings, we want to understand a physical relationship/phenomenon. Due to limited resources and/or ethical reasons, it is impossible to perform physical experiments to collect data, and therefore, we have to rely upon computer experiments, whose evaluation usually requires expensive simulation, involving complex mathematical equations. To reduce computational efforts, we are looking for a relatively cheap alternative, which is called an emulator, to serve as a surrogate model. Gaussian process (GP) is such an emulator, and has been very popular due to fabulous out-of-sample predictive performance and appropriate uncertainty quantification. However, due to computational complexity, full GP modeling is not suitable for “big data” settings. Gramacy and Apley (2015) proposed local approximate GP (laGP), the core idea of which is to use a subset of the data for inference and further prediction at unobserved inputs. This dissertation provides several extensions of laGP, which are applied to several real-life “big data” settings. The first application, detailed in Chapter 3, is to emulate satellite drag from large simulation experiments. A smart way is figured out to capture global input information in a comprehensive way by using a small subset of the data, and local prediction is performed subsequently. This method is called “multilevel GP modeling”, which is also deployed to synthesize field measurements and computational outputs of solar irradiance across the continental United States, illustrated in Chapter 4, and to emulate daytime land surface temperatures estimated by satellites, discussed in Chapter 5.
16

Design & Analysis of a Computer Experiment for an Aerospace Conformance Simulation Study

Gryder, Ryan W 01 January 2016 (has links)
Within NASA's Air Traffic Management Technology Demonstration # 1 (ATD-1), Interval Management (IM) is a flight deck tool that enables pilots to achieve or maintain a precise in-trail spacing behind a target aircraft. Previous research has shown that violations of aircraft spacing requirements can occur between an IM aircraft and its surrounding non-IM aircraft when it is following a target on a separate route. This research focused on the experimental design and analysis of a deterministic computer simulation which models our airspace configuration of interest. Using an original space-filling design and Gaussian process modeling, we found that aircraft delay assignments and wind profiles significantly impact the likelihood of spacing violations and the interruption of IM operations. However, we also found that implementing two theoretical advancements in IM technologies can potentially lead to promising results.
17

3d Object Recognition From Range Images

Izciler, Fatih 01 September 2012 (has links) (PDF)
Recognizing generic objects by single or multi view range images is a contemporary popular problem in 3D object recognition area with developing technology of scanning devices such as laser range scanners. This problem is vital to current and future vision systems performing shape based matching and classification of the objects in an arbitrary scene. Despite improvements on scanners, there are still imperfections on range scans such as holes or unconnected parts on images. This studyobjects at proposing and comparing algorithms that match a range image to complete 3D models in a target database.The study started with a baseline algorithm which usesstatistical representation of 3D shapesbased on 4D geometricfeatures, namely SURFLET-Pair relations.The feature describes the geometrical relationof a surface-point pair and reflects local and the global characteristics of the object. With the desire of generating solution to the problem,another algorithmthat interpretsSURFLET-Pairslike in the baseline algorithm, in which histograms of the features are used,isconsidered. Moreover, two other methods are proposed by applying 2D space filing curves on range images and applying 4D space filling curves on histograms of SURFLET-Pairs. Wavelet transforms are used for filtering purposes in these algorithms. These methods are tried to be compact, robust, independent on a global coordinate frame and descriptive enough to be distinguish queries&rsquo / categories.Baseline and proposed algorithms are implemented on a database in which range scans of real objects with imperfections are queries while generic 3D objects from various different categories are target dataset.
18

Analyse d'une base de données pour la calibration d'un code de calcul

Feuillard, Vincent 21 May 2007 (has links) (PDF)
Cette recherche s'insère dans le contexte général de la calibration, en vue d'applications industrielles. Son objectif est d'évaluer la qualité d'une base de données, représentant la manière dont celle-ci occupe, au mieux des objectifs recherchés, son domaine de variation. Le travail réalisé ici fournit une synthèse des outils mathématiques et algorithmiques permettant de réaliser une telle opération. Nous proposons en outre des techniques de sélection ou d'importation de nouvelles observations permettant d'améliorer la qualité globale des bases de données. Les méthodes élaborées permettent entre autres d'identifier des défauts dans la structure des données. Leurs applications sont illustrées dans le cadre de l'évaluation de paramètres fonctionnels, dans un contexte d'estimation par fonctions orthogonales.
19

Multi-layer designs and composite gaussian process models with engineering applications

Ba, Shan 21 May 2012 (has links)
This thesis consists of three chapters, covering topics in both the design and modeling aspects of computer experiments as well as their engineering applications. The first chapter systematically develops a new class of space-filling designs for computer experiments by splitting two-level factorial designs into multiple layers. The new design is easy to generate, and our numerical study shows that it can have better space-filling properties than the optimal Latin hypercube design. The second chapter proposes a novel modeling approach for approximating computationally expensive functions that are not second-order stationary. The new model is a composite of two Gaussian processes, where the first one captures the smooth global trend and the second one models local details. The new predictor also incorporates a flexible variance model, which makes it more capable of approximating surfaces with varying volatility. The third chapter is devoted to a two-stage sequential strategy which integrates analytical models with finite element simulations for a micromachining process.
20

Explorando dados provindos da internet em dispositivos móveis: uma abordagem baseada em visualização de informação / Exploring web data on mobile devices: an approach based on information visualization

Duarte, Felipe Simões Lage Gomes 12 February 2015 (has links)
Com o progresso da computação e popularização da Internet, a sociedade entrou na era da informação. Esta nova fase é marcada pela forma como produzimos e lidamos com a informação. Diariamente, são produzidos e armazenados milhares de Gigabytes de dados cujo valor é reduzido se a informação ali contida não puder ser transformada em conhecimento. Concomitante a este fato, o padrão da computação está caminhando para a miniaturização e acessibilidade com os dispositivos móveis. Estes novos equipamentos estão mudando o padrão de comportamento dos usuários que passam de leitores passivos a geradores de conteúdo. Neste contexto, este projeto de mestrado propõe a técnica de visualização de dados NMap e a ferramenta de visualização de dados web aplicável a dispositivo móvel SPCloud. A técnica NMap utiliza todo o espaço visual disponível para transmitir informações de grupos preservando a metáfora distância-similaridade. Teste comparativos com as principais técnicas do estado da arte mostraram que a técnica NMap tem melhores resultados em preservação de vizinhança com um tempo de processamento significativamente melhor. Este fato coloca a NMap como uma das principais técnicas de ocupação do espaço visual. A ferramenta SPCloud utiliza a NMap para visualizar notícias disponíveis na web. A ferramenta foi desenvolvida levando em consideração as características inerentes aos dispositivos moveis o que possibilita utiliza-la nestes equipamentos. Teste informais com usuários demonstraram que a ferramenta tem um bom desempenho para sumarizar grandes quantidades de notícias em um pequeno espaço visual. / With the development of computers and the increasing popularity of the Internet, our society has entered the information age. This era is marked by the way we produce and deal with information. Everyday, thousand of Gigabytes are stored, but their value is reduced if the data cannot be transformed into knowledge. Concomitantly, computing is moving towards miniaturization and affordability of mobile devices, which are changing users behavior who move from passive readers to content generators. In this context, in this master thesis we propose and develop a data visualization technique, called NMap, and a web data visualization tool for mobile devices, called SPCloud. NMap uses all available visual space to transmit information preserving the metaphor of distance-similarity between elements. Comparative evaluations were performed with the state of the art techniques and the result has shown that NMap produces better results of neighborhood preservation with a significant improvement in processing time. Our results place NMap as a major space-filling technique establishing a new state of the art. SPCloud, a tool which uses NMap to present news available on the web, was developed taking into account the inherent characteristics of mobile devices. Informal user tests revealed that SPCloud performs well to summarize large amounts of news in a small visual space.

Page generated in 0.0459 seconds