• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1398
  • 1015
  • 380
  • 88
  • 62
  • 59
  • 45
  • 38
  • 21
  • 19
  • 14
  • 12
  • 11
  • 8
  • 8
  • Tagged with
  • 3660
  • 1140
  • 591
  • 492
  • 383
  • 357
  • 300
  • 251
  • 249
  • 248
  • 229
  • 224
  • 217
  • 215
  • 209
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Bayesian cluster validation

Koepke, Hoyt Adam 11 1900 (has links)
We propose a novel framework based on Bayesian principles for validating clusterings and present efficient algorithms for use with centroid or exemplar based clustering solutions. Our framework treats the data as fixed and introduces perturbations into the clustering procedure. In our algorithms, we scale the distances between points by a random variable whose distribution is tuned against a baseline null dataset. The random variable is integrated out, yielding a soft assignment matrix that gives the behavior under perturbation of the points relative to each of the clusters. From this soft assignment matrix, we are able to visualize inter-cluster behavior, rank clusters, and give a scalar index of the the clustering stability. In a large test on synthetic data, our method matches or outperforms other leading methods at predicting the correct number of clusters. We also present a theoretical analysis of our approach, which suggests that it is useful for high dimensional data.
62

Proof validation in Euclidean geometry: a comparison of novices and experts using eye tracking

Flesher, Paul Michael January 1900 (has links)
Doctor of Philosophy / Department of Mathematics / Andrew G. Bennett / This dissertation investigates and compares the methods of proof validation utilized by novice and expert mathematicians within the realm of Euclidean geometry. With the use of eye tracking technology, our study presents empirical evidence supporting claims previously studied only through the use of verbal protocols. Our investigation settles a series of contentious results surrounding the practical implementation of the generalized validation strategy called zooming out (Inglis and Alcock, 2012; Weber, Mejia-Ramos, Inglis, and Alcock, 2013). This strategy analyzes the overall structure of a proof as an application of methods or logical chunks. Settling the debate through use of longer and more complicated proofs devoid of blatant errors, we found that validators do not initially skim-read proofs to gain structural insight. We did however confirm the practical implementation of zooming out strategies. The literature identifies within the proof validation process specific differences between novices and experts. We are interested in a holistic understanding of novice and expert validations. We therefore present the direct comparison of entire validation processes that assess the similarity of novice and expert overall validation attempts. We found that the validation processes of novices and experts share a certain degree of similarity. In fact novices tend to be closer to experts than to other novices. And when validations are clustered, the groups are heterogeneous with regard to mathematical maturity. Our investigation expands the proof validation literature by including diagrams in the proof validation process. We found that experts tend to spend more time proportionally on the diagram than novices and that novices spend more time on the text. Furthermore, experts tend to draw more connections within the diagram than novices as indicated by a higher proportion of attentional changes within the diagrams. Experts seem to draw on the power of visualizations within the mathematics itself, spending more time on conceptual understanding and intended connections.
63

Cinchonaínas - método cromatográfico e produção de padrões para contrôle de qualidade de extratos polares de catuaba (Trichilia catigua Adr. Juss.)

Martinelli, Fernanda Rodrigues [UNESP] 17 August 2010 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:29:11Z (GMT). No. of bitstreams: 0 Previous issue date: 2010-08-17Bitstream added on 2014-06-13T19:58:53Z : No. of bitstreams: 1 martinelli_fr_me_araiq.pdf: 2429715 bytes, checksum: 195dabf2850602f4ee50e19bafc2de9a (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / A espécie Trichilia catigua, é uma árvore de 3 a 5 metros de altura, de distribuição ampla nos países da América do Sul, é conhecida como catuaba ou catiguá ou Angelim rosa, é utilizada popularmente como tônico mental e físico e especialmente como estimulante sexual. Cinchonaínas A e B foram escolhidas como marcadores químicos para a padronização do extrato hidroalcóolico de cascas de catuaba por serem as substâncias majoritárias desse extrato e também possuírem atividades antioxidante e antibacteriana. Como os padrões de cinchonaínas ainda não são comercializados foi realizado, neste trabalho, o isolamento, identificação, determinação de pureza absoluta e estudo de estabilidade destes compostos para que possam ser utilizados adequadamente como padrões de trabalho. A purificação das cinchonaínas foi feita em CLAE preparativo, utilizando coluna de fase reversa C18 e gradiente linear de CH3COOH (0,1%) em H2O/MeOH e a identificação foi realizada através da análise dos espectros de massas de alta resolução e RMN de 1H e 13C. Foram realizados também estudos de estabilidade em solução para cinchonaína A, B e extrato hidroalcóolico da casca de catuaba em diferentes condições de armazenamento. Os estudos de estabilidade acelerada foram realizados de acordo com RE 398 da ANVISA em condições de temperatura (40ºC) e umidade (75% UR). Estes estudos indicaram que os padrões de cinchonaína A e B são estáveis quando mantidos secos sob condições ambientais, com prazo de validade estimado de dois anos. No entanto, em solução hidrometanólica (MeOH:H2O 1:1) ocorre rápida oxidação com formação das di - orto – quinonas, indicando que as soluções mães desses padrões não podem ser estocados por mais de 1 dia. O estudo de estabilidade em solução, mostrou que o perfil cromatográfico das soluções hidroalcóolicas da planta não apresentou... / Trichilia catigua, is a tree which grows 6.56 to 13.12 feet, widely distributed in South America, and is known as catuaba, catiguá or Angelim renders rose-colored and is popular used as mental and physical tonic and especially as sexual stimulant. Cinchonains A and B were chosen as chemical markers to standardize the extract in focus due the major concentration of those substances in the hydroalcoholic extract from bark of catuaba and also due to antioxidant and anti bacterial activities related to them. Since the cinchonains standards are not commercialized, we had to perform the isolation, identification, absolute purity and stability study of these compounds, so that they could be properly used as standards. The cinchonains purification was performed in HPLC using preparative C-18 reverse phase column and H2O/MeOH as solvent, by running a linear gradient. The identification was obtained through high resolution mass spectrum and 1H e 13C NMR (Nuclear Magnetic Resonance). Stability studies were executed in solution of cinchonain A, B and hydroalcoholic extract from bark of catuaba in different storage conditions. The accelerated stability studies were performed according to RE 398, November 12nd of 2004 - ANVISA under temperature of 40 °C and humidity of 75% UR. These studies indicated that standards of cinchonain A and B are stable when they are dried and kept under ambient conditions, having a shelf life estimated of two years. However, in hydromethanol solution (MeOH:H2O 1:1), rapid oxidation occurs leading to di-orthoquinones formation, indicating that stock standards solutions cannot be stored more than 1 day. The stability study on solution showed that the chromatographic profile of the plant hydroalcoholic solution did not present alteration within 7 days, indicating that samples can be prepared and stored during this period. The parameters... (Complete abstract click electronic access below)
64

Bayesian cluster validation

Koepke, Hoyt Adam 11 1900 (has links)
We propose a novel framework based on Bayesian principles for validating clusterings and present efficient algorithms for use with centroid or exemplar based clustering solutions. Our framework treats the data as fixed and introduces perturbations into the clustering procedure. In our algorithms, we scale the distances between points by a random variable whose distribution is tuned against a baseline null dataset. The random variable is integrated out, yielding a soft assignment matrix that gives the behavior under perturbation of the points relative to each of the clusters. From this soft assignment matrix, we are able to visualize inter-cluster behavior, rank clusters, and give a scalar index of the the clustering stability. In a large test on synthetic data, our method matches or outperforms other leading methods at predicting the correct number of clusters. We also present a theoretical analysis of our approach, which suggests that it is useful for high dimensional data. / Science, Faculty of / Computer Science, Department of / Graduate
65

On an implementation of abstract interpretation

Westcott, Doug January 1988 (has links)
This thesis describes an implementation of abstract interpretation and its application to strictness analysis and termination analysis. The abstract interpretation is performed based on a lattice-theoretical model of abstraction, or translation, of functions expressed in a lambda-calculus notation and defined over a concrete domain into functions defined over a user-specified, application-dependent, abstract domain. The functions thus obtained are then analyzed in order to find their least fixed-points in the lattice which is the abstract domain, using a method which is a simplification of the frontiers algorithm of Chris Clack and Simon Peyton Jones. In order to achieve the required efficiency, this method is implemented using lattice annotation, along with constraints upon the annotations. The implementation is then applied to the problems of strictness analysis and termination analysis, deriving useful pre-compilation information for many functions. The concrete domains over which the functions are defined may or may not include lists. / Science, Faculty of / Computer Science, Department of / Graduate
66

Lie Acceptability: A Construct and Measure

Oliveira, Carrie, Levine, Timothy R. 01 January 2008 (has links)
People likely hold different opinions about deceptive communication. Lie acceptability refers to an individual's attitude about deceptive communication. A self-report measure of lie acceptability was updated and refined, and preliminary data (N = 312) consistent with validity were reported. CFA indicated acceptable fit to the a priori unidimensional model. As anticipated, the lie acceptability scale was positively related to narcissism and negatively associated with religiosity. The scale also predicted likelihood of use ratings of equivocal messages, deceptive omissions, and outright lies. An anticipated sex difference, however, was not replicated. The potential utility of the scale in communication research is discussed.
67

Leave-Group-Out Cross-Validation for Latent Gaussian Models

Liu, Zhedong 04 1900 (has links)
Cross-validation is a widely used technique in statistics and machine learning for predictive performance assessment and model selection. It involves dividing the available data into multiple sets, training the model on some of the data and testing it on the rest, and repeating this process multiple times. The goal of cross-validation is to assess the model’s predictive performance on unseen data. Two standard methods for cross-validation are leave-one-out cross-validation and K-fold cross-validation. However, these methods may not be suitable for structured models with many potential prediction tasks, as they do not take into account the structure of the data. As a solution, leave-group-out cross-validation is an extension of cross-validation that allows the left-out groups to make training sets and testing points to adapt to different prediction tasks. In this dissertation, we propose an automatic group construction procedure for leave-group-out cross-validation to estimate the predictive performance of the model when the prediction task is not specified. We also propose an efficient approximation of leave-group-out cross-validation for latent Gaussian models. Both of these procedures are implemented in the R-INLA software. We demonstrate the usefulness of our proposed leave-group-out cross-validation method through its application in the joint modeling of survival data and longitudinal data. The example shows the effectiveness of this method in real-world scenarios.
68

Graphics function standard specification validation with GKS

Fraser, Steven D. January 1987 (has links)
No description available.
69

Gestionnaire de connaissances pour systèmes hybrides objets-règles

Es-salihe, Mustapha January 2000 (has links)
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
70

A General Observational Strategy for Validation of Satellite NO₂ Retrievals using Multi-Axis Differential Optical Absorption Spectroscopy (MAX-DOAS)

Earley, Jeffrey D. 21 June 2022 (has links)
This thesis analyzes the effectiveness of spatially averaged Multi-AXis Differential Optical Absorption Spectroscopy (MAX-DOAS) measurements at regular azimuth angle intervals on an hourly basis to validate satellite based DOAS measurements. Off-Axis MAX-DOAS Measurements taken in Blacksburg, Virginia, between November 2021 and April 2022 with an evenly distributed set of measurements were averaged every hour and compared to Direct Sun measurements, also averaged every hour. Comparisons of the difference in average measurement from both measuring strategies, as well as the distribution standard deviations of hourly measurements suggests that the NO₂ distribution around Blacksburg is homogeneous. In order to test the effectiveness of this sampling strategy,in an inhomogeneous location, the LOTOS-EUROS high resolution (1kmx1km) chemical transport model was used to simulate profiles and vertical column densities of real measurements taken during the TROLIX'19 Field Campaign. The LOTOs-EUROS model was used to simulate vertical profiles as well as Vertical Column Densities based on real MAX-DOAS measurements as well as TROPOMI viewing geometry. While the individual ground measurements were not equal to the TROPOMI profile, the TROPOMI profile is approximately the average of the profiles of measurements made within the hour of TROPOMI overpass. / M.S. / This thesis analyzes the effectiveness of spatially averaged Multi-AXis Differential Optical Absorption Spectroscopy (MAX-DOAS) measurements at regular intervals of angles offset from due North on an hourly basis to validate satellite based DOAS measurements. MAX-DOAS Measurements taken relative to the position of the sun in Blacksburg, Virginia, a low NO₂ location, between November 2021 and April 2022 to determine the effectiveness of a generalized measuring strategy for satellite validation in low pollution environments. An evenly distributed set of measurements were averaged every hour and compared to measurements taken in the direction of the sun, also averaged every hour, to determine if the variability of NO₂ around Blacksburg is high enough to require a generalized sampling strategy, or if the NO₂ distribution is homogeneous enough to be accurately validated with Direct Sun measurements only.. Comparisons of the difference in average measurement from both measuring strategies, as well as the distribution of standard deviations of hourly measurements suggests that the NO₂ distribution around Blacksburg is low. In order to test the effectiveness of this sampling strategy in a higher pollution location with many sources and sinks of NO₂, the data from the LOTOS-EUROS high resolution (1kmx1km) chemical transport model run by the Royal Dutch Meteorological Institute for the TROLIX'19 Field Campaign was used to simulate vertical distributions of NO₂ and vertical column densities of measurements taken during the field campaign. The LOTOS-EUROS model was used to simulate vertical distributions of NO₂ as well as Vertical Column Densities based on real MAX-DOAS measurements as well as viewing geometry seen by the TROPOspheric Monitoring Instrument (TROPOMI) satellite-based instrument. While the individual ground measurements were not equal to the vertical distribution seen by TROPOMI, the TROPOMI vertical distribution is approximately the average of the vertical distributions of measurements made within an hour of TROPOMI passing over Rotterdam.

Page generated in 0.1096 seconds