• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 221
  • 71
  • 49
  • 23
  • 8
  • 6
  • 4
  • 3
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 474
  • 86
  • 56
  • 52
  • 50
  • 47
  • 41
  • 41
  • 38
  • 35
  • 33
  • 32
  • 32
  • 29
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Use of Assessments in College Chemistry Courses: Examining Students' Prior Conceptual Knowledge, Chemistry Self-efficacy, and Attitude

Villafañe-García, Sachel M. 10 April 2015 (has links)
Students' retention in STEM-related careers is of great concern for educators and researchers, especially the retention of underrepresented groups such as females, Hispanics, and Blacks in these careers. Therefore it is important to study factors that could potentially influence students' decision to stay in STEM. The work described in this dissertation involved three research studies where assessments have been used in college chemistry courses to assess students' prior content knowledge, chemistry-self-efficacy, and attitude toward science. These three factors have been suggested to have an influence on students' performance in a course and could eventually be a retention factor. The first research study involved the development and use of an instrument to measure biochemistry prior knowledge of foundational concepts from chemistry and biology that are considered important for biochemistry learning. This instrument was developed with a parallel structure where three items were used to measure a concept and common incorrect ideas were used as distractors. The specific structure of this instrument allows the identification of common incorrect ideas that students have when entering biochemistry and that can hinder students' learning of biochemistry concepts. This instrument was given as pre/posttest to students enrolled in introductory biochemistry courses. The findings indicated that some incorrect ideas are persistent even after instruction, as is the case for bond energy and the structure of the alpha helix concepts. This study highlights the importance of measuring prior conceptual knowledge; so that instructors can plan interventions to help students overcome their incorrect ideas. For the second research study, students' chemistry self-efficacy was measured five times during a semester of preparatory college chemistry. Chemistry self-efficacy beliefs have been linked to students' achievement, and students with stronger self-efficacy are more likely to try challenging tasks and persist in them, which will help them to stay in STEM. Using multilevel modeling analysis to examine potential differences in students' self-efficacy beliefs by sex and race/ethnicity, it was found that there were some differences in the trends by race/ethnicity. In particular, we found that for Hispanic and Black males the trends were negative when compared with White males. This study highlights the importance of measuring self-efficacy at different time points in the semester and for instructors to be aware of potential differences in their students' confidence when working on a chemistry task. The third research study involves the use of the Test of Science Related Attitudes (TOSRA) in an introductory chemistry course. A shortened version of the instrument that includes three scales, normality of scientists, attitude toward inquiry, and career interest in science was used. The first purpose of this study was to gather validity evidence for the internal structure of the instrument with college chemistry students. Using measurement invariance analysis by sex and race/ethnicity, it was found that the internal structure holds by sex, but it did not hold for Blacks in our sample. Further analysis revealed problems with the normality scales for Blacks. The second purpose was to examine the relationship between the scales of TOSRA, achievement in chemistry, and math prior knowledge. Using Structural Equation Modeling (SEM) it was found that two of the TOSRA scales, attitude toward inquiry and career interest in science, have a small but significant influence on students' achievement in chemistry. This study highlights the importance of examining if the scores apply similarly for different group of students in a population, since the scores on these assessments could be used to make decisions that will affect student. The research studies presented in this work are a step forward with our intention to understand better the factors that can influence students' decisions to stay or leave STEM-related careers. Each study has provided psychometric evidence for the use of three different assessments in college chemistry courses. Instructors can use these assessments in large and small lecture classrooms. Information obtained from these assessments can then be used to make target interventions to help students learn and/or be more confident on a given task. Also, it highlights the importance to look at different group of students, such as the underrepresented groups, since response trends may be different. Being aware of students' diverse needs will help us to understand some of the challenges that student face in the chemistry classroom. Understanding some of these challenges will help instructors be more prepared for teaching.
52

De la nature de quelques types de chroniques environnementales. Conséquences pour leur traitement statistique.

Manté, Claude 06 June 2008 (has links) (PDF)
Ce mémoire se compose d'un document de synthèse et de 4 articles publiés dans des revues internationales ; il comporte 5 chapitres dont 4 fondamentaux.<br />Le chapitre 1, intitulé « Aspects géométriques de l'Analyse en Composantes Principales », est essentiellement consacré aux techniques actuellement populaires d'analyse des données fonctionnelles. <br />La première section est consacrée à l'analyse dans l'espace des fréquences d'un signal multidimensionnel, re-codé par sa transformée de Fourier discrète, puis soumis à une Analyse en composantes principale (ACP). On montre l'intérêt de cette méthode pour filtrer des signaux non-stationnaires, dans les domaines de l'acoustique sous-marine et de l'analyse d'images.<br />Dans la deuxième section, on introduit un point de vue géométrique, au sens où « les propriétés géométriques sont caractérisées par leur invariance par rapport à un groupe de transformations » (K. Klein). Cela nous permet d'introduire l'ACP invariante sous l'action d'un groupe d'isométries (translations, p.ex.), qui montre son efficacité pour le traitement de données acoustiques.<br />Le chapitre 2, intitulé « Une petite théorie statistique de la rareté » est directement associé aux préoccupations des écologues. Il est motivé par les difficultés rencontrées lors de l'utilisation de méthodes factorielles dans le contexte d'études de longues séries d'observations d'espèces benthiques. C'est autour d'un l'article joint, intitulé : Methods for selecting dominant species in ecological series. Application to marine macrobenthic communities from the english channel que s'articule le discours. Il est bien connu que les méthodes classiques d'Analyse en Composantes Principales et des Correspondances portant sur des tableaux d'effectifs sont très sensibles à la faible importance de nombreux effectifs observés. L'auteur consacre une partie de ce chapitre à des définitions précises de la rareté, afin de pouvoir séparer ce qui est rare de ce qui est commun ou dominant. Cela permet de simplifier l'interprétation de l'ACP de longues séries d'observations, en supprimant les espèces rares. On propose une méthode automatique de sélection des espèces pour l'ACP, mais aussi des test portant sur la rareté globale (c'est-à-dire dans tous les relevés) des espèces recensées.<br /> <br />Le chapitre 3 est intitulé « Analyse en composantes principales de mesures absolument continues: applications en sédimentologie et en écologie ». On y présente des résultats statistiques théoriques obtenus dans deux articles (joints) publiés récemment, et illustrés par des applications à l'écologie et la sédimentologie. <br /> Il s'agit d'abord, dans un premier article intitulé The use of regularization methods in computing Radon-Nikodym derivatives. Application to grain-size distributions, de résoudre un problème d'approximation de densités pour résoudre un problème de sédimentologie. Dans le second article, intitulé Principal components analysis of measures, with special emphasis on grain-size curves, on propose une méthodologie originale pour réaliser des ACP de granulométries, utilisant le cadre fonctionnel mis en place dans l'article précédent. Ce travail est illustré par l'élaboration de plusieurs cartographies sédimentaires de l'étang de Berre, en fonction de la mesure de référence choisie.<br /> <br />Comme l'indique son titre : « La turbulence et le plancton », le chapitre 4 met en relation l'Ecologie et l'Océanographie Physique. Au cœur de ce chapitre se trouve le « paradoxe du plancton » : pourquoi y a-t-il une telle biodiversité dans les océans, alors qu'il s'y trouve si peu de ressources ' Pour certains chercheurs, l'explication de ce paradoxe réside dans la turbulence, qui favoriserait l'apport de nutriments aux cellules phytoplanctoniques. Afin de valider numériquement un modèle de croissance du phytoplancton basé sur cette hypothèse et incorporant la turbulence, il est nécessaire de simuler celle-ci. Nous proposons de la modéliser par une puissance d'un processus stochastique à longue portée. Afin d'estimer le principal paramètre de ce processus (son exposant de Hurst) à partir de données disponibles, nous avons proposé des améliorations méthodologiques, qui on fait l'objet d'un article joint: Application of resampling and linear spline methods to spectral and dispersional analyses of long-memory processes. Les améliorations attendues sont testées sur des données simulées et sur les données classiques de l'historique du niveau d'eau du Nil, ainsi que sur la chronique de l'Oscillation Nord-Atlantique (NAO).
53

Learning Object-Independent Modes of Variation with Feature Flow Fields

Miller, Erik G., Tieu, Kinh, Stauffer, Chris P. 01 September 2001 (has links)
We present a unifying framework in which "object-independent" modes of variation are learned from continuous-time data such as video sequences. These modes of variation can be used as "generators" to produce a manifold of images of a new object from a single example of that object. We develop the framework in the context of a well-known example: analyzing the modes of spatial deformations of a scene under camera movement. Our method learns a close approximation to the standard affine deformations that are expected from the geometry of the situation, and does so in a completely unsupervised (i.e. ignorant of the geometry of the situation) fashion. We stress that it is learning a "parameterization", not just the parameter values, of the data. We then demonstrate how we have used the same framework to derive a novel data-driven model of joint color change in images due to common lighting variations. The model is superior to previous models of color change in describing non-linear color changes due to lighting.
54

Biologically Plausible Neural Circuits for Realization of Maximum Operations

Yu, Angela J., Giese, Martin A., Poggio, Tomaso A. 01 September 2001 (has links)
Object recognition in the visual cortex is based on a hierarchical architecture, in which specialized brain regions along the ventral pathway extract object features of increasing levels of complexity, accompanied by greater invariance in stimulus size, position, and orientation. Recent theoretical studies postulate a non-linear pooling function, such as the maximum (MAX) operation could be fundamental in achieving such invariance. In this paper, we are concerned with neurally plausible mechanisms that may be involved in realizing the MAX operation. Four canonical circuits are proposed, each based on neural mechanisms that have been previously discussed in the context of cortical processing. Through simulations and mathematical analysis, we examine the relative performance and robustness of these mechanisms. We derive experimentally verifiable predictions for each circuit and discuss their respective physiological considerations.
55

Generalization over contrast and mirror reversal, but not figure-ground reversal, in an "edge-based

Riesenhuber, Maximilian 10 December 2001 (has links)
Baylis & Driver (Nature Neuroscience, 2001) have recently presented data on the response of neurons in macaque inferotemporal cortex (IT) to various stimulus transformations. They report that neurons can generalize over contrast and mirror reversal, but not over figure-ground reversal. This finding is taken to demonstrate that ``the selectivity of IT neurons is not determined simply by the distinctive contours in a display, contrary to simple edge-based models of shape recognition'', citing our recently presented model of object recognition in cortex (Riesenhuber & Poggio, Nature Neuroscience, 1999). In this memo, I show that the main effects of the experiment can be obtained by performing the appropriate simulations in our simple feedforward model. This suggests for IT cell tuning that the possible contributions of explicit edge assignment processes postulated in (Baylis & Driver, 2001) might be smaller than expected.
56

Evaluation of sets of oriented and non-oriented receptive fields as local descriptors

Yokono, Jerry Jun, Poggio, Tomaso 24 March 2004 (has links)
Local descriptors are increasingly used for the task of object recognition because of their perceived robustness with respect to occlusions and to global geometrical deformations. We propose a performance criterion for a local descriptor based on the tradeoff between selectivity and invariance. In this paper, we evaluate several local descriptors with respect to selectivity and invariance. The descriptors that we evaluated are Gaussian derivatives up to the third order, gray image patches, and Laplacian-based descriptors with either three scales or one scale filters. We compare selectivity and invariance to several affine changes such as rotation, scale, brightness, and viewpoint. Comparisons have been made keeping the dimensionality of the descriptors roughly constant. The overall results indicate a good performance by the descriptor based on a set of oriented Gaussian filters. It is interesting that oriented receptive fields similar to the Gaussian derivatives as well as receptive fields similar to the Laplacian are found in primate visual cortex.
57

Invariance Properties and Performance Evaluation of Bit Decoding Algorithms

Abedi, Ali January 2004 (has links)
Certain properties of optimal bitwise APP (A Posteriori Probability) decoding of binary linear block codes are studied. The focus is on the Probability Density Function (<i>pdf</i>) of the bit Log-Likelihood-Ratio (<i>LLR</i>). A general channel model with discrete (not necessarily binary) input and discrete or continuous output is considered. It is proved that under a set of mild conditions on the channel, the <i>pdf</i> of the bit <i>LLR</i> of a specific bit position is independent of the transmitted code-word. It is also shown that the <i>pdf</i> of a given bit <i>LLR</i>, when the corresponding bit takes the values of zero and one, are symmetric with respect to each other (reflection of one another with respect to the vertical axis). In the case of channels with binary inputs, a sufficient condition for two bit positions to have the same <i>pdf</i> is presented. An analytical method for approximate performance evaluation of binary linear block codes using an Additive White Gaussian Noise (AWGN) channel model with Binary Phase Shift Keying (BPSK) modulation is proposed. The pdf of the bit LLR is expressed in terms of the Gram-Charlier series expansion. This expansion requires knowledge of the statistical moments of the bit <i>LLR</i>. An analytical method for calculating these moments which is based on some recursive calculations involving certain weight enumerating functions of the code is introduced. It is proved that the approximation can be as accurate as desired, using enough numbers of terms in the Gram-Charlier series expansion. A new method for the performance evaluation of Turbo-Like Codes is presented. The method is based on estimating the <i>pdf</i> of the bit <i>LLR</i> by using an exponential model. The moment matching method is combined with the maximum entropy principle to estimate the parameters of the new model. A simple method is developed for computing the Probabilities of the Point Estimates (PPE) for the estimated parameters, as well as for the Bit Error Rate (BER). It is demonstrated that this method requires significantly fewer samples than the conventional Monte-Carlo (MC) simulation.
58

Dynamique et complexité de la déformation plastique : étude par émission acoustique

Richeton, Thiebaud 11 September 2006 (has links) (PDF)
L'émission acoustique est un moyen unique d'étude des avalanches de dislocations, à la fois en termes d'énergie, de temps et d'espace. L'émission acoustique mesurée lors d'essais de fluage sur des monocristaux de glace avait révélé une très forte intermittence du processus de déformation plastique, associée à des distributions en loi de puissance de la taille des avalanches. Cette thèse a permis de montrer que cette dynamique critique invariante d'échelle était perturbée dans les polycristaux de glace, où un effet de taille finie non trivial lié à la taille de grain a été mis en évidence. Cette thèse a aussi révélé que la criticalité observée dans les monocristaux de glace se retrouvait au cours de la déformation de monocristaux de Cd, de Zn et de Cu. La température, l'écrouissage par la forêt, le mâclage et le glissement multiple n'ont notamment pas affecté la valeur de l'exposant critique. Ces résultats suggèrent que la plasticité monocristalline pourrait ainsi être gouverner par une dynamique universelle.
59

Interest Curves : Concept, Evaluation, Implementation and Applications

Li, Bo January 2015 (has links)
Image features play important roles in a wide range of computer vision applications, such as image registration, 3D reconstruction, object detection and video understanding. These image features include edges, contours, corners, regions, lines, curves, interest points, etc. However, the research is fragmented in these areas, especially when it comes to line and curve detection. In this thesis, we aim to discover, integrate, evaluate and summarize past research as well as our contributions in the area of image features. This thesis provides a comprehensive framework of concept, evaluation, implementation, and applications for image features. Firstly, this thesis proposes a novel concept of interest curves. Interest curves is a concept derived and extended from interest points. Interest curves are significant lines and arcs in an image that are repeatable under various image transformations. Interest curves bring clear guidelines and structures for future curve and line detection algorithms and related applications. Secondly, this thesis presents an evaluation framework for detecting and describing interest curves. The evaluation framework provides a new paradigm for comparing the performance of state-of-the-art line and curve detectors under image perturbations and transformations. Thirdly, this thesis proposes an interest curve detector (Distinctive Curves, DICU), which unifies the detection of edges, corners, lines and curves. DICU represents our state-of-the-art contribution in the areas concerning the detection of edges, corners, curves and lines. Our research efforts cover the most important attributes required by these features with respect to robustness and efficiency. Interest curves preserve richer geometric information than interest points. This advantage gives new ways of solving computer vision problems. We propose a simple description method for curve matching applications. We have found that our proposed interest curve descriptor outperforms all state-of-the-art interest point descriptors (SIFT, SURF, BRISK, ORB, FREAK). Furthermore, in our research we design a novel object detection algorithm that only utilizes DICU geometries without using local feature appearance. We organize image objects as curve chains and to detect an object, we search this curve chain in the target image using dynamic programming. The curve chain matching is scale and rotation-invariant as well as robust to image deformations. These properties have given us the possibility of resolving the rotation-variance problem in object detection applications. In our face detection experiments, the curve chain matching method proves to be scale and rotation-invariant and very computational efficient. / Bilddetaljer har en viktig roll i ett stort antal applikationer för datorseende, t.ex., bildregistrering, 3D-rekonstruktion, objektdetektering och videoförståelse. Dessa bilddetaljer inkluderar kanter, konturer, hörn, regioner, linjer, kurvor, intressepunkter, etc. Forskningen inom dessa områden är splittrad, särskilt för detektering av linjer och kurvor. I denna avhandling, strävar vi efter att hitta, integrera, utvärdera och sammanfatta tidigare forskning tillsammans med vår egen forskning inom området för bildegenskaper. Denna avhandling presenterar ett ramverk för begrepp, utvärdering, utförande och applikationer för bilddetaljer. För det första föreslår denna avhandling ett nytt koncept för intressekurvor. Intressekurvor är ett begrepp som härrör från intressepunkter och det är viktiga linjer och bågar i bilden som är repeterbara oberoende av olika bildtransformationer. Intressekurvor ger en tydlig vägledning och struktur för framtida algoritmer och relaterade tillämpningar för kurv- och linjedetektering. För det andra, presenterar denna avhandling en utvärderingsram för detektorer och beskrivningar av intressekurvor. Utvärderingsramverket utgör en ny paradigm för att jämföra resultatet för de bästa möjliga teknikerna för linje- och kurvdetektorer vid bildstörningar och bildtransformationer. För det tredje presenterar denna avhandling en detektor för intressekurvor (Distinctive curves, DICU), som förenar detektering av kanter, hörn, linjer och kurvor. DICU representerar vårt främsta bidrag inom området detektering av kanter, hörn, kurvor och linjer. Våra forskningsinsatser täcker de viktigaste attribut som krävs av dessa funktioner med avseende på robusthet och effektivitet. Intressekurvor innehåller en rikare geometrisk information än intressepunkter. Denna fördel öppnar för nya sätt att lösa problem för datorseende. Vi föreslår en enkel beskrivningsmetod för kurvmatchningsapplikationer och den föreslagna deskriptorn för intressekurvor överträffar de bästa tillgängliga deskriptorerna för intressepunkter (SIFT, SURF, BRISK, ORB, och FREAK). Dessutom utformar vi en ny objektdetekteringsalgoritm som bara använder geometri för DICU utan att använda det lokala utseendet. Vi organiserar bildobjekt som kurvkedjor och för att upptäcka ett objekt behöver vi endast söka efter denna kurvkedja i målbilden med hjälp av dynamisk programmering. Kurvkedjematchningen är oberoende av skala och rotationer samt robust vid bilddeformationer. Dessa egenskaper ger möjlighet att lösa problemet med rotationsberoende inom objektdetektering. Vårt ansiktsigenkänningsexperiment visar att kurvkedjematchning är oberoende av skala och rotationer och att den är mycket beräkningseffektiv. / INTRO – INteractive RObotics research network
60

On the Invariance of Size Distribution of Establishments

Kamanina, Polina January 2012 (has links)
The thesis examines the establishment size distribution over time and across groups of regions, using data on Swedish establishments during period 1994-2009. The size distribution of establishments is highly skewed and approximates the Pareto distribution. The shape of size distribution is invariant over time and across groups of regions. The distribution of total number of establishments and incumbent distribution are found to rise from the same distribution. Moreover, the invariance of establishment size distribution is highly determined by the invariance of distribution of incumbents, entry and exit distributions. Larger establishments have more chances to survive and higher probability to remain in current size group comparing to smaller ones, whereas higher probabilities of growth would be attached to smaller establishments.

Page generated in 0.0326 seconds