• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 359
  • 54
  • 47
  • 45
  • 37
  • 19
  • 16
  • 6
  • 6
  • 6
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 727
  • 318
  • 113
  • 77
  • 74
  • 66
  • 57
  • 54
  • 54
  • 51
  • 41
  • 41
  • 41
  • 37
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

TWO-SURFACE OPTICAL SYSTEMS WITH ZERO THIRD-ORDER SPHERICAL ABERRATION

Stavroudis, O. N. 15 April 1969 (has links)
QC 351 A7 no. 37 / This paper derives four one-parameter families of two-surface optical systems having the property that, relative to a well-defined pair of conjugate points, one finite and the other infinite, third-order spherical aberration is zero. The two surfaces can be either refracting or reflecting. Aperture planes are defined for which third-order astigmatism is zero. An expression for coma is also derived. Assuming that the systems will be constructible, a means of defining domains for the free parameter is indicated. Possible applications of these results to optical design are included.
102

Sledování přepravovaných zásilek v kombinované přepravě firmy ČSKD Intrans / Consignment tracking in combined transport of company ČSKD Intrans

Frajman, Tomáš January 2010 (has links)
The master's thesis discusses the purpose and objectives of consignment tracking in combined transport. The aim is to characterize the means and methods used by Czech combined transport operator ČSKD Intrans. This part is explained on the particular example of consignments transported via import route Hamburg-Prague. Next part of the thesis is focused on analyzing the current situation and suggesting proposals for improvement. These include implementation of global navigation satellite systems (GPS), status codes and possibilities enabled by integration of the external information systems with company's own internal operation's system.
103

Responding to Moments of Learning

Goldstein, Adam B 03 May 2011 (has links)
In the field of Artificial Intelligence in Education, many contributions have been made toward estimating student proficiency in Intelligent Tutoring Systems (cf. Corbett & Anderson, 1995). Although the community is increasingly capable of estimating how much a student knows, this does not shed much light on when the knowledge was acquired. In recent research (Baker, Goldstein, & Heffernan, 2010), we created a model that attempts to answer that exact question. We call the model P(J), for the probability that a student just learned from the last problem they answered. We demonstrated an analysis of changes in P(J) that we call “spikiness", defined as the maximum value of P(J) for a student/knowledge component (KC) pair, divided by the average value of P(J) for that same student/KC pair. Spikiness is directly correlated with final student knowledge, meaning that spikes can be an early predictor of success. It has been shown that both over-practice and under-practice can be detrimental to student learning, so using this model can potentially help bias tutors toward ideal practice schedules. After demonstrating the validity of the P(J) model in both CMU's Cognitive Tutor and WPI's ASSISTments Tutoring System, we conducted a pilot study to test the utility of our model. The experiment included a balanced pre/post-test and three conditions for proficiency assessment tested across 6 knowledge components. In the first condition, students are considered to have mastered a KC after correctly answering 3 questions in a row. The second condition uses Bayesian Knowledge Tracing and accepts a student as proficient once they earn a current knowledge probability (Ln) of 0.95 or higher. Finally, we test P(J), which accepts mastery if a student's P(J) value spikes from one problem and the next first response is correct. In this work, we will discuss the details of deriving P(J), our experiment and its results, as well as potential ways this model could be utilized to improve the effectiveness of cognitive mastery learning.
104

Leveraging Influential Factors into Bayesian Knowledge Tracing

Qiu, Yumeng 10 January 2013 (has links)
Predicting student performance is an important part of the student modeling task in Intelligent Tutoring System (ITS). The state-of-art model for predicting student performance - Bayesian Knowledge Tracing (KT) has many critical limitations. One specific limitation is that KT has no underlying mechanism for memory decay represented in the model, which means that no forgetting is happening in the learning process. In addition we notice that numerous modification to the KT model have been proposed and evaluated, however many of these are often based on a combination of intuition and experience in the domain, leading to models without performance improvement. Moreover, KT is computationally expensive, model fitting procedures can take hours or days to run on large datasets. The goal of this research work is to improve the accuracy of student performance prediction by incorporating the memory decay factor which the standard Bayesian Knowledge Tracing had ignored. We also propose a completely data driven and inexpensive approach to model improvement. This alternative allows for researchers to evaluate which aspects of a model are most likely to result in model performance improvements based purely on the dataset features that are computed from ITS system logs.
105

Trying to Reduce Gaming Behavior by Students in Intelligent Tutoring Systems

Forbes-Summers, Elijah 03 May 2010 (has links)
Student gaming behavior in intelligent tutoring systems (ITS) has been correlated with lower learning rates. The goal of this work is to identify such behavior, produce interventions to discourage this behavior, and by doing so hopefully improve the learning rate of students who would normally display gaming behavior. Detectors have been built to identify gaming behavior. Interventions have been designed to discourage the behavior and their evaluation is discussed.
106

Developing a Cognitive Rule-Based Tutor for the ASSISTment System

Rasmussen, Kai 09 January 2007 (has links)
The ASSISTment system is a web-based tutor that is currently being used as an eighth and tenth-grade mathematics in both Massachusetts and Pennsylvania. This system represents its tutors as state-based "pseudo-tutors" which mimic a more complex cognitive tutor based on a set of production rules. It has been shown that building pseudo-tutors significantly decreases the time spent authoring content. This is an advantage for authoring systems such as the ASSITment builder, though it sacrifices greater expressive power and flexibility. A cognitive tutor models a student's behavior with general logical rules. Through model-tracing of a cognitive tutor's rule space, a system can find the reasons behind a student action and give better tutoring. In addition, these cognitive rules are general and can be used for many different tutors. It is the goal of this thesis to provide the architecture for using cognitive rule-based tutors in the ASSITment system. A final requirement is that running these computationally intensive model-tracing tutors do not slow down students using the pseudo-tutors, which represents the majority of ASSISTment usage. This can be achieved with remote computation, realized with SOAP web services. The system was further extended to allow the creation and implementation of user-level experiments within the system. These experiments allow the testing of pedagogical choices. We implemented a hint dissuasion experiment to test this experimental framework and provide those results.
107

Boredom and student modeling in intelligent tutoring systems

Hawkins, William J 25 April 2014 (has links)
Over the past couple decades, intelligent tutoring systems (ITSs) have become popular in education. ITSs are effective at helping students learn (VanLehn, 2011; Razzaq, Mendicino & Heffernan, 2008; Koedinger et al, 1997) and help researchers understand how students learn. Such research has included modeling how students learn (Corbett & Anderson, 1995), the effectiveness of help given within an ITS (Beck et al, 2008), the difficulty of different problems (Pardos & Heffernan, 2011), and predicting long-term outcomes like college attendance (San Pedro et al, 2013a), among many other studies. While most studies have focused on ITSs from a cognitive perspective, a growing number of researchers are paying attention to the motivational and affective aspects of tutoring, which have been recognized as important components of human tutoring (Lepper et al, 1993). Recent work has shown that student affect within an ITS can be detected, even without physical sensors or cameras (D’Mello et al, 2008; Conati & Maclaren, 2009; Sabourin et al, 2011; San Pedro et al, 2013b). Initial studies with these sensor-less affect detectors have shown that certain problematic affective states, such as boredom, confusion and frustration, are prevalent within ITSs (Baker et al, 2010b). Boredom in particular has been linked to negative learning outcomes (Pekrun et al, 2010; Farmer & Sundberg, 1986) and long-term disengagement (Farrell, 1988). Therefore, reducing or responding effectively to these affective states within ITSs may improve both short- and long-term learning outcomes. This work is an initial attempt to determine what causes boredom in ITSs. First, we determine which is more responsible for boredom in ITSs: the content in the system, or the students themselves. Based on the findings of that analysis, we conduct a randomized controlled trial to determine the effects of monotony on student boredom. In addition to the work on boredom, we also perform analyses that concern student modeling, specifically how to improve Knowledge Tracing (Corbett & Anderson, 1995), a popular student model used extensively in real systems like the Cognitive Tutors (Koedinger et al, 1997) and in educational research.
108

Risk Aversion and Information Acquisition Across Real and Hypothetical Settings

Taylor, Matthew, Taylor, Matthew January 2012 (has links)
I collect data on subjects' information acquisition during real and hypothetical risky choices using process-tracing software called Mouselab. I also measure subjects' cognitive ability using the cognitive reflective test (CRT). On average, measured risk preferences are not significantly different across real and hypothetical settings. However, cognitive ability is inversely related to risk aversion when choices are hypothetical, but it is unrelated when the choices are real. This interaction between cognitive ability and hypothetical setting is consistent with the notion that some individuals, specifically higher-ability individuals, treat hypothetical choices as "puzzles" and may help explain why some studies find that subjects indicate that they are more tolerant of risk when they make hypothetical choices than when they make real choices. On average, subjects demonstrate a similar degree of consistency across settings, and there are also no significant differences across settings in the amount of time subjects take to make a choice, the amount of information they acquire, or how they distribute their attention. I also find evidence to suggest that subjects acquire information in a manner consistent with the implicit calculation of expected utility. Specifically, individuals do not merely make choices "as if" they are integrating probabilities and outcomes, it appears that they actually are. Moreover, as they progress through a series of choices in a commonly used risk preference elicitation method, their information acquisition becomes progressively more consistent with integration models. Finally, on average, individuals appear to acquire information in real and hypothetical settings in similar ways.
109

Síntese de fenômenos naturais através do traçado de raios usando "height fields"

Silva, Franz Josef Figueroa Ferreira da January 1996 (has links)
A síntese de imagens é uma ferramenta valiosa na compreensão de diversos fenômenos da natureza. Nos últimos anos várias abordagens têm sido propostas para sintetizar tais fenômenos. A grande maioria de tais abordagens têm se centralizado no desenvolvimento de modelos procedurais. Porém, cada uma destas técnicas simula exclusivamente um fenômeno natural. Um dos métodos de síntese de imagens fotorealísticas mais proeminente é denominado de Traçado de Raios (Ray Tracing). Contudo, apesar de produzir imagens de excelente qualidade, este método é computacionalmente muito oneroso. A síntese de fenômenos naturais utilizando-se o traçado de raios é um desafio. É importante que este problema seja abordado, apesar da sua complexidade, pois a simulaçao fotorealista da natureza é muito importante para os cientistas e pesquisadores desde o surgimento dos computadores. Um algoritmo versátil e rápido para a síntese de fenômenos da natureza através do traçado de raios utilizando campos de altitude é proposto. O algoritmo utiliza uma modificação do algoritmo do Analisador Diferencial Digital de Bresenham para atravesar uma matriz bidimensional de valores de altitude. A determinação das primitivas geométricas a serem interseccionadas por um raio é obtida num tempo ( N ) , sendo N o número de altitudes no campo de altitude. Este trabalho faz uma comparação em termos de velocidade e realismo deste método com outras abordagens convencionais; e discute as implicações que a implementação deste método traz. Finalmente, destaca-se a simplicidade e versatilidade que este método proporciona devido à pequena quantidade de parâmetros necessária para a síntese de fenômenos naturais utilizando o traçado de raios. Para a criação de animações basta a especificação de novos parâmetros num intervalo de tempo diferente. / Visualization is a powerful tool for better undestanding of several natural phenomena. In recent years, several techniques have been proposed. Considerable interest in natural scene synthesis has focused on procedural models. However, these techniques produce synthetic scenes of only one natural phenomenon. Ray tracing is one of the most photorealistic methods of image syntesis. While providing images of excellent quality, ray tracing is a computationally intensive task. Natural scene synthesis is a challenging problem within the realm of ray tracing. It is important to tackle this problem, despite of its complexity, because photorealistic simulation have been important to scientific community since the appearance of computers. A fast and versatile algorithm for ray tracing natural scenes through height fields is presented. The algorithm employs a modified Bresenham DDA to traverse a two dimensional array of values. The objects tested for intersection are located in ( N ) time where N is the number of values in the field. This work compares the speed-up and photorealism achieved in natural scene synthesis using this method with other algorithms and discusses the implications of implementing this approach. As a final point, the simplicity and versatility of synthesizing complex natural scenes from a few parameters and data is especially attractive. Animated sequences require only the additional specifications of time modified parameters or data.
110

Ray-traced radiative transfer on massively threaded architectures

Thomson, Samuel Paul January 2018 (has links)
In this thesis, I apply techniques from the field of computer graphics to ray tracing in astrophysical simulations, and introduce the grace software library. This is combined with an extant radiative transfer solver to produce a new package, taranis. It allows for fully-parallel particle updates via per-particle accumulation of rates, followed by a forward Euler integration step, and is manifestly photon-conserving. To my knowledge, taranis is the first ray-traced radiative transfer code to run on graphics processing units and target cosmological-scale smooth particle hydrodynamics (SPH) datasets. A significant optimization effort is undertaken in developing grace. Contrary to typical results in computer graphics, it is found that the bounding volume hierarchies (BVHs) used to accelerate the ray tracing procedure need not be of high quality; as a result, extremely fast BVH construction times are possible (< 0.02 microseconds per particle in an SPH dataset). I show that this exceeds the performance researchers might expect from CPU codes by at least an order of magnitude, and compares favourably to a state-of-the-art ray tracing solution. Similar results are found for the ray-tracing itself, where again techniques from computer graphics are examined for effectiveness with SPH datasets, and new optimizations proposed. For high per-source ray counts (≳ 104), grace can reduce ray tracing run times by up to two orders of magnitude compared to extant CPU solutions developed within the astrophysics community, and by a factor of a few compared to a state-of-the-art solution. taranis is shown to produce expected results in a suite of de facto cosmological radiative transfer tests cases. For some cases, it currently out-performs a serial, CPU-based alternative by a factor of a few. Unfortunately, for the most realistic test its performance is extremely poor, making the current taranis code unsuitable for cosmological radiative transfer. The primary reason for this failing is found to be a small minority of particles which always dominate the timestep criteria. Several plausible routes to mitigate this problem, while retaining parallelism, are put forward.

Page generated in 0.0474 seconds