• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 186
  • 23
  • 22
  • 18
  • 10
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 352
  • 352
  • 52
  • 38
  • 35
  • 34
  • 33
  • 33
  • 28
  • 27
  • 27
  • 26
  • 26
  • 25
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Complexité d'ordre supérieur et analyse récursive / Higher order complexity and computable analysis

Férée, Hugo 10 December 2014 (has links)
Alors que la complexité des fonctions d'ordre 1 est bien définie et étudiée, il n'existe pas de notion satisfaisante à tout ordre. Une telle théorie existe déjà à l'ordre 2 et permet de définir une classe analogue aux fonctions calculables en temps polynomial usuelles. Cela est tout particulièrement intéressant dans le domaine de l'analyse récursive où l'on peut représenter, entre autres, les nombres et les fonctions réelles par des fonctions d'ordre 1. On peut alors remarquer un lien fort entre calculabilité et continuité, et aussi rapprocher la complexité avec certaines propriétés analytiques, ce que nous illustrons dans le cas des opérateurs réels. Nous prouvons cependant que, du point de vue de la complexité, les fonctions d'ordre 1 ne permettent pas de représenter fidèlement certains espaces mathématiques. Ce résultat appuie tout particulièrement la nécessité d'une théorie de la complexité d'ordre supérieur. Nous développons alors un modèle de calcul basé sur la sémantique des jeux, où l'application de deux fonctions est représentée par la confrontation de deux stratégies dans un jeu. En définissant la taille de telles stratégies, nous pouvons déduire une notion robuste et pertinente de complexité pour ces stratégies et donc pour les fonctions d'ordre supérieur. Nous définissons aussi une classe de fonctions calculables en temps polynomial qui paraît être un bon candidat pour définir une classe de complexité raisonnable à tout ordre / While first order complexity is well defined and studied, higher order lacks a satisfactory notion of complexity. Such a theory already exists at order 2 and provides a complexity class analogue to usual polynomial time computable functions. This is already especially interesting in the domain of computable analysis, where real numbers or real functions for example can be represented by first order functions. In particular, there is a clear link between computability and continuity, and we also illustrate in the case of real operators that complexity can be related to some analytical properties. However, we prove that, from a complexity point of view, some mathematical spaces can not be faithfully represented by order 1 functions and require higher order ones. This result underlines that there is a need to develop a notion of complexity at higher types which will be, in particular but not only, useful to computable analysis. We have developed a computational model for higher type sequential computations based on a game semantics approach, where the application of two functions is represented by the game opposing two strategies. By defining the size of such strategies, we are able to define a robust and meaningful notion of complexity at all types, together with a class of polynomial time computable higher order functionals which seems to be a good candidate for a class of feasible functionals at higher types
62

Uma introdução à lógica de segunda ordem / An Introduction to Logic Second Order

Júnior, Enéas Alves Nogueira 26 April 2013 (has links)
Neste trabalho investigamos alguns aspectos da Lógica de Segunda Ordem, dividindo o tema em três capítulos. No primeiro capítulo discorremos sobre os conceitos básicos desta Lógica, tais como conjunto de fórmulas, sistemas dedutivos e semânticas. Fazemos também um contraste com a Lógica de Primeira Ordem, que é mais conhecida, para se ter uma espécie de modelo do qual estamos nos diferenciando. Provamos o teorema da completude para a Lógica de Segunda Ordem, devido a L. Henkin em Henkin (1950). No segundo capítulo nós procuramos entender o que acontece com a semântica da teoria de conjuntos ZF C (que é de primeira ordem) se adicionarmos alguns axiomas de segunda ordem, criando uma teoria que chamamos de ZF 2 . Mostramos um teorema devido a Zermelo (Zermelo (1930)) que diz que os modelos desta teoria são essencialmente os mesmos. Tam- bém procuramos investigar a questão da Hipótese do Contínuo com relação à de um metódo de forcing para esta teoria, mostramos que a HC ZF 2 e, através continua sem resposta. No terceiro capítulo, escrevemos sobre três temas diferentes: o primeiro é sobre a relação que existe entre a propriedade da completude, da compacidade e a semântica de Henkin. O teorema de Lindström, que provamos nesta seção, diz essencialmente que não podemos ter completude e compacidade para a Lógica de Segunda Ordem ao menos que usemos esta semântica. Na segunda seção, investigamos o número de Hanf da Lógica de Segunda Ordem com a semântica Padrão e, na terceira seção, mostramos que é possível fazer uma redução das Lógicas de ordem superior à segunda e que o conjunto das fórmulas válidas da Lógica de Segunda Ordem não é denível na estrutura dos números naturais. / In this work we investigate some aspects of Second-Order Logic, splitting the theme in three chapters. In the rst one, we discuss the basic concepts of that Logic, such as set of formulas, deductive systems and semantics. We also make a contrast with First-Order Logic, which is better know, in order to have some kind of model from wich we are dierentiating. We prove the theorem of the completeness for the Second-Order Logic, due to L. Henkin in Henkin (1950). In the second chapter we try to understand what happens with the semantics of the ZF C set theory (which is a First-Order theory) if we add some Second-Order axioms, creating a theory that we call ZF 2 . We prove a theorem due to Zermelo (Zermelo (1930)) which says that the models of this theory are essentially the same. We also investigate the question of the Continuum Hypothesis in relation to theory, we show that the HC ZF 2 and, through a method of forcing for that still has no answer. In the third chapter, we write about three dierent themes: the rst is about the relation that exists between the property of completeness, of compactness and the Henkin semantics. The Lindström\'s theorem, which we prove in this section, says essentially that we can\'t have the completeness and the compactness for the Secon-Order Logic without Henkin semantics. In the second section, we investigate the Hanf Number of Second-Order Logic and, in the third section, we show that it is possible to make a reduction of Logics of order higher than the second to the second and that the set of the Second-Order valid formulas is not denable in the structure of the natural numbers.
63

Higher-Order Thought and Borderline Cases of Consciousness: An Objection to HOT

Beach, Francesca Karin 01 January 2019 (has links)
David Rosenthal, in his Higher-Order Thought (HOT) theory of consciousness, argues that it is a higher-order thought to the effect that the subject is in a conscious state that makes one conscious of his or her own mental states. In this paper, I argue that since phenomenal consciousness can be vague and Rosenthal’s HOT cannot, HOT is not a necessary condition of phenomenal consciousness. I use primarily Ned Blocks’ refrigerator hum case and Sartre’s example of non-positional awareness to argue that the threshold which determines the degree of first-person awareness necessary for a mental state to be conscious is vague itself, therefore consciousness is a vague concept. HOT cannot accommodate for borderline cases of phenomenal consciousness, therefore it cannot be a necessary condition of all conscious mental states. This is especially relevant in the discussion of non-human animal consciousness, as HOT theories such as Carruthers have been used to deny non-human animal consciousness on the basis of the on/off feature of such representational theories.
64

Teacher Perception of Technology as a Conduit to Acquiring Critical Thinking Skills

Patrick, Wanda Pearl 01 January 2016 (has links)
Seventh-grade and eighth-grade special education students struggle to learn higher-order thinking skills in pre-algebra and algebra that can be addressed by using technology. However, little is known about science, technology, engineering, and math (STEM) teachers' attitudes toward use of and their actual use of calculators and technology to access students' development of higher-order thinking skills. The purpose of this qualitative case study was to explore the perceptions of rural middle school Grade 7 and 8 STEM teachers in one Western state. This study used Gardner's multiple intelligences and Armstrong's neurodiversity theories as a framework. Participants were 10 Grade 7 and 8 STEM teachers in a Western state. Data sources included interviews, surveys, and teacher journals. Open coding allowed the identification of similar threads, common words, or expressions that were then examined for themes and patterns. The emergent themes included a need for training, teachers' technological expectations, and whether teachers could meet grade level standards and students have success. This study assists social change by informing school administrators and teachers how technology is and is not being used in the classroom and how its use can be facilitated in the future.
65

Getting the HOTS with what's in the box: Developing higher order thinking skills within a technology-rich learning environment

McMahon, Graham January 2007 (has links)
Educators are divided with regards to the value of computer technology as a learning tool. Some maintain that computers have had little impact on students’ learning; others suggest that computers have the potential to enhance learning. Within this second group there are those who believe that computers are having a significant impact, while others believe that their potential is yet to be realised. The purpose of this study was to examine the relationship between students working in a technology rich environment and their development of higher order, critical and creative, thinking skills. Staff and students from one school participated in this case study. Data were collected by teachers as part of the normal teaching-learning program, supplemented by classroom observations and teacher interviews. In addition, data pertaining to the technology infrastructure was collated from school databases. The data were used to determine the degree of correlation between factors of the learning environment and the extent to which higher order thinking skills (HOTS) were demonstrated by the students. Collations of the statistically significant, and statistically insignificant, correlations allowed relationships between environmental factors and HOTS to be established. / The results indicate that studying within a technology-rich learning environment improves students’ higher order thinking skills, determined by measuring their critical and creative thinking. Factors such as length of time spent in the environment have a positive, non-linear effect on the development of critical thinking skills. These factors have no significant correlation with the development of creative thinking skills. The interaction of students’ computer skills and the classroom environmental factors was shown to be complex. Three-dimensional correlations were performed to derive equations that explain these interactions. Students with better developed computing skills scored higher on critical and creative thinking activities. This was most significant for students with better computer programming skills and the ability to competently manipulate Boolean logic. The most significant factors in developing higher order thinking skills were the students’ levels of computer skills, tempered with their attitudes towards computers and computer classes, and the teacher-student relationships within the technology-rich learning environment. The research suggests that in order to develop students' higher order thinking skills schools should endeavour to integrate technology across all of the learning areas. This will allow students to apply technology to the attainment of higher levels of cognition within specific contexts. This will need to be paralleled by providing students the opportunity to develop appropriate computer skills.
66

Optimal Algorithmic Techniques of LASIK Procedures

Yi, Fan, n/a January 2006 (has links)
Clinical wavefront-guided corneal ablation has been now the most technologically advanced method to reduce the dependence of glasses and contact lenses. It has the potential not only to eliminate spherocylindrical errors but also to reduce higher-order aberrations (HOA). Recent statistics show that more than 96% of the patients who received laser in situ keratomileusis (LASIK) treatment reported their satisfaction about the improvement on vision, six months after the surgery. However, there are still patients complaining that their vision performance did not achieve the expectation or was even worse than before surgery. The reasons causing the unexpected post-surgical outcome include undercorrection, overcorrection, induced HOA, and other postoperative diseases, most of which are caused by inaccurate ablation besides other pathological factors. Therefore, to find out the method to optimize the LASIK procedures and provide a higher surgical precision has become increasingly important. A proper method to calculate ablation profile and an effective way to control the laser beam size and shape are key aspects in this research to resolve the problem. Here in this Master of Philosophy degree thesis, the author has performed a meticulous study on the existing methods of ablation profile calculation and investigated the efficiency of wavefront only ablation by a computer simulation applying real patient data. Finally, the concept of a refractive surgery system with dynamical beam shaping function is sketched, which can theoretically overcome the disadvantages of traditional procedures with a finite laser beam size.
67

EEG based Macro-Sleep-Architecture and Apnea Severity Measures

Vinayak Swarnkar Unknown Date (has links)
Obstructive Sleep Apnea-Hypopnea Syndrome (OSAHS) is a serious sleep disordered affecting up to 24% of men and 9% of woman in the middle aged population. The current standard for the OSAHS diagnosis is Polysomnography (PSG), which refers to the continuous monitoring of multiple physiological variables over the course of a night. The main outcomes of the PSG test are the OSAHS severity measures, such as the Respiratory Disturbance Index (RDI), Arousal Index, Latencies and other information to determine the macro sleep architecture (MSA), which is defined by Wake, Rapid-eye-movement (REM) and non-REM states of sleep. The MSA results are essential for computing the diagnostic measures reported in a PSG. The existing methods of the MSA analysis require the recording of 5-7 electrophysiological signals, including the Electroencephalogram (EEG), Electroculogram (EOG), and the Electromyogram (EMG). Sleep clinicians have to depend on the manual scoring of the overnight data records using the criteria given by Rechtschaffen and Kales (R&K, 1968). The manual analysis of MSA is tedious, subjective and suffers from inter- and intra-scorer variability. Additionally, the RDI and the Apnea-Hypopnea Index (AHI) parameters although used as the primary measures of the OSAHS severity, suffers from subjectivity, low reproducibility and a poor correlation with the symptoms of OSAHS. Sleep is essentially a neuropsychological phenomenon, and the EEG remains the best technique for the functional imaging of the brain during sleep. The EEG is the direct result of the neuronal activity of the brain. However, despite the potential, the wealth of information available in the EEG signal remains virtually untapped in current OSAHS diagnosis. Although the EEG is extensively used in traditional sleep analysis, its usage is mainly limited to staging sleep, based on the four-decade old R&K criteria. This thesis addresses these issues plaguing the PSG. We develop a novel, fully-automated algorithm (Higher-order Estimated Sleep States, HESS-algorithm) for the MSA analysis, which requires only one channel of the EEG data. We also develop an objective MSA analysis technique that uses a single, one-dimensional slice of the Bispectrum of the EEG, representing a nonlinear transformation of a system function that can be considered as the EEG generator. The agreement between the human and the proposed technology was found to be in the range of 70%-87%, which are similar to those, possible between expert human scorers. The ability of the HESS algorithm to compute the MSA parameters reliably and objectively will make a dramatic impact on the diagnosis and treatment of OSAHS and other sleep diseases, such as insomnia. The proposed technology uses low-computation-load Bispectrum techniques independent of R&K Criteria (1968) making real-time automated analysis a reality. In the thesis we also propose a new index (the IHSI) to characterise the severity of sleep apnea. The new index is based on the hemispherical asymmetry of the brain and is computed from the EEG coherence analysis. We achieved a significant (p=0.0001) accuracy of up to 91% in classifying patients into apneic and non-apneic group. Our statistical analysis results show that the IHSI carries potential for providing us with a reproducible measure to assist in diagnosing of OSAHS. With the proposed methods in this thesis it may be possible to develop the technology that will not only attempt to screen the OSAHS patients but will be able to provide OSAHS diagnosis with detailed sleep architecture via home based test. These technologies will simplify the instrumentation dramatically and will make possible to extend EEG/MSA analysis to portable systems as well.
68

Designing for eAssessment of higher order thinking : An undergraduate IT online distance education course in Sri Lanka / Att designa IT-stödd bedömning av studenters förmåga till kritiskt tänkande, reflektion och problemlösning : distansutbildning i Sri Lanka

Usoof, Hakim January 2012 (has links)
Distance education has seen rapid growth over the recent decades. The rapid development of Information Communication Technology [ICT] has been one of the main drivers of this growth in distance education. However, distance education and ICT themselves posts challenges to both students and educators alike. This thesis finds its basis in the problem of high failure rates and quality assurance issues in the Bachelor of Information Technology [BIT] distance degree programme conducted by the University of Colombo School of Computing in Sri Lanka. A Formative Assessment for Distance Education [FADE] model that promotes the development of and assesses higher-order skills in a collaborative online distance-learning environment was designed based on a methodological approach involving design-based research. The main study was focussed on two main problems, plagiarism in distance education [part A] and the use of technology to address the issues of learning and assessment [part B]. Research questions arising from different aspects of the design required the use of multiple methodologies. Issues of plagiarism in technology aided assessment in distance education put forward questions that required the use of a quasi-experiment and a literature survey. The empirical material of this phase of the study comprised of keystroke logs and questionnaire data. The design and evaluation of the FADE model employed a mixed method two-phase sequential explanatory strategy. The empirical material of this phase of the study comprised of questionnaires, observations coding, interviews and examination and registry data. The quasi-experimental data was analysed using a fuzzy logic engine. The questionnaire, observation coding and examination and registry data were statistically analysed and interviews were used to interpret and explain finding. The results of the part A of the study indicate that there are keystroke patterns for individuals that are stable within and across different tasks. However, the results of the literature review on plagiarism suggested the use of both technological and pedagogical approaches to plagiarism. Part B of the study, showed relationships between higher order thinking demonstrated by students and their course results and attitudes.  Collaborative learning skills demonstrated by students showed relationships to the students’ purpose of use of the FADE forum and their experience on the social web. This study finds that both technological tools and pedagogical practices have to be used in conjunction to limit the possibility of plagiarism. With reference to assessment with a focus on the development of higher order thinking, the study indicates that assessment should be based on the student’s perspective, the purpose and aim of the assessment and the assessment environment. Furthermore, the study finds that in distance education collaboration seems particularly important.
69

Roots of Polynomials: Developing p-adic Numbers and Drawing Newton Polygons

Ogburn, Julia J 15 March 2013 (has links)
Newton polygons are constructions over the p-adic numbers used to find information about the roots of a polynomial or power series. In this the- sis, we will first investigate the construction of the field Qp on the p-adic numbers. Then, we will use theorems such as Eisenstein’s Irreducibility Criterion, Newton’s Method, Hensel’s Lemma, and Strassman’s Theorem to build and justify Newton polygons.
70

Aspects of Composite Likelihood Inference

Jin, Zi 07 March 2011 (has links)
A composite likelihood consists of a combination of valid likelihood objects, and in particular it is of typical interest to adopt lower dimensional marginal likelihoods. Composite marginal likelihood appears to be an attractive alternative for modeling complex data, and has received increasing attention in handling high dimensional data sets when the joint distribution is computationally difficult to evaluate, or intractable due to complex structure of dependence. We present some aspects of methodological development in composite likelihood inference. The resulting estimator enjoys desirable asymptotic properties such as consistency and asymptotic normality. Composite likelihood based test statistics and their asymptotic distributions are summarized. Higher order asymptotic properties of the signed composite likelihood root statistic are explored. Moreover, we aim to compare accuracy and efficiency of composite likelihood estimation relative to estimation based on ordinary likelihood. Analytical and simulation results are presented for different models, which include multivariate normal distributions, times series model, and correlated binary data.

Page generated in 0.0668 seconds