• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 66
  • 18
  • 13
  • 12
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 143
  • 36
  • 34
  • 23
  • 22
  • 16
  • 16
  • 15
  • 15
  • 14
  • 13
  • 13
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

「規範性理性」在「Wason選擇作業」上所扮演的角色 / 從選擇歷程邏輯性與「雙思考系統理論」的角度探究

陳宏道, Chen, Hung-Dao Unknown Date (has links)
「Wason選擇作業」是推理心理學領域中相當經典的作業,由於原版的 「Wason 選擇作業」正確率相當低,引發許多學者相繼投入研究,並形成了各種理論。例如「實用推理基模理論」、「社會契約理論」、「相配偏誤理論」、「訊息獲得量理論」、「關聯性理論」等等。歷經約四十年的研究,「Wason選擇作業」從經典的推理作業至今則被許多研究者質疑其成為研究推理歷程作業的 適當性,有些研究者則認為「Wason選擇作業」應被視為決策作業而非推理作業,並認為參與者在這個作業上不會表現出合於邏輯的推理能力。本研究重新以「規範性理性」的角度探究參與者在這個作業上的表現。研究一以「命題解讀作業」做為輔助,發現至少三成的參與者在傳統抽象型式的「Wason選擇作業」中的選擇行為是符合邏輯對應的,這樣的比例並不低於「命題解讀作業」本身的「正確」率。由於在傳統「Wason選擇作業」上要有「正確」表現須正確解讀命題且無其他形式誤解題意再加上符合邏輯的選擇歷程,低正確率可能僅是各歷程「正確」率相乘的效果,而非「選擇歷程」不具邏輯性所致。研究二則以「雙思考系統理論」的觀點設計有利於「系統二」(即分析性系統)歷程表現的評估型「Wason選擇作業」,實驗一與實驗二分別有約四成及五成七的參與者表現符合「規範性理性」的預期。本研究認為過去多數「Wason選擇作業」的研究著重在研究「系統一」(即聯結性系統)的歷程,故未能觀察到參與者高程度的邏輯表現。本研究結果顯示「規範性理性」在「Wason選擇作業」仍扮演重要角色。本研究最後提出新的研究取向並討論「理性爭議」的課題。
112

A geometria dedutiva em livros didáticos das escolas públicas do Estado de São Paulo para o 3° e 4° ciclos do Ensino Fundamental

Carlovich, Marisa 08 November 2005 (has links)
Made available in DSpace on 2016-04-27T16:57:19Z (GMT). No. of bitstreams: 1 Dissertacao MARISA CARLOVICH.pdf: 5040636 bytes, checksum: 7e30e13908d5b1f913ce36995beb89c1 (MD5) Previous issue date: 2005-11-08 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / ABSTRACT This study aims at analysing the teaching-learning process of Deductive Geometry in school-books prepared to the 3rd and 4th cycles of Ensino Fundamental (from the 1st to the 8th degrees students usually from 7 to 15 years) used in state-run schools in Sao Paulo State from the 1990s to the present day. According to Chervel (1990), after an important change in the history of Education there follows a trend in the approach chosen for pedagogical handbooks. In the past decade, this significant change in the history of Brazilian Education concerning Mathematics was the implementation of Programa Nacional do Livro Didático National Programme for School Textbooks (PNLD), in 1995. That made us separate two periods for the purpose of analysis in our study: the early 1990s and the early 2000s periods respectively before and after this implementation. Our queries turned on how, in each period, school-books were close to the debates of Mathematical Didactics about the teaching-learning process of Deductive Geometry and on how the differences of views for each period took place.Results for the selected book sets printed in 1990s show signs of the Deductive Geometry approach in which demonstrations are presented to students and followed only by application exercises, typifying a practical teaching of Mathematics. Results for the selected book sets printed in the 2000s point to an optimistic view regarding the teaching of Deductive Geometry. In order to study Geometry properties, besides application exercises, students were asked to do the empirical and deductive validations, which typifies heuristic approach, according to Lakato s definition (1976). However, it is fundamental to show students the ways to master deductive reasoning in Geometry, as recommended by the theoretical studies of Mathematical Didactics, only partially attended to in the book sets analysed for the two given periods. / Esta pesquisa tem o objetivo de analisar o ensino-aprendizagem da Geometria dedutiva nos livros didáticos do 3o. e 4o ciclos do Ensino Fundamental mais utilizados nas escolas públicas do Estado de São Paulo, desde a década de 1990 até os dias atuais. Segundo Chervel (1990), uma tendência de abordagem apresentada nos manuais pedagógicos se estabelece após mudanças importantes na história da educação. Nesta última década, a mudança significativa na história da Educação Matemática brasileira foi a implantação do Programa Nacional do Livro Didático (PNLD), em 1995. Assim, definimos para a nossa pesquisa dois períodos de análise: o início dos anos 1990 e o início dos anos 2000, períodos anterior e posterior, respectivamente, a essa implantação. Nossas questões de pesquisa versam sobre como, em cada época, as coleções de livros didáticos acompanharam as discussões da Didática da Matemática no que se refere ao ensino-aprendizagem da Geometria dedutiva e sobre as diferenças dessas apropriações nas duas épocas. Os resultados de nossa pesquisa sobre as coleções analisadas dos anos 1990 fornecem indícios de uma abordagem para a Geometria dedutiva em que as demonstrações são apresentadas aos alunos seguidas de exercícios apenas de aplicação, revelando um ensino prático para a Matemática. Os resultados da análise das coleções dos anos 2000 indicam otimismo em relação ao ensino da Geometria dedutiva. Para estudar as propriedades geométricas, além dos exercícios de aplicação, solicitam-se aos alunos validações empíricas e dedutivas, o que caracteriza um enfoque heurístico, conforme definição de Lakatos (1976). Entretanto, cumpre fornecer caminhos para que os alunos se apropriem do raciocínio dedutivo em Geometria, segundo recomendações baseadas em estudos de teóricos da Didática da Matemática, atendidas apenas parcialmente nas coleções analisadas das duas épocas.
113

Preuves par raffinement de programmes avec pointeurs / Proofs by refinement of programs with pointers

Tafat, Asma 06 September 2013 (has links)
Le but de cette thèse est de spécifier et prouver des programmes avec pointeurs, tels que des programmes C, en utilisant des techniques de raffinement. L’approche proposée permet de faire un compromis entre les techniques complexes qui existent dans la littérature et ce qui est utilisable dans l’industrie, en conciliant légèreté des annotations et restrictions sur les alias. Nous définissons, dans un premier temps, un langage d’étude, qui s’inspire du langage C, et dans lequel le seul type de données mutable possible est le type des structures, auquel on accède uniquement à travers des pointeurs. Afin de structurer nos programmes, nous munissons notre langage d’une notion de module et des concepts issus de la théorie du raffinement tels que les variables abstraites que nous formalisons par des champs modèle, et les invariants de collage. Ceci nous permet d’écrire des programmes structurés en composants. L’introduction des invariants de données dans notre langage soulève des problématiques liées au partage de pointeurs. En effet, en cas d’alias, on risque de ne plus pouvoir garantir la validité de l’invariant de données d’une structure. Nous interdisons, alors l’aliasing (le partage de référence) dans notre langage. Pour contrôler les accès à la mémoire, nous définissons un système de type, basé sur la notion de régions. Cette contribution s’inspire de la théorie du raffinement et a pour but, de rendre les programmes les plus modulaires possible et leurs preuves les plus automatiques possible. Nous définissons, sur ce langage, un mécanisme de génération d’obligations de preuve en proposant un calcul de plus faible précondition incorporant du raffinement. Nous prouvons ensuite, la correction de ce mécanisme de génération d’obligations de preuve par une méthode originale, fondée sur la notion de sémantique bloquante, qui s’apparente à une preuve de type soundness et qui consiste donc, à prouver la préservation puis le progrès de ce calcul. Nous étendons, dans un deuxième temps, notre langage en levant partiellement la restriction liée au partage de références. Nous permettons, notamment, le partage de références lorsqu’aucun invariant de données n’est associé au type structure référencé. De plus, nous introduisons le type des tableaux, ainsi que les variables globales et l’affectation qui ne font pas partie du langage noyau. Pour chacune des extensions citées ci-dessus, nous étendons la définition et la preuve de correction du calcul de plus faible précondition en conséquence. Nous proposons enfin, une implantation de cette approche sous forme d’un greffon de Frama-C (http://frama-c.com/). Nous expérimentons notre implantation sur des exemples de modules implantant des structures de données complexes, en particulier des défis issus du challenge VACID0 (http://vacid. codeplex.com/), à savoir les tableaux creux (Sparse Array) et les tas binaires. / The purpose of this thesis is to specify and prove programs with pointers, such as C programs, using refinement techniques. The proposed approach allows a compromise between the complexe methods that exist in the literature and what is used in industry, reconciling lightness annotations and restrictions on the alias. We define, firstly, a language study, based on the C language, in which the only type of mutable data allowed is the type of structures, which can be accessed only through pointers. In order to structure our programs, we bring our language with a module notion and concepts issue from a refinement theory such as abstract variables that we formalize by model fields and gluing invariants. This allows us to write programs structured by components. Introducing invariants in our language raises issues related to aliasing. Indeed, in presence of alias, we might not be able to guarantee the validity of the invariant data structure. We forbid then the aliasing in our language. To control memory access, we define a type system based on the concept of regions. This contribution is based on the theory and refinement. It aims to make programs as modular as possible and proofs as automatic as possible. We define on this language, a mechanism for generation of proof obligations by proposing a weakest precondition calculus incorporating refinement. Next we prove the correction of this proof obligations generation mechnaism by an original method based on the concept of blocking semantic, which is similar to a proof of type soundness, and consists therefore, to proove the preservation and the progress of the defined calculus. Secondly, we extend our language by, partially, lifting the restrictions related to aliasing. We allow, in particular, sharing when no invariant is associated to the referenced data structure. In addition, we introduce the type of arrays, global variables, and assignment that are not part of the core language. For each of the extensions mentioned above, we extend the definition and correctness proof of the weakest precondition calculus accordingly. Finally, we propose an implementation of this approach as a Frama-C plugin(http ://frama-c.com/). We experimente our implantation on examples of modules implementing complex data structures, especially the challenges from the challenge VACID0 (http ://vacid. Codeplex.com /), namely sparse srrays and binary heaps.
114

Explanation and deduction : a defence of deductive chauvinism

Hållsten, Henrik January 2001 (has links)
In this essay I defend the notion of deductive explanation mainly against two types of putative counterexamples: those found in genuinely indeterministic systems and those found in complex dynamic systems. Using Railton's notions of explanatory information and ideal explanatory text, deductivism is defended in an indeterministic setting. Furthermore, an argument against non-deductivism that hinges on peculiarities of probabilistic causality is presented. The use of the notion of an ideal explanatory text gives rise to problems in accounting for explanations in complex dynamic systems, regardless of whether they are deterministic or not. These problems are considered in the essay and a solution is suggested. This solution forces the deductivist to abandon the requirement that an explanation consists of a deductive argument, but it is argued that the core of deductivism is saved in so far as we, for full explanations, can still adhere to the fundamental requirement: If A explains B, then A is inconsistent with anything inconsistent with B.
115

Em dire??o a uma representa??o para equa??es alg?bricas :uma l?gica equacional local

Santos, Jos? Medeiros dos 17 July 2001 (has links)
Made available in DSpace on 2014-12-17T15:47:50Z (GMT). No. of bitstreams: 1 JoseMS.pdf: 1057927 bytes, checksum: 2fb0b885cdff7c8f9e8f9d1d07d2627f (MD5) Previous issue date: 2001-07-17 / The intervalar arithmetic well-known as arithmetic of Moore, doesn't possess the same properties of the real numbers, and for this reason, it is confronted with a problem of operative nature, when we want to solve intervalar equations as extension of real equations by the usual equality and of the intervalar arithmetic, for this not to possess the inverse addictive, as well as, the property of the distributivity of the multiplication for the sum doesn t be valid for any triplet of intervals. The lack of those properties disables the use of equacional logic, so much for the resolution of an intervalar equation using the same, as for a representation of a real equation, and still, for the algebraic verification of properties of a computational system, whose data are real numbers represented by intervals. However, with the notion of order of information and of approach on intervals, introduced by Aci?ly[6] in 1991, the idea of an intervalar equation appears to represent a real equation satisfactorily, since the terms of the intervalar equation carry the information about the solution of the real equation. In 1999, Santiago proposed the notion of simple equality and, later on, local equality for intervals [8] and [33]. Based on that idea, this dissertation extends Santiago's local groups for local algebras, following the idea of Σ-algebras according to (Hennessy[31], 1988) and (Santiago[7], 1995). One of the contributions of this dissertation, is the theorem 5.1.3.2 that it guarantees that, when deducing a local Σ-equation E t t in the proposed system SDedLoc(E), the interpretations of t and t' will be locally the same in any local Σ-algebra that satisfies the group of fixed equations local E, whenever t and t have meaning in A. This assures to a kind of safety between the local equacional logic and the local algebras / A aritm?tica intervalar conhecida como aritm?tica de Moore, n?o possui as mesmas propriedades dos n?meros reais, e por este motivo, defrontase com um problema de natureza operat?ria, quando se deseja resolver equa??es intervalares como extens?o de equa??es reais atrav?s da igualdade usual e da aritm?tica intervalar, por esta n?o possuir o inverso aditivo, como tamb?m, a propriedade da distributividade da multiplica??o pela soma n?o ser v?lida para qualquer terno de intervalos. A falta dessas propriedades impossibilita a utiliza??o da l?gica equacional, tanto para a resolu??o de uma equa??o intervalar usando a mesma, como para uma representa??o de uma equa??o real, e ainda, para a verifica??o alg?brica de propriedades de um sistema computacional, cujos dados sejam n?meros reais representados atrav?s de intervalos. Entretanto, com a no??o de ordem de informa??o e de aproxima??o sobre intervalos, introduzida por Aci?ly[6] em 1991, surge a id?ia de uma equa??o intervalar representar satisfatoriamente uma equa??o real, j? que os termos da equa??o intervalar carregam a informa??o sobre a solu??o da equa??o real. Em 1999, Santiago prop?s a no??o de igualdade simples e, posteriormente, igualdade local para intervalos [8] e [33]. Baseado nessa id?ia, esta disserta??o estende os conjuntos locais de Santiago para ?lgebras locais, seguindo a id?ia de Σ-?lgebras contidas em (Hennessy[31], 1988) e (Santiago[7], 1995). Uma das contribui??es desta disserta??o ? o teorema 5.1.3.2 que garante que, ao se deduzir uma Σ-equa??o local ⊢ E t t no sistema SDedLoc(E) proposto, as interpreta??es de t e t ser?o localmente iguais em qualquer Σ-?lgebra local que satisfa?a o conjunto de equa??es locais E fixado, sempre que t e t tiverem significado em A. Isto garante um tipo de seguran?a entre a l?gica equacional local e as ?lgebras locais
116

Static analysis of functional programs with an application to the frame problem in deductive verification / Analyse statique de programmes fonctionnels avec une application au problème du frame dans le domaine de la vérification déductive

Andreescu, Oana Fabiana 29 May 2017 (has links)
Dans le domaine de la vérification formelle de logiciels, il est impératif d'identifier les limites au sein desquelles les éléments ou fonctions opèrent. Ces limites constituent les propriétés de frame (frame properties en anglais). Elles sont habituellement spécifiées manuellement par le programmeur et leur validité doit être vérifiée: il est nécessaire de prouver que les opérations du programme n'outrepassent pas les limites ainsi déclarées. Dans le contexte de la vérification formelle interactive de systèmes complexes, comme les systèmes d'exploitation, un effort considérable est investi dans la spécification et la preuve des propriétés de frame. Cependant, la plupart des opérations ont un effet très localisé et ne menacent donc qu'un nombre limité d'invariants. Étant donné que la spécification et la preuve de propriétés de frame est une tache fastidieuse, il est judicieux d'automatiser l'identification des invariants qui ne sont pas affectés par une opération donnée. Nous présentons dans cette thèse une solution inférant automatiquement leur préservation. Notre solution a pour but de réduire le nombre de preuves à la charge du programmeur. Elle est basée sur l'analyse statique, et ne nécessite aucune annotation de frame. Notre stratégie consiste à combiner une analyse de dépendances avec une analyse de corrélations. Nous avons conçu et implémenté ces deux analyses statiques pour un langage fonctionnel fortement typé qui manipule structures, variants et tableaux. Typiquement, une propriété fonctionnelle ne dépend que de quelques fragments de l'état du programme. L'analyse de dépendances détermine quelles parties de cet état influent sur le résultat de la propriété fonctionnelle. De même, une fonction ne modifiera que certaines parties de ses arguments, copiant le reste à l'identique. L'analyse de corrélations détecte quelles parties de l'entrée d'une fonction se retrouvent copiées directement (i.e. non modifiés) dans son résultat. Ces deux analyses calculent une approximation conservatrice. Grâce aux résultats de ces deux analyses statiques, un prouveur de théorèmes interactif peut inférer automatiquement la préservation des invariants qui portent sur la partie non affectée par l’opération concernée. Nous avons appliqué ces deux analyses statiques à la spécification fonctionnelle d'un micro-noyau, et obtenu des résultats non seulement d'une précision adéquate, mais qui montrent par ailleurs que notre approche peut passer à l'échelle. / In the field of software verification, the frame problem refers to establishing the boundaries within which program elements operate. It has notoriously tedious consequences on the specification of frame properties, which indicate the parts of the program state that an operation is allowed to modify, as well as on their verification, i.e. proving that operations modify only what is specified by their frame properties. In the context of interactive formal verification of complex systems, such as operating systems, much effort is spent addressing these consequences and proving the preservation of the systems' invariants. However, most operations have a localized effect on the system and impact only a limited number of invariants at the same time. In this thesis we address the issue of identifying those invariants that are unaffected by an operation and we present a solution for automatically inferring their preservation. Our solution is meant to ease the proof burden for the programmer. It is based on static analysis and does not require any additional frame annotations. Our strategy consists in combining a dependency analysis and a correlation analysis. We have designed and implemented both static analyses for a strongly-typed, functional language that handles structures, variants and arrays. The dependency analysis computes a conservative approximation of the input fragments on which functional properties and operations depend. The correlation analysis computes a safe approximation of the parts of an input state to a function that are copied to the output state. It summarizes not only what is modified but also how it is modified and to what extent. By employing these two static analyses and by subsequently reasoning based on their combined results, an interactive theorem prover can automate the discharching of proof obligations for unmodified parts of the state. We have applied both of our static analyses to a functional specification of a micro-kernel and the obtained results demonstrate both their precision and their scalability.
117

El uso de la música como estrategia didáctica en la enseñanza del pretérito perfecto compuesto en la sala escolar de ELE en Suecia / The use of music in the teaching of the past perfect tense in the classroom of Spanish as a foreign language in Sweden

Lepp, Susanne January 2013 (has links)
Este estudio investiga la enseñanza del pretérito perfecto compuesto en el aula de españolcomo lengua extranjera a nivel escolar en Suecia. El objetivo de esta monografía es ver sipuede ser beneficioso enseñar el pretérito perfecto compuesto mediante canciones en español,que incluyen el pretérito perfecto compuesto en sus versos, como recurso didáctico. Losparticipantes son treinta y cuatro alumnos, divididos en dos grupos, de ELE en el octavo yocho profesores de español. Este estudio se realiza mediante observaciones en clase, dospruebas para los alumnos y una encuesta entre los profesores.Los resultados del estudio muestran que los profesores que participan en el estudio consideranque la mayoría de sus alumnos tienen problemas para aprender el pretérito perfectocompuesto. Los resultados de las pruebas indican que los alumnos que hacen parte del primergrupo de estudio, que tuvieron una enseñanza inductiva con canciones en español,aumentaron su destreza para utilizar el pretérito perfecto compuesto tanto con los verbosregulares como con los irregulares. Los alumnos que hicieron parte del segundo grupo,quienes recibieron una enseñanza deductiva, aumentaron su destreza en usar/aplicar elpretérito perfecto compuesto menos que el primer grupo cuando trata de los verbosirregulares. Sin embargo, estos alumnos aprendieron utilizar el pretérito perfecto compuestocon los verbos regulares mejor que los del primer grupo.Además, nos indigamos si se presentaba una posible diferencia en relación con el interés y lamotivación para aprender entre chicos y chicas dado el tema romántico de las canciones. Losresultados muestran una ligera mayoría en los resultados de las alumnas. / This study investigates the teaching of the past perfect tense in the Swedish classroom ofSpanish as a foreign language. The purpose of this essay is to see if it is profitable to teach thepast perfect tense with songs in Spanish, which include the past perfect tense in the lyrics, asan educational resource. The participants are thirtyfour students, divided into two groups, inthe eight grade who study Spanish as a foreign language, and eight Spanish teachers. Thestudy was made through observations in class, two tests by the students and a survey with theteachers.The results of the study show that the teachers who participated in the study consider that themajority of their students have problems learning the past perfect tense. Furthermore, theresults show that the students who were part of the first group, who received an inductiveeducation including Spanish music, increased their abilities of using the past perfect tenseboth with regular and irregular verbs. The students who were part of the second group andwho learnt/were taught by means of the deductive approach increased their abilities to use thepast perfect tense less than the first group when it came to the irregular verbs. However, theylearnt to use the past perfect tense with regular verbs better than the first group.Furthermore, we also investigated if there were any differences according to the interest andthe motivation to learn between the boys and the girls. The results were slightly higher amongthe girls.
118

Utilisation de la géométrie dynamique avec de futurs enseignants de mathématiques au secondaire pour repenser le développement du raisonnement

Damboise, Caroline 10 1900 (has links)
Les outils technologiques sont omniprésents dans la société et leur utilisation est de plus en plus grande dans les salles de classe. Du côté de l'enseignement et de l'apprentissage des mathématiques, ces outils se sont vu attribuer des rôles qui ont évolué avec les années. Les rôles de soutien, de visualisation et d'enrichissement des contenus en sont des exemples. Une utilisation des outils technologiques dans l'enseignement s'accompagne d'apports pragmatiques et épistémiques potentiels, mais comporte également des limites et des risques. Il s’avère important d’examiner le rôle accordé à l’outil technologique dans les activités qui le mobilisent. Puisque le raisonnement mathématique fait partie d'une des compétences visées à l’école (MELS, 2006) et que les futurs enseignants semblent accorder moins d'importance à la validation et la preuve comme composantes de ce raisonnement (Mary, 1999), nous émettons l'hypothèse qu'une séquence d'activités montrant la complémentarité de la preuve et des explorations tirant parti de la technologie pourrait aider les futurs enseignants à mieux saisir ces enjeux. La présente recherche s’appuie sur l'ingénierie didactique pour développer et valider une séquence d'activités intégrant le logiciel GeoGebra. Cette séquence d'activités a été conçue dans les buts suivants : initier les futurs enseignants en mathématiques au secondaire à un logiciel de géométrie dynamique et leur donner l'occasion de voir des activités mathématiques utilisant la technologie et visant le développement du raisonnement, par l’articulation de l’exploration et de la preuve. Le cadre théorique sur lequel repose cette recherche intègre des éléments de l'approche anthropologique (Chevallard, 1992, 1998, 2003) et de l'approche instrumentale (Vérillon et Rabardel, 1995; Trouche, 2000, 2003, 2007; Guin et Trouche, 2002). Certaines idées sur les constructions robustes et molles (Soury-Lavergne, 2011), la distinction figure/dessin (Laborde et Capponi, 1994) et le réseau déductif (Tanguay, 2006) ont servi de repères dans la construction de la séquence d'activités. Cette recherche s'est déroulée au cours de l'hiver 2016 dans une université québécoise, dans le cadre d’un cours de didactique de la géométrie auprès de futurs enseignants au secondaire en mathématiques. Un questionnaire pré-expérimentation a été rempli par les participants afin de voir leurs connaissances préalables sur les programmes, les outils technologiques ainsi que leurs conceptions au sujet de l'enseignement et de l'apprentissage des mathématiques. Par la suite, les étudiants ont expérimenté la séquence d'activités et ont eu à se prononcer sur les connaissances mises en jeu dans chacune des activités, l’opportunité de son utilisation avec des élèves du secondaire, et les adaptations perçues nécessaires pour sa réalisation (s'il y a lieu). Des traces écrites de leur travail ont été conservées ainsi qu'un journal de bord au fur et à mesure du déroulement de la séquence. En triangulant les diverses données recueillies, il a été constaté que la séquence, tout en contribuant à l’instrumentation des participants au regard du logiciel utilisé, a eu chez certains d’entre eux un impact sur leur vision du développement du raisonnement mathématique dans l’enseignement des mathématiques au secondaire. L’analyse des données a mis en lumière la place accordée au raisonnement par les futurs enseignants, les raisonnements mobilisés par les étudiants dans les diverses activités ainsi que des indices sur les genèses instrumentales accompagnant ces raisonnements ou les induisant. Suite à l’analyse de ces données et aux constats qui en découlent, des modifications sont proposées pour améliorer la séquence d’activités. / Technological tools are ubiquitous in society and their use is growing in the classroom. In mathematics education, these tools have been assigned roles that have evolved over the years: support, visualization, content enrichment. The use of technological tools in education comes with potential pragmatic and epistemic contributions, but also has limitations and risks. We must therefore examine at the activity level the role technology should play. Mathematical reasoning is one of the competencies aimed by school (MELS, 2006) and future teachers seem to place less emphasis on validation and proving processes as components of this reasoning (Mary, 1999). We hypothesize that a sequence of activities showing the complementarity of the proving processes with explorations leveraging technology could help future teachers better understand these issues. This research is based on didactical engineering to develop and validate a sequence of activities with GeoGebra software. The sequence of activities has been designed to: introduce pre-service secondary mathematics teachers to dynamic geometry software and give them the opportunity to see mathematical activities using technology that aim at developing mathematical reasoning and proof. The theoretical framework on which this research is based integrates elements of the anthropological theory of the didactic (Chevallard, 1992, 1998, 2003) and of the instrumental approach (Vérillon and Rabardel, 1995; Trouche, 2000, 2003, 2007; Guin and Trouche, 2002). Some ideas on robust and soft constructions (Soury-Lavergne, 2011), the distinction between figure and drawing (Laborde and Capponi, 1994) and the deductive network (Tanguay, 2006) served as benchmarks in the construction of the sequence of activities. This research took place at a Quebec university during the winter of 2016, in a geometry didactics course for pre-service secondary mathematics teachers. A preliminary questionnaire was given to the participants to capture their prior knowledge of programs, technological tools and conceptions about mathematics teaching and learning. Subsequently, the students experienced the sequence of activities and had to decide on the knowledge involved in each activity, the relevance of its use with high school students, and the perceived adaptations necessary for its realization (if considered). Written traces of their work have been kept as well as a diary as the sequence unfolds. By triangulating the various data collected, it was found that the sequence, while contributing to the instrumentation of the participants with regard to the software used, had, for some of them, an impact on their vision of the development of mathematical reasoning in mathematics education at secondary level. The analysis of the data highlighted the place given to the reasoning by the future teachers, the reasonings mobilized by the students in the various activities and also signs of the instrumental geneses inducing these reasonings and accompanying them. Following the analysis of these data and the findings that follow, modifications are proposed to improve the sequence of activities.
119

An Analysis of the way Grammar is Presented in two Coursebooks for English as a Second Language : A Qualitative Conceptual Analysis of Grammar in Swedish Coursebooks for Teaching English

From, Malcolm January 2021 (has links)
This essay aims to investigate theoretically how two currently used coursebooks, What’s Up 9 and Solid Gold 1, in a local area of Southern Sweden, present (introduces and covers) grammar. The overall aim is to investigate how grammar is presented, using the present simple and the present continuous as examples. The findings are also mapped to teaching approaches, as well as SLA (Second Language Acquisition) research, to see what approaches are favoured for teaching grammar in the first decades of the 21st century. In order to investigate the course- books, a qualitative content analysis and conceptual analysis was chosen with the presentation of grammar mapped into different categories, by using concepts for teaching and approaches used in SLA. The results show that the two proposed coursebooks favoured a FoFs (Focus on Forms) approach for presenting grammar. Furthermore, the results show that grammar is pre- sented explicitly and, if the teachers use the structures proposed in the coursebook rigidly, they automatically follow a deductive teaching procedure. When using a FoFs, explicit instructions and taking a deductive teaching approach, it may be regarded as the coursebooks suggest a grammar-translation approach as well. However, when observing other exercises connected to the reading texts in the coursebooks, it was detected that both coursebooks favoured a text- based approach for teaching, where the learners are supposed to learn the structure of different texts. In doing so, the grammatical structures are learned subconsciously and implicitly, which indicates that grammar is, in general, taught implicitly in the coursebooks, but presented (intro- duced and covered) explicitly.
120

Exploiting Graphic Card Processor Technology to Accelerate Data Mining Queries in SAP NetWeaver BIA

Lehner, Wolfgang, Weyerhaeuser, Christoph, Mindnich, Tobias, Faerber, Franz 15 June 2022 (has links)
Within business Intelligence contexts, the importance of data mining algorithms is continuously increasing, particularly from the perspective of applications and users that demand novel algorithms on the one hand and an efficient implementation exploiting novel system architectures on the other hand. Within this paper, we focus on the latter issue and report our experience with the exploitation of graphic card processor technology within the SAP NetWeaver business intelligence accelerator (BIA). The BIA represents a highly distributed analytical engine that supports OLAP and data mining processing primitives. The system organizes data entities in column-wise fashion and its operation is completely main-memory-based. Since case studies have shown that classic data mining queries spend a large portion of their runtime on scanning and filtering the data as a necessary prerequisite to the actual mining step, our main goal was to speed up this expensive scanning and filtering process. In a first step, the paper outlines the basic data mining processing techniques within SAP NetWeaver BIA and illustrates the implementation of scans and filters. In a second step, we give insight into the main features of a hybrid system architecture design exploiting graphic card processor technology. Finally, we sketch the implementation and give details of our vast evaluations.

Page generated in 0.0498 seconds