• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 66
  • 18
  • 13
  • 12
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 143
  • 36
  • 34
  • 23
  • 22
  • 16
  • 16
  • 15
  • 15
  • 14
  • 13
  • 13
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Grammar in the English Language Classroom : Teachers’ perspectives on grammar knowledge and instruction / Grammatik i det engelskspråkiga klassrummet : Lärares perspektiv om grammatikkunskap och undervisning

Papalexi, Stavroula January 2023 (has links)
The purpose of this work is to examine the teaching methods secondary and upper secondary teachers apply to teach grammar in the EFL classroom, as well as their perceptions about the benefits of having grammar knowledge and its benefits to students’ writing. Previous research reveals that grammar teaching is an integral part of language teaching; explicit and implicit methods along with deductive and inductive instruction are the main practices teachers use to transfer knowledge to their students. Apart from that, it is beneficial when teachers have good grammarknowledge; hence students can develop good metalinguistic awareness and writing ability. The theoretical framework is based on teacher cognition theory, as teachers’ personal experiences and knowledge affect their decisions about grammar practices in the classroom environment. A qualitative analysis of semi-structured interviews with six secondary and upper secondary teachers who teach English as a foreign language in different schools and municipalities in the whole of Sweden are used to conduct the current study. The results demonstrate that indeed grammar teaching is needed to teach a new language. However, factors such as age and type of groupinfluence teachers’ decisions. Above all else, teachers’ highest goal is to help students become good language users. / Syftet med detta arbete är att undersöka de undervisningsmetoder högstadie- och gymnasielärare tillämpar för att undervisa i grammatik i EFL-klassrummet, samt deras uppfattningar om fördelarna med att ha grammatikkunskap och dess fördelar för elevernas skriftliga förmåga. Tidigare forskning visar att grammatikundervisning är en integrerad del av språkundervisningen; de explicita och implicita metoderna tillsammans med deduktiv och induktiv undervisning är de huvudsakliga metoder som lärare använder för att överföra kunskap till sina elever. Bortsett från det är det fördelaktigt när lärare har goda grammatiska kunskaper; därför kan eleverna utveckla en god metaspråklig medvetenhet och skriftlig förmåga i målspråket. Det teoretiska ramverket bygger på lärarkognitionsteori, eftersom lärares personliga erfarenheter och kunskaper påverkar deras beslut om grammatikpraxis i klassrumsmiljön. En kvalitativ analys av semistrukturerade intervjuer med sex högstadie- och gymnasielärare som undervisar i engelska som främmande språk i olika skolor och kommuner i hela Sverige har använts för att genomföra den aktuella studien. Resultaten visar att det verkligen behövs grammatikundervisning för att undervisa ett nytt språk. Faktorer som ålder och typ av grupp påverkar dock lärares beslut. Framför allt annat är lärares högsta mål att hjälpa eleverna att bli bra språkanvändare.
122

[en] A LABELLED NATURAL DEDUCTION LOGICAL FRAMEWORK / [pt] UM FRAMEWORK LÓGICO PARA DEDUÇÃO NATURAL ROTULADA

BRUNO CUCONATO CLARO 27 November 2023 (has links)
[pt] Neste trabalho propomos um framework lógico para sistemas de Dedução Natural rotulados. Sua meta-linguagem é baseada numa generalização dos esquemas de regras propostos por Prawitz, e o uso de rótulos permite a definição de lógicas intencionais como lógicas modais e de descrição, bem como a definição uniforme de quantificadores como o para um número não-renumerável de indivíduos vale a propriedade P (lógica de Keisler), ou para quase todos os indivíduos vale P (lógica de ultra-filtros), sem mencionar os quantificadores padrões de lógica de primeira-ordem. Mostramos também a implementação deste framework em um assistente de prova virtual disponível livremente na web, e comparamos a definição de sistemas lógicos nele com o mesmo feito em outros assistentes — Agda, Isabelle, Lean, Metamath. Como subproduto deste experimento comparativo, também contribuímos uma prova formal em Lean do postulado de Zolt em três dimensões usando o sistema Zp proposto por Giovaninni et al. / [en] We propose a Logical Framework for labelled Natural Deduction systems. Its meta-language is based on a generalization of the rule schemas proposed by Prawitz, and the use of labels allows the definition of intentional logics, such as Modal Logic and Description Logic, as well as some quantifiers, such as Keisler s for non-denumerable-many individuals property P, or for almost all individuals P holds, or generally P holds, not to mention standard first-order logic quantifiers, all in a uniform way. We also show an implementation of this framework as a freely-available web-based proof assistant. We then compare the definition of logical systems in our implementation and in other proof assistants — Agda, Isabelle, Lean, Metamath. As a sub-product of this comparison experiment, we contribute a formal proof (in Lean) of De Zolt s postulate for three dimensions, using the Zp system proposed by Giovaninni et al.
123

DJ: Bridging Java and Deductive Databases

Hall, Andrew Brian 07 July 2008 (has links)
Modern society is intrinsically dependent on the ability to manage data effectively. While relational databases have been the industry standard for the past quarter century, recent growth in data volumes and complexity requires novel data management solutions. These trends revitalized the interest in deductive databases and highlighted the need for column-oriented data storage. However, programming technologies for enterprise computing were designed for the relational data management model (i.e., row-oriented data storage). Therefore, developers cannot easily incorporate emerging data management solutions into enterprise systems. To address the problem above, this thesis presents Deductive Java (DJ), a system that enables enterprise programmers to use a column oriented deductive database in their Java applications. DJ does so without requiring that the programmer become proficient in deductive databases and their non-standardized, vendor-specific APIs. The design of DJ incorporates three novel features: (1) tailoring orthogonal persistence technology to the needs of a deductive database with column-oriented storage; (2) using Java interfaces as a primary mapping construct, thereby simplifying method call interception; (3) providing facilities to deploy light-weight business rules. DJ was developed in partnership with LogicBlox Inc., an Atlanta based technology startup. / Master of Science
124

An analysis of teacher competencies in a problem-centred approach to dynamic Geometry teaching

Ndlovu, Mdutshekelwa 11 1900 (has links)
The subject of teacher competencies or knowledge has been a key issue in mathematics education reform. This study attempts to identify and analyze teacher competencies necessary in the orchestration of a problem-centred approach to dynamic geometry teaching and learning. The advent of dynamic geometry environments into classrooms has placed new demands and expectations on mathematics teachers. In this study the Teacher Development Experiment was used as the main method of investigation. Twenty third-year mathematics major teachers participated in workshop and microteaching sessions involving the use of the Geometer's Sketchpad dynamic geometry software in the teaching and learning of the geometry of triangles and quadrilaterals. Five intersecting categories of teacher competencies were identified: mathematical/geometrical competencies. pedagogical competencies. computer and software competences, language and assessment competencies. / Mathematical Sciences / M. Ed. (Mathematical Education)
125

The effectiveness of dynamic assessment as an alternative aptitude testing strategy

Zolezzi, Stefano Alberto 06 1900 (has links)
The present study sets out to evaluate the effectiveness of a dynamic approach to aptitude testing. It was proposed that it is not always appropriate to use conventional aptitude tests to predict future academic success in the South African context. The study posited the belief that an alternative testing format could be facilitated by using a test-train-test procedure within a learning potential paradigm. The learning potential paradigm as formulated through Vygotskian and Feuersteinian theory is operationalised in the form of a Newtest Battery. The Newtest procedure is in direct contrast to traditional approaches to aptitude testing. The latter approaches both implicitly and explicitly adopt a static view of ability, whereas the Newtest approach focuses on the learning potential of the testee, as well as consequent performance. However, the assessment of learning potential poses problems of its own. Modifications were introduced to ensure that the Newtest format is both appropriate and psychometrically defensible. The construction and evaluation of the Newtest Battery is described. A sample of both advantaged and disadvantaged students were tested on a battery of traditional aptitude tests. This group of students was contrasted with another sample of both advantaged and disadvantaged students who undertook the Newtest Battery in the modified dynamic testing format. The traditional measures of aptitude were found to be invalid predictors of university success. Matric results showed a relationship with academic success for both groups. The Newtest measures enhanced the prediction of academic success for both advantaged and disadvantaged students. The Deductive Reasoning dynamic measure was found to be a valid predictor of university success for the disadvantaged students. The results thus successfully extend the learning potential paradigm into the realm of group aptitude testing. The validity of traditional aptitude test measures has been brought into question by the findings of the study. The study points the way forward to a more equitable and relevant aptitude testing procedure. Finally, it was shown that the testing environment forms part of the socio-educational context. Personnel involved in the administration of aptitude tests are given guidelines \vi th the aim of equalising the test process. / Psychology of Education / D. Ed. (Psychology of Education)
126

Analyse de la structure logique des inférences légales et modélisation du discours juridique

Peterson, Clayton 05 1900 (has links)
Thèse par articles. / La présente thèse fait état des avancées en logique déontique et propose des outils formels pertinents à l'analyse de la validité des inférences légales. D'emblée, la logique vise l'abstraction de différentes structures. Lorsqu'appliquée en argumentation, la logique permet de déterminer les conditions de validité des inférences, fournissant ainsi un critère afin de distinguer entre les bons et les mauvais raisonnements. Comme le montre la multitude de paradoxes en logique déontique, la modélisation des inférences normatives fait cependant face à divers problèmes. D'un point de vue historique, ces difficultés ont donné lieu à différents courants au sein de la littérature, dont les plus importants à ce jour sont ceux qui traitent de l'action et ceux qui visent la modélisation des obligations conditionnelles. La présente thèse de doctorat, qui a été rédigée par articles, vise le développement d'outils formels pertinents à l'analyse du discours juridique. En première partie, nous proposons une revue de la littérature complémentaire à ce qui a été entamé dans Peterson (2011). La seconde partie comprend la contribution théorique proposée. Dans un premier temps, il s'agit d'introduire une logique déontique alternative au système standard. Sans prétendre aller au-delà de ses limites, le système standard de logique déontique possède plusieurs lacunes. La première contribution de cette thèse est d'offrir un système comparable répondant au différentes objections pouvant être formulées contre ce dernier. Cela fait l'objet de deux articles, dont le premier introduit le formalisme nécessaire et le second vulgarise les résultats et les adapte aux fins de l'étude des raisonnements normatifs. En second lieu, les différents problèmes auxquels la logique déontique fait face sont abordés selon la perspective de la théorie des catégories. En analysant la syntaxe des différents systèmes à l'aide des catégories monoïdales, il est possible de lier certains de ces problèmes avec des propriétés structurelles spécifiques des logiques utilisées. Ainsi, une lecture catégorique de la logique déontique permet de motiver l'introduction d'une nouvelle approche syntaxique, définie dans le cadre des catégories monoïdales, de façon à pallier les problèmes relatifs à la modélisation des inférences normatives. En plus de proposer une analyse des différentes logiques de l'action selon la théorie des catégories, la présente thèse étudie les problèmes relatifs aux inférences normatives conditionnelles et propose un système déductif typé. / The present thesis develops formal tools relevant to the analysis of legal discourse. When applied to legal reasoning, logic can be used to model the structure of legal inferences and, as such, it provides a criterion to discriminate between good and bad reasonings. But using logic to model normative reasoning comes with some problems, as shown by the various paradoxes one finds within the literature. From a historical point of view, these paradoxes lead to the introduction of different approaches, such as the ones that emphasize the notion of action and those that try to model conditional normative reasoning. In the first part of this thesis, we provide a review of the literature, which is complementary to the one we did in Peterson (2011). The second part of the thesis concerns our theoretical contribution. First, we propose a monadic deontic logic as an alternative to the standard system, answering many objections that can be made against it. This system is then adapted to model unconditional normative inferences and test their validity. Second, we propose to look at deontic logic from the proof-theoretical perspective of category theory. We begin by proposing a categorical analysis of action logics and then we show that many problems that arise when trying to model conditional normative reasoning come from the structural properties of the logic we use. As such, we show that modeling normative reasoning within the framework of monoidal categories enables us to answer many objections in favour of dyadic and non-monotonic foundations for deontic logic. Finally, we propose a proper typed deontic system to model legal inferences.
127

Förbättrad tidsuppskattning för IT-projekt

Safa, Amir, Dehmer, Linus January 2010 (has links)
<p>Det finns olika strategier och många olika metoder inom strategierna för att tidsuppskatta ett IT-projekt. Dessa strategier och metoder skiljer sig mer eller mindre åt och många gånger har företag egna metoder för tidsuppskattning. Dessa strategier och metoder har samma syfte, vilket är att göra så noggranna tidsuppskattningar som möjligt för att undvika försenade projekt. Examensarbetets syfte är att visa på hur LexiConsult AB kan förbättra sina tidsuppskattningar av inkommande IT-projekt genom att följa studiens rekommendationer, vilket leder till bättre tidsuppskattningar och nöjdare kunder. Undersökningen har utförts genom en litteraturstudie, där genomgång av de olika delarna inom IT-projekt skett. Det teoretiska kapitlet ligger senare till grund för den empiriska undersökning som utförts på företaget vilket ger insikt och perspektiv i hur företaget arbetar med sina projekt. Teorin visade att en bra tidsuppskattningsmetod inkluderar alla delarna i ett projekt och att dessa måste genomgås innan en bra tidsuppskattningsmetod kan tas fram.</p><p>Det komplexa ämnet tidsuppskattning har gjort att en deduktiv ansats har valts för arbetet, där utgångspunkten ligger i teorin. Den djupa teorigenomgång som har utförts av författarna avspeglar sig i den kvalitativa forskningsmetod som valts för studien där syftet med metoden är att få djupare kunskap genom analyser och tolkningar i teori och verklighet. Författarna har försöktsäkerställa examensarbetets kvalité genom att beakta det insamlade materialets validitet, reliabilitet och relevans.</p><p>De analyser som genomförs i arbetet där LexiConsults arbetssätt analyseras mot den teori som har genomgåtts leder till ett flertal rekommendationer i slutsatsen av arbetet. Företagets brister i informationsstruktur leder till rekommendationer om införandet av en databas med kategorier av de projekt företaget utför. Vikten av bra kravspecifikationer inom tidsuppskattning leder i slutsatsen till en förbättrad kravspecifikation med uppdelning av de olika faserna i projektet med tidsuppskattning och kommentarer på varje del. Vikten av bra uppföljning av projekt framhävs och rekommendationer ges till en uppföljningsmall, vilket resulterar i designen till en sådan. Företagets arbetsmetoder och historik tas tillvara och en analogisk tidsuppskattningsmetod rekommenderas för framtida projekts tidsuppskattningar. Den analogiska metodens nackdelar förbättras genom tillämpandet av den nya kravspecifikationen där projekt delas i olika faser och tidsuppskattas separat enligt mikrostrategin inom tidsuppskattningar. Uppföljningsdokumentet samt den nya kravspecifikationen och den nya informationsstrukturen kommer enligt studiens slutsats att leda till flera positiva förändringar inom företaget. De nya förändringarna tillför större ansvar för den enskilde i projekt, bättre kunskapsdelning i företaget där erfarenhetsmässig kunskap kommer att ligga till grund för nya tidsuppskattningar, bättre beslutsunderlag för konsultchef samt för kund att lättare följa upp företagets tidsuppskattningar och hitta svagheter.Det finns olika strategier och många olika metoder inom strategierna för att tidsuppskatta ett IT-projekt. Dessa strategier och metoder skiljer sig mer eller mindre åt och många gånger har företag egna metoder för tidsuppskattning. Dessa strategier och metoder har samma syfte, vilket är att göra så noggranna tidsuppskattningar som möjligt för att undvika försenade projekt.</p><p>Examensarbetets syfte är att visa på hur LexiConsult AB kan förbättra sina tidsuppskattningar av inkommande IT-projekt genom att följa studiens rekommendationer, vilket leder till bättre tidsuppskattningar och nöjdare kunder. Undersökningen har utförts genom en litteraturstudie, där genomgång av de olika delarna inom IT-projekt skett. Det teoretiska kapitlet ligger senare till grund för den empiriska undersökning som utförts på företaget vilket ger insikt och perspektiv i hur företaget arbetar med sina projekt. Teorin visade att en bra tidsuppskattningsmetod inkluderar alla delarna i ett projekt och att dessa måste genomgås innan en bra tidsuppskattningsmetod kan tas fram.Det komplexa ämnet tidsuppskattning har gjort att en deduktiv ansats har valts för arbetet, där utgångspunkten ligger i teorin. Den djupa teorigenomgång som har utförts av författarna avspeglar sig i den kvalitativa forskningsmetod som valts för studien där syftet med metoden är att få djupare kunskap genom analyser och tolkningar i teori och verklighet. Författarna har försöktsäkerställa examensarbetets kvalité genom att beakta det insamlade materialets validitet, reliabilitet och relevans.De analyser som genomförs i arbetet där LexiConsults arbetssätt analyseras mot den teori som har genomgåtts leder till ett flertal rekommendationer i slutsatsen av arbetet. Företagets brister i informationsstruktur leder till rekommendationer om införandet av en databas med kategorier av de projekt företaget utför. Vikten av bra kravspecifikationer inom tidsuppskattning leder i slutsatsen till en förbättrad kravspecifikation med uppdelning av de olika faserna i projektet med tidsuppskattning och kommentarer på varje del. Vikten av bra uppföljning av projekt framhävs och rekommendationer ges till en uppföljningsmall, vilket resulterar i designen till en sådan. Företagets arbetsmetoder och historik tas tillvara och en analogisk tidsuppskattningsmetod rekommenderas för framtida projekts tidsuppskattningar. Den analogiska metodens nackdelar förbättras genom tillämpandet av den nya kravspecifikationen där projekt delas i olika faser och tidsuppskattas separat enligt mikrostrategin inom tidsuppskattningar. Uppföljningsdokumentet samt den nya kravspecifikationen och den nya informationsstrukturen kommer enligt studiens slutsats att leda till flera positiva förändringar inom företaget. De nya förändringarna tillför större ansvar för den enskilde i projekt, bättre kunskapsdelning i företaget där erfarenhetsmässig kunskap kommer att ligga till grund för nya tidsuppskattningar, bättre beslutsunderlag för konsultchef samt för kund att lättare följa upp företagets tidsuppskattningar och hitta svagheter.</p> / <p>There are many different strategies and different approaches within the strategies to estimate the time of an IT project. These methods and practices differ in several aspects and many times companies have developed their own method for estimating time. The common factors for these strategies and methods are the objectives, which is to make time estimation as accurate as possible, to avoid delays in projects. This study aims to show how LexiConsult AB can improve it’s time estimations by following the recommendations made in this study, which leads to reduction in their time estimates and better customer satisfaction. The study was conducted through a literature review of where the various components of IT projects are reviewed. The theoretical analysis is later the subject of the empirical investigation in the company giving instinct and perspective into how the company is working on their projects. The theory showed that a good time estimation method includes all the elements of a project and these must be examined before a good time estimation method can be developed.</p><p>Because of the complexity of this topic, a deductive approach was chosen for the study, where the starting point lies in the theory. The deep theory examination conducted by the authors is reflected in the qualitative research method chosen for study where the purpose of the method is to gain deeper knowledge through analysis and interpretation of the theory and reality.The authors have tried to ensure the quality of work by considering the collected material's validity, reliability and relevance.Because of the complexity of this topic, a deductive approach was chosen for the study, where the starting point lies in the theory. The deep theory examination conducted by the authors is reflected in the qualitative research method chosen for study where the purpose of the method is to gain deeper knowledge through analysis and interpretation of the theory and reality.The authors have tried to ensure the quality of work by considering the collected material's validity, reliability and relevance.</p><p>The analysis conducted in the study, is LexiConsult's project methods which are compared against the theory. These comparisons led to a number of recommendations in the conclusion of the study. The company's shortcomings in information structure area are identified and leads to recommendations on the establishment of a database with project categories. The importance of good requirements specifications for time estimation is identified and in the conclusion an improved specification is shown. The new specification has separated phases of the project with time estimates and comments on each part. The importance of good monitoring of the projects is highlighted in the study and recommendations are given to follow a template, resulting in the design of one. The company's working methods and use of the experience is indicative, and an analogy time estimation method is recommended for future time estimations. The analogical method has disadvantages that are improved through the new specification which the project is divided into different phases and time is estimated separately according to the Micro strategy for every phase. The recommendations made in this study leads to greater responsibility for individuals within the projects, better knowledge sharing in the company where old experiences will be the basis for the new time estimates. The new changes will lead to making the decision making within a project easier for both the costumer and LexiConsult AB.</p>
128

Παραμετροποίηση στοχαστικών μεθόδων εξόρυξης γνώσης από δεδομένα, μετασχηματισμού συμβολοσειρών και τεχνικών συμπερασματικού λογικού προγραμματισμού / Parameterization of stochastic data mining methods, string conversion algorithms and deductive logic programming techniques

Λύρας, Δημήτριος 02 February 2011 (has links)
Η παρούσα διατριβή πραγματεύεται το αντικείμενο της μάθησης από δύο διαφορετικές οπτικές γωνίες: την επαγωγική και την παραγωγική μάθηση. Αρχικά, παρουσιάζονται παραμετροποιήσεις στοχαστικών μεθόδων εξόρυξης γνώσης από δεδομένα υπό τη μορφή τεσσάρων καινοτόμων εξατομικευμένων μοντέλων στήριξης ασθενών που πάσχουν από διαταραχές άγχους. Τα τρία μοντέλα προσανατολίζονται στην ανεύρεση πιθανών συσχετίσεων μεταξύ των περιβαλλοντικών παραμέτρων των ασθενών και του επιπέδου άγχους που αυτοί παρουσιάζουν, ενώ παράλληλα προτείνεται και η χρήση ενός Μπεϋζιανού μοντέλου πρόβλεψης του επιπέδου άγχους που είναι πιθανό να εμφανίσει κάποιος ασθενής δεδομένων ορισμένων τιμών του περιβαλλοντικού του πλαισίου εφαρμογής. Αναφορικά με το χώρο της εξόρυξης γνώσης από κείμενο και του μετασχηματισμού συμβολοσειρών, προτείνεται η εκπαίδευση μοντέλων δέντρων αποφάσεων για την αυτόματη μεταγραφή Ελληνικού κειμένου στην αντίστοιχη φωνητική του αναπαράσταση, πραγματοποιείται η στοχαστική μοντελοποίηση όλων των πιθανών μεταγραφικών νορμών από ορθογραφημένα Ελληνικά σε Greeklish και τέλος παρουσιάζεται ένας καινοτόμος αλγόριθμος που συνδυάζει δύο γνωστά για την ικανοποιητική τους απόδοση μέτρα σύγκρισης ομοιότητας αλφαριθμητικών προκειμένου να επιτευχθεί η αυτόματη λημματοποίηση του κειμένου εισόδου. Επιπρόσθετα, στα πλαίσια της ανάπτυξης συστημάτων που θα διευκολύνουν την ανάκτηση εγγράφων ή πληροφοριών προτείνεται η συνδυαστική χρήση του προαναφερθέντος αλγορίθμου λημματοποίησης παράλληλα με τη χρήση ενός πιθανοτικού δικτύου Bayes στοχεύοντας στην ανάπτυξη ενός εύρωστου και ανταγωνιστικού ως προς τις επιδόσεις συστήματος ανάκτησης πληροφοριών. Τέλος, παρουσιάζονται οι προτάσεις μας που αφορούν στο χώρο της παραγωγικής μάθησης και του ελέγχου ικανοποιησιμότητας λογικών εκφράσεων. Συγκεκριμένα περιλαμβάνουν: i) την ανάλυση και εκτενή παρουσίαση μιας καινοτόμας μαθηματικής μοντελοποίησης με την ονομασία AnaLog (Analytic Tableaux Logic) η οποία δύναται να εκφράσει τη λογική που διέπει τους αναλυτικούς πίνακες για προτασιακούς τύπους σε κανονική διαζευκτική μορφή. Mέσω του λογισμού Analog επιτυγχάνεται η εύρεση των κλειστών κλάδων του πλήρως ανεπτυγμένου δέντρου Smullyan, χωρίς να είναι απαραίτητος ο αναλυτικός σχεδιασμός του δέντρου, και ii) την παράθεση ενός αναλυτικού αλγορίθμου που μπορεί να αξιοποιήσει τον φορμαλισμό AnaLog σε ένα πλαίσιο αριθμητικής διαστημάτων μέσω του οποίου μπορούμε να αποφανθούμε για την ικανοποιησιμότητα συμβατικών διαζευκτικών προτασιακών εκφράσεων. / The present dissertation deals with the problem of learning from two different perspectives, meaning the inferential and the deductive learning. Initially, we present our suggestions regarding the parameterization of stochastic data mining methods in the form of four treatment supportive services for patients suffering from anxiety disorders. Three of these services focus on the discovery of possible associations between the patients’ contextual data whereas the last one aims at predicting the stress level a patient might suffer from, in a given environmental context. Our proposals with regards to the wider area of text mining and string conversion include: i) the employment of decision-tree based models for the automatic conversion of Greek texts into their equivalent CPA format, ii) the stochastic modeling of all the existing transliteration norms for the Greek to Greeklish conversion in the form of a robust transcriber and iii) a novel algorithm that is able to combine two well-known for their satisfactory performance string distance metric models in order to address the problem of automatic word lemmatization. With regards to the development of systems that would facilitate the automatic information retrieval, we propose the employment of the aforementioned lemmatization algorithm in order to reduce the ambiguity posed by the plethora of morphological variations of the processed language along with the parallel use of probabilistic Bayesian Networks aiming at the development of a robust and competitive modern information retrieval system. Finally, our proposals regarding logical deduction and satisfiability checking include: i) a novel mathematical formalism of the analytic tableaux methodology named AnaLog (after the terms Analytic Tableaux Logic) which allows us to efficiently simulate the structure and the properties of a complete clausal tableau given an input CNF formula. Via the AnaLog calculus it is made possible to calculate all the closed branches of the equivalent complete Smullyan tree without imposing the need to fully construct it, and ii) a practical application of the AnaLog calculus within an interval arithmetic framework which is able to decide upon the satisfiability of propositional formulas in CNF format. This framework, apart from constituting an illustrative demonstration of the application of the AnaLog calculus, it may also be employed as an alternative conventional SAT system.
129

An analysis of teacher competences in a problem-centred approach to dynamic geometry teaching

Ndlovu, Mdutshekelwa 04 1900 (has links)
The subject of teacher competences or knowledge has been a key issue in mathematics education reform. This study attempts to identify and analyze teacher competences necessary in the orchestration of a problem-centred approach to dynamic geometry teaching and learning. The advent of dynamic geometry environments into classrooms has placed new demands and expectations on mathematics teachers. In this study the Teacher Development Experiment was used as the main method of investigation. Twenty third-year mathematics major teachers participated in workshop and microteaching sessions involving the use of the Geometer’s Sketchpad dynamic geometry software in the teaching and learning of the geometry of triangles and quadrilaterals. Five intersecting categories of teacher competences were identified: mathematical/geometrical competences, pedagogical competences, computer and software competences, language and assessment competencies. / Mathematics Education / M. Ed. (Mathematics Education)
130

The effectiveness of dynamic assessment as an alternative aptitude testing strategy

Zolezzi, Stefano Alberto 06 1900 (has links)
The present study sets out to evaluate the effectiveness of a dynamic approach to aptitude testing. It was proposed that it is not always appropriate to use conventional aptitude tests to predict future academic success in the South African context. The study posited the belief that an alternative testing format could be facilitated by using a test-train-test procedure within a learning potential paradigm. The learning potential paradigm as formulated through Vygotskian and Feuersteinian theory is operationalised in the form of a Newtest Battery. The Newtest procedure is in direct contrast to traditional approaches to aptitude testing. The latter approaches both implicitly and explicitly adopt a static view of ability, whereas the Newtest approach focuses on the learning potential of the testee, as well as consequent performance. However, the assessment of learning potential poses problems of its own. Modifications were introduced to ensure that the Newtest format is both appropriate and psychometrically defensible. The construction and evaluation of the Newtest Battery is described. A sample of both advantaged and disadvantaged students were tested on a battery of traditional aptitude tests. This group of students was contrasted with another sample of both advantaged and disadvantaged students who undertook the Newtest Battery in the modified dynamic testing format. The traditional measures of aptitude were found to be invalid predictors of university success. Matric results showed a relationship with academic success for both groups. The Newtest measures enhanced the prediction of academic success for both advantaged and disadvantaged students. The Deductive Reasoning dynamic measure was found to be a valid predictor of university success for the disadvantaged students. The results thus successfully extend the learning potential paradigm into the realm of group aptitude testing. The validity of traditional aptitude test measures has been brought into question by the findings of the study. The study points the way forward to a more equitable and relevant aptitude testing procedure. Finally, it was shown that the testing environment forms part of the socio-educational context. Personnel involved in the administration of aptitude tests are given guidelines \vi th the aim of equalising the test process. / Psychology of Education / D. Ed. (Psychology of Education)

Page generated in 0.0525 seconds