Spelling suggestions: "subject:"MAT/04 matematiche complementaria"" "subject:"MAT/04 matematiche complementario""
1 |
How teachers think about the role of digital technologies in student assessment in mathematicsVenturini, Marta <1987> 07 July 2015 (has links)
This study concerns teachers’ use of digital technologies in student assessment, and how the learning that is developed through the use of technology in mathematics can be evaluated. Nowadays math teachers use digital technologies in their teaching, but not in student assessment. The activities carried out with technology are seen as ‘extra-curricular’ (by both teachers and students), thus students do not learn what they can do in mathematics with digital technologies. I was interested in knowing the reasons teachers do not use digital technology to assess students’ competencies, and what they would need to be able to design innovative and appropriate tasks to assess students’ learning through digital technology.
This dissertation is built on two main components: teachers and task design. I analyze teachers’ practices involving digital technologies with Ruthven’s Structuring Features of Classroom Practice, and what relation these practices have to the types of assessment they use. I study the kinds of assessment tasks teachers design with a DGE (Dynamic Geometry Environment), using Laborde’s categorization of DGE tasks. I consider the competencies teachers aim to assess with these tasks, and how their goals relate to the learning outcomes of the curriculum.
This study also develops new directions in finding how to design suitable tasks for student mathematical assessment in a DGE, and it is driven by the desire to know what kinds of questions teachers might be more interested in using. I investigate the kinds of technology-based assessment tasks teachers value, and the type of feedback they give to students. Finally, I point out that the curriculum should include a range of mathematical and technological competencies that involve the use of digital technologies in mathematics, and I evaluate the possibility to take advantage of technology feedback to allow students to continue learning while they are taking a test.
|
2 |
L’effetto “età della Terra”. Contratto didattico e principi regolativi dell’azione degli studenti in matematica. / “The age of the Earth” effect. Didactic contract and regulative principles for the action of students facing mathematical tasks.Ferretti, Federica <1985> 09 July 2015 (has links)
L’oggetto di questa tesi è un fenomeno didattico osservato in due valutazioni standardizzate nazionali INVALSI, legato all’atteggiamento degli studenti mentre svolgono task di matematica. L’effetto, che abbiamo denotato effetto “età della Terra”, è stato interpretato in questa ricerca attraverso l’interazione e il confronto di diversi costrutti teorici che spiegano come questo effetto, che può essere considerato come una tipica situazione di contratto didattico, è generato dalla relazione studente-insegnante ma può diventare più strettamente legato al rapporto che hanno gli studenti con la matematica.
Inizialmente abbiamo condotto uno studio dei risultati statistici delle valutazioni standardizzate nazionali (Rash Analysis). Il primo step della sperimentazione è consistito nella preparazione, validazione e somministrazione di 612 questionari a studenti di diversi livelli scolastici e basandoci sui risultati dei questionari abbiamo condotto interviste di gruppo. L’analisi quantitativa e qualitativa dei risultati ha confermato la presenza dell’effetto “età della Terra” e ha mostrato che questo effetto è indipendente dal livello scolastico e dall’età degli studenti, dal contenuto matematico e dal contesto dei task proposti. La seconda parte della ricerca è stata volta ad indagare la cause di questo effetto. Abbiamo infatti individuato un principio regolativo che condizione l’azione degli studenti mentre fanno attività matematica e abbiamo condotto molte interviste individuali per indagarlo. Il comportamento degli studenti intervistati è stato così studiato e classificato con i costrutti del quadro teorico. / The object of this thesis is a didactic phenomenon observed firstly in two national standardized assessments, related to students' behaviour in performing a mathematical task. The effect, that we denote “The age of the Earth effect” is interpreted in this research throught the comparison and the interactions of different theoretical constructs, which explain how an effect, which stems in the teacher-student relation and can be described as a typical didactic contract situation, may become something affecting the relationship between the student and mathematics. We firstly conducted a study of the statistical results of the national assessment (Rasch Analysis). Then, the first step of our experimental design consisted of the preparation, the testing and the administration of 612 questionnaires to students of different school grades. Based on the results of these questionnaires, we conducted group interviews. This qualitative and quantitative experimentation confirmed the presence of the effect and showed that this effect is independent from the age of the students, the school level, the context of the task and the mathematical content. A second part of the research was devoted to investigate a possible cause of this effect. We individuated a regulative principle conditionating the action of the pupils and we conducted many individual interviews in order to study it. We classified the behaviuors of the students with the theoretical constructs of our framework.
|
3 |
Funzioni e potenzialità dell'analisi statistica di test su larga scala in didattica della matematicaGiberti, Chiara January 2017 (has links)
Questa tesi si propone di mettere in luce come l’analisi statistica di prove standardizzate di matematica possa avere importanti ricadute per lo studio di fenomeni didattici. In particolare, le ricerche presentate si riferiranno alle prove INVALSI studiate attraverso il modello di Rasch. Si mostrerà come questo approccio possa far emergere macro-fenomeni già osservati in didattica della matematica, studiandoli anche da un punto di vista quantitativo. In particolare verranno utilizzati per questi casi dei grafici, detti distractor plot, che mostrano l’andamento della risposta corretta e delle altre opzioni di risposta in funzione dell’abilità degli studenti. Questi grafici, applicati all’intera popolazione o a sottoinsiemi della stessa, permetteranno di evidenziare se una determinata risposta degli studenti, legata a un costrutto didattico, abbia una maggiore influenza su particolari livelli di abilità.
|
4 |
New proposals for the popularization of braid theoryDalvit, Ester January 2011 (has links)
Braid theory is a very active research field. Braids can be studied from variuos points of view and have applications in different fields, e.g. in mathematical physics and in biology. In this thesis we provide a formal introduction to some topics in the mathematical theory of braids and two possible approaches to this field at a popular level: a movie and a workshop. The scientific movie addressed to a non-specialist audience has been realized using the free ray-tracer POV-Ray. It is divided into four parts, each of which has a length of about 15 minutes. The content ranges from the introduction of basic concepts to deep results. The workshop activity is based on the action of braids on loops and aims to invite and lead the audience to a mathematical formalization of the principal concepts involved: braids, curves and group actions.
|
5 |
Geogebrizzazione di testi matematici come processo di oggettivazioneDel Zozzo, Agnese 03 June 2024 (has links)
Viene definita, analizzata e caratterizzata la geogebrizzazione (GGBZ) di un testo matematico, un’attività in cui uno o più individui trasformano un testo matematico stampato in una combinazione adeguata di risorse – espressione la cui caratterizzazione operativa è stata fondata in letteratura - della Piattaforma di Servizi GeoGebra (PSG, https://www.geogebra.org/). Lo scopo finale di questo processo è creare un prodotto reso pubblico nella PSG che possa essere utilizzato per scopi comunicativi, divulgativi o didattici. Per lo studio di tale processo, ci si concentra su un particolare caso studio in cui il punto di partenza è il testo di Guido Castelnuovo "Lezioni di geometria analitica e proiettiva" del 1904 e il cui punto di arrivo è un Libro GeoGebra. La ricerca dunque si colloca nel campo di studio in cui storia della matematica, didattica della matematica e tecnologie digitali interagiscono e il quadro teorico scelto per inquadrare lo studio del processo di GGBZ di un testo matematico è la Teoria dell’Oggettivazione (TO) di Luis Radford. Da un punto di vista metodologico, è stato elaborato un percorso sperimentato con quattro coppie di partecipanti con diversi background matematici. Ciascuna coppia è formata da esperti di tematiche ritenute rilevanti per l’esplorazione e la caratterizzazione del modo in cui l’attività di GGBZ di un testo matematico contribuisce a quello che nella TO viene chiamato “processo di addomesticamento dell’occhio”, nel particolare caso studio dei fondamenti di geometria proiettiva così come sono presentati in Castelnuovo (1904). Il risultato generale è che l’attività di GGBZ di un testo matematico è un’attività di insegnamento/apprendimento nel senso della TO.
|
6 |
Theoretical and Algorithmic Solutions for Null models in Network TheoryGobbi, Andrea January 2013 (has links)
The graph-theoretical based formulation for the representation of the data-driven structure and the dynamics of complex systems is rapidly imposing as the paramount paradigm [1] across a variety of disciplines, from economics to neuroscience, with biological -omics as a major example. In this framework, the concept of Null Model
borrowed from the statistical sciences identifies the elective strategy to obtain a baseline points of modelling comparison [2]. Hereafter, a null model is a graph
which matches one specific graph in terms of some structural features, but which is otherwise taken to be generated as an instance of a random network. In this view, the network model introduced by Erdos & Renyi [3], where random edges are generated as independently and identically distributed Bernoulli trials, can be considered the simplest possible null model. In the following years,
other null models have been developed in the framework of graph theory, with the detection of the community structure as one of the most important target[4]. In particular, the model described in [5] introduces the concept of a randomized version of the original graph: edges are rewired at random, with each expected vertex degree matching the degree of the vertex in the original graph. Although aimed
at building a reference for the community detection, this approach will play a key role in one of the model considered in this thesis. Note that, although being the
ï¬ rst problem to be considered, designing null models for the community structures detection is still an open problem [6, 7]. Real world applications of null model in graph theory have also gained popularity in many different scientific areas, with ecology as the ï¬ rst example: see
[8] for a comprehensive overview. More recently, interest for network null models arose also in computational biology [9, 10], geosciences [11] and economics [12, 13],
just to name a few. In the present work the theoretical design and the practical implementation of
a series of algorithms for the construction of null models will be introduced, with applications ranging from functional genomics to game theory for social studies.
The four chapters devoted to the presentation of the examples of null model are preceded by an introductory chapter including a quick overview of graph theory,
together with all the required notations.
The ï¬ rst null model is the topic of the second chapter, where a suite of novel algorithms is shown, aimed at the efficient generation of complex networks under
different constraints on the node degrees. Although not the most important example in the thesis, the premiment position dedicated to this topic is due to its strict familiarity with the aforementioned classical null models for random graph construction. Together with the algorithms definition and examples, a thorough
theoretical analysis of the proposed solutions is shown, highlighting the improvements with respect to the state-of-the-art and the occurring limitations. Apart from its intrinsic mathematical value, the interest for these algorithms by the community of systems biology lies in the need for benchmark graphs resembling the real biological networks. They are in fact of uttermost importance when testing novel inference methods, and as testbeds for the network reconstruction challenges such as the DREAM series [14, 15, 16].
The following Chapter three includes the most complex application of null models presented in this thesis. The scientific workï¬ eld is again functional genomics,
namely the combinatorial approach to the modelling of patterns of mutations in cancer as detected by Next Generation Sequencing exome Data. This problem has
a natural mathematical representation in terms of rewiring of bipartite networks and mutual-exclusively mutated modules [17, 18], to which Markov chain updates
(switching-steps) are applied through a Switching Algorithm SA. Here we show some crucial improvements to the SA, we analytically derive an approximate lower
bound for the number of steps required, we introduce BiRewire, an R package implementing the improved SA and we demonstrate the effectiveness of the novel
solution on a breast cancer dataset. A novel threshold-selection method for the construction of co-expression net-
works based on the Pearson coefficient is the third and last biological example of null model, and it is outlined in Chapter four. Gene co-expression networks inferred by correlation from high-throughput proï¬ ling such as microarray data represent a simple but effective technique for discovering and interpreting linear gene relationships. In the last years several approach have been proposed to tackle the problem of deciding when the resulting correlation values are statistically significant. This is mostly crucial when the number of samples is small, yielding a non negligible chance that even high correlation values are due to random effects. Here we introduce a novel hard thresholding solution based on the assumption
that a coexpression network inferred by randomly generated data is expected to be empty. The theoretical derivation of the new bound by geometrical methods is shown together with two applications in oncogenomics. The last two chapters of the thesis are devoted to the presentation of null
models in non-biological contexts. In Chapter 5 a novel dynamic simulation model is introduced mimicking a random market in which sellers and buyers follow different price distributions and matching functions. The random marked is mathematically formulated by a dynamic bipartite graph, and the analytical formula for the evolution along time of the mean price exchange is derived, together with global likelihood function for retrieving the initial parameters under different assumptions. Finally in Chapter 6 we describe how graph tools can be used to model abstraction and strategy (see [19, 20, 21]) for a class of games in particular the TTT solitaire. We show that in this solitaire it is not possible to build an optimal (in
the sense of minimum number of moves) strategy dividing the big problems into smaller subproblems. Nevertheless, we ï¬ nd some subproblems and strategies for solving the TTT solitaire with a negligible increment in the number of moves. Although quite simple and far from simulating highly complex real-world situations of decision making, the TTT solitaire is an important tool for starting the exploration of the social analysis of the trajectories of the implementation of winning strategies through different learning procedures [22].
|
7 |
Analysis of 3D scanning data for optimal custom footwear manufactureTure Savadkoohi, Bita January 2011 (has links)
Very few standards exist for tting products to people. Footwear fit is a noteworthy example for consumer consideration when purchasing shoes. As a result, footwear manufacturing industry for achieving commercial success encountered the problem of developing right footwear which is fulfills consumer's requirement better than it's competeries. Mass customization starts with understanding individual customer's requirement and it finishes with fulllment process of satisfying the target customer with near mass production efficiency. Unlike any other consumer product, personalized footwear or the matching of footwear to feet is not easy if delivery of discomfort is predominantly caused by pressure induced by a shoe that has a design unsuitable for that particular shape of foot. Footwear fitter have been using manual measurement for a long time, but the combination of 3D scanning systems with mathematical technique makes possible the development of systems, which can help in the selection of good footwear for a given customer. This thesis, provides new approach for addressing the computerize footwear fit customization in industry problem. The design of new shoes starts with the design of the new shoe last. A shoe last is a wooden or metal model of human foot on which shoes are shaped. Despite the steady increase in accuracy, most available scanning techniques cause some deficiencies in the point cloud and a set of holes in the triangle meshes. Moreover, data resulting from 3D scanning are given in an arbitrary position and orientation in a 3D space. To apply sophisticated modeling operations on these data sets, substantial post-processing is usually required. We described a robust algorithm for filling holes in triangle mesh. First, the advance front mesh technique is used to generate a new triangular mesh to cover the hole. Next, the triangles in initial patch mesh is modified by estimating desirable normals instead of relocating them directly. Finally, the Poisson equation is applied to optimize the new mesh. After obtaining complete 3D model, the result data must be generated and aligned before taking this models for shape analysis such as measuring similarity between foot and shoe last data base for evaluating footwear it. Principle Component Analysis (PCA), aligns a model by considering its center of mass as the coordinate system origin, and its principle axes as the coordinate axes. The purpose of the PCA applied to a 3D model is to make the resulting shape independent to translation and rotation asmuch as possible. In analysis, we applied "weighted" PCA instead of applying the PCA in a classical way (sets of 3D point-clouds) for alignment of 3D models. This approach is based on establishing weights associated to center of gravity of triangles. When all of the models are aligned, an efficient algorithm to cut the model to several sections toward the heel and toe for extracting counters is used. Then the area of each contour is calculated and compared with equal sections in shoe last data base for finding best footwear fit within the shoe last data base.
|
Page generated in 0.1392 seconds