Spelling suggestions: "subject:"sciencesxstudy anda teachinglevaluation."" "subject:"sciencesxstudy anda teachers’evaluation.""
1 |
An evaluation of the 'Into Science' programme and materials designed by the Open University, using perceptions of South African Colleges of Education students taking this programme.Sokhela, Nompumelelo Kitty Hellen. January 1998 (has links)
The aim of this study was to evaluate a distance education programme and materials called 'Into
Science', designed by the Open University in the United Kingdom. The perceptions of selected
KwaZulu-Natal college students taking the course were used for this evaluation.
The trialling took place in three KwaZulu-Natal colleges of education from February to June
1997. 120 students were involved, mostly year 3 primary teacher diploma students. Students' and
lecturers' views were obtained through the use of oPen-ended questionnaires, 5 point Likert type
questionnaires, focus group interviews, individual interviews with lecturers/tutors and participant
observation during the tutorial sessions.
The results show that 'Into Science' materials can be used for South African students, but with
some recommended modifications. The language used in 'Into Science' was not a problem for
most of the students who took part in the trialling; students' reactions to the materials and course
were very positive; their confidence in handling the subject matter increased markedly; most
students did not read everything contained in the study materials in the time specified; students
did not say that their learning styles changed as a result ofusing these materials; students placed
a low value on the practicals; the earth sciences is not recognised as one of the fields in science;
lecturers had low expectations of their students; and finally, students and tutors or course
providers will need extensive support in a variety of ways in order for the course to run
successfully and to achieve desired outcomes in South Africa. / Thesis (M.Ed.) - University of Natal, Pietermaritzburg, 1998.
|
2 |
A framework for validation of the use of performance assessment in scienceBartley, Anthony William 05 1900 (has links)
The assessment of learning in school science is important to the students,
educators, policy makers, and the general public. Changes in curriculum and instruction
in science have led to greater emphasis upon alternative modes of assessment. Most
significant of these newer approaches is “performance assessment”, where students
manipulate materials in experimental situations. Only recently has the development of
performance assessment procedures, and the appropriate strategies for interpreting their
results, received substantial research attention.
In this study, educational measurement and science education perspectives are
synthesized into an integrated analysis of the validity of procedures, inferences and
consequences arising from the use of performance assessment. The Student Performance
Component of the 1991 B.C. Science Assessment is offered as an example. A framework
for the design, implementation, and interpretation of hands-on assessment in school
science is presented, with validity and feasibility considered at every stage. Particular
attention is given to a discussion of the influence of construct labels upon assessment
design. A model for the description of performance assessment tasks is proposed. This
model has the advantage of including both the science content and the science skill
demands for each task. The model is then expanded to show how simultaneous
representation of multiple tasks enhances the ability to ensure adequate sampling from
appropriate content domains.
The main conclusion of this validation inquiry is that every aspect of performance
assessment in science is influenced by the perspective towards learning in science that
permeates the assessment, and that this influence must be considered at all times.
Recommendations are made for those carrying out practical assessments, as well as
suggestions of areas that invite further research.
|
3 |
A framework for validation of the use of performance assessment in scienceBartley, Anthony William 05 1900 (has links)
The assessment of learning in school science is important to the students,
educators, policy makers, and the general public. Changes in curriculum and instruction
in science have led to greater emphasis upon alternative modes of assessment. Most
significant of these newer approaches is “performance assessment”, where students
manipulate materials in experimental situations. Only recently has the development of
performance assessment procedures, and the appropriate strategies for interpreting their
results, received substantial research attention.
In this study, educational measurement and science education perspectives are
synthesized into an integrated analysis of the validity of procedures, inferences and
consequences arising from the use of performance assessment. The Student Performance
Component of the 1991 B.C. Science Assessment is offered as an example. A framework
for the design, implementation, and interpretation of hands-on assessment in school
science is presented, with validity and feasibility considered at every stage. Particular
attention is given to a discussion of the influence of construct labels upon assessment
design. A model for the description of performance assessment tasks is proposed. This
model has the advantage of including both the science content and the science skill
demands for each task. The model is then expanded to show how simultaneous
representation of multiple tasks enhances the ability to ensure adequate sampling from
appropriate content domains.
The main conclusion of this validation inquiry is that every aspect of performance
assessment in science is influenced by the perspective towards learning in science that
permeates the assessment, and that this influence must be considered at all times.
Recommendations are made for those carrying out practical assessments, as well as
suggestions of areas that invite further research. / Education, Faculty of / Curriculum and Pedagogy (EDCP), Department of / Graduate
|
4 |
A study of pedagogical approaches to teaching problem solvingSnyder, Brian Lyn January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries / Department: Computer Science.
|
5 |
Comparability of science assessment across languages : the case of PISA science 2006El Masri, Yasmine Hachem January 2015 (has links)
In this research, I investigated the extent to which language versions (English, French and Arabic) of the same science test were comparable in terms of item difficulty and demands. I used PISA science 2006 data from three countries (respectively, UK, France and Jordan). I argued that language was an intrinsic part of the scientific literacy construct, be it intended or not by the examiner. The tight relationship between the language element and the scientific knowledge makes the language variable inextricable from the construct. This argument has considerable implications on methodologies used to address this question. I also argued that none of the available statistical or qualitative techniques were capable of teasing out the language variable and answering the research question. In this thesis, I adopted a critical evaluation and empirical methods, using literature from various fields (cognitive linguistics, psychology, measurement and science education) to analyse the test development and design procedures. In addition, I illustrated my claims with evidence from the technical reports and examples of released items. I adopted the same class of models employed in PISA, the Rasch model, as well as differential item functioning (DIF) techniques to address my question empirically. General tests of fit suggested an overall good fit of the data to the model with eleven items out of 103 showing strong evidence of misfit. Various violations to the requirements of the Rasch model were highlighted. The DIF analysis indicated that 22% of the items showed bias in the selected countries, but bias balanced out at test level. Limitations of the DIF analysis to identify the source of bias were discussed. Qualitative approaches to investigating question demands were examined and issues with their usefulness in international settings were discussed. A way forward incorporating cognitive load theory and computational linguistics is proposed.
|
Page generated in 0.3049 seconds