11 |
Impact of a California Community College's General Education Information Literacy RequirementUsina, Phyllis 01 January 2015 (has links)
Budget cuts at a California community college prompted stakeholders to consider dropping the college's general education information literacy (IL) requirement. Broad institutional outcomes data showed learning gains, but no targeted assessment existed regarding the IL requirement's impact on those gains. This quantitative study used Astin and Antonio's Inputs-Environment-Outcomes (I-E-O) assessment model to address relationships among student characteristics of demographic and prior preparation (Inputs), the IL requirement (Environment), and student reports of information critical analysis behavior and confidence (Outcomes). Study participants were 525 students aged 18 years and older who had completed the IL course with a grade of 2.0 or better and volunteered to complete an anonymous survey. The majority of participants reported the IL requirement had a positive impact upon subsequent coursework, with 87% stating that taking it in the first or second term would be most helpful. Less preparedness for information critical analysis prior to the IL course was significantly correlated (r = -.35, r = -.38, p < .001) with higher reported frequency of 2 measures of information evaluation changes following completion of the course. The 3 hierarchical multiple regression analyses revealed that the predictors of student demographic characteristics, prior student preparation, and IL course format contributed significantly to reported information critical analysis and confidence. The study's outcome was a white paper with recommendations to support completion of the IL course requirement early, continue the IL requirement, and repeat the study's survey in the future. Effective IL education promotes information evaluation behaviors essential to informed members of society.
|
12 |
Les compétences informationnelles des futurs enseignants québécois sur le WebDumouchel, Gabriel 06 1900 (has links)
Bien que les apprenants québécois de tous les niveaux d’éducation cherchent majoritairement vers le Web pour trouver de l’information dans le cadre de leur formation, bon nombre d’entre eux présentent des difficultés pour obtenir celle dont ils ont besoin. Face à cette nouvelle réalité, les enseignants actuels et futurs ont certes pour mission de développer les compétences informationnelles des apprenants, à savoir leurs compétences de recherche, d’évaluation et d’utilisation de l’information, mais sont-ils pour autant bien préparés pour le faire? La présente étude vise donc à vérifier si les futurs enseignants québécois possèdent les compétences nécessaires pour chercher, évaluer et utiliser de l’information pour répondre à leurs propres besoins informationnels et s’ils sont adéquatement formés pour enseigner ces compétences à leurs élèves. Afin d’atteindre cet objectif général de recherche, la présente thèse s’appuie sur un cadre conceptuel élaboré autour de trois éléments complémentaires : le concept de compétences informationnelles, le processus de recherche et de traitement de l’information et les méthodes d’enseignement des compétences informationnelles en milieu scolaire. Pour atteindre les objectifs spécifiques de recherche qui ont découlé de la synthèse de ces éléments, cette thèse a eu recours à une méthodologie mixte mariant sondage, entrevues et observations auprès de 353 futurs enseignants d’une université québécoise. Dans un premier temps, nous avons décrit et analysé leurs pratiques déclarées et effectives en recherche et traitement d’information sur le Web. Les principaux résultats concernant la recherche d’information démontrent que la majorité des futurs enseignants font figure de novices alors qu’ils planifient peu ou pas leurs recherches, emploient des stratégies de recherche basiques et diversifient peu les outils de recherche pour arriver à leurs fins, Google dominant largement. Dans un deuxième temps, les principaux résultats au sujet du traitement d’information montrent que la plupart des futurs enseignants évaluent l’information trouvée selon plusieurs critères dont la vérifiabilité de l’information et son utilisation à des fins d’enseignement et d’apprentissage. De plus, on constate que si la majorité d’entre eux sont en mesure de synthétiser l’information trouvée sur le Web, il reste que seule une minorité cite les sources utilisées. Dans un troisième temps, nos résultats dénotent que la formation initiale en compétences informationnelles est nettement insuffisante aux yeux des futurs enseignants, celle-ci étant principalement concentrée en début de baccalauréat sous la forme d’ateliers offerts par la bibliothèque. Parallèlement, les résultats démontrent que la majorité des futurs enseignants mettront davantage l’accent sur l’enseignement de l’évaluation et l’utilisation de l’information que sur sa recherche sur le Web. De fait les élèves seront sensibilisés à évaluer l’information trouvée selon une multitude de critères et à faire attention au plagiat. Dans un dernier temps, les résultats de notre étude sont synthétisés et analysés à la lumière de la littérature et des pistes de recommandations sont proposées dans le but d’améliorer la formation initiale en compétences informationnelles. / While Québec’s students from elementary schools to university now mainly use the Web to find information for their studies. many have difficulties getting the information they need. Faced with this new reality, inservice and preservice teachers must give students the proper information literacy training, that is to learn how to search, evaluate and use information to meet their needs. This study aims to analyze if Québec’s preservice teachers have the information literacy competencies to meet their own needs and if they are adequately trained to teach information literacy. To achieve this goal, this research uses a conceptual framework developed around three complementary elements: the information literacy concept, the information seeking process, and information literacy teaching methods. To meet the specific research objectives that resulted from the synthesis of these elements, a mixed methodology combines data collected from a survey, interviews and observations conducted with 353 preservice teachers of a university in Québec.
Results on reported and actual practices of the students’ information seeking process on the Web show, firstly, that a majority of them are novice information searchers, as they plan little or not at all their research on the Web, use basic search strategies, and scarcely diversify their research tools, with Google as their preferred tool by far. Secondly, results show that most preservice teachers evaluate the information they find on the Web according to several criteria, including information verifiability and potential use for teaching and learning. But while a majority of preservice teachers are able to synthesize the information found on the Web, only a minority cites correctly their sources. Thirdly, results indicate that preservice teachers perceive their initial training in information literacy as clearly insufficient, for it has been mainly received at the beginning of their program and in the form of workshops by the university’s librarians. Our results also show that while a majority of preservice teachers plan to teach how to evaluate information found on the Web using many criteria and educate them about plagiarism, they will put less emphasis on teaching how to search for information on the Web. We conclude our study by summarizing and analyzing our results in the light of the existing literature and offering recommandations to improve information literacy teacher training.
|
13 |
Lying, deception and strategic omission : definition and evaluation / Mensonge, tromperie et omission stratégique : définition et évaluationIcard, Benjamin 04 February 2019 (has links)
Cette thèse vise à mieux définir ainsi qu'à mieux évaluer les stratégies de tromperie et de manipulation de l'information. Des ressources conceptuelles, formelles et expérimentales sont combinées en vue d'analyser des cas standards de tromperie, tels que le mensonge, mais aussi non-standards, tels que les inférences trompeuses et l'omission stratégique. Les aspects définitionnels sont traités en premier. J'analyse la définition traditionnelle du mensonge en présentant des résultats empiriques en faveur de cette définition classique (dite 'définition subjective'), contre certains arguments visant à défendre une 'définition objective' par l'ajout d'une condition de fausseté. J'examine ensuite une énigme logique issue de R. Smullyan, et qui porte sur un cas limite de tromperie basé sur une règle d'inférence par défaut pour tromper un agent par omission. Je traite ensuite des aspects évaluatifs. Je pars du cadre existant pour l'évaluation du renseignement et propose une typologie des messages fondée sur les dimensions descriptives de vérité (pour leur contenu) et d'honnêteté (pour leur source). Je présente ensuite une procédure numérique pour l'évaluation des messages basée sur les dimensions évaluatives de crédibilité (pour la vérité) et de fiabilité (pour l'honnêteté). Des modèles numériques de plausibilité servent à capturer la crédibilité a priori des messages puis des règles numériques sont proposées pour actualiser ces degrés selon la fiabilité de la source. / This thesis aims at improving the definition and evaluation of deceptive strategies that can manipulate information. Using conceptual, formal and experimental resources, I analyze three deceptive strategies, some of which are standard cases of deception, in particular lies, and others non-standard cases of deception, in particular misleading inferences and strategic omissions. Firstly, I consider definitional aspects. I deal with the definition of lying, and present new empirical data supporting the traditional account of the notion (called the ‘subjective definition’), contradicting recent claims in favour of a falsity clause (leading to an ‘objective definition’). Next, I analyze non-standard cases of deception through the categories of misleading defaults and omissions of information. I use qualitative belief revision to examine a puzzle due to R. Smullyan about the possibility of triggering a default inference to deceive an addressee by omission. Secondly, I consider evaluative aspects. I take the perspective of military intelligence data processing to offer a typology of informational messages based on the descriptive dimensions of truth (for message contents) and honesty (for message sources). I also propose a numerical procedure to evaluate these messages based on the evaluative dimensions of credibility (for truth) and reliability (for honesty). Quantitative plausibility models are used to capture degrees of prior credibility of messages, and dynamic rules are defined to update these degrees depending on the reliability of the source.
|
14 |
Vilse i ett flöde av digital information : En metastudie om skolans roll i elevernas källkritiska utvecklingsförmåga / Lost in a stream of digital informationSvensson, Annie, Malmqvist, Julia January 2023 (has links)
The curiosity for source criticism was something that sparked in us during our internships. We discovered that students, even at an early age, frequently use digital sources and social media to find information and news. The purpose of this meta-study is to find out how the Swedish school system teaches the younger students about critical thinking and source criticism. The current Swedish curriculum mentions in the introductory chapters, along with the core content for Swedish and civics education, that the students should get the opportunity during their education in compulsory school to develop a source-critical approach. The research question posed was therefore “Are the younger students in compulsory school given the right tools and conditions to develop their critical thinking in civics education, to thus understand contemporary society?”. To find answers to this question, a qualitative textual analysis was conducted to determine a selection of related literature. The results showed that the schools lacked practical work regarding critical thinking and source criticism. As a result the teachers find it difficult to determine how it should be incorporated, along with which subject is responsible for it. Studies have shown confusion mainly among Swedish and civics teachers, since both teachers considered source criticism to be the other teacher's area of responsibility. The results also included which abilities are needed for a student to be able to develop a source-critical approach to thinking, also how a phenomenon can be interpreted differently depending on the experiences of the observer. The primary conclusion drawn was that there is a significant lack of research focusing on students in elementary school and how they relate to media and other information they meet every single day. It was also concluded that there is a lack of communication regarding this topic in the Swedish school system and how important it is for young students to develop critical thinking skills to be able to participate in our society as democratic citizens.
|
15 |
EEG Data acquisition and automatic seizure detection using wavelet transforms in the newborn EEG.Zarjam, Pega January 2003 (has links)
This thesis deals with the problem of newborn seizre detection from the Electroencephalogram (EEG) signals. The ultimate goal is to design an automated seizure detection system to assist the medical personnel in timely seizure detection. Seizure detection is vital as neurological diseases or dysfunctions in newborn infants are often first manifested by seizure and prolonged seizures can result in impaired neuro-development or even fatality. The EEG has proved superior to clinical examination of newborns in early detection and prognostication of brain dysfunctions. However, long-term newborn EEG signals acquisition is considerably more difficult than that of adults and children. This is because, the number of the electrodes attached to the skin is limited by the size of the head, the newborns EEGs vary from day to day, and the newborns are reluctant of being in the recording situation. Also, the movement of the newborn can create artifact in the recording and as a result strongly affect the electrical seizure recognition. Most of the existing methods for neonates are either time or frequency based, and, therefore, do not consider the non-stationarity nature of the EEG signal. Thus, notwithstanding the plethora of existing methods, this thesis applies the discrete wavelet transform (DWT) to account for the non-stationarity of the EEG signals. First, two methods for seizure detection in neonates are proposed. The detection schemes are based on observing the changing behaviour of a number of statistical quantities of the wavelet coefficients (WC) of the EEG signal at different scales. In the first method, the variance and mean of the WC are considered as a feature set to dassify the EEG data into seizure and non-seizure. The test results give an average seizure detection rate (SDR) of 97.4%. In the second method, the number of zero-crossings, and the average distance between adjacent extrema of the WC of certain scales are extracted to form a feature set. The test obtains an average SDR of 95.2%. The proposed feature sets are both simple to implement, have high detection rate and low false alarm rate. Then, in order to reduce the complexity of the proposed schemes, two optimising methods are used to reduce the number of selected features. First, the mutual information feature selection (MIFS) algorithm is applied to select the optimum feature subset. The results show that an optimal subset of 9 features, provides SDR of 94%. Compared to that of the full feature set, it is clear that the optimal feature set can significantly reduce the system complexity. The drawback of the MIFS algorithm is that it ignores the interaction between features. To overcome this drawback, an alternative algorithm, the mutual information evaluation function (MIEF) is then used. The MIEF evaluates a set of candidate features extracted from the WC to select an informative feature subset. This function is based on the measurement of the information gain and takes into consideration the interaction between features. The performance of the proposed features is evaluated and compared to that of the features obtained using the MIFS algorithm. The MIEF algorithm selected the optimal 10 features resulting an average SDR of 96.3%. It is also shown, an average SDR of 93.5% can be obtained with only 4 features when the MIEF algorithm is used. In comparison with results of the first two methods, it is shown that the optimal feature subsets improve the system performance and significantly reduce the system complexity for implementation purpose.
|
Page generated in 0.1575 seconds