• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 3
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 58
  • 16
  • 10
  • 9
  • 8
  • 8
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Constructing Provably Secure Identity-Based Signature Schemes

Chethan Kamath, H January 2013 (has links) (PDF)
An identity-based cryptosystem (IBC) is a public-key system where the public key can be represented by any arbitrary string such as an e-mail address. The notion was introduced by Shamir with the primary goal of simplifying certificate management. An identity-based signature(IBS) is the identity-based counter part of a digital signature. In the first (and primary) part of the work, we take a closer look at an IBS due to Galindo and Garcia–GG-IBS, for short. GG-IBS is derived through a simple and elegant concatenation of two Schnorr signatures and, importantly, does not rely on pairing. The security is established through two algorithms (both of) which use the Multiple-Forking(MF) Algorithm to reduce the problem of computing the discrete logarithm to breaking the IBS. Our focus is on the security argument : It turns out that the argument is flawed and, as a remedy, we sketch a new security argument. However, the resulting security bound is still quite loose, chiefly due to the usage of the MF Algorithm. We explore possible avenues for improving this bound and , to this end, introduce two notions pertaining to random oracles termed dependency and independency. Incorporating (in) dependency allows us to launch the nested replay attack far more effectively than in the MF Algorithm leading to a cleaner,(significantly) tighter security argument for GG-IBS, completing the final piece of the GG-IBS jigsaw. The second part of the work pertains to the notion of selective-identity (sID) for IBCs. The focus is on the problem of constructing a fully-secure IBS given an sID-secure IBS without using random oracles and with reasonable security degradation.
52

Real-time Assessment, Prediction, and Scaffolding of Middle School Students’ Data Collection Skills within Physical Science Simulations

Sao Pedro, Michael A. 25 April 2013 (has links)
Despite widespread recognition by science educators, researchers and K-12 frameworks that scientific inquiry should be an essential part of science education, typical classrooms and assessments still emphasize rote vocabulary, facts, and formulas. One of several reasons for this is that the rigorous assessment of complex inquiry skills is still in its infancy. Though progress has been made, there are still many challenges that hinder inquiry from being assessed in a meaningful, scalable, reliable and timely manner. To address some of these challenges and to realize the possibility of formative assessment of inquiry, we describe a novel approach for evaluating, tracking, and scaffolding inquiry process skills. These skills are demonstrated as students experiment with computer-based simulations. In this work, we focus on two skills related to data collection, designing controlled experiments and testing stated hypotheses. Central to this approach is the use and extension of techniques developed in the Intelligent Tutoring Systems and Educational Data Mining communities to handle the variety of ways in which students can demonstrate skills. To evaluate students' skills, we iteratively developed data-mined models (detectors) that can discern when students test their articulated hypotheses and design controlled experiments. To aggregate and track students' developing latent skill across activities, we use and extend the Bayesian Knowledge-Tracing framework (Corbett & Anderson, 1995). As part of this work, we directly address the scalability and reliability of these models' predictions because we tested how well they predict for student data not used to build them. When doing so, we found that these models demonstrate the potential to scale because they can correctly evaluate and track students' inquiry skills. The ability to evaluate students' inquiry also enables the system to provide automated, individualized feedback to students as they experiment. As part of this work, we also describe an approach to provide such scaffolding to students. We also tested the efficacy of these scaffolds by conducting a study to determine how scaffolding impacts acquisition and transfer of skill across science topics. When doing so, we found that students who received scaffolding versus students who did not were better able to acquire skills in the topic in which they practiced, and also transfer skills to a second topic when was scaffolding removed. Our overall findings suggest that computer-based simulations augmented with real-time feedback can be used to reliably measure the inquiry skills of interest and can help students learn how to demonstrate these skills. As such, our assessment approach and system as a whole shows promise as a way to formatively assess students' inquiry.
53

Real-time Assessment, Prediction, and Scaffolding of Middle School Students’ Data Collection Skills within Physical Science Simulations

Sao Pedro, Michael A. 25 April 2013 (has links)
Despite widespread recognition by science educators, researchers and K-12 frameworks that scientific inquiry should be an essential part of science education, typical classrooms and assessments still emphasize rote vocabulary, facts, and formulas. One of several reasons for this is that the rigorous assessment of complex inquiry skills is still in its infancy. Though progress has been made, there are still many challenges that hinder inquiry from being assessed in a meaningful, scalable, reliable and timely manner. To address some of these challenges and to realize the possibility of formative assessment of inquiry, we describe a novel approach for evaluating, tracking, and scaffolding inquiry process skills. These skills are demonstrated as students experiment with computer-based simulations. In this work, we focus on two skills related to data collection, designing controlled experiments and testing stated hypotheses. Central to this approach is the use and extension of techniques developed in the Intelligent Tutoring Systems and Educational Data Mining communities to handle the variety of ways in which students can demonstrate skills. To evaluate students' skills, we iteratively developed data-mined models (detectors) that can discern when students test their articulated hypotheses and design controlled experiments. To aggregate and track students' developing latent skill across activities, we use and extend the Bayesian Knowledge-Tracing framework (Corbett & Anderson, 1995). As part of this work, we directly address the scalability and reliability of these models' predictions because we tested how well they predict for student data not used to build them. When doing so, we found that these models demonstrate the potential to scale because they can correctly evaluate and track students' inquiry skills. The ability to evaluate students' inquiry also enables the system to provide automated, individualized feedback to students as they experiment. As part of this work, we also describe an approach to provide such scaffolding to students. We also tested the efficacy of these scaffolds by conducting a study to determine how scaffolding impacts acquisition and transfer of skill across science topics. When doing so, we found that students who received scaffolding versus students who did not were better able to acquire skills in the topic in which they practiced, and also transfer skills to a second topic when was scaffolding removed. Our overall findings suggest that computer-based simulations augmented with real-time feedback can be used to reliably measure the inquiry skills of interest and can help students learn how to demonstrate these skills. As such, our assessment approach and system as a whole shows promise as a way to formatively assess students' inquiry.
54

Repeatability of the Adaptation of Pseudomonas fluorescens to Low Glucose

Teselkin, Oleksiy 30 April 2014 (has links)
Inspired by Gould, who claimed life would be arriving at a different outcome each time it were allowed to run from the same beginning, I have attempted to determine the repeatability of the adaptive course of one Pseudomonas fluorescens lineage. In addition, my study aimed to establish whether the likelihood of parallel evolution of the two synonymous single-nucleotide substitutions was contingent upon a prior motility-impairing deletion or a prior increase in fitness. Further, the study was designed to provide empirical data addressing the long-standing question of the effect of starting fitness on the ensuing rate of adaptation. Although no exact replay of the initial evolutionary trajectory was observed, I have demonstrated that gtsB, but not gtsC gene, is likely to be a mutational hotspot under the low glucose with a recovery of two undescribed mutations in gtsB. My data are consistent with a notion that substitutions in gtsB may be contingent upon Δ35kB(fliJ-PFLU4466) motility-impairing deletion, but not the fitness increase associated with it. Finally, the features of the adaptive landscape of P. fluorescens in the minimal glucose provide languid support for Fisher’s hypothesis of a decrease in adaptation rate with the rise in the starting fitness. Taken together, these original results reinforce the non-negligible role of history in shaping the outcomes of biological evolution and call for caution in attempting a formulation of rigid predictive models of evolutionary change. Inspiré par les travaux de Stephen J. Gould qui affirmait que la vie sur terre arriverait à une forme différente si elle repartait à zéro, je présente ici mes travaux où je teste la reproductibilité du cours adaptatif d’une lignée expérimentale de Pseudomonas fluorescens. L’objectif de cette étude était de déterminer si la probabilité que deux mutations synonymes évoluent en parallèle est affectée par la présence d’une délétion affectant la motilité de la bactérie ou de l’augmentation de la valeur sélective de celle-ci. De plus, le design expérimental de cette étude permet de tester si la valeur sélective initiale d’une population affecte le taux d’adaptation de cette même population. Bien d’une reproductibilité exacte du cours adaptatif initial ne fut pas observée, je démontre que le gène gtsB est probablement un « hotspot »mutationnel permettant l’adaptation à de bas niveau de glucose, ayant trouvé deux mutations dans ce site; alors que le gène gtsC ne l’est pas. Mes données sont également conséquentes avec le fait que les mutation dans le gène gtsB dépendent de l’effet de la délétion Δ35kB(fliJ-PFLU4466) affectant la motilité de la bactérie, mais non de l’augmentation de la valeur sélective qui y est associée. Finalement, la forme du plateau adaptative associé à de bas niveaux de glucose chez P. fluorescens supporte l’hypothèse émise par Fisher qui stipule que le taux d’adaptation d’un organisme diminue avec la valeur sélective initiale qui y est associée. L’ensemble de ces résultats supporte le rôle non-négligeable de l’histoire de vie d’une population en ce qui attrait à l’évolution future de cette même population. Aussi, ces résultats appelle à la prudence quand vient le temps de formuler des modèles prédictifs des changements évolutifs d’une population.
55

Repeatability of the Adaptation of Pseudomonas fluorescens to Low Glucose

Teselkin, Oleksiy January 2014 (has links)
Inspired by Gould, who claimed life would be arriving at a different outcome each time it were allowed to run from the same beginning, I have attempted to determine the repeatability of the adaptive course of one Pseudomonas fluorescens lineage. In addition, my study aimed to establish whether the likelihood of parallel evolution of the two synonymous single-nucleotide substitutions was contingent upon a prior motility-impairing deletion or a prior increase in fitness. Further, the study was designed to provide empirical data addressing the long-standing question of the effect of starting fitness on the ensuing rate of adaptation. Although no exact replay of the initial evolutionary trajectory was observed, I have demonstrated that gtsB, but not gtsC gene, is likely to be a mutational hotspot under the low glucose with a recovery of two undescribed mutations in gtsB. My data are consistent with a notion that substitutions in gtsB may be contingent upon Δ35kB(fliJ-PFLU4466) motility-impairing deletion, but not the fitness increase associated with it. Finally, the features of the adaptive landscape of P. fluorescens in the minimal glucose provide languid support for Fisher’s hypothesis of a decrease in adaptation rate with the rise in the starting fitness. Taken together, these original results reinforce the non-negligible role of history in shaping the outcomes of biological evolution and call for caution in attempting a formulation of rigid predictive models of evolutionary change. Inspiré par les travaux de Stephen J. Gould qui affirmait que la vie sur terre arriverait à une forme différente si elle repartait à zéro, je présente ici mes travaux où je teste la reproductibilité du cours adaptatif d’une lignée expérimentale de Pseudomonas fluorescens. L’objectif de cette étude était de déterminer si la probabilité que deux mutations synonymes évoluent en parallèle est affectée par la présence d’une délétion affectant la motilité de la bactérie ou de l’augmentation de la valeur sélective de celle-ci. De plus, le design expérimental de cette étude permet de tester si la valeur sélective initiale d’une population affecte le taux d’adaptation de cette même population. Bien d’une reproductibilité exacte du cours adaptatif initial ne fut pas observée, je démontre que le gène gtsB est probablement un « hotspot »mutationnel permettant l’adaptation à de bas niveau de glucose, ayant trouvé deux mutations dans ce site; alors que le gène gtsC ne l’est pas. Mes données sont également conséquentes avec le fait que les mutation dans le gène gtsB dépendent de l’effet de la délétion Δ35kB(fliJ-PFLU4466) affectant la motilité de la bactérie, mais non de l’augmentation de la valeur sélective qui y est associée. Finalement, la forme du plateau adaptative associé à de bas niveaux de glucose chez P. fluorescens supporte l’hypothèse émise par Fisher qui stipule que le taux d’adaptation d’un organisme diminue avec la valeur sélective initiale qui y est associée. L’ensemble de ces résultats supporte le rôle non-négligeable de l’histoire de vie d’une population en ce qui attrait à l’évolution future de cette même population. Aussi, ces résultats appelle à la prudence quand vient le temps de formuler des modèles prédictifs des changements évolutifs d’une population.
56

Real-world Exploitation and Vulnerability Mitigation of Google/Apple Exposure Notification Contact Tracing

Ellis, Christopher Jordan January 2021 (has links)
No description available.
57

La télévision, média de masse ou média individuel ? De la télévision traditionnelle à la e-télévision / Is television a mass media or an individual media? From traditional TV to e-television

Martin, Valérie 16 December 2015 (has links)
La télévision traditionnelle, « voix de la France » est un média de masse. Avec un taux d’équipement des foyers de plus de 98%, un taux d’écoute de plus de 3h50 par jour, un chiffre d’affaires publicitaires de plus de 4 Mrd EUR , et des « contenus rendez-vous » assurant des audiences record, cette télévision trône dans le salon et réunit la famille. Jusque dans les années 1980, elle reste placée sous le contrôle du pouvoir politique.Sous la Présidence de François Mitterrand, la télévision se libéralise. De nouvelles chaînes privées et commerciales financées par l’audience et la publicité apparaissent, tandis que le secteur public reste principalement financé par la redevance audiovisuelle. Dans les années 90, l’arrivée du câble et du satellite, puis de la TNT en 2005, permettent l’accroissement considérable du nombre de nouvelles chaînes et des offres de télévision (abonnements au câble, au satellite, et à Canal + en 1984). Le numérique révolutionne le secteur de la télévision, les technologies et les usages. Le nombre de chaînes continue de s’accroître de façon exponentielle. On voit se développer la production de contenus générés par les consommateurs (UGC), et l’interactivité est favorisée grâce aux réseaux sociaux. L’évolution des équipements changent les habitudes de consommation du petit écran, notamment pour les plus jeunes. L’ordinateur, la tablette, le Smartphone et la télévision connectée permettent de visionner « la télévision comme je veux, où je veux, quand je veux. ». Ainsi la télévision traditionnelle considérée comme un média de masse, tend à s’individualiser, et s’adapte aux goûts et aux envies de chacun. L’écosystème jusqu’à présent hexagonal est littéralement en train d’exploser pour faire face à l’internationalisation des contenus et à des acteurs audiovisuels majoritairement nord-américains. La réglementation devenue obsolète peine à trouver un nouveau cadre juridique à l’échelle française et européenne. Face à l’offre mondialisée, et les nouveaux usages de consommation notamment les usages délinéarisés (télévision de rattrapage, et Vidéo à la demande), la télévision continue néanmoins à fédérer le public autour de grands évènements d’actualité (évènements sportifs, politiques, journal télévisé du 20h…). Les profondes mutations et les changements rapides qui s’opèrent actuellement laissent planer une grande incertitude sur l’avenir de la télévision traditionnelle et faute de pouvoir s’adapter, celle-ci pourrait à plus ou moins long terme disparaitre… / Traditional TV, "Voice of France" is a mass medium. With a home ownership rate of over 98%, a viewing rate of l more than 3.50 hours, advertising revenue of over 4 billion euros , and “contenus Rendez-vous” bringing in record audiences, television rules over the living room, bringing the family together up into the 1980s. It is, however, controlled by the ruling political powers.Under the presidency of François Mitterrand, television was liberalized with the appearance of new private and commercial channels, financed by audiences and advertising, while the public sector continued to be funded by a license fee. In the 90s, the arrival of cable and satellite, followed by TNT in 2005, allowed the amazing increase of a number of new channels and TV services (« subscriptions to cable, satellite and finally Canal + in 1984 )But this evolution continues with the arrival of digital TV, revolutionizing the industry. It increases the number of channels exponentially, expands the production of content generated by consumers, and promotes interactivity through social networks. Technological advances are changing consumer habits with regard to the small screen, especially for young people. The computer, tablet, smartphone and the connected TV let you watch "television as I want, where I want, when I want”. Traditional TV, once considered as a mass media, is now a media that is individualized and that adapts to the tastes and desires of each viewer.The current financial system in France is literally exploding to deal with the internationalization of audiovisual content and players, with most of the content produced in North America.. Obsolete regulations can no longer keep pace with consumer demand in Europe and France. Faced with global supply and new viewing habits (delinearised viewing: replayand video on demand), television still continues to unite the public around major news events (sporting events, political events, the evening news …). The important, rapid changes now occurring in TV content and distribution lead to considerable uncertainty with regard to the future of traditional television, and unless it can adapt, it could more or less disappear in the long term.
58

Anomaly Detection in RFID Networks

Alkadi, Alaa 01 January 2017 (has links)
Available security standards for RFID networks (e.g. ISO/IEC 29167) are designed to secure individual tag-reader sessions and do not protect against active attacks that could also compromise the system as a whole (e.g. tag cloning or replay attacks). Proper traffic characterization models of the communication within an RFID network can lead to better understanding of operation under “normal” system state conditions and can consequently help identify security breaches not addressed by current standards. This study of RFID traffic characterization considers two piecewise-constant data smoothing techniques, namely Bayesian blocks and Knuth’s algorithms, over time-tagged events and compares them in the context of rate-based anomaly detection. This was accomplished using data from experimental RFID readings and comparing (1) the event counts versus time if using the smoothed curves versus empirical histograms of the raw data and (2) the threshold-dependent alert-rates based on inter-arrival times obtained if using the smoothed curves versus that of the raw data itself. Results indicate that both algorithms adequately model RFID traffic in which inter-event time statistics are stationary but that Bayesian blocks become superior for traffic in which such statistics experience abrupt changes.

Page generated in 0.0329 seconds