Spelling suggestions: "subject:"1echnology anda science"" "subject:"1echnology ando science""
91 |
Enhancing the Internet of Things Architecture with Flow SemanticsDeSerranno, Allen Ronald 02 December 2017 (has links)
<p> Internet of Things (‘IoT’) systems are complex, asynchronous solutions often comprised of various software and hardware components developed in isolation of each other. These components function with different degrees of reliability and performance over an inherently unreliable network, the Internet. Many IoT systems are developed within silos that do not provide the ability to communicate or be interoperable with other systems and platforms. Literature exists on how these systems should be designed, how they should interoperate, and how they could be improved, but practice does not always consult literature. </p><p> The work brings together a proposed reference architecture for the IoT and engineering practices for flow semantics found in existing literature with a commercial implementation of an IoT platform. It demonstrates that the proposed IoT reference architecture and flow-service-quality engineering practices when integrated together can produce a more robust system with increased functionality and interoperability. It shows how such practices can be implemented into a commercial solution, and explores the value provided to the system when implemented. This work contributes to the current understanding of how complex IoT systems can be developed to be more reliable and interoperable using reference architectures and flow semantics. The work highlights the value of integration of academic solutions with commercial implementations of complex systems.</p><p>
|
92 |
A Behavioral Biometrics User Authentication Study Using Motion Data from Android SmartphonesMaghsoudi, Javid 03 January 2018 (has links)
<p> This is a study of the behavioral biometric of smartphone motion to determine the potential accuracy of authenticating users on smartphone devices. The study used the application Sensor Kinetics Pro and the Weka machine-learning library to analyze accelerometer and gyroscope data. The study conducted three experiments for the research. They were conducted in spring 2015, fall 2015, and spring 2016. The final experiment in spring 2016 used six Android-based smartphones to capture data from 60 participants and each participant performed 20 trials of two motions: bringing the phone up to eye level for review, and then bringing the phone to the ear, resulting in 1200 runs. The resulting sensor datasets were used for machine learning training and testing. The study used filtering data to remove noise, and then aggregated the data and used them as inputs to the Weka Machine Learning tool. The study used several machine classification algorithms: the Multilayer Perception (MLP), k-Nearest Neighbor (k-NN), Naïve Bayes (N-B), and Support Vector Machine (SVM) machine learning classification algorithms. The study reached authentication accuracies of up to 93% thus supporting the use of behavioral motion biometrics for user authentication. Preliminary studies with smaller numbers of participants in spring 2015 and in fall 2015 also produced 90%+ authentication accuracy.</p><p>
|
93 |
Contention Alleviation in Network-on-ChipsXiang, Xiyue 21 December 2017 (has links)
<p>In a network-on-chip (NoC) based system, the NoC is a shared resource among multiple processor cores. Requests generated by different applications running on different cores can create severe contention in NoCs. This contention can jeopardize the system performance and power efficiency in many different formats. First and foremost, we discover that the contention in NoCs can induce inter-application interference, leading to overall system performance degradation, prevent fair-progress of different applications, and cause starvation of unfairly-treated applications. We propose the NoC Application Slowdown (NAS) Model, the first online model that accurately estimates how much network delays due to interference contribute to the overall stall time of each application. We use NAS to develop Fairness-Aware Source Throttling (FAST), a mechanism that employs slowdown predictions to control the network injection rates of applications in a way that minimizes system unfairness. Furthermore, although removing buffers from the constituent routers can reduce power consumption and hardware complexity, the bufferless NoC is subject to the growing deflection caused by contention, leading to severe performance degradation and squandering power-saving potential. we then propose Deflection Containment (DeC) for the bufferless NoC to address its notorious shortcoming of excessive deflection for performance improvement and power reduction. With a link added to each router for bridging subnetworks (whose aggregated link width equals a give value, say, 128b), DeC lets a contending flit in one subnetwork be forwarded to another subnetwork instead of deflected, yielding extraordinary deflection reduction and greatly enriching path diversity. In addition, router microarchitecture under DeC is rectified to shorten the critical path and lift network bandwidth. Last but not least, beside 1-to-1 flow, the growing core counts urgently requires effective hardware support to alleviate the contention caused by 1-to-many and many-to-1 flow. We propose Carpool, the very first bufferless NoC optimized for 1-to-many and many-to-1 traffic. Carpool adaptively forks new flit replicas and performs traffic aggregation at appropriate intermediate routers to
lessen bandwidth demands and reduce contention. We propose the microarchitecture of Carpool routers and develop parallel port allocation which supports multicast and reduces critical paths to improve network bandwidth.
|
94 |
Computational methods for multi-omic models of cell metabolism and their importance for theoretical computer scienceAngione, Claudio January 2015 (has links)
To paraphrase Stan Ulam, a Polish mathematician who became a leading figure in the Manhattan Project, in this dissertation I focus not only on how computer science can help biologists, but also on how biology can inspire computer scientists. On one hand, computer science provides powerful abstraction tools for metabolic networks. Cell metabolism is the set of chemical reactions taking place in a cell, with the aim of maintaining the living state of the cell. Due to the intrinsic complexity of metabolic networks, predicting the phenotypic traits resulting from a given genotype and metabolic structure is a challenging task. To this end, mathematical models of metabolic networks, called genome-scale metabolic models, contain all known metabolic reactions in an organism and can be analyzed with computational methods. In this dissertation, I propose a set of methods to investigate models of metabolic networks. These include multi-objective optimization, sensitivity, robustness and identifiability analysis, and are applied to a set of genome-scale models. Then, I augment the framework to predict metabolic adaptation to a changing environment. The adaptation of a microorganism to new environmental conditions involves shifts in its biochemical network and in the gene expression level. However, gene expression profiles do not provide a comprehensive understanding of the cellular behavior. Examples are the cases in which similar profiles may cause different phenotypic outcomes, while different profiles may give rise to similar behaviors. In fact, my idea is to study the metabolic response to diverse environmental conditions by predicting and analyzing changes in the internal molecular environment and in the underlying multi-omic networks. I also adapt statistical and mathematical methods (including principal component analysis and hypervolume) to evaluate short term metabolic evolution and perform comparative analysis of metabolic conditions. On the other hand, my vision is that a biomolecular system can be cast as a ?biological computer?, therefore providing insights into computational processes. I therefore study how computation can be performed in a biological system by proposing a map between a biological organism and the von Neumann architecture, where metabolism executes reactions mapped to instructions of a Turing machine. A Boolean string represents the genetic knockout strategy and also the executable program stored in the ?memory? of the organism. I use this framework to investigate scenarios of communication among cells, gene duplication, and lateral gene transfer. Remarkably, this mapping allows estimating the computational capability of an organism, taking into account also transmission events and communication outcomes.
|
95 |
Measurement-driven characterization of the mobile environmentSoroush, Hamed 01 January 2013 (has links)
The concurrent deployment of high-quality wireless networks and large-scale cloud services offers the promise of secure ubiquitous access to seemingly limitless amount of content. However, as users' expectations have grown more demanding, the performance and connectivity failures endemic to the existing networking infrastructure have become more apparent. These problems are in general exacerbated by user mobility. The work presented in this dissertation demonstrates that performance of services for mobile users is significantly affected by environmental factors that are hard to characterize ahead of deployment. This work includes development and evaluation of large-scale mobile experimentation infrastructures (DOME, GENI) that facilitate longitudinal studies of today's technologically diverse mobile environment over a period of four years. Based on the insights gained from these studies, a mechanism called Spider is presented that provides for efficiently utilizing Wi-Fi deployments in highly mobile scenarios to achieve higher throughput and improved connectivity. This work presents the first in-depth analysis of the performance of attempting concurrent AP connections from highly mobile clients. Spider provides a 400% improvement in throughput and 54% improvement in connectivity over stock Wi-Fi implementations. The last part of this dissertation demonstrates there are predictable differences among performances of a cellular network in different geographical locations of a town. Consequently, patterns of data transmission between a server on the Internet and a moving cell phone can reveal the geographic travel path of that phone. While the GPS and location-awareness features on phones explicitly share this information, phone users will likely be surprised to learn that disabling these features does not suffice to prevent a remote server from determining their travel path. We showed that a simple HMM-based classifier can discover and exploit features of the geography surrounding possible travel paths to determine the path a phone took, using only data visible at the remote server on the Internet. Having gathered hundreds of traces over a large geographic area, we showed that the HMM-based technique is able to distinguish mobile phones from stationary phones with up to 94.7% accuracy. Routes taken by each mobile phone could be distinguished with up to 75.9% accuracy using the same technique. This dissertation proposes new tools and techniques for characterization of the impact of the environment on the performance of mobile networks. The concrete set of results and insights gained from this work demonstrates mechanisms for improving connectivity and throughput in highly mobile scenarios while at the same time raising new challenges for maintaining the privacy of mobile users.
|
96 |
A study of e-learning technology integration by preservice science teachersOlugbara, Cecilia Temilola January 2017 (has links)
A thesis submitted to the Faculty of Education in fulfillment of the requirements for the Degree of Doctor Of Education in Science Education in the Department of Mathematics, Science and Technology Education at the University of Zululand, 2017 / This study investigated possible factors predicting e-learning technology integration into the teaching and learning of science subjects by preservice science teachers. An E-learning technology integration model was developed in which factors such as intention (INT), attitude (ATT), Skill (SKL) and Flow Experience (FLW) served as possible precursors of e-learning technology integration. This was done against the gap that continued to exist between intention to integrate e-learning technology and actual integration of e-learning technologies. To close the gap, the study developed a model to predict e-learning technology integration by the research sample. More specifically, the model hypothesised that quality consciousness and innovation consciousness moderated the intention-integration gap. The proposed model was first pilot-tested on a sample of 30 preservice science teachers (PSSTs) before it was applied to the main study, which comprised a research sample of 100 final year PSSTs at the University of Zululand, KwaZulu-Natal Province, South Africa. The study was located within the mixed-methods research paradigm, based on a survey research design. Data collection was carried out using a semi-structured questionnaire which allowed for the collection of both quantitative and qualitative data. Quantitative data were analysed using the Partial Least Squares (PLS) Structural Equation Modelling (SEM), while qualitative data were analysed using a hermeneutic content analysis approach. The results of the study were, firstly, that the proposed model explained 44% of the PSSTs integration of e-learning technologies into the teaching and learning of science subjects and that skill was the most significant and strongest factor predicting the PSSTs integration of e-learning technologies; flow experience was the second important factor predicting the PSSTs integration of e-learning technologies, followed by intention and lastly, attitude. Secondly, the study revealed that quality consciousness and innovation consciousness significantly moderated the gap between intention to integrate e-learning technologies and the actual integration of e-learning technologies, with quality consciousness having the stronger moderating effect. Thirdly, the study revealed that some preservice science teachers were able to utilise e-learning technologies during the period of teaching practice for instructional preparation, instructional delivery, and to facilitate learning. However, some PSSTs were unable to utilised e-learning technologies during teaching practice, ostensibly because of a lack of e-learning facilities in the schools. Some recommendations are made based on the findings of the study. These relate to the management of e-learning at the university, schools and implications for policy.
|
97 |
Facilitating teacher participation in intelligent computer tutor design: Tools and design methodsMurray, Thomas Joseph 01 January 1992 (has links)
This work addresses the widening gap between research in intelligent tutoring systems (ITSs) and practical use of this technology by the educational community. In order to ensure that ITSs are effective, teachers must be involved in their design and evaluation. We have followed a user participatory design process to build a set of ITS knowledge acquisition tools that facilitate rapid prototyping and testing of curriculum, and are tailored for usability by teachers. The system (called KAFITS) also serves as a test-bed for experimentation with multiple tutoring strategies. The design includes novel methodologies for tutoring strategy representation (Parameterized Action Networks) and overlay student modeling (a "layered" student model), and incorporates considerations from instructional design theory. It also allows for considerable student control over the content and style of the information presented. Highly interactive graphics-based tools were built to facilitate design, inspection, and modification of curriculum and tutoring strategies, and to monitor the progress of the tutoring session. Evaluation of the system includes a sixteen-month case study of three educators (one being the domain expert) using the system to build a tutor for statics (forty topics representing about four hours of on-line instruction), testing the tutor on a dozen students, and using test results to iteratively improve the tutor. Detailed throughput analysis indicates that the amount of effort to build the statics tutor was, surprisingly, comparable to similar figures for building (non-intelligent) conventional computer aided instructional systems. Few ITS projects focus on educator participation and this work is the first to empirically study knowledge acquisition for ITSs. Results of the study also include: a recommended "design process" for building ITSs with educator participation; guidelines for training educators; recommendations for conducting knowledge acquisition sessions; and design tradeoffs for knowledge representation architectures and knowledge acquisition interfaces.
|
98 |
Virtual Fetal Pig Dissection As An Agent Of Knowledge Acquisition And Attitudinal Change In Female High School Biology StudentsMaloney, Rebecca 20 December 2002 (has links)
One way to determine if all students can learn through the use of computers is to introduce a lesson taught completely via computers and compare the results with those gained when the same lesson is taught in a traditional manner. This study attempted to determine if a virtual fetal pig dissection can be used as a viable alternative for an actual dissection for females enrolled in high school biology classes by comparing the knowledge acquisition and attitudinal change between the experimental (virtual dissection) and control (actual dissection) groups. Two hundred and twenty four students enrolled in biology classes in a suburban all-girl parochial high school participated in this study. Female students in an all-girl high school were chosen because research shows differences in science competency and computer usage between the genders that may mask the performance of females on computer-based tasks in a science laboratory exercise. Students who completed the virtual dissection scored significantly higher on practical test and objective tests that were used to measure knowledge acquisition. Attitudinal change was measured by examining the students' attitudes toward dissections, computer usage in the classroom, and toward biology both before and after the dissections using pre and post surveys. Significant results in positive gain scores were found in the virtual dissection group's attitude toward dissections, and their negative gain score toward virtual dissections. Attitudinal changes toward computers and biology were not significant. A purposefully selected sample of the students were interviewed, in addition to gathering a sample of the students' daily dissection journals, as data highlighting their thoughts and feelings about their dissection experience. Further research is suggested to determine if a virtual laboratory experience can be a substitute for actual dissections, or may serve as an enhancement to an actual dissection.
|
99 |
Development of superconducting thin films for use in SRF cavity applicationsWilde, Stuart January 2017 (has links)
Superconducting thin films are a possible alternative to bulk niobium for superconducting radio frequency cavity applications. Thin film cavities have produced larger Q0 than bulk niobium at low accelerating voltages [1], are less susceptible to external magnetic fields and therefore require less magnetic shielding than bulk niobium cavities [2] and can benefit from substrates which conduct heat more effectively than bulk niobium [3]. The major drawback for current thin film cavity technology is the large Q slope which is observed above accelerating gradients of 6 7 MV/m. The mechanism for the Q slope is not yet fully understood. Theories have been suggested but are not accepted by everyone within the scientific community [2, 4, 5, 6, 7]. It is assumed that a better understanding of the physical properties of superconducting films is required before the origins of the sharp Q slope can be elucidated. This study has been conducted to better understand the physical properties of superconducting thin films deposited by the magnetron sputtering process. In particular, superconducting niobium films have been deposited by high power impulse magnetron sputtering (HiPIMS) and tested by a wide range of analytical techniques as a function of the substrate temperature and applied bias during deposition. Analytical techniques which have been used include x-ray diffraction crystallography, Rutherford backscattering spectroscopy, scanning electron microscopy, residual resistance ratio, DC magnetometry and RF surface resistance measurements. Results showed that the application of an applied bias during deposition resulted in increased energy of bombarding ions and enhanced rates of surface diffusion and defect annihilation within the microstructure of a growing niobium film. However, large numbers of random complex defects formed once the energy of bombarding ions becomes too large. The systematic approach that was described to investigate the changing morphological and DC superconducting properties of deposited films, as a function of the applied bias, allowed the identification of which process conditions produce the fewest random complex defects. The same systematic investigations could be applied to any HiPIMS deposition facility to provide similar results. An important observation during the study is that the initial substrate conditions have a large influence on the properties of a deposited niobium film. Niobium films deposited onto polycrystalline copper substrate that was pre-annealed at 700 ˚C prior to deposition displayed more stable magnetic flux pinning, larger RRR and an enhanced resistance to the onset of flux penetration, than was observed for films deposited with a wide range of process conditions onto as received copper substrate. Superconductors other than niobium have been successfully deposited by HiPIMS and tested. Niobium titanium nitride thin films displayed a superconducting transition temperature up to 16.7 K, with a normal state resistivity as small as 45±7 μΩcm. The findings suggest that similar niobium titanium nitride thin films could produce smaller RF surface resistance than bulk niobium cavities at 4.2 K.
|
100 |
Evaluation of a professional development program on integrating technology into middle schools : classroom environment and student attitudesBiggs, Ellyn M. January 2008 (has links)
The Alliance+ project is a teacher professional development program that integrates technology into mathematics and science lessons. The effectiveness of this innovative program was evaluated in terms of students‟ perceptions of the classroom learning environment and their attitudes towards science/mathematics. The sample consisted of 759 students of seven mathematics/science teachers (four Alliance+ participants and three non-participants) in one middle school in Miami-Dade County, Florida. The students responded to learning environment scales based on the Constructivist Learning Environment Survey (CLES) and the What Is Happening In this Class? (WIHIC) questionnaires to assess their perceptions of the classroom learning environment. Additionally, they responded to an attitude scale modeled on the Test Of Science-Related Attitudes (TOSRA) to assess their attitudes towards mathematics/science. It was found that Alliance+ teachers were more successful than the non-Alliance+ teachers in promoting a classroom environment with more cooperation among students during the science/mathematics lessons. Additionally, Alliance+ professional development model was differentially effective for mathematics and science teachers in terms of three learning environment scales (namely, Teacher Support, Cooperation, and Critical Voice), but not in terms of students‟ attitudes to science. In terms of Cooperation, Alliance+ teachers were more effective than non-Alliance+ teachers for mathematics, but comparable in effectiveness to non-Alliance+ teachers for science. For Critical Voice, Alliance+ teachers were slightly more effective than non-Alliance+ teachers for mathematics, but considerably less effective than non-Alliance+ teachers for science. / In terms of Teacher Support, Alliance+ were less effective than non-Alliance+ teachers for science, but comparable in effectiveness to non-Alliance+ teachers for mathematics. However, teachers who did not participate in the Alliance+ project were more effective than the teachers who participated in the Alliance+ project in providing a positive learning environment in which the students perceived more teacher support and in promoting positive attitudes towards science/mathematics. Qualitative data results revealed that the Alliance+ teachers had not received sufficient support from their school administrators and Alliance+ trainers and lacked the resources that were necessary for them to implement the project successfully, which could possibly be an explanation for the quantitative results in favor of the non-Alliance+ teachers. This study also investigated outcome-environment associations. It was found that associations existed between students‟ attitudes towards science/mathematics and their perceptions of the classroom leaning environment (especially personal relevance, teacher support, and cooperation).
|
Page generated in 0.0984 seconds