• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19691
  • 3373
  • 2417
  • 2015
  • 1551
  • 1432
  • 881
  • 406
  • 390
  • 359
  • 297
  • 237
  • 208
  • 208
  • 208
  • Tagged with
  • 38219
  • 12470
  • 9257
  • 7123
  • 6700
  • 5896
  • 5307
  • 5203
  • 4740
  • 3461
  • 3307
  • 2835
  • 2730
  • 2546
  • 2117
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
661

Explicating a Biological Basis for Chronic Fatigue Syndrome

Abou-Gouda, Samar A. 18 December 2007 (has links)
In the absence of clinical markers for Chronic Fatigue Syndrome (CFS), research to find a biological basis for it is still open. Many data-mining techniques have been widely employed to analyze biomedical data describing different aspects of CFS. However, the inconsistency of the results of these studies reflect the uncertainty in regards to the real basis of this disease. In this thesis, we show that CFS has a biological basis that is detectable in gene expression data better than blood profile and Single Nucleotide Polymorphism (SNP) data. Using random forests, the analysis of gene expression data achieves a prediction accuracy of approximately 89%. We also identify sets of differentially expressed candidate genes that might contribute to CFS. We show that the integration of data spanning multiple levels of the biological scale might reveal further insights into the understanding of CFS. Using integrated data, we achieve a prediction accuracy of approximately 91%. We find that Singular Value Decomposition (SVD) is a useful technique to visualize the performance of random forests. / Thesis (Master, Computing) -- Queen's University, 2007-12-11 12:15:40.096
662

Various stabbing problems in computational geometry

Doskas, Michael. January 1987 (has links)
No description available.
663

Circuits and programs for handling nuclear data.

Kuchela, Kavudavadi Seshagiriah January 1968 (has links)
No description available.
664

Computer Go-Muku

Yuen, Jeanne Y. Y. January 1988 (has links)
No description available.
665

Data compression systems.

Husson, Georges Eugene. January 1970 (has links)
No description available.
666

Education and technology : a critical study of introduction of computers in Pakistani public schools

Arshad-Ayaz, Adeela. January 2006 (has links)
The importance of technology in education cannot be underestimated. There are compelling reasons for developing nations like Pakistan to introduce technology in their educational systems. Nevertheless the approach and methods used in introducing technology in schools are premised on an economic ideology and based on a techno-centric curriculum that leads to new forms of dependency by keeping individuals from controlling the decisions that significantly shape their lives. / Introduction of technology does not automatically guarantee enhanced learning or effective teaching. Technology in education should be used as a tool to increase communication, create awareness, break down existing hierarchies, develop new styles of creating knowledge, and make schooling and education more inclusive. Mere technical use of computers in education does nothing to empower students. / The techno-centric introduction of technology in Pakistani public schools is likely to produce inequality. A number of practices in Pakistan's educational and social structure will have to change for the potential of technology to be fully achieved. A shift is needed from 'learning about the computers' to 'using computers in learning', from 'acquisition of limited skills' to 'construction of knowledge', from 'teacher-dependency' to 'independent inquiry' and from 'teacher-centered' to 'student-centered' teaching methods. / However, such a change can only take place within a critical framework of education. The critical model based on integrated curriculum treats the computer not as an isolated subject but as a tool that helps learners enhance their critical thinking skills and seek various alternatives to solve problems. / Thus, it is important for educational policy-makers to realize that any effort at introducing technology in the educational realm requires theoretical discussion and a societal dialogue to arrive at a framework for technology's place in socio-educational contexts. Pakistan needs to develop and introduce educational technology to seek solutions for its unique economic, social, cultural and human and social development requirements based on its present level of development and evolution.
667

TCP Adaptation Framework in Data Centers

Ghobadi, Monia 09 January 2014 (has links)
Congestion control has been extensively studied for many years. Today, the Transmission Control Protocol (TCP) is used in a wide range of networks (LAN, WAN, data center, campus network, enterprise network, etc.) as the de facto congestion control mechanism. Despite its common usage, TCP operates in these networks with little knowledge of the underlying network or traffic characteristics. As a result, it is doomed to continuously increase or decrease its congestion window size in order to handle changes in the network or traffic conditions. Thus, TCP frequently overshoots or undershoots the ideal rate making it a "Jack of all trades, master of none" congestion control protocol. In light of the emerging popularity of centrally controlled Software-Defined Networks (SDNs), we ask whether we can take advantage of the information available at the central controller to improve TCP. Specifically, in this thesis, we examine the design and implementation of OpenTCP, a dynamic and programmable TCP adaptation framework for SDN-enabled data centers. OpenTCP gathers global information about the status of the network and traffic conditions through the SDN controller, and uses this information to adapt TCP. OpenTCP periodically sends updates to end-hosts which, in turn, update their behaviour using a simple kernel module. In this thesis, we first present two real-world TCP adaptation experiments in depth: (1) using TCP pacing in inter-data center communications with shallow buffers, and (2) using Trickle to rate limit TCP video streaming. We explain the design, implementation, limitation, and benefits of each TCP adaptation to highlight the potential power of having a TCP adaptation framework in today's networks. We then discuss the architectural design of OpenTCP, as well as its implementation and deployment at SciNet, Canada's largest supercomputer center. Furthermore, we study use-cases of OpenTCP using the ns-2 network simulator. We conclude that OpenTCP-based congestion control simplifies the process of adapting TCP to network conditions, leads to improvements in TCP performance, and is practical in real-world settings.
668

TCP Adaptation Framework in Data Centers

Ghobadi, Monia 09 January 2014 (has links)
Congestion control has been extensively studied for many years. Today, the Transmission Control Protocol (TCP) is used in a wide range of networks (LAN, WAN, data center, campus network, enterprise network, etc.) as the de facto congestion control mechanism. Despite its common usage, TCP operates in these networks with little knowledge of the underlying network or traffic characteristics. As a result, it is doomed to continuously increase or decrease its congestion window size in order to handle changes in the network or traffic conditions. Thus, TCP frequently overshoots or undershoots the ideal rate making it a "Jack of all trades, master of none" congestion control protocol. In light of the emerging popularity of centrally controlled Software-Defined Networks (SDNs), we ask whether we can take advantage of the information available at the central controller to improve TCP. Specifically, in this thesis, we examine the design and implementation of OpenTCP, a dynamic and programmable TCP adaptation framework for SDN-enabled data centers. OpenTCP gathers global information about the status of the network and traffic conditions through the SDN controller, and uses this information to adapt TCP. OpenTCP periodically sends updates to end-hosts which, in turn, update their behaviour using a simple kernel module. In this thesis, we first present two real-world TCP adaptation experiments in depth: (1) using TCP pacing in inter-data center communications with shallow buffers, and (2) using Trickle to rate limit TCP video streaming. We explain the design, implementation, limitation, and benefits of each TCP adaptation to highlight the potential power of having a TCP adaptation framework in today's networks. We then discuss the architectural design of OpenTCP, as well as its implementation and deployment at SciNet, Canada's largest supercomputer center. Furthermore, we study use-cases of OpenTCP using the ns-2 network simulator. We conclude that OpenTCP-based congestion control simplifies the process of adapting TCP to network conditions, leads to improvements in TCP performance, and is practical in real-world settings.
669

Mining frequent itemsets from uncertain data: extensions to constrained mining and stream mining

Hao, Boyu 19 July 2010 (has links)
Most studies on frequent itemset mining focus on mining precise data. However, there are situations in which the data are uncertain. This leads to the mining of uncertain data. There are also situations in which users are only interested in frequent itemsets that satisfy user-specified aggregate constraints. This leads to constrained mining of uncertain data. Moreover, floods of uncertain data can be produced in many other situations. This leads to stream mining of uncertain data. In this M.Sc. thesis, we propose algorithms to deal with all these situations. We first design a tree-based mining algorithm to find all frequent itemsets from databases of uncertain data. We then extend it to mine databases of uncertain data for only those frequent itemsets that satisfy user-specified aggregate constraints and to mine streams of uncertain data for all frequent itemsets. Experimental results show the effectiveness of all these algorithms.
670

Frequent pattern mining of uncertain data streams

Jiang, Fan January 2011 (has links)
When dealing with uncertain data, users may not be certain about the presence of an item in the database. For example, due to inherent instrumental imprecision or errors, data collected by sensors are usually uncertain. In various real-life applications, uncertain databases are not necessarily static, new data may come continuously and at a rapid rate. These uncertain data can come in batches, which forms a data stream. To discover useful knowledge in the form of frequent patterns from streams of uncertain data, algorithms have been developed to use the sliding window model for processing and mining data streams. However, for some applications, the landmark window model and the time-fading model are more appropriate. In this M.Sc. thesis, I propose tree-based algorithms that use the landmark window model or the time-fading model to mine frequent patterns from streams of uncertain data. Experimental results show the effectiveness of our algorithms.

Page generated in 0.0754 seconds