• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8730
  • 2930
  • 1104
  • 1047
  • 1016
  • 682
  • 315
  • 302
  • 277
  • 266
  • 135
  • 128
  • 79
  • 78
  • 75
  • Tagged with
  • 20065
  • 3906
  • 2815
  • 2572
  • 2430
  • 2343
  • 1929
  • 1828
  • 1554
  • 1521
  • 1510
  • 1510
  • 1497
  • 1443
  • 1395
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Aggregation, dissemination and filtering : controlling complex information flows in networks

Banerjee, Siddhartha 25 October 2013 (has links)
Modern day networks, both physical and virtual, are designed to support increasingly sophisticated applications based on complex manipulation of information flows. On the flip side, the ever-growing scale of the underlying networks necessitate the use of low-complexity algorithms. Exploring this tension needs an understanding of the relation between these flows and the network structure. In this thesis, we undertake a study of three such processes: aggregation, dissemination and filtering. In each case, we characterize how the network topology imposes limits on these processes, and how one can use knowledge of the topology to design simple yet efficient control algorithms. Aggregation: We study data-aggregation in sensor networks via in-network computation, i.e., via combining packets at intermediate nodes. In particular, we are interested in maximizing the refresh-rate of repeated/streaming aggregation. For a particular class of functions, we characterize the maximum achievable refresh-rate in terms of the underlying graph structure; furthermore we develop optimal algorithms for general networks, and also a simple distributed algorithm for acyclic wired networks. Dissemination: We consider dissemination processes on networks via intrinsic peer-to-peer transmissions aided by external agents: sources with bounded spreading power, but unconstrained by the network. Such a model captures many static (e.g. long-range links) and dynamic/controlled (e.g. mobile nodes, broadcasting) models for long-range dissemination. We explore the effect of external sources for two dissemination models: spreading processes, wherein nodes once infected remain so forever, and epidemic process, in which nodes can recover from the infection. The main takeaways from our results demonstrate: (i) the role of graph structure, and (ii) the power of random strategies. In spreading processes, we show that external agents dramatically reduce the spreading time in networks that are spatially constrained; furthermore random policies are order-wise optimal. In epidemic processes, we show that for causing long-lasting epidemics, external sources must scale with the number of nodes -- however the strategies can be random. Filtering: A common phenomena in modern recommendation systems is the use of user-feedback to infer the 'value' of an item to other users, resulting in an exploration vs. exploitation trade-off. We study this in a simple natural model, where an 'access-graph' constrains which user is allowed to see which item, and the number of items and the number of item-views are of the same order. We want algorithms that recommend relevant content in an online manner (i.e., instantaneously on user arrival). To this end, we consider both finite-population (i.e., with a fixed set of users and items) and infinite-horizon settings (i.e., with user/item arrivals and departures) -- in each case, we design algorithms with guarantees on the competitive ratio for any arbitrary user. Conversely, we also present upper bounds on the competitive ratio, which show that in many settings our algorithms are orderwise optimal. / text
202

Social network influence on relational termination and renewal

Hansen, Ryan Nicholas 24 March 2014 (has links)
This study’s focus was on individuals who are currently in romantic relationships or who were in one within the past six months. The goal of the study was to learn more about the relationship between the perceived difficulty of renewing a recently terminated relationship or terminating a current romantic relationship and social network closeness, integration, and levels of perceived support for the relationship from network members. The findings for this study support the prediction that an individual’s perceived difficulty of terminating an intact relationship was positively related to their levels of closeness with their partner’s social network members, the individual’s partner’s closeness with the individual’s social network members, and the perceived support for the relationship from network members. / text
203

Bowling online : smartphones, mobile messengers, and mobile social games for Korean teen girls

Seo, Hogeun 16 February 2015 (has links)
Due to their arduous schedules, Korean high school students have little time to socialize with their peers face-to-face. Because of this, socializing in online environments is important to them. Using smartphone applications, Korean high school girls are creating their own cultural practices as they socialize. However, media repeatedly report concerns about adolescents’ excessive use of smartphones, and the public has begun to worry about children’s media dependence. In exploring these phenomena, I pose four research questions: 1) what do smartphones mean to South Korean high school girls? 2) How do South Korean high school girls socialize through mobile messengers, such as Kakao Talk, and how are these activities related to their social capital and social networks? 3) How do South Korean high school girls socialize through mobile social games connected to mobile messengers, and how are these activities related to their social capital and social networks? 4) How is Korean high school girls’ attachment to smartphones related to smartphone addiction? For this research, I conducted focus group interviews with 23 Korean high school girls about their smartphone use. The findings of this research revealed that 1) South Korean high school girls established an exceptional attachment to smartphones; 2) interviewees were complementing the deficiency of offline socializing by establishing alternative online communities through smartphone messengers, and these social behaviors were increasing their bonding social capital; 3) Korean high school girls were interacting with their strong ties through mobile social games, and the interactions with their weak ties were limited and superficial; and 4) Korean high school girls were at risk for smartphone addiction in accordance with the existing criteria for media addiction. / text
204

To investigate and evaluate a prototype for a remote database access protocol

Haughton, Howard January 1990 (has links)
In the past, techniques for specifying, verifying and implementing protocols have taken on a somewhat ad hoc (non -uniform) and informal nature. This lack of uniformity has resulted in an abundance of techniques and methodologies for analysing protocols, most of which are applicable to protocols having a small degree of complexity. Typically, different techniques are applied to various stages of a protocol development without an underlying formal basis for their integrated application. As a result, there may be no way to guarantee that subsequent stages of a development represent correct realisations of earlier ones. This thesis aims to address the problem of protocol development stated above by describing unified frameworks within which: 1) A formal theoretical foundation is laid for specifying, verifying and implementing protocols. 2) A knowledge based system is used for the formal development of a certain class of protocols. A number of limitations have been identified in the approach taken for developing the frameworks: a) The lack of 'compositional' expressiveness of the algebraic specification language. This makes it difficult to effectively analyse concurrently executing processes of protocols. b) The lack of support provided for addressing performance related issues. This makes it difficult to compare different protocols to assess their effect with respect to how long they take to achieve some data processing task. c) The protocol derivation algorithm can prove cumbersome in its application and may require a significant amount of domain knowledge (about types of 'primitives') in order to be machine automated. d) The knowledge based framework is currently limited to supporting the development of end-to-end protocols. This however is not a serious problem as the ideas and principles applied in developing these protocols form the basis for work in analysing other types. The above limitations form the basis for future work which will aim to address the problems stated. The thesis is in 5 main parts:- I) A description of various formalisms used in the past, to specify protocols. From this analysis, criteria are developed for assessing the relative merits of these formalisms, with a view towards choosing one such technique to be employed in specifying protocols. ii) A formal development of the protocol which includes a discussion of automatic theorem proving via a syntactic measure known as a trace. iii) A description of a notation with operational semantics developed for specifying and verifying protocols and services. In addition, a method utilising the notation is described whereby a service can be derived from a protocol. iv) A description of a framework within which a protocol may be verified in respect of the service it provides. v) A description of an interactive program (environment) allowing the formal development of a certain class of distributed protocols, such as the ECMA application layer protocol. The originality of this work lies in the: I) development of a methodology for automatically deriving and proving invariant properties of a specification. ii) development of a notation with operational semantics, capable of specifying and verifying distributed protocols and services. iii) identification of a means by which the correspondence between a specification and its implementation may be used as rewrite rules (not necessarily preserving all semantic information) in developing communication protocols. v) numerous algorithms described for addressing safety, liveness and conformity issues, as part of the interactive environment.
205

Channel Based Sampling in a Network Based Data Acquisition System

Sulewski, Joseph, Dehmelt, Chris 10 1900 (has links)
ITC/USA 2013 Conference Proceedings / The Forty-Ninth Annual International Telemetering Conference and Technical Exhibition / October 21-24, 2013 / Bally's Hotel & Convention Center, Las Vegas, NV / Over the last few years, PCM based data acquisition systems have become known as "Traditional PCM" systems. This terminology modification is a sign of the evolution of the next generation of telemetry/data acquisition systems based on network topologies. This has come about due to users clamoring for functionality that has not been available in the traditional systems, such as supporting increased data rates, providing access to onboard archived data, supporting on-the-fly reconfiguration, and simplifying data distribution and delivery. The iNET standard is using standard network technology to improve device interoperability and data acquisition. To minimize impact on existing data acquisition system devices, the initial effort of this approach has included the transmission of "Traditional" fixed PCM frames within a network message based structure. This approach, however, squanders network bandwidth, as a PCM frame includes all samples of all channels, and requires significant processing power for even simple tasks. Delivering on the promise of a more flexible transmission method requires a change in how data is acquired in the data acquisition devices. The iNET standard defines such a packet based transport system, which supports channel based packet formats besides "Traditional PCM" to efficiently deliver data products. This paper will provide background on the benefits of these methods and an overview of methods by which these formats can be implemented.
206

Optimisation techniques for telecommunication networks

Grout, V. M. January 1988 (has links)
This thesis deals with various facets of the optimisation problem for telecommunication networks and proposes a number of new techniques for their solution. The necessary essentials, Graph Theory, Complexity Theory and Telecommunication Principles, are investigated. The relevant graphs are enumerated and the requirements of suitable optimisation algorithms for certain graphical problems are established. The Private Automatic Branch Exchange (PABX) is introduced. the variety of telecommunications traffic as well as the practical requirements of a connection topology are discussed. The fundamental Network Optimisation Problem (NJP) is defined and analysed. Simple exhaustive methods of solution are considered together with partial solution algorithms and simplification methods. Centralised networks with and without concentrators are introduced. Extensions and modifications are proposed for some techniques and existing practical methods of dealing with the NOP are investigated. A number of new ideas are proposed for the practical solution of the NOP. Reduction methods are presented for replacing large unmanageable networks with smaller ones, on which optimisation can take place. Fixed topology techniques are introduced for initial tandem switch selection purposes and perturbation methods are considered which can be applied to such an initial solution. Lookahead methods of link removal are introduced for the purposes of determining the tandem interconnection network together with the traffic routeing strategy. A composite method is proposed incorporating all of these concepts and the results of a number of numerical experiments upon actual network problem; are presented. the extension of the proposed techniques to other areas of problem solving and optimisation is considered. In particular, a new method for the solution of the Euclidean Travelling Salesman Problem (ETSP) is presented. A brief discussion is undertaken, in conclusion, concerning the practical difficulties of the NOP and The restrictions this placed upon solution algorithms of various types.
207

Some results on the location problem

鄭國榮, Cheng, Kwok-wing, Philip. January 1998 (has links)
published_or_final_version / Computer Science / Master / Master of Philosophy
208

Properties of a distributed-lumped-active low-pass network

Tong, Ronald Matthew, 1946- January 1971 (has links)
No description available.
209

An investigation of sensitivity in active RC networks

Workman, Kenneth, 1938- January 1963 (has links)
No description available.
210

Neural networks predict well inflow performance

Alrumah, Muhammad K. 30 September 2004 (has links)
Predicting well inflow performance relationship accurately is very important for production engineers. From these predictions, future plans for handling and improving well performance can be established. One method of predicting well inflow performance is to use artificial neural networks. Vogel's reference curve, which is produced from a series of simulation runs for a reservoir model proposed by Weller, is typically used to predict inflow performance relationship for solution-gas-drive reservoirs. In this study, I reproduced Vogel's work, but instead of producing one curve by conventional regression, I built three neural network models. Two models predict the IPR efficiently with higher overall accuracy than Vogel's reference curve.

Page generated in 0.0516 seconds