• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 2
  • Tagged with
  • 14
  • 14
  • 14
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A COMPUTER-AIDED METHODOLOGY FOR THE DESIGN AND ANALYSIS OF DATA COMMUNICATION NETWORKS

Bracker, William Edward January 1981 (has links)
The increase in the interconnection of computers has led to a corresponding increase in the complexity of effective network design. This increase is partly to the size and diversity of computer networks, but also is due to the proliferation of available network hardware and software. As with any system, computer networks are made up of various interrelated components, all of which are essential to the network design process. Some of these components are physical in nature, that is, they specify a piece of hardware or software with certain perfomance properties. Other components are considered to be logical elements of network design. The problem of designing, implementing and controlling present and planned data communication networks is rapidly exceeding manual planning and design capabilities. Network managers and designers are looking more and more to computer resources to manage the volume of traffic information, to develop and evaluate network configurations, and to assist in allocating and placing control functions among network processors. The number of network alternatives is increasing so fast that the demand for network optimization--both in private network design and in the interface to, and use of, public network facilities--can be satisfied only through computer aids. These aids range from simple scratch-pad systems that store and compile network statistics to complete traffic emulators, design configurators, and software for data-distribution modelling. Network design generates a set of performance/cost tradeoffs for an existing or proposed network. The network design process generates and regenerates these performance curves in response to changes in critical design parameter such as protocols, line capacity, concentration points, host processor capability and message routing techniques. This dissertation considers the development of a generalized model of the network design process. This model proposes that the physical and logical network components can be described using a set of objects, properties and relationships. Objects are used to describe network physical and logical components. These components are given properties, thus associating with each object a set of network descriptions. Linkages between objects or classes of objects are provided by object-object relationships. The prototype system provides a generalized network design tool which allows a concise and explicit statement of network requirements providing a network planning and design model which is independent of any particular network or application.
2

Second derivative algorithms for minimum delay distributed routing in networks

January 1981 (has links)
Dimitri P. Bertsekas, Eli M. Gafni, Robert G. Gallager. / Bibliography: p. 47. / "March 1981" / " ARPA Grant No. ONR-N00014-75-C-1183" "National Science Foundation Grant NSF/ECS 79-19880"
3

Robust synchronization plan for SDH network.

Mpele, Jeremy Rodrigue. January 2010 (has links)
M. Tech. Electrical Engineering. / High-speed transmission network requires adequate synchronization planning for quality of service. Based on ITU-T synchronization network guidelines, this dissertation proposes a Synchronous Digital Hierachy (SDH) synchronization model catering for loop of synchronization and robust to communications links faults. From candidate schemes for providing synchronization information, we have adopted the master-slave strategy. Heuristics have been developed to assist in the allocation of primary and back-up reference clocks. The contribution lies in the systematic selection of master node placement, the distribution of the reference clock from master node to all slave nodes using shortest path and the allocation of system redundancy by means of clock priority table for each nodal clock. To cater for clock stability and accuracy, the use of atomic clock (Cesium, Rubidium) as frequency standard, in long term basis, has proven that clock s characteristics namely stability and accuracy may be controlled in the midst of jitter/wander. Fiber optics transmission medium has proven to be adequate for optimal clock dissemination with very trivial frequency deviation from the nominal positional instant, traceable to Primary Reference Clock.
4

Second derivative algorithms for minimum delay distributed routing in networks

January 1983 (has links)
by Dimitri P. Bertsekas, Eli M. Gafni, Robert G. Gallager. / Bibliography: p. 34-35. / "October, 1983." / Defense Advanced Research Projects Agency Grant No. ONR-N00014-75-C-1183 National Science Foundation Grant NSF/ECS 79-19880
5

In-situ S-Parameter Analysis and Applications

Hershberger, Kyle M 22 May 2014 (has links)
This thesis will begin with an investigation on the limitations associated with the predominate two-port stability analysis techniques with respect to multi-stage RF amplifier design. The primary focus will be to investigate and develop network analysis techniques that allow internal ports to be created within a RF circuit. This technique will facilitate the application of existing stability analysis techniques in ways that are not commonly known. Examples of situations where traditional network and stability analysis is insufficient will be presented, and the application of the newly developed techniques will be examined.
6

Network Analysis of the Symmetric and Asymmetric Patterns of Conflict in an Organization

Helt, Kimberly M. (Kimberly Mae) 05 1900 (has links)
Missing from extant conflict literature is an examination of both symmetric and asymmetric conflict ties. To address this void, network analysis was utilized to examine the responses (both symmetric and asymmetric conflict ties) of 140 employees and managers in four divisions of a large agency of the Federal Government. The study was limited to conflict over scarce resources. Conflict management methods were examined as well as the perceptions of how respondents both cope with and feel about conflict. The results indicate that when two people in a conflict setting are structurally equivalent they both report actions and feelings that are opposite from those of- the other person. This finding, an inverse contagion effect, has been termed diffusion resistance.
7

Cultural Diffusion through Language: How Communication Networks Influence Culture in the Age of Digitization

Yeaton, Matthew Richard January 2021 (has links)
My dissertation focuses on the strategic implications of the link between organizational culture and social network structure. I study their role in the process of knowledge transfer and diffusion, organizational memory, and organizational design. More broadly, I examine the way that social structure influences the information environment, and what effect this has on organizational learning. I focus in particular on the process of cultural evolution. My dissertation leverages digitization as a phenomenon of inherent interest and as an empirical setting that can improve our theoretical understanding of both digital and non-digital communities. I have developed an expertise in computational methods, especially in machine learning techniques related to text and other unstructured data, and in the analysis of "big data," especially pertaining to large-scale networks. By combining these computational tools with organizational theory and the rich relational data generated by the explosion of digital records, my research grants insight into the dynamic process of learning in organizations and the implications for innovation and competitive advantage. I explore how digitization informs and develops our understanding of organizational culture, knowledge transfer, and the labor market. Specifically, I investigate how digitization has opened a window to observe network structure and language, providing a lasting record of these changes through time. Using these digital records to observe the structure of social relations and the language used to communicate can help deepen our theory of knowledge transfer for a wide range of organizations, not just those that operate in the digital sphere. This means that these studies also have implications for understanding organizations in non-digital settings. My dissertation contributes both theoretically and empirically to the knowledge theory of the firm. However, the mechanisms underlying knowledge transfer remain underdeveloped. I contribute by disentangling the related mechanisms of language and organizational structure, and I propose that common language directly impacts what knowledge may be efficiently transferred. Next, my dissertation contributes to the growing field of digitization. Digitization is salient for researchers both as a unique phenomenon and as an ever-expanding source of accessible data to test theory. Moreover, since one of the central contributions of digitization is to reduce the cost of information gathering, it is well-suited to my theoretical setting of knowledge transmission and organizational memory. Finally, my dissertation contributes to our understanding of culture in organizations. The focus on language as an aspect of culture allows both additional formalization as well as more specific empirical tests of the contribution of culture to organizational outcomes. In particular, a focus on dynamic settings in each of the chapters reveals the interplay between organizational structure, memory, and change. This helps us to understand how language evolves, how it is learned, and how it changes in response to information shocks.
8

Calibration of time domain network analyzers

Su, Wansheng 23 September 2008 (has links)
A calibration technique for time domain reflectometry and transmission (TDR and TDT) measurement system as applied to network analysis is presented. The calibration corrects for the errors caused by the response of the measurement system. A complete physically-based model has been established for the system. A set of calculable standards has been developed to satisfy the time domain requirements for calibration. The calibration technique was applied to determining the model parameters of a commercial TDR and TDT system. The errors of modeling and de-embedding are analyzed. The calibration enhanced the system bandwidth from 8 GHz to about 20 GHz. Experimental verification is given to demonstrate the validity and accuracy of the calibration technique. / Ph. D.
9

The significance of transients following failures and repairs in packet-switched networks

Kobza, John E. 24 October 2005 (has links)
A system composed of unreliable components can experience different levels of performance as its configuration changes due to failures and repairs. One approach used to measure overall system performance is to weight the level of performance measured for each system state by the probability that the system is in that state and then sum across all system states. Many performance measures have a transient behavior following a change in the state of the system. Because of the difficulty associated with transient analysis, the system is often assumed to be in steady state when measuring the performance for each system state. When this approach is used to analyze packet-switched communication networks, which consist of highly reliable high-speed links and switching nodes, it is argued that the steady-state assumption is justified on the basis of the large difference in rates of traffic-related events, such as call completions and packet transmissions, compared to component-related events, such as failures and repairs. To investigate the validity of this assumption, we define lower bounds for the length of the transient phase fol1owing link failures and repairs. For both cases, we obtain a distribution for the length of the lower bound. The transient phase is significant when its length exceeds a given fraction of the time until the next change in network state. Using the distributions for these lengths, we derive an expression for the probability that the transient phase is significant in terms of the amount of traffic on the link and the ratio of the rates for traffic-related events and network state changes. These results show that the difference in rates between traffic-related events and component related events is not enough by itself to justify the steady-state assumption. The amount of traffic carried on the link and the size of the network must also be considered. These results indicate some situations where the steady-state assumption is inappropriate. We also obtain sufficient conditions for transient-phase significance following link failures. Although these results do not indicate when it is safe to use the steady-state assumption, they provide a measure of the risk associated with using it. / Ph. D.
10

A metaphoric cluster analysis of the rhetoric of digital technology

Marse, Michael Eugene, Negroponte, Nicholas 01 January 2005 (has links)
This thesis seeks to identify and explain some technology in order to more fully understand modern communication. This study makes use of metaphoric cluster analysis to examine the technological rhetoric of Nicholas Negroponte.

Page generated in 0.1477 seconds