• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2398
  • 1697
  • 316
  • 141
  • 85
  • 48
  • 40
  • 40
  • 40
  • 40
  • 40
  • 40
  • 24
  • 24
  • 24
  • Tagged with
  • 5737
  • 5737
  • 2820
  • 2059
  • 1511
  • 1098
  • 888
  • 782
  • 677
  • 577
  • 532
  • 517
  • 432
  • 403
  • 391
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

NETWORK DESIGN UNDER DEMAND UNCERTAINTY

Meesublak, Koonlachat 27 September 2007 (has links)
A methodology for network design under demand uncertainty is proposed in this dissertation. The uncertainty is caused by the dynamic nature of the IP-based traffic which is expected to be transported directly over the optical layer in the future. Thus, there is a need to incorporate the uncertainty into a design model explicitly. We assume that each demand can be represented as a random variable, and then develop an optimization model to minimize the cost of routing and bandwidth provisioning. The optimization problem is formulated as a nonlinear Multicommodity Flow problem using Chance-Constrained Programming to capture both the demand variability and levels of uncertainty guarantee. Numerical work is presented based on a heuristic solution approach using a linear approximation to transform the nonlinear problem to a simpler linear programming problem. In addition, the impact of the uncertainty on a two-layer network is investigated. This will determine how the Chance-Constrained Programming based scheme can be practically implemented. Finally, the implementation guidelines for developing an updating process are provided.
32

Time-Synchronized Optical Burst Switching

Rugsachart, Artprecha 27 September 2007 (has links)
Optical Burst Switching was recently introduced as a protocol for the next generation optical Wavelength Division Multiplexing (WDM) network. Currently, in legacy Optical Circuit Switching over the WDM network, the highest bandwidth utilization cannot be achieved over the network. Because of its physical complexities and many technical obstacles, the lack of an optical buffer and the inefficiency of optical processing, Optical Packet Switching is difficult to implement. Optical Burst Switching (OBS) is introduced as a compromised solution between Optical Circuit Switching and Optical Packet Switching. It is designed to solve the problems and support the unique characteristics of an optical-based network. Since OBS works based on all-optical switching techniques, two major challenges in designing an effective OBS system have to be taken in consideration. One of the challenges is the cost and complexities of implementation, and another is the performance of the system in terms of blocking probabilities. This research proposes a variation of Optical Burst Switching called Time-Synchronized Optical Burst Switching. Time-Synchronized Optical Burst Switching employs a synchronized timeslot-based mechanism that allows a less complex physical switching fabric to be implemented, as well as to provide an opportunity to achieve better resource utilization in the network compared to the traditional Optical Burst Switching.
33

HUMAN CONTROL OF COOPERATING ROBOTS

Wang, Jijun 31 January 2008 (has links)
Advances in robotic technologies and artificial intelligence are allowing robots to emerge from research laboratories into our lives. Experiences with field applications show that we have underestimated the importance of human-robot interaction (HRI) and that new problems arise in HRI as robotic technologies expand. This thesis classifies HRI along four dimensions human, robot, task, and world and illustrates that previous HRI classifications can be successfully interpreted as either about one of these elements or about the relationship between two or more of these elements. Current HRI studies of single-operator single-robot (SOSR) control and single-operator multiple-robots (SOMR) control are reviewed using this approach. Human control of multiple robots has been suggested as a way to improve effectiveness in robot control. Unlike previous studies that investigated human interaction either in low-fidelity simulations or based on simple tasks, this thesis investigates human interaction with cooperating robot teams within a realistically complex environment. USARSim, a high-fidelity game-enginebased robot simulator, and MrCS, a distributed multirobot control system, were developed for this purpose. In the pilot experiment, we studied the impact of autonomy level. Mixed initiative control yielded performance superior to fully autonomous and manual control. To avoid limitation to particular application fields, the present thesis focuses on common HRI evaluations that enable us to analyze HRI effectiveness and guide HRI design independently of the robotic system or application domain. We introduce the interaction episode (IEP), which was inspired by our pilot human-multirobot control experiment, to extend the Neglect Tolerance HUMAN CONTROL OF COOPERATING ROBOTS Jijun Wang, Ph.D. University of Pittsburgh, 2007 v model to support general multiple robots control for complex tasks. Cooperation Effort (CE), Cooperation Demand (CD), and Team Attention Demand (TAD) are defined to measure the cooperation in SOMR control. Two validation experiments were conducted to validate the CD measurement under tight and weak cooperation conditions in a high-fidelity virtual environment. The results show that CD, as a generic HRI metric, is able to account for the various factors that affect HRI and can be used in HRI evaluation and analysis.
34

ONTOLOGY MAPPING: TOWARDS SEMANTIC INTEROPERABILITY IN DISTRIBUTED AND HETEROGENEOUS ENVIRONMENTS

Mao, Ming 03 June 2008 (has links)
The World Wide Web (WWW) now is widely used as a universal medium for information exchange. Semantic interoperability among different information systems in the WWW is limited due to information heterogeneity, and the non semantic nature of HTML and URLs. Ontologies have been suggested as a way to solve the problem of information heterogeneity by providing formal, explicit definitions of data and reasoning ability over related concepts. Given that no universal ontology exists for the WWW, work has focused on finding semantic correspondences between similar elements of different ontologies, i.e., ontology mapping. Ontology mapping can be done either by hand or using automated tools. Manual mapping becomes impractical as the size and complexity of ontologies increases. Full or semi-automated mapping approaches have been examined by several research studies. Previous full or semi-automated mapping approaches include analyzing linguistic information of elements in ontologies, treating ontologies as structural graphs, applying heuristic rules and machine learning techniques, and using probabilistic and reasoning methods etc. In this paper, two generic ontology mapping approaches are proposed. One is the PRIOR+ approach, which utilizes both information retrieval and artificial intelligence techniques in the context of ontology mapping. The other is the non-instance learning based approach, which experimentally explores machine learning algorithms to solve ontology mapping problem without requesting any instance. The results of the PRIOR+ on different tests at OAEI ontology matching campaign 2007 are encouraging. The non-instance learning based approach has shown potential for solving ontology mapping problem on OAEI benchmark tests.
35

An Agent-Based Model for Secondary Use of Radio Spectrum

Tonmukayakul, Arnon 31 January 2008 (has links)
Wireless communications rely on access to radio spectrum. With a continuing proliferation of wireless applications and services, the spectrum resource becomes scarce. The measurement studies of spectrum usage, however, reveal that spectrum is being used sporadically in many geographical areas and times. In an attempt to promote efficiency of spectrum usage, the Federal Communications Commission has supported the use of market mechanism to allocate and assign radio spectrum. We focus on the secondary use of spectrum defined as a temporary access of existing licensed spectrum by a user who does not own a spectrum license. The secondary use of spectrum raises numerous technical, institutional, economic, and strategic issues that merit investigation. Central to the issues are the effects of transaction costs associated with the use of market mechanism and the uncertainties due to potential interference. The research objective is to identify the pre-conditions as to when and why the secondary use would emerge and in what form. We use transaction cost economics as the theoretical framework in this study. We propose a novel use of agent-based computational economics to model the development of the secondary use of spectrum. The agent-based model allows an integration of economic and technical considerations to the study of pre-conditions to the secondary use concept. The agent-based approach aims to observe the aggregate outcomes as a result of interactions among agents and understand the process that leads to the secondary use, which can then be used to create policy instruments in order to obtain the favorable outcomes of the spectrum management.
36

Demand-Based Wireless Network Design by Test Point Reduction

Pongthaipat, Natthapol 31 January 2008 (has links)
The problem of locating the minimum number of Base Stations (BSs) to provide sufficient signal coverage and data rate capacity is often formulated in manner that results in a mixed-integer NP-Hard (Non-deterministic Polynomial-time Hard) problem. Solving a large size NP-Hard problem is time-prohibitive because the search space always increases exponentially, in this case as a function of the number of BSs. This research presents a method to generate a set of Test Points (TPs) for BS locations, which always includes optimal solution(s). A sweep and merge algorithm then reduces the number of TPs, while maintaining the optimal solution. The coverage solution is computed by applying the minimum branching algorithm, which is similar to the branch and bound search. Data Rate demand is assigned to BSs in such a way to maximize the total network capacity. An algorithm based on Tabu Search to place additional BSs is developed to place additional BSs, in cases when the coverage solution can not meet the capacity requirement. Results show that the design algorithm efficiently searches the space and converges to the optimal solution in a computationally efficient manner. Using the demand nodes to represent traffic, network design with the TP reduction algorithm supports both voice and data users.
37

Modeling Team Performance For Coordination Configurations Of Large Multi-Agent Teams Using Stochastic Neural Networks

Polvichai, Jumpol 31 January 2008 (has links)
Coordination of large numbers of agents to perform complex tasks in complex domains is a rapidly progressing area of research. Because of the high complexity of the problem, approximate and heuristic algorithms are typically used for key coordination tasks. Such algorithms usually require tuning algorithm parameters to yield the best performance under particular circumstances. Manually tuning parameters is sometimes difficult. In domains where characteristics of the environment can vary dramatically from scenario to scenario, it is desirable to have automated techniques for appropriately configuring the coordination. This research presents an approach to online reconfiguration of heuristic coordination algorithms. The approach uses an abstract simulation to produce a large performance data set to train a stochastic neural network that concisely models the complex, probabilistic relationship between configurations, environments and performance metrics. The final stochastic neural network, referred as the team performance model, is then used as the core of a tool that allows rapid online or offline configuration of coordination algorithms to particular scenarios and user preferences. The overall system allows rapid adaptation of coordination, leading to better performance in new scenarios. Results show that the team performance model captured key features of a very large configuration space and mostly captured the uncertainty in performance well. The tool was shown to be often capable of reconfiguring the algorithms to meet user requests for increases or decreases in performance parameters. This work represents the first practical approach to quickly reconfiguring a complex set of algorithms for a specific scenario.
38

ANCIENT ARCHITECTURE IN VIRTUAL REALITY DOES IMMERSION REALLY AID LEARNING?

Jacobson, Jeffrey 07 July 2008 (has links)
This study explored whether students benefited from an immersive panoramic display while studying subject matter that is visually complex and information-rich. Specifically, middle-school students learned about ancient Egyptian art and society using an educational learning game, Gates of Horus, which is based on a simplified three dimensional computer model of an Egyptian temple. First, we demonstrated that the game is an effective learning tool by comparing written post-test results from students who played the game and students in a no-treatment control group. Next, we compared the learning results of two groups of students who used the same mechanical controls to navigate through the computer model of the temple and to interact with its features. One of the groups saw the temple on a standard computer desktop monitor while the other-saw it in a visually immersive display (a partial dome) The major difference in the test results between the two groups appeared when the students gave a verbal show-and-tell presentation about the Temple and the facts and concepts related to it. During that exercise, the students had no cognitive scaffolding other than the Virtual Egyptian Temple which was projected on a wall. The student navigated through the temple and described its major features. Students who had used the visually immersive display volunteered notably more than those who had used a computer monitor. The other major tests were questionnaires, which by their nature provide a great deal of scaffolding for the task of recalling the required information. For these tests we believe that this scaffolding aided students' recall to the point where it overwhelmed the differences produced by any difference in the display. We conclude that the immersive display provides better supports for the student's learning activities for this material. To our knowledge, this is the first formal study to show concrete evidence that visual immersion can improve learning for a non-science topic.
39

Credibility-based Binary Feedback Model for Grid Resource Planning

Chokesatean, Parasak 31 July 2008 (has links)
Grid service providers (GSPs), in commercial grids, improve their profitability by maintaining the least possible set of resources to meet client demand. Their goal is to maximize profits by optimizing resource planning. In order to achieve such goal, they require feedback from clients to estimate demand for their service. The objective of this research is to develop an approach to build a useful value profile for a collection of heterogeneous grid clients. For developing the approach, we use binary feedback as the theoretical framework to build the value profile, which can be used as a proxy for a demand function that represents clients willingness-to-pay for grid resources. However, clients may require incentives to provide feedback and deterrents from selfish behavior, such as misrepresenting their true preferences to obtain superior services at lower costs. To address this concern, we use credibility mechanisms to detect untruthful feedback and penalize insincere or biased clients. We also use game theory to study how the cooperation can emerge. In this dissertation, we propose the use of credibility-based binary feedback to build value profiles, which GSPs can use to plan their resources economically. The use of value profiles aims to benefit both GSPs and clients, and helps to accelerate an adoption of commercial grids.
40

CAN CITYWIDE MUNICIPAL WIFI BE A FEASIBLE SOLUTION FOR LOCAL BROADBAND ACCESS IN THE US? AN EMPIRICAL EVALUATION OF A TECHNO-ECONOMIC MODEL

Huang, Kuang Chiu 24 July 2008 (has links)
Citywide wireless fidelity (WiFi) offers an opportunity for municipalities and BISPs to break through the duopoly broadband market structure that is prevalent in the US. Although municipal WiFi offers low deployment cost, short building time, high capacity, and wide coverage, the competition from the local broadband market makes it difficult to be self-sustainable from public Internet access revenues. Therefore, it is interesting and useful not only to discuss the demographic features of existing WiFi projects but also to evaluate what is necessary for them to be economically sustainable. We propose to study these questions by building a techno-economic model to determine features, sustainability, and necessary subsidy of citywide WiFi for local broadband access. We evaluate this model with data from several existing projects. In order to gain insight from previous experience and to evaluate the feasibility of citywide WiFi, we carried this research out in three steps. The first, we undertook a systematic study to analyze all existing and operating citywide WiFi projects in the US. We were interested in identifying what key geo-demographic differences exist between WiFi cities and non-WiFi cities, and how private ISPs and municipalities implemented citywide projects with various business models and strategies. Next, we built a model linking access point density and network coverage, and used this to build a techno-economic model of municipal WiFi. Finally, we evaluated the effectiveness of the model using existing projects identified in the empirical study and determined how much subsidy could be reasonable from municipality to make WiFi projects sustainable. The outcome of this research is designed to assist policy makers, municipalities, and WiFi ISPs in evaluating, designing and implementing a sustainable project.

Page generated in 0.1637 seconds