• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2399
  • 1700
  • 316
  • 141
  • 85
  • 48
  • 40
  • 40
  • 40
  • 40
  • 40
  • 40
  • 24
  • 24
  • 24
  • Tagged with
  • 5742
  • 5742
  • 2820
  • 2063
  • 1511
  • 1102
  • 888
  • 782
  • 681
  • 577
  • 536
  • 521
  • 432
  • 403
  • 391
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

An Access Control and Trust Management Framework for Loosely-Coupled Multidomain Environment

Zhang, Yue 06 May 2011 (has links)
Multidomain environments where multiple organizations interoperate with each other are becoming a reality as can be seen in emerging Internet-based enterprise applications. Access control to ensure secure interoperation in such an environment is a crucial challenge. A multidomain environment can be categorized as tightly-coupled and loosely-coupled. The access control challenges in the loosely-coupled environment have not been studied adequately in the literature. In a loosely-coupled environment, different domains do not know each other before they interoperate. Therefore, traditional approaches based on users identities cannot be applied directly. Motivated by this, researchers have developed several attribute-based authorization approaches to dynamically build trust between previously unknown domains. However, these approaches all focus on building trust between individual requesting users and the resource providing domain. We demonstrate that such approaches are inefficient when the requests are issued by a set of users assigned to a functional role in the organization. Moreover, preserving principle of security has long been recognized as a challenging problem when facilitating interoperations. Existing research work has mainly focused on solving this problem only in a tightly-coupled environment where a global policy is used to preserve the principle of security. In this thesis, we propose a role-based access control and trust management framework for loosely-coupled environments. In particular, we allow the users to specify the interoperation requests in terms of requested permissions and propose several role mapping algorithms to map the requested permissions into roles in the resource providing domain. Then, we propose a Simplify algorithm to simplify the distributed proof procedures when a set of requests are issued according to the functions of some roles in the requesting domain. Our experiments show that our Simplify algorithm significantly simplifies such procedures when the total number of credentials in the environment is sufficiently large, which is quite common in practical applications. Finally, we propose a novel policy integration approach using the special semantics of hybrid role hierarchy to preserve the principle of security. At the end of this dissertation a brief discussion of implemented prototype of our framework is present.
72

The Real Options to Technology Management: Strategic Options for 3G Wireless Network Architecture and Technologies

Kim, Hak Ju 17 February 2011 (has links)
The increasing demands for high-quality multimedia services have challenged the wireless industry to rapidly develop wireless network architecture and technologies. These demands have led wireless service providers to struggle with the current network migration dilemma of how to best deliver high-quality multimedia services. Currently, there are many alternative wireless network technologies, such as TDMA, GSM, GPRS, EDGE, WCDMA, cdma2000, etc. These wireless technology choices require close examination when making strategic decisions involving network evolution. This study assesses the technology options for wireless networks to establish next generation networks (i.e., 3G), based on the real options approach (ROA), and discusses wireless network operators¡¯ technology migration strategies. The goal of this study is to develop a theoretical framework for wireless network operators to support their strategic decision-making process when considering technology choices. The study begins by tracing the evolution of technologies in wireless networks to place them in the proper context, and continues by developing the strategic technology option model (STOM) using ROA as an assessment tool. Finally, STOM is applied to the world and US wireless industries for the formulation of technology migration strategies. Consequently, this study will help wireless network service providers make strategic decisions when upgrading or migrating towards the next generation network architecture by showing the possible network migration paths and their relative value. Through this study, network operators can begin to think in terms of the available network options and to maximize overall gain in networks. Since the migration issues concerning the next generation wireless network architecture and technologies remain the subject of debate, with no substantial implementation in progress now, this study will help the industry to decide where best to focus its efforts and can be expanded for further research.
73

Providing Fairness Through Detection and Preferential Dropping of High Bandwidth Unresponsive Flows

Chatranon, Gwyn 17 February 2011 (has links)
Stability of the Internet today depends largely on cooperation between end hosts that employ TCP (Transmission Control Protocol) protocol in the transport layer, and network routers along an end-to-end path. However, in the past several years, various types of traffic, including streaming media applications, are increasingly deployed over the Internet. Such types of traffic are mostly based on UDP (User Datagram Protocol) and usually do not employ neither end-to-end congestion nor flow control mechanism, or else very limited. Such applications could unfairly consume greater amount of bandwidth than competing responsive flows such as TCP traffic. In this manner, unfairness problem and congestion collapse could occur. To avoid substantial memory requirement and complexity, fair Active Queue Management (AQM) utilizing no or partial flow state information were proposed in the past several years to solve these problems. These schemes however exhibit several problems under different circumstances. This dissertation presents two fair AQM mechanisms, BLACK and AFC, that overcome the problems and the limitations of the existing schemes. Both BLACK and AFC need to store only a small amount of state information to maintain and exercise its fairness mechanism. Extensive simulation studies show that both schemes outperform the other schemes in terms of throughput fairness under a large number of scenarios. Not only able to handle multiple unresponsive traffic, but the fairness among TCP connections with different round trip delays is also improved. AFC, with a little overhead than BLACK, provides additional advantages with an ability to achieve good fairness under a scenario with traffic of diff21erent sizes and bursty traffic, and provide smoother transfer rates for the unresponsive flows that are usually transmitting real-time traffic. This research also includes the comparative study of the existing techniques to estimate the number of active flows which is a crucial component for some fair AQM schemes including BLACK and AFC. Further contribution presented in this dissertation is the first comprehensive evaluation of fair AQM schemes under the presence of various type of TCP friendly traffic.
74

Topological Design of Multiple Virtual Private Networks UTILIZING SINK-TREE PATHS

Srikitja, Anotai 17 February 2011 (has links)
With the deployment of MultiProtocol Label Switching (MPLS) over a core backbone networks, it is possible for a service provider to built Virtual Private Networks (VPNs) supporting various classes of services with QoS guarantees. Efficiently mapping the logical layout of multiple VPNs over a service provider network is a challenging traffic engineering problem. The use of sink-tree (multipoint-to-point) routing paths in a MPLS network makes the VPN design problem different from traditional design approaches where a full-mesh of point-to-point paths is often the choice. The clear benefits of using sink-tree paths are the reduction in the number of label switch paths and bandwidth savings due to larger granularities of bandwidth aggregation within the network. In this thesis, the design of multiple VPNs over a MPLS-like infrastructure network, using sink-tree routing, is formulated as a mixed integer programming problem to simultaneously find a set of VPN logical topologies and their dimensions to carry multi-service, multi-hour traffic from various customers. Such a problem formulation yields a NP-hard complexity. A heuristic path selection algorithm is proposed here to scale the VPN design problem by choosing a small-but-good candidate set of feasible sink-tree paths over which the optimal routes and capacity assignments are determined. The proposed heuristic has clearly shown to speed up the optimization process and the solution can be obtained within a reasonable time for a realistic-size network. Nevertheless, when a large number of VPNs are being layout simultaneously, a standard optimization approach has a limited scalability. Here, the heuristics termed the Minimum-Capacity Sink-Tree Assignment (MCSTA) algorithm proposed to approximate the optimal bandwidth and sink-tree route assignment for multiple VPNs within a polynomial computational time. Numerical results demonstrate the MCSTA algorithm yields a good solution within a small error and sometimes yields the exact solution. Lastly, the proposed VPN design models and solution algorithms are extended for multipoint traffic demand including multipoint-to-point and broadcasting connections.
75

A Neural Network Approach to Treatment Optimization

Sanguansintukul, Siripun 17 February 2011 (has links)
An approach for optimizing medical treatment as a function of measurable patient data is analyzed using a two-network system. The two-network approach is inspired by the field of control systems: one network, called a patient model (PM), is used to predict the outcome of the treatment, while the other, called a treatment network (TN), is used to optimize the predicted outcome. The system is tested with a variety of functions: one objective criterion (with and without interaction between treatments) and multi-objective criteria (with and without interaction between treatments). Data are generated using a simple Gaussian function for some studies and with functions derived from the medical literature for other studies. The experimental results can be summarized as follows: 1) the relative importance of symptoms can be adjusted by applying different coefficient weights in the PM objective functions. Different coefficients are employed to modulate the tradeoffs in symptoms. Higher coefficients in the cost function result in higher accuracy. 2) Different coefficients are applied to the objective functions of the TN when both objective functions are quadratic, the experimental results suggest that the higher the coefficient the better the symptom. 3) The simulation results of training the TN with a quadratic cost function and a quartic cost function indicate the threshold-like behavior in the quartic cost function when the dose is in the neighborhood of the threshold. 4) In general, the network illustrates a better performance than the quadratic model. However, the network encounters a local minima problem. Ultimately, the results indicate a proof of idea that this framework might be a useful tool for augmenting a clinicians decision in selecting dose strengths for an individual patient need.
76

On Proximity Based Sub-Area Localization

Korkmaz, Aylin 24 August 2011 (has links)
A localization system can save lives in the aftermath of an earthquake; position people or valuable assets during a fire in a building; or track airplanes besides many of its other attractive applications. Global Positioning System (GPS) is the most popular localization system, and it can provide 7-10 meters localization accuracy for outdoor users; however, it has certain drawbacks for indoor environments. Alternatively, wireless networks are becoming pervasive and have been densely deployed for communication of various types of devices indoors, exploiting them for the localization of people or other assets is a convenience. Proximity based localization that estimates locations based on closeness to known reference points, coupled with a widely deployed wireless technology, can reduce the cost and effort for localization in local and indoor areas. In this dissertation, we propose a proximity based localization algorithm that exploits knowledge of the overlapping coverages of known monitoring stations. We call this algorithm Sub-Area Localization (SAL). We present a systematic study of proximity-based localization by defining the factors and parameters that affect the localization performance in terms of metrics such as accuracy and efficiency. Then, we demonstrate that SAL can be used in multi-floor buildings to take advantage of the infrastructure elements deployed across floors to reduce the overall cost (in terms of the number of monitoring stations required) without harming accuracy. Finally, we present a case study of how SAL can be used for spatial spectrum detection in wireless cognitive networks.
77

Information visualization for knowledge repositories: Applications and impacts

Zhu, Bin January 2002 (has links)
Information technology plays a supportive role in knowledge management. It captures and stores knowledge into knowledge repositories. At the same time, it also improves access to knowledge stored in knowledge repositories. The codification strategy in knowledge management (Hansen, et al., 1999) and the capturing functionality of information technology have made more and more knowledge repositories available. However, the utility of a knowledge repository may largely depend on how information is presented and requested through its interfaces. The interface requirement of a knowledge repository varies with the content of knowledge and the media type in which the repository stores the knowledge. The dissertation provides an example of selecting appropriate information visualization and analysis technology to facilitate effective knowledge retrieval from different types of knowledge repository. It identifies four types of knowledge repository, each of which has unique requirements for its interfaces. The dissertation applies various visualization technologies to fulfill such requirements. The interfaces developed facilitate the knowledge retrieval by helping in the specification of information needs or by supporting users' information browsing behavior. In addition, the dissertation also presents four empirical studies evaluating the systems developed. Since the lack of evaluation studies in the field of information visualization has become an issue, such empirical studies also provide examples of approaches to evaluating different aspects of an interface.
78

Virtual mentor and media structuralization theory

Zhang, Dongsong January 2002 (has links)
In the 21st century, e-Learning has been widely used in both academic education and corporate training. However, many e-Learning systems present multimedia instructional material in a static, passive, and unstructured manner, giving learners little control over learning content and process. As a result, higher effectiveness and greater societal potential of e-Learning are hindered. This thesis makes two primary contributions to this trend. From a theoretical perspective, we propose a new concept called "Virtual Mentor (VM)" and a research framework called Media StructuRalization Theory (MSRT). The VM refers to a multimedia-based e-Learning environment that emphasizes interaction, flexibility, and self-direction. The MSRT aims at providing guidance toward effective design and implementation of virtual mentor systems. From a technical perspective, we have developed a prototype VM system called Learning by Asking (LBA), which integrates various information technologies. The major technical innovation is adoption of a novel natural language approach to content-based video indexing and retrieval. We conducted empirical studies to validate a few propositions of the MSRT. The results demonstrated that structuring of multimedia content and the use of instructional videos improved learning outcome significantly. The learning performance of students in an eLearning environment with content structuring and synchronized multimedia instruction is comparable to that of students in traditional classrooms. Our research was enabled by the LBA system, which provides a learner-centered, self-paced, and interactive online learning environment. In order to enhance personalized and just-in-time learning, the LBA system allows learners to ask questions in conversational English and watch appropriate multimedia instructions retrieved by LBA that address learners' interests. Traditional video indexing and retrieval approaches are based on scene changes or other image cues in videos that are not normally available in video lectures. We propose a novel two-phase natural language approach to identifying relevant video clips for content-based video indexing and retrieval. It integrates natural language processing, named entity extraction, frame-based indexing, and information retrieval techniques. The preliminary evaluation reveals that this approach is better than the traditional keyword-based approach in terms of precision and recall.
79

Book review of Stonier (1997) Information and Meaning: An evolutionary Perspective. in: World Futures, vol 53, pp. 367-376.

Hjørland, Birger January 1999 (has links)
No description available.
80

Information Science, Epistemology and the Knowledge Society. Invited speech, INFO 2008, Cuba. April 2008.

Hjørland, Birger January 2008 (has links)
The point of departure of this presentation is the challenges facing Information Science and the information profession. It provides a strategy for how to understand and address information problems and a vision about the role of information professionals in the Knowledge Society. The basic assumption is that any question put to a library or information service can be viewed from different perspectives and that the ability to identify, evaluate and negotiate different perspectives is the way advanced information services differentiate themselves from more primitive kinds of information services. Although it is technological advanced that IR-systems can retrieve documents based on, for example, combinations of words, are such systems primitive in relation to what is needed when information is searched. Knowledge itself is organized socially according to the social division of labour in society (e.g. in disciplines and trades). It is also organized intellectually into theories. Although facts exist, it is the best strategy for information science to assume the principle of fallibilism and to consider all knowledge as provisory and principally open to revision and modification. â Informationâ should thus be understood as â knowledge claimsâ produced on the basis of certain preunderstandings and interests. Information services should therefore not just communicate fragmented claims, but should contribute to the mapping of the structures in which knowledge is organized and also provide contextual information needed for evaluating specific knowledge claims. Different perspectives on a given issue tend to develop their own languages, genres, documents, citation patterns, symbolic systems and cultural products. By considering such connections may a lot of indicators be used to identify different perspectives on a given topic. The basic tasks for libraries and information science is to help users conceptualize and search for knowledge claims based on the understanding that any given claim is always produced from a specific perspective, which are often connected with specific social interests and with specific epistemological assumptions.

Page generated in 0.1347 seconds