• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 9
  • 5
  • 1
  • 1
  • Tagged with
  • 62
  • 62
  • 41
  • 18
  • 18
  • 16
  • 15
  • 15
  • 10
  • 9
  • 7
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Firewall Traversal in Mobile IPv6 Networks / Firewall Traversal in Mobile IPv6 Networks

Steinleitner, Niklas 09 October 2008 (has links)
No description available.
42

Reliable Vehicle-to-Vehicle Weighted Localization in Vehicular Networks

Unknown Date (has links)
Vehicular Ad Hoc Network (VANET) supports wireless communication among vehicles using vehicle-to-vehicle (V2V) communication and between vehicles and infrastructure using vehicle-to-infrastructure (V2I) communication. This communication can be utilized to allow the distribution of safety and non-safety messages in the network. VANET supports a wide range of applications which rely on the messages exchanged within the network. Such applications will enhance the drivers' consciousness and improve their driving experience. However, the efficiency of these applications depends on the availability of vehicles real-time location information. A number of methods have been proposed to fulfill this requirement. However, designing a V2V-based localization method is challenged by the high mobility and dynamic topology of VANET and the interference noise due to objects and buildings. Currently, vehicle localization is based on GPS technology, which is not always reliable. Therefore, utilizing V2V communication in VANET can enhance the GPS positioning. With V2V-based localization, vehicles can determine their locations by exchanging mobility data among neighboring vehicles. In this research work, we address the above challenges and design a realistic V2V-based localization method that extends the centroid localization (CL) by assigning a weight value to each neighboring vehicle. This weight value is obtained using a weighting function that utilizes the following factors: 1) link quality distance between the neighboring vehicles 2) heading information and 3) map information. We also use fuzzy logic to model neighboring vehicles' weight values. Due to the sensitivity and importance of the exchanged information, it is very critical to ensure its integrity and reliability. Therefore, in this work, we present the design and the integration of a mobility data verification component into the proposed localization method, so that only verified data from trusted neighboring vehicles are considered. We also use subjective logic to design a trust management system to evaluate the trustworthiness of neighboring vehicles based on the formulated subjective opinions. Extensive experimental work is conducted using simulation programs to evaluate the performance of the proposed methods. The results show improvement on the location accuracy for varying vehicle densities and transmission ranges as well as in the presence of malicious/untrusted neighboring vehicles. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2016. / FAU Electronic Theses and Dissertations Collection
43

The future will be better tomorrow: a novel of apocalyptic sarcasm

Unknown Date (has links)
The Future Will Be Better Tomorrow is a satirical post-apocalyptic novel that examines the personal and social ironies that occur in a society that is unbalanced by an unexplained apocalyptic event. Working with a combination of dark humor and the terrifying realities of an apocalyptic event – in this case: a blackout – the novel aims to challenge the machinery established by this particular subset of the science fiction genre. / Includes bibliography. / Thesis (M.F.A.)--Florida Atlantic University, 2014.. / FAU Electronic Theses and Dissertations Collection
44

Animalia

Unknown Date (has links)
The novel Animalia is the representation of not just human relationships, but also, of human beings’ relationships to other animals. While the story revolves around a family, the narrative as a whole is meant to bring the reader into a microcosmic ecosystem. Essentially, I am examining an ecosystem. An ecosystem, not in the traditional sense, but an ecosystem nonetheless, because the narrative is a study of how varying species of heterotrophs interact with one another for both physical and emotional sustenance. Russell Water’s story is paramount, but the animals’ affect on one another is what lies below the peak and forms the mountain (an unintentional Hemingway reference). “It has often been observed that an object in a story does not derive its density of existence from the number and length of descriptions devoted to it, but from the complexity of its connections with the different characters” (Sartre 1210). Essentially, through complex and multiple connections between the human species and other species within Kingdom Animalia, I am attempting to develop an “ecosystem” that allows for narrative progression and the interconnection of relationships and thematic elements which range from the capitalistic class system to natural selection. / Includes bibliography. / Thesis (M.F.A.)--Florida Atlantic University, 2014.. / FAU Electronic Theses and Dissertations Collection
45

Hybrid multicasting using Automatic Multicast Tunnels (AMT)

Alwadani, Dhaifallah January 2017 (has links)
Native Multicast plays an important role in distributing and managing delivery of some of the most popular Internet applications, such as IPTV and media delivery. However, due to patchy support and the existence of multiple approaches for Native Multicast, the support for Native Multicast is fragmented into isolated areas termed Multicast Islands. This renders Native Multicast unfit to be used as an Internet wide application. Instead, Application Layer Multicast, which does not have such network requirements but is more expensive in terms of bandwidth and overhead, can be used to connect the native multicast islands. This thesis proposes Opportunistic Native Multicast (ONM) which employs Application LayerMulticast (ALM), on top of a DHT-based P2P overlay network, and Automatic Multicast Tunnelling (AMT) to connect these islands. ALM will be used for discovery and initiating the AMT tunnels. The tunnels will encapsulate the traffic going between islands' Primary Nodes (PNs). AMT was used for its added benefits such as security and being better at traffic shaping and Quality Of Service (QoS). While different approaches for connecting multicast islands exists, the system proposed in the thesis was designed with the following characteristics in mind: scalability, availability, interoperability, self-adaptation and efficiency. Importantly, by utilising AMT tunnels, this approach has unique properties that improve network security and management.
46

Discovery Of Application Workloads From Network File Traces

Yadwadkar, Neeraja 12 1900 (has links) (PDF)
An understanding of Input/Output data access patterns of applications is useful in several situations. First, gaining an insight into what applications are doing with their data at a semantic level helps in designing efficient storage systems. Second, it helps to create benchmarks that mimic realistic application behavior closely. Third, it enables autonomic systems as the information obtained can be used to adapt the system in a closed loop. All these use cases require the ability to extract the application-level semantics of I/O operations. Methods such as modifying application code to associate I/O operations with semantic tags are intrusive. It is well known that network file system traces are an important source of information that can be obtained non-intrusively and analyzed either online or offline. These traces are a sequence of primitive file system operations and their parameters. Simple counting, statistical analysis or deterministic search techniques are inadequate for discovering application-level semantics in the general case, because of the inherent variation and noise in realistic traces. In this paper, we describe a trace analysis methodology based on Profile Hidden Markov Models. We show that the methodology has powerful discriminatory capabilities that enables it to recognize applications based on the patterns in the traces, and to mark out regions in a long trace that encapsulate sets of primitive operations that represent higher-level application actions. It is robust enough that it can work around discrepancies between training and target traces such as in length and interleaving with other operations. We demonstrate the feasibility of recognizing patterns based on a small sampling of the trace, enabling faster trace analysis. Preliminary experiments show that the method is capable of learning accurate profile models on live traces in an online setting. We present a detailed evaluation of this methodology in a UNIX environment using NFS traces of selected commonly used applications such as compilations as well as on industrial strength benchmarks such as TPC-C and Postmark, and discuss its capabilities and limitations in the context of the use cases mentioned above.
47

Improving The Communication Performance Of I/O Intensive And Communication Intensive Application In Cluster Computer Systems

Kumar, V Santhosh 10 1900 (has links)
Cluster computer systems assembled from commodity off-the-shelf components have emerged as a viable and cost-effective alternative to high-end custom parallel computer systems.In this thesis, we investigate how scalable performance can be achieved for database systems on clusters. In this context we specfically considered database query processing for evaluation of botlenecks and suggest optimization techniques for obtaining scalable application performance. First we systematically demonstrated that in a large cluster with high disk bandwidth, the processing capability and the I/O bus bandwidth are the two major performance bottlenecks in database systems. To identify and assess bottlenecks, we developed a Petri net model of parallel query execution on a cluster. Once identified and assessed,we address the above two performance bottlenecks by offoading certain application related tasks to the processor in the network interface card. Offoading application tasks to the processor in the network interface cards shifts the bottleneck from cluster processor to I/O bus. Further, we propose a hardware scheme,network attached disk ,and a software scheme to achieve a balanced utilization of re-sources like host processor, I/O bus, and processor in the network interface card. The proposed schemes result in a speedup of upto 1.47 compared to the base scheme, and ensures scalable performance upto 64 processors. Encouraged by the benefits of offloading application tasks to network processors, we explore the possibilities of performing the bloom filter operations in network processors. We combine offloading bloom filter operations with the proposed hardware schemes to achieve upto 50% reduction in execution time. The later part of the thesis provides introductory experiments conducted in Community At-mospheric Model(CAM), a large scale parallel application used for global weather and climate prediction. CAM is a communication intensive application that involves collective communication of large messages. In our limited experiment, we identified CAM to see the effect of compression techniques and offloading techniques (as formulated for database) on the performance of communication intensive applications. Due to time constraint, we considered only the possibility of compression technique for improving the application performance. However, offloading technique could be taken as a full-fledged research problem for further investigation In our experiment, we found compression of messages reduces the message latencies, and hence improves the execution time and scalability of the application. Without using compression techniques, performance measured on 64 processor cluster resulted in a speed up of only 15.6. While lossless compression retains the accuracy and correctness of the program, it does not result in high compression. We therefore propose lossy compression technique which can achieve a higher compression, yet retain the accuracy and numerical stability of the application while achieving a scalable performance. This leads to speedup of 31.7 on 64 processors compared to a speedup of 15.6 without message compression. We establish that the accuracy within prescribed limit of variation and numerical stability of CAM is retained under lossy compression.
48

Link prediction and link detection in sequences of large social networks using temporal and local metrics

Cooke, Richard J. E. 01 November 2006 (has links)
This dissertation builds upon the ideas introduced by Liben-Nowell and Kleinberg in The Link Prediction Problem for Social Networks [42]. Link prediction is the problem of predicting between which unconnected nodes in a graph a link will form next, based on the current structure of the graph. The following research contributions are made: • Highlighting the difference between the link prediction and link detection problems, which have been implicitly regarded as identical in current research. Despite hidden links and forming links having very highly significant differing metric values, they could not be distinguished from each other by a machine learning system using traditional metrics in an initial experiment. However, they could be distinguished from each other in a "simple" network (one where traditional metrics can be used for prediction successfully) using a combination of new graph analysis approaches. • Defining temporal metric statistics by combining traditional statistical measures with measures commonly employed in financial analysis and traditional social network analysis. These metrics are calculated over time for a sequence of sociograms. It is shown that some of the temporal extensions of traditional metrics increase the accuracy of link prediction. • Defining traditional metrics using different radii to those at which they are normally calculated. It is shown that this approach can increase the individual prediction accuracy of certain metrics, marginally increase the accuracy of a group of metrics, and greatly increase metric computation speed without sacrificing information content by computing metrics using smaller radii. It also solves the “distance-three task” (that common neighbour metrics cannot predict links between nodes at a distance greater than three). • Showing that the combination of local and temporal approaches to link prediction can lead to very high prediction accuracies. Furthermore in “complex” networks (ones where traditional metrics cannot be used for prediction successfully) local and temporal metrics become even more useful.
49

Incentive Strategies and Algorithms for Networks, Crowds and Markets

Dayama, Pankaj January 2015 (has links) (PDF)
This work is motivated by several modern applications involving social networks, crowds, and markets. Our work focuses on the theme of designing effective incentive strategies for these applications. Viral marketing is receiving much attention by practicing marketers and researchers alike. While not a new idea, it has come to the forefront because of multiple effects – products have become more complex, making buyers to increasingly rely on opinions of their peers; consumers have evolved to distrust advertising; and Web2.0 has revolutionized the way people can connect, communicate and share. With power shifting to consumers, it has become important for companies to devise effective viral marketing strategies. Incentives are also a critical aspect of crowd sourcing tasks and play a crucial role in attracting, motivating and sustaining participation. The thesis addresses the following problems. (i) Optimal Control of Information Epidemics: We address two problems concerning information propagation in a population: a) how to maximize the spread of a given message in the population within the stipulated time and b) how to create a given level of buzz- measured by the fraction of the population engaged in conversation on a topic of interest- at a specified time horizon. (ii) Optimal Control Strategies for Social Influence (SI) Marketing: We investigate four SI strategies, namely, recommendation programs, referral programs, consumer reviews and campaigns on on-line forums. The campaign is assumed to be of finite duration, and the objective is to maximize profit, the (un-discounted) revenue minus the expenditure on the SI strategy under consideration, over the campaign duration. For each SI strategy, we focus on its timing, i.e., determining at what times to execute it. We address two important questions pertaining to them: a) how to execute a given SI strategy optimally? and b) having executed it so, what gains does it lead to? (iii) Optimal Mix of Incentive Strategies on Social Networks: The reach of a product in a pop- ulation can be influenced by offering (a) direct incentives to influence the buying behavior of potential buyers and (b) referral rewards to exploit the impact of social influence in inducing a purchasing decision. The company is interested in an optimal mix of these incentive programs. We report results on structure of optimal strategies for the company with significant practical implications. (iv) Truthful Tractable Mechanisms with Applications to Crowd sourcing: We focus on crowd- sourcing applications that involve specialized tasks for which the planner hardly has any idea about crowdworkers’ costs, for example, tagging geographical regions with air pollution levels or severity level of Ebola like disease. The mechanisms have to be robust to untruthful bidding from the crowdworkers. In our work, we propose tractable allocation algorithms that are monotone, leading to design of truthful mechanisms that can be successfully deployed in such applications.
50

Visual simultaneous localization and mapping in a noisy static environment

Makhubela, J. K. 03 1900 (has links)
M. Tech. (Department of Information and Communication Technology, Faculty of Applied and Computer Sciences), Vaal University of Technology / Simultaneous Localization and Mapping (SLAM) has seen tremendous interest amongst the research community in recent years due to its ability to make the robot truly independent in navigation. Visual Simultaneous Localization and Mapping (VSLAM) is when an autonomous mobile robot is embedded with a vision sensor such as monocular, stereo vision, omnidirectional or Red Green Blue Depth (RGBD) camera to localize and map an unknown environment. The purpose of this research is to address the problem of environmental noise, such as light intensity in a static environment, which has been an issue that makes a Visual Simultaneous Localization and Mapping (VSLAM) system to be ineffective. In this study, we have introduced a Light Filtering Algorithm into the Visual Simultaneous Localization and Mapping (VSLAM) method to reduce the amount of noise in order to improve the robustness of the system in a static environment, together with the Extended Kalman Filter (EKF) algorithm for localization and mapping and A* algorithm for navigation. Simulation is utilized to execute experimental performance. Experimental results show a 60% landmark or landfeature detection of the total landmark or landfeature within a simulation environment and a root mean square error (RMSE) of 0.13m, which is minimal when compared with other Simultaneous Localization and Mapping (SLAM) systems from literature. The inclusion of a Light Filtering Algorithm has enabled the Visual Simultaneous Localization and Mapping (VSLAM) system to navigate in an obscure environment.

Page generated in 0.1387 seconds