• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • Tagged with
  • 26
  • 26
  • 18
  • 9
  • 9
  • 8
  • 7
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Dynamic anticipatory mobility management for personal communication networks

Liu, Liang Qui January 1999 (has links)
No description available.
2

Design of optical networks: performance bounds, complexity and algorithms /

Saad, Mohamed Elsayed Mostafa. Luo, Zhi-Quan. January 2004 (has links)
Thesis (Ph.D.)--McMaster University, 2005. / Advisor: Zhi-Quan Luo. Includes bibliographical references (leaves 136-145). Also available online.
3

Women's internet usage in university settings in Malaysia and the United Kingdom : a comparative case study

Husain, Kalthom January 2010 (has links)
The revolution in information technology has resulted in innovations that are having increasingly important effects on the life of their users, in both their personal and work lives. In particular, the Internet and associated applications such as email and the World Wide Web have had profound impacts over the last twenty or so years that they have been in widespread use, raising issues about various types of digital divide, including that between more and less developed nations. This thesis reports a study carried out on two continents, Europe and Asia, to compare and contrast the adoption of these innovations in a roughly comparable context, that of a University department. Interviews were carried out with 27 women drawn from administrative and academic staff in the University of Brighton (UK) and Kolej Universiti Teknikal Kebangsaan (Malaysia).
4

A linear logic approach to RESTful web service modelling and composition

Zhao, Xia January 2013 (has links)
RESTful Web Services are gaining increasing attention from both the service and the Web communities. The rising number of services being implemented and made available on the Web is creating a demand for modelling techniques that can abstract REST design from the implementation in order better to specify, analyse and implement large-scale RESTful Web systems. It can also help by providing suitable RESTful Web Service composition methods which can reduce costs by effi ciently re-using the large number of services that are already available and by exploiting existing services for complex business purposes. This research considers RESTful Web Services as state transition systems and proposes a novel Linear Logic based approach, the first of its kind, for both the modelling and the composition of RESTful Web Services. The thesis demonstrates the capabilities of resource-sensitive Linear Logic for modelling five key REST constraints and proposes a two-stage approach to service composition involving Linear Logic theorem proving and proof-as-process based on the π-calculus. Whereas previous approaches have focused on each aspect of the composition of RESTful Web Services individually (e.g. execution or high-level modelling), this work bridges the gap between abstract formal modelling and application-level execution in an efficient and effective way. The approach not only ensures the completeness and correctness of the resulting composed services but also produces their process models naturally, providing the possibility to translate them into executable business languages. Furthermore, the research encodes the proposed modelling and composition method into the Coq proof assistant, which enables both the Linear Logic theorem proving and the π-calculus extraction to be conducted semi-automatically. The feasibility and versatility studies performed in two disparate user scenarios (shopping and biomedical service composition) show that the proposed method provides a good level of scalability when the numbers of services and resources grow.
5

Interference mitigation in cognitive femtocell networks

Kpojime, Harold Orduen January 2015 (has links)
Femtocells have been introduced as a solution to poor indoor coverage in cellular communication which has hugely attracted network operators and stakeholders. However, femtocells are designed to co-exist alongside macrocells providing improved spatial frequency reuse and higher spectrum efficiency to name a few. Therefore, when deployed in the two-tier architecture with macrocells, it is necessary to mitigate the inherent co-tier and cross-tier interference. The integration of cognitive radio (CR) in femtocells introduces the ability of femtocells to dynamically adapt to varying network conditions through learning and reasoning. This research work focuses on the exploitation of cognitive radio in femtocells to mitigate the mutual interference caused in the two-tier architecture. The research work presents original contributions in mitigating interference in femtocells by introducing practical approaches which comprises a power control scheme where femtocells adaptively controls its transmit power levels to reduce the interference it causes in a network. This is especially useful since femtocells are user deployed as this seeks to mitigate interference based on their blind placement in an indoor environment. Hybrid interference mitigation schemes which combine power control and resource/scheduling are also implemented. In a joint threshold power based admittance and contention free resource allocation scheme, the mutual interference between a Femtocell Access Point (FAP) and close-by User Equipments (UE) is mitigated based on admittance. Also, a hybrid scheme where FAPs opportunistically use Resource Blocks (RB) of Macrocell User Equipments (MUE) based on its traffic load use is also employed. Simulation analysis present improvements when these schemes are applied with emphasis in Long Term Evolution (LTE) networks especially in terms of Signal to Interference plus Noise Ratio (SINR).
6

Enhancing the governance of information security in developing countries : the case of Zanzibar

Shaaban, Hussein Khamis January 2014 (has links)
Organisations in the developing countries need to protect their information assets (IA) in an optimal way. This thesis is based upon the argument that in order to achieve fully effective information security management (ISM) strategy, it is essential to look at information security in a socio-technical context, i.e. the cultural, ethical, moral, legal dimensions, tools, devices and techniques. The motivation for this study originated from the concern of social chaos, which results from ineffective information security practices in organisations in the developing nations. The present strategies were developed for organisations in countries where culture is different to culture of the developing world. Culture has been pointed out as an important factor of human behaviour. This research is trying to enhance information security culture in the context of Zanzibar by integrating both social and technical issues. The theoretical foundation for this research is based on cultural theories and the theory of semiotics. In particular, the study utilised the GLOBE Project (House et al, 2004), Competing Values Framework (Quinn and Cameron; 1983) and Semiotic Framework (Liu, 2000). These studies guide the cultural study and the semiotics study. The research seeks to better understand how culture impact the governance of information security and develop a framework that enhances the governance of information security in non-profit organisations. ISO/IEC 27002 best practices in information security management provided technical guidance in this work. The major findings include lack of benchmarking in the governance of information security. Cultural issues impact the governance of information security. Drawing the evidence from the case study a framework for information security culture was proposed. In addition, a novel process model for information security analysis based on semiotics was developed. The process model and the framework integrated both social and technical issues and could be implemented in any non-profit organisation operating within a societal context with similar cultural feature as Zanzibar. The framework was evaluated using this process model developed in this research. The evaluated framework provides opportunities for future research in this area.
7

Localisation in wireless sensor networks for disaster recovery and rescuing in built environments

Gu, Shuang January 2014 (has links)
Progress in micro-electromechanical systems (MEMS) and radio frequency (RF) technology has fostered the development of wireless sensor networks (WSNs). Different from traditional networks, WSNs are data-centric, self-configuring and self-healing. Although WSNs have been successfully applied in built environments (e.g. security and services in smart homes), their applications and benefits have not been fully explored in areas such as disaster recovery and rescuing. There are issues related to self-localisation as well as practical constraints to be taken into account. The current state-of-the art communication technologies used in disaster scenarios are challenged by various limitations (e.g. the uncertainty of RSS). Localisation in WSNs (location sensing) is a challenging problem, especially in disaster environments and there is a need for technological developments in order to cater to disaster conditions. This research seeks to design and develop novel localisation algorithms using WSNs to overcome the limitations in existing techniques. A novel probabilistic fuzzy logic based range-free localisation algorithm (PFRL) is devised to solve localisation problems for WSNs. Simulation results show that the proposed algorithm performs better than other range free localisation algorithms (namely DVhop localisation, Centroid localisation and Amorphous localisation) in terms of localisation accuracy by 15-30% with various numbers of anchors and degrees of radio propagation irregularity. In disaster scenarios, for example, if WSNs are applied to sense fire hazards in building, wireless sensor nodes will be equipped on different floors. To this end, PFRL has been extended to solve sensor localisation problems in 3D space. Computational results show that the 3D localisation algorithm provides better localisation accuracy when varying the system parameters with different communication/deployment models. PFRL is further developed by applying dynamic distance measurement updates among the moving sensors in a disaster environment. Simulation results indicate that the new method scales very well.
8

Cell identity allocation and optimisation of handover parameters in self-organised LTE femtocell networks

Zhang, Xu January 2013 (has links)
Femtocell is a small cellular base station used by operators to extend indoor service coverage and enhance overall network performance. In Long Term Evolution (LTE), femtocell works under macrocell coverage and combines with the macrocell to constitute the two-tier network. Compared to the traditional single-tier network, the two-tier scenario creates many new challenges, which lead to the 3rd Generation Partnership Project (3GPP) implementing an automation technology called Self-Organising Network (SON) in order to achieve lower cost and enhanced network performance. This thesis focuses on the inbound and outbound handovers (handover between femtocell and macrocell); in detail, it provides suitable solutions for the intensity of femtocell handover prediction, Physical Cell Identity (PCI) allocation and handover triggering parameter optimisation. Moreover, those solutions are implemented in the structure of SON. In order to e ciently manage radio resource allocation, this research investigates the conventional UE-based prediction model and proposes a cell-based prediction model to predict the intensity of a femtocell's handover, which overcomes the drawbacks of the conventional models in the two-tier scenario. Then, the predictor is used in the proposed dynamic group PCI allocation approach in order to solve the problem of PCI allocation for the femtocells. In addition, based on SON, this approach is implemented in the structure of a centralised Automated Con guration of Physical Cell Identity (ACPCI). It overcomes the drawbacks of the conventional method by reducing inbound handover failure of Cell Global Identity (CGI). This thesis also tackles optimisation of the handover triggering parameters to minimise handover failure. A dynamic hysteresis-adjusting approach for each User Equipment (UE) is proposed, using received average Reference Signal-Signal to Interference plus Noise Ratio (RS-SINR) of the UE as a criterion. Furthermore, based on SON, this approach is implemented in the structure of hybrid Mobility Robustness Optimisation (MRO). It is able to off er the unique optimised hysteresis value to the individual UE in the network. In order to evaluate the performance of the proposed approach against existing methods, a System Level Simulation (SLS) tool, provided by the Centre for Wireless Network Design (CWiND) research group, is utilised, which models the structure of two-tier communication of LTE femtocell-based networks.
9

Radio resource scheduling in homogeneous coordinated multi-point joint transmission of future mobile networks

Shyam Mahato, Ben Allen January 2013 (has links)
The demand of mobile users with high data-rate services continues to increase. To satisfy the needs of such mobile users, operators must continue to enhance their existing networks. The radio interface is a well-known bottleneck because the radio spectrum is limited and therefore expensive. Efficient use of the radio spectrum is, therefore, very important. To utilise the spectrum efficiently, any of the channels can be used simultaneously in any of the cells as long as interference generated by the base stations using the same channels is below an acceptable level. In cellular networks based on Orthogonal Frequency Division Multiple Access (OFDMA), inter-cell interference reduces the performance of the link throughput to users close to the cell edge. To improve the performance of cell-edge users, a technique called Coordinated Multi-Point (CoMP) transmission is being researched for use in the next generation of cellular networks. For a network to benefit from CoMP, its utilisation of resources should be scheduled efficiently. The thesis focuses on the resource scheduling algorithm development for CoMP joint transmission scheme in OFDMA-based cellular networks. In addition to the algorithm, the thesis provides an analytical framework for the performance evaluation of the CoMP technique. From the system level simulation results, it has been shown that the proposed resource scheduling based on a joint maximum throughput provides higher spectral efficiency compared with a joint proportional fairness scheduling algorithm under different traffic loads in the network and under different criteria of making cell-edge decision. A hybrid model combining the analytical and simulation approaches has been developed to evaluate the average system throughput. It has been found that the results of the hybrid model are in line with the simulation based results. The benefit of the model is that the throughput of any possible call state in the system can be evaluated. Two empirical path loss models in an indoor-to-outdoor environment of a residential area have been developed based on the measurement data at carrier frequencies 900 MHz and 2 GHz. The models can be used as analytical expressions to estimate the level of interference by a femtocell to a macrocell user in link-level simulations.
10

Applying the finite-difference time-domain to the modelling of large-scale radio channels

Rial, Alvaro Valcarce January 2010 (has links)
Finite-difference models have been used for nearly 40 years to solve electromagnetic problems of heterogeneous nature. Further, these techniques are well known for being computationally expensive, as well as subject to various numerical artifacts. However, little is yet understood about the errors arising in the simulation of wideband sources with the finitedifference time-domain (FDTD) method. Within this context, the focus of this thesis is on two different problems. On the one hand, the speed and accuracy of current FDTD implementations is analysed and increased. On the other hand, the distortion of numerical pulses is characterised and mitigation techniques proposed. In addition, recent developments in general-purpose computing on graphics processing units (GPGPU) have unveiled new methods for the efficient implementation of FDTD algorithms. Therefore, this thesis proposes specific GPU-based guidelines for the implementation of the standard FDTD. Then, metaheuristics are used for the calibration of a FDTD-based narrowband simulator. Regarding the simulation of wideband sources, this thesis uses first Lagrange multipliers to characterise the extrema of the numerical group velocity. Then, the spread of numerical Gaussian pulses is characterised analytically in terms of the FDTD grid parameters. The usefulness of the proposed solutions to the previously described problems is illustrated in this thesis using coverage and wideband predictions in large-scale scenarios. In particular, the indoor-to-outdoor radio channel in residential areas is studied. Furthermore, coverage and wideband measurements have also been used to validate the predictions. As a result of all the above, this thesis introduces first an efficient and accurate FDTD simulator. Then, it characterises analytically the propagation of numerical pulses. Finally, the narrowband and wideband indoorto-outdoor channels are modeled using the developed techniques.

Page generated in 0.118 seconds