271 |
Implicit Concept-based Image Indexing and Retrieval for Visual Information SystemsAzzam, Ibrahim Ahmed Aref January 2006 (has links) (PDF)
This thesis focuses on Implicit Concept-based Image Indexing and Retrieval (ICIIR), and the development of a novel method for the indexing and retrieval of images. Image indexing and retrieval using a concept-based approach involves extraction, modelling and indexing of image content information. Computer vision offers a variety of techniques for searching images in large collections. We propose a method, which involves the development of techniques to enable components of an image to be categorised on the basis of their relative importance within the image in combination with filtered representations. Our method concentrates on matching subparts of images, defined in a variety of ways, in order to find particular objects. The storage of images involves an implicit, rather than an explicit, indexing scheme. Retrieval of images will then be achieved by application of an algorithm based on this categorisation, which will allow relevant images to be identified and retrieved accurately and efficiently. We focus on Implicit Concept-based Image Indexing and Retrieval, using fuzzy expert systems, density measure, supporting factors, weights and other attributes of image components to identify and retrieve images.
|
272 |
Critical Factors in the Adoption and Diffusion of E-Government Initiatives in OmanAlShihi, Hafedh January 2006 (has links) (PDF)
Many significant barriers must be faced in the adoption and dissemination of e-government systems regardless of how advanced or modest a country is in terms of ICT infrastructure and deployment. This research has endeavored to investigate the impediments associated with the development and diffusion of e-government with a concentration on non-technical and country-specific factors. The focus of the research was on Oman's efforts to develop an e-government system, using advanced nations' experiences in the same domain to establish benchmarks. Initially, this research undertook a general literature review to define the barriers to the uptake of e-government and to set and refine aims, scope and questions asked of the research. Subsequently, a more focused literature review was conducted on the experiences of advanced nation with e-government, to identify possible lessons for and solutions to barriers facing the take-up of e-government. In parallel, an exploratory case study of the Oman e-government project was conducted that aimed to test the extent to which the barriers and solutions drawn from the largely Western-centric literature apply in the Omani situation, and to investigate other possible cultural and country-specific barriers. Semi-structured interviews and face-to-face administered questionnaires were the primary data collection strategies used throughout the case study phase. The study found that non-technical barriers in Oman, such as users' lack of IT knowledge and the absence of marketing campaigns, have negatively affected people's decisions to use the technology and inhibited decision makers from implementing or adopting technology initiatives. In addition, several country-specific limits to e-government growth were identified. Government decision makers in Oman were found to be prone to short-term planning, which prevents them from anticipating the long-term potential of e-government. Additionally, frequent structural changes within ministries, and the fact that the e-government project is not given high priority nor urgently needed at present, have contributed in delaying development of and improvements to such a system. Ultimately, this research delivered a socio-technical framework for adoption, detailing causes and effects of the critical factors in the adoption and diffusion of e-government initiatives in Oman.
|
273 |
Adaptive Duplexer for Software RadioKannangara, Shyama Dilrukshi January 2006 (has links) (PDF)
Different geographies and localities around the world have adopted various wireless interface standards for mobile communications. As a result roaming users will require multiple handsets with multiple standards and multiple band capabilities. Triple-band hand sets are currently offered for high end users. In the future quad-band handsets including GSM 850 will become common in the market. This trend will continue. The addition of third generation functionality to second generation platforms will be even more difficult and complex. The radio handset should be able to use the same hardware for communications anywhere in the world. Therefore users will require small low cost terminals with multimode/ multi-band capability. The software radio concept has been developed to address these challenges. The replacement of fixed frequency components in the front end of the software radio is one of the key architectural changes required. The duplexer is one such component. Since duplexing filters are not normally tuneable, each band requires a separate duplexer in a multi-band system. The duplexers are passive devices (ceramic or SAW) and multiple duplexers lead to a dramatic increase in terminal cost and size. Abstract Adaptive Duplexer for Software Radio iv This thesis proposes a new adaptive duplexer architecture to reduce/eliminate the multiple duplexer problem in software radio. This technique is based on combining a low isolation device with an adaptive double loop cancelling scheme. The proposed double loop cancellation provides the required transmitter leakage and transmitter noise isolation over wideband using a delay element and an adjustable vector attenuator in each cancellation path. This thesis analyses the double loop cancellation technique. The cancellation path delay constraints are derived for coefficients with limited adjustment range in the cancellation paths. A linear relationship between the bandwidth and the achievable cancellation level is obtained. It is shown that the residual signal power is proportional to the square of the duplexing frequency. It is concluded that the delays in the cancellation paths should be chosen to straddle the expected range variation of the delay in the main path, predominantly caused by variations in antenna matching. The new algorithm uses a single cost function to achieve simultaneous cancellation in both the transmit band and the receive band. A direct conversion receiver architecture was chosen for the hardware prototype, since it is more suitable for multi-band systems. Alternate structures are also possible. A prototype of the adaptive duplexer using a 20dB circulator and a single loop cancelling technique was designed and implemented. It achieved a total Tx leakage cancellation of 69dB at 2GHz with 45MHz duplexing frequency. However it was not possible to simultaneously cancel the transmitter noise in the receiver band. The original prototype was extended to include the second loop. The achieved isolation between the transmit and the receive signals and the achieved reduction of the transmitter noise in the receiver band were 66.8dB and 58dB respectively. These results were obtained over 5MHz bandwidth and using a 190MHz duplexing frequency. The performance is more than adequate for W-CDMA applications. Lowering the duplexing frequency improves the cancellation bandwidth and so the scheme performs better with other standards, such as IS-95 (CDMA), using 45MHz duplexing offset.
|
274 |
Complexity reduction in multiple input multiple output algorithmsGor, Leon January 2007 (has links) (PDF)
Wireless communication devices are currently enjoying increasing popularity and widespread use. The constantly growing number of users, however, results in the shortage of the available spectrum. Various techniques have been proposed to increase the spectrum efficiency of wireless systems to solve the problem. Multiple Input Multiple Output (MIMO) is one solution that employs multiple antennas at the transmitter and receiver. The MIMO algorithms are usually highly complex and computationally intensive. This results in increased power consumption and reduced battery lifespan. This thesis investigates the complexity – performance trade-off of two MIMO algorithms. Space Time Block Coding (STBC) is a MIMO-based algorithm, which efficiently exploits spatial and temporal diversity. Recently, it has been specified in a number of 3G standards. However, not much attention has been paid to the implementation issues of this algorithm. One such issue, clipping of the Analog to Digital Converter (ADC) at the receiver, is described in the first part of the thesis (chapter 3). A small amount of clipping in an ADC can improve dynamic range and reduce the power consumption. However, the increased clipping distortion of the signal, can adversely affect the overall performance of the system. It will be shown in this dissertation that STBC are more sensitive to clipping, compared to the uncoded single antenna systems. Two receiver structures are considered: Direct Conversion (DC) structure, where the ADCs impose a square clipping function, and a Log-Polar structure, where ADC induces a circular clipping function. Log-Polar receivers were found to be clipping insensitive for the given target Symbol Error Rate (SER) of 1*10-3. This makes Log-Polar receivers an obvious choice for the system designers. The second part of the thesis (chapter 4) addresses the complexity problem associated with the QR decomposition algorithm, which is frequently used as a faster alternative to channel inversion in a MIMO scheme. Channel tracking can be employed with QR equalization in order to reduce the pilot overhead of a MIMO system in a non-stationary environment. QR decomposition is part of the QR equalization method and has to be performed in every instance that the channel estimate is obtained. The high rate of the QR decomposition, a computationally intensive technique, results in a high computational complexity per symbol. Some novel modifications are proposed to address this problem. Reducing the repetition rate of QR decompositions and tracking R (the upper triangular matrix) directly, while holding unitary matrix Q fixed, can significantly reduce complexity per symbol at the expense of some introduced error. Additional modification of the CORDIC algorithm (a square root- and division-free algorithm used to perform QR decomposition) results in more than 80% of computational complexity savings. Further, Minimum Mean Squared Error (MMSE) detection is applied to Least Mean Squared (LMS) based R tracking and channel tracking algorithms and then compared in complexity and performance to the Recursive Least Squares Decision Feedback Equalizer (RLS-DFE) tracking system in [1]. The R tracking scheme is shown to achieve more accurate channel estimates compared to the channel tracking scenario, but this advantage does not translate into better Bit Error Rate (BER) results due to errors on the first layer of the detector. Both LMS strategies have an inferior BER performance compared to the DFE RLS-based system of [1], and surprisingly the LMS schemes show no significant complexity improvement.
|
275 |
Co operative satisfaction factors for effective strategic alliances in the Australian telecommunications industryKaragiannidis, Vanaja January 2008 (has links) (PDF)
The telecommunications industry is critically important to Australia’s economic future. For this reason it was chosen as the subject of this thesis. This industry has progressed through a number of stages. From 1975 to 1991 it was a monopolistic public utility. During the nineties it became a duopoly (1991-97) before becoming an oligopolistic industry dominated by few major and powerful competitors. By the early part of the twenty first century the industry had rapidly expanded and developed into a more competitive market. Many transactions and alliances developed between the large dominant market leaders and smaller firms where they co-operated to achieve a common purpose. Some alliances were intended to facilitate co-operations between members of a value chain such as between suppliers of raw materials or components, suppliers and end users. Others were used to share tacit knowledge or expertise. Previous research suggested that, more so than other forms of relationships, alliances depend on social factors for their continuing success. The purpose of this study was to analyse the relationship between cooperative satisfaction factors (effective communication, commitment and trust, workable power and control, compatibility, cultural respect, and perception that alliance is worthwhile) and alliance effectiveness in the Australian telecommunications industries. This research used the triangulation approach. One method of data collection was a survey, the other was interviews. A survey was used to obtain quantitative data from a sample of 120 telecommunication companies. A response rate of 52.5% was achieved. A structured interview schedule was also used to collect qualitative data, which formed the basis of three companies case studies. The objectives of this research were: 1 To clarify the meaning and import of the word ‘alliance’ in the telecommunications industry. 2 To determine the nature of the relationship between cooperative factors and effectiveness of an alliance success in Australian telecommunications industries. 3 To determine the nature of the relationship between cooperative factors and effectiveness of alliance sustainability. 4 To ascertain the effect of organisational size on the relationship between cooperative factors and effectiveness of an alliance. The empirical findings confirmed that the term ‘alliance’ was used to describe a variety of co-operative arrangements that included contractual or non-contractual agreements, vendor arrangements, major tenders, major networks and cell/cluster groups. The quantitative research findings supported the proposition that effective communication is positively related to successful alliance effectiveness. Effective communication followed by commitment and trust were significant predictors of effectiveness. Size has an effect on the power and control factor in a relationship. This finding was supported by qualitative findings in which managers perceived size to matter in an alliance. On the other hand, a powerful partner could exert undue pressure on a relationship on the other; small creative, innovative and fragmented companies need relationships with big, strong and successful telecommunications firms that have established infrastructure foundations in the market. Further, the survey results suggested that respect had a positive influence on the sustainability of relationships. However, the qualitative results showed that other circumstances that influenced respect are often unpredictable.
|
276 |
Web mining techniques for recommendation and personalizationXu, Guandong January 2008 (has links) (PDF)
Nowadays Web users are facing the problems of information overload and drowning due to the significant and rapid growth in the amount of information and the number of users. As a result, how to provide Web users with more exactly needed information is becoming a critical issue in web-based information retrieval and Web applications. In this work, we aim to address improving the performance of Web information retrieval and Web presentation through developing and employing Web data mining paradigms. Web data mining is a process that discovers the intrinsic relationships among Web data, which are expressed in the forms of textual, linkage or usage information, via analysing the features of the Web and web-based data using data mining techniques. Particularly, we concentrate on discovering Web usage pattern via Web usage mining, and then utilize the discovered usage knowledge for presenting Web users with more personalized Web contents, i.e. Web recommendation. For analysing Web user behaviour, we first establish a mathematical framework, called the usage data analysis model, to characterise the observed co-occurrence of Web log files. In this mathematical model, the relationships between Web users and pages are expressed by a matrix-based usage data schema. On the basis of this data model, we aim to devise algorithms to discover mutual associations between Web pages and user sessions hidden in the collected Web log data, and in turn, to use this kind of knowledge to uncover user access patterns. To reveal the underlying relationships among Web objects, such as Web pages or user sessions, and find the Web page categories and usage patterns from Web log files, we have proposed three kinds of latent semantic analytical techniques based on three statistical models, namely traditional Latent Semantic Indexing, Probabilistic Latent Semantic Analysis and Latent Dirichlet Allocation model. In comparison to conventional Web usage mining approaches, the main strengths of latent semantic based analysis are their capabilities that can not only, capture the mutual correlations hidden in the observed objects explicitly, but also reveal the unseen latent factors/tasks associated with the discovered knowledge implicitly. In the traditional Latent Semantic Indexing, a specific matrix operation, i.e. Singular Value Decomposition algorithm, is employed on the usage data to discover the Web user behaviour pattern over a transformed latent Web page space, which contains the maximum approximation of the original Web page space. Then, a k-means clustering algorithm is applied to the transformed usage data to partition user sessions. The discovered Web user session group is eventually treated as a user session aggregation, in which all users share like-minded access task or intention. The centroids of the discovered user session clusters are, then, constructed as user profiles. In addition to intuitive latent semantic analysis, Probabilistic Latent Semantic Analysis and Latent Dirichlet Allocation approaches are also introduced into Web usage mining for Web page grouping and usage profiling via a probability inference approach. Meanwhile, the latent task space is captured by interpreting the contents of prominent Web pages, which significantly contribute to the user access preference. In contrast to traditional latent semantic analysis, the latter two approaches are capable of not only revealing the underlying associations between Web pages and users, but also capturing the latent task space, which is corresponding to user navigational patterns and Web site functionality. Experiments are performed to discover user access patterns, reveal the latent task space and evaluate the proposed techniques in terms of quality of clustering. The discovered user profiles, which are represented by the centroids of the Web user session clusters, are then used to make usage-based collaborative recommendation via a top-N weighted scoring scheme algorithm. In this scheme, the generated user profiles are learned from usage data in an offline stage using above described methods, and are considered as a usage pattern knowledge base. When a new active user session is coming, a matching operation is carried out to find the most matched/closest usage pattern/user profile by measuring the similarity between the active user session and the learned user profiles. The user profile with the largest similarity is selected as the most matched usage profile, which reflects the most similar access interest to the active user session. Then, the pages in the most matched usage profile are ranked in a descending order by examining the normalized page weights, which are corresponding to how likely it is that the pages will be visited in near future. Finally, the top-N pages in the ranked list are recommended to the user as the recommendation pages that are very likely to be visited in the coming period. To evaluate the effectiveness and efficiency of the recommendation, experiments are conducted in terms of the proposed recommendation accuracy metric. The experimental results have demonstrated that the proposed latent semantic analysis models and related algorithms are able to efficiently extract needed usage knowledge and to accurately make Web recommendations. Data mining techniques have been widely used in many other domains recently due to the powerful capability of non-linear learning from a wide range of data sources. In this study, we also extend the proposed methodologies and technologies to a biomechanical data mining application, namely gait pattern mining. Likewise in the context of Web mining, various clustering-based learning approaches are performed on the constructed gait variable data model, which is expressed as a feature vector of kinematic variables, to discover the subject gait classes. The centroids of the partitioned gait clusters are used to represent different specific walking characteristics. The data analysis on two gait datasets corresponding to various specific populations is carried out to demonstrate the feasibility and applicability of gait pattern mining. The results have shown the discovered gait pattern knowledge can be used as a useful means for human movement research and clinical applications.
|
277 |
Analysis and evaluation of visual information systems performanceGrubinger, Michael January 2007 (has links) (PDF)
This dissertation investigates the system-centred evaluation of visual information retrieval from generic photographic collections. The development of visual information retrieval systems has long been hindered by the lack of standardised benchmarks. Researchers have proposed numerous systems and techniques, and although different systems clearly have their particular strength, there is a tendency by researchers to use different means of showing retrieval performance to highlight the own algorithm’s benefits. For the field of visual information search to advance, however, objective evaluation to identify, compare and validate the strengths and merits of different systems is therefore essential. Benchmarks to carry out such evaluation have recently been developed, and evaluation events have also been organised for several domains. Yet, no efforts have considered the evaluation of retrieval from generic photographic collections (i.e. containing everyday real-world photographs akin to those that can frequently be found in private photographic collections as well, e.g. pictures of holidays and events). We therefore first analyse a multitude of variables and factors with respect to the performance and requirements of visual information systems, and we then design and implement the framework and resources necessary to carry out such an evaluation. These resources include: a parametric image collection, representative search requests, relevance assessments and a set of performance measures. In addition, we organise the first evaluation event for retrieval from generic photographic collections and report on its realisation. Finally, we present an analysis and the evaluation of the participating retrieval systems as well as of the evaluation event itself. Filling this particular gap by making possible a systematic calibration and comparison of system performance for retrieval from generic photographic collections constitutes the main scientific contribution of this research. This dissertation thereby enables a deeper understanding of the complex conditions and constraints associated with visual information identification, the accurate capturing of user requirements, the appropriate specification and complexity of user queries, the execution of searches, and the reliability of performance indicators.
|
278 |
Agent-based open connectivity for decision support systemsZhang, Hao Lan January 2007 (has links) (PDF)
One of the major problems that discourages the development of Decision Support Systems (DSSs) is the un-standardised DSS environment. Computers that support modern business processes are no longer stand-alone systems, but have become tightly connected both with each other and their users. Therefore, having a standardised environment that allows different DSS applications to communicate and cooperate is crucial. The integration difficulty is the most crucial problem that affects the development of DSSs. Therefore, an open and standardised environment for integrating various DSSs is required. Despite the critical need for an open architecture in the DSS designs, the present DSS architectural designs are unable to provide a fundamental solution to enhance the flexibility, connectivity, compatibility, and intelligence of a DSS. The emergence of intelligent agent technology fulfils the requirements of developing innovative and efficient DSS applications as intelligent agents offer various advantages, such as mobility, flexibility, intelligence, etc., to tackle the major problems in existing DSSs. Although various agent-based DSS applications have been suggested, most of these applications are unable to balance manageability with flexibility. Moreover, most existing agent-based DSSs are based on agent-coordinated design mechanisms, and often overlook the living environment for agents. This could cause the difficulties in cooperating and upgrading agents because the agent-based coordination mechanisms have limited capabilities to provide agents with relatively comprehensive information about global system objectives. This thesis proposes a novel multi-agent-based architecture for DSS, called Agentbased Open Connectivity for Decision support systems (AOCD). The AOCD architecture adopts a hybrid agent network topology that makes use of a unique feature called the Matrix-agent connection. The novel component, i.e. Matrix, provides a living environment for agents; it allows agents to upgrade themselves through interacting with the Matrix. This architecture is able to overcome the difficulties in concurrency control and synchronous communication that plague many decentralised systems. Performance analysis has been carried out on this framework and we find that it is able to provide a high degree of flexibility and efficiency compared with other frameworks. The thesis explores the detailed design of the AOCD framework and the major components employed in this framework including the Matrix, agents, and the unified Matrices structure. The proposed framework is able to enhance the system reusability and maximize the system performance. By using a set of interoperable autonomous agents, more creative decision-making can be accomplished in comparison with a hard-coded programmed approach. In this research, we systematically classified the agent network topologies, and developed an experimental program to evaluate the system performance based on three different agent network topologies. The experimental results present the evidence that the hybrid topology is efficient in the AOCD framework design. Furthermore, a novel topological description language for agent networks (TDLA) has been introduced in this research work, which provides an efficient mechanism for agents to perceive the information about their interconnected network. A new Agent-Rank algorithm is introduced in the thesis in order to provide an efficient matching mechanism for agent cooperation. The computational results based on our recently developed program for agent matchmaking demonstrate the efficiency and effectiveness of the Agent-Rank algorithm in the agent-matching and re-matching processes
|
279 |
Spectroscopy of thulium doped silica glassSimpson, David Allan January 2008 (has links) (PDF)
The increasing demand for bandwidth in optical fibre communication systems has prompted a significant research effort into developing efficient fibre based optical amplifiers at operating wavelengths neighbouring the erbium-doped fibre amplifier. Of the possible candidates, thulium-doped fibre amplifiers appear best suited to serve this need with amplification available from 1460 to 1530 nm. However, the current limitation with these devices is that the amplifying transition can only provide sufficient gain when doped into host materials with relatively low phonon energies. The ultimate goal for thulium-doped amplifiers is to incorporate the ion into a host material which can be easily integrated into standard communication systems without compromising gain; this thereby involves, to some degree, a silica based host material. To date, optical amplification in the telecommunication S-band using thulium-doped silica fibres has been inefficient due to the high phonon energy associated with Si-O bonds in the glass. This work undertakes a systematic study on the effects of network modifiers on the S-band amplifying transition in an effort to improve the radiative quantum efficiency in silica based glasses. To this end, the techniques employed in this investigation included modifying the glass network with elements which may act to reduce the local phonon energy surrounding the thulium (Tm3+) ion and co-doping the Tm3+ ions with sensitising ytterbium (Yb3+) ions.
|
280 |
Tourism information systems integration and utilization within the semantic webAbrahams, Brooke January 2006 (has links) (PDF)
The objective of this research was to generate grounded theory about the extent to which the Semantic Web and related technologies can assist with the creation, capture, integration, and utilization of accurate, consistent, timely, and up-to-date Web based tourism information. Tourism is vital to the economies of most countries worldwide (developed and lessdeveloped). Advanced Destination Marketing Systems (DMS) are essential if a country’s tourism infrastructure, facilities and attractions are to receive maximum exposure. A necessary prerequisite here is that relevant data must be captured, ‘cleansed’, organized, integrated and made available to key industry parties (e.g. travel agents and inbound tour operators). While more and more tourists are using the Internet for travel planning, the usability of the Internet as a travel information source remains a problem, with travellers often having trouble finding the information they seek as the amount of online travel related information increases. The problem is largely caused by the current Web’s lack of structure, which makes the integration of heterogeneous data a difficult time consuming task. Traditional approaches to overcoming heterogeneity have to a large extent been unsuccessful. In the past organizations attempted to rectify the problem by investing heavily in top-down strategic information systems planning projects (SISP), with the ultimate aim of establishing a new generation of systems built around a single common set of enterprise databases. An example of this approach to integration is that undertaken by the Bell companies (Nolan, Puryear & Elron 1989), whose massive investment in computer systems turned out to be more of a liability than an asset. The Semantic Web offers a new approach to integration. Broadly speaking, the Semantic Web (Berners-Lee, Hendler & Lassila 2001) refers to a range of standards, languages, development frameworks and tool development initiatives aimed at annotating Web pages with welldefined metadata so that intelligent agents can reason more effectively about services offered at particular sites. The technology is being developed by a number of scientists and industry organizations in a collaborative effort led by the Worldwide Web Consortium (W3C) with the goal of providing machine readable Web intelligence that would come from hyperlinked vocabularies, enabling Web authors to explicitly define their words and concepts. It is based on new markup languages such as such as Resource Description Framework (RDF) (Manola & Miller 2004), Ontology Web Language (OWL) (McGuinness & Harmelen 2004), and ontologies which provide a shared and formal description of key concepts in a given domain. The ontology driven approach to integration advocated here might be considered ‘bottom-up’, since individual enterprises (and parts of the one enterprise) can apply the technology (largely) independently – thereby mirroring the processes by which the Web itself evolved. The idea is that organizations could be provided with a common model (the Semantic Web ontology), and associated (easy-to-use) software could then be employed to guide them in the development of their Websites. As such, because Website production is driven by the common ontology, consistency and convenient integration is almost an automatic by-product (for all companies that take advantage of the technology and approach). In many cases, organizations would not have to change their present data structures or naming conventions, which could potentially overcome many of the change management issues that have led to the failure of previous integration initiatives. Many researchers (e.g. (El Sawy 2001)) have stressed the necessity to take a holistic view of technology, people, structure and processes in IT projects and, more specifically, Sharma et al. (2000, p. 151) have noted that as significant as DMS technological problems are, they may well pale into insignificance when compared with the managerial issues that need to be resolved. With this in mind, a systems development research approach supported by a survey of tourism operators and secondary interviews was used to generate grounded theory. The systems development and evaluation were designed to uncover technical benefits of using the Semantic Web for the integration and utilization of online tourism information. The survey of tourism operators and secondary data interviews were aimed at providing an understanding of attitudes towards adoption of a radical new online technology among industry stakeholders. A distinguishing feature of this research was its applied and pragmatic focus: in particular, one aim was to determine just what of practical use can be accomplished today, with current (albeit, extended) technology, in a real industry setting.
|
Page generated in 0.1402 seconds