• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 56
  • 56
  • 56
  • 11
  • 10
  • 9
  • 8
  • 7
  • 6
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Segmentation-based and region-adaptive lossless image compression underpinned by a stellar-field image model

Grunler, Christian Dieter January 2010 (has links)
The central question addressed in this research is whether lossless compression of stellar-field images can be enhanced in terms of compression ratio, by using image segmentation and region-adaptive bit-allocation which are based on a suitable image model. Therefore, special properties of stellarfield images, which compression algorithms could exploit, are studied. The research proposes and develops novel lossless compression algorithms for the compaction of stellar-field images. The proposed algorithms are based on image segmentation coupled to a domain-specific image data model and to a region-adaptive allocation of pixel bits. The algorithms exploit the distinctive characteristics of stellar-field images and aim to meet the requirements for compressing scientific-quality astronomical images. The image data model used is anchored on the property of a stellar-field image encapsulated in the characterisation of this type of images as consisting of “dot-like bright objects on a noisy background”. These novel algorithms segment the dot-like bright objects, corresponding to the high-dynamic-range areas of the image, from the noise-like low-dynamic-range background sky areas. Following the segmentation of the image, the algorithms perform region-adaptive image compression tuned to each specific component of the image data model. Besides the development of novel algorithms, the research also presents a survey of the state-of-the-art of compression algorithms for astronomical images. It reviews and compares existing methods claimed to be able to achieve lossless compression of stellar-field images and contributes an evaluation of a set of existing methods. Experiments to evaluate the performance of the algorithms investigated in this research were conducted using a set of standard astronomical test images. The results of the experiments show that the novel algorithms developed in this research can achieve compression ratios comparable to, and often better than existing methods. The evaluation results show that a significant compaction can be achieved by image segmentation and region-adaptive bitallocation, anchored on a domain-specific image data model. Based on the evaluation results, this research suggests application classes for the tested algorithms. On the test image set, existing methods which do not explicitly exploit the special characteristics of astronomical images were shown to lead to average compression ratios of 1.97 up to 3.92. Great differences were found between the results on 16-bit-per-pixel images and those on 32-bit-per-pixel images. For these existing methods, the average results on 16-bit-per-pixel images range from 1.37 up to 2.81, and from 3.81 up to 6.42 for 32-bit-per-pixel images. Therefore, it is concluded that for archiving data, compression methods may indeed save costs for storage media or data transfer time, especially if a large part of the raw images is encoded with 32 bits per pixel. With average compression ratios on the test image set in the range of 3.37 to 3.82, the simplest among the new algorithms developed in this research achieved a result which is comparable to the best existing methods. These simple algorithms use general-purpose methods, which have limited performance, for encoding the data streams of separate image regions corresponding to components of a stellar-field image. The most advanced of the new algorithms, which uses data encoders tuned to each image signal component, outperformed existing methods by about 10 percent (average of 4.29 on the test image set), in terms of size efficiency; it can yield a compression ratio of 7.87. Especially for applications where high volumes of image data have to be stored, the most advanced of the new algorithms should also be considered.
2

WEBFRAME : a framework for informing web developers' methodology selection

Kinmond, Robert M. January 2012 (has links)
This research explores web information systems developers‘ choices and use of methodologies. The stated aim of the research is to seek to identify key features of the developers‘ requirements for methodologies and then from this, to design a framework for use in practice. The literature review reveals that a great many methodologies are available but recent research also suggests that these are poorly used in practice. This study explores whether or not this is so and if so, why. Using an interpretevist epistemological framework the principles of Grounded Theory Methodology are used to conduct a mixed methods investigation. Structuration Theory offers a theoretical framework for analysis and development of the theory. An initial web-based survey aims to capture a breadth of developers‘ views and experiences. This is then followed with semi-structured interviews which enables exploration of the area in depth. The findings suggest that web information system developers are not following a published methodology but prefer instead to develop their own ‗bespoke‘ approach to suit the project. The developers seem to be aware of, and are using, traditional information system tools and importing them as appropriate into the web development methodologies. They are however, less aware or concerned with published web methodologies apparently needing greater flexibility and choice for developing web information systems than the published methodologies offer. Thus, the proposed new framework (entitled WEBFRAME) aims to provide web developers with a set of key principles to facilitate development of web information system development methodologies. This proposed framework is evaluated and validated by an expert panel of web developers with findings from this evaluation and validation reported here.
3

An animated pedagogical agent for assisting novice programmers within a desktop computer environment

Case, Desmond Robert January 2012 (has links)
Learning to program for the first time can be a daunting process, fraught with difficulty and setback. The novice learner is faced with learning two skills at the same time each that depends on the other; they are how a program needs to be constructed to solve a problem and how the structures of a program work towards solving a problem. In addition the learner has to develop practical skills such as how to design a solution, how to use the programming development environment, how to recognise errors, how to diagnose their cause and how to successfully correct them. The nature of learning how to program a computer can cause frustration to many and some to disengage before they have a chance to progress. Numerous authorities have observed that novice programmers make the same mistakes and encounter the same problems when learning their first programming language. The learner errors are usually from a fixed set of misconceptions that are easily corrected by experience and with appropriate guidance. This thesis demonstrates how a virtual animated pedagogical agent, called MRCHIPS, can extend the Beliefs-Desires-Intentions model of agency to provide mentoring and coaching support to novice programmers learning their first programming language, Python. The Cognitive Apprenticeship pedagogy provides the theoretical underpinning of the agent mentoring strategy. Case-Based Reasoning is also used to support MRCHIPS reasoning, coaching and interacting with the learner. The results indicate that in a small controlled study when novice learners are assisted by MRCHIPS they are more productive than those working without the assistance, and are better at problem solving exercises, there are also manifestations of higher of degree of engagement and learning of the language syntax.
4

Applications of the genetic algorithms optimisation approach in the design of circular polarised microstrip antennas

Al-Jibouri, Belal January 2005 (has links)
No description available.
5

Migrating relational databases into object-based and XML databases

Maatuk, Abdelsalam January 2009 (has links)
Rapid changes in information technology, the emergence of object-based and WWW applications, and the interest of organisations in securing benefits from new technologies have made information systems re-engineering in general and database migration in particular an active research area. In order to improve the functionality and performance of existing systems, the re-engineering process requires identifying and understanding all of the components of such systems. An underlying database is one of the most important component of information systems. A considerable body of data is stored in relational databases (RDBs), yet they have limitations to support complex structures and user-defined data types provided by relatively recent databases such as object-based and XML databases. Instead of throwing away the large amount of data stored in RDBs, it is more appropriate to enrich and convert such data to be used by new systems. Most researchers into the migration of RDBs into object-based/XML databases have concentrated on schema translation, accessing and publishing RDB data using newer technology, while few have paid attention to the conversion of data, and the preservation of data semantics, e.g., inheritance and integrity constraints. In addition, existing work does not appear to provide a solution for more than one target database. Thus, research on the migration of RDBs is not fully developed. We propose a solution that offers automatic migration of an RDB as a source into the recent database technologies as targets based on available standards such as ODMG 3.0, SQL4 and XML Schema. A canonical data model (CDM) is proposed to bridge the semantic gap between an RDB and the target databases. The CDM preserves and enhances the metadata of existing RDBs to fit in with the essential characteristics of the target databases. The adoption of standards is essential for increased portability, flexibility and constraints preservation. This thesis contributes a solution for migrating RDBs into object-based and XML databases. The solution takes an existing RDB as input, enriches its metadata representation with the required explicit semantics, and constructs an enhanced relational schema representation (RSR). Based on the RSR, a CDM is generated which is enriched with the RDB's constraints and data semantics that may not have been explicitly expressed in the RDB metadata. The CDM so obtained facilitates both schema translation and data conversion. We design sets of rules for translating the CDM into each of the three target schemas, and provide algorithms for converting RDB data into the target formats based on the CDM. A prototype of the solution has been implemented, which generates the three target databases. Experimental study has been conducted to evaluate the prototype. The experimental results show that the target schemas resulting from the prototype and those generated by existing manual mapping techniques were comparable. We have also shown that the source and target databases were equivalent, and demonstrated that the solution, conceptually and practically, is feasible, efficient and correct.
6

A compositional analysis of broadcasting embedded systems

Brockway, Michael J. January 2010 (has links)
This work takes as its starting point D Kendall's CANdle/bCANdle algebraic framework for formal modelling and specification of broadcasting embedded systems based on CAN networks. Checking real-time properties of such systems is beset by problems of state-space explosion and so a scheme is given for recasting systems specified in Kendall's framework as parallel compositions of timed automata; a CAN network channel is modelled as an automaton. This recasting is shown to be bi-similar to the original bCANdle model. In the recast framework,"compositionality" theorems allow one to infer that a model of a system is simulated by some abstraction of the model, and hence that properties of the model expressible in ACTL can be inferred from analogous properties of the abstraction. These theorems are reminiscent of "assume-guarantee" reasoning allowing one to build simulations component-wise although, unfortunately, components participating in a "broadcast" are required to be abstracted "atomically". Case studies are presented to show how this can be used in practice, and how systems which take impossibly long to model-check can tackled by compositional methods. The work is of broader interest also, as the models are built as UPPAAL systems and the compositionality theorems apply to any UPPAAL system in which the components do not share local variables. The method could for instance extend to systems using some network other than CAN, provided it can be modelled by timed automata. Possibilities also exist for building it into an automated tool, complementing other methods such as counterexample- guided abstraction refinement.
7

An investigation into server-side static and dynamic web content survivability using a web content verification and recovery (WVCR) system

Aljawarneh, Shadi January 2008 (has links)
A malicious web content manipulation software can be used to tamper with any type of web content (e.g., text, images, video, audio and objects), and as a result, organisations are vulnerable to data loss. In addition, several security incident reports from emergency response teams such as CERT and AusCERT clearly demonstrate that the available security mechanisms have not made system break-ins impossible. Therefore, ensuring web content integrity against unauthorised tampering has become a major issue. This thesis investigates the survivability of server-side static and dynamic web content using the Web Content Verification and Recovery (WCVR) system. We have developed a novel security system architecture which provides mechanisms to address known security issues such as violation of data integrity that arise in tampering attacks. We propose a real-time web security framework consisting of a number of components that can be used to verify the server-side static and dynamic web content, and to recover the original web content if the requested web content has been compromised. A conceptual model to extract the client interaction elements, and a strategy to utilise the hashing performance have been formulated in this research work. A prototype of the solution has been implemented and experimental studies have been carried out to address the security and the performance objectives. The results indicate that the WCVR system can provide a tamper detection, and recovery to server-side static and dynamic web content. We have also shown that overhead for the verification and recovery processes are relatively low and the WCVR system can efficiently and correctly determine if the web content has been tampered with.
8

Automatic schedule computation for distributed real-time systems using timed automata

Park, Young-Saeng January 2008 (has links)
The time-triggered architecture is becoming accepted as a means of implementing scalable, safer and more reliable solutions for distributed real-time systems. In such systems, the execution of distributed software components and the communication of messages between them take place in a fixed pattern and are scheduled in advance within a given scheduling round by a global scheduling policy. The principal obstacle in the design of time-triggered systems is the difficulty of finding the static schedule for all resources which satisfies constraints on the activities within the scheduling round, such as the meeting of deadlines. The scheduler has to consider not only the requirements on each processor but also the global requirements of system-wide behaviour including messages transmitted on networks. Finding an efficient way of building an appropriate global schedule for a given system is a major research challenge. This thesis proposes a novel approach to designing time-triggered schedules which is radically different from existing mathematical methods or algorithms for schedule generation. It entails the construction of timed automata to model the arrival and execution of software tasks and inter-task message communication for a system; the behaviour of an entire distributed system is thus a parallel composition of these timed automata models. A job comprises a sequence of tasks and messages; this expresses a system-wide transaction which may be distributed over a system of processors and networks. The job is formalized by a timed automata based on the principle that a task or message can be modelled by finite states and a clock variable. Temporal logic properties are formed to express constraints on the behaviour of the system components such as precedence relationships between tasks and messages and adherence to deadlines. Schedules are computed by formally verifying that these properties hold for an evolution of the system; a successful schedule is simply a trace generated by the verifier, in this case the UPPAAL model-checking tool has been employed to perform the behaviour verification. This approach guarantees to generate a practical schedule if one exists and will fail to construct any schedule if none exists. A prototype toolset has been developed to automate the proposed approach to create of timed automata models, undertake the analysis, extract schedules from traces and visualize the generated schedules. Two case studies, one of a cruise control system, the other a manufacturing cell system, are presented to demonstrate the applicability and usability of the approach and the application of the toolset. Finally, further constraints are considered in order to yield schedules with limited jitter, increased efficiency and system-wide properties.
9

The formal evaluation and design of routing protocols for wireless sensor networks in hostile environments

Saghar Malik, Kashif January 2010 (has links)
Wireless Sensor Networks (WSNs) have attracted considerable research attention in recent years because of the perceived potential benefits offered by self-organising, multi-hop networks consisting of low-cost and small wireless devices for monitoring or control applications in di±cult environments. WSN may be deployed in hostile or inaccessible environments and are often unattended. These conditions present many challenges in ensuring that WSNs work effectively and survive long enough to fulfil their functionalities. Securing a WSN against any malicious attack is a particular challenge. Due to the limited resources of nodes, traditional routing protocols are not appropriate in WSNs and innovative methods are used to route data from source nodes to sink nodes (base stations). To evaluate the routing protocols against DoS attacks, an innovative design method of combining formal modelling and computer simulations has been proposed. This research has shown that by using formal modelling hidden bugs (e.g. vulnerability to attacks) in routing protocols can be detected automatically. In addition, through a rigorous testing, a new routing protocol, RAEED (Robust formally Analysed protocol for wirEless sEnsor networks Deployment), was developed which is able to operate effectively in the presence of hello flood, rushing, wormhole, black hole, gray hole, sink hole, INA and jamming attacks. It has been proved formally and using computer simulation that the RAEED can pacify these DoS attacks. A second contribution of this thesis relates to the development of a framework to check the vulnerability of different routing protocols against Denial of Service(DoS) attacks. This has allowed us to evaluate formally some existing and known routing protocols against various DoS attacks iand these include TinyOS Beaconing, Authentic TinyOS using uTesla, Rumour Routing, LEACH, Direct Diffusion, INSENS, ARRIVE and ARAN protocols. This has resulted in the development of an innovative and simple defence technique with no additional hardware cost for deployment against wormhole and INA attacks. In the thesis, the detection of weaknesses in INSENS, Arrive and ARAN protocols was also addressed formally. Finally, an e±cient design methodology using a combination of formal modelling and simulation is propose to evaluate the performances of routing protocols against DoS attacks.
10

Objective models for subjective quality estimate of stereoscopic video

Malekmohamadi, Hossein January 2013 (has links)
No description available.

Page generated in 0.1042 seconds