• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 180
  • 127
  • 28
  • 21
  • 7
  • 1
  • Tagged with
  • 855
  • 334
  • 323
  • 318
  • 317
  • 317
  • 317
  • 313
  • 313
  • 312
  • 311
  • 311
  • 311
  • 311
  • 311
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Photogrammetric multi-view stereo and imaging network design

Hosseininaveh Ahmadabadian, A. January 2014 (has links)
This thesis proposes a new approach, photogrammetric multi-view stereo, for accurate and dense 3D reconstruction, including scale recovery from images. The novelty of the method can be seen in exploiting the length of the stereo camera base separation to define scale within a robust adjustment algorithm. The method is tested by imaging a series of known objects with stereo camera systems of varying quality. In each case, the baseline scaled network output is used as input into four different state-of-the-art dense matching packages in order to generate a series of dense (detailed) point clouds. Results demonstrate that networks, comprising some 50 images captured with consumer grade digital SLR cameras, can deliver 3D point data with an uncertainty around 100μm. Results are shown to be of comparable quality to a metrology grade triangulation laser scanner. Experiences gained with these stereo systems exposed a key data handling limitation in that both image capture and processing time are highly dependent on the number of camera views used. In particular, the volumes of data make the dense matching process impractical on current consumer level computing hardware. This problem demands a logical clustering and selection of the most suitable viewpoints (vantage viewpoints) from the large image dataset to provide a reduced network with similar overall capability. The second part of the research described in this thesis addresses this challenge through the development and testing of a novel methodology capable of structuring the viewpoints into clusters and then selecting vantage images resulting in more effective processing whilst ensuring a specified level of coordinate precision and point cloud completeness. The developed method is compared against the established CMVS clustering method using an in-house Imaging Network Designer (IND) suite of software developed during the course of the research. Results demonstrate that IND can provide a better image selection for subsequent dense reconstruction than CMVS in terms of completeness. The ability to select vantage images raises another research question in the feasibility of designing a complete imaging network from scratch. This issue is investigated and validated through a novel stereo imaging network design strategy. Again, this is implemented within the IND software framework and evaluated with both simulation and practical tests. In simulation tests, IND performance was tested by: 1) a comparison between spherical and ellipsoidal imaging network configurations 2) capability to select an appropriate stereo camera system to achieve a given level of point cloud precision. In practical tests, IND was evaluated using a purpose built imaging robot, INDRo, to capture images from each designed camera posture. The images were then used for accurate and dense 3D reconstruction using the photogrammetric multi view stereo method in two modes: 1) resolving scale with stereo camera base separation; 2) resolving scale with control points. The system provided this opportunity to test the effect of incidence angle, one of the key internal IND parameters, which affects the density of the network and should be adjusted with respect to the performance of feature matching algorithms. The results of this test showed that the matching algorithms work effectively for incidence angles of 10o. The results demonstrated that the final point cloud generated with the system in resolving scale with either control points or stereo camera base separation has an agreement of 0.3mm with a dataset generated with an Arius3D laser scanner. Furthermore, work on inverse robotic kinematics demonstrated the feasibility of using Particle Swarm Optimization to achieve the required camera poses with this non-standard robot.
12

Real-time motion capture for analysis and presentation within virtual environments

Brownridge, Adam Mark January 2014 (has links)
This thesis describes motion capture methods with an application for real-time recording of extreme human movement. A wireless gyroscopic sensor based system is used to record and evaluate misalignments in ankle position of ballet dancers in a performance environment. Anatomic alignment has been shown to contribute to dance related injuries, and results of this work show that subtle variations in joint rotation can be clearly measured. The workflow has been developed to extract performance analysis data for fault detection in order to assist augmented feedback methods for the prevention of injury and improved performance. Infra-red depth sensing technology, commonly used in garment design, has been used to produce a representation of a scanned human subject and a workflow established to utilise this character avatar for animation using motion capture data. The process of presenting a visually acceptable representation of an overall performance in addition to the numerical evaluation of specific joint orientation provides a significant contribution to knowledge.
13

Maximising microprocessor reliability through game theory and heuristics

Docherty, James January 2014 (has links)
Embedded Systems are becoming ever more pervasive in our society, with most routine daily tasks now involving their use in some form and the market predicted to be worth USD 220 billion, a rise of 300%, by 2018. Consumers expect more functionality with each design iteration, but for no detriment in perceived performance. These devices can range from simple low-cost chips to expensive and complex systems and are a major cost driver in the equipment design phase. For more than 35 years, designers have kept pace with Moore's Law, but as device size approaches the atomic limit, layouts are becoming so complicated that current scheduling techniques are also reaching their limit, meaning that more resource must be reserved to manage and deliver reliable operation. With the advent of many-core systems and further sources of unpredictability such as changeable power supplies and energy harvesting, this reservation of capability may become so large that systems will not be operating at their peak efi ciency. These complex systems can be controlled through many techniques, with jobs scheduled either onl ine prior to execution beginning or online at each time or event change. Increased processing power and job types means that current online scheduling methods that employ exhaustive search techniques will not be suitable to de ne schedules for such enigmatic task lists and that new techniques using statistic-based methods must be investigated to preserve Quality of Service. A new paradigm of scheduling through complex heuristics is one way to administer these next levels of processor effectively and allow the use of more simple devices in complex systems; thus reducing unit cost while retaining reliability a key goal identified by the International Technology Roadmap for Semi- conductors for Embedded Systems in Critical Environments. These changes would be beneficial in terms of cost reduction and system exibility within the next generation of device. This thesis investigates the use of heuristics and statistical methods in the operation of real-time systems, with the feasibility of Game Theory and Statistical Process Control for the successful supervision of high-load and critical jobs investigated. Heuristics are identified as an effective method of controlling complex real-time issues, with two-person non-cooperative games delivering Nash-optimal solutions where these exist. The simplified al- gorithms for creating and solving Game Theory events allow for its use within small embedded RISC devices and an increase in reliability for systems operating at the apex of their limits. Within this Thesis, Heuristic and Game Theoretic algorithms for a variety of real-time scenarios are postulated, investigated, re- ned and tested against existing schedule types; initially through MATLAB simulation before testing on an ARM Cortex M3 architecture functioning as a simplified automotive Electronic Control Unit.
14

Development and characterisation of a near-infrared femtosecond optical parametric oscillator frequency comb

Ferreiro, Teresa I. January 2013 (has links)
This thesis describes a 280 MHz MgO:PPLN-based optical parametric oscillator (OPO) synchronously pumped by a 50 fs Ti:sapphire laser to produce ultrafast pulses in the near-infrared. The OPO tuned over a wavelength range from 1450 - 1550 nm and 1624 - 1750 nm for the signal and idler respectively. The carrier-envelope-offset (CEO) frequency of the signal pulses was stabilised to a 10 MHz reference frequency without f-2f self-referencing, with an RMS phase variation of 0.56 rad over an observation time of 1 second. The relative intensity noise was measured for the CEO-stabilised OPO over an observation time of 64 seconds as 0.04%. The repetition frequency of the OPO was stabilised to 280 MHz using a frequency synthesiser at the eighth harmonic (2.24 GHz). This locking loop had an RMS phase variation of 0.98 mrad over a 1 second observation time. The CEO- and repetition frequencies were then locked simultaneously to a synthesiser referenced to a Rb-disciplined source, to generate a fully stabilised 1.5 μm frequency comb with an absolute uncertainty in comb mode position of 110 Hz. The upper limit for the fractional instability for a comb mode at 200 THz was found to be 2 x 10-11, limited by the stability of the Rb reference. A five-fold increase in comb mode spacing to 1.4 GHz was demonstrated with the stabilised frequency comb. This was achieved using a passive filter cavity, stabilised to a transmission peak via dither locking. The FWHM bandwidth of the optical spectrum for the filtered frequency comb was reduced to 8 nm, however no increase in comb linewidth was observed. An additional experiment was carried out where an external cavity diode laser was frequency-stabilised to a saturated absorption peak in Rb at 780.2 nm using dither locking, providing an optical frequency reference for future OPO frequency combs.
15

Towards task automation for mobile device forensics

Andriotis, Panagiotis January 2015 (has links)
Our developed societies have experienced lately the invasion of mobile devices in people's lives. Smartphones, tablets and wearables are embedded in our personal and professional routines. The various smartphone applications take advantage of these devices' increased capabilities and assist users to communicate with other people, entertain themselves and complete several other tasks. These applications store a significant amount of information in the device's internal memory. These data can be used as evidence in court. Hence, the need for their extraction, analysis, presentation and preservation has established the Digital Forensics Community. The findings of this thesis can be helpful to Law Enforcement Agencies (LEA) and forensic analysts during triage and analysis assisting them to efficiently extract and visualise underlying patterns of the data. This thesis proposes concepts towards the direction of automating specific tasks related to forensic analyses on mobile devices. The basic theme of this thesis is the analysis of high level non-volatile data that exist in the internal memory of mobile devices. Thus, it studies three major categories of data (text, multimedia and social network relationships) proposing concepts towards content analysis automation. Therefore, it proposes a framework to assist forensic analysts to extract data from devices running the Android OS in a forensically sound fashion. Furthermore, it analyses human factors that dictate biased behaviour when users form graphical passwords in order to investigate if lightweight methodologies can bypass user authentication schemes such as the Android pattern lock screen. In addition, it investigates if text-mining tasks can be performed on short texts with a special focus on SMS sentiment analysis. Moreover, social network analysis tools are implemented aiming to depict relationships among the people that interact with the user of the device. Finally, the performance of a steganalysis tool is evaluated and a framework to describe and visualise data sharing and distribution among applications is proposed.
16

Realization methods for asynchronous sequential circuits

Mago´, Gyula Antal January 1970 (has links)
No description available.
17

A new approach to transistor switching analysis, with applications to digital computer circuits

Vickery, Joseph Colin January 1962 (has links)
This thesis is an account of studies of the high speed switching properties of logic circuits employing transistors. Properties of both the transistor and its associated circuitry are described largely in terms of charge parameters, to simplify the analysis of the transient responses. A system of analysis which uses piecewise-linear switching waveforms is presented, which is capable of reducing an essentially non-linear problem to linear terms whilst preserving sufficient accuracy to yield useful results. By this means predictions of the transient response of circuits operating under typical conditions of interconnection can be made with a minimum of mathematical complexity. The effect of each circuit parameter on the overall performance of the circuits is preserved in physical terms which are useful to the circuit engineer. A particular family of defined current circuits is used to illustrate the theory, and optimum circuits obtained from application of the analysis. Results show that the desirable working conditions for fastest response are not always attainable in practice due to the limitations of the working ranges of practical devices. A study is made of the recovery of reactive elements in the coupling circuits, and leads to a fundamental conclusion regarding the resolving time of reactively coupled circuits when they are not allowed to become underdamped. As a result, optimisation of such circuits can be described completely in terms of the response to the first two edges of a suddenly applied switching signal. Improvements to the basic circuits suggested by the results of the analysis are discussed, and the thesis ends with an indication of other investigations which might be carried out.
18

Deterministic SpaceWire networks

Gibson, David James January 2017 (has links)
SpaceWire-D is an extension to the SpaceWire protocol that adds deterministic capabilities over existing equipment. It does this by using time-division multiplexing, controlled by the sequential broadcasting of time-codes by a network manager. A virtual bus abstraction is then used to divide the network architecture into segments in which all traffic is controlled by a single Remote Memory Access Protocol (RMAP) transaction initiator. Virtual buses are then allocated a number of time-slots in which they are allowed to operate, forming the SpaceWire-D schedule. This research starts by contributing an efficient embedded SpaceWire-D software layer, running on top of the RTEMS real-time operating system, for use in the initiators of a SpaceWire-D network. Next, the SpaceWire-D software layer was used in two LEON2-FT processor boards in combination with multiple other RMAP target boards, routers, a network manager, and a host PC running a suite of applications to create a SpaceWire-D Demonstrator. The SpaceWire-D software layer and SpaceWire-D Demonstrator were used to verify and demonstrate the SpaceWire-D protocol during the ESA SpaceWire-D project and resulted in multiple deliverables to ESA. Finally, this research contributes a novel SpaceWire-D scheduling strategy using a combination of path selection and transaction allocation algorithms. This strategy allows for a SpaceWire-D network to be defined as a list of periodic, aperiodic and payload data bandwidth requirements and outputs a list of paths and an allocation of transactions to time-slots which satisfy the networking requirements of a mission.
19

Automated synthesis of delay-insensitive circuits

Sayle, Roger Anthony January 1996 (has links)
The technological trend towards VLSI circuits built from increasing numbers of transistors continues to challenge the ingenuity of both designers and engineers. The use of asynchronous design techniques presents a method for taming the complexity of large concurrent VLSI system design and offers a number of attractive advantages over conventional design styles. In this thesis, we concentrate on the useful class of delay insensitive asynchronous circuits. These have the property that their correct operation is independent of the speed of the individual elements and the delays in the connecting wires. Traditionally, asynchronous circuits are considered much harder to design than their synchronous equivalents due to their inherent nondeterminism. The use of automated formal methods for generating such circuits shields the designer from this complexity. This allows the designer to abstract away from implementation issues and reason about the system behaviour in terms of concurrent processes or high level programs. Because each step of the compilation process can be shown to be sound, the resulting circuits are correct-by-construction. This thesis presents a compilation methodology for designing delay insensitive VLSI systems from behavioural specifications. The synthesis method makes use of a graph-based representation of the circuit's behaviour. Optimization of the global behaviour, by graph transformation, enables the generation of more efficient circuits than those produced by previous asynchronous circuit compilers based on syntax-directed translation. The resulting circuits are further improved by semantics-preserving circuit transformations. A compiler has been constructed that automatically performs the translation and transformation.
20

Computer accessibility for colour-blind people

Jefferson, Luke Alexander January 2008 (has links)
Colour vision deficiency (CVD), often and erroneously called colour-blindness, is the collective term for a variety of abnormal physiological conditions, usually congenital, which result in fewer colour responses than normal. Despite the surprisingly high incidence of CVD (8% of men are colour vision deficient) very few commercial interfaces tackle the problem explicitly. This thesis reviews the different types of accessibility problems encountered by colour vision deficient computer users. It reports results from a questionnaire study designed to ascertain the extent to which CVD impacts computer use, the frequency at which people with CVD experience problems and the severity of these problems. This thesis demonstrates how computational models of CVD can be applied to a variety of existing software tools and interfaces to improve computer accessibility for people with CVD. Specifically, it shows how it is possible to integrate models of CVD and measures of colour difference to facilitate the selection of accessible colour schemes and to automatically map combinations of colours that colour vision deficient people find hard to discern using multi-dimensional scaling. The recolouring algorithm is evaluated using a computerised version of the standard Ishihara pseudoisochromatic plate colour vision test. The effect of applying the algorithm is to decrease (increase) error (performance) significantly for both simulated and real colour vision deficient observers so that it is comparable to the error obtained by a normal colour observer. In addition to introducing a fully automatic recolouring method, a new semi-automatic recolouring method is described along with an interface that allows the method to be delivered as an adaptive technology. The interface allows users with CVD to recolour images for their own colour vision impairment in real-time. The interface is evaluated using a perceptual image similarity task, highlighting the benefits and limits of the proposed method.

Page generated in 0.0195 seconds