• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3267
  • 1226
  • 892
  • 505
  • 219
  • 178
  • 161
  • 161
  • 160
  • 160
  • 160
  • 160
  • 160
  • 159
  • 77
  • Tagged with
  • 8737
  • 4075
  • 2533
  • 2456
  • 2456
  • 805
  • 805
  • 588
  • 579
  • 554
  • 552
  • 525
  • 486
  • 480
  • 472
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Supporting project comprehension with revision control system repository analysis

Burn, Andrew James January 2011 (has links)
Context: Project comprehension is an activity relevant to all aspects of software engineering, from requirements specification to maintenance. The historical, transactional data stored in revision control systems can be mined and analysed to produce a great deal of information about a project. Aims: This research aims to explore how the data-mining, analysis and presentation of revision control systems can be used to augment aspects of project comprehension, including change prediction, maintenance, visualization, management, profiling, sampling and assessment. Method: A series of case studies investigate how transactional data can be used to support project comprehension. A thematic analysis of revision logs is used to explore the development process and developer behaviour. A benchmarking study of a history-based model of change prediction is conducted to assess how successfully such a technique can be used to augment syntax-based models. A visualization tool is developed for managers of student projects with the aim of evaluating what visualizations best support their roles. Finally, a quasi-experiment is conducted to determine how well an algorithmic model can automatically select a representative sample of code entities from a project, in comparison with expert strategies. Results: The thematic analysis case study classified maintenance activities in 22 undergraduate projects and four real-world projects. The change prediction study calculated information retrieval metrics for 34 undergraduate projects and three real-world projects, as well as an in-depth exploration of the model's performance and applications in two selected projects. File samples for seven projects were generated by six experts and three heuristic models and compared to assess agreement rates, both within the experts and between the experts and the models. Conclusions: When the results from each study are evaluated together, the evidence strongly shows that the information stored in revision control systems can indeed be used to support a range of project comprehension activities in a manner which complements existing, syntax-based techniques. The case studies also help to develop the empirical foundation of repository analysis in the areas of visualization, maintenance, sampling, profiling and management; the research also shows that students can be viable substitutes for industrial practitioners in certain areas of software engineering research, which weakens one of the primary obstacles to empirical studies in these areas.
352

Electronic Nose Optimisation

Scott, Simon Michael January 2005 (has links)
No description available.
353

QoS-aware and policy based mobile data offloading

Amani, Mojdeh January 2014 (has links)
The rapid growth in the number of 3G/4G enabled devices such as smart-phones and tablets has created exceptional demand for ubiquitous connectivity and high quality applications. As a result, cellular networks are struggling to keep up with this explosive demand for data traffic. The emergence to LTE has boosted cellular network throughput; however these improvements are not sufficient given the limited availability of licensed spectrum. To meet the requirements of capacity-hungry applications, Wi-Fi offloading has been intensively researched as an essential approach to alleviate the mobile data traffic load on cellular network by providing extra capacity and improving overall performance. The offloading algorithms should be evaluated and compared to steer Wi-Fi offloading to increase the combined throughput and network performance of LTE and Wi-Fi access technologies connected to the evolved packet core (EPC) with at least the baseline case of having all the data traffic in LTE. In this thesis, novel off-loading algorithms are proposed and implemented to address challenges in Wi-Fi loading to LTE networks and provide solutions when performance needs exceed the capability of the LTE access network. In the design of such smart offloading techniques, important issues such as scalability and stability are being considered. Through an extensive set of simulations, the performance of the proposed techniques is thoroughly investigated focusing on the figure of merits that affects user experience. The end-to-end throughput that a flow can accomplish, offloading efficiency and packet dropping rate are examined. Furthermore, these evaluations have demonstrated that offloading users from LTE to Wi-Fi reduces burden on the LTE network without affecting user experience. Also it is shown that the mobile communication architecture can be improved further by applying the principles of Software-Defined Networking (SDN) with providing logically centralized control of the overall infrastructure, and enabling programmability.
354

Robust optimization for multi-antenna downlink transmission in cellular networks

Nasseri, Saba January 2015 (has links)
In multi-cell networks where resources are aggressively reused and the cell sizes are shrinking to accommodate more users, eliminating interference is the key factor to reduce the system energy consumption. This growth in the demand of wireless services has urged the researchers to find new and efficient ways of increasing coverage and reliability, i.e., coordinated signal processing across base stations. The optimum exploitation of the benefits provided by coordinated signal processing can be achieved when a perfect channel state information at transmitter (CSIT) is available. The assumption of having perfect knowledge of the channel is, however, often unrealistic in practice. Noise-prone channel estimation, quantization effects, fast varying environment combined with delay requirements, and hardware limitations are some of the most important factors that cause errors. Providing robustness to imperfect channel state information (CSI) is, therefore, a task of significant practical interest. Current robust designs address the channel imperfections with the worstcase and stochastic approaches. In worst-case analysis, the channel uncertainties are considered as deterministic and norm-bounded, and the resulting design is a conservative optimization that guarantees a certain quality of service (QoS) for every allowable perturbation. The latter approach focuses on the average performance under the assumption of channel statistics, such as mean and covariance. The system performance could break down when persistent extreme errors occur. Thus, an outage probability-based approach is developed by keeping a low probability that channel condition falls below an acceptable level. Compared to the worst-case methods, this approach can optimize the average performance as well as consider the extreme scenarios proportionally. In existing literature, robust precoder designs for single-cell downlink transmissions have been extensively investigated, where inter-cell interference was treated as background noise. However, robust multi-cell signal processing has not been adequately explored. In this thesis, we focus on robust design of downlink beamforming vectors for multiple antenna base stations (BSs) in a multi-cell interference network. We formulate a robust distributed beamforming (DBF) to independently design beamformers for the local users of each BS. In DBF, the combination of each BS’s total transmit power and its resulting interference power toward other BSs’ users is minimized while the required signal-tointerference- plus-noise-ratios (SINRs) for its local users are maintained. In our first approach of solving the proposed robust downlink beamforming problem for multiple-input-single-output (MISO) system, we assume only imperfect knowledge of channel covariance is available at the base stations. The uncertainties in the channel covariance matrices are assumed to be confined in an ellipsoids of given sizes and shapes. We obtain exact reformulations of the worst-case quality of service (QoS) and inter-cell interference constraints based on Lagrange duality, avoiding the coarse approximations used by previous solutions. The final problem formulations are converted to convex forms using semidefinite relaxation (SDR). Through simulation results, we investigate the achievable performance and the impact of parameters uncertainty on the overall system performance. In the second approach, in contrast to the ‘average case’ and ‘worst-case’ estimation error scenarios in the literature, to provide the robustness against channel imperfections, the outage probability-based approach is proposed for the aforementioned optimization problem. The outages are due to the uncertainties that naturally emerge in the estimation of channel covariance matrices between a BS and its intra-cell local users as well as the other users of the other cells. We model these uncertainties using random matrices, analyze their statistical behavior and formulate a tractable probabilistic approach to the design of optimal robust downlink beamforming vectors by transforming the probabilistic constraints into a semidefinite programming (SDP) form with linear matrix inequality (LMI) constraints. The performance and power efficiency of the proposed probabilistic algorithm compare to the worst-case approach are assessed and demonstrated through simulation results. Finally, we shift to the case where imperfect channel state information is available both at transmitter and receiver sides; hence we adopt a bounded deterministic model for the error in instantaneous CSI and design the downlink beamformers. The robustness criterion is to minimize the transmitted power while guaranteeing a certain quality of service per user for every possible realization of the channel that is compatible with the available channel state information. To derive closed form solutions for the original nonconvex problem we transform the worst-case constraints into a SDP with LMI constraints using the standard rank relaxation and the S-procedure. Superiority of the proposed model is confirmed through simulation results.
355

Spatial models in computer-based information systems

Thomas, Adrian Lynn January 1976 (has links)
From a series of initial studies in the area of computer cartography a dual data structure was evolved based on matrix representation of graphs and the use of boolean expressions. This data structure was used principally to represent zones in space though, by using boundaries of zones, it was possible to create line networks. The original idea was to use the boolean expressions as an input language for creating volume and area descriptions and to use the graph matrices for internal manipulation and creating graphic output. However, a way was found to interpret the boolean expression directly into the form of graphic output suitable for the raster scan displays given by television monitors. The software implementation of this process was very slow but, with the current developments in integrated circuitry, it suggested a way of creating a new form of parallel display processor. This possibility was investigated initially as a general processor to carry out several related spatial operations and then, finally, merely to create displays. The applications depend on (t) the general nature of the data structure used and the possible graphic languages it makes possible and (2) the real time manipulation of displays. In the case of three-dimensional scenes, this includes an automatic hidden line and hidden area removal capability. The particular applications which have been considered include the fast access and display of maps and technical drawings from planning, archi¬ tectural and engineering data bases; the real time generation of displays for training simulation; the preparation of animated films for teaching and entertainment; the control of numerically-controlled machine tools; and solving the placement problem in computer-aided design work and overlap problems in type setting and map annotation.
356

Tactile probing strategies for soft tissue examination

Konstantinova, Jelizaveta January 2015 (has links)
This thesis investigates the ways how behavioural examination strategies are used to enhance tactile perception during examination of soft viscoelastic environments. During the last few decades, Robot-assisted Minimally Invasive Surgery (RMIS) has become increasingly popular and has been employed in various medical procedures with proven benefits for the patient. However, direct tactile feedback which greatly improves surgical outcomes is not present in RMIS. Artificial probing performed on ex-vivo and phantom tissues shows high variability in the obtained data. Variability is caused by non-homogeneous tissue stiffness and the resulting nonlinear temporal dynamics of tool indentation. Performed numerical simulations indicate that probing speed for fixed indentation influences the variability of tissue viscoelastic parameter estimation. To reduce variability in artificial tactile examination, suitable probing behaviour should be implemented. Three techniques of manual palpation are studied to understand the behavioural patterns of soft tissue tactile examination, namely, modulation of the applied force, localised, and global examination. Humans use force modulation strategies to enhance their perception of harder areas in non-homogeneous environments. A second-order reactive autoregressive model has been found to describe this type of technique. Robotic palpation showed how this technique enhances the perceived inhomogeneous stiffness of the environment. The study of unidirectional localised manual palpation, has shown that combinations of force-velocity strategies can be used to enhance the perception of hard formations in viscoelastic environments. To validate the obtained probing strategies, finite element simulations and tele-manipulated palpation were used. It was found that kinaesthetic and/or force feedback are used by humans to detect hard formations. Finally, a remote palpation procedure on a silicone phantom utilizing a tele-manipulation setup was performed to study the behaviour of remote global palpation. The results demonstrate the effectiveness of global palpation pattern used during manual soft tissue examination as well as in tele-manipulated palpation.
357

Characteristics and regularities of graphs and words

Christou, Michalis January 2014 (has links)
In the last years there has been an increasing interest in discrete structures such as graphs, trees and strings that appear in Mathematics and Computer Science but also in many other interdisciplinary areas including bioinformatics, pattern matching and data compression. We investigate several types of regularities and characteristics appearing in these structures, as well as algorithms for their computation. Graphs are the most popular objects of study in discrete mathematics. We show some progress in external graph theory, i.e. problems which investigate external graphs satisfying certain properties (maximizing planar graphs under girth constrains, degree/diameter problem for trees and pseudo trees, bipartite Ramsey numbers involving stars, stripes and tress). Words, also called strings, are also structures that have acquired great importance in recent years mainly due to their use in DNA modelling. We show some new results regarding the identification and appearance of seeds in strings (a linear time algorithm that computes the minimal left-seed array of a given string, an O(n log n) time algorithm that computes the minimal right-seed arrary given string x, a linear-time solution to compute the maximal left-seed/right-seed array, the appearance of quasiperiodic structures in Fibonacci words and general words). We also show some progress regarding the identification and appearance of abelian regularities in strings (quadratic time algorithms for the identification of all abelian borders of a string, bounds on the number of abelian borders a word and bounds on the lenght of the shortest border of a binary string). Furthermore, we investigate the average number of regularities in a word and we reveal some interesting properties of Padovan words, a family of words related to the Fibonacci sequence. Finally we present a problem on trees: how to output all subtree repeats of a tree in linear time using a string representation of the tree.
358

Strategies for the execution of long-term continuous and simultaneous tasks in grids

Haberland, Valeriia January 2015 (has links)
Increasingly large amounts of computing resources are required to execute resource intensive, continuous and simultaneous tasks. For instance, automated monitoring of temperature within a building is necessary for maintaining comfortable conditions for people, and it has to be continuous and simultaneous for all rooms in the building. Such monitoring may function for months or even years. Continuity means that a task has to produce results in a real-time manner without significant interruptions, while simultaneity means that tasks have to be run at the same time because of data dependencies. Although a Grid environment has a large amount of computational resources, they might be scarce at times due to high demand and resources occasionally may fail. A Grid might be unable or unwilling to commit to providing clients’ tasks with resources for long durations such as years. Therefore, each task will be interrupted sooner or later, and our goal is to reduce the durations and number of interruptions. To find a mutually acceptable compromise, a client and Grid resource allocator (GRA) negotiate over time slots of resource utilisation. Assuming a client is not aware of resource availability changes, it can infer this information from the GRA’s proposals. The resource availability is considered to change near-periodically over time, which can be utilised by a client. We developed a client’s negotiation strategy, which can adapt to the tendencies in resource availability changes using fuzzy control rules. A client might become more generous towards the GRA, if there is a risk of resource exhaustion or the interruption (current or total) is too long. A client may also ask for a shorter task execution, if this execution ends around the maximum resource availability. In addition, a task re-allocation algorithm is introduced for inter-dependent tasks, when one task can donate its resources to another one.
359

Identification of haptic based guidance in low visibility conditions

Dissanayake Mudiyanselage, Anuradha Ranasinghe January 2015 (has links)
This thesis presents identification of abstracted dynamics of haptic based human control policies and human responses in guiding/following using hard reins in low visibility conditions. The extracted haptic based guidance policies can be implemented on a robot to guide a human in low visibility conditions like in indoor fire-fighting, disaster response, and search and rescue. Firstly, the thesis presents haptic based guidance in Human-human interactions. The control policies were modeled by a simple linear Auto-Regressive model (AR). It was found that the guiding agent’s control policy can be modeled as a 3rd order predictive AR system and the human follower can be modeled as a 2nd order reactive AR system. Secondly, the human follower’s dynamics were modeled by a time varying virtual damped inertial system to understand how trust in the guider is reflected by physical variables. Experimental results on human trust showed that the coefficient of virtual damping is most sensitive to the follower’s trust. Thirdly, the thesis evaluates human-robot interactions when the control policy identified from human guiders was implemented on a planar 1-DoF robotic arm to perturb the blindfolded subjects’ most dominant arm to guide them to a desired position in leftward/rightward directions. Experiments were carried with naive and trained subjects. Humans’ behavior in leftward/rightward movements are asymmetric for naive subjects and symmetric for trained subjects. Moreover, it was found that naive subjects elicit a 2nd order reactive behavior similar to human demonstration experiments. However, trained subjects developed a 2nd order predictive following behavior. Furthermore, naive and trained subjects’ arm muscle activation is significantly different in leftward/rightward arm perturbation. Finally, the thesis presents how humans trained in primitive haptic patterns given using a wearable sleeve, can recognize their shifts and linear combinations.
360

Enhanced heuristics for numeric temporal planning

Piacentini, Chiara January 2015 (has links)
After 50 years of fundamental research, domain independent planning has recently started to be applied to numerous real world problems. However, this has shown that the techniques developed until now are not completely mature: improvements can be made in different directions, such as in the area of metric temporal planning. This PhD research is focused on how we can use more sophisticated and informative heuristics in the general context of automated planning, when numeric and temporal constraints are a significant part of the problem. As a starting point, we will use as a reference example the voltage control problem in distributed electricity networks of power systems. This domain is a real world application of planning in which non-linear numeric effects and exogenous events are combined, challenging the state of the art planners. A power system is a nation-wide infrastructure which delivers electricity from suppliers to consumers. Technical and economic considerations impose constraints on the different elements of the network and subsequently on control and controlling variables of interest. One of the main parameters is the voltage, which must lie in strict boundaries. The voltage, as well as all the other physical quantities involved, is subject to the variation of the electrical output that changes through time. Effects of these changes propagate all across the network. This comes with a substantial computational burden and calls for extensions to be developed to enable the application of automated planning. In addition to this, the presence of events representing predicted load and supply over time require the planner to interact with uncontrollable events. The voltage control problem is the starting point for our investigation on how the standard delete-relaxation behaves in the presence of numeric and temporal constraints. Fully relaxing all the negative effects can result in a too poorly informed heuristic. In this thesis we explore different ways to enhance the heuristic, adding selected negative effects, while not compromising too much the efficiency of the heuristic computation. In particular we have studied the numeric temporal heuristic of the planner popf2, based on the Temporal Relaxed Planning Graph (TRPG), and propose a way to take into account numeric effects that are calculated by external modules connected to the planner. Negative effects of predictable numeric exogenous events in the presence of trajectory constraints are also taken into account in the heuristic.

Page generated in 0.0699 seconds