• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 487
  • 487
  • 487
  • 171
  • 157
  • 155
  • 155
  • 68
  • 57
  • 48
  • 33
  • 29
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

A model of e-learning uptake and continuance in Higher Educational Institutions

Pinpathomrat, Nakarin January 2015 (has links)
To predict and explain E-learning usage in higher educational institutes (HEIs) better, this research conceptualized E-learning usage as two steps, E-learning uptake and continuance. The aim was to build a model of effective uptake and continuance of E-learning in HEIs, or ‘EUCH’. The EUCH model was constructed by applying five grounded theories: Unified Theory of Acceptance and Use of Technology (UTAUT); Keller’s ARCS model; Theory of Reasoned Action (TRA); Cognitive Dissonance Theory (CDT); and Adaptation Level Theory (ALT). The preliminarystudy was conducted with experts and end users (students) to confirm the factors of E learning uptake and continuance. With confirmation through triangulation from at least two source of data (literature, expert and end user review), all the proposed factors were indeed confirmed. A longitudinal study was conducted in a Thai university to: (a) assess the model’s performance for E learning uptake and continued use; (b) validate the relationships between the proposed EUCH model variables; and (c) investigate the consequence of E-learning usage on students’ learning performance. The results of the longitudinal study suggested that: (a) the EUCH model does as well in predicting the uptake and continued use of E-learning as the existing comparative models (TAM, UTAUT and ECM), the improvement however was found in its explanation; (b) students’ initial expectations influence their uptake of E-learning and the changes in their expectation during usage time period have an influence on their continued use; (c) no influence was found from E-learning usage on students’ learning performance. Even though the effect of E-learning usage to a student’s learning performance are not confirmed by the empirical results of this study, it could be argued that E-learning usage is an initial condition for realizing the benefits of E-learning on students and HEI: if there is no use, there will be no benefit. Although its predictive power and precision of equation prediction on E-learning uptake and continuance was not found to be an improvement on comparative models on purely statistical grounds, the EUCH model, which bridges the existing gap between findings on uptake and continuance of E-learning, provides an improved understanding of the processes of E-learning usage and the prediction of E-learning usage at any given time within a single model.
192

Telling ancient tales to modern machines : ontological representation of Sumerian literary narratives

Nurmikko, Terhi January 2015 (has links)
This thesis examines the potential of semantic web technologies to support and complement scholarship in Assyriology. Building on prior research, it is unique in its assessment of the suitability of three existing OWL ontologies (CIDOC Conceptual Reference Model, FRBRoo and Ontomedia) to adequately capture and represent the heterogeneous and incomplete narratives published as composites by the Electronic Text Corpus of Sumerian Literature. Its agenda sits firmly within the interdisciplinary context of the Digital Humanities and Web Science, and it describes a process centered on the development, implementation and valuation of an ontological representation system (mORSuL), designed to reflect the needs, desires, challenges and opportunities of Assyriological research paradigms. Underlying the process are two fundamental assumptions: firstly, that semantic technologies can be used to support academic endeavours in the Humanities, and secondly, that the benefits of doing so can be identified and evaluated. The thesis culminates in the conclusion that these existing ontologies are mostly suitable for the representation of the narrative content of these ancient texts, requiring only a few additions and changes.
193

The computational assessment of mechanical fixation failure in cemented total hip arthroplasty

Coultrup, Oliver J. January 2010 (has links)
No description available.
194

Using social data as context for making recommendations (semantics of people and culture)

Noor, Salma January 2013 (has links)
This research explores the potential of utilising social Web data as a source of contextual information for searching and information retrieval tasks. While using a semantic and ontological approach to do so, it works towards a support system for providing adaptive and personalised recommendations for Cultural Heritage Resources. Most knowledge systems nowadays support an impressive amount of information and in case of Web based systems the size is ever growing. Among other difficulties faced by these systems is the problem of overwhelming the user with a vast amount of unrequired data, often referred to as information overload. The problem is elevated with the ever increasing issues of time constraint and extensive use of handheld devices. Use of context is a possible way out of this situation. To provide a more robust approach to context gathering we propose the use of Social Web technologies alongside the Semantic Web. As the social Web is used the most amongst today’s Web users, it can provide better understanding about a user’s interests and intentions. The proposed system gathers information about users from their social Web identities and enriches it with ontological knowledge and interlinks this mapped data with LOD resources online e.g., DBpedia. Thus, designing an interest model for the user can serve as a good source of contextual knowledge. This work bridges the gap between the user and search by analysing the virtual existence of a user and making interesting recommendations accordingly. i This work will open a way for the vast amount of structured data on Cultural Heritage to be exposed to the users of social networks, according to their tastes and likings.
195

Bridging the air gap : an information assurance perspective

Richardson, Christopher January 2012 (has links)
The military has 5 domains of operations: Land, Sea, Air, Space and now Cyber. This 5th Domain is a heterogeneous network (of networks) of Communication and Information Systems (CIS) which were designed and accredited to meet Netcentric capability requirements; to be robust, secure and functional to the organisation’s needs. Those needs have changed. In the globalised economy and across the Battlespace, organisations now need to share information. Keeping our secrets, secret has been the watchwords of Information Security and the accreditation process; whilst sharing them securely across coalition, geo-physically dispersed networks has become the cyber security dilemma. The diversity of Advanced Persistent Threats, the contagion of Cyber Power and insecurity of coalition Interoperability has generated a plethora of vulnerabilities to the Cyber Domain. Necessity (fiscal and time-constraints) has created security gaps in deployed CIS architectures through their interconnections. This federated environment for superior decision making and shared situational awareness requires that Bridging the (new capability) Gaps needs to be more than just improving security (Confidentiality, Integrity and Availability) mechanisms to the technical system interfaces. The solution needs a new approach to creating and understanding a trusted,social-technical CIS environment and how these (sensitive) information assets should be managed, stored and transmitted. Information Assurance (IA) offers a cohesive architecture for coalition system (of systems) interoperability; the identification of strategies, skills and business processes required for effective information operations, management and exploitation. IA provides trusted, risk managed social-technical (Enterprise) infrastructures which are safe, resilient, dependable and secure. This thesis redefines IA architecture and creates models that recognise the integrated, complex issues within technical to organisational interoperability and the assurance that the right information is delivered to the right people at the right time in a trustworthy environment and identifies the need for IA practitioners and a necessary IA education for all Cyber Warriors.
196

Modelling and design optimisation of a hollow cathode thruster

Frollani, Daniele January 2014 (has links)
The present trend in spacecraft is to have two separate thrusters systems performing different tasks, a main electric propulsion system operating on xenon and a chemical system, usually bipropellants or cold gas. The development of a low power electric propulsion system operating on xenon to replace the chemical thrusters on board spacecrafts would be beneficial. It would be bring significant advantages in terms of mass saving from the sharing of the tanks, pipes and flow control unit, also with improvements in the specific impulse. In recent years experiments have demonstrated the possibility of using hollow cathodes as standalone thrusters, with indirect thrust measurement performed at the University of Southampton. Nevertheless indirect thrust measurements bring large uncertainties on the real value of the thrust. For the first time, direct thrust measurements were carried out with two different thrust balances on two different hollow cathode thrusters, derived from the T5 and T6 hollow cathodes, with unique design modification in the orifice and anode geometry. These measurements provide a unique insight into the real performance range of hollow cathode thrusters. Significant improvements in thrust, specific impulse and thrust efficiency have been achieved thanks to the optimized design of the T6 hollow cathode. The design of the thruster was modified using a one dimensional theoretical model developed within this research. With the help of the theoretical model the optimisation of the hollow cathode thruster design was carried out and a better understanding of the physical mechanisms which contribute to the generation of the thrust could be achieved, with the conclusion of electrothermal and electromagnetic phenomena being the main contributors. The main conclusions of the research and recommendations for related future works are also presented.
197

Polynomially searchable exponential neighbourhoods for sequencing problems in combinatorial optimisation

Congram, Richard K. January 2000 (has links)
In this thesis, we study neighbourhoods of exponential size that can be searched in polynomial time. Such neighbourhoods are used in local search algorithms for classes of combinatorial optimisation problems. We introduce a method, called dynasearch, of constructing new neighbourhoods, and of viewing some previously derived exponentially sized neighbourhoods which are searchable in polynomial time. We produce new neighbourhoods by combining simple well-known neighbourhood moves (such as swap, insert, and k-opt) so that the moves can be performed together as a single move. In dynasearch neighbourhoods, the moves are combined in such a way that the effect of the combined move on the objective function is equal to the sum of the effects of the individual moves from the underlying neighbourhood. Dynasearch neighbourhoods can be formed using dynamic programing from underlying moves: • nested within each other; • disjoint from each other; • and in the case of the TSP overlapping one another. Our dynasearch neighbourhoods made from underlying disjoint moves are successfully implemented within well-known local search methods to form competitive algorithms for the travelling salesman problem and state-of-the-art algorithms for the total weighted tardiness problem and linear ordering problem. By viewing moves from some known travelling salesman problem neighbourhoods as a combination of underlying moves, each reversing a section of the tour, greater insight into the structure of the neighbourhoods may be obtained. This insight has both enabled us to calculate the size of a number of neighbourhoods and demonstrate how some neighbourhoods are contained within others.
198

Changing the behaviour of healthcare professionals using theory based, computer-delivered interventions

McDermott, Lisa January 2013 (has links)
Non-adherence to clinical guidelines has been identified as a consistent finding in general practice. The purpose of this research was to develop and evaluate theory-informed, computer-delivered interventions to promote the implementation of guidelines in general practice, which GPs viewed as feasible and acceptable. The intervention aimed to promote guideline adherence for antibiotic prescribing in respiratory tract infections, and adherence to recommendations for secondary stroke prevention. An intervention development study involved the creation of computer-delivered prompts using aspects of social cognitive theory, and drawing on nationally recommended standards for clinical content. Prompts were presented to GPs during interviews, and iteratively refined based on feedback. GPs reported being more likely to use prompts if they were perceived as offering support and choice, as opposed to being an enforcement method. The prompts were then entered into a trial (not reported) and two process evaluation studies were conducted with GPs who had taken part in the trial. A qualitative evaluation study involving interviews with GPs, revealed that the prompts were perceived as useful and acceptable in practice, but GPs who had not been informed of the prompts appearance reported being less likely to engage with them. A quantitative evaluation study involved a questionnaire consisting of theory based measures and an intervention evaluation measure. GPs were satisfied with the usability of the prompts, and intervention group GPs reported higher levels of self-efficacy in managing patients according to guidelines compared to control group GPs. Overall the intervention was viewed as feasible and acceptable. A key characteristic of an acceptable computer-delivered intervention appears to be that it should be perceived as a useful tool supporting GP practice. However, conclusions of the evaluation were limited by a small and potentially non-representative sample of trial GPs.
199

A computer model for investigating the biomechanical effects of radiation exposure on pathological and non-pathological living human cells

Johnston, G. J. January 2017 (has links)
The cellular response to radiation insult and studies have been carried out to investigate aspects of the cytoskeleton and the force response of the cell when probed by an AFM. Confirmed for the first time that there was a statistically significant difference for the PNT2 and PC3 cell lines in response to probing with the AFM tip, and that time was eliminated a possible influencing factor in the short term (1 hour) for the force response. Showed that the Hertz model is not sufficient for distances greater than 500nm due to the strain hardening effect for biological cells and that the biological cells non-linear force response becomes marked after the 500nm region. The orientation of actin was investigated and a bimodal variation was statistical significant, although the larger tendency was for a 90 degree separation there was indications that earlier theoretical work by Pollard, 2008 was present. The importance of the contact point when considering the cell lines PNT2, DU145 and PC3 and greater than 500nm indentation is shown and four different methods are tested and the most robust of these chosen as the method for the distance and cell lines involved. That being ‘line projection’ method created by the author. A method that normalises the data for AFM force curves is presented, the method minimises the contact point error at the same time and therefore provides biologists with a way to test cell lines using standard normal population tests.
200

Intelligent control system for CFD modelling software

Janes, Dominik Sebastian January 2003 (has links)
In this thesis we show that it is possible to create an intelligent agent capable of emulating the human ability to control CFD simulations and provide similar benefits in terms of performance, overall reliability and result accuracy. We initially consider the rule-based approach proposed by other researchers. It is argued that heuristic search is better suited to model the techniques used by human experts. The residual graphs are identified as the most important source of heuristic information relevant to the control decisions. Three different graph features are found to be most important and dedicated algorithms are developed for their extraction. A heuristic evaluation function employing the new extraction algorithms is proposed and implemented in the first version of the heuristic control system (ICS 1.0). The analysis of the test results gives rise to the next version of the system (ICS 2.0). ICS 2.0 employs an additional expert system responsible for dynamic pruning of the search space using the rules obtained by statistical analysis of the initial results. Other features include dedicated goal-driven search plans that help reduce the search space even further. The simulation results and overall improvements are compared with non-controlled runs. We present a detailed analysis of a fire case solution obtained with different control techniques. The effect of the automatic control on the accuracy of the results is explained and discussed. Finally, we provide some indications for further research that promise to provide even greater performance gains.

Page generated in 0.0693 seconds