• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 720
  • 715
  • 707
  • 398
  • 385
  • 382
  • 164
  • 97
  • 86
  • 82
  • 44
  • 42
  • 39
  • 30
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Design of a wireless intelligent fuzzy controller network

Saeed, Bahghtar Ibraheem January 2014 (has links)
Since the first application of fuzzy logic in the field of control engineering, fuzzy logic control has been successfully employed in controlling a wide variety of applications, such as commercial appliances, industrial automation, robots, traffic control, cement kilns and automotive engineering. The human knowledge on controlling complex and non-linear processes can be incorporated into a controller in the form of linguistic expressions. Despite these achievements, however, there is still a lack of an empirical or analytical design study which adequately addresses a systematic auto-tuning method. Indeed, tuning is one of the most crucial parts in the overall design of fuzzy logic controllers and it has become an active research field. Various techniques have been utilised to develop algorithms to fine-tune the controller parameters from a trial and error method to very advanced optimisation techniques. The structure of fuzzy logic controllers is not straightforward as is the case in PID controllers. In addition, there is also a set of parameters that can be adjusted, and it is not always easy to find the relationship between the parameters and the controller performance measures. Moreover, in general, controllers have a wide range of setpoints; changing from one value to another requiring the controller parameters to be re-tuned in order to maintain a satisfactory performance over the entire range of setpoints. This thesis deals with the design and implementation of a new intelligent algorithm for fuzzy logic controllers in a wireless network structure. The algorithm enables the controllers to learn about their plants and systematically tune their gains. The algorithm also provides the capability of retaining the knowledge acquired during the tuning process. Furthermore, this knowledge is shared on the network through a wireless communication link with other controllers. Based on the relationships between controller gains and the closed-loop characteristics, an auto-tuning algorithm is developed. Simulation experiments using standard second order systems demonstrate the effectiveness of the algorithm with respect to auto-tuning, tracking setpoints and rejecting external disturbances. Furthermore, a zero overshoot response is produced with improvements in the transient and the steady state responses. The wireless network structure is implemented using LabVIEW by composing a network of several fuzzy controllers. The results demonstrate that the controllers are able to retain and share the knowledge.
132

The use of advanced soft computing for machinery condition monitoring

Ahmed, Mahmud January 2014 (has links)
The demand for cost effective, reliable and safe machinery operation requires accurate fault detection and classification. These issues are of paramount importance as potential failures of rotating and reciprocating machinery can be managed properly and avoided in some cases. Various methods have been applied to tackle these issues, but the accuracy of those methods is variable and leaves scope for improvement. This research proposes appropriate methods for fault detection and diagnosis. The main consideration of this study is use Artificial Intelligence (AI) and related mathematics approaches to build a condition monitoring (CM) system that has incremental learning capabilities to select effective diagnostic features for the fault diagnosis of a reciprocating compressor (RC). The investigation involved a series of experiments conducted on a two-stage RC at baseline condition and then with faults introduced into the intercooler, drive belt and 2nd stage discharge and suction valve respectively. In addition to this, three combined faults: discharge valve leakage combined with intercooler leakage, suction valve leakage combined with intercooler leakage and discharge valve leakage combined with suction valve leakage were created and simulated to test the model. The vibration data was collected from the experimental RC and processed through pre-processing stage, features extraction, features selection before the developed diagnosis and classification model were built. A large number of potential features are calculated from the time domain, the frequency domain and the envelope spectrum. Applying Neural Networks (NNs), Support Vector Machines (SVMs), Relevance Vector Machines (RVMs) which integrate with Genetic Algorithms (GAs), and principle components analysis (PCA) which cooperates with principle components optimisation, to these features, has found that the features from envelope analysis have the most potential for differentiating various common faults in RCs. The practical results for fault detection, diagnosis and classification show that the proposed methods perform very well and accurately and can be used as effective tools for diagnosing reciprocating machinery failures.
133

Software quality and governance in agile software development

Abbas, Noura January 2009 (has links)
Looking at software engineering from a historical perspective, we can see how software development methodologies have evolved over the past 50 years. Using the right software development methodology with the right settings has always been a challenge. Therefore, there has always been a need for empirical evidence about what worked well and what did not, and what factors affect the different variables of the development process. Probably the most noticeable change to software development methodology in the last 15 years has been the introduction of the word “agile”. As any area matures, there is a need to understand its components and relations, as well as the need of empirical evidence about how well agile methods work in real life settings. In this thesis, we empirically investigate the impact of agile methods on different aspects of quality including product quality, process quality and stakeholders’ satisfaction as well as the different factors that affect these aspects. Quantitative and qualitative research methods were used for this research, including semi-structured interviews and surveys. Quality was studied in two projects that used agile software development. The empirical study showed that both projects were successful with multiple releases, and with improved product quality and stakeholders’ satisfaction. The data analysis produced a list of 13 refined grounded hypotheses out of which 5 were supported throughout the research. One project was studied in-depth by collecting quantitative data about the process used via a newly designed iteration monitor. The iteration monitor was used by the team over three iterations and it helped identify issues and trends within the team in order to improve the process in the following iterations. Data about other organisations collected via surveys was used to generalise the obtained results. A variety of statistical analysis techniques were applied and these suggested that when agile methods have a good impact on quality they also has a good impact on productivity and satisfaction, also when agile methods had good impact on the previous aspects they reduced cost. More importantly, the analysis clustered 58 agile practices into 15 factors including incremental and iterative development, agile quality assurance, and communication. These factors can be used as a guide for agile process improvement. The previous results raised questions about agile project governance, and to answer these questions the agile projects governance survey was conducted. This survey collected 129 responses, and its statistically significant results suggested that: retrospectives are more effective when applied properly as they had more impact when the whole team participated and comments were recorded, that organisation size has a negative relationship with success, and that good practices are related together as when a team does one aspect well, they do all aspects well. Finally, the research results supported the hypotheses: agile software development can produce good quality software, achieve stakeholders’ satisfaction, motivate teams, assures quick and effective response to stakeholder’s requests, and it goes in stages, matures, and improves over time.
134

End-user data-centric interactions over linked data

Popov, Igor January 2013 (has links)
The ability to build tools that support gathering and querying information from distributed sources on the Web rests on the availability of structured data. Linked Data, as a way for publishing and linking distributed structured data sources on the Web, provides an opportunity to create this kind of tools. Currently, however, the ability to complete such tasks over Linked Data sources is limited to users with advanced technical skills, resulting in an online information space largely inaccessible to non-technical end users. This thesis explores the challenges of designing user interfaces for end users, those without technical skills, to use Linked Data to solve information tasks that require combining information from multiple sources. The thesis explores the design space around interfaces that support access to Linked Data on demand, suggests potential use cases and stakeholders, and proposes several direct manipulation tools for end users with diverse needs and skills. User studies indicate that the tools built offer solutions to various challenges in accessing Linked Data that are identified in this thesis.
135

Performance visualization of parallel programs

D'Paola, Oscar Naim January 1995 (has links)
No description available.
136

Implementation and validation of model-based multi-threaded Java applications and Web services

Xue, Pengfei January 2008 (has links)
In the software engineering world, many modelling notations and languages have been developed to aid application development. The technologies, Java and Web services, play an increasingly important role in web applications. However, because of issues of complexity, it is difficult to build multi-threaded Java applications and Web Service applications, and even more difficult to model. Furthermore, it is difficult to reconcile the directly-coded application with the model-based application. Based on the formal modelling system, RDT, the new work here covers: (i) a translator, RDTtoJava, used to automatically convert an RDT model into an executable multi-threaded Java application; (ii) the framework for developing an RDT model into a Java synchronous distributed application that is supported by the JAX-RPC Web Services; and, (iii) the framework for developing an RDT model into a Java asynchronous distributed application that is supported by the JMS Web services. Experience was gained by building distributed computing models and client/server models and generation of the application based on such models. This work is helpful for the software developers and software researchers in formal software development.
137

The automated translation of integrated formal specifications into concurrent programs

Yang, Letu January 2008 (has links)
The PROB model checker [LB03] provides tool support for an integrated formal specification approach, which combines the state-based B specification language [Abr96] with the event-based process algebra CSP [Hoa78]. The JCSP package [WM00b] presents a concurrent Java implementation for CSP/occam. In this thesis, we present a developing strategy for implementing such a combined specification as a concurrent Java program. The combined semantics in PROB is flexible and ideal for model checking, but is too abstract to be implemented in programming languages. Also, although the JCSP package gave us significant inspiration for implementing formal specifications in Java, we argue that it is not suitable for directly implementing the combined semantics in PROB. Therefore, we started with defining a restricted semantics from the original one in PROB. Then we developed a new Java package, JCSProB, for implementing the restricted semantics in Java. The JCSProB package implements multi-way synchronization with choice for the combined B and CSP event, as well as a new multi-threading mechanism at process level. Also, a GUI sub-package is designed for constructing GUI programs for JCSProB to allow user interaction and runtime assertion checking. A set of translation rules relates the integrated formal models to Java and JCSProB, and we also implement these rules in an automated translation tool for automatically generating Java programs from these models. To demonstrate and exercise the tool, several B/CSP models, varying both in syntactic structure and behavioural properties, are translated by the tool. The models manifest the presence and absence of various safety, deadlock, and fairness properties; the generated Java code is shown to faithfully reproduce them. Run-time safety and fairness assertion checking is also demonstrated. We also experimented with composition and decomposition on several combined models, as well as the Java programs generated from them. Composition techniques can help the user to develop large distributed systems, and can significantly improve the scalability of the development of the combined models of PROB.
138

A competency model for semi-automatic question generation in adaptive assessment

Sitthisak, Onjira January 2009 (has links)
The concept of competency is increasingly important since it conceptualises intended learning outcomes within the process of acquiring and updating knowledge. A competency model is critical to successfully managing assessment and achieving the goals of resource sharing, collaboration, and automation to support learning. Existing e learning competency standards such as the IMS Reusable Definition of Competency or Educational Objective (IMS RDCEO) specification and the HR-XML standard are not able to accommodate complicated competencies, link competencies adequately, support comparisons of competency data between different communities, or support tracking of the knowledge state of the learner. Recently, the main goal of assessment has shifted away from content-based evaluation to intended learning outcome-based evaluation. As a result, through assessment, the main focus of assessment goals has shifted towards the identification of learned capability instead of learned content. This change is associated with changes in the method of assessment. This thesis presents a system to demonstrate adaptive assessment and automatic generation of questions from a competency model, based on a sound pedagogical and technological approach. The system’s design and implementation involves an ontological database that represents the intended learning outcome to be assessed across a number of dimensions, including level of cognitive ability and subject matter content. The system generates a list of the questions and tests that are possible from a given learning outcome, which may then be used to test for understanding, and so could determine the degree to which learners actually acquire the desired knowledge. Experiments were carried out to demonstrate and evaluate the generation of assessments, the sequencing of generated assessments from a competency data model, and to compare a variety of adaptive sequences. For each experiment, methods and experimental results are described. The way in which the system has been designed and evaluated is discussed, along with its educational benefits.
139

Activismo social y difusión en el movimiento del Software Libre en Chile

Báez Bezama, Eric Rolando 03 1900 (has links)
Esta investigación es el resultado de la experiencia profesional del autor en medios de comunicación y centros tecnológicos, así como de la vinculación y participación activa que ha tenido con las comunidades nacionales e internacionales que promueven la creación y difusión libre del conocimiento, en especial el Software Libre.
140

Defect correction based domain decomposition methods for some nonlinear problems

Siahaan, Antony January 2011 (has links)
Defect correction schemes as a class of nonoverlapping domain decomposition methods offer several advantages in the ways they split a complex problem into several subdomain problems with less complexity. The schemes need a nonlinear solver to take care of the residual at the interface. The adaptive-∝ solver can converge locally in the ∞-norm, where the sufficient condition requires a relatively small local neighbourhood and the problem must have a strongly diagonal dominant Jacobian matrix with a very small condition number. Yet its advantage can be of high signicance in the computational cost where it simply needs a scalar as the approximation of Jacobian matrix. Other nonlinear solvers employed for the schemes are a Newton-GMRES method, a Newton method with a finite difference Jacobian approximation, and nonlinear conjugate gradient solvers with Fletcher-Reeves and Pollak-Ribiere searching direction formulas. The schemes are applied to three nonlinear problems. The first problem is a heat conduction in a multichip module where there the domain is assembled from many components of different conductivities and physical sizes. Here the implementations of the schemes satisfy the component meshing and gluing concept. A finite difference approximation of the residual of the governing equation turns out to be a better defect equation than the equality of normal derivative. Of all the nonlinear solvers implemented in the defect correction scheme, the nonlinear conjugate gradient method with Fletcher-Reeves searching direction has the best performance. The second problem is a 2D single-phase fluid flow with heat transfer where the PHOENICS CFD code is used to run the subdomain computation. The Newton method with a finite difference Jacobian is a reasonable interface solver in coupling these subdomain computations. The final problem is a multiphase heat and moisture transfer in a porous textile. The PHOENICS code is also used to solve the system of partial differential equations governing the multiphase process in each subdomain while the coupling of the subdomain solutions is taken care of with some FORTRAN codes by the defect correction schemes. A scheme using a modified-∝ method fails to obtain decent solutions in both single and two layers case. On the other hand, the scheme using the above Newton method produces satisfying results for both cases where it can lead an initially distant interface data into a good convergent solution. However, it is found that in general the number of nonlinear iteration of the defect correction schemes increases with the mesh refinement.

Page generated in 0.0448 seconds