• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 393
  • 29
  • 22
  • 14
  • 5
  • 1
  • Tagged with
  • 493
  • 493
  • 312
  • 96
  • 78
  • 58
  • 57
  • 53
  • 53
  • 51
  • 46
  • 45
  • 45
  • 45
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

A hybrid intelligent technique for induction motor condition monitoring

Wen, Xin January 2011 (has links)
The objective of this research is to advance the field of condition monitoring and fault diagnosis for induction motors. This involves processing the signals produced by induction motors, classifying the types and estimating the severity of induction motors faults. A typical process of condition monitoring and fault diagnosis for induction motors consists of four steps: data acquisition, signal analysis, fault detection and post-processing. A description of various kinds of faults that can occur in induction motors is presented. The features reflecting faults are usually embedded in transient motor signals. The signal analysis is a very important step in the motor fault diagnosis process, which is to extract features which are related to specific fault modes. The signal analysis methods available in feature extraction for motor signals are discussed. The wavelet packet decomposition results consist of the time-frequency representation of a signal in the same time, which is inherently suited to the transient events in the motor fault signals. The wavelet packet transform-based analysis method is proposed to extract the features of motor signals. Fault detection has to establish a relationship between the motor symptoms and the condition. Classifying motor condition and estimating the severity of faults from the motor signals have never been easy tasks and they are affected by many factors. AI techniques, such as expert system (ES), fuzzy logic system (FLS), artificial neural network (ANN) and support vector machine (SVM), have been applied in fault diagnosis of very complex system, where accurate mathematical models are difficult to be built. These techniques use association, reasoning and decision making processes as would the human brain in solving diagnostic problems. ANN is a computation and information processing method that mimics the process found in biological neurons. But when ANN-based methods are used for fault diagnosis, local minimums caused by the traditional training algorithms often result in large approximation error that may destroy their reliability. In this research, a novel method of condition monitoring and fault diagnosis for induction motor is proposed using hybrid intelligent techniques based on WPT. ANN is trained by improved genetic algorithm (IGA). WPT is used to decompose motor signals to extract the feature parameters. The extracted features with different frequency resolutions are used as the input of ANN for the fault diagnosis. Finally, the proposed method is tested in 1.5 kW and 3.7 kW induction motor rigs. The experimental results demonstrate that the proposed method improves the sensitivity and accuracy of the ANN-based methods of condition monitoring and fault diagnosis for induction motors.
32

Statistical-based optimization of process parameters of fused deposition modelling for improved quality

Alhubail, Mohammad A. M. J. January 2012 (has links)
Fused Deposition Modelling (FDM) is a rapid prototyping system that produces physical models directly from the computer aided design (CAD) drawings. These models can be used to evaluate the assembly and the functionality of the design, also producing a manufacturing tools, and end-use parts. Parts built with production-grade thermoplastics that match the traditional machined parts, and according to the realworld conditions. FDM can produce instantly functional parts that used mainly in medical and automotive applications, with the use of reverse engineering techniques such as engineering scanning or digitizing systems. Knowledge of the quality characteristics of FDM fabricated parts is crucial. Quality significantly depends on process variable parameters. Optimizing the process parameters of FDM can make the system more precise and repeatable and such advancement can lead to use of FDM in rapid manufacturing applications rather than only producing prototypes. The part building is influenced by variant processing conditions. Thus, FDM process variable parameters are required to be collectively optimized rather than individually. In order to understand this issue, this study presents results of the experimental work on the effect of the main FDM process variable parameters of layer thickness (A), air gap (B), raster width (C), contour width (D), and raster orientation (E) on the quality characteristics of surface roughness (Ra), dimensional accuracy (DA), and tensile strength (TS). Previous studies have investigated the quality characteristics but limited knowledge is available on FDM newly improved materials. Thus, the new ABS- M30i biomedical material was used in this experimental work to build parts. To conduct this study, a full factorial experiment was used to obtain the test runs. A number of analytical methods such as regression analysis, Analysis of Variance (ANOVA), and Pareto analysis were used to determine the influence of the variable FDM process parameter settings. Results show that these process parameters have significant effect on the quality of finished products. For example, it has been found that the surface roughness and tensile strength of processed parts are greatly influenced by the air gap parameter as it affects the part’s beads structure, because it overlapping the material beads and consequently strengthen the beads bonding, and reduce the voids between the beads. Scanning Electron Microscope (SEM) work has been undertaken to characterise the experimental results. The results will be important for FDM produced parts in different functional applications as rapid manufacturing becomes increasingly accepted.
33

In vitro studies of bone-cement interface and related work on cemented acetabular replacement

Tozzi, Gianluca January 2012 (has links)
The lasting integrity of the bond between bone cement and bone defines the long-term stability of cemented acetabular replacements. Although several studies have been carried out on bone-cement interface at continuum level, micromechanics of the interface has been studied only recently for tensile and shear loading cases. Furthermore, the mechanical and microstructural behaviour of this interface is complex due to the variation in morphology and properties that can arise from a range of factors. In this work in vitro studies of the bone-cement interfacial behaviour under selected loading conditions were carried out using a range of experimental techniques. Damage development in cemented acetabular reconstructs was studied under a combined physiological loading block representative of routine activities in a saline environment. A custom-made environmental chamber was developed to allow testing of acetabular reconstructs in a wet condition for the first time and damage was monitored and detected by scanning at selected loading intervals using micro-focus computed tomography (μCT). Preliminary results showed that, as in dry cases, debonding at the bone-cement interface defined the failure of the cement fixation. However, the combination of mechanical loading and saline environment seems to affect the damage initiation site, drastically reducing the survival lives of the reconstructs. Interfacial behaviour of the bone-cement interface was studied under tensile, shear and mixed-mode loading conditions. Bone-cement coupons were first mechanically tested and then μCT imaged. The influence of the loading angle, the extent of the cement penetration and the failure mechanisms with regard to the loading mode on the interfacial behaviour were examined. Both mechanical testing and post failure morphologies seem to suggest an effect of the loading angle on the failure mechanism of the interface. The micromechanical performance of bone-cement interface under compression was also examined. The samples were tested in step-wise compression using a custom-made micromechanical loading stage within the μCT chamber, and the damage evolution with load was monitored. Results showed that load transfer in bone-cement interface occurred mainly in the bone-cement contact region, resulting in progressively developed deformation due to trabeculae bending and buckling. Compressive and fatigue behaviour of bovine cancellous bone and selected open-cell metallic foams were studied also, and their suitability as bone analogous materials for cemented biomechanical testing was investigated. Whilst the morphological parameters of the foams and the bone appear to be closer, the mechanical properties vary significantly between the foams and the bone. However, despite the apparent differences in their respective properties, the general deformation behaviour is similar across the bone and the foams. Multi-step fatigue tests were carried out to study the deformation behaviour under increasing compressive cyclic stresses. Optical and scanning electron microscopy (SEM) were used to characterise the microstructure of foams and bone prior to and post mechanical testing. The results showed that residual strain accumulation is the predominant driving force leading to failure of foams and bones. Although foams and bone fail by the same mechanism of cyclic creep, the deformation behaviour at the transient region of each step was different for both materials. Preliminary results of foam-cement interface performance under mixed-mode loading conditions are also presented.
34

Developing a compositional ontology alignment framework for unifying business and engineering domains

Azzam, Said Rabah January 2012 (has links)
In the context of the Semantic Web, ontologies refer to the consensual and formal description of shared concepts in a domain. Ontologies are said to be a way to aid communication between humans and machines and also between machines for agent communication. The importance of ontologies for providing a shared understanding of common domains, and as a means for data exchange at the syntactic and semantic level has increased considerably in the last years. Therefore, ontology management becomes a significant task to make distributed and heterogeneous knowledge bases available to the end users. Ontology alignment is the process where ontology from different domains can be matched and processed further together, hence sharing a common understanding of the structure of information among different people. This research starts from a comprehensive review of the current development of ontology, the concepts of ontology alignments and relevant approaches. The first motivation of this work is trying to summarise the common features of ontology alignment and identify underdevelopment areas of ontology alignment. It then works on how complex businesses can be designed and managed by semantic modelling which can help define the data and the relationships between these entities, which provides the ability to abstract different kinds of data and provides an understanding of how the data elements relate. The main contributions of this work is to develop a framework of handling an important category of ontology alignment based on the logical composition of classes, especially under a case that one class from a certain domain becomes a logic prerequisites (assumption) of another class from a different domain (commitment) which only happens if the class from the first domain becomes valid. Under this logic, previously un-alignable classes or miss-aligned classes can be aligned in a significantly improved manner. A well-known rely/guarantee method has been adopted to clearly express such relationships between newly-alignable classes. The proposed methodology has be implemented and evaluated on a realistic case study.
35

Analysis of human electrocardiogram for arrhythmia auto-classification and biometric recognition systems using analytic and autoregressive modeling parameters

Alhamdi, Mustafa January 2015 (has links)
The electrocardiogram is a skin surface measurement of the electrical activity of the heart over time. This activity is detected by electrodes attached to the surface of the skin and recorded or displayed by an external medical device. Doctors use electrocardiograms to detect and diagnose conditions such as arrhythmias (abnormal heart rhythms) and myocardial infarctions (heartattacks). The work described in this thesis investigates the system designed for two primary applications, electrocardiogram classification system based on autoregressive models which identifies normal (healthy) from abnormal (unhealthy) electrocardiogram signals and the electrocardiogram biometric system based on analytic and modeling features which identifies each person individually from his or her electrocardiogram. In recent years, a number of signal processing techniques have been used to design electrocardiogram signal auto-classification and biometric identification systems. electrocardiogram classification and biometric systems implemented in this thesis are compared with a number of other recently described techniques and methods to identify electrocardiogram signals. The aim of any designed electrocardiogram classification and biometric system described in this work is to achieve high accuracy rate when identifying electrocardiograms. Electrocardiogram classification and biometric systems consists of four major stages, pre-processing of electrocardiogram signal, QRS complex detection, feature extraction and classification algorithms. Each of those steps are discussed and explained in separate chapters with variety of techniques and methods employed to achieve each step. Developed systems based on autoregressive models to design electrocardiogram classification and biometric achieved accurate correct classification rate with high level of productivity due to the small number of extracted parameters using autoregressive models. The proposed electrocardiogram classification and biometric systems of this work achieved 100 % correct classification rate in identifying normal from abnormal electrocardiogram signals and each person individually from his or her electrocardiogram signal. In this work, it has been proven that autoregressive models can represent electrocardiogram signals with 91 % accuracy and matching between the original electrocardiogram signal and the modeled signal.
36

A study of the implementation of quality management systems (QMS) within the Kuwaiti manufacturing industry

Al-Khadher, Abdullah M. Kh. M. January 2015 (has links)
The intended contribution and unifying purpose of this research was to identify the enablers and barriers affecting the implementation of quality management systems (QMS) within the Kuwaiti industrial context. This was achieved by assessing the current practices adopted by a number of manufacturing firms in Kuwait and then the gaps or areas that need to be filled were measured. QMS is defined as an organisation’s structure, responsibilities, processes, procedures and resources which guide and control an organisation with regard to quality in order to constantly improve the effectiveness and efficiency of its performance. QMS enables businesses to minimise undesired impacts on the processes while maximising the effectiveness and efficiency of those processes that deliver products or services to end-users. This PhD research focuses on identifying the general state of the manufacturing sector within Kuwaiti Small, Medium and Large Enterprises (KSMLEs) and analyses links, enablers and barriers related to the implementation of QMS by linking the possible effects of culture. The research was carried out using a case study as well as the mixed methods approach involving the use of both quantitative and qualitative research methods. In the first instance a single case study organisation involving a large sized enterprise within the Kuwaiti manufacturing sector was researched into as a pilot study. This helped in refining both the quantitative survey questionnaire and the qualitative interview questions. Thereafter, for the quantitative approach, a total of 192 valid responses out of 308 received data sets involving KSMLEs that included three different levels of management (top, middle and shop floor) were generated using physical and online-surveys, the target participants being managers working in the industrial sector in Kuwait. The qualitative research design included twenty-five interviews involving twenty-five KSMLEs, also at different management levels (six top, six middle and thirteen shop floor managers) in the manufacturing sector in Kuwait. Findings from this study confirmed that leaders and managers of the manufacturing sector in Kuwait are actively engaged with customers. However, they lack the process of delegating tasks and empowering of people. Findings further suggest that firms’ strategy must focus on appreciating cultural aspects such organisational culture, investing in employees, developing training programmes, and addressing community and social responsibility. This study contributes to the body of knowledge on QMS and provides a viable framework (model) that was over time developed through an iterative series of revisions, literature review, a case study and expert views. The new model designed and proposed in this thesis is called the Kuwait Quality Culture Model (KQCM). This model portrays the importance of culture along with other factors in the initiation and the implementation of QMS, particularly in Arabic culture. Overall findings confirm the researcher expectation that QMS performance (managing people, policy & strategy, partnerships & resources, and culture) is positively associated to commitment and leadership of top management and those managers at different levels stress on the importance and necessity of internal communication channels. Moreover, employees’ empowerment and relevant processes (people, policy & strategy, partnerships & resources, and culture) are crucial in implementing any quality practices. Furthermore, culture was found to be a vital factor as it plays an important role in the implementation of QMS in Kuwait. Finally, when implemented, the study confirmed that KQCM results (people, customers and society) would lead to better organisational performance.
37

A context-aware framework for personalised recommendation in mobile environments

Yeung, Kam Fung January 2011 (has links)
Context-awareness has become an essential part in various personalised applications such as mobile recommeder systems and mobile information retrieval. Much progress has been made in context-aware applications. However there is a lack of general framework for supporting the rapid development of context-aware applications and enabling the sharing and dissemiation of context information across different applications
38

Novel nano-particulate/polymer treatment systems for masonry enhancement and protection

MacMullen, James January 2012 (has links)
Fundamental issues associated with addressing the UK housing shortage problem are climate change and the lack of usable building space. Conservation of old buildings, maintaining green land and a country filled with single walled older properties mean that the UK government has to retrofit existing older building stock. This would make a significant impact on reducing the carbon footprint of each household as well as to alleviate energy supply problems. The investigation of novel zinc oxide and titanium dioxide nanoparticulates in aqueous silane/siloxane oil-in-water (O/W) emulsions for practical exterior facade applications is presented. An initial emulsion was developed and optimised before further improvement through nanoparticulate incorporation was achieved. Nanoparticulates of zinc oxide and titanium dioxide were dispersed respectively using ultrasonication in n-isooctyltriethoxysilane before being incorporated into a base emulsion. Once formulations were optimised, applied studies and fundamental assessment of these emulsions were conducted. The aim of the work presented was to produce a practical facade emulsion that could be used in the retrofitting of existing building stock or for heritage remedial treatments. Initial research indicated that water was the governing factor in a diverse range of facade degradation mechanisms; improving water repellence, thermal envelope efficiency while reducing biofouling was recognised as being of key interest in this field. The study gives insight into aqueous water repellent emulsions, nanoparticulate integration by commercially practical means, and assessment of the attributes exhibited by such treatments. Rheological and morphological characterisation concluded that a gelled network structure is produced by the incorporation of the fabricated nanoparticulate colloids. The emulsions retained shear-thinning characteristics, ideal for deep treatment penetration to be achieved for porous silicate substrates. Accelerated ageing tests showed that the nanoparticulate emulsions were substantially more thermodynamically stable than the emulsion control while also being physically more stable due to surfactant-particulate stabilisation mechanisms of the polar phases. From further investigation it was also found that hydroxyl terminated siloxane could be integrated into these emulsions, helping to improve the ‘green credentials’ of such systems through the replacement of conventionally used trimethoxy terminated siloxanes, that release harmful methanol upon curing, thus substitution is preferential and in line with current European policy. Treatments improved thermal envelope efficiency of structures through the reduction of retained water in various forms including rising damp. Assessment was carried out using model houses exposed to various temperature and humidity scenarios under controlled heating. The findings showed that water was the root cause of heat loss and thus a key parameter when considering the improvement of a structures’ carbon footprint. These treatments allow water vapour to permeate out of a structure passively, reducing internal humidity issues including microbiological degradation and health related problems experienced by occupants. Investigation of bioreceptivity conducted though an 8 week algal culture streaming study concluded that treatments with <0.1wt% of the aforementioned nanoparticulates could reduce biofouling through photo-induced sanitisation. Furthermore, it was also found that while water repellence may vary at the facade interface due to the respective metal oxides photocatalytic nature, substrate interior water contact angles should remain high due to the absence of UV light. This is of key importance as it implies that simultaneous antifouling and rising damp remediation may be achieved by such treatments. One of the major advantages of the treatments presented is that they do not effectively alter the aesthetics of the substrate, unlike photo-induced ‘self-cleaning’ coatings that turn the facade white. In addition, these treatments are not susceptible to problems related to the cracking or chipping of coated surfaces, allowing them to provide better protection. From the above points it is clear that these treatments are the next paradigm in facade protection, complying with current social, ecological, political and industrial needs. This study presents the inception, development and investigation of key attributes and mechanisms of these novel treatments while showing critically that such emulsion treatments are practical with great potential to enhance the lives of UK residents.
39

An intelligent monitoring system to predict potential catastrophic incidents

Painting, Andrew David January 2014 (has links)
This dissertation identified a gap in research for an intelligent monitoring system to monitor various indicators within complex engineering industries, in order to predict the potential situations that may lead to catastrophic failures. The accuracy of prediction was based upon lessons learnt from historic catastrophic incidents. These incidents are normally attributed to combinations of several minor errors or failures, and seldom occur through single point failures. The new system to monitor, identify and predict the conditions likely to cause a catastrophic failure could improve safety, reduce down time and prioritise funding. This novel approach involved the classification of ten common traits that are known to lead to catastrophe, based on six headings used by the Health and Safety Executive and four headings used in Engineering Governance. These were weight averaged to provide a ‘state’ condition for each asset, and amalgamated with a qualitative fault tree representation of a known catastrophic failure type. The information on current ‘state’ was plotted onto a coloured 2D surface graph over a period of time to demonstrate one particular visual tool. The research demonstrated that it was possible to create the monitoring system within Microsoft Excel and to run Visual Basic programs alongside Boolean logic calculations for the fault tree and the predictive tools, based upon the trend analysis of historic data. Another significant research success was the development of a standardised approach to the investigation of incidents and the dissemination of information.
40

Pedestrians counting and event detection in crowded environment

Shbib, Reda January 2015 (has links)
Crowd density estimation and pedestrian counting are becoming an area of interest such as assessing the social effect and impact between small groups of people within a crowd. Still, existing experimental crowd analyses performed by operators are time consuming. Generally, human controllers are engaged to achieve this task, however, more and more, visual surveillance are becoming an essential need, it is a hard task to watch and study all recorded video due to the huge number of cameras being installed. Currently, image-processing field has attracted all academic and research to develop automatic counting and monitoring algorithms. In this thesis, some novel contributions in different fields are presented: pedestrian counting, event detection, and queue monitoring. Firstly, this thesis presents an original contribution in the pedestrian counting domain. In recent years, many of proposed counting techniques have used global features to estimate crowd density. In this thesis, a new approach has been introduced to replace global image features by the low level- features, which are specific to individuals and clusters within the crowd. Thus, the total number of pedestrians is the summation of all clusters, which construct the crowd. Experimental results through different datasets showed that low-level features have performed better than global features. In addition to the pedestrian counting, this thesis presents another contribution in the area of pedestrian flow monitoring through the developing of a virtual door algorithm, in which pedestrians are counted while they are passing through a proposed virtual count line. Important features have been extracted from the region of interest. Discriminant features are detected, and optical flow of these points are assembled .The proposed system assembles optical flow in the trajectory direction in a discrete group of extracted feature points. Finally, this thesis presents a novel technique for estimating queue parameters, such as number of entrance, leaving and the frequency, in order to obtain a clear picture about the queue traffic and flow. Therefore, in order to obtain these parameters, the proposed pedestrian counting and virtual door approach have been integrated together. Experimental results conducted demonstrate that the proposed system is strong to real-life environments.

Page generated in 0.088 seconds