• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 33876
  • 12662
  • 10150
  • 1115
  • 799
  • 552
  • 387
  • 323
  • 323
  • 323
  • 323
  • 323
  • 321
  • 238
  • 235
  • Tagged with
  • 68504
  • 33405
  • 16814
  • 16188
  • 13197
  • 13173
  • 13072
  • 10685
  • 5420
  • 4633
  • 4521
  • 4362
  • 3898
  • 3874
  • 3586
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1371

Nonlinear continuous-time generalised predictive control

Siller-Alcalá, Irma Irasema January 1998 (has links)
The development of the nonlinear version of the Continuous-time Generalised Predictive Control (NCGPC) is presented. Unlike the linear version, the nonlinear version is developed in state-space form and shown to include Nonlinear Generalised Minimum Variance (NGMV), and a new algorithm, Nonlinear Predictive Generalised Minimum Variance (NPGMV), as special cases. Through simulations, it is demonstrated that NCGPC can deal with nonlinear systems whose relative degree is not well defined and nonlinear systems with unstable zero dynamics. Geometric approaches, such as exact linearisation, are shown to be included in the NCGPC as special cases.
1372

An intermittent predictive control approach to modelling sustained human motor control

Mamma-Graham, Adamantia S. January 2014 (has links)
Although human sustained control movements are continuous in nature there is still controversy on the mechanisms underlying such physiological systems. A popular topic of debate is whether human motor control mechanisms could be modelled as engineering control systems, and if so, what control algorithm is most appropriate. Since the early years of modelling sustained control tasks in human motor control the servomechanism has been an adequate model to describe human tracking tasks. Another continuous-time system model that is often used to model sustained control tasks is the predictive controller which is based on internal models and includes prediction and optimisation. On the other hand, studies have suggested intermittent behaviour of the ``human controller'' in sustained motor control tasks. This thesis investigated whether intermittent control is a suitable approach to describe sustained human motor control. It was investigated how well an intermittent control system model could approximate both the deterministic and non-deterministic parts of experimental data, from a visual-manual compensatory tracking task. Finally, a preliminary study was conducted to explore issues associated with the practical implementation of the intermittent control model. To fit the deterministic part of experimental data, a frequency domain identification method was used. Identification results obtained with an intermittent controller were compared against the results using continuous-time non-predictive and predictive controllers. The results show that the identified frequency response functions of the intermittent control model not only fit the frequency response functions derived from the experimental data well, but most importantly resulted in identified controller parameters which are similar to those identified using a predictive controller, and whose parameter values appear to be physiologically meaningful. A novel way to explain human variability, as represented by the non-deterministic part of the experimental data (the \emph{remnant}), was developed, based on an intermittent control model with variable intermittent interval. This model was compared against the established paradigm, in which variability is explained by a predictive controller with added noise, either signal dependent control signal noise, or observation noise. The study has shown that the intermittent controller with a variable intermittent interval could model the non-deterministic experimental data as well as the predictive controller model with added noise. This provides a new explanation for the source of remnant in human control as inherent to the controller structure, rather than as a noise signal, and enables a new interpretation for the physiological basis for human variability. Finally, the theoretical intermittent control model was implemented in real-time in the context of the physiological control mechanism of human standing balance. An experimental method was developed to apply automatic artificial balance of an inverted pendulum in the context of human standing, via functions electrical stimulation control of the lower leg muscles of a healthy subject. The significance of this study is, firstly, that frequency domain identification was applied for the first time with intermittent control, and it could be shown that both intermittent and predictive control models can model deterministic experimental data from manual tracking tasks equally well. Secondly, for the first time the inherent variability, which is represented by the remnant signal, in human motor control tasks could be modelled as part of the structure of the intermittent controller rather than as an added noise model. Although, the experimental method to apply automatic artificial balance of an inverted pendulum in the context of human standing was not successful, the intermittent controller was implemented for the first time in real-time and combined with electrical muscle stimulation to control a physiological mechanism.
1373

The economics of power

Javary, Michele January 1998 (has links)
No description available.
1374

Radiation damage in hexagonal-close-packed metals

Yellen, Duncan Howard January 1990 (has links)
No description available.
1375

Enhancing user's privacy : developing a model for managing and testing the lifecycle of consent and revocation

Agrafiotis, Ioannis January 2012 (has links)
Increasingly, people turn to the Internet for access to services, which often require disclosure of a significant amount of personal data. Networked technologies have enabled an explosive growth in the collection, storage and processing of personal information with notable commercial potential. However, there are asymmetries in relation to how people are able to control their own information when handled by enterprises. This raises significant privacy concerns and increases the risk of privacy breaches, thus creating an imperative need for mechanisms offering information control functionalities. To address the lack of controls in online environments, this thesis focuses on consent and revocation mechanisms to introduce a novel approach for controlling the collection, usage and dissemination of personal data and managing privacy ex- pectations. Drawing on an extensive multidisciplinary review on privacy and on empirical data from focus groups, this research presents a mathematical logic as the foundation for the management of consent and revocation controls in technological systems. More specifically, this work proposes a comprehensive conceptual model for con- sent and revocation and introduces the notion of 'informed revocation'. Based on this model, a Hoare-style logic is developed to capture the effects of expressing indi- viduals' consent and revocation preferences. The logic is designed to support certain desirable properties, defined as healthiness conditions. Proofs that these conditions hold are provided with the use of Maude software. This mathematical logic is then verified in three real-world case study applications with different consent and revocation requirements for the management of employee data in a business envi- ronment, medical data in a biobank and identity assurance in government services. The results confirm the richness and the expressiveness of the logic. In addition, a novel testing strategy underpinned by this logic is presented. This strategy is able to generate testing suites for systems offering consent and revocation controls, such as the EnCoRe system, where testing was carried out successfully and resulted in identifying faults in the EnCoRe implementation.
1376

A cost-benefit forecasting framework for assessment of advanced manufacturing technology development

Jones, Mark Benjamin January 2014 (has links)
Development of new Advanced Manufacturing Technology (AMT) for the aerospace industry is critical to enhance the manufacture and assembly of aerospace products. These novel AMTs require high development cost, specialist resource capabilities, have long development periods, high technological risks and lengthy payback durations. This forms an industry reluctance to fund the initial AMT development stages, impacting on their success within an ever increasingly competitive environment. Selection of suitable AMTs for development is typically performed by managers who make little reference to estimating the non-recurring development effort in resources and hardware cost. In addition, the performance at the conceptual stage is predicted using expert opinion, consisting of subjective and inaccurate outputs. AMTs selected are then submerged into development research and heavily invested in, with incorrect selections having a detrimental impact on the business. A detailed study of the UK aerospace manufacturing industry corroborated these findings and revealed a requirement for a new process map to resolve the problem of managing AMT developments at the conceptual stages. This process map defined the final research protocol, forming the requirement for a Cost-Benefit Forecasting Framework. The framework improves the decision making process to select the most suitable AMTs for development, from concept to full scale demonstration. Cost is the first element and is capable of estimating the AMT development effort in person-hours and cost of hardware using two parametric cost models. Benefit is the second element and forecasts the AMT tangible and intangible performance. The framework plots these quantified cost-benefit parameters and is capable of presenting development value advice for a diverse range of AMTs with varied applications. A detailed case study is presented evaluating a total of 23 novel aerospace AMTs verifying the capability and high accuracy of the framework within a large aerospace manufacturing organisation. Further validation is provided by quantifying the responses from 10 AMT development experts, after utilising the methodology within an industrial setting. The results show that quantifying the cost-benefit parameters provides manufacturing research and technology with the ability to select AMTs that provide the best value to a business.
1377

Contractors' bidding behaviour and tender price prediction

McCaffer, Ronald January 1976 (has links)
Data relating to the bids for 384 roads contracts and 190 buildings contracts and a library of individual unit prices were obtained. The normality or near normality of the distribution of bids for buildings and roads contracts is established. This allows the relationship between mean and lowest bids to be defined using normal order statistics. It also permits the application of outlier tests to be used in identifying unrealistically low bids. The average mean standardised bids of contractors have a strong negative correlation with the contractor's success ratio. This allows contractors to predict success ratios of others using their mean-standardised bids. The data required for this is not limited to the competitions in which the contractor himself enters. Contractors have different behaviour patterns, some with disproportionate numbers of high or low bids and others behave randomly. These behaviour features correlate well with the average mean-standardised bids. Graphs of the cumulative sum of (bid-mean bid)/mean bid are useful in identifying contractors who are seeking work and those who are not. These can be used to identify serious rivals for particular contracts. Contractors have different sensitivity of success ratio to changes in bid value thus indicating different market judgements. Contractors also have different trends within their standardised bids to contract value. This only affects success ratios in extreme cases. Designers have accuracies of standard deviations of 16.63% and 20.14% for predicting the lowest bid of buildings and roads contracts respectively. Price models based on multiple regression analysis produce similar accuracies for comparable construction works. The tender price prediction system developed, based on a library of, untt prices and inflation indices achieved a standard deviation of 8,30% in predicting the mean bid and 11.08% in predicting the lowest bid for roads contracts. This could be improved with more data in the price library but nevertheless is a substantial improvement on the results achieved by designer's estimating.
1378

Hydrogen production through sorption-enhanced steam reforming of ethanol using CaO-based sorbent mixed with iron oxide catalyst

Elfaki, Hind Omer Elsheikh January 2016 (has links)
Novel synthetic CaO-based sorbents for carbon dioxide (CO2) capture in sorption-enhanced steam reforming (SESR) were prepared by the co-precipitation method. Magnesium oxide (MgO) and cerium oxide (CeO2) were mixed with calcium oxide (CaO) in different molar percentages in order to obtain the optimum percentage, which provide high CO2 uptake capacity and cyclic stability. The TGA results for CO2 uptake, revealed that for the molar ratio of CaO, MgO and CeO2 of (6:2:1) and (4:2:1), the sorbents had CO2 capture capacity of 29 and 25 wt.%, respectively. The fresh sorbents were characterized using X-ray diffraction, mercury porosimetry, N2 physisorption and scanning electron microscopy. It was found that the sorbents with higher CO2 uptake capacities had relatively high porous surface structure with porosity percentage (>66%). Modelling of CO2 uptake kinetics showed that JMA (Johnson-Mehl-Avrami) fits best the first and second stages except for the molar ratio of CaO, MgO and CeO2 of (4:2:1) sample where, surface chemisorption (SC) fits the initial stage and JMA fits the second stage. While the contracting volume model (CV2/3) fits the final stages of all the studied sorbents. The stability of sorbents at high temperatures was examined over multiple cycles of carbonation/de-carbonation reactions. After 45 cycles, the sample with a molar ratio of CaO, MgO and CeO2 of (6:2:1) remained as high as 25 wt.% (0.43g CO2 /1g CaO) with only 25% decrease from its CO2 uptake capacity as a fresh sample. Therefore, the latter sample was selected to be mixed with iron oxide catalyst and used for the SESR. The study of ethanol steam reforming employing an iron oxide as a catalyst, with and without in-situ CO2 removal, has been investigated. The results confirmed that iron oxide exhibited catalytic activity for hydrogen (H2) production from ethanol steam reforming/decomposition reactions. Furthermore, the CaO-based sorbent had sucessfully decrease the amount of CO2 produced during ethanol reforming reaction up to 70 min of reaction time. Ethanol reformation with in-situ CO2 removal was investigated at 550-700 °C. The maximum H2 yield achieved was 3.5 mol (H2) /mol (EtOH) at 600°C. GC results revealed that there was no evidence of CO and C across the studied temperature range. The results showed an enhancement in reaction reactivity by increasing the gas hourly space velocities (GHSVs). The amount of H2 produced remained stable within 10 cycles, which is equivalent to 30 hours of reaction time.
1379

Credit risk analysis using artificial intelligence : evidence from a leading South African banking institution

Moonasar, Viresh January 2007 (has links)
Credit risk analysis is an important topic in financial risk management. Financial institutions (e.g. commercial banks) that grant consumers credit need reliable models that can accurately detect and predict defaults. This research investigates the ability of artificial neural networks as a decision support system that can automatically detect and predict “bad” credit risks based on customers demographic, biographic and behavioural characteristics. The study focuses specifically on the learning vector quantization neural network algorithm. This thesis contains a short overview of credit scoring models, an introduction to artificial neural networks and their applications and presents the performance evaluation results of a credit risk detection model based on learning vector quantization networks.
1380

Examining teachers' and college students' perspectives toward e-textbooks as an educational tool

Alawami, Nariman Ali 07 April 2017 (has links)
<p> The purpose of this research was to understand the perspectives of teachers and students regarding their use of specific e-textbooks in a higher education learning environment. The e-textbooks used by the teachers and students were examined in order to determine functions and features to focus on during the inquiry. This study was particularly interested in the design of e-textbooks and how they are used by both teachers and students in the learning process. The participants in the research were also prompted to suggest improvements to e-textbooks. </p><p> A case study approach was used as the methodology to examine three teachers represented three majors in three different colleges and a small sample of their students to gather information to answer the research questions. Qualitative data from multiple sources such as in depth interviews and document reviews were used to analyze and examine the overall utility of e-textbooks toward learning/teaching, and functions related to the instructional and navigational design of e-textbooks. </p><p> Results of this study showed that there was overall agreement that choosing books needs to be based on the quality of the material contained within the book, regardless of the format, print or electronic. E-textbooks facilitate teachers&rsquo; and students&rsquo; interactions and access to the materials and resources. However, both teachers and students viewed their e-textbooks as supplementary materials, even thought these e-textbooks allowed them to interact with the text using different tools. One recurring finding was how flexible e-textbooks are in individualizing student learning. Recommendations by students/teachers included the improvement of page layout, the interface, increasing window size, providing more complex tasks, keeping up with technology, and insuring the quality of information within the e-textbook. Implications for future research include further investigation into the use of e-textbooks as supplementary materials, and whether printed texts are also being considered as supplementary. Finally, there are indications that advanced technology may be changing how students learn and if e-textbooks reflect this change. Further research into this possible change in ways that students learn would shed additional light on this question.</p>

Page generated in 0.1895 seconds