• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1284
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 12
  • 12
  • 10
  • 10
  • Tagged with
  • 2847
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 162
  • 156
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

An efficient approach for high-fidelity modeling incorporating contour-based sampling and uncertainty

Crowley, Daniel R. 13 January 2014 (has links)
During the design process for an aerospace vehicle, decision-makers must have an accurate understanding of how each choice will affect the vehicle and its performance. This understanding is based on experiments and, increasingly often, computer models. In general, as a computer model captures a greater number of phenomena, its results become more accurate for a broader range of problems. This improved accuracy typically comes at the cost of significantly increased computational expense per analysis. Although rapid analysis tools have been developed that are sufficient for many design efforts, those tools may not be accurate enough for revolutionary concepts subject to grueling flight conditions such as transonic or supersonic flight and extreme angles of attack. At such conditions, the simplifying assumptions of the rapid tools no longer hold. Accurate analysis of such concepts would require models that do not make those simplifying assumptions, with the corresponding increases in computational effort per analysis. As computational costs rise, exploration of the design space can become exceedingly expensive. If this expense cannot be reduced, decision-makers would be forced to choose between a thorough exploration of the design space using inaccurate models, or the analysis of a sparse set of options using accurate models. This problem is exacerbated as the number of free parameters increases, limiting the number of trades that can be investigated in a given time. In the face of limited resources, it can become critically important that only the most useful experiments be performed, which raises multiple questions: how can the most useful experiments be identified, and how can experimental results be used in the most effective manner? This research effort focuses on identifying and applying techniques which could address these questions. The demonstration problem for this effort was the modeling of a reusable booster vehicle, which would be subject to a wide range of flight conditions while returning to its launch site after staging. Contour-based sampling, an adaptive sampling technique, seeks cases that will improve the prediction accuracy of surrogate models for particular ranges of the responses of interest. In the case of the reusable booster, contour-based sampling was used to emphasize configurations with small pitching moments; the broad design space included many configurations which produced uncontrollable aerodynamic moments for at least one flight condition. By emphasizing designs that were likely to trim over the entire trajectory, contour-based sampling improves the predictive accuracy of surrogate models for such designs while minimizing the number of analyses required. The simplified models mentioned above, although less accurate for extreme flight conditions, can still be useful for analyzing performance at more common flight conditions. The simplified models may also offer insight into trends in the response behavior. Data from these simplified models can be combined with more accurate results to produce useful surrogate models with better accuracy than the simplified models but at less cost than if only expensive analyses were used. Of the data fusion techniques evaluated, Ghoreyshi cokriging was found to be the most effective for the problem at hand. Lastly, uncertainty present in the data was found to negatively affect predictive accuracy of surrogate models. Most surrogate modeling techniques neglect uncertainty in the data and treat all cases as deterministic. This is plausible, especially for data produced by computer analyses which are assumed to be perfectly repeatable and thus truly deterministic. However, a number of sources of uncertainty, such as solver iteration or surrogate model prediction accuracy, can introduce noise to the data. If these sources of uncertainty could be captured and incorporated when surrogate models are trained, the resulting surrogate models would be less susceptible to that noise and correspondingly have better predictive accuracy. This was accomplished in the present effort by capturing the uncertainty information via nuggets added to the Kriging model. By combining these techniques, surrogate models could be created which exhibited better predictive accuracy while selecting the most informative experiments possible. This significantly reduced the computational effort expended compared to a more standard approach using space-filling samples and data from a single source. The relative contributions of each technique were identified, and observations were made pertaining to the most effective way to apply the separate and combined methods.
462

Odhad parametru při dvoufázovém stratifikovaném a skupinovém výběru / Parameter Estimation under Two-phase Stratified and Cluster Sampling

Šedová, Michaela January 2011 (has links)
Title: Parameter Estimation under Two-phase Stratified and Cluster Sampling Author: Mgr. Michaela Šedová Department: Department of Probability and Mathematical Statistics Supervisor: Doc. Mgr. Michal Kulich, Ph.D. Abstract: In this thesis we present methods of parameter estimation under two-phase stratified and cluster sampling. In contrast to classical sampling theory, we do not deal with finite population parameters, but focus on model parameter inference, where the ob- servations in a population are considered to be realisations of a random variable. However, we consider the sampling schemes used, and thus we incorporate much of survey sampling theory. Therefore, the presented methods of the parameter estimation can be understood as a combination of the two approaches. For both sampling schemes, we deal with the concept where the population is considered to be the first-phase sample, from which a sub- sample is drawn in the second phase. The target variable is then observed only for the subsampled subjects. We present the mean value estimation, including the statistical prop- erties of the estimator, and show how this estimation can be improved if some auxiliary information, correlated with the target variable, is observed for the whole population. We extend the method to the regression problem....
463

Pesquisas sob amostragem informativa utilizando o FBST / Surveys under informative sampling using the FBST

Daniel Mendes Azerêdo 28 May 2013 (has links)
Pfeffermann, Krieger e Rinott (1998) apresentaram uma metodologia para modelar processos de amostragem que pode ser utilizada para avaliar se este processo de amostragem é informativo. Neste cenário, as probabilidades de seleção da amostra são aproximadas por uma função polinomial dependendo das variáveis resposta e concomitantes. Nesta abordagem, nossa principal proposta é investigar a aplicação do teste de significância FBST (Full Bayesian Significance Test), apresentado por Pereira e Stern (1999), como uma ferramenta para testar a ignorabilidade amostral, isto é, para avaliar uma relação de significância entre as probabilidades de seleção da amostra e a variável resposta. A performance desta modelagem estatística é testada com alguns experimentos computacionais. / Pfeffermann, Krieger and Rinott (1998) introduced a framework for modeling sampling processes that can be used to assess if a sampling process is informative. In this setting, sample selection probabilities are approximated by a polynomial function depending on outcome and auxiliary variables. Within this framework, our main purpose is to investigate the application of the Full Bayesian Significance Test (FBST), introduced by Pereira and Stern (1999), as a tool for testing sampling ignorability, that is, to detect a significant relation between the sample selection probabilities and the outcome variable. The performance of this statistical modelling framework is tested with some simulation experiments.
464

Identifiering av variabler vid framtagning av optimerad stickprovsfrekvens / Identifying Variables for Developing Optimized Sampling Frequency

Gunnarsson Ljungblom, Joel, Larsson, Rikard January 2017 (has links)
Arbetet kring mätfrekvenser, alltså hur ofta en producerad detalj ska mätas, inom produktionen på Volvo Cars följer i dagsläget inget standardiserat arbetssätt. Arbetet kring det bygger i stort på tidigare erfarenheter och vad liknande utrustningar har för mätfrekvens. Volvo Cars efterfrågar mer kunskap inom området för att få en mer kostnadseffektiv kvalitetssäkring. Arbetets huvudsyfte har innefattats av identifiering gällande vilka variabler som påverkar mätfrekvensen, samt uppbyggnad av en enklare modell där variablerna applicerats. Intervjuer har även genomförts på ett flertal företag, där några av de viktigaste slutsatserna är: Mätfrekvenser arbetas med retroaktivt, snarare än proaktivt. Duglighet är i dagsläget vanligast att använda vid arbete med mätfrekvenser. Arbete med mätfrekvenser sker inte standardiserat. Förbättring av mätfrekvenser jobbas med i låg grad och när det väl görs är det ofta triggat av en mantidsanalys. Arbetet har resulterat i identifiering av två huvudvariabler; duglighet och kvalitetskostnader. Även om verkligheten är mer komplicerad, kan dessa två variabler ses som huvudkategorier. Under duglighet och kvalitetskostnader finns sedan underkategorier. För duglighet finns verktygsrelaterade egenskaper såsom förslitning och dess material. Även detaljens material och dess termodynamiska egenskaper har inverkan på dugligheten. Slutligen återfinns felintensitet, vibrationer som uppstår och processens stabilitet. Gällande kvalitetsbristkostnader finns felkostnader som uppstår inom företagets väggar, interna felkostnader, och de felkostnader som uppstår när produkt levererats till kund, externa felkostnader. Utöver de två finns även kontrollkostnader och förebyggande kostnader. Arbetet har dessutom mynnat ut i en enklare modell där erfarenhet från intervjuer och data från Volvo Cars tagits i beaktande. Flera av de data som återfinns i modellen har tagits fram genom analysering av tre veckors produktionsdata från Volvo Cars. Data som används i modellen berörande kvalitet är duglighet och den procentuella fördelningen av den aktuella varianten. De data som har inverkan på kvalitetskostnaderna är hur många operationer flödet har och aktuell operations placering i relation till totala antalet. Även råämnets kostnad, allvarlighetsgraden för kvalitetsbristen hos aktuell egenskap och skrotkostnaden används. Modellen har sedan applicerat på en maskinerna som omfattats av arbetet för att kontrollera utfallet. Med data införd baserad på produktionsdata från Volvo Cars har en stickprovsfrekvens på 62 genererats. / Work on measuring frequencies, which is how often a produced detail is to be measured, within Volvo Cars’ production currently does not follow a standardized approach. The work around it basically builds on past experiences and what similar equipment has for measurement frequency. Volvo Cars requests more knowledge in the area to get more cost-effective quality assurance. The main objective of the work has contained identification of the variables that affect the measurement frequency, as well as construction of a simpler model where the variables are applied. Interviews have also been conducted on a number of companies, where some of the key conclusions are: Measuring frequencies are worked retrospectively, rather than proactively. Capability is currently the most common for work with measurement frequencies. Working with measurement frequencies does not occur standardized. Improving measurement frequencies occur to a low extent, and when done, it is often triggered by a man-time analysis. The work has resulted in the identification of two main variables; capability and quality costs. Although the reality is more complicated, these two variables can be seen as main categories. Under capability and quality costs, there are subcategories. For capability, tool-related properties such as wear and its material are available. The material of the detail and its thermodynamic properties also affect the capability. Finally, error intensity, vibrations and stability of the process are found. Regarding quality deficiency there are error costs arising within the company's walls, internal error costs, and the error costs that occur when the product is delivered to the customer, external error costs. In addition to the two, there are also control costs and prevention costs. In addition, the work has resulted in a simpler model, taking into account experience from interviews and data from Volvo Cars. Several of the data contained in the model have been developed by analyzing three-week production data from Volvo Cars. Data used in the model related to quality is the capability and the percentage distribution of the current variant. The data that impact on quality costs is how many operations the flow has and the current operation location in relation to the total number. The cost of the raw material, the severity of the quality lack of the current property and the scrap cost is also used. The model has then been applied to one of the machines covered by the work to check the outcome. With data imported based on production data from Volvo Cars, a sampling rate of 62 has been generated.
465

Non-uniformity issues and workarounds in bounded-size sampling

Gemulla, Rainer, Haas, Peter J., Lehner, Wolfgang 27 January 2023 (has links)
A variety of schemes have been proposed in the literature to speed up query processing and analytics by incrementally maintaining a bounded-size uniform sample from a dataset in the presence of a sequence of insertion, deletion, and update transactions. These algorithms vary according to whether the dataset is an ordinary set or a multiset and whether the transaction sequence consists only of insertions or can include deletions and updates. We report on subtle non-uniformity issues that we found in a number of these prior bounded-size sampling schemes, including some of our own. We provide workarounds that can avoid the non-uniformity problem; these workarounds are easy to implement and incur negligible additional cost. We also consider the impact of non-uniformity in practice and describe simple statistical tests that can help detect non-uniformity in new algorithms.
466

Spectrographic determination of rhenium in molybdenite with the D.C. arc

Ho, Show-Jy. January 1964 (has links)
Call number: LD2668 .T4 1964 H67 / Master of Science
467

Application of an automated labour performance measuring system at a confectionery company

Van Blommestein, D.L., Matope, S., Ruthven, G., Van der Merwe, A.F. January 2013 (has links)
Published Article / This paper focuses on the implementation of a labour performance measuring system at a confectionery company. The computer vision based system is based on the work sampling methodology. It consists of four cameras linked to a central computer via USB extenders. The computer uses a random function in C++ in order to determine when measurements are to be taken. OpenCV is used to track the movement of a target worker's dominant hand at a given work station. Tracking is accomplished through the use of a bandwidth colour filter. The speed of the worker's hand is used to identify whether the worker is busy, idle or out of the frame over the course of the sampling period. Data collected by the system is written into a number of text files. The stored data is then exported to a Microsoft Excel 2007 spread sheet where it is analysed and a report on the labour utilisation is generated.
468

IEEE1588 – A solution for synchronization of networked data acquisition systems?

Corry, Diarmuid 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / One of the problems for manufacturers and users of flight test data acquisition equipment, is to guarantee synchronization between multiple units acquiring data on the vehicle. Past solutions have involved proprietary interconnects and multiple wire installations increasing weight and complexity and reducing inter-operation of units. This problem has become particularly important given the trend towards commercial busses, especially Ethernet, as a system interconnect. The IEEE1588 standard offers a way to transmitting time accurately over Ethernet. This paper discusses the standard, how it might be implemented, and examines the issues involved in adopting this standard for flight test data acquisition. A particular implementation that results in a synchronized four-wire Ethernet based distributed data acquisition system is discussed in section 3.
469

Design Considerations for a Variable sample Rate Signal Conditioning Module

Lee, Jeffrey C. 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / Modern telemetry systems require flexible sampling rates for analog signal conditioning within telemetry encoders in order to optimize mission formats for varying data acquisition needs and data rate constraints. Implementing a variable sample rate signal conditioning module for a telemetry encoder requires consideration of several possible architectural topologies that place different system requirements on data acquisition modules within the encoder in order to maintain adequate signal fidelity of sensor information. This paper focuses on the requirements, design considerations and tradeoffs associated with differing architectural topologies for implementing a variable sample rate signal conditioning module and the resulting implications on the encoder system's data acquisition units.
470

ARCHITECTURAL CONSIDERATIONS FOR A VARIABLE BIT RATE DATA ACQUISITION TELEMETRY ENCODER

Lee, Jeffrey C. 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Modern telemetry systems require flexible bit rate telemetry encoders in order to optimize mission formats for varying data rate requirements and/or signal to noise conditions given a fixed transmitter power. Implementing a variable bit rate telemetry encoder requires consideration of several possible architectural topologies that place different system requirements on data acquisition modules within the encoder in order to maintain adequate signal fidelity of sensor information. This paper focuses on the requirements, design considerations and tradeoffs associated with differing architectural topologies for implementing a variable bit rate encoder and the resulting implications on the encoder systems data acquisition units.

Page generated in 0.0349 seconds