• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 45
  • 6
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 84
  • 84
  • 24
  • 15
  • 13
  • 12
  • 11
  • 9
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Design, Construction And Performance Evaluation Of A Submersible Pump With Numerical Experimentation

Engin, Ertan 01 September 2003 (has links) (PDF)
Due to the increasing demand, nonclog type sewage pumps are designed and manufactured in large amounts all over the world. However, a methodology on the design of these special duty pumps is not encountered in the literature. Therefore, the manufacturers tend to develop their own empirical methodologies. In this thesis, a nonclog pump is designed and constructed on the basis of suitable approaches of known centrifugal pump design methods. In this frame, a nonclog type submersible pump that is capable of handling solids, up to a diameter of 80 mm is aimed to be designed. The designed pump delivers 100 l/s flow rate against a head of 24 m. The rotational speed of the pump is 1000 rpm. Design procedure and the important points that differ nonclog pump design from standard centrifugal pump designs are given. In addition, hydraulic characteristics of two nonclog pumps, one of which is the pump designed in this study, are investigated by means of computational fluid dynamics (CFD) code. The designed pump is manufactured and tested in Layne Bowler Pump Company Inc. The test result indicates that design point is reached with a deviation in the limits of the related standard. Wire to water total best efficiency obtained by the test is 60%. Close agreement between results of actual test and numerical experimentation performed by CFD code shows that CFD analysis is a quite useful tool in predicting the hydraulic characteristics of nonclog pumps. Moreover, the pump is tested at 750 rpm and the test results are found to be in good agreement with the similitude anaysis results.
42

Attachment and closeness in parent-child relationships in late adolescence and young adulthood

Saebel, Judith January 2008 (has links)
Analyses of data from 317 young people (16-24 yrs) suggested that their attachment to parents is best represented by an overall attachment scale and 1(2) specific subscales. Analyses of data from 146 parents indicated that closeness to their children is captured by an overall scale and two specific subscales.
43

Analýza a návrh informačního systému pro firmu eSports.cz, s.r.o. / The Design of Information System for Company eSport.cz, s.r.o.

Kobelka, Michal January 2015 (has links)
The master's thesis deals with analysis and design of information system for company eSports, s.r.o. First chapter presents theoretical basis which is necessary for understanding the problem. Second chapter assesses current situatuion of eSports, s.r.o. and analyzes main processes of the company. Last chapter provides proposals for data model of new information system.
44

Plánovaný experiment / Design of Experiment

Sabová, Iveta January 2015 (has links)
This thesis deals with the possibility of applying the method of Design of Experiments (DoE) on specific data. In the first chapter of theoretical part, this method is described in detail. The basic principles and guidelines for the design of the experiment are written there. In the next two chapters, factorial design of the experiment and response surface design are described. The latter one includes a central composite design and Box-Behnken design. The following chapter contains practical part, which focuses on modelling firing range of ball from a catapult using the above three types of experimental design. In this work, the models are analysed together with their different characteristics. Their comparison is made by using prediction and confidence intervals and by response optimizing. The last part of the thesis comprises overall evaluation.
45

Contextuality and Noncontextuality in Human Choice Behavior

Victor Hernando Cervantes Botero (8801195) 06 May 2020 (has links)
<div>The Contextuality-by-Default theory describes the contextual effects on random variables: how the identity of random variables changes from one context to another. Direct influences and true contextuality constitute different types of effects of contexts upon sets of random variables. Changes in the distributions of random variables across contexts define direct influences. True contextuality is defined by the impossibility of sewing all the variables of a system of random variables into a particular overall joint distribution. In the absence of direct influences, the theory specializes to the theory of selective influences in psychology and the traditional treatment of contextuality in quantum mechanics. Consistently connected (i.e., with no direct influences) noncontextual systems are the systems with selective influences. However, observable systems of human behavior are seldom consistently connected. Contextuality-by-Default allows one to classify and measure the degree of deviation from or adherence to the pattern of selective influences, both for consistently and inconsistently connected systems.</div><div><br></div><div><div>The papers here included follow the development of the Contextuality-by-Default theory. The theory is presented for cyclic systems of binary random variables, for arbitrary systems of binary random variables, and for systems that include categorical random variables. Although contextuality has been searched for in human behavior since at least the 1990s, I report here the first experiments that have demonstrated contextuality in choice behavior without making the mistake of ignoring the direct influences present in the systems of random variables. A psychophysical experiment was conducted and then analyzed using the theory for systems of binary random variables. Its results showed no contextuality in a double-detection paradigm, that is, in an experiment in which each participant was asked to make dual conjoint judgments of signal detection for two stimuli at a time. Several crowdsourcing experiments were</div><div>conducted and analyzed using the theory for cyclic systems of binary random variables. These experiments demonstrate contextuality using a between-subjects experimental design. Among them, the Snow Queen experiment, in which each participant made two conjoint choices in accordance with a simple story line, provided a methodological template (used afterward to design the other crowdsourcing experiments) for</div><div>systematically exploring contextuality. Lastly, another psychophysical experiment was conducted and then analyzed using the theory for systems with categorical random variables. This one is the first experiment that demonstrates contextuality in a within-subject design.</div></div><div><br></div><div><div>In addition to the experimental work reported in these papers, I also present the development of the Contextuality-by-Default theory from the theory for cyclic systems to the theory for systems with categorical random variables. The nominal dominance theorem, which states a necessary condition for noncontextuality of systems where all dichotomizations of categorical variables are considered, is the most relevant theoretical result of this development. The role that the notion of contextuality can play in psychology is difficult to fully understand at our present stage of knowledge. Most obviously, contextuality analysis is a generalization of the traditional psychological problem of selective influences. It is, in fact, the only existing theoretical tool for classifying and quantifying patterns of deviations from the hypothesis of selective influences. It is less evident whether the degree of (non)contextuality correlates with specific aspects of behavior that may be of interest. Although some such correlations seem to suggest themselves, to be certain and precise in identifying them, we need to expand our knowledge of the degree of (non)contextuality to a broader class of behavioral systems.</div></div>
46

Transgender als Identitätsherausforderung. Eine medienpädagogische Analyse des Filmes „Transamerica“

Hamisch, Mariann 21 July 2017 (has links)
Die Auseinandersetzung mit der Transgenderthematik auf politischer, gesellschaftlicher und kultureller Ebene wird immer wieder auch medial verhandelt. Um Verständnis und Akzeptanz zu fördern und Diskriminierung und Transphobie entgegenzuwirken, gilt es, das Thema auch pädagogisch aufzuarbeiten. Die Thematik wird aus medizinischer, gesellschaftlicher und rechtlicher Perspektive dargelegt, um anschließend aufzuzeigen, inwiefern sie in der Lebenswelt von Jugendlichen und in der Schule vertreten ist. Dabei wird offensichtlich, dass hinsichtlich pädagogischer Ansätze in der Schule Nachholbedarf besteht. Um Jugendliche für die Identitätsherausforderungen von Transgendern zu sensibilisieren, eignet sich der US-amerikanische Spielfilm „Trans­america“, welcher hier insbesondere im Hinblick auf die Identitätsherausforderungen der Hauptcharaktere medienpädagogisch analysiert wird. In der abschließenden medienpädagogischen Einschätzung wird erörtert, inwiefern der Film auch im Unterricht eingesetzt werden kann.
47

Perceived organizational support as social validation: Concept clarity and content validation

Andrew T Jebb (9023918) 29 June 2020 (has links)
Perceived organizational support (POS) is an important construct in organizational science that describes employees’ degree of perceived support from their organization. However, in the academic literature, no paper has openly consulted real employees for how they understand and experience organizational support. The goal of the present dissertation was to conduct a qualitative,<br>person-centric study to from the employee’s perspective investigate the meaning of POS. To do this, techniques based on current best-practice recommendations were used, including examining incidents of the phenomenon and collecting lay definitions from key informants. It was found that a wide range of organizational behaviors can count as support; in the data, 25 distinct support forms were identified along with 27 lack of support forms. Through thematic analysis, these forms were aggregated into six themes of organizational support (e.g., “Organization helps the employee perform their job effectively”) and ultimately formed a single higher-order theme that represented<br>the meaning of POS. That ism POS is the holistic perception of whether or not an employee is<br>valued by their organization. This aligns with the classical academic definition of POS (perceptions of how much the organization values one’s well-being and work contributions) but also suggests the construct should be considered more broadly.<br>Because how a construct is conceptualized determines its essential content, the second half of this dissertation performed a systematic content validation of the Survey of Perceived Organizational Support (SPOS) and its short forms. Little formal content validation had been done<br>for this scale, but it was found that all four aspects of content validity examined (content deficiency, relevance, distinctiveness, and balance) were satisfactory in the SPOS and of its short forms. Thus, researchers using these scales can be confident of content validity, although there is a need to improve content validation processes and reduce the number of SPOS short forms in current use.
48

Statistical Methods for Launch Vehicle Guidance, Navigation, and Control (GN&C) System Design and Analysis

Rose, Michael Benjamin 01 May 2012 (has links)
A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical formulations that are discussed are applicable to ascent on Earth or other planets as well as other rocket-powered systems such as sounding rockets and ballistic missiles.
49

Handling research data at the front end of the design process

Sachidanandam, Vignesh 10 September 2008 (has links)
No description available.
50

OPTIMIZATION TECHNIQUES FOR PHARMACEUTICAL MANUFACTURING AND DESIGN SPACE ANALYSIS

Daniel Joseph Laky (13120485) 21 July 2022 (has links)
<p>In this dissertation, numerical analysis frameworks and software tools for digital design of process systems are developed. More specifically, these tools have been focused on digital design within the pharmaceutical manufacturing space. Batch processing represents the traditional and still predominant pathway to manufacture pharmaceuticals in both the drug substance and drug product spaces. Drug substance processes start with raw materials or precursors to produce an active pharmaceutical ingredient (API) through synthesis and purification. Drug product processes take this pure API in powder form, add excipients, and process the powder into consumer doses such as capsules or tablets.  Continuous manufacturing has allowed many other chemical industries to take advantage of real-time process management through process control, process optimization, and real-time detection of off-spec material. Also, the possibility to reduce total cleaning time of units and encourage green chemistry through solvent reduction or recycling make continuous manufacturing an attractive alternative to batch manufacturing. However, to fully understand and take advantage of real-time process management, digital tools are required, both as soft sensors during process control or during process design and optimization.  Since the shift from batch to continuous manufacturing will proceed in stages, processes will likely adopt both continuous and batch unit operations in the same process, which we will call {\em hybrid} pharmaceutical manufacturing routes. Even though these processes will soon become common in the industry, digital tools that address comparison of batch, hybrid, and continuous manufacturing routes in the pharmaceutical space are lacking. This is especially true when considering hybrid routes. For this reason, PharmaPy, an open-source tool for pharmaceutical process development, was created to address rapid in-silico design of hybrid pharmaceutical processes.  Throughout this work, the focus is on analyzing alternative operating modes within the drug substance manufacturing context. First, the mathematical models for PharmaPy's synthesis, crystallization, and filtration units are discussed. Then, the simulation capabilities of PharmaPy are highlighted, showcasing dynamic simulation of both fully continuous and hybrid processes. However, the technical focus of the work as a whole is primarily on optimization techniques for pharmaceutical process design. Thus, many derivative-free optimization frameworks for simulation-optimization were constructed and utilized with PharmaPy performing simulations of pharmaceutical processes.  The timeline of work originally began with derivative-based methods to solve mixed-integer programs (MIP) for water network sampling and security, as well as nonlinear programs (NLPs) and some mixed-integer nonlinear programs (MINLPs) for design space and feasibility analysis. Therefore, a method for process design that combines both the ease of implementation from a process simulator (PharmaPy) with the computational performance of derivative-based optimization was implemented. Recent developments in Pyomo through the PyNumero package allow callbacks to an input-output or black-box model while using {\sc Ipopt} as a derivative-based solver through the cyipopt interface. Using this approach, it was found that using a PharmaPy simulation as a black box within a derivative-based solver resulted in quicker solve times when compared with traditional derivative-free optimization strategies, and offers a much quicker implementation strategy than using a simultaneous equation-oriented algebraic definition of the problem.  Also, uncertainty exists in virtually all process systems. Traditionally, uncertainty is analyzed through sampling approaches such as Monte Carlo simulation. These sampling approaches quickly become computational obstacles as problem scale increases. In the 1980s, chemical plant design under uncertainty through {\em flexibility analysis} became an option for explicitly considering model uncertainty using mathematical programming. However, such formulations provide computational obstacles of their own as most process models produce challenging MINLPs under the flexibility analysis framework.  Specifically when considering pharmaceutical processes, recent initiatives by the FDA have peaked interest in flexibility analysis because of the so called {\em design space}. The design space is the region for which critical quality attributes (CQAs) may be guaranteed over a set of interactions between the inputs and process parameters. Since uncertainty is intrinsic to such operations, industry is interested in guaranteeing that CQAs hold with a set confidence level over a given operating region. In this work, the {\em probabilistic design space} defined by these levels of confidence is presented to address the computational advantages of using a fully model-based flexibility analysis framework instead of a Monte Carlo sampling approach. From the results, it is seen that using the flexibility analysis framework decreased design space identification time by more than two orders of magnitude.  Given implementation difficulty with new digital tools for both students and professionals, educational material was developed for PharmaPy and was presented as part of a pharmaceutical API process development course at Purdue. The students were surveyed afterward and many of the students found the framework to be approachable through the use of Jupyter notebooks, and would consider using PharmaPy and Python for pharmaceutical modeling and data analysis in the future, respectively.  Through software development and the development of numerical analysis frameworks, digital design of pharmaceutical processes has expanded and become more approachable. The incorporation of rigorous simulations under process uncertainty promotes the use of digital tools in regulatory filings and reduces unnecessary process development costs using model-based design. Examples of these improvements are evident through the development of PharmaPy, a simulation-optimization framework using PharmaPy, and flexibility analysis tools. These tools resulted in a computational benefit of 1 to 2 orders of magnitude when compared to methods used in practice and in some cases reduce the modeling time required to determine optimal operating conditions, or the design space of a pharmaceutical manufacturing process.</p>

Page generated in 0.0944 seconds