251 |
Methodology to Evaluate Proposed Leading Indicators of Space System Performance Degradation Due to ContaminationSeasly, Elaine Ellen 25 April 2018 (has links)
<p> Leading indicators can be utilized to monitor a system and detect if a risk is present or increasing over time during key development phases such as integration and test. However, no leading indicator is perfect, and each contains inherent holes that can miss signals of risk. While the Swiss cheese model is a well-known framework for conceptualizing the propagation of risks through holes in system defenses, research is lacking on characterizing these holes. There are many choices for leading indicators, and to select an appropriate indicator for a system, engineering managers need to know how well the indicator will detect a signal of risk and what it can miss. A methodology was developed to quantify holes in proposed leading indicator methods and determine the impact to system performance if the methods miss detecting the risk. The methodology was executed through a case study that empirically evaluated two different techniques for detecting and monitoring molecular contamination risk to space system hardware performance during systems integration: visual inspections and portable Raman spectroscopy. Performance model results showed the impact the presence of a contaminant film had on space system surfaces, and how each indicator method missing the detection of a film could impact the system. Results from the methodology provide an understanding of the limitations in the risk detection and monitoring techniques of leading indicators to aid engineering managers in effectively selecting, deploying, and improving these techniques.</p><p>
|
252 |
Technology Maturation Framework| Extending the Technology Readiness Index to the Systems Engineering LifecycleDavis, Carrie Elizabeth 26 April 2018 (has links)
<p> Maturing new technologies through the systems engineering lifecycle has been a perpetual issue for the U.S. systems engineering community. Although there are many reasons why technologies fail to mature, this study examines the perceptions of why technologies succeed. To examine these perceptions, this study proposes to extend an established marketing tool framework, the Technology Readiness Index (TRI), into the systems engineering space. The TRI was originally developed to understand customers’ and employees’ interactions with technology and their propensity to embrace cutting-edge technology, which is considered as a person’s technology readiness. This study uses the TRI framework to survey the perceptions of the U.S. systems engineering community of successful technology maturation traits, and relate these perceptions to the systems engineering lifecycle. This survey framework, in combination with the Mann-Whitney U test method for determining statistical differences in non-parametric data, results in multiple common technology maturation perceptions throughout the systems engineering lifecycle, with cross-cutting themes of communicating regularly with interested parties, reducing implementation risk, and having a positive business case to sell to stakeholders. Moreover, it shows differences in the perceptions of the U.S. systems engineering community in relation to these common perceptions, as well as specific technology transition traits.</p><p>
|
253 |
A Micromechanical Model for Numerical Study of Rock Dilation and DuctilityNorouzi, Siavash 13 February 2018 (has links)
<p> The newly implemented micromechanical model in the CA2 computer program was studied in this work. The purpose was to address some of the issues in the numerical studies involving the Bonded Particle Model (BPM) including unrealistically low <i>q<sub>u</sub></i>/σ<sub> t</sub> ratios, overall dilation behavior, and the post-failure response of rocks. The plasticity model allows both tensile and shear softening of the filling material at the contact points of the particles. It is shown that for a more ductile material, there is less scatter of micro-cracking at the peak load. Furthermore, the ductility parameter appears to be a good tool in controlling the ratio of compressive to uniaxial tensile strength of rock. While the ductility of the filling at the contact points of the particles has a drastic effect on the macroscopic post-peak rock behavior in the direct tensile testing, its role in dictating the post-peak rock behavior in compression is negligible and needs further study. The combined effect of ductility and initial micro-cracking on rock strength characteristics was studied as well. The numerical results suggest that the ratio of Brazilian to direct tensile strength of the simulated material is affected by the initial micro-crack intensity; this ratio is around 1 for a material with no initial micro-cracks but it gradually increases as the initial micro-crack intensity is increased. In terms of the overall dilation behavior, it is shown that the macro-dilation angle can be controlled by means of the micro-dilation angle in a positive correlation provided that the average grain size is sufficiently small or when a joint is involved. As the grain size increases, the resulted macro-asperities suppress the functionality of the micro-dilation angle and consequently, the macro-dilation angle cannot be controlled. Further, it is shown that the genesis pressure can help to govern the overall dilation behavior. This parameter is also able to control the post-peak behavior of a bonded particle system. It is shown that high values of the genesis pressure yield to more brittle BPM system with greater dilation angles and steeper post-peak curves.</p><p>
|
254 |
Numerical simulation of noise attenuating perforated combustor liners and the combustion instability issue in gas turbine enginesWang, Jianguo January 2017 (has links)
Combustion instability represents a significant problem in the application of low emission lean premixed combustion for gas turbines and has become one of the primary concerns in modern gas turbine industry. Effusion cooling has become common practice in gas turbine combustors and when calibrated, perforated combustor liners are able to attenuate combustion instability within a wide frequency range. However, the acoustic attenuation effect of perforated liner absorbers varies with a considerable number of flow and geometry influencing factors. The traditional approach of designing perforated combustor liners relies heavily on expensive and lengthy trial-and-error experimental practice. Computational fluid dynamics (CFD), especially large eddy simulation (LES) method has gained recognition as a viable tool for the simulation of unsteady flows and the phenomenon of combustion instability in gas turbine combustors. However, detailed resolution of the many small scale features, such as effusion cooling holes, is computationally very expensive and restricts the routine simulation of detailed engine geometries. In this thesis, a novel homogenous porous media model is proposed for the simulation of acoustic attenuation effect of gas turbine perforated liners. The model is validated against a number of well-acknowledged experiments and is shown to be able to predict acoustic attenuation properties of gas turbine liners both in the linear and non-linear absorption regimes and also the effect of bias flow, grazing flow and the temperature of flow on the acoustic properties of the liners. The model is applied to a large eddy simulation of a lab-scale premixed combustor "PRECCINSTA" and is demonstrated to successfully represent noise attenuation effects of perforated liner absorbers in both cold flow and reacting flow conditions. This model is able to provide a significant reduction in the overall computational time in comparison to directly resolved geometries, and can be applied as such a viable option for routine engineering simulation of perforated combustor liners.
|
255 |
Machine Learning and Cellular Automata| Applications in Modeling Dynamic Change in Urban EnvironmentsCurtis, Brian J. 27 April 2018 (has links)
<p> There have been several studies advocating the need for, and the feasibility of, using advanced techniques to support decision makers in urban planning and resource monitoring. One such advanced technique includes a framework that leverages remote sensing and geospatial information systems (GIS) in conjunction with cellular automata (CA) to monitor land use / land change phenomena like urban sprawling. Much research has been conducted using various learning techniques spanning all levels of complexity - from simple logistical regression to advance artificial intelligence methods (e.g., artificial neural networks). In a high percentage of the published research, simulations are performed leveraging only one or two techniques and applied to a case study of a single geographical region. Typically, the findings are favorable and demonstrate the studied methods are superior. This work found no research being conducted to compare the performance of several machine learning techniques across an array of geographical locations. Additionally, current literature was found lacking in investigating the impact various scene parameters (e.g., sprawl, urban growth) had on the simulation results. Therefore, this research set out to understand the sensitivities and correlations associated with the selection of machine learning methods used in CA based models. The results from this research indicate more simplistic algorithms, which are easier to comprehend and implement, have the potential to perform equally as well as compared to more complicated algorithms. Also, it is shown that the quantity of urbanization in the studied area directly impacts the simulation results. </p><p>
|
256 |
The Effect of Integrating Risk Management on Large-Scale Information Technology Projects Using an Integrated Software Risk Management ToolOdeh, Muhammad F. 27 April 2018 (has links)
<p> A risk in principle is an uncertain event that if it occurs, may have an adverse or positive effect on a project's objectives. Adverse risks imply threats and positive ones may lead to opportunities. Proper risk management is creating an appropriate environment and a policy for maintaining efficient and effective processes that are vital elements not only to the success of projects but the organization as a whole. </p><p> The occurrence of risks is a reality in information technology (IT) projects, regardless if it is an implementation of proven technology, a new one or development of software for a specific function. An appropriate approach is a practice whereby organizations methodically address these risks with the objective of achieving sustained benefit within each project and across the organization as a whole. On the other hand, poor management is ignoring the chances of anticipating time, resources, and scope or budget risks. Risk management incorporates risk planning, identification, analysis, response planning, monitoring and controlling and closing processes. These standard methods, marshal the understanding of potential upside and downside factors that can affect the business and increases the probability of success while reducing not only the likelihood of failure and cost overrun but also the uncertainty of achieving the organization’s overall financial objectives. </p><p> This praxis represents a modeling approach focused on the impact of risk management on information technology projects by developing an integrated risk management tool. A framework to proactively adhere to risk management processes for identifying and analyzing risks so that proper responses are planned, risks are tracked, monitored, controlled and closed. The successful implementation of this risk management tool will serve as a guide to others for developing and implementing systematic project risk management that is suitable for their organizations. It helps provide better control over the future of the project and improves the chances of the project meeting its objectives and complete on time and budget. </p><p> This praxis contributes to the practice of risk management in IT projects by refining the perception of proactively utilizing proper risk management processes from the inception of the project and throughout its lifecycle. It also improves the understanding of what drives the use of risk management processes and methodologies on IT projects for the improvement of projects success rates and the overall health of the organization. </p><p> The failure ratio of IT projects meeting their objectives is a common concern and a frustrating challenge for executives. Integrating risk management processes throughout the lifecycle of project management using appropriate methodologies, techniques, and tools, mastering technology, relying on skilled project managers and effective teams and stakeholders’ management is practical and proven applications companies, project managers and practitioners can employ to increase the value and make the most of their IT projects.</p><p>
|
257 |
Time-Dependent Nonlinear Control of Bipedal Robotic WalkingGu, Yan 03 November 2017 (has links)
<p> Although bipedal walking control has been extensively studied for the past forty years, it remains a challenging task. To achieve high-performance bipedal robotic walking, this dissertation studies and investigates control strategies for both fully actuated and underactuated bipedal robots based on nonlinear control theories and formal stability analysis.</p><p> Previously, the Hybrid-Zero-Dynamics (HZD) framework, which is a state-based feedback controller design based on the full-order dynamic modeling and the input-output linearization, has successfully realized stable, agile, and efficient bipedal walking for both fully actuated and underactuated bipedal robotic walking. However, the critical issue of achieving high walking versatility has not been fully addressed by the HZD framework. In this dissertation, we propose and develop a time-dependent controller design methodology to achieve not only stable, agile, and efficient but also versatile bipedal walking for fully actuated bipeds. Furthermore, the proposed time-dependent approach can be used to achieve better walking robustness to implementation imperfections for both fully actuated and underactuated bipeds by effectively solving the high-sensitivity issue of the state-based approaches to sensor noises.</p><p> In our controller design methodology, the full-order hybrid walking dynamics are first modeled, which consist of both continuous-time dynamics and rigid-body impact dynamics. Then, the desired path/motion for a biped to track is planned, and the output function is designed as the tracking error of the desired path/motion. Based on the full-order model of walking dynamics, the input-output linearization is utilized to synthesize a controller that exponentially drives the output function to zero during continuous phases. Finally, sufficient conditions are developed to evaluate the stability of the hybrid, time-varying closed-loop control system. By enforcing these conditions, stable bipedal walking can be automatically realized, and the desired motion can be satisfactorily followed. </p><p> Both full actuation and underactuation are common in bipedal robotic walking. Full actuation occurs when the number of degrees of freedom equals the number of independent actuators while underactuation occurs when the number of degrees of freedom is greater than the number of independent actuators. Because a fully actuated biped can directly control each of its joints, more objectives may be achieved for a fully actuated biped than an underactuated one. In this dissertation, the exponential tracking of a straight-line contour in Cartesian space is achieved for both planar and three-dimensional (3-D) walking, which greatly improves the versatility of fully actuated bipedal robots. To guarantee the closed-loop stability, the <i>first</i> sufficient stability conditions are developed based on the construction of multiple Lyapunov functions. </p><p> Underactuated walking is much more difficult to control than fully actuated walking because an underactuated biped cannot directly control each of its joints. In this dissertation, control design of periodic, underactuated walking is investigated, and the first set of sufficient conditions for time-dependent orbitally exponential stabilization is established based on time-dependent nonlinear feedback control. Without modifications, the proposed controller design can be directly applied to both planar and 3-D bipeds that are subject to either underactuation or full actuation. </p><p> Extensive computer simulation results validated the proposed time-dependent controller design methodology for bipedal robotic walking. Specifically, three bipedal models were simulated: one was a fully actuated, planar bipedal model with three revolute joints, one was a fully actuated, 3-D bipedal model with nine revolute joints, and one was an underactuated, planar bipedal model with five revolute joints.</p><p>
|
258 |
Rheological Measurements and Core Flood Data Analysis in Support of Chemical Enhanced Oil Recovery Formulation DesignTang, Huiling 03 November 2017 (has links)
<p> This research involved rheological measurements and the analysis of core flood test data in support of the design of a formulation for Chemical Enhanced Oil Recovery (cEOR) at the Pioneer Rock Hill reservoir, a site characterized by relatively low formation brine salinity and temperature. Extensive and systematic rheological measurements identified viscosity values and rheological behaviors of different polymers, surfactants and polymer-surfactant solutions over a range of concentrations, salinities, and temperatures relevant to the targeted field conditions. The results were used to support formulation design in combination with phase behavior studies and interfacial tension measurements, provide information relevant to in-tank mixing/pumping operations, and maximize sweep efficiency and mobility control in the core flood tests. Further rheological measurements were conducted on the primary surfactant, Petrostep<sup>® </sup> S13D, over a broad range of concentrations in both deionized water and two synthetic brines, up to neat solution. The results of these tests indicate that different structures (micellar solution, hexagonal liquid crystal, and lamellar liquid crystal) form at different concentrations, supporting SAXS observations performed by another research group. </p><p> In a separate effort, data obtained from core flood tests conducted in the Purdue EOR laboratory to evaluate and optimize formulations, were collected and organized. Five performance parameters: recovery factor in terms of %ROIP, oil saturation after chemical flood (S<sub>orc</sub>), maximum injection pressure during chemical flood, surfactant sorption, and total injectant cost, were selected to evaluate test efficiency, based on technical and economic feasibility. Performance analysis of the core flood data and comparison with data from the literature show average to very good performance of the Purdue core flood tests.</p><p>
|
259 |
Practice and Fit in the Allocation of the Resource of Faculty Time: A Study of Current and Preferred Scholarly Practice of the Faculties of Ten Theological Schools Affiliated with the Presbyterian Church (USA)Nicholson, Roger Allen 01 January 1997 (has links)
Faculties, deans, and trustees of theological schools affiliated with the Presbyterian Church (USA) were surveyed to determine current faculty practice; the practice preferred by faculty, deans. and trustees; the fit between current and preferred practice; and faculty practice and faculty preference difference according to categorical variables such as gender, race, and rank.
Seven variables defined faculty practice: workweek in hours, instruction, scholarship, service, advising, governance, and other. Scholarship was subdivided into three categories adapting Ernest Boyer’s multi-dimensional definition of scholarship: orginitive, applied, and teaching. Fit was defined in two ways: statistical fit and practical fit.
The reported workweek was comparable to that reported by faculties at other types of universities and colleges. The time theological faculties reported spending on teaching exceeded only that of research university faculty. The theological faculties reported spending more time on scholarship than liberal arts college and comprehensive university faculties, but less than doctoral and research faculties. Theological faculties reported spending significantly more time on service than faculty at other types of institutions.
While statistical differences were found between current practice and the preferences of deans and trustees, practical differences were negligible. A statistical and practical difference was found between the preferences of faculty and deans for governance activities and between faculty and trustee preferences for the categories of instruction and scholarship.
Considered by categorical variables, preferred practice of faculty varied most by faculty teaching discipline. Implications of the findings for planning and assessment in theological schools were discussed.
|
260 |
Identifying and Overcoming the Barriers to Cloud Adoption within the Government SpaceTaylor, Cyril Mark, Sr. 16 November 2017 (has links)
<p> Cloud computing services have been supporting consumers’ enterprise demands for almost 20-years now and longer if conceptualized as a decoupled mainframe. Small companies and individuals with meager resources can gain and maintain the same class of service once only attainable by large companies that are well resources and funded. Additionally, the only impedance on a commercial company’s growth in the cloud are financial limitations or service limitations. This is not the case for the government as a consumer. </p><p> This praxis research effort focused on identifying the common barriers to cloud adoption, and the development of a data aggregating solution to better facilitate enterprise migrations to the cloud with the government as a consumer. The applied nature of this research enabled the integration of existing capabilities and standards into the proposed solution. Initial solutions development was based on the constructivist paradigm of qualitative methods, holding that knowledge is gained interactively. The follow on phase of research was quantitative in nature, following the positivism paradigm. The quantitative methods validated the initial solution and provided refinements.</p><p> Knowledge of cloud computing was gained through intrinsic subject matter expertise tempered with literature review and investigation. Statistical methods and tools were employed to provide engineering rigor to both validate and refine the proposed solution of the enterprise cloud management framework (ECMF).</p><p>
|
Page generated in 0.15 seconds