• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 12
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 38
  • 38
  • 38
  • 15
  • 12
  • 10
  • 8
  • 6
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Displacement and interstory drift constraint design in GT STRUDL

Maham, Andrew S. 08 1900 (has links)
No description available.
12

Application Software of the Future-Filter Design with Gem

George, Alan D. 01 January 1985 (has links) (PDF)
As the use of computer in engineering design as well as other areas increase, it becomes more imperative that the application software used be as simple, convenient, and powerful as possible. The engineer is not interested in the internal workings of the computer or its operating system. It is the design itself that takes precedence. The filter design package developed for this project, known as FILTER, is such an application. With FILTER, coupled with the Digital Research Graphics Environment Manager, the engineer is led through the analog and digital filter design phase on a personal computer with carefully designed interactive computer graphics requiring little or no computer knowledge.
13

A user friendly preprocessor for plane and space frames and space trusses

Pugh, James Christopher 08 September 2012 (has links)
A user friendly preprocessor was developed and documented for the plane and space frame and space truss structural analysis programs that are based on the matrix displacements method. This preprocessor is comprised of three programs. The main program in the preprocessor is to allow the user to create error free input data files. This program also allows modifications of existing input data files. The two other programs are the library manager and the graphics presentation. The library manager is used to manage the libraries of the element and material properties. The graphics presentation is used to display a plane structure on the graphics display. In Chapter 2, the development of a user friendly preprocessor is discussed. After a short review of the extension of the analysis program from plane frame to space frame in Chapter 3, the preprocessor and its supporting programs are described in detail in the user manual in Chapter 4. Possible extensions to the preprocessor are discussed in Chapter 5. The appendix contains examples of input data files for these structural analysis programs. / Master of Science
14

A microcomputer program for the design of minimum weight bridge plate girders

Allison, Donald K January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
15

Development of micro-computer programs for the analysis of an open spandral arch

Koontz, John Jay. January 1984 (has links)
Call number: LD2668 .T4 1984 K66 / Master of Science
16

RELIABILITY GROWTH MODELS FOR ATTRIBUTES (BAYES, SMITH).

SANATGAR FARD, NASSER. January 1982 (has links)
In this dissertation the estimation of reliability for a developmental process generating attribute type data is examined. It is assumed that the process consists of m stages, and the probability of failure is constant or decreasing from stage to stage. Several models for estimating the reliability at each stage of the developmental process are examined. In the classical area, Barlow and Scheuer's model, Lloyd and Lipow's model and a cumulative maximum likelihood estimation model are investigated. In the Bayesian area A.F.M. Smith's model, an empirical Bayes model and a cumulative beta Bayes model are investigated. These models are analyzed both theoretically and by computer simulation. The strengths and weaknesses of each are pointed out, and modifications are made in an attempt to improve their accuracy. The constrained maximum likelihood estimation model of Barlow and Scheuer is shown to be inaccurate when no failures occur at the final stage. Smith's model is shown to be incorrect and a corrected algorithm is presented. The simulation results of these models with the same data indicate that with the exception of the Barlow and Scheuer's model they are all conservative estimators. When reliability estimation with growth is considered, it is reasonable to emphasize data obtained at recent stages and de-emphasize data from the earlier stages. A methodology is developed using geometric weights to improve the estimates. This modification is applied to the cumulative MLE model, Lloyd and Lipow's model, Barlow and Scheuer's model and cumulative beta Bayes model. The simulation results of these modified models show considerable improvement is obtained in the cumulative MLE model and the cumulative beta Bayes model. For Bayesian models, in the absence of prior knowledge, the uniform prior is usually used. A prior with maximum variance is examined theoretically and through simulation experiments for use with the cumulative beta Bayes model. These results show that the maximum variance prior results in faster convergence of the posterior distribution than the uniform prior. The revised Smith's model is shown to provide good estimates of the unknown parameter during the developmental process, particularly for the later stages. The beta Bayes model with maximum variance prior and geometric weights also provides good estimates.
17

Three-dimensional knowledge representation using extended structure graph grammars

20 November 2014 (has links)
M.Sc. (Computer Science) / The purpose of this disssertation is to study methods to represent structures in three-dimensions. Due to the fact that chemical molecules are mostly complex three-dimensional structures, we used chemical molecules as our application domain. A literature study of current chemical information systems was undertaken. The whole spectrum of information systems was covered because almost all of these systems represent chemical molecules in one way or another. Various methods of three-dimensional structure representation were found in our literature study. All of these methods were discussed in the context of its own application domain. Structure graph grammars were examined and explained in detail. A small object-based system with structure graph grammars as the underlying principle was developed. We speculated on the use of such "intelligent" graph grammars in structure interpretation and identification. Further research in this area was also identified.
18

Improved Methodology for Limit States Finite Element Analysis of Lattice Type Structures using Nonlinear Post-Buckling Member Performance

Ostendorp, Markus 01 January 1992 (has links)
In an attempt to achieve more efficient designs, the technological frontier is pushed further and further. Every year science probes for a better understanding of natural phenomena, discovering new and improved methods to perform the same task more efficiently and with better results. One of the new technologies is the nonlinear analysis of structural systems using inelastic post-buckling member performance. Inelastic post-buckling member performance is defined as the constitutive relationship between axial load and displacement after the ultimate member capacity has been exceeded. A nonlinear analysis is able to predict the failure behavior of a structural system under ultimate loads more accurately than the traditionally used linear elastic analysis. Consequently, designs can be improved and become more efficient, which reduces the realization cost of a project. An improved nonlinear analysis solution algorithm has been developed, that allows the analyst to perform a nonlinear analysis using post-buckling member performances faster than previously possible. Furthermore, the original post-buckling member performance database was expanded using results obtained from physical member compression tests. Based on the experimental results, new post-buckling member performance model curves were developed to be used together with the improved nonlinear solution algorithm. In addition, a program was developed that allows the analyst to perform a valid nonlinear analysis using a finite element program (LIMIT). The program combines a numerical pre-processor, and input and output data evaluation modules based on human expertise together with the LIMIT analysis package. Extensive on-line help facilities together with graphical pre- and post-processors were also integrated into the program. The resulting analysis package essentially combines all of the necessary components required to perform a nonlinear analysis using post-buckling member performances into one complete analysis package.
19

Rapid Architecture Alternative Modeling (RAAM): a framework for capability-based analysis of system of systems architectures

Iacobucci, Joseph Vincent 04 April 2012 (has links)
The current national security environment and fiscal tightening make it necessary for the Department of Defense to transition away from a threat based acquisition mindset towards a capability based approach to acquire portfolios of systems. This requires that groups of interdependent systems must regularly interact and work together as systems of systems to deliver desired capabilities. Technological advances, especially in the areas of electronics, computing, and communications also means that these systems of systems are tightly integrated and more complex to acquire, operate, and manage. In response to this, the Department of Defense has turned to system architecting principles along with capability based analysis. However, because of the diversity of the systems, technologies, and organizations involved in creating a system of systems, the design space of architecture alternatives is discrete and highly non-linear. The design space is also very large due to the hundreds of systems that can be used, the numerous variations in the way systems can be employed and operated, and also the thousands of tasks that are often required to fulfill a capability. This makes it very difficult to fully explore the design space. As a result, capability based analysis of system of systems architectures often only considers a small number of alternatives. This places a severe limitation on the development of capabilities that are necessary to address the needs of the war fighter. The research objective for this manuscript is to develop a Rapid Architecture Alternative Modeling (RAAM) methodology to enable traceable Pre-Milestone A decision making during the conceptual phase of design of a system of systems. Rather than following current trends that place an emphasis on adding more analysis which tends to increase the complexity of the decision making problem, RAAM improves on current methods by reducing both runtime and model creation complexity. RAAM draws upon principles from computer science, system architecting, and domain specific languages to enable the automatic generation and evaluation of architecture alternatives. For example, both mission dependent and mission independent metrics are considered. Mission dependent metrics are determined by the performance of systems accomplishing a task, such as Probability of Success. In contrast, mission independent metrics, such as acquisition cost, are solely determined and influenced by the other systems in the portfolio. RAAM also leverages advances in parallel computing to significantly reduce runtime by defining executable models that are readily amendable to parallelization. This allows the use of cloud computing infrastructures such as Amazon's Elastic Compute Cloud and the PASTEC cluster operated by the Georgia Institute of Technology Research Institute (GTRI). Also, the amount of data that can be generated when fully exploring the design space can quickly exceed the typical capacity of computational resources at the analyst's disposal. To counter this, specific algorithms and techniques are employed. Streaming algorithms and recursive architecture alternative evaluation algorithms are used that reduce computer memory requirements. Lastly, a domain specific language is created to provide a reduction in the computational time of executing the system of systems models. A domain specific language is a small, usually declarative language that offers expressive power focused on a particular problem domain by establishing an effective means to communicate the semantics from the RAAM framework. These techniques make it possible to include diverse multi-metric models within the RAAM framework in addition to system and operational level trades. A canonical example was used to explore the uses of the methodology. The canonical example contains all of the features of a full system of systems architecture analysis study but uses fewer tasks and systems. Using RAAM with the canonical example it was possible to consider both system and operational level trades in the same analysis. Once the methodology had been tested with the canonical example, a Suppression of Enemy Air Defenses (SEAD) capability model was developed. Due to the sensitive nature of analyses on that subject, notional data was developed. The notional data has similar trends and properties to realistic Suppression of Enemy Air Defenses data. RAAM was shown to be traceable and provided a mechanism for a unified treatment of a variety of metrics. The SEAD capability model demonstrated lower computer runtimes and reduced model creation complexity as compared to methods currently in use. To determine the usefulness of the implementation of the methodology on current computing hardware, RAAM was tested with system of system architecture studies of different sizes. This was necessary since system of systems may be called upon to accomplish thousands of tasks. It has been clearly demonstrated that RAAM is able to enumerate and evaluate the types of large, complex design spaces usually encountered in capability based design, oftentimes providing the ability to efficiently search the entire decision space. The core algorithms for generation and evaluation of alternatives scale linearly with expected problem sizes. The SEAD capability model outputs prompted the discovery a new issue, the data storage and manipulation requirements for an analysis. Two strategies were developed to counter large data sizes, the use of portfolio views and top `n' analysis. This proved the usefulness of the RAAM framework and methodology during Pre-Milestone A capability based analysis.
20

Computer method for the generation of the geometry of tensegrity structures

Charalambides, Jason Evelthon 28 August 2008 (has links)
Not available / text

Page generated in 0.0966 seconds