• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 100
  • 66
  • 47
  • 19
  • 8
  • 6
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 312
  • 81
  • 43
  • 40
  • 36
  • 32
  • 32
  • 32
  • 32
  • 31
  • 29
  • 27
  • 26
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

A Strain Energy Function for Large Deformations of Curved Beams

Mackenzie, Ian January 2008 (has links)
This thesis develops strain and kinetic energy functions and a finite beam element useful for analyzing curved beams which go through large deflections, such as a hockey stick being swung and bent substantially as it hits the ice. The resulting beam model is demonstrated to be rotation invariant and capable of computing the correct strain energy and reaction forces for a specified deformation. A method is also described by which the model could be used to perform static or dynamic simulations of a beam.
132

Computer Aided Ferret Design

Siu, Selina January 2003 (has links)
Ferrets are amusing, flexible creatures that have been under represented in computer models. Because their bodies can assume almost any curved shape, splines are the natural tool for modelling ferrets. Surface pasting is a hierarchical method of modelling with spline surfaces, where features are added onto a base surface. Existing surface pasting techniques are limited to modelling rectilinear shapes. Using the task of modelling a ferret as a driving force, I propose a method of pasting cylinders in world space; I looked at methods for reducing distortion of pasted features; and I created a method for pasting trimmed features to allow for features that do not have the rectilinear shape of standard pasting. With my methods, modelling ferrets with surface pasting is easier, and the resulting models are closer to a real ferret.
133

A Strain Energy Function for Large Deformations of Curved Beams

Mackenzie, Ian January 2008 (has links)
This thesis develops strain and kinetic energy functions and a finite beam element useful for analyzing curved beams which go through large deflections, such as a hockey stick being swung and bent substantially as it hits the ice. The resulting beam model is demonstrated to be rotation invariant and capable of computing the correct strain energy and reaction forces for a specified deformation. A method is also described by which the model could be used to perform static or dynamic simulations of a beam.
134

An Efficient Robust Concept Exploration Method and Sequential Exploratory Experimental Design

Lin, Yao 31 August 2004 (has links)
Experimentation and approximation are essential for efficiency and effectiveness in concurrent engineering analyses of large-scale complex systems. The approximation-based design strategy is not fully utilized in industrial applications in which designers have to deal with multi-disciplinary, multi-variable, multi-response, and multi-objective analysis using very complicated and expensive-to-run computer analysis codes or physical experiments. With current experimental design and metamodeling techniques, it is difficult for engineers to develop acceptable metamodels for irregular responses and achieve good design solutions in large design spaces at low prices. To circumvent this problem, engineers tend to either adopt low-fidelity simulations or models with which important response properties may be lost, or restrict the study to very small design spaces. Information from expensive physical or computer experiments is often used as a validation in late design stages instead of analysis tools that are used in early-stage design. This increases the possibility of expensive re-design processes and the time-to-market. In this dissertation, two methods, the Sequential Exploratory Experimental Design (SEED) and the Efficient Robust Concept Exploration Method (E-RCEM) are developed to address these problems. The SEED and E-RCEM methods help develop acceptable metamodels for irregular responses with expensive experiments and achieve satisficing design solutions in large design spaces with limited computational or monetary resources. It is verified that more accurate metamodels are developed and better design solutions are achieved with SEED and E-RCEM than with traditional approximation-based design methods. SEED and E-RCEM facilitate the full utility of the simulation-and-approximation-based design strategy in engineering and scientific applications. Several preliminary approaches for metamodel validation with additional validation points are proposed in this dissertation, after verifying that the most-widely-used method of leave-one-out cross-validation is theoretically inappropriate in testing the accuracy of metamodels. A comparison of the performance of kriging and MARS metamodels is done in this dissertation. Then a sequential metamodeling approach is proposed to utilize different types of metamodels along the design timeline. Several single-variable or two-variable examples and two engineering example, the design of pressure vessels and the design of unit cells for linear cellular alloys, are used in this dissertation to facilitate our studies.
135

Implementation of B-splines in a Conventional Finite Element Framework

Owens, Brian C. 16 January 2010 (has links)
The use of B-spline interpolation functions in the finite element method (FEM) is not a new subject. B-splines have been utilized in finite elements for many reasons. One reason is the higher continuity of derivatives and smoothness of B-splines. Another reason is the possibility of reducing the required number of degrees of freedom compared to a conventional finite element analysis. Furthermore, if B-splines are utilized to represent the geometry of a finite element model, interfacing a finite element analysis program with existing computer aided design programs (which make extensive use of B-splines) is possible. While B-splines have been used in finite element analysis due to the aforementioned goals, it is difficult to find resources that describe the process of implementing B-splines into an existing finite element framework. Therefore, it is necessary to document this methodology. This implementation should conform to the structure of conventional finite elements and only require exceptions in methodology where absolutely necessary. One goal is to implement B-spline interpolation functions in a finite element framework such that it appears very similar to conventional finite elements and is easily understandable by those with a finite element background. The use of B-spline functions in finite element analysis has been studied for advantages and disadvantages. Two-dimensional B-spline and standard FEM have been compared. This comparison has addressed the accuracy as well as the computational efficiency of B-spline FEM. Results show that for a given number of degrees of freedom, B-spline FEM can produce solutions with lower error than standard FEM. Furthermore, for a given solution time and total analysis time B-spline FEM will typically produce solutions with lower error than standard FEM. However, due to a more coupled system of equations and larger elemental stiffness matrix, B-spline FEM will take longer per degree of freedom for solution and assembly times than standard FEM. Three-dimensional B-spline FEM has also been validated by the comparison of a three-dimensional model with plane-strain boundary conditions to an equivalent two-dimensional model using plane strain conditions.
136

Predicting bid prices in construction projects using non-parametric statistical models

Pawar, Roshan 15 May 2009 (has links)
Bidding is a very competitive process in the construction industry; each competitor’s business is based on winning or losing these bids. Contractors would like to predict the bids that may be submitted by their competitors. This will help contractors to obtain contracts and increase their business. Unit prices that are estimated for each quantity differ from contractor to contractor. These unit costs are dependent on factors such as historical data used for estimating unit costs, vendor quotes, market surveys, amount of material estimated, number of projects the contractor is working on, equipment rental costs, the amount of equipment owned by the contractor, and the risk averseness of the estimator. These factors are nearly similar when estimators are estimating cost of similar projects. Thus, there is a relationship between the projects that a particular contractor has bid in previous years and the cost the contractor is likely to quote for future projects. This relationship could be used to predict bids that the contractor might quote for future projects. For example, a contractor may use historical data for a certain year for bidding on certain type of projects, the unit prices may be adjusted for size, time and location, but the basis for bidding on projects of similar types is the same. Statistical tools can be used to model the underlying relationship between the final cost of the project quoted by a contractor to the quantities of materials or amount of tasks performed in a project. There are a number of statistical modeling techniques, but a model used for predicting costs should be flexible enough that it could adjust to depict any underlying pattern. Data such as amount of work to be performed for a certain line item, material cost index, labor cost index and a unique identifier for each participating contractor is used to predict bids that a contractor might quote for a certain project. To perform the analysis, artificial neural networks and multivariate adaptive regression splines are used. The results obtained from both the techniques are compared, and it is found that multivariate adaptive regression splines are able to predict the cost better than artificial neural networks.
137

Bayesian Hierarchical, Semiparametric, and Nonparametric Methods for International New Product Di ffusion

Hartman, Brian Matthew 2010 August 1900 (has links)
Global marketing managers are keenly interested in being able to predict the sales of their new products. Understanding how a product is adopted over time allows the managers to optimally allocate their resources. With the world becoming ever more global, there are strong and complex interactions between the countries in the world. My work explores how to describe the relationship between those countries and determines the best way to leverage that information to improve the sales predictions. In Chapter II, I describe how diffusion speed has changed over time. The most recent major study on this topic, by Christophe Van den Bulte, investigated new product di ffusions in the United States. Van den Bulte notes that a similar study is needed in the international context, especially in developing countries. Additionally, his model contains the implicit assumption that the diffusion speed parameter is constant throughout the life of a product. I model the time component as a nonparametric function, allowing the speed parameter the flexibility to change over time. I find that early in the product's life, the speed parameter is higher than expected. Additionally, as the Internet has grown in popularity, the speed parameter has increased. In Chapter III, I examine whether the interactions can be described through a reference hierarchy in addition to the cross-country word-of-mouth eff ects already in the literature. I also expand the word-of-mouth e ffect by relating the magnitude of the e ffect to the distance between the two countries. The current literature only applies that e ffect equally to the n closest countries (forming a neighbor set). This also leads to an analysis of how to best measure the distance between two countries. I compare four possible distance measures: distance between the population centroids, trade ow, tourism ow, and cultural similarity. Including the reference hierarchy improves the predictions by 30 percent over the current best model. Finally, in Chapter IV, I look more closely at the Bass Diffusion Model. It is prominently used in the marketing literature and is the base of my analysis in Chapter III. All of the current formulations include the implicit assumption that all the regression parameters are equal for each country. One dollar increase in GDP should have more of an eff ect in a poor country than in a rich country. A Dirichlet process prior enables me to cluster the countries by their regression coefficients. Incorporating the distance measures can improve the predictions by 35 percent in some cases.
138

Improving Tool Paths for Impellers

Kuo, Hsin-Hung 02 September 2004 (has links)
Impellers are important components in the field of aerospace, energy technology, and precision machine industries. Considering the high accuracy and structural integrity, impellers might be manufactured by cutting. Due to their complex geometries and high degrees of interference in machining, multi-axis machines are requested to produce impellers. The object of this thesis is to improve 5-axis tool paths for surface quality of impellers by smoothing point cutting tool paths in terms of linear segments and B-Splines and by using flank milling technologies with linear segment and B-Splines tool paths. Experimental results show that the surface quality of impeller blades can be improved by point cutting with smoothed tool paths and by flank milling. Moreover, the required milling time can be reduced by 18 percent and 13percent based on smoothed linear tool paths and smoothed B-Splines tool paths, respectively.
139

Development Of A Matlab Based Software Package For Ionosphere Modeling

Nohutcu, Metin 01 September 2009 (has links) (PDF)
Modeling of the ionosphere has been a highly interesting subject within the scientific community due to its effects on the propagation of electromagnetic waves. The development of the Global Positioning System (GPS) and creation of extensive ground-based GPS networks started a new period in observation of the ionosphere, which resulted in several studies on GPS-based modeling of the ionosphere. However, software studies on the subject that are open to the scientific community have not progressed in a similar manner and the options for the research community to reach ionospheric modeling results are still limited. Being aware of this need, a new MATLAB&reg / based ionosphere modeling software, i.e. TECmapper is developed within the study. The software uses three different algorithms for the modeling of the Vertical Total Electron Content (VTEC) of the ionosphere, namely, 2D B-spline, 3D B-spline and spherical harmonic models. The study includes modifications for the original forms of the B-spline and the spherical harmonic approaches. In order to decrease the effect of outliers in the data a robust regression algorithm is utilized as an alternative to the least squares estimation. Besides, two regularization methods are employed to stabilize the ill-conditioned problems in parameter estimation stage. The software and models are tested on a real data set from ground-based GPS receivers over Turkey. Results indicate that the B-spline models are more successful for the local or regional modeling of the VTEC. However, spherical harmonics should be preferred for global applications since the B-spline approach is based on Euclidean theory.
140

Piecewise polynomial functions on a planar region: boundary constraints and polyhedral subdivisions

McDonald, Terry Lynn 16 August 2006 (has links)
Splines are piecewise polynomial functions of a given order of smoothness r on a triangulated region (or polyhedrally subdivided region) of Rd. The set of splines of degree at most k forms a vector space Crk() Moreover, a nice way to study Cr k()is to embed n Rd+1, and form the cone b of with the origin. It turns out that the set of splines on b is a graded module Cr b() over the polynomial ring R[x1; : : : ; xd+1], and the dimension of Cr k() is the dimension o This dissertation follows the works of Billera and Rose, as well as Schenck and Stillman, who each approached the study of splines from the viewpoint of homological and commutative algebra. They both defined chain complexes of modules such that Cr(b) appeared as the top homology module. First, we analyze the effects of gluing planar simplicial complexes. Suppose 1, 2, and = 1 [ 2 are all planar simplicial complexes which triangulate pseudomanifolds. When 1 \ 2 is also a planar simplicial complex, we use the Mayer-Vietoris sequence to obtain a natural relationship between the spline modules Cr(b), Cr (c1), Cr(c2), and Cr( \ 1 \ 2). Next, given a simplicial complex , we study splines which also vanish on the boundary of. The set of all such splines is denoted by Cr(b). In this case, we will discover a formula relating the Hilbert polynomials of Cr(cb) and Cr (b). Finally, we consider splines which are defined on a polygonally subdivided region of the plane. By adding only edges to to form a simplicial subdivision , we will be able to find bounds for the dimensions of the vector spaces Cr k() for k 0. In particular, these bounds will be given in terms of the dimensions of the vector spaces Cr k() and geometrical data of both and . This dissertation concludes with some thoughts on future research questions and an appendix describing the Macaulay2 package SplineCode, which allows the study of the Hilbert polynomials of the spline modules.

Page generated in 0.0508 seconds