• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 719
  • 238
  • 238
  • 121
  • 67
  • 48
  • 21
  • 19
  • 13
  • 10
  • 9
  • 8
  • 8
  • 8
  • 7
  • Tagged with
  • 1771
  • 529
  • 473
  • 274
  • 184
  • 139
  • 137
  • 117
  • 117
  • 115
  • 114
  • 109
  • 107
  • 102
  • 102
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
581

The Effects of Diaphragm Flexibility on the Seismic Performance of Light Frame Wood Structures

Pathak, Rakesh 11 July 2008 (has links)
This dissertation presents work targeted to study the effects of diaphragm flexibility on the seismic performance of light frame wood structures (LFWS). The finite element approach is considered for modeling LFWS as it is more detailed and provides a way to explicitly incorporate individual structural elements and corresponding material properties. It is also suitable for capturing the detailed response of LFWS components and the structure as a whole. The finite element modeling methodology developed herein is in general based on the work done by the other finite element researchers in this area. However, no submodeling or substructuring of subassemblages is performed and instead a detailed model considering almost every connection in the shear walls and diaphragms is developed. The studs, plates, sills, blockings and joists are modeled using linear isotropic three dimensional frame elements. A linear orthotropic shell element incorporating both membrane and plate behavior is used for the sheathings. The connections are modeled using oriented springs with modified Stewart hysteresis spring stiffnesses. The oriented spring pair has been found to give a more accurate representation of the sheathing to framing connections in shear walls and diaphragms when compared to non-oriented or single springs typically used by most researchers in the past. Fifty six finite element models of LFWS are created using the developed methodology and eighty eight nonlinear response history analyses are performed using the Imperial Valley and Northridge ground motions. These eighty eight analyses encompass the parametric study on the house models with varying aspect ratios, diaphragm flexibility and lateral force resisting system. Torsionally irregular house models showed the largest range of variation in peak base shear of individual shear walls, when corresponding flexible and rigid diaphragm models are compared. It is also found that presence of an interior shear wall helps in reducing peak base shears in the boundary walls of torsionally irregular models. The interior walls presence was also found to reduce the flexibility of diaphragm. A few analyses also showed that the nail connections are the major source of in-plane flexibility compared to sheathings within a diaphragm, irrespective of the aspect ratio of the diaphragm. A major part of the dissertation focuses on the development of a new high performance nonlinear dynamic finite element analysis program which is also used to analyze all the LFWS finite element models presented in this study. The program is named WoodFrameSolver and is written on a mixed language platform Microsoft Visual Studio .NET using object-oriented C++, C and FORTRAN. This tool set is capable of performing basic structural analysis chores like static and dynamic analysis of 3D structures. It has a wide collection of linear, nonlinear and hysteretic elements commonly used in LFWS analysis. The advanced analysis features include static, nonlinear dynamic and incremental dynamic analysis. A unique aspect of the program lies in its capability of capturing elastic displacement participation (sensitivity) of spring, link, frame and solid elements in static analysis. The program's performance and accuracy are similar to that of SAP 2000 which is chosen as a benchmark for validating the results. The use of fast and efficient serial and parallel solver libraries obtained from INTEL has reduced the solution time for repetitive dynamic analysis. The utilization of the standard C++ template library for iterations, storage and access has further optimized the analysis process, especially when problems with a large number of degrees of freedom are encountered. / Ph. D.
582

Development of Methods for Improved Data Integrity and Efficient Testing of Wind Tunnel Models for Dynamic Test Conditions in Unsteady and Nonlinear Flight Regimes

Heim, Eugene Henry DeWendt 05 February 2004 (has links)
Today's high performance aircraft are operating in expanded flight envelopes, often maneuvering at high angular rates at high angles-of-attack, even above maximum lift. Current aerodynamic models are inadequate in predicting flight characteristics in the expanded envelope, such as rapid aircraft departures and other unusual motions. Unsteady flows of aircraft are of real concern. The ability to accurately measure aerodynamic loads directly impacts the ability to accurately model and predict flight. Current wind tunnel testing techniques do not adequately address the data fidelity of a test point under the influence of fluctuating loads and moments. Additionally, forced oscillation test techniques, one of the primary tools used to develop dynamic models, do not currently provide estimates of the uncertainty of the results during an oscillation cycle. Further, in testing models across a range of flight conditions, there are frequently parts of the envelope which are well behaved and require few data points to arrive at a sound answer, and other parts of the envelope where the responses are much more active and require a large sample of data to arrive at an answer with statistical significance. Currently, test methods do not factor changes of flow physics into data acquisition schemes, so in many cases data are obtained over more iterations than required, or insufficient data may be obtained to determine a valid estimate. Methods of providing a measure of data integrity for static and forced oscillation test techniques are presented with examples. A method for optimizing required forced oscillation cycles based on decay of uncertainty gradients and balance tolerances is also presented. / Master of Science
583

Electromagnetic Field Computation for Power Transmission Lines Using Quasi-Static Sub-Gridding Finite-Difference Time-Domain Approach

Ramli, Khairun N., Abd-Alhameed, Raed, See, Chan H., Noras, James M., Excell, Peter S. 06 1900 (has links)
Yes / A new approach of modelling the electromagnetic wave propagation and the penetration of small objects, are investigated and analysed. The travelling electromagnetic wave from source is simulated by time-dependent Maxwell's solutions. Subgridding technique is imposed at the point of interest for observing the electromagnetic field in high resolution. The computational burden caused by a large number of time steps has been parried by implementing the state of art of quasi-static approach. The induced electromagnetic fields near a buried pipeline runs parallel to a 400 kV power transmission lines are presented, and discussed.
584

Bradford Multi-Modal Gait Database: Gateway to Using Static Measurements to Create a Dynamic Gait Signature

Alawar, Hamad M.M.A., Ugail, Hassan, Kamala, Mumtaz A., Connah, David 25 November 2014 (has links)
Yes / Aims: To create a gait database with optimum accuracy of joint rotational data and an accu-rate representation of 3D volume, and explore the potential of using the database in studying the relationship between static and dynamic features of a human’s gait. Study Design: The study collected gait samples from 38 subjects, in which they were asked to walk, run, walk to run transition, and walk with a bag. The motion capture, video, and 3d measurement data extracted was used to analyse and build a correlation between features. Place and Duration of Study: The study was conducted in the University of Bradford. With the ethical approval from the University, 38 subjects’ motion and body volumes were recorded at the motion capture studio from May 2011- February 2013. Methodology: To date, the database includes 38 subjects (5 females, 33 males) conducting walk cycles with speed and load as covariants. A correlation analysis was conducted to ex-plore the potential of using the database to study the relationship between static and dynamic features. The volumes and surface area of body segments were used as static features. Phased-weighted magnitudes extracted through a Fourier transform of the rotation temporal data of the joints from the motion capture were used as dynamic features. The Pearson correlation coefficient is used to evaluate the relationship between the two sets of data. Results: A new database was created with 38 subjects conducting four forms of gait (walk, run, walk to run, and walking with a hand bag). Each subject recording included a total of 8 samples of each form of gait, and a 3D point cloud (representing the 3D volume of the subject). Using a Pvalue (P<.05) as a criterion for statistical significance, 386 pairs of features displayed a strong relationship. Conclusion: A novel database available to the scientific community has been created. The database can be used as an ideal benchmark to apply gait recognition techniques, and based on the correlation analysis, can offer a detailed perspective of the dynamics of gait and its relationship to volume. Further research in the relationship between static and dynamic features can contribute to the field of biomechanical analysis, use of biometrics in forensic applications, and 3D virtual walk simulation.
585

INTEGRATED DESIGN OF BINDER JET PRINT PRODUCED HYDRAULIC AUTOMATIC VALVE SYSTEM

Heming Liu (14380014) 18 January 2023 (has links)
<p>Binder jet printing (BJP) is an additive manufacturing (AM) method which has the potential to be applied to high annual volumes in the automotive industry. Binder jet printing provides an excellent opportunity to innovate transmission valve body components. The three-layer design and complex hydraulic control system channels of valve body housing formulated a new electro-hydraulic system with the brand-new features inherited from BJP. For the valve body, the features of BJP brought a revolutionary new idea for both the valves and hydraulic channel design. The spool valve was housed with a sleeve that integrates orifices and port controls. The hydraulic channel layout of the valve body assembly was greatly simplified and space-saving. The support components had also been replaced with a lightweight design while maintaining the same functionality. Integrated design of Binder jet print produced hydraulic automatic valve system presented an entirely new design, whose static performance was compared to that of the conventional 948TE ZF9HP48 transmission valve body. Similar performance indicated that a valve body design featuring BJP would have great potential for various industrial applications.</p>
586

An Investigation into the Relationship between Static and Dynamic Gait Features. A biometrics Perspective

Alawar, Hamad M.M.A. January 2014 (has links)
Biometrics is a unique physical or behavioral characteristic of a person. This unique attribute, such as fingerprints or gait, can be used for identification or verification purposes. Gait is an emerging biometrics with great potential. Gait recognition is based on recognizing a person by the manner in which they walk. Its potential lays in that it can be captured at a distance and does not require the cooperation of the subject. This advantage makes it a very attractive tool for forensic cases and applications, where it can assist in identifying a suspect when other evidence such as DNA, fingerprints, or a face were not attainable. Gait can be used for recognition in a direct manner when the two samples are shot from similar camera resolution, position, and conditions. Yet in some cases, the only sample available is of an incomplete gait cycle, low resolution, low frame rate, a partially visible subject, or a single static image. Most of these conditions have one thing in common: static measurements. A gait signature is usually formed from a number of dynamic and static features. Static features are physical measurements of height, length, or build; while dynamic features are representations of joint rotations or trajectories. The aim of this thesis is to study the potential of predicting dynamic features from static features. In this thesis, we have created a database that utilizes a 3D laser scanner for capturing accurate shape and volumes of a person, and a motion capture system to accurately record motion data. The first analysis focused on analyzing the correlation between twenty-one 2D static features and eight dynamic features. Eleven pairs of features were regarded as significant with the criterion of a P-value less than 0.05. Other features also showed a strong correlation that indicated the potential of their predictive power. The second analysis focused on 3D static and dynamic features. Through the correlation analysis, 1196 pairs of features were found to be significantly correlated. Based on these results, a linear regression analysis was used to predict a dynamic gait signature. The predictors chosen were based on two adaptive methods that were developed in this thesis: "the top-x" method and the "mixed method". The predictions were assessed for both for their accuracy and their classification potential that would be used for gait recognition. The top results produced a 59.21% mean matching percentile. This result will act as baseline for future research in predicting a dynamic gait signature from static features. The results of this thesis bare potential for applications in biomechanics, biometrics, forensics, and 3D animation.
587

A Study of Some Aspects of Numerically Controlled Machine Tools

Heideman, Murdoch 11 1900 (has links)
<p> This thesis is a study of numerically controlled machine tools (NCMT), and is divided into four sections. </p> <p> Section A is a literature survey of current concepts, criteria and techniques in design of MCMT structures and drives. Several of the authors own ideas are also included. </p> <p> Section B deals with NCMT manual and computer aided programming techniques. The structure and function of post processors is also covered. </p> <p> Section C is a practical combination of computer design optimization and numerical control manufacture. In an example the geometrical dimensions of a hydrostatic thrust bearing are optimized and used as an input to a generalized APT programme, written to produce a numerical control tape for manufacture of this bearing type. </p> <p> Section D is the discussion and conclusion. </p> / Thesis / Master of Engineering (MEngr)
588

A Hybrid Software Change Impact Analysis for Large-scale Enterprise Systems

Chen, Wen 11 1900 (has links)
This work is concerned with analysing the potential impact of direct changes to large- scale enterprise systems, and, in particular, how to minimise testing efforts on such changes. A typical enterprise system may consist of hundreds of thousands of classes and millions of methods. Thus, it is extremely costly and difficult to apply conventional testing techniques to such a system. Retesting everything after a change is very expensive, and in practice generally not necessary. Selective testing can be more effective. However, it requires a deep understanding of the target system and a lack of that understanding can lead to insufficient test coverage. Change Impact Analysis can be used to estimate the impacts of the changes to be applied, providing developers/testers with confidence in selecting necessary tests and identifying untested entities. Conventional change impact analysis approaches include static analysis, dynamic analysis or a hybrid of the two analyses. They have proved to be useful on small or medium size programs, providing users an inside view of the system within an acceptable running time. However, when it comes to large-scale enterprise systems, the sizes of the programs are orders of magnitude larger. Conventional approaches often run into resource problems such as insufficient memory and/or unacceptable running time (up to weeks). More critically, a large number of false-negatives and false-positives can be generated from those approaches.In this work, a conservative static analysis with the capability of dealing with inheritance was conducted on an enterprise system and associated changes to obtain all the potential impacts. Later an aspect-based dynamic analysis was used to instrument the system and collect a set of dynamic impacts at run-time. We are careful not to discard impacts unless we can show that they are definitely not impacted by the change. Reachability analysis examines the program to see “Whether a given path in a program representation corresponds to a possible execution path”. In other words, we employ reachability analysis to eliminate infeasible paths (i.e., miss-matched calls and returns) that are identified in the control-flow of the program. Furthermore, in the phase of alias analysis, we aim at identifying paths that are feasible but cannot be affected by the direct changes to the system, by searching a set of possible pairs of accesses that may be aliased at each program point of interest. Our contributions are, we designed a hybrid approach that combines static anal- ysis and dynamic analysis with reachability analysis and alias/pointer analysis, it can be used to (1) solve the scalability problem on large-scale systems, (2) reduce false-positives and not introduce false-negatives, (3) extract both direct and indirect changes, and (4) identify impacts even before making the changes. Using our approach, organizations can focus on a much smaller, relevant subset of the overall test suite instead of blindly doing their entire suite of tests. Also it enables testers to augment the test suite with tests applying to uncovered impacts. We include an empirical study that illustrates the savings that can be attained. / Thesis / Doctor of Philosophy (PhD)
589

Multipole moments of axisymmetric spacetimes

Bäckdahl, Thomas January 2006 (has links)
In this thesis we study multipole moments of axisymmetric spacetimes. Using the recursive definition of the multipole moments of Geroch and Hansen we develop a method for computing all multipole moments of a stationary axisymmetric spacetime without the use of a recursion. This is a generalisation of a method developed by Herberthson for the static case. Using Herberthson’s method we also develop a method for finding a static axisymmetric spacetime with arbitrary prescribed multipole moments, subject to a specified convergence criteria. This method has, in general, a step where one has to find an explicit expression for an implicitly defined function. However, if the number of multipole moments are finite we give an explicit expression in terms of power series. / <p>Note: The two articles are also available in the pdf-file. Report code: LiU-TEK-LIC-2006:4.</p>
590

MLpylint: Automating the Identification of Machine Learning-Specific Code Smells

Hamfelt, Peter January 2023 (has links)
Background. Machine learning (ML) has rapidly grown in popularity, becoming a vital part of many industries. This swift expansion has brought about new challenges to technical debt, maintainability and the general software quality of ML systems. With ML applications becoming more prevalent, there is an emerging need for extensive research to keep up with the pace of developments. Currently, the research on code smells in ML applications is limited and there is a lack of tools and studies that address these issues in-depth. This gap in the research highlights the necessity for a focused investigation into the validity of ML-specific code smells in ML applications, setting the stage for this research study. Objectives. Addressing the limited research on ML-specific code smells within Python-based ML applications. To achieve this, the study begins with the identification of these ML-specific code smells. Once recognized, the next objective is to choose suitable methods and tools to design and develop a static code analysis tool based on code smell criteria. After development, an empirical evaluation will assess both the tool’s efficacy and performance. Additionally, feedback from industry professionals will be sought to measure the tool’s feasibility and usefulness. Methods. This research employed Design Science Methodology. In the problem identification phase, a literature review was conducted to identify ML-specific code smells. In solution design, a secondary literature review and consultations with experts were performed to select methods and tools for implementing the tool. Additionally, 160 open-source ML applications were sourced from GitHub. The tool was empirically tested against these applications, with a focus on assessing its performance and efficacy. Furthermore, using the static validation method, feedback on the tool’s usefulness was gathered through an expert survey, which involved 15 ML professionals from Ericsson. Results. The study introduced MLpylint, a tool designed to identify 20 ML-specific code smells in Python-based ML applications. MLpylint effectively analyzed 160ML applications within 36 minutes, identifying in total 5380 code smells, although, highlighting the need for further refinements to each code smell checker to accurately identify specific patterns. In the expert survey, 15 ML professionals from Ericsson acknowledged the tool’s usefulness, user-friendliness and efficiency. However, they also indicated room for improvement in fine-tuning the tool to avoid ambiguous smells. Conclusions. Current studies on ML-specific code smells are limited, with few tools addressing them. The development and evaluation of MLpylint is a significant advancement in the ML software quality domain, enhancing reliability and reducing associated technical debt in ML applications. As the industry integrates such tools, it’s vital they evolve to detect code smells from new ML libraries. Aiding developers in upholding superior software quality but also promoting further research in the ML software quality domain.

Page generated in 0.0429 seconds