• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1161
  • 291
  • 244
  • 221
  • 109
  • 103
  • 30
  • 28
  • 28
  • 28
  • 28
  • 28
  • 28
  • 24
  • 22
  • Tagged with
  • 2908
  • 337
  • 315
  • 280
  • 246
  • 183
  • 159
  • 152
  • 151
  • 149
  • 142
  • 140
  • 127
  • 126
  • 121
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
771

Development of a DME Simulator

Brown, Robert W. 01 January 1974 (has links) (PDF)
This report summarizes the design of a DME (Distance Measuring Equipment) simulator to be used in the testing of an Area Navigation System. The purpose of the simulator is to generate a signal representing an aircraft's distance from a ground station. This information is in the form of two pulses whose separation represents that elapsed transmission time for an aircraft to receive a reply from the ground station to an interrogation by the aircraft. The pulse spacing must be selectable as fixed distances for static tests and as distance changing at a constant rate to simulate flying to or from the station for dynamic testing. Thumbwheel switches are used to input fixed distances and up/down counters provide inbound and outbound range rates. The rate clock is derived from a crystal oscillator whose output is divided down by a programmable, modulo-n, divider to the desired rate/frequency. This input distance information, available in parallel binary coded decimal format, is then converted to the required pulse pair spacing. This is accomplished with presettable down counters clocked by another crystal oscillator whose frequency represents two-way propagation time for radio waves.
772

Development of a UHF Digital Frequency Synthesizer for Distance Measuring Equipment

Sharpe, Claude A. 01 January 1975 (has links) (PDF)
This report summarizes the design of a digital frequency synthesizer for airborne distance measuring equipment. It is the purpose of the frequency synthesizer to provide a stable frequency source for the local oscillator of the airborne receiver and for the power amplifiers in the transmitter chain. The synthesizer is required to furnish a frequency ranging from 260.250 mHz to 287.50 mHz in channel steps of 250 kHz at a power level of +7.0 dBm. the stability of the frequency must be greater than .005% over the temperature range of from minus 45 degrees centigrade to plus 55 degrees centigrade, requiring a crystal controlled source. Digital techniques are employed using two crystal controlled oscillators to synthesize all required channel frequencies. Linear circuits using standard configurations are employed for the oscillators, buffers, and mixers. Primary attention is paid to optimizing the transient characteristics of the synthesizer which employ programmable digital counters to change the division ratio in a phase locked loop. Decoding is provided to interface the modulus of the counters with the aircraft cockpit controls.
773

The 'Who' and 'Where' of Events: Infants' Processing of Figures and Grounds in Events

Goksun-Yoruk, Tilbe January 2010 (has links)
Learning relational terms such as verbs and prepositions is fundamental to language development. To learn relational words, children must first dissect and process dynamic event components, and then uncover how the particular language they are learning encodes these constructs. Building on a new area of research, this dissertation investigated two event components, figure (i.e., the moving entity) and ground (i.e., the stationary setting) that are central to learning relational words. In particular, we examine how English- and Japanese-reared infants process figures and grounds in nonlinguistic events and how language learning interacts with their conceptualization of these constructs. Four studies were designed to probe our questions. Study 1 examined English-reared infants' ability to form nonnative ground categories encoded only in Japanese. For example, "crossing a road," which extends in a line and is bounded, is expressed differently than "crossing a field" that extends in a plane and is unbounded. We found that infants can detect the geometry of the ground and form a nonnative ground category. Study 2 indicated that the path of an action plays a role in construing these categorical ground distinctions such that without the bounded paths infants do not differentiate between grounds. Study 3 demonstrated that even though infants notice figures and grounds in static representations of the dynamic events (even earlier for the ground discrimination), the Japanese categorical ground differentiation no longer emerged. In the last set of studies, we showed that despite the sensitivity to the event structure and categorical ground distinctions in dynamic events by both English- and Japanese-reared infants (Study 4a), only Japanese toddlers retained these categorical distinctions (Study 4b). Overall, these results suggest that 1) infants distinguish between figures and grounds in events with differential attention to static and dynamic displays; 2) before learning much about their native language infants form nonnative event categories; and 3) the process of learning language appears to shift earlier formed categorical boundaries. / Psychology
774

Object Trackers Performance Evaluation and Improvement with Applications using High-order Tensor

Pang, Yu January 2020 (has links)
Visual tracking is one of the fundamental problems in computer vision. This topic has been a widely explored area attracting a great amount of research efforts. Over the decades, hundreds of visual tracking algorithms, or trackers in short, have been developed and a great packs of public datasets are available alongside. As the number of trackers grow, it then becomes a common problem how to evaluate who is a better tracker. Many metrics have been proposed together with tons of evaluation datasets. In my research work, we first make an application practice of tracking multiple objects in a restricted scene with very low frame rate. It has a unique challenge that the image quality is low and we cannot assume images are close together in a temporal space. We design a framework that utilize background subtraction and object detection, then we apply template matching algorithms to achieve the tracking by detection. While we are exploring the applications of tracking algorithm, we realize the problem when authors compare their proposed tracker with others, there is unavoidable subjective biases: it is non-trivial for the authors to optimize other trackers, while they can reasonably tune their own tracker to the best. Our assumption is based on that the authors will give a default setting to other trackers, hence the performances of other trackers are less biased. So we apply a leave-their-own-tracker-out strategy to weigh the performances of other different trackers. we derive four metrics to justify the results. Besides the biases in evaluation, the datasets we use as ground truth may not be perfect either. Because all of them are labeled by human annotators, they are prone to label errors, especially due to partial visibility and deformation. we demonstrate some human errors from existing datasets and propose smoothing technologies to detect and correct them. we use a two-step adaptive image alignment algorithm to find the canonical view of the video sequence. then use different techniques to smooth the trajectories at certain degrees. The results show it can slightly improve the trained model, but would overt if overcorrected. Once we have a clear understanding and reasonable approaches towards the visual tracking scenario, we apply the principles in multi-target tracking cases. To solve the problem, we formulate it into a multi-dimensional assignment problem, and build the motion information in a high-order tensor framework. We propose to solve it using rank-1 tensor approximation and use a tensor power iteration algorithm to efficiently obtain the solution. It can apply in pedestrian tracking, aerial video tracking, as well as curvalinear structure tracking in medical video. Furthermore, this proposed framework can also fit into the affinity measurement of multiple objects simultaneously. We propose the Multiway Histogram Intersection to obtain the similarities between histograms of more than two targets. With the solution of using tensor power iteration algorithm, we show it can be applied in a few multi-target tracking applications. / Computer and Information Science
775

The Contribution of Below Knee Wobbling Mass to the Estimation of Vertical Knee Joint Reaction Forces Following Impact with the Ground / The Effect of Wobbling Mass on Knee Joint Force Estimates

Andrews, David 07 1900 (has links)
In human impacts involving high peak accelerations, the wobbling mass (skin, muscle, fat, and connective tissue) of the body will accelerate independently of the rigid mass (bone) . The purpose of this study was to quantify the effect that below knee wobbling mass has, if any, on the attenuation of peak forces transmitted through the leg to the knee joint, following impacts with the ground. Fifteen healthy subjects dropped vertically from heights of 5 and 10 cm, onto a force platform, with the ankle and knee joints of the support leg held rigidly. A uni-axial accelerometer was fixed, with skin bond cement, to the skin overlying the anterior upper tibia, and another to the posterior wobbling tissue of the support limb. Vertical ground reaction forces and accelerations were used in rigid only and rigid and wobbling link segment models of the leg, which resulted in estimates of vertical knee joint reaction forces (VKJRF). Mean peak VKJRFs resulting from rigid only calculations were 2.66±.55 x body weight (bw) and 3.53±.68 x bw, and from rigid and wobbling calculations were 2.64±.55 x bw and 3.52±.68 x bw, for the 5 and 10 cm heights, respectively. A two-way, repeated measures ANOVA revealed that there was no main effect for calculation method. Validation of the subject results was attempted indirectly by comparing them to the actual forces (load cell) at the knee of a manufactured model. The model was constructed in proportion to a human subject of mass 75 kg. When the model was dropped from the same heights as the subjects, the mean peak VKJRFs (22. 9 x bw) greatly overestimated the actual load cell values (8.2 x bw), and were unrealistic relative to subject values (approximately 3.0 x bw). Although mean peak VKJRFs were also overestimated when the wobbling mass accelerations were included, they were much closer to the actual values (9.9 x bw vs. 8.2 x bw). / Thesis / Master of Science (MS)
776

A study of potash mining methods related to ground control criteria /

Molavi, M. A. January 1987 (has links)
No description available.
777

The development of coastal bluffs in a permafrost environment : Kivitoo Peninsula, Baffin Island, Canada

Algus, Mitchell January 1986 (has links)
No description available.
778

A study of building response and damage due to mining-induced ground movements

Yu, Zhanjing 11 July 2007 (has links)
Several methods have been developed to predict mining-induced ground movements. Some of these methods, such as the profile and influence functions, have been used successfully in a number of applications. Prediction methods, however, do not address the response of surface buildings and structures to mining-induced ground movements. In order to study the response of a building to ground movements, a finite element model has been developed. The program named SRMP (Subsidence Response Modelling Program) is a large displacement, small strain, two dimensional finite element program. Such model is more appropriate than the commonly used small-displacement formulations and describes more accurately this particular problem because large displacements are involved in mining-induced ground movements. Four types of elements are employed in the program, namely plane, beam, transition and friction elements. Total Lagrangian (T.L.) formulation is used for plane elements and Updated Lagrangian (U.L.) formulation for beam, transition, and friction elements. The program consists of twenty six subroutines and requires about one mega-bytes of memory. It can model the slippage between foundation and subgrade. An important feature of SRMP is that it can simulate the excavation process continuously, without re-initiating the system variables and boundary conditions. Ground movement, building displacement, and stresses can be obtained, therefore, at each excavation stage. The accuracy of the finite element model was verified through field data. The slippage between foundation and subgrade was analysed in depth. Structural deformations and stresses induced by ground movements were also studied and damage criteria in term of ground displacements were proposed. Finally, based on the SRMP analyses, appropriate measures were developed which can provide better protection to surface structures affected by excavation-induced ground movements. / Ph. D.
779

Passive Site Remediation for Mitigation of Liquefaction Risk

Gallagher, Patricia M. 28 November 2000 (has links)
Passive site remediation is a new concept proposed for non-disruptive mitigation of liquefaction risk at developed sites susceptible to liquefaction. It is based on the concept of slow injection of stabilizing materials at the edge of a site and delivery of the stabilizer to the target location using the natural groundwater flow. The purpose of this research was to establish the feasibility of passive site remediation through identification of stabilizing materials, a study of how to design or adapt groundwater flow patterns to deliver the stabilizers to the right place at the right time, and an evaluation of potential time requirements and costs. Stabilizer candidates need to have long, controllable gel times and low viscosities so they can flow into a liquefiable formation slowly over a long period of time. Colloidal silica is a potential stabilizer for passive site remediation because at low concentrations it has a low viscosity and a wide range of controllable gel times of up to about 100 days. Loose Monterey No. 0/30 sand samples (Dr = 22%) treated with colloidal silica grout were tested under cyclic triaxial loading to investigate the influence of colloidal silica grout on the deformation properties. Distinctly different deformation properties were observed between grouted and ungrouted samples. Untreated samples developed very little axial strain after only a few cycles and prior to the onset of liquefaction. Once liquefaction was triggered, large strains occurred rapidly and the samples collapsed within a few additional cycles. In contrast, grouted sand samples experienced very little strain during cyclic loading. What strain accumulated did so uniformly throughout loading and the samples remained intact after cyclic loading. In general, samples stabilized with 20 weight percent colloidal silica experienced very little (less than two percent) strain during cyclic loading. Sands stabilized with 10 weight percent colloidal silica tolerated cyclic loading well, but experienced slightly more (up to eight percent) strain. Treatment with colloidal silica grout significantly increased the deformation resistance of loose sand to cyclic loading. Groundwater and solute transport modeling were done using the codes MODFLOW, MODPATH, and MT3DMS. A "numerical experiment" was done to determine the ranges of hydraulic conductivity and hydraulic gradient where passive site remediation might be feasible. For a treatment are of 200 feet by 200 feet, a stabilizer travel time of 100 days, and a single line of low-head (less than three feet) injection wells, it was found that passive site remediation could be feasible in formations with hydraulic conductivity values of 0.05 cm/s or more and hydraulic gradients of 0.005 and above. Extraction wells will increase the speed of delivery and help control the down gradient extent of stabilizer movement. The results of solute transport modeling indicate that dispersion will play a large role in determining the concentration of stabilizer that will be required to deliver an adequate concentration at the down gradient edge. Consequently, thorough characterization of the hydraulic conductivity throughout the formation will be necessary for successful design and implementation of passive site remediation. The cost of passive site remediation is expected to be competitive with other methods of chemical grouting, i.e. in the range of $60 to $180 per cubic meter of treated soil, depending on the concentration of colloidal silica used. / Ph. D.
780

Stability of Embankments Founded on Soft Soil Improved with Deep-Mixing-Method Columns

Navin, Michael Patrick 25 August 2005 (has links)
Foundations constructed by the deep mixing method have been used to successfully support embankments, structures, and excavations in Japan, Scandinavia, the U.S., and other countries. The current state of practice is that design is based on deterministic analyses of settlement and stability, even though deep mixed materials are highly variable. Conservative deterministic design procedures have evolved to limit failures. Disadvantages of this approach include (1) designs with an unknown degree of conservatism and (2) contract administration problems resulting from unrealistic specifications for deep mixed materials. This dissertation describes research conducted to develop reliability-based design procedures for foundations constructed using the deep mixing method. The emphasis of the research and the included examples are for embankment support applications, but the principles are applicable to foundations constructed for other purposes. Reliability analyses for foundations created by the deep mixing method are described and illustrated using an example embankment. The deterministic stability analyses for the example embankment were performed using two methods: limit equilibrium analyses and numerical stress-strain analyses. An important finding from the research is that both numerical analyses and reliability analyses are needed to properly design embankments supported on deep mixed columns. Numerical analyses are necessary to address failure modes, such as column bending and tilting, that are not addressed by limit equilibrium analyses, which only cover composite shearing. Reliability analyses are necessary to address the impacts of variability of the deep mixed materials and other system components. Reliability analyses also provide a rational basis for establishing statistical specifications for deep mixed materials. Such specifications will simplify administration of construction contracts and reduce claims while still providing assurance that the design intent is satisfied. It is recommended that reliability-based design and statistically-based specifications be implemented in practice now. / Ph. D.

Page generated in 0.1219 seconds