• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 9
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 35
  • 35
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Study of friction losses of radially loaded ball bearings

Gartner, Joseph R. January 1961 (has links)
Thesis (M.S.)--University of Wisconsin--Madison, 1961. / Typescript. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 134-136).
2

Heat treating effects on microyield properties of two bearing steels

Lee, William Le Roy. January 1965 (has links)
Thesis (M.S.)--University of Wisconsin--Madison, 1965. / eContent provider-neutral record in process. Description based on print version record. Bibliography: l. 44-45.
3

INVESTIGATION INTO THE LUBRICATION MECHANISM OF THE BALL BEARING CAGE

Thomas Russell (16934733) 08 September 2023 (has links)
<p dir="ltr">This thesis presents an investigation into the mechanism of friction generation and lubrication of cages used in modern Deep Groove Ball Bearings (DGBBs). Although cages provide a necessary function, e.g., ensuring proper spacing between rolling elements during assembly and operation, they also serve as an undesirable source of friction to the overall assembly. Cage friction originates primarily from two sources: i) localized cage pocket friction between the balls and the rollers and ii) churning losses from excess lubricant inside the bearing cavity. Localized cage pocket frictional losses were characterized through the development of a novel Bearing Cage Friction Test Rig (BCFTR). This rig was designed and developed to replicate the orientation and relative motion of a fully assembled bearing in steady state operation while measuring cage friction. The BCFTR uses a six-axis load cell to record forces and torques generated due to a rotating ball inside of a rigidly fastened cage segment. The test rig can be set up in two different configurations: i) a load control configuration where a friction coefficient is calculated due to a constant force applied between the ball and the cage segment and ii) a position control configuration where frictional torque is measured for specific positions of the ball relative to the cage. </p><p dir="ltr">In order to gain a deeper understanding of the relationship between cage position, lubrication, and friction, an acrylic cage segment with an exact cage pocket geometry was developed and tested on the BCFTR over a broad range of operating conditions. The clear acrylic cage allowed for the visualization of lubricant flow inside the cage pocket. Videos of oil flow revealed that the quantity of oil inside the pocket correlates closely with the measured frictional torque. Oil volume information from the videos was then used as an input to a cage pocket lubrication model. The model uses the finite difference method to solve the Reynolds and film thickness equations over a spherically defined cage pocket domain. The model was developed primarily to estimate cage pocket friction and corroborate with the results from the BCFTR; however, the model was also used to investigate the pressure distribution and lubricant shear stress in a variety of cage pocket shapes. The finite difference model uses oil volume fraction data to estimate frictional torque and corroborate experimental friction measurements. The results obtained from the model and experiments are in good agreement, proving that the key information required to estimate cage friction is the quantity of oil inside of the cage pocket.</p><p dir="ltr">The main contribution of overall cage friction in DGBBs can be attributed to local drag from inside the cage pocket; however, there remains an appreciable amount of friction and drag losses due to the interaction of the outside of the cage with oil in the bearing cavity. Because DGBB cages reside in the space between the rolling elements and have a significant effect on the churning behavior of the oil, it is paramount to understand how the size and shape of these cages affect the lubricant flow. To achieve this objective, a series of Computational Fluid Dynamics (CFD) models were developed. A full-scale simulation of the inner cavity of a DGBB was developed to observe fluid flow as a function of bearing geometry, operating conditions, and cage shape. Considerable effort was taken to perform optimization studies of the solution method. In addition, an efficient CFD model covering only three rolling elements was also used to investigate fluid flow in a bearing. This model utilized symmetry, periodic boundary conditions, and rotating reference frames to produce equivalent results to the full bearing simulation with a great reduction in computational effort. Results from the model were analyzed both qualitatively and quantitatively through the generation of contour maps of pressure and wall shear stress and the calculation of force and drag coefficient values for each cage.</p><p dir="ltr">The final development presented in this thesis is a high-fidelity Dynamic Bearing Model (DBM) capable of resolving local pocket and external cage lubrication effects of bearings in operation. In this dynamic simulation, the motion of the cage was determined using the finite difference method to solve for the pressure generation and resultant forces inside of each cage pocket at each time step. The computational domain of the finite difference model was designed to reflect the specific cage pocket geometry of four common cage designs. Additional testing on the bearing cage friction test rig was performed to characterize the lubrication state inside of each cage. An inverse distance weighting scheme was utilized to predict starvation parameters for a general ball position inside of the cage pocket. Additionally, the fluid drag losses associated with cage lubrication outside of the cage pocket were included in select dynamic simulations in the form of a drag torque applied to the cage. Results from the dynamic simulation reveal new knowledge on the effect of cage geometry and lubrication on dynamic behavior. Compared to simulations without cage lubrication, results from the new DBM demonstrate a reduction in median ball-cage contact force and improved stability in the trajectory of the center of mass of the cage.</p>
4

Analysis of Heat Generation and Temperature in High Speed, High Temperature Bearing Balls

Ringger, Hans R. 01 April 1973 (has links)
This thesis reports an investigation of the generation of heat on, and the prediction of temperature of high-speed, dry-film lubricated, stainless steel bearing balls.
5

Air-oil mist lubrication of small bore ball bearings at high speeds

Pinckney, Francis Douglas January 1985 (has links)
Deep groove and angular con tact 25 and 30 mm bore ball bearings were tested to high speeds using air-oil mist lubrication. Test conditions included cooling air flow rates of 1.5, 3.0, and 6.0 scfm (0.05, 0.10, and 0.20 kg/min), thrust loads of 50, 75, and 100 lb (222, 334, and 445 N), and a constant radial load of 25 lb (111 N). Steady-state bearing outer race temperature was recorded at various speeds under each set of test conditions. Maximum ON values of 1.9 x 10⁶, 1.5 x 10⁶, 1.4 x 10⁶, and 1.26 x 10⁶ were achieved on the 30 mm deep groove, the 25 mm deep groove, the 25 mm angular contact, and the 30 mm angular contact bearings, respectively. Tests were usually terminated when the stabilized outer race temperature reached approximately 200°F (366 K) although the 30 mm deep groove bearing was operated to 240°F (389 K). A cooling air flow rate of 1.5 scfm (0.05 kg/min) was judged not adequate for high speed bearing operation under the tested conditions. An outer-race temperature prediction equation, based on a regression analysis of the test results, is presented for each test bearing. / M.S.
6

Prediction of the running torque of instrument ball bearings at high speed under combined radial and axial loads

Clarke, George Edward 02 June 2010 (has links)
The purpose of this investigation was to develop an expression to represent the torque versus speed behavior of instrument ball bearings between 1000 and 40,000 rpm with various combinations of radial and axial load ranging between 0 and 200 grams. Because of the lack of experimental data for instrument bearings over any range of speeds, loads and sizes, it was necessary to construct a suitable bearing tester and accumulate the required data. The testers used were based on previous work by H.H. Mabie at Sandia Corporation and G.E. Clarke at V.P.I. The driving source was a small air turbine developed by Mabie which performed smoothly and reliably between 0 and 50,000 rpm. The torque measuring system employed strain gages on a very small beam which was used to sense forces on the stationary outer race of the bearing while the inner race was driven at speed. Each test was conducted from 0 to 40,000 rpm. The radial load took on va1ues of 50, 100, and 200 grams. The axial load was 0, 50, 100, and 200 grams. All combinations of these loads were used for each size bearing. The sizes tested were R-2, R-3, R-4. Six bearings of each size were used with all six bearings of each size undergoing the same test program in order to yield statistically reasonable averages. Investigation of analytical methods of predicting the running torque indicated that production tolerances of ball bearings rendered such an approach impractical. This led to the development of an empirical expression to predict the running torque within the same range of sizes, loads, and speeds for which experimental test data was obtained. Such an empirical expression was successfully developed and the reSUlting torque predictions compared with the experimental values of torque. The empirical expression proved capable of predicting the running torques within the envelope of the sample standard deviations for a given bearing size and loading in most cases. During the investigation of supplementary topics, it was determined that frictional heating was insignificant during the conduct of the torque tests which had a duration of approximately two minutes. All tests were at ambient temperature. All tests conducted were with oil lubricant and ribbon retainer ba1l bearings. There was no evidence that the empirical expression for friction torque developed here was valid when extrapolated beyond the limits of size, load, and speed used in its development. / Ph. D.
7

Development of a pneumatic sensor for measuring the torque of instrument ball bearings

Edwards, Earl Garland January 1968 (has links)
Of the studies that have been conducted on the operational characteristics of instrument ball bearings, a great majority have been in accordance with MIL-STD-206. Since tests in compliance with this specification determine bearing quality or rate bearings comparatively, nothing was known of the operational characteristics of the bearings in their final application. A few investigators have developed sensors to study torque characteristics of instrument ball bearings. However, in no case has a report been made of the effect on torque when both radial and axial loads were varied. In seeking to obtain improvements in methods of measuring small torques, a pneumatic sensor was developed for testing R-3 instrument ball bearings under varying radial and axial loads. This sensor was based upon the principle of the flapper-nozzle valve. The flapper valve consisted of two orifices in series, one of constant area, the other of variable area, which was determined by flapper position. Since the pressure between the two orifices was dependent upon flapper position, indirect measurements of torque acting on the flapper were obtained by measuring this pressure. As a result of this study, it was concluded that the pneumatic sensor accurately measured the running torque of R-3 instrument bearings. This statement was based upon good agreement with data from other investigators working under identical conditions. It was also concluded that for a range of 50 to 200 gm. radial loading, no significant effect on torque was observed. For axial loads in the same range, the torque was found to vary in proportion to the equivalent load acting on the bearing. / Master of Science
8

The measurement of the running torque of oil and grease lubricated instrument ball bearings under combined radial and axial loads

Clarke, George Edward January 1966 (has links)
Although many studies have been made on the operating characteristics of instrument bearings, most were conducted at two rpm or less and with thrust load only. A study by H.H. Mable tested the running torques of radially loaded bearings from 1,000 to 40,000 rpm. The purpose of this investigation was to study the running torques of R-3 size instrument ball bearings at speeds up to 40,000 rpm while under combined radial and axial load. Much of this investigation was devoted to the construction of an accurate torque sensing device. The method employed relied on the amplification of strain gage signals by a strain gage indicator and an x-y plotter. The strain gages were used to detect the strain at the base of a small beam that prevented rotation of the outer race.of a test bearing while the inner race was driven at test speed by an air turbine. The accumulated data was the result or 30 test series, with each series being constituted of a test sample of six ball bearings. From the study, it was consluded that the strain gage method of torque sensing accurately measured the running torque of R-3 size ball bearings at ambient temperatures. It was also concluded that the effect of axial loading or an R-3 ball bearing loaded with 47 grams radially is negligible until the axial: load equals o.r exceeds the radial loading. By comparing lubricants, it was concluded that grease lubricated ball bearings demonstrate running torques approximately twice as great as bearings lubricated with a similar weight of oil. / Master of Science
9

Assembly Optimization for Double Row Ball Bearings

Holland, Michael L. 02 September 1998 (has links)
This thesis is a treatise on optimal assembly methods for double row ball bearings. As with common single row bearings, double row ball bearings, consist of four general components, namely, an inner ring, an outer ring, a complement of balls and a cage or retainer to keep the balls separate. Unlike single row bearings, however, double row ball bearings have two complements of balls in two distinct parallel races. Although this double row configuration is desirable in a number of applications, it makes the bearings more difficult and expensive to assemble. In addition, current manual assembly procedures require a great deal of digital manipulation, leading to concern about carpal tunnel syndrome and other long-term repetitive motion injuries. This thesis attempts to develop an improved assembly process for all types of double row bearings. Although the work is intended to be general, the Torrington 5203 double row ball bearing is adopted as a specific application example. This bearing's assembly difficulties and additional cost are a result of its manual Conrad assembly method and a rubber O-ring and groove used solely for bearing assembly. In the assembly process, the O-ring supports the upper balls temporarily until the two rings can be aligned concentrically, thus snapping the balls into the bearing races. This thesis addresses the replacement of the rubber O-ring and explores opportunities for bearing assembly automation. Design synthesis of a retractable or reusable assembly component to replace the rubber O-ring supporting the upper balls during assembly is presented. A large group of design concepts are developed and evaluated, resulting in a small group of feasible designs. These feasible solutions are then tested, and a design that has the potential immediate implementation in an improved manual assembly process is proposed. In addition, two design concepts are presented as candidates for possible implementation in an automated assembly process. / Master of Science
10

Studies on the Dynamic Analysis and the Lapping Tracks in the Ball-Lapping Systems

Hwang, Yih-chyun 18 August 2006 (has links)
A general closed-form analytical solution is derived for the lapping tracks with its kinematics for the concentric V-groove lapping system. The lapping tracks on the ball surface for the three contact points are fixed circles, and their lengths of the lapping tracks are linearly proportional to , , and , respectively. In practice, if the orientation is randomized as the ball enters the lap again, then the distribution of the lapping tracks are dense after many cycles, and the larger the lapping length in each cycle, the smaller is the number of cycles required achieving the maximum lapped area ratio. In the geometry design of ball lapping, the V-groove half-angle should be larger than 45¢X, but to prevent the splash of abrasives, it should be less than 75¢X. Since the spin angular speed with its angle continuously varies with time for the eccentric lapping system, lapping tracks are not fixed circles. In practice, the lapped areas are complementary at the contact points of A and B. The total lapped area ratio is higher than 87% for a slip ratio less than 0.5. Hence, it is possible to lap all the surface of a ball by changing the slip ratio during the lapping process. Moreover, the larger the V-groove half-angle, the less is the eccentricity to achieve the optimum lapped area ratio. In order to understand the ball motion and ball lapping mechanism in the magnetic fluid lapping system, the forces and moments equilibrium equations are derived and numerical methods are analyzed. As the balls traveling in a train are assumed to be the same size, only one ball is considered in the dynamic analysis. Results show that as the ball separates from the shaft and the float, the spin angle increases quickly and approaches to 90¢X. Hence, the ball changes its attitude and thereby generates a new lapping tracks on the ball surface. Consequently, after repeating many cycles, lapping tracks would be scoping out more space and this is one of the spherical surface generation mechanisms. Surface waviness of ball causes a variation in the lapping load. When , it is possible to cause the ball separated from float and the lapping load is zero during the separation period. No matter how the ball separates from float, the spin angle always varies in a small range. Hence, only a very small region can be grounded due to the effect of the surface waviness. Therefore, it is not the main lapping mechanism of the spherical surface generation. In fact, during the lapping process, many balls with different diameters are lapped. To understand the ball¡¦s lapping mechanism of the spherical surface generation, it is necessary to consider a batch of balls. For a batch of balls with different diameters, the applied load on each should be different from each other. Generally, the larger the diameter of a ball, the larger is the friction force between the ball and shaft and the ball circulation speed. Therefore, it is possible to cause the collision between the larger and the smaller balls. To understand the interaction between balls traveling in a train, the dynamic analysis of multiple balls is developed. As the ball interacts with each other, it is possible to change the spin angle, and thereby to achieve the larger variation range of the lapping tracks. During the lapping process of a batch of balls, it is also possible to cause the separation between the shaft and the ball, and it causes the ball to change its attitude and to achieve more uniform lapping tracks.

Page generated in 0.0898 seconds