171 |
An improved scoring system for the Available Motions Inventory (AMI)Nerhood, Robert C. 10 July 2009 (has links)
The role of engineering in the rehabilitation of the disabled has been steadily increasing in recent years. With the enactment of the Americans with Disabilities Act, that role has taken on a new level of importance. Uncomfortable with the qualitative, disability-oriented assessment tools of their special education, occupational and physical therapists, and medical teammates, engineers have come to rely more and more on their own quantitative assessment devices. Among these, the Available Motions Inventory (AMI) has shown great promise as a tool for the development of job modifications for the moderately disabled individual. From its seventy-one sub-tests, the AMI provides raw and processed data on an individual's capability to manipulate switches, orient settings, and demonstrate strength, as well as perform light assembly tasks. Included in the output is a weighted set of scores showing the subject's strength, range of motion, and reach/reaction capabilities. However, the AMI has its drawbacks. The scoring algorithms can underestimate the capabilities of subjects who fail to perform certain tasks, and it will not permit recombination or selective omission of the various sub-tests. This study examined the feasibility of employing the AMI analysis algorithms using a spreadsheet format for the purpose of better analyzing the data generated by persons with limited range of motion disabilities. Ten Subject Matter Experts (SMEs) were asked to analyze a series of data profiles and place the individuals described by the profiles into one of four job options. The data profiles are the AMI scores for individuals falling into one of three categories of disability: normal, hemiplegic, and limited range of motion. The jobs increase in difficulty from a position of non-feasible employment to employment as a pizza chef. The data generated were analyzed using the Sign Test.
The results showed that a difference does exist between the current scoring system and the updated system in the placement of individuals. A difference between the systems was established for the case involving individuals with a limited range of motion. More importantly, the SMEs frequently chose more complex jobs for individuals with limited range of motion, suggesting that the new system provides a more realistic picture of this category of disabled persons. The results of this research permit a more effective use of the AMI by implementing an updated scoring system. The new system allows for several increased benefits during analysis. The scoring system is based on an EXCEL spreadsheet, thus it is operable in both the PC, Windows and Apple environments. Better data control and manipulation allows for better representation of an individual's capabilities. The system operates in the same manner as the existing system; however, the spreadsheet design allows for customization of the data output. Finally, it is believed that the use of the new system will increase the chance of job placement for severely disabled individuals with a limited range of motion. / Master of Science
|
172 |
Liquid collection efficiencies after supercritical fluid extractionThompson, Peter G. 28 August 2003 (has links)
The design of any supercritical fluid extraction (SFE) experiment should break down into two parts: 1) removal of the analytes of interest from the bulk matrix and 2) deposition of those analytes in a user friendly manner for further studies (spectroscopy or chromatography). While great attention has been paid to the extraction of analytes, less attention has been paid for their efficient collection after extraction. Currently solid phase trapping and liquid trapping are available for off-line collection of analytes after SFE.
A polarity test mix consisting of acetophenone, <i>N,N</i>-dimethylaniline, naphthalene, <i>n</i>-decanoic acid, 2-naphthol, and <i>n</i>-tetracosane was spiked onto sand and extracted with supercritical carbon dioxide to evaluate the collection efficiency of various solvents and solvent mixtures. Nine single collection solvents and four mixed collection solvent systems were studied. When one-component collection solvents were employed, quantitative (above 90%) recovery of all analytes was not possible. With mixed collection solvents, recoveries of 90% or better with all analytes were possible.
Additional studies were performed with carbon dioxide modified with 1, or 4% acetonitrile or 1, 4, or 8% methanol or toluene. With these extraction fluids, quantitative collection of the analytes with a mixed collection solvent was not possible, but excellent collection efficiencies were observed for hexane and methanol collection. / Master of Science
|
173 |
A Tidal Prism analysis of the soluble copper mixing zone around the Occoquan Water Treatment PlantLuettinger, Jason Crosby 11 June 2009 (has links)
The Fairfax County Water Authority applies copper sulfate as an algaecide in the Occoquan Reservoir to control the growth of taste and odor causing algae. The Occoquan Water Treatment Plant removes water from the reservoir for distribution to portions of Fairfax County. Approximately once a day, the treatment plant backwashes one of its five clarifier/filters and discharges this backwash water directly into the Occoquan River. Concern has risen that the plant may be concentrating this copper during treatment, and therefore may be responsible for violations in the toxicity standards for copper in the Occoquan River below the outfall. This investigation has attempted to define the extent of the mixing zone around the Occoquan Water Treatment Plant.
A dynamic computer model has been developed based upon the 1988 version of the Tidal Prism Model. Simulations were run on the "worst case scenarios" in the Occoquan River as an attempt to define the mixing zone. It is our initial conclusion that the treatment plant discharge has a negligible effect upon the concentrations of dissolved copper that may exist in the Occoquan River. The treatment plant appears to be removing copper from the raw water rather than concentrating it as initially assumed. Because both the flow and the background dissolved copper in the Occoquan River seem to be controlled by the Occoquan Reservoir, the definition of a static mixing zone boundary is not practical. A dynamic boundary was proposed as an alternative which incorporates the many variables in this system. / Master of Science
|
174 |
NO<sub>x</sub> reduction for natural gas engines with increased ignition energy and plasma jet ignitorsOchel, Ralf 12 June 2009 (has links)
Five plasma jet ignitor designs were tested on a Waukesha ASTM-CFR engine fueled with natural gas. The pollutant emissions, fuel and air flow rates and dynamic cylinder pressure were measured for the full range of Air/Fuel ratios. From these measurements the indicated power and specific fuel consumption were calculated. The energy for the ignitors was provided by a variable high energy ignition system, and each ignitor was supplied with 0.00, 0.08, 0.32, 0.72 and 1.28 Joules of energy in addition to that provided by the standard ignition coil. To differentiate between the benefits gained by use of the plasma jet ignitors and those due to the higher ignition energies, an ordinary spark plug was also tested with added ignition energies.
The goal of the experiment was to find an ignitor that could be used to extend the lean operating limit of a natural gas fueled engine, so that the emission of NO<sub>x</sub>, and other pollutants could be reduced. The following table shows the optimum pollutant emission reductions achieved by the use of the most effective plasma jet ignitor and the high energy spark plug compared with the emissions from the engine when operated with the standard equipment spark plug near stoichiometric. The plasma jet ignitor for which the results are displayed in this table consisted of an 83 mm³ cavity and a 118 mm³ ejector, both of which were insulated with ceramic cylinders. / Master of Science
|
175 |
Gas permeability of polyimide/polysiloxane block copolymersMecham, Sue Jewel 11 June 2009 (has links)
A series of perfectly alternating polyimide/ poly(dimethylsiloxane) microphase separated block copolymers ranging from 0-50 wgt. % poly(dimethylsiloxane) have been measured for permeability characteristics. The polyimide segment of the copolymers was based on oxydiphthalicdianhydride (ODPA) and 1,4-Bis(4-amino-1,1- dimethylbenzyl)benzene (Bis P). The polysiloxane was an aminopropyl terminated poly(dimethylsiloxane). Randomly segmented block copolymers of =20 wet. % poly(dimethylsiloxane) with different segment lengths were also studied, based on the same materials for the sake of comparison with the perfectly alternating versions of the same block copolymers. Permeability measurements were performed on tough, microphase separated, transparent films with O₂, N₂, CH₄, and CO₂ gases in that order. The effects of the chemical composition and block lengths on permeability coefficients and selectivity values were evaluated. The permeability of copolymer films to gases was found to be highly sensitive to the morphology of the copolymer. The morphology was found to be controlled by varying the amount and the segment length of each component and this allowed for fine control of the permeability characteristics. Conversely, the measurement of permeability characteristics can lead to more information about the morphology of complicated microphase separated block copolymers. / Master of Science
|
176 |
Modified atmosphere packaging of hard grated cheesesYoder, Jonna D. 21 July 2009 (has links)
The objective of this study was to use MAP technology to produce safe, shelf-stable, high quality, hard grated cheeses not requiring preservatives or refrigeration during distribution and sale.
Initially, a challenge study with Staphylococcus aureus (S. aureus) was conducted to determine the water activity (Aw) level of high-moisture cheeses necessary to prevent the growth of a food pathogen when packaged under a modified atmosphere (25% CO, and 75% N,). Other microbial analysis included mold and yeast enumerations. Secondly, product quality and shelf stability were determined biweekly by sensory, microbial, and instrumental analysis to evaluate product safety and changes in the natural aromas and flavors of hard grated cheeses. Instrument color analysis CIE L* a* b* values were determined to measure color changes.
Parmesan cheese with high Aw levels (Aw= 0.90 and 0.88) supported the growth and survival of S. aureus. The microorganism was incapable of surviving at Aw levels of 0.86 and below. S. aureus was not able to survive on Romano cheese. Mold and yeast proliferated on higher Aw Parmesan cheeses. Visible mold was detected on the Parmesan sample of Aw= 0.90 after 8 weeks of storage. No mold growth was observed on Romano cheese. However, yeast were capable of growing on Romano cheese.
The sensory evaluation study of hard grated cheeses was unable to detect a difference between the fresh cheese sample and the cheeses packaged under MAP. / Master of Science
|
177 |
An image quality analysis of ANVIS-6 night vision gogglesAbel, Derek H. 10 November 2009 (has links)
This study was undertaken in an effort to relate ANVIS-6 Night Vision Goggle image quality to user performance. The purpose was to determine which of five image quality metrics best related to performance tasks. The image quality metrics examined Modulation Transfer Function Area (MTFA), Integrated Contrast Sensitivity (leS), Square Root Integral (SQRI), Resolution, and Signal-to-Noise Ratio (SNR). The performance tasks were detection and recognition of targets under various levels of moon illumination. The metric that best related to target detection was SNR. The SNR results are consistent with visual psychophysics and SNR effects. The metric that best related to target recognition was Resolution. The resolution results are consistent with the position that recognition performance improves for suprathreshold targets as resolving power increases. / Master of Science
|
178 |
Relaxation mechanism in methyl stearate monolayer films at the air/water interfaceTiwari, Rajesh Kumar 11 June 2009 (has links)
A monolayer film of methyl stearate was compressed until catastrophic film collapse took place. Surface pressure relaxation was then followed as a function of time. Investigation involving the effects of film compression beyond the collapse pressure revealed an important process involved in the surface pressure relaxation mechanism. When the monolayer is compressed beyond the collapse pressure and then held at a constant area, the surface pressure relaxation, in a plot of surface pressure vs time, was delayed during the initial stage of the process. A similar delay in the surface pressure relaxation was also observed for a monolayer film of methyl stearate when it was compressed and held at 40 mN/m, below the collapse pressure, for some time before allowing it to relax under a constant area condition. A relaxation mechanism has been proposed to explain the delay phenomenon observed during the surface pressure relaxation at constant area: At collapse, the monolayer film buckles and folds over to form bilayer molecular channels (ridges and ribbons). The ridges and ribbons act as a reservoir for monolayer material to make up for lost molecules at the air/water interface due to the growth of a bulk (crystalline) phase under a constant area condition.
The results from temperature dependence studies as well as from the area-relaxation experiments strongly support the proposed relaxation mechanism. The Langmuir-Blodgett films of methyl stearate, deposited before and after the catastrophic film collapse, revealed interesting structural features of the collapsed film.
The experimental results from the pressure-time, area-time, and pressure-area isotherms strongly suggest that the methyl stearate monolayer film undergoes an organized film collapse. This work helps to better understand the relaxation mechanism in monolayer films at the air/water interface. / Master of Science
|
179 |
Ethics in CongressMcDanal, Charles E. 23 June 2009 (has links)
Public confidence in Congress has dropped to an all-time low, and yet the notion that Congress is full of "crooks" does not seem to bear close scrutiny. Instances of ethical impropriety that were significant either for setting constitutional precedent or for providing the impetus for reform are reviewed individually. This is followed by a discussion of the attempts at reform and of the ethics codes that were established. Information about documented offenses and the members involved are then analyzed in an attempt to examine potential explanations for and influences of the types of offense committed, by whom, and to what outcome. Finally, this thesis closes with conclusions regarding the direction of, the need for, and the efficacy of the ethics reforms that have been undertaken. This research shows that much of the opportunity for gross misbehavior has been limited by changes occurring in the latter part of the twentieth century. A review of these incidents and of reform attempts indicates a relatively small proportion of misdeeds and some fairly strong efforts to eliminate the most egregious forms of misbehavior. Even though Congress has often been accused of doing little to police itself unless forced, the fact that it does act in the more extreme cases sits well with a desire to prevent legislative paralysis-inducing witch hunts. The data collected on those who have had their misdeeds brought to light fail to show significant effects of party membership or offense type on whether the member remained in Congress. Small but statistically significant effects were found for election year and institution served. These data seem to indicate that while the system may not be perfect, it does work. Almost eighty percent of those accused of improper conduct left Congress soon thereafter. The overall negative perception of the American public toward Congress tends to perpetuate the notion that Congress is corrupt, when it actually seems to be a small proportion that engage in unethical behavior. / Master of Arts
|
180 |
An experimental apparatus for the measurement of moisture permeability of building materialsMosier, Roger Carhart 10 July 2009 (has links)
An experimental apparatus was built and operated for the measurement of moisture permeability of building materials. The data are for use in resolution of problems associated with moisture buildup in porous building materials. The apparatus is capable of maintaining simultaneous humidity and temperature differences across a test specimen. In contrast with existing experimental methods, the relative humidity on either side of the specimen is controlled without the use of quiescent saturated salt solutions. Forced-air convection at the surface of the specimen is used, resulting in uniform spatial conditions and faster results. Data are obtained for fiberboard sheathing at various temperature and humidity setpoints.
The apparatus consists of two environmental chambers between which a wood-based test specimen is sealed. An external forced-air conditioning system using distilled water and molecular sieve desiccant humidifies or dries the chamber air as needed. The moisture transfer rate across the specimen is determined gravimetrically: the desiccant column is weighed to measure its change in mass as a result of moisture diffusion across the specimen. The apparatus is capable of maintaining relative humidities over a range of 5 to 65 percent RH, with a temperature difference across the specimen of up to 20°C. Furthermore, the apparatus is capable of automated relative humidity and temperature control to within ±0.5 percent RH and ±0.5°C of the setpoints, respectively.
Test results for fiberboard sheathing subjected to a range of humidity and temperature conditions are presented. Results are compared with the limited data from the literature. Recommendations for improvement of the data measurement methods are included. / Master of Science
|
Page generated in 0.0352 seconds