71 |
DETERMINATION OF AN OPTIMAL DATA BUS ARCHITECTURE FOR A FLIGHT DATA SYSTEMCrawford, Kevin, Johnson, Martin 10 1900 (has links)
International Telemetering Conference Proceedings / October 22-25, 2001 / Riviera Hotel and Convention Center, Las Vegas, Nevada / NASA/Marshall Space Flight Center (MSFC) is continually looking for methods to reduce cost
and schedule while keeping the quality of work high. MSFC is NASA’s lead center for space
transportation and microgravity research. When supporting NASA’s programs several decisions
concerning the avionics system must be made. Usually many trade studies must be conducted to
determine the best ways to meet the customer’s requirements. When deciding the flight data
system, one of the first trade studies normally conducted is the determination of the data bus
architecture. The schedule, cost, reliability, and environments are some of the factors that are
reviewed in the determination of the data bus architecture. Based on the studies, the data bus
architecture could result in a proprietary data bus or a commercial data bus. The cost factor
usually removes the proprietary data bus from consideration. The commercial data bus
architecture’s range from Versa Module Euro card (VME) to Compact PCI to STD 32 to PC 104.
If cost, schedule and size are prime factors, VME is usually not considered. If the prime factors
are cost, schedule, and size then Compact PCI, STD 32 and PC 104 are the choices for the data bus
architecture.
MSFC’s center director has funded a study from his discretionary fund to determine an optimal
low cost commercial data bus architecture. The goal of the study is to functionally and
environmentally test Compact PCI, STD 32 and PC 104 data bus architectures. This paper will
summarize the results of the data bus architecture study.
|
72 |
'Les cent nouvelles nouvelles' : a linguistic study of MS Glasgow Hunter 252Roger, Geoffrey January 2011 (has links)
MS Glasgow Hunter 252 is the sole surviving manuscript copy of the 'Cent Nouvelles Nouvelles'. The present PhD thesis, funded by a Glasgow University Scholarship and supervised by James Simpson and Peter Davies, explores the language of this collection of bawdy tales, attributed to the court of Philippe III de Bourgogne (1396-1467). Most existing studies on the language of the 'Cent Nouvelles Nouvelles' have offered a literary (e.g. stylistic, narratological) perspective, and very few have considered the document within the wider context of French historical linguistics. The present thesis aims to fill this gap by: •Presenting elements of linguistic interest within the document (dialectalisms, archaisms, rare features, cultural references, etc.), through a comprehensive survey of phonology, morphology, syntax and vocabulary. •Expanding and reassessing existing theories on orthographic standardisation and dialectal input in written and, more speculatively, spoken Middle French. •Providing scriptological evidence towards the localisation of other textual resources within the online 'Dictionnaire du moyen français (1330-1500)'. •Investigating the authenticity of the mise-en-scène of the 'Cent Nouvelles Nouvelles'; reflecting on linguistic practices and note-taking at the Court of Burgundy. •Exploring spoken language as rendered by direct speech passages, with special consideration of linguistic variation and stereotyping. •Publishing textual databases for future analysis (tables of main spelling variants, alphabetical list of words, etc.).
|
73 |
Theoretical Investigation of Thermodiffusion (Soret Effect) in Multicomponent MixturesAlireza, Abbasi 23 February 2011 (has links)
Thermodiffusion is one of the mechanisms in transport phenomena in which molecules are transported in a multicomponent mixture driven by temperature gradients. Thermodiffusion in associating mixtures presents a larger degree of complexity than non-associating mixtures, since the direction of flow in associating mixtures may change with variations in composition and temperature. In this study a new activation energy model is proposed for predicting the ratio of evaporation energy to activation energy. The new model has been implemented for prediction of thermodiffusion for acetone-water, ethanol-water and isopropanol-water mixtures. In particular, a sign change in the thermodiffusion factor for associating mixtures has been predicted, which is a major step forward in modeling of thermodiffusion for associating mixtures.
In addition, a new model for the prediction of thermodiffusion coefficients for linear chain hydrocarbon binary mixtures is proposed using the theory of irreversible thermodynamics and a kinetics approach. The model predicts the net amount of heat transported based on an available volume for each molecule. This model has been found to be the most reliable and represents a significant improvement over the earlier models. Also a new approach to predicting the Soret coefficient in binary mixtures of linear chain and aromatic hydrocarbons using the thermodynamics of irreversible processes is presented. This approach is based on a free volume theory which explains the diffusivity in diffusion-limited systems. The proposed model combined with the Shukla and Firoozabadi model has been applied to predict the Soret coefficient for binary mixtures of toluene and n-hexane, and benzene and n-heptane. Comparisons of theoretical results with experimental data show a good agreement. The proposed model has also been applied to estimate thermodiffusion coefficients of binary mixtures of n-butane & carbon dioxide and n-dodecane & carbon dioxide at different temperature. The results have also been incorporated into CFD software FLUENT for 3-dimensional simulations of thermodiffusion and convection in porous media. The predictions show the thermodiffuison phenomenon is dominant at low permeabilities (0.0001 to 0.01), but as the permeability increases convection plays an important role in establishing a concentration distribution.
Finally, the activation energy in Eyring’s viscosity theory is examined for associating mixtures. Several methods are used to estimate the activation energy of pure components and then extended to mixtures of linear hydrocarbon chains. The activation energy model based on alternative forms of Eyring’s viscosity theory is implemented to estimate the thermodiffusion coefficient for hydrocarbon binary mixtures. Comparisons of theoretical results with the available thermodiffusion coefficient data have shown a good performance of the activation energy model.
|
74 |
Development of a PC-Based Object-Oriented Real-Time Robotics ControllerTran, Hang January 2005 (has links)
The industrial world of robotics requires leading-edge controllers to match the speed of new manipulators. At the University of Waterloo, a three degree-of-freedom ultra high-speed cable-based robot was created called Deltabot. In order to improve the performance of the Deltabot, a new controller called the QNX Multi-Axis Robotic Controller (QMARC) was developed. QMARC is a PC-based controller built for the replacement of the existing commercial controller called PMAC, manufactured by Delta Tau Data Systems. Although the PMAC has its own real-time processor, the rigid and complex internal structure of the PMAC makes it difficult to apply advanced control algorithms and interpolation methods. Adding unconventional hardware to PMAC, such as a camera and vision system is also quite challenging. With the development of QMARC, the flexibility issue of the controller is resolved. QMARC?s open-sourced object-oriented software structure allows the addition of new control and interpolation techniques as required. In addition, the software structure of the main Controller process is decoupled for the hardware, so that any hardware change does not affect the main controller, just the hardware drivers. QMARC is also equipped with a user-friendly graphical user interface, and many safety protocols to make it a safe and easy-to-use system. <br /><br /> Experimental tests has proven QMARC to be a safe and reliable controller. The stable software foundation created by the QMARC will allow for future development of the controller as research on the Deltabot progresses.
|
75 |
A PC-Based Data Acquisition and Compact Disc Recording SystemBretthauer, Joy W., Davis, Rodney A. 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / The Telemetry Data Distribution System (TDDS) solves the need to record, archive, and distribute sounding rocket and satellite data on a compact, user-friendly medium, such as CD-Recordable discs. The TDDS also archives telemetry data on floppy disks, nine-track tapes, and magneto-optical disc cartridges. The PC-based, semi-automated, TDDS digitizes, time stamps, formats, and archives frequency modulated (FM) or pulse code modulated (PCM) telemetry data. An analog tape or a real-time signal may provide the telemetry data source. The TDDS accepts IRIG A, B, G, H, and NASA 36 analog code sources for time stamp data. The output time tag includes time, frame, and subframe status information. Telemetry data may be time stamped based upon a user-specified number of frames, subframes, or words. Once recorded, the TDDS performs data quality testing, formatting, and validation and logs the results automatically. Telemetry data is quality checked to ensure a good analog source track was selected. Raw telemetry data is formatted by dividing the data into records and appending header information. The formatted telemetry data is validated by checking consecutive time tags and subframe identification counter values (if applicable) to identify data drop-outs. After validation, the TDDS archives the formatted data to any of the following media types: CD-Recordable (CD-R) Disc (650 megabytes capacity); nine track tape (180 megabytes capacity); and erasable optical disc (499 megabytes capacity). Additionally, previously archived science data may be re-formatted and archived to a different output media.
|
76 |
Digitally Recorded Data Reduction on a PC Using CAPSRarick, Michael J., Lawrence, Ben-z 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / The Common Airborne Processing System (CAPS) provides a general purpose data reduction capability for digitally recorded telemetry data on a cost-efficient platform. Telemetry data can be imported from a variety of formats into the CAPS standard file format. Parameter dictionaries describing raw data structures and output product descriptions describing the desired outputs can be created and edited from within CAPS. All of this functionality is performed on an IBM compatible personal computer within the framework of the graphical user interface provided by Microsoft Windows.
|
77 |
Begåvningstest med persondator och smarttelefon - finns det skillnader?Forsgren, Helena January 2016 (has links)
The easy access to the Internet, computers and mobile devices brings new technological means of ability testing. The aim of this thesis is to investigate whether there is any difference in the results when an ability test of deductive reasoning is administered with two different platforms, computer and smartphone. In an experiment with within-subjects design the differences between platforms when it came to the level of the test score and usability were investigated. In addition, the test-retest reliability in the ability test was assessed. No significant differences between the scores from the platforms were observed. However, test-retest reliability was low and between r = .372 for smartphone and r = .526 for computer. The low reliability can explain why the score levels with different platforms could not be separated. The usability results indicate that the participants preferred to take the test with a computer, rather than with a smartphone, where even the most experienced users of smartphones disfavor test delivered on smartphone. It is recommended to replicate the experiment with more participants, and also with other handheld platforms, like tablet PC. / Den enkla tillgången till Internet, datorer och smarttelefoner innebär nya tekniska möjligheter till begåvningstestning. Syftet med detta examensarbete är att undersöka om det blir någon skillnad i resultat när ett begåvningstest för deduktiv förmåga utförs på två olika plattformar, dator och smarttelefon. I ett experiment med inompersonsdesign undersöktes skillnaderna mellan plattformarna när det gällde nivå hos testpoäng och användbarhet. Dessutom studerades begåvningstestets test-retest-reliabilitet. Inga signifikanta skillnader mellan testresultaten för det två plattformarna observerades. Test-retest-reliabiliteten var dock låg och varierade mellan r=.372 för smarttelefon och r=.526 för dator. Den låga reliabiliteten kan förklara varför poängnivåerna vid olika plattformar inte kunde skiljas åt. Gällande användbarheten indikerar resultatet att deltagarna föredrog att ta testet på dator framför smarttelefon, där även de mest mobilvana svarade till smarttelefonernas nackdel. Rekommendation för fortsatta studier är att replikera experimentet med större urval, samt att även testa med övriga handhållna datorer, som pekplattor.
|
78 |
An investigation of technical support issues influencing user satisfactionGutierrez, Charletta Frances 05 1900 (has links)
The widespread distribution of personal computers (PCs) throughout organizations has made a substantial impact on information systems. Additionally, the tremendous growth of the Internet has changed the way business is carried out. As the user population evolves into a much more technical and demanding group, their needs are also changing. With this change, Management Information Systems (MIS) departments must develop new ways of providing service and support to the user community.
This study investigates the relationship between information systems support structures, support services, service quality and the characteristics of a diverse user population. This includes investigating technical support issues influencing user satisfaction. This study attempts to improve the understanding of the support function within MIS. The results of this study clarify the support needs of the users and identify user satisfaction factors, as well as factors relative to the quality of the support received.
Six streams of prior research were reviewed when developing the research framework. These include: user support, end users and end-user computing, identifying and classifying user types, information centers, user satisfaction, service quality and other sources of computer support.
A survey instrument was designed using the (UIS) user satisfaction instrument developed by Doll and Torkzadeh (1988) and the SERVQUAL instrument as modified by Kettinger and Lee (1994). The survey was distributed to 720 individuals. A total of 155 usable responses were analyzed providing mixed results. Of the ten hypotheses, only four were rejected. The finding of this study differ from those in earlier studies. The variables that were found to be significant to the users for service quality are the method of support that is provided to the user, i.e., help desk or local MIS support and the support technician's experience level.
For user satisfaction the location of the service personnel made a difference to the end user. As with service quality, the support technician's experience level added to the users' satisfaction with MIS support. The results of this study are pertinent to managers of MIS departments as it clarifies the support needs of the users and identifies issues of user satisfaction and service quality.
|
79 |
Sistema embebido para adquisión de parámetros ambientales con comunicación a PCBustamante Avanzini, Renzo Emilio January 2008 (has links)
No description available.
|
80 |
L'italiano neostandard : un'analisi linguistica attraverso la stampa sportivaChalupinski, Beniamin Kazimierz January 2014 (has links)
Since the first definition of “italiano neostandard” appeared in the Eighties, more and more often “neostandard” forms, while already present in common speech, feature today in the written media, and even find their space in contemporary grammaticography. Through a corpus-based analysis, this dissertation aims at assessing the vitality of the neostandard as it appears in the written columns of three daily papers during a selected period of time in 2007. In particular, two phenomena are explored: the usage of the clitics ci, ne and lo in function of case marker (marca complementare); and the tendency to reduce the use of the subjunctive in epistemic modality. This contribution proposes the integration of different approaches into one interpretation of mechanism of cliticization as a continuum which goes from facultative usages of case markers to obligatory ones. In the second case the phenomenon of reduction of usage of epistemic subjunctive is described here as a reorganization (ristrutturazione). According to this study, within the category of epistemic subjunctive it is necessary to distinguish particular contexts after which the subjunctive preserves its status from the ones in which tends to be substituted by the indicative or the conditional.
|
Page generated in 0.0312 seconds