Spelling suggestions: "subject:"computer sciences"" "subject:"coomputer sciences""
121 |
The Development of an Information Literacy Indicator for Incoming College freshmenCritchfield, Ron 01 January 2005 (has links)
This study developed a comprehensive information literacy instrument based on the ACRL (2000) Information Literacy Competency Standards for Higher Education. Instrument items were composed for the initial instrument (version 1). Three experts in the field of information literacy examined Instrument version I to determine content validity. The instrument was then administered to 78 college freshmen at Warner Southern College. Scale reliability was determined for the items of instrument version I by calculating Cronbach alpha values for each construct. The results were positive and described the study instrument as a reliable indicator of information literacy skills. The major scales for ACRL standards two, three, and five all achieved an alpha value above .90. The reliability of the major scale for ACRL standard one was near the upper end of the moderate-to-high range with an alpha value of .89. The major scale for ACRL standard four achieved an alpha value of .75. This value was within the statistical range of a reliable scale. To triangulate the data, interviews were conducted with 14 randomly selected freshmen volunteers. The interviews also confirmed the reliability of the instrument.
Instrument version 1 items not as consistent as others in the scale were eliminated to produce the final instrument (version 2). As a confirmatory test, instrument version 2 was administered to a second group of 81 college students from Warner Southern College. The instrument analysis again resulted in Cronbach alpha values indicating a reliable instrument. All major scales were comparable between the two trials except for major scale 5-yet the result for major scale 5 was within the statistical range of reliability. The conclusion of the study was that the development and testing of the information literacy indicator proved to be valid and reliable for indicating information literacy skills across all five ACRL information literacy standards.
|
122 |
A Study of the Impact of Price Elasticity on the Adoption Decision of MinicomputersCrittenden, Dorrell W. 01 January 2005 (has links)
The rate at which a new technology is adopted by a population, called the diffusion rate, has been studied extensively in many fields, particularly the Information Systems field. Price elasticity, the rate at which quantity demanded changes, as a function of price, could be a useful measure in explaining diffusion patterns. In this empirical study, the well- known Bass diffusion model is fitted to a new dataset. Changes in the price elasticity over time were modeled after explaining away the effect of product characteristics in a hedonic regression. The main findings are: a) price elasticity changes over time, b) the changes in price elasticity are related to the life cycle stage of the product life cycle. These findings could offer a valuable empirical guide for managers to evaluate price and the effects of diffusion faced by the adopter population. The implications are that not all adopters react the same with respect to time and price effects. Another important implication suggests that an improved focus on the determinants of price elasticity could also play an important part in the Information Systems decision making process.
|
123 |
The Effect of Distress on the Quality of Computer ProgrammingCross, William C. 01 January 1990 (has links)
Quality means meeting or exceeding customer expectations. Quality of computer programming has become a part of the quality emphasis within business enterprises. The research demonstrates negative stress (distress) on Information Systems professionals lowers the quality of the computer programs produced by these professionals. The result is higher cost and lower customer satisfaction.
The research measured distress levels for sample populations of programmers in seven business enterprises over a three year period. These distress levels were then compared to the programming quality levels in the enterprises for the same periods.
The research results show episodic distress has a correlation coefficient of -0.38160 with programming quality, while situational distress has a correlation coefficient with programming quality of -0.24939. These results indicate both episodic distress and situational distress contribute to poor programming quality, and support the introduction of stress reduction efforts in the Information Systems profession.
Stress reduction among programming professionals should concentrate on reducing the vulnerability of the individual to a stressor and changing the context in which the stressor is received. Successful treatment of distress among Information Systems professionals should yield economic, sociological, political, psychological, and environmental benefits.
|
124 |
Evolutionary Algorithms for VLSI Test AutomationCruz, Alfredo 01 January 2002 (has links)
The generation of binary test patterns for VLSI devices belongs to the class of NP complete problems. As the complexity of VLSI circuits increases, the time to generate test vectors becomes computationally expensive. This dissertation focuses on an evolutionary algorithm (EA) approach for the generation of effective test vectors for single and multiple fault detection in VLSI circuits. EAs provide significant speedup while retaining good quality solutions through heuristic procedures. Although not guaranteed to find optimal solution, EAs are able to find very good solutions for a wide range of problems. Three basic steps are performed during the generation of the test vectors: selection, crossover, and mutation. In the selection step, a bias roulette-wheel and the steady state are used as the reproduction operators. The new candidate test vectors with the highest fitness function replace the old ones, once crossover and mutation occur. Using this scheme, population members steadily improve their fitness level with each generation. A new mutation operator is introduced that increases the Hamming distance among the candidate solutions for shrinkage faults in Programmable Logic Arrays (PLAs). The genetic operators (selection, crossover, and mutation) are applied to the CNF-satisfiability problem for the generation of test vectors for growth faults in PLAs. The CNF constraints satisfaction problem has several advantages over other approaches used for PLA testing. The method proposed eliminates the possibility of intersecting a redundant growth term with a valid candidate test vector. The genetic operator, unlike previous operators used in PLA test generation, does not use lookups or backtracking. That is, the CNF technique eliminates operations (such as backtracking and sharp (#) operation) that can become intractable with increasing PLA size. The proposed evolutionary algorithms based solution aims to address the shortcoming of the existing methods for VLSI testing. Deterministic procedures were used to allow the identification of untestable faults and to improve the fault coverage. This hybrid deterministic/genetic test generator helps improve fault effectiveness and reduce CPU time run. Experimental results have confirmed that the number of untestable faults identified contributed to test generation effectiveness .
|
125 |
Preliminary Examination for Empirical Knowledge(PEEK) A Fast Heuristic to Estimate the Inherent Degree of Clustering in Data SetsCupp, J. William 01 January 2007 (has links)
The general area of this research is data clustering, in which an unsupervised classification process is used to discover and extract the clusters that naturally exist in some data set. These inherent patterns are then used to understand the data in a manner consistent with what the data represent. Such clustering methods may be used to discover natural grouping of raw data and to abstract structures which might reside there, without having any prior knowledge of whether such structures exist.
Many different clustering algorithms are in use, each having relative strengths or other points of merit. For example, some have lower asymptotic running time than others, some require a priori knowledge of the underlying data, and some produce results which are highly dependent on the input parameters.
The goal of this dissertation was to develop an approach which measures the degree to which the data under study contains natural clusters. It develops measures of the degree of clustering inherent in a data set. That is, using the measures developed, a researcher can know whether the underlying data possesses nature clusters, or not, so further processing of the data may proceed by choosing a method known to provide best results given the degree of clustering exhibited by the native data. Moreover, understanding the native clustering tendency of the data will facilitate measure of clustering validity, or how well the produced clusters actually partition the data in a manner which is meaningful in the real-world domain of the data set.
Such measures permit a researcher to choose intelligently, perhaps employing computationally intensive technique with the confidence that the underlying data warrant such effort.
|
126 |
Developing a Wider View of Educational Technology Through Ubiquitous Computing: A Qualitative Case StudyCurry, D. Bruce 01 January 2003 (has links)
Ubiquitous computing in elementary schools is in its early phases of development. However, many signs point to the likelihood that some form of one-to-one personal computing in schools is inevitable in the near future. The research pertaining to this model of computing technology is relatively scarce, leaving schools with a knowledge void when attempting to make decisions pertaining to this paradigm. The investigation sought to help alleviate this gap in the knowledge base by providing a qualitative case study that examined an elementary school's wireless laptop program over an extended period of one and one-half years. It gathered and described basic information about this model of technology in education and helped form a database for future comparison and/or theory building related to ubiquitous computing. Additionally, the relationship between identifying/removing barriers to technology use and developing a wider view of educational computing among members of the school community was examined.
The theoretical purpose of the investigation explored how participants in the program developed a wider view of educational computing, which was defined as the processes by which individuals come to understand how technology can enhance the school environment. The more basic purposes were threefold: to describe richly what occurred as this school made the transition to Ubiquitous computing, to interpret inductively the findings into a series of lessons learned from the experience, and to apply practically the findings to the future direction of the program. The qualitative inquiry into this laptop program employed an embedded single-case study design to achieve the stated purposes and made use of a variety of data-gathering strategies (interviews, observations, and document analysis), consistent with the investigator's goals.
By advancing our understanding of the processes by which people come to use ubiquitous computing to empower teaching and learning, this research made a substantial contribution to knowledge and practice. Its timing relative to this paradigm's entry into schools gives it the potential to have a disproportionately significant impact on the model's development.
|
127 |
Visually Searching the World Wide Web for Content: A Study of Two Search InterfacesCusano, Carol 01 January 2002 (has links)
The vast amount of data available over the World Wide Web has created the necessity for new initiatives that translate this data into useful information for users. Due to human's acute visual perception, applications that utilize information visualization CIV) methodologies may ease user frustration when facing an abundance of search results from an Internet query. The recent introduction of ditto.com, an Internet search engine that provides users with a graphical depiction of search results documents, is a recent initiative that employs IV methodologies.
This research is based upon the usability of traditional information retrieval systems and Internet search applications, and the impact IV methodologies have had on these systems. A usability evaluation was recently implemented to determine if IV methodologies can facilitate users' search needs when searching for information over the Internet. Fifteen randomly selected participants that match the diversity of Web users were asked to compare two Internet search results interfaces: Yahoo! a search engine that provides users with text-based search results and the graphical displays found within ditto.com.
Descriptive data was collected through usability questionnaires and observing users search for information. Measurable data was collected by testing the performance of each search engine as the users search for ready-reference questions. Time to complete search tasks, the accuracy of the tasks, and number of error rates was collected from this session. Users were asked to provide their preference for one of the search engines. The data was analyzed for mean averages, occurrence of specific incidents that help or hindered users, and distribution of results with user experience. The results of this study are presented in a narrative report of users' preferences and concerns.
|
128 |
Design and Implementation: A Custom Interactive Multimedia CBT for the Blood-Cell Analysis IndustryCzelusniak, Vernon L. 01 January 1998 (has links)
The purpose of this study was to develop a custom interactive multimedia computer-based training (CBT) program for field-based employees of a medical instrument manufacturing company. An informal needs analysis determined that safety training for field-based employees was currently met through video training tapes and paper-based materials. Tills form of training is sporadic, incomplete and lacks consistency. The training department decided to investigate the feasibility of using alternate training methods such as CBT and interactive multimedia for the delivery of training. Through analysis, a decision was made to develop a custom interactive multimedia CBT program that offers a viable avenue of training for several reasons. First, a multimedia-training program would allow for the consistency in training required by the Occupational Safety and Health Administration (OSHA). Second, training would no longer be subject to the availability of trainers to congregate in a centralized area for extended periods of time. Third, it will allow companies to train a population that is globally dispersed. Fourth, it will provide a cost-effective training program within a limited timeframe.
A custom interactive multimedia CBT program was created for this study using Asymetrix, Tool book II Instructor as the authoring system. The development of this educational program followed a systematic development process prescribed by Hudspeth and Sullivan (1997) and Alessi and Trollip (1991). The development model for an interactive multimedia CBT program includes planning, project definition, design, development/scripting, progran1ming, implementation and evaluation phases.
The CBT program was provided to 20 globally dispersed field-based employees. A follow-up questionnaire was administered to measure interest, attitudes and expectations toward the computer learning experience. The program was received very favorably, provided the training at the desired time, provided flexibility in administration time and offered a cost-effective method of training. Ultimately, the success of this project determined that additional custom designed interactive multimedia CBT programs would be developed and provided for future education and training programs.
|
129 |
A Study of The Impact of Users' Involvement, Resistance and Computer Self-Efficacy on the Success of a Centralized Identification System ImplementationDanet, Theon L. 01 January 2006 (has links)
A recent Presidential Directive (PD) mandated, as an IT requirement, that all government agencies establish a centralized identification management system. This study investigated the impact of user's involvement, computer self-efficacy and user's resistance on the success of a centralized identification management system. The research methodology proposed was a web-based survey approach conducted at NASA Langley Research Center. Information System (IS) use was the construct employed to measure IS implementation success.
The results of this study indicated a strong reliability for the measures of all constructs (users' involvement, computer self-efficacy, user's resistance, and system use). Two statistical methods were used to formulate models and test predictive power: Multiple Linear Regression (MLR) and Ordinal Logistic Regression (OLR). In both models (MLR and OLR), the User Involvement dimension had the highest predictor weight in predicting system use. This empirical study showed that success is related to user's involvement, computer self-efficacy and user's resistance. Results were consistent with prior literature demonstrating the partial model is also valid in new context such as government agencies
|
130 |
Anatomy of A Software Maintenance Training ProgramDavid, Larry G. 01 January 1988 (has links)
The process of doing the software training portion of a large complex command and control system under government contract was traced from the initial advertisement to completion of formal training. The Royal Thai Air Defense System (RTADS) contract, as viewed from the perspective of the software training manager, was used as the vehicle for describing the development and delivering a software maintenance training program.
The early aspects of the contracting process were reviewed in general terms from the initial public announcement to the contract award. Emphasized, was the need for thorough analysis of the request for proposal (RFP), the system specification, and references included in both. Each included reference could lead to further references and failure to examine all such references could result in underestimating the amount of work needed to complete the contract. Such a failure could result in not bidding enough money to do the job within the proposed schedule.
Once the contract was awarded, the processes involved in doing the project were described. These included acquiring and training the necessary staff; analyzing the project needs; coordinating with subcontractors; developing the training and training equipment plan (TTEP); developing budgets and schedules; coordinating with governmental oversight agencies; designing the courses, lessons, and instructional materials; producing the lesson plans, student study guides, and other materials; securing approvals; scheduling students and classes; and finally delivering the planned and prepared training. The problems encountered in coordinating and implementing a program where multiple agencies have shared responsibilities were discussed. Also described were the complications of developing training materials for teaching computer programs that were simultaneously being developed and were thus changing regularly. The added complications associated with training Thai military personnel were covered, such as language and cultural problems.
The software maintenance training program was described as it grew from a few lines of general statements in the RFP to about 60 pages of the 800 page TTEP to about 3800 pages of training materials and 500 graphic slides developed specifically for the three software maintenance courses. Those three courses were presented successfully over a two year period to three different groups of people. Conclusions emphasized the need to plan in great detail, to expect problems, to coordinate with everyone concerned, to adhere to budgets and schedules, and to expect to expend much time and energy on student personal matters because of cultural and language difficulties.
The RTADS software maintenance training program was evaluated as successful by all concerned, all students completed their training, and all students safely returned home to Thailand.
|
Page generated in 0.1019 seconds