Spelling suggestions: "subject:"computer sciences"" "subject:"coomputer sciences""
441 |
A Study of the Processes by Which Enterprise Architecture Decisions are MadeTanigawa, Utako 01 January 2004 (has links)
This dissertation presents the findings of a descriptive study in enterprise architecture decision-making processes. Decisions regarding enterprise systems architecture are among the most complex decisions in the IS domain. Enterprise architecture evaluations and recommendations such as choice of an enterprise system and buy-versus-build applications are strategic decisions in the sense that they influence and constrain corporate decisions. Formal and informal methodologies are discussed in this dissertation regarding how enterprise architecture decisions ought to be made in an abstract, ideal situation (i.e., normatively) and in describing best practices (i.e., prescriptively). How enterprise architecture decisions are made in practice, however, has not been rigorously studied (i.e., descriptively). The first purpose of this research is to examine the processes by which enterprise architects make decisions in practice. Drawing on concepts from complex decision-making (e.g., heuristics, scripts, schema, etc.) and sociology (e.g., mimetic isomorphism), the aim of this paper is to understand architectural decision-making in practice from the descriptive view. A second purpose of this research is to provide at least the beginnings of a new theory for enterprise architecture decision-making processes. Using a grounded theory approach, the study uses systematically collected field data to clarify and refine concepts of architecture decision processes and how organizational issues influence them. This descriptive and empirical study was used to propose a new theory for conceptualizing the organizational environmental issues on enterprise architecture. The paper thus has important implications for research and practice.
|
442 |
Information Technology in a University: An Institutional Case Study of Instructional and Research Computing in a client/ServerTellis, Winston M. 01 January 1997 (has links)
This study examined the problem of financing of information technology in higher education institutions. Some of the pressures faced by those institutions as they attempt to control expenditures while continuing to support information technology were also examined. Particular emphasis was placed on the aspects of information technologies as they relate to pace of acquisition, the use of client/server computing, the Internet and the World Wide Web. The current study replicated a case study conducted by Samuel Levy at the University of Arizona in 1988, and extended it as described above, to explore various aspects of recent technological advances at Fairfield University. Levy (1988) used two surveys to assess computer use of the faculty and administrators. Those surveys were modified to reflect the environment in the case organization, and to capture data on the aspects Gmt extended the original study. The study used multiple sources of data as recommended by the literature, to improve the reliability of the study. The multiple sources included interviews, and internal documents that were relevant to the research agenda. The results of the survey showed that respondents expect their use of the Internet and the World Wide Web to increase in the next few years. The current method of equipment procurement was found unsatisfactory by a majority of the respondents. With the use of client/server computing increasing at all institutions, and the rapid growth in the use of the Internet and the World Wide Web, the results of this study should be of value to many other institutions. The recommendations of the study include shortening the information technology planning cycle to reflect the rapid advances in technology. Each planning cycle should be followed by a survey to evaluate the user acceptance of the initiatives in the planning cycle. The study also recommended formal capacity planning procedures for client/server environments to ensure the efficiency and integrity of the operation.
Future researchers might be able to use the new groupings of data that were developed in this study. New variables could lead to new analyses and fresh insight into the problem of rapid acquisition of information technology.
|
443 |
A Model of Trusted Computing Acceptance in Higher EducationTeo, Jeff 01 January 2005 (has links)
Virus and computer attacks are common occurrences in today's open computing platforms. As a result of these infractions, users and companies, here and abroad, have suffered untold losses and incurred tremendous costs. Until recently, the IT industry reacted to these problems merely by introducing more software-based solutions, such as updated virus definitions, whenever a computer virus attack was publicized. And yet, these attacks continued unabated.
While hardware implementations to harden and thus improve computer security were developed in the 1980s, only specialized applications users (such as the military) could afford such implementations. A group of leading technology companies including IBM, Microsoft, HP, Intel and many others, formed the Trusted Computing Group (TCG) which aimed to improve trust and security in today's open computing platforms by utilizing both software and hardware implementations. To realize this vision of trusted computing (TC), TCG began incorporating hardware through the use of a Trusted Platform Module (TPM). This low-cost device contains several built-in features that will improve security and trust in today's networked platforms. Several million of these TPMs have already been shipped and TCG is aiming for even wider deployment across all computing platforms. While trusted computing may be a viable option to improve trust and security in computing platforms, opponents cite concerns due to privacy, digital rights management (DRM), and restriction of software choices resulting from this technology. Currently there is little if any research has been done to determine whether institutions of higher education (IHEs) will adopt trusted computing. IHEs are unique entities. They promote open access to information and academic freedom. At the same time, they are required to protect the privacy and confidentiality of students, staff, and faculty. The goal of the researcher in this study was to determine the relationships between variables involving TC acceptance in higher education. In this study, the researcher integrated several influential streams of literature: the theory of reasoned action (TRA), the theory of planned behavior (TPB), the technology acceptance model (TAM), trust and risk, and trusted computing technologies.
|
444 |
Development of a Multimedia Publication in Hospitality and Tourism Ethics for Undergraduate Students and Workplace Training ProgramsTesone, Dana V. 01 January 1995 (has links)
This qualitative study addresses the development process for a text manuscript and storyboard layout used for a multimedia product for undergraduate students and workplace training programs in ethics for hospitality/tourism management. Literature indicates that ethics courses are being offered as part of business curricula and training programs. Literature also shows that hospitality management and training programs are focusing on applications of business ethics in an industry-specific setting. Computer based learning (CBL) and other forms of multimedia are methodologies that educators and trainers in hospitality management are beginning to implement, according to the literature.
The dissertation uses needs analysis to collect information from various user groups to determine the feasibility of developing a text script and storyboard for a multimedia production. The needs analysis and literature provide insight concerning the development of content and format for the product. Formative and summative reviews by established colleagues provides further guidance in the development of a final product designed for use by trainers and educators. The process for development and the final product could be useful to subsequent education researchers and designers.
|
445 |
Marketing the Learning resources Center at South Florida Community College: A Business ApproachTeuton, Luella Bosman 26 April 1991 (has links)
Administrators in many community college libraries must deal with dwindling funds, rising costs of materials, and staff shortages. The increased pressures and competition that result from outside Forces (technological innovations, networks, user demands, and the increased information requirements of the 19905) compel library professionals to rethink their position in today's world. With the library no longer perceived as an information monopoly, the need for that institution is being questioned. The Learning Resources Center of South Florida Community College (LRC/SFCC) is among community college libraries with an uncertain future. To survive and prosper, strategic management, planning, marketing, and public relations must play an essential part in this future. By conducting a four-month marketing campaign from May-August 1990, the LRC/SFCC demonstrated the value of such strategies for increasing knowledge and usage of its resources and services.
|
446 |
Sinclair Curriculum eXchange (SCX) Sharing Learning Resources to Improve Part-Time InstructionThibeault, Nancy 01 January 2005 (has links)
The dissertation effort focused upon improving the quality and consistency of instruction across multiple course sections taught by full -time and part-time faculty. Sinclair Curriculum eXchange (SCX), an online repository of learning objects (LOs) was designed, implemented, and used to deliver a consistent set of teaching materials to introductory Computer Information Systems (CIS) students. Experienced CIS faculty documented successful learning activities along with instructions for using those activities in the classroom. The SCX system was used to assemble the materials for three LOs and one lesson, and then the SCX system was used to share the materials with all faculty teaching the course.
The quality and consistency of instruction were measured by a faculty survey and the analysis of student quiz scores. Overall, the faculty agreed that the materials were effective, they liked the teaching approach, and the materials made it easier to teach. Student quiz scores were compared across instructors, course sections, and instructor status. Statistical analysis revealed no significant differences on three of the four quizzes or on all quizzes combined.
The results of the faculty survey and analysis of student quiz scores suggest that the SCX system has the potential to increase the quality and consistency of instruction across multiple course sections. It is therefore recommended that a complete course be developed in SCX and the system be re-evaluated. Two major issues surfaced during this study. Faculty participation was problematic in the development of course materials. The fine granularity level used required the creation of a prohibitive number of files.
|
447 |
An Instrument for the Distribution and collection of Data Using Computer-Based TechnologyThombs, Michael 21 October 1989 (has links)
A diskette-based instrument for an electronic survey, created by the writer, was used to distribute questionnaires to and collect responses from technology educators. The instrument contained five elements: a menu control subsystem, a color storyboard instructional subsystem, a distribution and collection procedure, a hardcopy print facility, and a free software download subsystem. An IBM compatible diskette was programmed to coordinate each of the five elements. The writer measured the effectiveness of this electronic survey and proposed changes to improve it.
Two questionnaires were made, a paper-based questionnaire and a diskette-based questionnaire. Two samples of 100 technology education users were selected at random from the same population. The population in this study was technology educators who belong to the International Technology Education Association (ITEA). One group was sent a preview letter and was asked if they would be willing to participate in a paper-based survey. Respondents were sent the paper-based questionnaire. A different preview letter was sent to a second sample of 100 technology education users. Diskettes were sent to the respondents of the second preview letter. A textbook and free software were offered as an incentive to both groups.
At the end of the 6-week collection period, 30 people had asked for the diskette-based questionnaire and 33 people had asked for the paper-based questionnaire. Thirty diskette-based questionnaires were mailed and 14 were returned. Thirty-three paper-based questionnaires were mailed and 26 were returned. Three extra paper-based questionnaires and one extra diskette were collected from colleagues of respondents who copied the questionnaire in their respective medium. The extra surveys were not included in the reported statistics. The response rate of returned preview letters from the paper-based survey sample was greater than the response rate of returned preview letters from the diskette-based survey sample. The response rate of returned paper-based surveys was greater than the response rate of diskette- based surveys (Response rates are based on chi-square analysis of difference at the .05 level of significance).
There were two main null hypotheses tested in this study. The first was "The difference between the number of diskette-based preview letter responses and the number of paper-based preview letter responses is zero." This null hypothesis was not rejected. The second null hypothesis was "The difference between the number of returned diskette-based survey responses and the number of paper based survey responses is zero." This null hypothesis was rejected.
Seven recommendations were made following this study. First, hardware and software should match the users if the diskettes are used as a survey medium. The difference in hardware and software configuration between the diskette-based survey medium and that of the technology users prevented the completion of the diskette-based media approach for several people who were contacted by follow-up phone calls. Technology users in this study were unwilling to bear the burden of converting from one size floppy diskette to another or locating a machine that had a compatible drive. Second, a handwritten cover page and a real stamp on the return mailer should be used. The survey administrators thought that this was an important feature of this survey and should be used in future surveys. Third, a preview letter should be used to save costs. Once contact between the survey administrators and a user was made by the preview letter response rates improved. The preview letter responses could be used to let respondents indicate what size diskette they need so that only the correct diskette would be mailed. The use of a preview letter could also help researchers find people who are more likely to respond. Fewer diskettes are mailed and returned diskettes can be recycled. Fourth, attractive incentives should be offered for surveys that use diskettes. The incentive has to motivate users to solve technical problems when these arise.
Fifth, an eight week collection period should be used to give researchers time to contact delinquent users and solve hardware and software problems. Sixth, users should be technologically oriented. Seventh, the appearance of the instrument should be attractive. The writer valued the promotional impact of the diskette and thinks that it could outweigh poor response rates when advertising and promotion goals are important. Commercial printing services should be used to add this feature to the instrument.
The survey instrument was tested at Davis Publications, Incorporated (Davis), Worcester Massachusetts. Survey administrators and personnel at Davis were pleased with the appearance of the physical diskette and the programmed software on it. The personnel in the Public Relations and the Marketing Departments at Davis thought that the instrument had prestige value. They are interested in use of the medium in future projects.
|
448 |
A Comparative Study of the Effectiveness of Three Models of Distance Education on Student Achievement and Level of SatisfactionThorn, Virgelean J. 01 January 2000 (has links)
The discovery of interesting patterns from database transactions is one of the major problems in knowledge discovery in database. One such interesting pattern is the association rules extracted from these transactions. The goal of this research was to develop and implement a parallel algorithm for mining association rules. We implemented a parallel algorithm that used a lattice approach for mining association rules. The Dynamic Distributed Rule Mining (DDRM) is a lattice-based algorithm that partitions the lattice into sub lattices to be assigned to processors for processing and identification of frequent item sets. We implemented the DDRM using a dynamic load balancing approach to assign classes to processors for analysis of these classes in order to determine if there are any rules present in them. Parallel algorithms are required for the mining of association rules due to the very large databases used to store the transactions. Some of the previous parallel algorithms are Count Distribution (CD), Data Distribution (DD), Candidate Distribution (CDD), Intelligent Data Distribution (IDD), and Hybrid Distribution (HD). However the costs associated with these algorithms are hash tree construction, hash tree traversal, communication overhead, input/output (I/O) cost and data movement respectively. These algorithms assign tasks to the processors using a static scheduling scheme. The main challenge for a static scheduling scheme is to determine the amount of time that will be needed to process each task. This information can then be used to compute the total time needed to process all the tasks and to divide these tasks among the processors so that an equal amount of tasks are assigned to each processor using processing time as the unit of measurement. Experimental results show that DDRM utilizes the processors efficiently and performed better than the prefix-based and Partition algorithms that use a static approach to assign classes to the processors. The DDRM algorithm scales well and shows good speedup.
|
449 |
Attitudes of Inmates Toward The Use of Computer In Tennessee Correctional InstitutionsTobias, Renee 01 January 1993 (has links)
The purpose of this study was to investigate inmates ' attitudes toward computers at the Tennessee Correctional institutions. Specific attention was given to the relationship between attitudes and gender, race, age, education and prior computer experience.
The instrument used in this study, the Computer Attitudes Scale (Loyd and Gressard, 1984), consisted of 40 items organized into four subscales (anxiety, confidence, liking, and usefulness). The subjects were 188 inmates from correctional institutions in the State of Tennessee.
The inmates in the control group were randomly selected through cluster sampling by the individual teacher of each correctional site. There were 94 inmates enrolled in a computer class at the four sites in the fall of 1992. The researcher had a 100% return rate. The number of surveys administered to the control group were given to equally match the number of surveys returned from the experimental group.
The data were tabulated and analyzed using the Statistical Package for Social Science (SPSS). Data related to the following variables: gender, race, age, education, and experience were analyzed using a Multivariate Analysis of Variance (MANOVA).
The results of data analysis revealed the following: There was a statistically significant difference between the experimental group and the control group in their Attitude Difference Score (ADS) in terms of computer anxiety. There was no statistically significant difference between the experimental group and the control group in their ADS in terms of computer confidence, computer liking, and computer usefulness. Gender, race, age, and education was not found to be significantly related to changes in attitudes toward computers. Prior computer experience appeared to make a statistically significant difference in inmates' attitudes toward computers. Word Processing was the most popular course taken and the IBM computer was the most widely utilized.
|
450 |
Pictogram: The Design and Implementation of a New Visual Programming LanguageTomizawa, Takaaki 01 January 1999 (has links)
The objective of this dissertation was to design and implement a platform-independent, distributed visual programming language / visual programming environment (VPUVPE) called the PictGram system. PictGram (PICTorial proGRAMming) is based on the functional programming paradigm. The PictGram system required the development of three challenging components: (1) a visual lexical specification for graphical tokens, (2) a visual syntactic definition specifying rules by which expressions can be legally combined, and (3) a visual parsing mechanism for graphically represented programs. The construction of PictGram has required an intensive analysis of theories for distributed functional programming languages, and extensive experiments of possible VPUVPE implementation. The theoretical investigation developed a formalism of functional programming language in three design phases: (1) a lexical representation of a visual primitive, (2) visually-expressed syntactic rules for the lexical representation, and (3) semantic interpretation of the visual expression. The practical experiments have integrated such formalisms into two realistic implementation components: (1) a front-end of PictGram manages a construction of the visual expression and (2) a back-end of PictGram, a distributed interpreter, evaluates the visual expression. PictGram was constructed by integrating three sub-goals: (1) to develop a theory of PictGram VPUVPE, (2) to design and implement the PictGram VPUVPE, and (3) to integrate PictGram VPUVPE with distributed interpreters. Pic/Gram allows the users to construct a graphically-represented source program. Pic/Gram translates such a graphical expression into textual expression, then uses an interpreter to evaluate the expression. The result is then translated back to an appropriate graphical form. All programming activities are supported interactively through the system's graphical user interface. This dissertation has investigated visual programming methodologies based on a functional programming paradigm, and a visual programming system, PictGram, has been suggested. A lexeme is expressed by the graphical user components, and a syntactic relationship is specified by the click-and-drop operation. The semantics of the graphically-represented source programs are interpreted by the distributed interpreters. PictGram provides a simple interface that supports a general programming language paradigm.
|
Page generated in 0.0877 seconds