Spelling suggestions: "subject:"computer sciences"" "subject:"coomputer sciences""
81 |
Proposition of the Temporal Variation Data model and Evaluation of an ImplementationBrown, Mark A. 01 January 1998 (has links)
This dissertation identifies the need to develop practical implementations of temporal databases. Most databases today model the real world at one particular time. As new information is collected older information is deleted. This degrades the capabilities of decision support systems because only trends that were anticipated and included in the design of the database can be reported. The retention of older information requires significant increases in resources. This dissertation focuses on temporal databases which retain older information rather than delete it.
This dissertation reviews the relevance and significance of the use of temporal databases. It then presents a problem that has been identified through review of the literature on temporal databases. Using current models to design relations in a temporal database results in large databases that are costly to query. The Temporal Variation Data Model (TVDM), is proposed to address this problem. There are two basic approaches to implementing temporal databases. These are attribute and tuple versioning. Ahn (1986) provides calculations to compare the storage requirements between these alternatives. Shiftan (1986) assesses the temporal differentiation of attributes as an implementation strategy. His proposed implementation included both tuple and attribute versioned relations. Shiftan notes an additional step, separating event information, for use in organizing tuples. That additional step, taken concurrently with additional consideration of the variation of those attributes Shiftan included in tuple versioned relations, serves as the foundation for the TVDM.
Using one set of source data, attribute and tuple versioned databases are created, then the TVDM is used to create a third temporal database. This dissertation reports the results of a case study that includes a three-way comparison. It found increased utility for a temporal database as a result of using the new data model. Utility is operationally defined as a scaled comparison of central processing unit (CPU) time, input/output activity (VO), and required storage measurements. The CPU time and VO requirements are recorded for a series of test queries accessing each of the alternatives. The TVDM based database in the case study required the least storage space and had the lowest requirements for CPU time and VO.
|
82 |
System Requirements for a New Management Control Expert System Generalized Inference Engine Interfacing With a Client/Server Notification SystemBrown, Abby H. 01 January 1998 (has links)
Modem organizations struggle with controlling computer platform maintenance expenses. This maintenance effort is further complicated by the existence of multiple, heterogeneous platform environments. A diverse composition of Computer maintenance engineers is responsible for diagnosing and carrying out tasks required to assure quality platform performance. Client/server notification systems are an effective tool designed to manage these computer resources, and yet one of the least mature within the computer platform maintenance domains. The system requirements for a new expert system generalized inference engine were developed based upon research and analysis of expert system generalized inference methods to date, in addition to related efforts accomplished in the Client/server notification system arena. These computer notification systems have done little to address the issues related to the increased complexities in monitoring and managing the maintenance of multiple computer platforms, each with their own unique environments and applications. The systems requirements for a generalized inference engine that utilize binding, matching and unification to categorize alarms written in Prolog have been developed and validated. When the inferencing method was applied to the notification interface, it generalized the heterogeneous data. The rules were applied to an instance of a frame through unification. The inference engine then identified the correct node reporting the error, and applied the appropriate resolution. The requirements also detailed an explanation. The system requirements for the new expert system generalized inference engine provide for effective centralized processing of multiple computer alarms generated by diverse systems. The systems requirements have been tested via test case execution to verify the logical processing of the new generalized inference engine. The results indicate that the growing discipline of Computer maintenance engineering will benefit from the system requirements for the generalized inference engine that provides for effective centralized processing of computer alarms generated by diverse computer platforms. The implication is that organizations will be able to achieve cost-effective centralized heterogeneous platform maintenance with fewer Computer maintenance engineers, each with a lesser degree of expertise required on multiple platforms.
|
83 |
Credible Webcast for Financial CommunicationsBrown, William C. 01 January 2004 (has links)
A growing body of knowledge suggests that credibility can be engineered into computer applications. This dissertation evaluates the moderating variables of engagement, persuasive tools, and attunement and common ground to determine whether the credibility of financial web casts is enhanced. A sample population that understands financial reporting evaluated three alternative webcasts that used (a) streaming audio, (b) streaming audio and slides with financial content, and (c) integrated streaming audio, slides with financial content, and an interface design to enhance engagement, offer persuasive tools, and create attunement and common ground. When participants were asked what they liked most about the streaming audio webcast, the favorable response to the professional actor used in the research was the dominant theme. When participants were asked about what they liked most about the webcast that used streaming audio and slides, the favorable response to the slides with financial content was the dominant theme. The enhanced webcast with an interface design that attempted to enhance engagement, offer persuasive tools, and create attunement and common ground produced subdued favorable feedback. The enhanced webcast paced the slide presentation with the speaker and ultimately caused participants to make unfavorable comments about excessive speed and confusion. The streaming audio and slide with financial content in print form produced higher positive levels of satisfaction in graphics, usefulness, value, and trust. Within the scope of this research, it could not be determined whether selected features used in the enhanced webcast could further enhance credibility. Several areas of additional research are suggested, including designs that further enhance user control and streaming video. Over 50% of the Fortune 1000 use streaming audio webcasts, an application design that produces lower levels of usefulness, value, and trust than a webcast using streaming audio and slides with financial content. The Securities and Exchange Commission recommended financial webcasts in 2001 to enhance financial disclosure communications using asynchronous communications. Companies in the Fortune 1000 use webcasts extensively for investor relations. Additionally, this research has broad implications for webcast applications in other domains and multimedia.
|
84 |
A Study of Chief Information Officer Effectiveness in Higher EducationBrown, Wayne A. 01 January 2004 (has links)
For almost as long as information technology (IT) has existed, there has been a communication and action gap between IT departments and their institutions. The gaps cause a variety of dysfunctional outcomes that include multimillion dollar failed projects, inefficient operations, and the inability of other departments to focus on their jobs. The consequential effects are well documented in the IT-leadership literature. Responsibility for resolving the negative effects rests with the senior IT-executive or chief information officer (CIO). CIOs come from any number of varying backgrounds and each may have very different attributes. In some institutions, the person is responsible for all technology initiatives and may have a direct link to the chief executive officer (CEO). In other organizations, he may be a peer of academic department leaders. The position the CIO holds within the organization may affect the organization's perception of IT. Furthermore, the organizational view of IT and the configuration of the IT department may have an impact on a CIO's effectiveness.
This qualitative and quantitative case study used CIOs in four- year or above higher education institutions in the United States as the case study group. Separate survey. Of CIOs and the institution management teams (IMTs) were used to determine how attributes, management team membership, organizational view of IT and a centralized or decentralized IT structure related to effectiveness. Results showed a correlation between CIO attributes and effectiveness in all of the CIO roles.
The effectiveness of the CIO was not affected by his membership on the IMT. In addition, the business partner role was the only CIO role affected by the decision to centralize the IT department. There was no correlation between the effectiveness of the CIO and institutional view of IT. Recommendations are made to CIOs and IMTs about the benefits of eliminating the communication and action gap. Also, necessary attributes and organizational configurations demonstrated important to ensure CIO, and therefore organizational, success are discussed.
|
85 |
The Application of Genetic Programming to The Automatic Generation of Object-Oriented ProgramsBruce, Wilker Shane 01 January 1995 (has links)
Genetic programming is an automatic programming method that creates computer programs to satisfy a software designer's input/output specification through the application of principles from genetics and evolutionary biology. A population of programs is maintained where each program is represented in the chromosome data structure as a tree. Programs are evaluated to determine their fitness in solving the specified task. Simulated genetic operations like crossover and mutation are probabilistically applied to the more highly fit programs in the population to generate new programs. These programs then replace existing programs in the population according to the principles of natural selection. The process repeats until a correct program is found or an iteration limit is reached.
This research concerns itself with the application of genetic programming to the generation of object-oriented programs. A new chromosome data structure is presented in which the entire set of methods associated with an object are stored as a set of program trees. Modified genetic operators that manipulate this new structure are defined. Indexed memory methods are used to allow the programs generated by the system to access and modify object memory. The result of these modifications to the standard genetic programming paradigm is a system that can simultaneously generate all of the methods associated with an object.
Experiments were performed to compare the sequential generation of object methods with two variants of simultaneous generation. The first variant used information about both method return values and object internal memory state in its fitness function. The second variant only used information about method return values. It was found that simultaneous generation of methods is possible in the domain of simple collection objects both with and without the availability of internal memory state in the fitness function. It was also found that this technique is up to four orders of magnitude more computationally expensive in terms of number of individuals generated in the search than the sequential generation of the same set of methods on an individual basis.
|
86 |
Development and Implementation of a Plan for an Online Educational Technology Program Leading to Cross Endorsement for Connecticut EducatorsBruciati, Antoinette P. 01 January 2005 (has links)
On January 8, 2002, President George W. Bush reauthorized the Elementary and Secondary Education Act of 1965 by signing the No Child Left behind Act of 2000 into law (US DOE, 2002a). Through federal mandates, the United States Department of Education directed state and local school officials to assume a prominent role in ensuring that individuals teaching in all core academic subject areas become highly qualified by the end of the 2005-06 academic year (USDOE, 2002b).
In addition to federal mandates, the emergence of a global knowledge economy challenges educational policymakers to abandon the traditional approaches to education in favor of a new educational model that will support a student's lifelong learning needs.
In a global knowledge economy, technological skills and innovation are valued resources for economic growth. Communication, problem-solving, collaboration, and the development of information and communication technology skills are important knowledge economy competencies that must complement a leamer's current educational core skills in the areas of language, mathematics, and science.
In the State of Connecticut, changes in teacher certification regulations were slated to become effective in July, 2003 and had included a cross endorsement in the area of computer technology for educators teaching kindergarten through twelfth grades (CSDE, 1998a). These regulations had subsequently been repealed during a Regular Session of the Connecticut General Assembly (CGA, 2003, June 26). As a result, educators can continue to receive their teaching certificates under the teacher certification regulations in effect since August, 1998.
As a result of this investigation, this researcher developed a taxonomy of online learning behaviors along with guidelines for improving teacher quality through the availability of a cross endorsement in the area of computer technology. In order to achieve the goal of this research, an online educational technology program was designed, implemented, and evaluated based on the Whitten, Bentley, and Dittman (2004) information systems design model. One online course was measured and discussed in terms of the benchmarks for quality in distance learning programs that are recommended by Phipps, Merisotis, Harvey, and O'Brien (2000).
|
87 |
Technological Literacy Assessment in Secondary Schools Through Portfolio DevelopmentBryan, Joyce Bethea 01 January 1998 (has links)
Secondary school students lacking technological literacy required for job success in the 21st century participated in an action oriented research study to increase their literacy levels. A team of teachers, 9th-grade students, media specialists, and a researcher implemented a technological and information skills model across subject -area disciplines in an effort to identify the needed skills and implement an instructional program for technological literacy. The researcher worked with a formative and summative committee to design and produce a conceptual design, scope, sequence, and instructional schedule that served four grade levels across subject area curricula. Teachers used an interdisciplinary approach to instruction and determined that effective and efficient teaching for technological literacy across the curriculum was achieved. Students successfully demonstrated performance in 14 core competencies over a two-month time period during regular courses in five major disciplines. During the study, students benefited from opportunities to engage in supplemental technological activities by individual choice. Performance of technological objectives was marked and entered on checklists for planned future entry into a networked database for use by all teachers and administrators. Individual checklists were printed and became a part of student portfolios displaying technological learning. Other items in the portfolios included self-entry and exit-analyses and pre- and post-instruction compositions. Assessment instruments developed for the study were used to evaluate teacher attitudes, portfolio development, student attitudes, and class performance. Teachers and technology committee members judged the program to be successful and projected a need for implementation of the program for the entire school population. Findings and recommendations showed that cross-discipline instruction based on the model used in the study was a solution for increasing student levels literacy through increased understandings and demonstrated performance. The study revealed a need for further research in areas of curriculum space, cooperative work, and contextual problem-solving education as they apply to improving technological literacy in secondary schools.
|
88 |
An Extension to the Information System Architecture Framework for Eliciting and Representing System RequirementsBuchanan, Candice L. 01 January 1997 (has links)
This paper explains how management should be able to describe to others the problem they want to resolve or the opportunity they wish to take advantage of in terms of their needs, desires, and anticipated results. This statement of the problem or opportunity is the basis for eliciting and representing the system requirements for those problems and opportunities that require an automated solution (i.e., an information system). The goal of this dissertation was to minimize and eliminate, where feasible, the major issues and problems associated with the elicitation and representation of system requirements.
A System Requirements Knowledge Structure and an Extended Information System (ISA) Framework were developed to assist in identifying, defining, and specifying system requirements. The System Requirements Knowledge Structure consists of a set of graphic representations and narratives that identify the types of system requirements that should be considered when defining the requirements for an information system and it serves as a model to use when identifying, articulating, defining, and classifying system requirements.
The Extended ISA Framework serves as a model to use when describing, conveying, sharing, validating, and verifying system requirements. The Extended ISA Framework is based on the Information System Architecture Framework, developed initially by John Zachman (1987) and later extended by John Sowa and John Zachman (1992). The ISA Framework and Extended ISA Framework are based on the six Basic English interrogatories: Who, What, Where, When, How, and Why. They represent the columns in a table that describes the real world, while the rows capture the different perspectives of the parties involved in the development of an information system. More specifically, the top two rows of the Extended ISA Framework (which consists of nine models) represent information about an organization, and the third row (which consists of seven models) reveals the initial definition of the technology to be used to develop or enhance one or more information systems.
|
89 |
Implementing and Evaluating A Bibliographic Retrieval System for Print and Non-Print Media MaterialsBuchholz, James L. 01 January 1987 (has links)
A fast growing south Florida school district struggled with providing needed central cataloging and processing services to its 103 school centers for library books and non-print media materials. Previous methods employed involved the manual typing of spine labels, book/material check out cards and pockets, and either the original production of catalog cards, the duplication of cards held in the master file or the ordering of available cards from the Library of Congress by U.S. Mail. Prior analysis by the researcher indicated that a computer-based bibliographic retrieval system, properly configured to meet district and school specifications, might be implemented to eliminate the mail ordering of card sets from the Library of Congress and serve to simplify and expedite the "in-house" production of cards and processing of materials not cataloged by the Library of Congress. It was assumed by the researcher that the providing of district-wide cataloging services and full "shelf-ready" processing of media materials to 103 school centers was a significant study worthy of review and relevant to existing problems in the information science field.
A comprehensive search of professional literature was conducted to obtain more information about currently used bibliographic retrieval systems - their merits and disadvantages. Media supervisors in selected colleges and other Florida school districts were queried for their input about research conducted and solutions they employed relative to the selection phase of the study. Based on the information gathering process, possible retrieval systems and/or ancillary products capable of solving the institutional problem were identified. Selected vendors were contacted for specific information about their individual products that was further analyzed for possible acquisition. Based on information received from all sources, the Biblio-File system was found to be the most cost-effective solution, and the one most capable of enhancing cataloging and processing operations. Its purchase was recommended to, and approved by, higher level district administrative personnel.
Once the system was received, it had to configure to insure that produced materials were consistent with both existing institutional guidelines and the MARC, AACR II and ISBD formats. During this phase, existing personnel were trained to use the system and queried for input relative to its implementation. Care was taken during this phase to insure that existing cataloging and processing standards, etc. we’re not sacrificed by an inadvertent enthusiasm to effect positive implementation of the system. By the same token, safeguards were taken to insure that dislike of change, particularly, automated change, on the part of existing personnel, and did not adversely affect the implementation of the system. During the configuration and limited implementation stages, which lasted two months, many procedural changes were identified that would enhance the full implementation of the system. Configuration adjustments were made throughout the configuration and limited implementation stages until system produced materials were of the desired quality and format.
Once the system was up and running and producing materials at a high level of staff satisfaction, system utilization moved into the full implementation stage. During this six month phase the system was used to produce processing materials for all books and audio visual materials cataloged by the Library of Congress. Additionally, the system was used for the in-house production of processing materials for books and audio visual materials for which there was no cataloging data either in the system database or in the district master file. During this phase, many procedural changes were identified and implemented, resulting in the writing of revised procedures for the Processing Section. Significant hardware changes were effected during this phase to enhance the production capabilities. Following the full implementation phase, it became necessary to evaluate the system for effect. In the researcher's opinion, system evaluation had to be based on both a survey of school media specialists relative to their needs and expectations and an in-house time-cost study effected at the institutional level to determine relative costs or savings of the new system as opposed to the preexisting procedures. In that regard, an evaluative instrument was constructed and distributed to district media personnel that facilitated the gathering of data about the effectiveness of the newly operational system from their point of view. Also, a time-cost study comparing the production of processing materials, under the old set of procedures and with the new system, was conducted by gathering direct time measurement data of the cataloging and processing functions. Results from both analyses strongly indicated that system production was viewed favorably from both the standpoint of district school media specialists and administratively from a cost-effectiveness point of view. Several recommendations from both staff and media specialists were analyzed and incorporated into the system production capability. Additionally, the researcher has considered several future measures that would facilitate the storage of cataloging data into a proposed district union catalog. The researcher was able to supervise the selection, installation, configuration, implementation and evaluation of the Biblio-File system.
|
90 |
A Study of Computer Aversion Factors And their Effect On An Older Adult Population's Computer AnxietyBurkett, William Henry 01 January 1993 (has links)
The purpose of this study was to examine the relationship between older adults and computer anxiety by identifying specific computer aversion factors and their influence on an older adult population's computer anxiety. The study was undertaken in the county of Palm Beach, Florida in the fall of 1992 and winter of 1993. The study collected and analyzed the responses to Meier's Computer Aversion Survey from 393 adults. Though it was determined that both populations exhibited traits of computer anxiety, the older adult population exhibited a higher level of anxiety on all three factors observed.
Older adults are not computer illiterate because they cannot or will not learn to use computers. The study determined that they are computer illiterate due in part to their lack of an appropriate outcome expectation for computer use. They do not think they can realize a usable outcome from using a computer system, therefore they do not attempt to use them. They have not been shown or convinced of what they could achieve through their use. Furthermore they will continue to miss the advantages of the computer revolution if they are not directly and specifically confronted with the benefits of computer systems. If they do not become computer literate society will have lost the knowledge that they could have provided if they had had the opportunity to meld their years of expertise with the capabilities of the computer revolution.
|
Page generated in 0.0842 seconds