• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6744
  • 1917
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 11418
  • 11418
  • 7851
  • 7851
  • 1182
  • 1160
  • 910
  • 814
  • 811
  • 760
  • 743
  • 541
  • 467
  • 453
  • 447
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

A Systems Analysis of a Networked Information Technology System at a Local Police Department: The Melbourne Police Department Case Study

Bass, M. Joanna 01 January 2002 (has links)
As the new millennium begins, networked information technology systems are progressively more essential to the success of an organization (Turban, McLean & Wetherbe, 2001). With the steady growth of computer power and ever-increasing access to communications networks, digital information is an increasingly important resource (Li, Wang, & Wiederhold, 2000). As the information technology has changed, so too has the implementation of that technology changed. The changing digital information era presents significant challenges to organizations because there is a growing interdependence between organizational management methods and procedures on one side and information technology and communications on the other (Laudon & Laudon, 2001). A change in anyone of the components often requires changes in other components (Laudon & Laudon, 2001). The changes and challenges to organizations brought about by information technology apply to governmental entities such as local police departments. The widespread use of computers and the rapidly developing technology of communications have combined to dramatically increase the volume and complexity of digital information resources available for criminal investigations and criminal-related research (Freeh, 2000). Following the September 11, 2001 terrorist attacks on the United States, emphasis has been placed on using networked information resources as a means to identify people and determine associated criminal histories. This single case study investigated the implementation of a networked information technology system operating in the real-world setting of a local police department. The objective of this study was to examine information technology successfully implemented as an investigative resource for law enforcers. This included reviewing criminal justice information resources that became available between 1999 and 2001, and reviewing the impact of the September 11, 2001 events on information technology implementation for criminal investigation and research. It also includes the examination of theories taken from recent literature from the period 1998 to 2002, regarding methods of successful system implementation. The major contribution of the study is that it provides a broad comprehensive analysis of a mixed mode wireless and wireline networked information technology system implemented to support criminal investigations and criminal-related research. The study presents a timely model of technology implementation, following the September 11, 2001 terrorist attacks and passage of the USA PATRIOT Act. Other police departments and criminal justice agencies can use the study model to implement similar networked information systems. Results of the study indicated that the implementation of the information technology was successful because the implementation followed accepted theories in recent literature, and has had a consistent positive impact on the Melbourne Police Department. Positive and significant productivity results were achieved via the implementation processes utilized by the police department. Lastly, the study provides recommendations to local police departments and criminal justice agencies on whether continued implementation of information technology developed under current theories is warranted.
62

A Front-Loaded Agile Software Process for Web Applications

Bautista, Eduardo L. 01 January 2004 (has links)
The current state of application development for the World Wide Web is characterized by anarchy and ad hoc methodologies. In recent years, various hypermedia methodologies have been proposed to facilitate the deployment of Web applications. However, no standard has emerged to fulfill the need for a systematic and methodological approach to complex and dynamic Web application development. The primary goal of this dissertation was to elaborate a software development process for Web applications that focuses most of the developers' efforts and creativity on the phases that determine software requirements, on analysis and design, but that is, at the same time, flexible and adaptable so developers can respond to changes in requirements without major cost overruns or delays. In order to develop an effective process, the researcher examined the recommended tasks for Web development as provided by the literature. As a result, the new software process defines the following phases: Feasibility Study, Requirements Definition, System Specification, System Design, Program Design and Development, System Test, Implementation and Production, and Maintenance. The Web software process was developed for a particular initial-level organization. A statistical instrument, the Rating and Evaluation Guide (REG), which utilizes data gathered via questionnaire, was used to measure the perceived effectiveness of the developed software process. The new process was evaluated against a standard development methodology from the literature. The evaluation was performed by randomly selected teams of developers in an initial-level organization, and the evaluation revealed that the new Web application development process was perceived to be more effective than a generic software process provided by the literature. The scores for all twenty-six evaluation items in the REG were higher for the new software process than the generic methodology. Additionally, the new software process achieved an overall score of 175 points on the REG scale, a high rating score, which is substantially higher than the anticipated minimum score of 132. Furthermore, the new software process scored 41 out of 48 maximum points in the REG Properties category, which exceeds the anticipated minimum score of 40. A high score for the Properties category reveals the inherent attributes of a high quality methodology are included in FLASOFTi, and the high scores for all six distinct properties in this category indicate that the method's overall quality has been recognized by the participants as quite high. The findings of this study will be of practical value to initial-level organizations in which the corporate culture tends to require planning development projects thoroughly before working on them. Although the researcher recognizes that the "one-size-fits-all" approach is not appropriate in applying a software process to Web projects, that the factors that influence the development of a Web site are complex and vary from organization to organization, most computing environments that require small to mid-size Web application development projects will be able to use the new method with minor revisions to suit both the project and the organizational culture.
63

Karel The Robot: A Gentle Introduction To The Art Of Programming For The Apple IIE

Beckner, Howard E. 01 January 1992 (has links)
'Karel the Robot' is a pre-pascal programming tool specifically designed to introduce students to the structure and form of programming. The original version was written by Richard Pattis for the UNIX operating system, and is now available for the IBM and Macintosh. Because of the high number of Apple lie computers in use in schools, the author has written a version of 'Karel the Robot' for this machine. The program was used to teach 60 high school computer students the Karel language. The IBM version was then used to teach the same lessons to 49 additional students. The students were given a teacher-made test to evaluate their knowledge of Karel principles and were asked to respond to a questionnaire. The results of the two group's tests were compared using a z-test. The z-test result of -1.72 at a .05 level of significance indicated that there was no difference in the two groups. The test was also checked for internal consistency using the split-half method and the SpearmanBrown formula. The Spearman-Brown results of 0.833 indicated a high degree of consistency within the test. Other comparisons were made by sex, grade level, and computer experience and math enrollment. The only significant difference in the test results occurred with math enrollment. The statistical results confirmed that in this case the two computer programs were equally reliable in teaching the Karel language. The Apple version of 'Karel the Robot' is available free of charge from the author or from the Alabama Council for Technology in Education.
64

A Model for Cultural Resistance in Business Process Re-engineering Failure

Beebe, Larry E. 01 January 1997 (has links)
The need for a new way of conducting organizational business has been identified as essential to remaining competitive. Increasingly, businesses and organizations have turned to redesigning or re-engineering operational business processes to improve performance and competitiveness. Business process re-engineering (BPR) has become a methodology that management uses when radical change is required in organizations practices. Despite the widespread implementation of BPR, most projects have failed. A major reason for reengineering failure is cultural resistance. The evidence about the culture in re-engineering suggests that the majority of BPR projects are implemented by cross-functional, multi-disciplined teams so that was the focus of the research. A review of the literature failed to provide a significant guideline that management could use to address cultural resistance. Accordingly, it was necessary to examine social issues in order to determine what management could do to reduce cultural resistance in BPR teams. The hypothesis was that cultural resistance in BPR implementations can be reduced and that a model can be developed that will effectively guide management intervention into the implementation of BPR. Findings suggested that cultural resistance could be reduced, if the correct combination of team characteristics are present, such as: openness and candor, leadership that does not dominate, decisions by consensus, understood and accepted goals, progress and results assessed, comfortable atmosphere, common access to information, a win-win approach to conflict. Results indicate that these characteristics can be measured and relationships established using the Myers Briggs Temperament Index, the Belbin Leadership Model, and the Motivational Potential Score. The QFD Matrix has been demonstrated to provide a sound approach for assessment and relationships. Committees and a Pilot Group provided feedback during the development of the model. It seems clear that BPR methodology, with a credible plan for social re-engineering implementation, can play a significant role in gaining competitive advantage in the modem organization. BPR without consideration of the social or cultural factors is likely to meet significant resistance. This resistance will result in disappointing re-engineering implementation results, wasting vital organizational resources.
65

The Effect of Computer-Based Accounting Practice Sets on The Achievement of Introductory College Accounting Students

Bernard, Bryce A. 01 January 2002 (has links)
The purpose of this study was to measure the effectiveness of using a computer-based practice set to teach college students who are enrolled in an introductory accounting course the process, procedures, and records, that are used in an accounting system. The study addressed the ongoing concern of accounting educators about the effectiveness of using computer-based accounting practice sets by comparing test results for a group of students enrolled in a lower division accounting principles course. The course was structured with a common lecture component and two accounting lab sections. Students in one lab section completed a manual accounting practice set and students in the other lab section completed a computer-based accounting practice set. All students were pretested at the beginning of the semester and post-tested at the end of the semester. The results of this study indicate that difference in treatment had no significant impact on post-test scores
66

Knowledge Discovery by Attribute-Oriented Approach Under Directed Acyclic Concept Graph(DACG)

Bi, Wenyi 01 January 2001 (has links)
Knowledge discovery in databases (KDD) is an active and promising research area with potentially high payoffs in business and scientific applications. The great challenge of knowledge discovery in databases is to process large quantities of raw data automatically, to identify the most significant and meaningful patterns, and to present this knowledge in an appropriate form for decision making and other purposes. In previous researches, Attribute-Oriented Induction, implemented artificial intelligence, "learning from examples" paradigm. This method integrates traditional database operations to extract rules from database systems. The key techniques in attribute-oriented induction are attribute generalization and undesirable attribute removal. Attribute generalization is implemented by replacing a low-level concept with its corresponding high-level concept. The core part of this approach is a concept hierarchy, which is a linear tree schema built on each individual and independent domain (attribute), to control concept generalization. Because such linear structure of a concept hierarchy represents the concepts that are confined to each independent domain, this topology leads to a learning process without the capability of conditional concept generalization. Therefore, it is unable to extract rich knowledge implied in different directions of non-linear concept scheme. Although some recent improvements have extended to the basic attribute-oriented induction (BAOD approach, they have some shortcomings. For example, rule-based attribute-oriented induction has to invoke a backtracking algorithm to tackle information loss problem, whereas path id generalization has to transform each data values (at a great cost) in databases into its corresponding path id in order to perform generalization on the path id relation instead. To overcome the above limitations, we propose a non-linear concept schema: Directed Acyclic Concept Graph (DACG), to extend the power in BAOI in order to discover knowledge across multiple domains conditionally. By utilizing graph theory, DACG can be transformed to its equivalent linear concept tree, which is a linear concept schema. Additionally, we also apply functional mappings, which reflect values from multiple domains into their high-level concepts in their codomains, to implement concept generalization. Therefore, our approach overcomes the limitations of BAOI and enriches the spectrum of learned patterns. Even though a concept learning under a non-linear concept schema is substantially more complicated than under linear concept tree in BAOI, this research shows that our approach is feasible and practical. In addition to presenting the theoretical discussion in this dissertation, our solution has been implemented by both Java JDK1.2 in Oracle 8i under Solaris at Ultra 450 machines and PUSQL in Oracle 8i under Windows 2000 to generalize rich knowledge from live production databases.
67

Web Information System(WIS): Information Delivery Through Web Browsers

Bianco, Joseph 01 January 2000 (has links)
The Web Information System (WIS) is a new type of Web browser capable of retrieving and displaying the physical attributes (retrieval time, age, size) of a digital document. In addition, the WIS can display the status of Hypertext Markup Language (HTML) links using an interface that is easy to use and interpret. The WIS also has the ability to dynamically update HTML links, thereby informing the user regarding the status of the information. The first generation of World Web browsers allowed for the retrieval and rendering of HTML documents for reading and printing. These browsers also provided basic management of HTML links, which are used to point to often-used information. Unfortunately, HTML links are static in nature -- other than a locator for information, an HTML link provides no other useful data. Because of the elusive characteristics of electronic information, document availability, document size (page length), and absolute age of the information can only be assessed after retrieval. WIS addresses the shortcomings of the Web by using a different approach to delivering digital information within a Web browser. By attributing the physical parameters of printed documentation such as retrieval time, age, and size to digital information, the WIS makes using online information easier and more productive than the current method.
68

A Model for Developing Interactive Instructional Multimedia Applications for Electronic Music Instructors

Biello, Antonio D. 01 January 2005 (has links)
This study investigated methods for designing a procedural model for the development of interactive multimedia applications for electronic music instruction. The model, structured as a procedural guide, was derived from methodologies synthesized from related research areas and documented as a reference for educators, instructional designers, and product developers for developing interactive multimedia applications. While the model was designed primarily for junior college electronic music students, it has the potential for generalization to other related disciplines. A Formative Committee consisting of five experts in the areas of education, music education, cognitive psychology, and institutional research assisted in the development of a set of criteria for the model. Utilizing the Nominal Group Technique, the committee examined, evaluated, and scored the efficacy of each proposed criterion according to its relevance to the model. Criteria approved by the committee and the researcher were incorporated in the model design. A Design Committee comprised of five experts in the areas of instructional design, media/interaction design, behavioral psychology, and electronic music evaluated and validated the criteria set established by the Formative Committee. The validation was realized through surveys and formative feedback of the criteria set developed by the Formative Committee. Prototype instantiations of the process model were an integral part of the model development process. Prototypes derived from the model were used to test the efficacy of the model criteria. A Development Committee comprised of members of the Formative and Design committees examined and evaluated prototype instantiations. Recommendations for improvements were implemented in the model design. A Pilot Study was conducted by the Development Committee to assist the product development process and to evaluate the efficacy of the model. As a result of the study, a number of suggestions proposed by the committee were implemented for further improvement of the model. A Summative Committee comprised of educational experts having significant experience in educational research examined the efficacy of the model criteria established and validated by the Formative and Design committees. The Summative Committee evaluated the model and made recommendations for improving the model. Founded on a set of criteria, the electronic music model was successfully developed and evaluated by a team of professionals. The Development and Summative Committees were satisfied with the results of this study and the criteria developed for the model design were deemed to be complete and relevant to the model. The results of this study suggest that instruction based on this model will support the unique learning needs of students having diverse cultural, learning, and educational backgrounds.
69

Evolutionary Algorithm for Generation of Air Pressure and Lip Pressure Parameters for Automated Performance of Brass Instruments

Bilitski, James A. 01 January 2006 (has links)
The artificial mouth is a robotic device that simulates a human mouth. It consists of moveable lips and an adjustable air supply. The uses of an artificial mouth include research for physical modeling of the lips and automatic performance. Automatic performance of a musical instrument is when an instrument is played without the direct interaction of a human. Typically mechanics and robotics are used instead of a human. In this study the use of a genetic algorithm to compute air pressure and lip pressure values so that the artificial mouth can correctly play five notes on a brass instrument is investigated. In order to properly playa brass instrument, a player must apply proper tension between the lips and apply proper airflow so that the lips vibrate at the proper frequency. A player changes the notes on a brass instrument by depressing keys and changing lip pressure and air flow. This study investigated a machine learning approach to finding lip pressure and air pressure parameters so that an artificial mouth could play five notes of a scale on a trumpet. A fast search algorithm was needed because it takes about 4 seconds to measure the frequency produced by each combination of pressure parameters. This measurement is slow because of the slow moving mechanics of the system and a delay produced while the notes are measured for pitch. Two different mouthpieces were used to investigate the ability to adapt to different mouthpieces. The algorithm started with a randomly generated population and evolved the lip pressure and air pressure parameters with an evolutionary algorithm using crossover and mutation designed for the knowledge scheme in this application. The efficiency of this algorithm was compared to an exhaustive search. Experimentation was performed using various combinations of genetic parameters including population size, crossover rate, and mutation rate. The evolutionary search was shown to be about 10 times faster than the exhaustive search because the evolutionary algorithm searches only very small portion of the search space. A recommendation for future research is to conduct further experimentation to determine more optimal crossover and mutation rates.
70

The Development of Reliable Metrics to Measure the Efficiency of Object-Oriented Dispatching using Ada 95 a High-Level Language implementing Hard-Deadline Real-time Programming

Bingue, Eugene W. P. 01 January 2002 (has links)
The purpose of this study is to produce a metric to accurately capture the effects of real time dispatching using object-oriented (00) programming applied in the maintenance phase of the life cycle of hard, real-time systems. The hypothesis presented is that object-oriented programming constructs can be applied in a manner that will have beneficial life-cycle maintenance effects while avoiding adverse timing side effects. This study will use complexity measures instruments that will calculate the Cyciomatic Complexity. This study will examine the dispatching time of each program, and utilize utilities to calculate the number of machine cycles for each program component. Coding techniques will be presented for various program design dilemmas, which examine the object-oriented dispatching features.

Page generated in 0.08 seconds