• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7011
  • 1944
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 11712
  • 11712
  • 8064
  • 8064
  • 1211
  • 1206
  • 927
  • 845
  • 842
  • 774
  • 767
  • 552
  • 479
  • 461
  • 459
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

An Extensible Markup Language (XML) Application for the University Course Timetabling Problem

Lehman, Jeffrey L. 01 January 2004 (has links)
The university course timetabling problem involves the assignment of instructors, courses, and course sections to meeting rooms, dates, and times. Timetabling research has generally focused on the algorithms and techniques for solving specific scheduling problems. The independent evaluation and comparison of timetabling problems and solutions is limited by the lack of a standard timetabling language. This dissertation created an Extensible Markup Language (XML) application, called Course Markup Language (CourseML), for the university course timetabling problem. CourseML addressed the need for a standardized timetabling language to facilitate the efficient exchange of timetabling data and provided a means for the independent evaluation and comparison of time tabling problems and solutions. A sample real-world university course timetabling problem was defined. CourseML was used to define the sample problem. CourseML was evaluated based on how well it captured the sample problem, including hard and soft constraints, and how well it represented a solution instance. The qualities that made CourseML a candidate for general use were identified. The set of characteristics that made XML an appropriate language for specifying university course timetabling problems and solutions were identified.
282

The Mapping and Integration of The Haskell Language to The Common Object Request Broker Architecture

Leitner, Lee J. 01 January 1996 (has links)
This dissertation is about the mapping and integration of the pure functional language Haskell to the Object Management Group's (OMG) Common Object Request Broker Architecture (CORBA). The purpose of this work is to create the definitions necessary for programs written in the Haskell language to successfully interoperate with programs written in any other programming languages operating within the OMG/CORBA environment. This work extended prior work in the areas of language integration into distributed environments, and language mappings to the OMG/CORBA environment. It also extended and synthesized the prior theoretical and applied research to integrate imperative and object-oriented characteristics into the Haskell programming language. In order to accomplish this objective, a language mapping from the OMG Interface Definition Language to Haskell was created. Specific extensions were created in Haskell to support the semantics of this interface definition language. These extensions also respected Haskell's pure functional, non-strict semantics as well. It is expected that the results of this work are sufficient so that object brokerage systems can be implemented to support the mapping and integration definitions defined in this dissertation. In addition, it is expected that the extensions and techniques defined in this work may have further utility in similar theoretical and applied problem domains.
283

A National Approach to Touch Keyboarding Instruction on Computers in Primary Schools in Belize

Lewis, Gilda 01 January 1998 (has links)
As a result of this study, it was possible to make suggestions for informed pedagogical decisions, regarding the manner in which learning should be structured for a national approach to touch keyboarding instruction on computers at the primary school level in Belize. The population consists of 1,757 Standard 2 students in the 68 primary schools in the Belize District. The design was a posttest-only control group design with random assignment of subjects to four sub-groups, and random assignment to two types of treatment at counter-balanced times and days. A cluster sample of 29 students in an intact class, divided into four sub-groups, was drawn from a typical, co-educational, inner-city primary school in Belize City. Two sub-groups -- the experimental group –used the Herzog System of Keyboarding, i.e., Herzog Fast-Track text and Hub-Key Sensors, and presentation of the keys in alphabetic sequence. The other two sub-groups the control group -- received keyboarding instruction by the traditional method, i.e., keyboarding text, and home keys followed by random letters. Subjects were taught the alphabet keys, period, comma, shift lock, and shift keys in about 11'/2 hours spread over an 8-week period, divided into 4-week sessions for each treatment. Towards the end of the period of instruction, subjects were involved in using their keyboarding capability to compose language arts material at the keyboard. At the end of the period of instruction, two 3-minute straight copy timings were administered as the posttest. Scores for each subject were averaged and analyzed by a parametric statistical test, viz., analysis of variance (ANOVA). It was hypothesized that subjects who were taught touch keyboarding by the Herzog System would achieve higher speeds, that they would be more accurate, and that they would be more adept at composing at the keyboard, than students taught by the traditional method. The analysis of variance procedure did not support the first two hypotheses, but it supported the third hypothesis.
284

A Load Balancing Data Allocation for Parallel Query Processing

Lin, Wen-Ya 01 January 1998 (has links)
This paper presents a multidimensional schema, called the multidimensional range tree (MDR-tree), to manage multi dimensionally partitioned relational database tables in parallel database system environments. In order to support the speed-up and scalability for intensive data processing, the parallel data processing paradigm has been proved as being one of the best solutions for handling query processing. One of the most important issues in parallel data processing systems is the management of dynamic load balancing for partitioned relational database tables. In order to support the management of dynamic load balancing, we employ a dynamic multidimensional data structure, the multidimensional range tree (MDR-tree), and its associated operations such as insertion, deletion, split, merge, and tuning. The major purpose of operations such as splitting, merging, and tuning is to balance the distribution data records in each processor or network node. By using the tuning strategy, each processor or network node is initially assigned the partitioned data, with an equal load as the other processors. The equal load in each processor or network node is also maintained when the database undergoes frequent updates such as insertions and deletions. The application of multidimensional partitioning by using the MDR-tree is to support multidimensional query processing, such as OLAP (on-line analytic processing), in large database or data warehouse environments. Due to the nature of intensive data and real-time constraints in database and data warehouse applications, the issues of both speed-up and scalability do require finding good practical solutions to load-balanced partitioning and management of a huge volume of data. In the specific case of star queries in data warehouse applications, multidimensional joins can be processed effectively and in a parallel manner with the support of the MDR-tree. We believe that the multidimensional data partitioning and management can be one of the alternatives for these types of applications. We have performed simulation experiments to validate the effectiveness of the MDR-tree. All experimental results indicate that the MDR-tree is an effective index structure for multidimensional data partitioning.
285

The Implementation and Integration of the Interactive Markup Language to the Distributed Component Object Model Protocol in the Application of Distributed File System Security

Lin, Jenglung 01 January 1999 (has links)
This dissertation is about the implementation and integration of the interactive markup language to the distributed component object model protocol with the application to modeling distributed file system security. Among the numerous researches in network security, the file system usually plays in the least important role of the spectrum. From the simple Disk Operating System (DOS) to modern Network Operating System (NOS), the file system relies only on one or more login passwords to protect it from being misused. Today the most thorough protection scheme for the file system is from virus protection and removal application, but it does not prevent a hostile but well-behaved program from deleting files or formatting hard disk. There are several network-monitoring systems that provide packet-level examination, although they suffer significant degradation in system performance. In order to accomplish this objective, the implementation and integration of an interactive markup language to the distributed component object model protocol is created. The framework is also associated with the network security model for protecting the file system against unfriendly users or programs. The research will utilize a comprehensive set of methods that include software signature, caller identification, backup for vital files, and encryption for selected system files. It is expected that the results of this work are sufficient so those component objects can be implemented to support the integration definitions defined in this dissertation. In addition, it is expected that the extensions and techniques defined in this work may have further utilization in similar theoretical and applied problem domains.
286

The Future of Newspapers A Study of the World Wide Web And Its Relationship to electronic Publishing of Newspaper

Lindoo, Edward C. 01 January 1998 (has links)
The purpose of this study was to determine the effects that the World Wide Web (Web) is having on newspaper publishing. With the development of the Web, more than 4,000 electronic publishers have created Web sites, and are now in competition, not only with each other, but also with traditional media such as newspaper, magazines, radio and television. Due to a variety of factors, including advertisers expanding into Web markets, newspaper publishers perceive this new competition to be not only the most immediate, but also the most serious, systemic, long-term threat to traditional newspaper publishing to date. Therefore, the goal of this dissertation was to study the development of electronic newspapers, to ascertain how newspapers are currently using the World Wide Web, to suggest how new technologies, such as the Web, might be used by newspaper companies in the future to keep their share of the information dissemination marketplace, and to draw conclusions as to the importance of the Web to newspapers, now, and in the future. Millions of dollars have been spent by newspapers to get their product on the Web, and millions more will be spent to keep their presence on the Web. However, few publishers are making money on the Web, and some have ceased Web operations, almost as quickly as they started. Through the implementation of a survey, questions such as profitability, staffing, pricing, promotion, and Subscriptions were answered. Ultimately, this information can be used by newspaper publishers to enhance their Web product as they move into the future. The success of this project was based on the literature review, the results of the survey, and the final analysis, which has put into perspective where the electronic publishing industry is today, and what newspaper publishers need to do in the future to remain competitive, while maintaining their share of the market. Although the literature contained much hype about the Web, the final results of this paper show that the newspaper industry, as a whole, is not in as much danger as most publishers fear. And because most newspapers have created Web sites, they have positioned themselves well to fight off competition.
287

Applying Decision Theory to Quantify the Cost of Network Security Risk

Linnes, Cathrine 01 January 2006 (has links)
This research quantifies the maximum potential loss due to a breach of security to help decision makers understand and justify the expenses necessary to properly protect information systems and identify the optimally priced security features that will provide the maximum cost benefit ratio. The purpose is to help assess and reduce the value of risk such that it is as close to zero as possible, where companies are not spending too little or too much on security prevention. The research uses decision analysis, specifically a "decision tree" and "influence diagram" to model the problem, quantify the losses, and gauge the risk associated with network intrusions and security technologies applied to an organization. The model is designed to help decision makers balance the costs of security procedures against the potential costs of internal and external information systems misuse and computer crime whether the attack is intentional or unintentional. The methodology can be used to better plan for the prevention of attacks. The model incorporates sufficient flexibility to accommodate the different risks and associated costs faced by different organizations. The model will help managers understand and justify the expenses necessary to protect information systems properly.
288

Models for Optimizing the Use of Contractors in Information Systems Application Development Outsourcing

Lipman, Craig S. 01 January 1999 (has links)
The goal in this research was to design a model for optimizing the use of Information Systems (IS) sub-contractors during application development. This was accomplished through an investigation of organizations that were providers or sellers of IS as well as an investigation of buyer-seller relationships in the manufacturing and health care industries. The manufacturing and health care industries have had considerable success with this process of matching buyers and sellers. Models and structures of organizations and inter organizational relationships investigated within the targeted industries were ported to, and modified for use within, the IS industry. This study identified and evaluated four models for potential porting. Structured analysis that consisted of concept modeling, process modeling, and Bockle's et al. (1996) delta chart methodology refined the models and reduced the final number to two models. These models led to the conclusion that improved models for outsourcing IS application development could be derived from other industries. One of the two final models, the Vendor Network (VN) model, was derived from the healthcare industry, and the other model, the IS Vendor Strategic Center (VSC) model, was from the manufacturing industry. Both of these models were examples of how the outsourcing of functionality from the domain of a single buyer into the buy generic or multi-buyer domain enabled opportunities that were not afforded to the buyers that insourced the same functions.
289

Electronic Student Portfolios A Tool Performance-Based Assessment ( A Pilot Project in the Berks County Pennsylvania Schools)

Lipton, Robert B. 01 January 1997 (has links)
The Pennsylvania Department of Education (POE) under Chapter 5 - Curriculum state regulations (Outcomes Based Education Act) requires each school district to implement a performance based assessment system. To meet this regulation the Department of Education requires all school districts to create and maintain portfolios for each student. The 18 school districts that comprise the Berks County Intermediate Unit had identified a need to develop a comprehensive program to implement student portfolios within their districts. A Goals 2000 federal government grant was secured (1995) to support a pilot program to evaluate the feasibility of implementing electronic portfolios in the Pennsylvania's Berks County schools A systematic approach to creating, storing, managing, and assessing student electronic portfolios was instituted and refined based on feedback from the pilot groups. Both qualitative and quantitative research formats were chosen to present the information derived from the pilot study. A quantitative analysis of a survey was used to choose the school districts which would participate in the pilot and to identify important trends and concepts needed to be included in the pi lot project. Computer resources and comprehensive training were provided to all the participants during the length of the pilot study. A qualitative analysis of all the data was performed at the end of the pilot project using the surveys, interviews, and observations collected during each of the pilot studies. The study concluded that the computerization of the portfolio process aided in managing the collection and assessment activities performed by students and teachers. The pilot participants preferred software especially created for the management of electronic portfolios such as Grady Profile. Nevertheless, the electronic portfolio process must be supported with the appropriate computer technology, training, instructional time, and staff resources to be a viable measurement of performance and assessment.
290

A JAVA based API for the Implementation and Maintenance of Domain Specific Collaborative Virtual Environments

Litman, Mike B. 01 January 2006 (has links)
Collaborative virtual environments (CVEs) are environments that actively support human-human communication in addition to human-machine communication and which use the virtual environment as the user interface. The number of domains to which CVEs are being applied is constantly increasing, yet there are currently no standards governing the implementation of CVEs. Not surprisingly, most CVEs are proprietary to some degree, and are difficult to adapt to domains for which they were not originally designed. This dissertation proposes a set of abstractions common to many CVEs and presents an API named CORE (Creation of Reusable Environments) based on these abstractions for the implementation and maintenance of CVEs. Additionally, three case studies were undertaken to validate the API: a chat room, virtual classroom, and traditional multi-user dungeon (MUD). These case studies demonstrate typical uses of the API for creating CVEs. It is the finding of this research that a common set of abstractions can be identified across many CVEs and if harnessed within an API, the implementation of CVEs becomes more convenient and structured.

Page generated in 0.1451 seconds