• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7013
  • 1944
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 11715
  • 11715
  • 8064
  • 8064
  • 1211
  • 1207
  • 927
  • 845
  • 842
  • 774
  • 767
  • 552
  • 479
  • 461
  • 459
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

The Design of a Logarithmic File Data Allocation Algorithm for Extent Based File Systems

Heger, Dominique A. 01 January 2001 (has links)
I/O has become the major bottleneck in application performance as processor speed has skyrocket over the last few years, leaving storage hardware and file systems struggling to be competitive. This study presented a new methodology to evaluate workload-dependent file system performance. Applying the new methodology, the benchmarks conducted on the VxFS file system served as a baseline for the design of a new logarithmic file data allocation algorithm for extent based file systems. Most of the I/O performance evaluations conducted today have reached a steady state in which new developments are evolutionary, not revolutionary. The performance model that was introduced in this thesis is revolutionary, as it produces an application as well as a system characterization vector that allows researchers to conduct an in -depth analysis of a complete system environment. The constructed performance hierarchy incorporates performance dependencies at all levels of a system, which allows comparing, evaluating, and analyzing the system at any level of abstraction. When combining the performance characterization vectors, a single, application specific metric allows the analyst to conduct a sensitivity study. The characterization vectors and the resulting single metric satisfy all the potential demands for analyzing I/O performance, in particular systems evaluation, comparison, and optimization. The main argument made throughout this study is that systems performance has to be measured in the context of a particular application and a particular workload. The performance evaluation methodology presented in this thesis enables such measurements. Utilizing the methodology, the in-depth file system performance analysis conducted on different hardware and software configurations reviled that the new approach taken in this study was successful, as it was possible to rank the test systems in their proper order of relative performance measurements. The methodology allowed to normalize the results and to quantify performance in terms of performance paths that included an application, an operating system, as well as the underlying hardware subsystems. The file system benchmarks conducted on VxFS and UFS file systems respectively disclosed the strengths and weaknesses of an extent-based and a block-based file system design. In both architectures, fragmentation substantially impacts I/O performance as the file systems age. The performance measurements outlined the correlation between internal and external fragmentation, and made a strong case for a much enhanced file data allocation algorithm for extent based file systems. The second part of this research introduced a new file data allocation algorithm for extent based file systems that is unique in its approach in that it questioned established boundaries and redistributed existing responsibilities. The new allocation algorithm fulfilled the major requirements of increasing I/O performance by lowering internal fragmentation without significantly increasing the metadata overhead. The presented analytical data model, as well as the actual simulation of the new file data allocation algorithm proofed the great potential of the new design. The study concluded with the recommendation for a new I/O model that is fundamentally different from all the existing once. The vision is that a completely redesigned I/O model is necessary to provide adequate I/O performance. To be efficient, the new I/O model will have to work around the expensive memory-copy to and from user and kernel address space. Moving the data in an out of virtual address ·space is poor overhead, and if the subsystem that is moving the data does not have any virtual memory technique implemented to avoid the data copy, performance is limited to approximately 1,4 of the memory speed. The next generation I/O model envisioned in this study focuses on alleviating all unnecessary overhead in the I/O path , allowing the hardware to operate at its full potential.
212

A Data-Driven Soft Real-Time Expert System for Producing Coral Bleaching Alerts

Hendee, James C. 01 January 2000 (has links)
In the Florida Keys there are many physical, chemical and biological events of interest and concern to personnel of the Florida Keys National Marine Sanctuary, marine biologists, oceanographers, fishermen and divers. Large volumes of continuously generated meteorological and oceanographic data from instruments in the SEAKEYS (Sustained Ecological Research Related to Management of the Florida Keys Seascape) network help to understand these events. However, since no one has the time to look at every printout of data from every station, every day, seven days a week, it is highly desirable to have an automated system that can monitor parameters of interest and produce specialized alerts of specific events, as indicated by prescribed or abnormal ranges, or combinations of parameters. A soft real-time expert system was developed to produce such alerts based on data input from the SEAKEYS network. The prototype system collected data from the Sombrero Reef station in the network and produced automated e-mail and World-Wide Web alerts when conditions were thought to be conducive to, or predictive of, coral bleaching, which occurs under environmental conditions stressful to corals. Configuration of the system included a point system for three coral bleaching models (high sea temperature only, high sea temperature plus low winds, high sea temperature plus low winds plus low tide). The approach is an important development in the use of knowledge-based systems to solve environmental problems, as it provides for knowledge synthesis (in the form of data summaries) from any environmental ASCII data stream or table, be it real-time or not.
213

Computer-Aided Performance Analysis using Product-Form Queueing Networks to Model Steady-State Behavior: An Examination of a Medical Device Communications Network

Hendrickson, David B. 01 January 2001 (has links)
Healthcare information systems share records through common messaging standards and exchange information via universal network communications protocols. This interaction benefits hospitals by lowering administration costs and improving the accuracy of recorded information. This sharing and exchanging of information benefits patients by providing easier access to medical records, enabling point-of-care services, and simplifying retrieval of real-time patient data resulting in better patient care. Bedside medical device data complements the overall healthcare information system by providing a more complete understanding of a patient's health. This dissertation presents a simulation based on a standard network communications protocol for medical devices. This researcher addressed the problem that the use of the IEEE 1073 communications protocol to facilitate communication between legacy medical devices and hospital information systems is not adequately understood. The goal of this research was to further develop the understanding of such a system. The objective of this study was to develop a simulation model based on a reasonable approximation of a hypothetical system to identify the parameters involved and quantify its performance characteristics using real-world inputs. A model was developed using product-form queueing networks to model the steady-state behavior of a medical device communications system. The simulation model consisted of elements representing both physical and logical resources. Only the Physical and Data-Link Open Systems Interconnection (OSI) layers were considered. System configurations were limited to those defined by the IEEE 1073 communications protocol that support legacy medical devices. The inputs to the model consisted of real –world information compiled from vendor data specifications including physical communication mediums, microprocessors, medical devices, and representative software implementations. The results of the simulation suggest that a medical device communications network employing an IEEE 1073 communications protocol can support a limited number of legacy medical devices assuming a I -limited round-robin scheduling policy with a store and- forward data coherency strategy. However, under heavily loaded conditions, it is inadequate to deliver data generated from periodic multi -class work-loads in a timely manner. This conclusion demonstrates the need for an efficient medium access protocol schedule specification.
214

A Study of the Use of Paper Book Metaphors in the Design of Electronic Books

Henke, Harold A. 01 January 2002 (has links)
The goal of this research was to determine which, if any, paper book metaphors are useful in the design of electronic books and to answer the following research question: Does the inclusion of paper book metaphors in electronic books provide improved user satisfaction with electronic books? The objectives were to I) gather user requirements for electronic book features and 2) determine how quickly users can find information in an electronic book using paper book (index and table of contents) and non-paper book (bookmarks and search tool) features. Data was gathered from a user survey and a user review of an electronic book. From the user survey, 48 features were rated by 163 participants and of these 48 features, 36 features were non-paper (electronic) and 12 were paper. Of the top ten features, three features (title page, table of contents, and bookshelf) were based on paper metaphors. Of the bottom ten features, only one feature was based on a paper metaphor, which was the watermark feature. Furthermore, the majority of low rated features were features that clearly were associated electronic capabilities not found in paper books. In the user survey, 23 participants completed the user review and those participants who used non-paper book features, bookmarks and search tool, found information more quickly and were more satisfied with the non-paper book features. A key finding of the user review was that the index is an important tool for finding information and users should be provided tools to create a dynamic index within electronic books to aid them in finding information. The significance of this study is there are few experimental studies available where participants of the study represented actual users of electronic books. This study also validated a list of features that can be used in future research such as determining preference for features based on genre as well as participant age.
215

Factors Affecting the Use of Computers In Classrooms

Herring, Donna F. 17 February 1992 (has links)
The problem addressed by this investigation was the potential disparity between the availability of computers for classroom use and the extent to which computers were being used in Northwest Georgia classrooms. The study was based on three objectives: (1) To document the availability of computers for instructional purposes, (2) To determine the extent to which teachers were using computers for instructional purposes, and (3) To determine the effect of 49 identified factors on classroom computer utilization. Data for this study was collected through three procedures: (1) completion of a nationally published technology survey, (2) observations of classrooms where computers were being used for instructional purposes, and (3) interviews with classroom teachers. The population for the technology survey consisted of 880 lead teachers from the 16 member school systems of Northwest Georgia RESA. Participants for the classroom observations and teacher interviews were selected by the principals from 8 school systems that had been randomly selected. Appropriate data from the surveys, classroom observations and teacher interviews were categorized and tabulated in terms of both frequency counts and percentages. In addition, specific data from the surveys and teacher interviews were analyzed by the chi-square analysis statistical technique to determine if there were any differences between survey participants and interview participants for the questions used. The following major recommendations, based on the discussion and implications of the findings from this investigation, include: (1) the 16 school systems that were involved in this investigation need to develop comprehensive plans for using computer technology throughout the K-12 curriculum. (2) This investigation that was limited to lead teachers should be repeated and expanded to include all teachers from the 16 Northwest Georgia school systems. (3) Before the investigation is repeated, certain modifications need to be made to the survey instrument for purposes of clarity as well as for collecting other important information. (4) The 16 school systems that were involved in this investigation need to develop comprehensive staff development plans to train teachers to use computers for instructional as well as management purposes.
216

Enterprise Management Software Approaches for Economical Selective Outsourcing by Managed Service Providers

HerrNeckar, Adam D. 01 January 2008 (has links)
Organizations of all types and sizes are increasingly dependent upon reliable network interconnectivity for facilitating information exchange in a global economy. Given the expanding mix of technologies and complexity of networked environments, companies frequently struggle to contain rising operational costs, optimize resource effectiveness, and retain focus on strategic objectives. Selectively outsourcing network services and computer infrastructure provides organizations with an alternative strategy to increase efficiency, improve financial performance, and become more competitive. A symbiotic arrangement is subsequently formed when the outsourcing vendor, also termed the managed service provider (MSP), is able to economically satisfy the unique needs of each customer. A fundamental problem for the MSP involves effectively and efficiently balancing individual customer needs and requirements within the existing set of base services to remain economically viable. More specifically, the existing set of base services and processes must be capable of delivering exponential value to enable management of evolving technologies. The MSP forms a critical advantage by supporting an operational environment comprised of software applications that are relevant, accurate, economical, timely, and reliable to facilitate proven processes and meet customer service level agreements (SLAs). The goal of this research was to propose an economically bounded software engineering framework and proof-of-concept monitoring application suite for effective, efficient, and extensible network management automation. A mixed methods research approach was based on the survey of secondary marketing data representing 42 commercial and open source management solutions. The findings were synthesized with the literature to describe management capabilities supporting converged networks. Additionally, software prototypes were developed to demonstrate strategic elements of the enterprise management system. Finally, quasi-experiments of eight management queries indicated significant differences between communication protocols when responding to a similar request. The results indicated that no single protocol was efficient and effective in satisfying the needs of enterprise management under all conditions. The author recommended careful selection and assessment of each management query to optimize the ability for the MSP to economically scale while meeting customer expectations. Further, the MSP should be capable of providing adaptable monitoring mechanisms, and encourage the use of exception-based and publish/subscribe data acquisition.
217

Development of Computer Skills in Physical Therapist Students

Hill, Cheryl J. 01 January 2001 (has links)
The purposes of this study were to determine the current status of computer use in physical therapist education programs, what is being done to develop computer skills in physical therapist students, the factors impacting computer use in physical therapist education, and if there is a disparity between the type of computing technology used in the profession and that which is used in the education of physical therapists. The directors of all physical therapist education programs accredited by the Commission on Accreditation of Physical Therapy Education (CAPTE) in the United States were surveyed in nine different topic areas related to computer use in physical therapist education. Descriptive statistics were used to analyze the results. Ninety-eight of 162 surveys were returned for a 60% response rate. Respondents rated their faculty highest in word processing, Internet, e-mail, and presentation program skills, and lowest in authoring systems/courseware, creating home pages, and troubleshooting computer problems. Almost all respondents (96.9%) use computer aided instruction in the delivery of their curricula, and 95% require students to submit assignments using various computer skills. Hardware, software, technical support, and computer training are available to faculty, and, to a lesser extent, students. Most respondents believed it was not their responsibility to develop relevant computer skills in their students, and the vast majority did not favor having CAPTE establish minimum computer skills for graduates of physical therapist education programs. There does not appear to be a disparity between the type of computing technology used in the profession and that which is used in the education of physical therapists in most areas. What is lacking is a commitment to developing database skills, encouraging innovation in using the Internet for practice, creating educational courseware, and using/creating virtual reality for physical therapists. It is recommended that faculty reconsider where the responsibility resides for developing relevant computer skills in physical therapist students. It is also recommended that research be done on the use of computer skills in the various practice environments. In addition, much more needs to be done in research and development of computer applications specific to physical therapist education and practice.
218

Development and Production of the Electronic "Virtual" Pharmaceutical Dossier

Hilscher, Arthur E. 01 January 1998 (has links)
This dissertation proposes the implementation of a new electronic document management system (EDMS) to assist a United States pharmaceutical company in the filing of an electronic or "virtual" drug submission to the Food and Drug Administration (FDA). The goal of this EDMS was to reengineer the process of producing pharmaceutical submissions by creating a digital document repository providing a single point of reference and control for all document versions, thereby decreasing the cycle time of business-critical documents circulating among departments and between the company and the FDA. Another objective was to clarify the document standards that form the basis for the "compound" document management architecture of this emerging technology (ET) by incorporating object oriented technology. The project management approach was used to define requirements, maintain funding continuity and evaluate progress. A project manager served as the integrative force throughout the project, and he selected a project team composed of people from the path of critical document production. The project team selected the EDMS that met all the technical requirements, price and an intuitive and user-friendly graphical user interface (GUI). A pilot project using a discrete set of documents tested this ET. The success of this project was judged on testing the usability and effectiveness of the GUI of the EDMS in 16 participants and shortening the cycle time of business critical documents within the company. Critical success factors included the availability of immediate benefits such as the direct access to different file formats from a common digital document repository offering version control and an audit trail of documents via workflow, the ability to search the company's knowledge base, improved internal communication, and the reusability of information by all process members. Test results and post implementation feedback from users demonstrated that the EDMS pilot project was the path to the electronic submission. A quicker time to market of pharmaceutical products plus a greater market share for the company are the anticipated longer-term business deliverables of this ET. The major innovation is that knowledge workers will now have the means to create an electronic submission by processing the documents electronically instead of manipulating physical documents.
219

A Plan For Implementation of A Computer Assisted Academic Advising System At Centenary College Of Louisiana

Hitchcock, Miles E. 01 January 1992 (has links)
Centenary College, a small private liberal arts college, used a completely manual method of academic advising. Success of the system was based only on the dedication of the individual faculty advisors. Errors in degree plans and academic advising were frequent and sometimes resulted in an extra semester or extra courses for the student. Preliminary work in the form of interviews with administrative department heads and a survey of academic advisors indicated great support for a computer assisted academic advising system. The academic dean approved the results of a feasibility study and authorized development of a systems analysis for a computer assisted academic advising system. A project steering committee was appointed to serve as the primary evaluative body for the project, and a detailed systems analysis was completed. The analysis included a description of the manual method, list of current system deficiencies, list and discussion of new system requirements, list of restrictions that will limit the implementation of the new system, and a description in both narrative and graphic form of the process and data flow of the new system. A prototype of the degree audit process was developed. The prototype used official transcript information that was downloaded for the mainframe computer database and manual entries to reflect exceptions and transfer courses. The prototype used college, core, and major requirements from the 1990 - 1991 Centenary College Catalogue. To validate the prototype, a degree plan audit was performed for the records of twenty students who graduated in May 1991. The results of the prototype were compared to manual degree plans and the evaluation by the registrar of those manual degree plans. The systems analysis and the plan were approved by the entire committee and the academic dean. The dean authorized computing center personnel to begin development and implementation of the system
220

Total Quality Management(TQM): Implementation in a Not-For-Profit Research and Development Organization

Holt, Amos E. 01 January 1994 (has links)
No description available.

Page generated in 0.1265 seconds