• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7011
  • 1944
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 11713
  • 11713
  • 8064
  • 8064
  • 1211
  • 1207
  • 927
  • 845
  • 842
  • 774
  • 767
  • 552
  • 479
  • 461
  • 459
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

Minimizing Parallel Virtual Machine [PVM] Tasks Execution Times Through Optimal Host Assignments

Mtshali, Progress Q. T. 01 January 2000 (has links)
PVM is a message passing system that is designed to run parallel programs across a network of workstations. Its default scheduler uses the round-robin scheduling algorithm to assign tasks to hosts on the virtual machine. This task assignment strategy is deficient when it comes to a network of heterogeneous workstations. Heterogeneity could be software related, hardware related or both. The default scheduler does not attempt to assign a PVM task to a node within the virtual machine that best services the PVM task. The problem with this allocation scheme is that resource requirements for each task that must be assigned to a computer vary. The default scheduler could assign a computation intensive parallel task to a computer that has a slow central processing unit (CPU) thus taking longer to execute the task to completion (Ju and Wang, 1996). It is also possible that a communication bound process is assigned to a computer with very low bandwidth. The scheduler needs to assign tasks to computers in such a way that tasks are assigned to computers that will fulfill the resource requirements of a given task: schedule and balance the work load in such a way that the overall execution time is minimized. In this dissertation, a stochastic task assignment method was developed that not only takes into account the heterogeneity of the hosts but also utilizes state information for the tasks to be assigned, and hosts onto which they are assigned. Task assignment methods that exist today, such as used in Condor, PvmJobs, CROP, and Matchmaker tend to be designed for a homogenous environment using the UNIX operating system or its variants. They also tend to use operating system data structures that are sometimes not common across most architectures. This makes it difficult to port them to other architectures. The task assignment method that was developed in this dissertation was initially implemented across a PVM environment consisting of a network of Win32® architecture, Linux architecture, and Sun Solaris architecture. A portable prototype implementation that runs across these architectures was developed. A simulator based on the CSIMIS toolkit was developed and utilized to study the behavior of the M/G/I queueing model on which the task assignment is based. Four PVM programs were run on a PVM environment that consisted of six nodes. The run-times of these progran1s were measured for both the default scheduler and the method proposed in this dissertation. The analysis of the run-times suggests that the proposed task assignment method works better for PVM programs that are CPU intensive than those that are small and do not require a lot of CPU time. The reason for this behavior can be attributed to high overheard that is required to compute the task assignment probabilities for all the hosts in the PVM environment. However, the proposed method appears to minimize execution times for the CPU intensive tasks. The ease with which external resource managers in the PVM environment can be incorporated into the PVM environment at run-time makes this proposed method an alternative to the default scheduler. More work needs to be done before the results that were obtained from this dissertation could be generalized, if at all possible, across the board. A number of recommendations on the reduction of overheard are specified at the end of this report.
342

An Analysis of the Implementation of a Workflow System for Health Information Management

Murray, Mary Gregory Coffin 01 January 1999 (has links)
Workflow occurs in all business settings (Bajaj, 1997). Computerized workflow systems provide automated support for business processes. Workflow is categorized as a group support system which supports a group of people trying to solve problems with the use of communications, computing and decision support technologies (Aiken, Vanjami & Krosp, 1995). Workflow refers to software integration and development to automate business processes. Information is routed among users and applications in a formal manner to meet established business requirements. Workflow involves a combination of human and machine based activities which interact with information systems and information technology applications and tools (Hollingsworth, 1994). Workflow is praised as the technology on the forefront of collaborative computing efforts (Bothrick, 1997). However, the challenges to the successful implementation of workflow systems are greater than expected. Several reasons are cited for this. These reasons include a miscalculation of the complexities of human interaction required to implement collaborative technologies (Khoshafian & Buckiewicz, 1995), a misunderstanding of how to use the technology (Bothrick, 1997), and a misunderstanding of system capabilities (Leibert, 1997). This study investigated the implementation of a workflow system in health information management. In health information management, workflow systems are used to automate the flow of medical records and other processes requiring access to the patient information contained in those records (Mahoney, 1997). Workflow is an emerging technology (Silver, 1995). The purpose of this research is to increase the knowledge base in relation to the successful implementation of workflow technology. The research methodology employed a multiple case study in which workflow implementation was assessed in real world settings. Six theoretical propositions defined the scope of the study and provided the framework for data collection and data presentation. These propositions address the multiple phenomena identified as having a major impact on the implementation of workflow technologies. The goal of the research is to provide information to healthcare organizations to assist them in the successful implementation of the powerful new paradigm of workflow technology.
343

A Retention Issue-Predicting The at Risk Student in Community College Web-Based Classes

Muse, Herbert E., Jr. 01 January 2003 (has links)
This report describes a quasi-replication of an earlier study. The problem of this study was the need to develop an assessment tool that would assess and predict whether Web based learners at the community college were at-risk of failure in this mode of learning. In this study the instrument was used to identify factors that could then be used to discriminate between 276 successful and non-successful Web-based learners at the community college level. Twenty-eight ordinal-level questions, as used in the previous study, and eight more items related to computer and Web-based skills produced seven factors using factor analysis. The seven factors and seven background variables from the original study were used as input for additional quantitative analyses. Discriminant function analysis produced a significant discriminant function and five variables that contributed significantly to that function: Grade Point Average (GP A), Study Environment, Age, Last College Class, and Background Preparation. The function was used to classify (predict) student membership into successful and non-successful groups and classified two-thirds of the cases correctly. This study also presented results from a qualitative investigation into dropout of the Web based learner. Twenty-two randomly selected students who dropped their Web-based course were interviewed and each was questioned about their reasons for dropping the course. The reason given most often for dropout was that the student could not obtain, access, or install all the required learning materials in a timely manner, and that he/she dropped the course while a chance to do so was still available. According to the findings of this investigation, students who had a history of academic achievement, were older, had a positive learning space, and believed they were prepared for this learning environment were more likely to be successful than others who had lesser amounts of these qualities. The study also showed that students who could react quickly to logistical demands early in the course were more likely to persist. Recommendations for further research included using more questions to characterize the domains studied in this inquiry, using another statistic to compute the likelihood of success and failure of the Web-based student, and testing students in other populations.
344

An Instructional Technology Guide for Technical Trainers

Mylott, David T. 01 January 2008 (has links)
The goal of this study was to develop a guide for technical training instructors that combines instructional theory, adult learning principles, distance online learning methodologies, and multimedia principles. There is currently no definitive source of information to guide instructors through the transition from traditional classroom based instruction to blended learning programs. These programs utilize multiple instructional methods including traditional classroom training, live distance learning and various asynchronous technology-based methods. The guide was developed based on a research and development methodology and was validated through a series of steps involving review, feedback and revision. The process ensured the end product was validated and relevant to the end users. A needs analysis was conducted to determine the objectives and content of the guide. A thorough review of literature was conducted in the process of conducting this analysis. A preliminary guide was developed based on the identified objectives. The guide was reviewed by expert jurors to determine its relevance, validity and usefulness. The feedback was synthesized and revisions made. The revised guide was field tested by potential users of the guide - technical trainers within the semiconductor industry. Their feedback further validated the guide, its usefulness and applicability. Final revisions were made and were included in the dissertation report along with the analysis of the process and results. The product is a resource for instructors who are encountering the inevitable entanglement of technology in technical training. As a tool, it enables instructors to better use technologies available to enhance the learning process. Additionally, it adds to the overall body of knowledge for the corporate training environment.
345

Improving Information Systems Security Through Management Practices: A Non-technical Approach

Nard, Karen D. 01 January 2004 (has links)
Most organizations have acknowledged the importance of information systems security, yet in this environment of heightened awareness many organizations focus on technology and overlook the non-technical security resources available to them. This project focused on the non-technical side of security and the management practices that can be used to establish an important layer in a comprehensive security solution. A security planning matrix was developed by drawing from the theoretical and practical body of knowledge in the information systems security field. The matrix was designed to support generally accepted security principles, standards, and legislation so that information systems management can use the product to protect information systems using non-technical controls and techniques such as people, policies, practices, training, awareness, and the organizational structure and culture. A hybrid waterfall/spiral process model, Microsoft Solutions Framework (MSF) was used to develop the security planning matrix. Specific procedures emulated those used by the National Institute of Standards and Technology (NIST) based on their experience and expertise in developing security guidelines and other security tools. A prototype of the product was developed early in the process based on requirements abstracted from security standards, legislation, and industry best practices. The prototype was then reviewed by an expert panel to refine both product requirements and design. One round of feedback and two versions of the prototype were required before the panel approved the prototype for use in the pilot study. The pilot was performed in a real-world setting at Republic Mortgage Insurance Corporation (RMIC), where user acceptance testing, success criteria evaluation, and security performance improvement testing were all performed to evaluate and stabilize the product. The research improved professional practice and added to the body of information systems security knowledge by identifying and demonstrating methods for defining requirements of, developing, and evaluating a product such as the security planning matrix. Results of the research also showed that the product's features and functions were acceptable to both subject matter experts and real-world users and that implementation and use of the security planning matrix could improve the level of security preparedness as evidenced by pilot study results at RMIC.
346

A Systems Analysis of Information Technology and the Use of WLANs Implemented by an FBI Field Office for Crisis Response Incidents: The Columbia Field Office Case Study

Neubauer, Michael J. 01 January 2007 (has links)
A comprehensive crisis management system must be ready to operate on a moment's notice and in the midst of an onslaught of information. Within the crisis incident, the management of information may be characterized by magnitude and urgency as well as by preciseness and uncertain reliability. Described as rapidly changing, high risk, and non-routine, crisis incidents challenge the mandates of the Federal Bureau of Investigation (FBI) to protect and to share information. Complicating the development of a crisis system are security constraints imposed on the information technology (IT) infrastructure by crisis participants such as the FBI. Constraints circumvent collaborative initiatives fundamental to crisis response efforts. For example, mandates to secure information may inhibit access to critical, real-time, decision-making data and thereby result in additional danger and damage due to limitations placed on analytical processes affecting recovery efforts. The author addressed the aforementioned issues confronting an FBI field office (FO) during a crisis response effort. A single-case study analyzing the implementation of IT by the Columbia FO during a crisis incident was conducted. Results of this study indicated that IT used by the Columbia FO during a crisis incident had a positive impact on FBI crisis response operations. The results further indicated that strict IT security controls placed upon FBI systems effect how technologies such as wireless local area networks (WLANs) are implemented to support the investigative processes during a crisis incident. The outcomes from this research showed an increase in information exchange capability through the use of WLANs. Since the terrorist attacks on the World Trade Center, the Pentagon, and a rural field in Pennsylvania on September 11, 2001 , the FBI has been undergoing a transformation from an investigative agency to an investigative and intelligence agency. These changes include the use of IT for meeting the renewed emphasis on information exchange. The major contribution of this study is that it provides a comprehensive analysis of IT used by an FBI FO and the potential of WLAN technology in support of crisis responders. This study also provides recommendations on how a WLAN pilot project could be initiated within a FO crisis response.
347

The Effect of Vocabulary Integration in the Curriculum on Learning at the Junior High School Level

Neumeier, Burton F. 01 January 1991 (has links)
Previous research has shown that student achievement can be improved by vocabulary oriented instruction in the content area. This research expands on previous studies because it uses integrated vocabulary instruction across four main subjects. The overall purpose of this study was to provide evidence that learning and achievement of regular seventh grade students can be improved with vocabulary instruction activities that are integrated across curriculum boundaries. The study was conducted in two phases: (a) development of a list of vocabulary words for seventh grade and (b) classroom implementation using vocabulary integrated instructional activities in math, English, social studies, and science. The instructional activities were employed for 5 to 10 minutes daily over a period of 7 weeks in the four main subject areas with the experimental class. The control class receive similar instruction without the vocabulary integrated instructional activities. Two standardized examinations were administered, one at the beginning of the implementation and one at the end of the implementation. The results of the investigation indicate there could be a significantly higher achievement for the experimental group as compared to the control group. These results showed achievement in the areas of mathematics and reading can be enhanced with integrated vocabulary activities across curriculum boundaries. The results did not show that achievement in the area of reading can be enhanced with integrated vocabulary activities across curriculum boundaries.
348

An Investigation of the Attitudes Cost and Benefits of Telework for Information Systems Employees

Noble, Bernard J. 01 January 2007 (has links)
Although telework has been around for more than 30 years, it still does not have a precise definition in the literature, and the number people doing telework is growing worldwide. Telework has been defined by one author as a work arrangement by which an employee works from his or her office rather than a central office. Another author expanded this definition by noting that teleworkers work in remote locations, such as their homes or neighborhood satellite offices, one or more days a week. Likewise, teleworking is also viewed as any form of substitution of information technologies involving telecommunications and computers for work-related travel. Regardless of its meaning, the outgrowth of telework in today's high technological environment has resulted from the need to conserve time and resources significant to America's economy, though most organizations are built around models that bring their employees to a central location for work. However, as stated in the literature, with the advances in information system technology, teleworking should have been eagerly adopted by the Information System (IS) industry. Yet, IS professionals' enthusiasm toward telework appears to be the opposite. Despite the benefits of flexibility and autonomy for employees, and an increase in human resources productivity for employers, there is a considerable amount of reluctance by many organizations to pursue aggressively widespread implementation of telework programs. Therefore, in this research, the author examined the factors, such as costs, attitudes, and benefits that relate to telework at the level of the IS worker and the IS organization by utilizing a survey instrument to examine attitudes of IS workers and IS managers toward telework in IS environments. Since there is no clear or precise definition of telework, for the purpose of this study, telework is defined as work that is performed during normal work hours at a site other than an office setting. The objectives were achieved by using data collected from a telework survey of various organizations, including those from the government and private sectors of northern Alabama located in the Tennessee Valley.
349

Students' Success With World Wide Web Search Engines: Retrieving Relevant Results With Respect to End-User Relevance Judgments

Nowicki, Stacy A. 01 January 2002 (has links)
Search engines are currently the most popular method of information retrieval on the World Wide Web. However, researchers have not thoroughly examined search engines as they are used and judged by novice end-users. Calumet College of St. Joseph (CCSJ) required an investigation of search engine results to reveal how Web search engines supported the information seeking of CCSJ students. This study determined the effectiveness of information gathering through six popular search engines: Excite, Google, Lycos, MSN, Northern Light, and Yahoo!. It included an investigation of the relevance of search engine results as established by end-user relevance judgments made by novice information seekers, CCSJ students. Students in seven CCSJ English classes participated in this study. A questionnaire gathered demographic data and information about students' computer use, information retrieval experience, and experience with the World Wide Web. Students searched six search engines with search topics and queries of their choice and ranked the first \0 results according to their own relevance judgments (1 was most relevant and 10 was least relevant) . The Pearson Product Moment Correlation Coefficient determined what correlation existed between the relevance rankings of the search engines and the rankings of the students. Results showed a low correlation, though a test of significance determined that this correlation is not statistically significant. Therefore, currently popular search engines are not effective in retrieving results for information-seeking CCSJ students though they may be successful some of the time. No search engine outperformed the others in this experiment, though it is also evident that no search engine consistently performed badly enough to indicate that it was the poorest performer. Furthermore, the frequency with which students used the Web, online databases/indexes, and search engines was highly correlated with search success. Two issues surfaced during the course of this study: some students' lack of computer skills, and some students' inability to construct appropriate search statements. CCSJ should take action in the areas of computer literacy and information literacy (specifically information retrieval on the World Wide Web) in order to prepare these students for the increased importance of this popular measure of information retrieval in their lives.
350

Data Conversion: An Investigation of Management Role in the Change Process

Odoh, Mc-Chester O. 01 January 2003 (has links)
No description available.

Page generated in 0.0602 seconds