• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 187
  • 22
  • 22
  • 21
  • 13
  • 12
  • 7
  • 6
  • 4
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 351
  • 351
  • 64
  • 62
  • 60
  • 53
  • 49
  • 47
  • 42
  • 41
  • 41
  • 37
  • 36
  • 33
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

The Use of Computers to Enhance the Administrative Function in Tennessee Schools

Cole, Jerry W. 01 May 1992 (has links)
The purpose of this study was to compare the levels of computer use by school principals in administering their schools. Comparisons were made of the different techniques being employed by school principals as they manage the vast amounts of data that are present in today's educational process. A comprehensive collection of computer applications was identified and school principals were surveyed regarding their use of these applications. A random sample was selected from a population of 1,800 school principals in the state of Tennessee. School principals from 430 public schools and 70 private/parochial schools in Tennessee were surveyed for responses relative to their practices regarding the use of computers in the management of their school. Surveys were mailed in early January, 1992, to those principals who were identified in the sample selection. Surveys were received over a period of several weeks. A return of 71% was obtained. Findings include the determination that schools have computers specifically for the purpose of completing administrative tasks. Principals and office staffs are using administrative computers to improve their management of school data. The primary areas identified as being preformed by school principals were attendance, management of student data, wordprocessing, grade reporting, and transportation. Principals indicated that the major avenues for computer training is through seminars and workshops. The major conclusions included the need for additional computer training in principal preparation curricula, exposure to innovative uses of computers to enhance the administrative function.
42

Using web services for customised data entry

Deng, Yanbo January 2007 (has links)
Scientific databases often need to be accessed from a variety of different applications. There are usually many ways to retrieve and analyse data already in a database. However, it can be more difficult to enter data which has originally been stored in different sources and formats (e.g. spreadsheets, other databases, statistical packages). This project focuses on investigating a generic, platform independent way to simplify the loading of databases. The proposed solution uses Web services as middleware to supply essential data management functionality such as inserting, updating, deleting and retrieval of data. These functions allow application developers to easily customise their own data entry applications according to local data sources, formats and user requirements. We implemented a Web service to support loading data to the Germinate database at the New Zealand Institute of Crop & Food Research (CFR). We also provided language specific client toolkits to help developers invoke the Web service. The toolkits allow applications to be easily customised for different platforms. In addition, we developed sample applications to help end users load data from their project data sources via the Web service. The Web service approach was evaluated through user and developer trials. The feedback from the developer trial showed that using Web services as middleware is a useful approach to allow developers and competent end users to customise data entry with minimal effort. More importantly, the customised client applications enabled end users to load data directly from their project spreadsheets and databases. It significantly reduced the effort required for exporting or transforming the source data.
43

Examining Methods and Practices of Source Data Verification in Canadian Critical Care Randomized Controlled Trials

Ward, Roxanne E. 21 March 2013 (has links)
Statement of the Problem: Source data verification (SDV) is the process of comparing data collected at the source to data recorded on a Case Report Form, either paper or electronic (1) to ensure that the data are complete, accurate and verifiable. Good Clinical Practice (GCP) Guidelines are vague and lack evidence as to the degree of SDV and whether or not SDV affects study outcomes. Methods of Investigation: We performed systematic reviews to establish the published evidence-base for methods of SDV and to examine the effect of SDV on study outcomes. We then conducted a national survey of Canadian Critical Care investigators and research coordinators regarding their attitudes and beliefs regarding SDV. We followed by an audit of the completed and in-progress Randomized Controlled Trials (RCTs) of the Canadian Critical Care Trials Group (CCCTG). Results: Systematic Review of Methods of SDV: The most common reported or recommended frequency of source data verification (10/14 - 71%) was either based on level or risk, or that it be conducted early (i.e. after 1st patient enrolled). The amount of SDV recommended or reported, varied from 5-100%. Systematic Review of Impact of SDV on Study Outcomes: There was no difference in study outcomes for 1 trial and unable to assess in the other. National Survey of Critical Care Investigators and Research Coordinators: Data from the survey found that 95.8% (115/120) of respondents believed that SDV was an important part of Quality Assurance; 73.3% (88/120) felt that academic studies should do more SDV; and 62.5% (75/120) felt that there is insufficient funding available for SDV. Audit of Source Data Verification Practices in CCCTG RCTs: In the national audit of in-progress and completed CCCTG RCTs, 9/15 (60%) included a plan for SDV and 8/15 (53%) actually conducted SDV. Of the 9 completed published trials, 44% (4/9) conducted SDV. Conclusion: There is little evidence base for methods and effect of SDV on study outcomes. Based on the results of the systematic review, survey, and audit, more research is needed to support the evidence base for the methods and effect of SDV on study outcomes.
44

Data Consistency Checks on Flight Test Data

Mueller, G. 10 1900 (has links)
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA / This paper reflects the principal results of a study performed internally by Airbus's flight test centers. The purpose of this study was to share the body of knowledge concerning data consistency checks between all Airbus business units. An analysis of the test process is followed by the identification of the process stakeholders involved in ensuring data consistency. In the main part of the paper several different possibilities for improving data consistency are listed; it is left to the discretion of the reader to determine the appropriateness these methods.
45

Data Management at the University of Arizona: Working Across Campus to Develop Support and Services

Kollen, Chris 24 April 2012 (has links)
Poster presentation from the Living the Future 8 Conference, April 23-24, 2012, University of Arizona Libraries, Tucson, AZ. / In January 2011, the National Science Foundation instituted a requirement that all grant proposals include a data management plan. In response, many academic libraries began to focus on developing library services that support storing and curating data in order to increase research productivity. The University of Arizona, with the Libraries taking a lead, wanted to look at how the campus could support researchers as they developed data management plans. With the goal of making substantial advances in this area, the Dean of Libraries designated 1 FTE librarian for data management, and the Dean and Vice-President for Research (VPR) established the Campus Data Management and Curation Advisory Committee with members from the Libraries, VPR's office, and faculty regarding data management, the Campus Committee's charge and recommendations (including what units need to collaborate), progress made, next steps, and useful tools and initiatives to keep an eye on.
46

Examining Methods and Practices of Source Data Verification in Canadian Critical Care Randomized Controlled Trials

Ward, Roxanne E. 21 March 2013 (has links)
Statement of the Problem: Source data verification (SDV) is the process of comparing data collected at the source to data recorded on a Case Report Form, either paper or electronic (1) to ensure that the data are complete, accurate and verifiable. Good Clinical Practice (GCP) Guidelines are vague and lack evidence as to the degree of SDV and whether or not SDV affects study outcomes. Methods of Investigation: We performed systematic reviews to establish the published evidence-base for methods of SDV and to examine the effect of SDV on study outcomes. We then conducted a national survey of Canadian Critical Care investigators and research coordinators regarding their attitudes and beliefs regarding SDV. We followed by an audit of the completed and in-progress Randomized Controlled Trials (RCTs) of the Canadian Critical Care Trials Group (CCCTG). Results: Systematic Review of Methods of SDV: The most common reported or recommended frequency of source data verification (10/14 - 71%) was either based on level or risk, or that it be conducted early (i.e. after 1st patient enrolled). The amount of SDV recommended or reported, varied from 5-100%. Systematic Review of Impact of SDV on Study Outcomes: There was no difference in study outcomes for 1 trial and unable to assess in the other. National Survey of Critical Care Investigators and Research Coordinators: Data from the survey found that 95.8% (115/120) of respondents believed that SDV was an important part of Quality Assurance; 73.3% (88/120) felt that academic studies should do more SDV; and 62.5% (75/120) felt that there is insufficient funding available for SDV. Audit of Source Data Verification Practices in CCCTG RCTs: In the national audit of in-progress and completed CCCTG RCTs, 9/15 (60%) included a plan for SDV and 8/15 (53%) actually conducted SDV. Of the 9 completed published trials, 44% (4/9) conducted SDV. Conclusion: There is little evidence base for methods and effect of SDV on study outcomes. Based on the results of the systematic review, survey, and audit, more research is needed to support the evidence base for the methods and effect of SDV on study outcomes.
47

The Ebola Virus Disease Outbreak in Guinea, Liberia and Sierra Leone - Data Management Implementation and Outcomes for Movement and Monitoring of Travelers at Points of Entry

Washburn, Faith M 09 January 2015 (has links)
Data management in resource-limited settings can be a mountainous problem if not approached with a thorough understanding of those limitations and a mindset prepared for rapid changes in the environment. Data management becomes even more challenging at multiple points of entry, where there are many interwoven parts working together in order to get a potential traveler from his/her first steps into an airport area to boarding a plane, all while ensuring that the traveler has been thoroughly screened for any signs or symptoms of a possible Ebola virus disease infection. This capstone describes the history of the International Health Regulations’ effects on control of disease spread and importation at points of entry, the Do Not Board/Lookout List’s role in disease control in the United States, and the CDC’s International Assistance Team’s unique task in creating and implementing country-specific databases to meet the needs of Ebola-affected countries. The most critical data management need at these countries’ points of entry is specifically to prevent the exportation of Ebola virus disease in order to keep each country’s airspace open and allow goods, personnel and services to continue to be imported into these countries during this sustained Ebola outbreak.
48

Diffusion of regional spatial data infrastructures: with particular reference to Asia and the Pacific

Rajabifard, Abbas January 2002 (has links) (PDF)
The development of a Regional Spatial Data Infrastructure (Regional SDI) is much more challenging than the development of a National SDI initiative within a nation. This is mainly because of the voluntary nature of cooperation at a multi-national level and participation in a Regional SDI initiative. As a result, despite considerable interest and activities, the development of an effective and comprehensive Asia-Pacific Regional Spatial Data Infrastructure (APSDI) is hampered by a lack of support from member nations which results in this initiative remaining only an innovative concept. Based on this situation, the aim of this research is to design an improved conceptual model for Regional SDI and an implementation strategy. It is proposed that this problem can be partly addressed by increasing the level of awareness about the nature and value of SDIs; improving the SDI conceptual model to better meet the needs of nations; and by identifying key factors that facilitate development by better understanding the complexity of the interaction between social, economic and political issues.
49

Examining Methods and Practices of Source Data Verification in Canadian Critical Care Randomized Controlled Trials

Ward, Roxanne E. January 2013 (has links)
Statement of the Problem: Source data verification (SDV) is the process of comparing data collected at the source to data recorded on a Case Report Form, either paper or electronic (1) to ensure that the data are complete, accurate and verifiable. Good Clinical Practice (GCP) Guidelines are vague and lack evidence as to the degree of SDV and whether or not SDV affects study outcomes. Methods of Investigation: We performed systematic reviews to establish the published evidence-base for methods of SDV and to examine the effect of SDV on study outcomes. We then conducted a national survey of Canadian Critical Care investigators and research coordinators regarding their attitudes and beliefs regarding SDV. We followed by an audit of the completed and in-progress Randomized Controlled Trials (RCTs) of the Canadian Critical Care Trials Group (CCCTG). Results: Systematic Review of Methods of SDV: The most common reported or recommended frequency of source data verification (10/14 - 71%) was either based on level or risk, or that it be conducted early (i.e. after 1st patient enrolled). The amount of SDV recommended or reported, varied from 5-100%. Systematic Review of Impact of SDV on Study Outcomes: There was no difference in study outcomes for 1 trial and unable to assess in the other. National Survey of Critical Care Investigators and Research Coordinators: Data from the survey found that 95.8% (115/120) of respondents believed that SDV was an important part of Quality Assurance; 73.3% (88/120) felt that academic studies should do more SDV; and 62.5% (75/120) felt that there is insufficient funding available for SDV. Audit of Source Data Verification Practices in CCCTG RCTs: In the national audit of in-progress and completed CCCTG RCTs, 9/15 (60%) included a plan for SDV and 8/15 (53%) actually conducted SDV. Of the 9 completed published trials, 44% (4/9) conducted SDV. Conclusion: There is little evidence base for methods and effect of SDV on study outcomes. Based on the results of the systematic review, survey, and audit, more research is needed to support the evidence base for the methods and effect of SDV on study outcomes.
50

A comparative quality of life survey in Elsies River and Basuto QwaQwa

Erlank, D January 1985 (has links)
Bibliography: leaves 199-204. / This thesis is concerned with developing a method for determining the Quality of Life of a group or community in quantitative terms. The method devised is aimed at providing decision-makers with a useful tool when allocating public funds. The method involves setting critical values for indicators and then applying a mathematical formula, in order to standardise information gathered from several different sources. A value for the indicator of a particular group or community is thus calculated. This procedure made it possible to compare data from these different sources. Arising out of this the values for individual indicators were aggregated to produce indices evaluating the Quality of Life, which are in a form that may be readily used by decision-makers. Surveys were run in Elsies River, a coloured suburb of Cape Town, and in Basuto QwaQwa, a homeland in the Orange Free State, using two questionnaires. The results were computed and the method developed here used to compare and aggregate the data. Other sources of data included opinions from experts and objective data concerning the two survey areas which were also standardised and aggregated. The results show that the method is pragmatic and could be useful to decision-makers. The standardisation provided the means for arriving at the indices which show how different aspects of the Quality of Life may be assessed. The results, however, are not absolute and could change through a process of negotiation: in fact this is an essential qualification.

Page generated in 0.1112 seconds