• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 473
  • 281
  • 75
  • 64
  • 35
  • 15
  • 10
  • 7
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 1156
  • 242
  • 174
  • 160
  • 159
  • 151
  • 143
  • 131
  • 108
  • 97
  • 96
  • 95
  • 87
  • 86
  • 84
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Fundamental methodologies and tools for the employment of webometric analyses - a discussion and proposal for improving the foundation of webometrics

Fugl, Liv Danman 06 1900 (has links)
The paper Fundamental methodologies and tools for the employment of webometric analyses defines the most important rules to keep in mind before performing webometric analyses. The paper deals with the two basic elements, that constitutes the foundation for webometric analyses: the documents being analysed, and the tools that are applied for the data collection. The concepts of a citation theory and a link theory are discussed through a study of the current litterature. Different methodologies for uncovering motivations for making references in scientific articles are reviewed and discussed. A methodology for uncovering motivations for making links on webpages is proposed and applied on six researchers' websites at the Royal School of Library and Information Science in Denmark, and on all the institutes at the same institution and at selected institutes at The Technical University of Denmark. The paper further contains a review on the linktopology of the Internet and the current status for the tools available for data collection. Finally, alternative possible tools for applying webometric analyses are proposed. The alternative tools are the Researchindex invented by Lawrence and Giles (Lawrence, Bollacker & Giles, 1999b; Giles, Bollacker & Lawrence, 1998), Kleinberg's HITS algorithm employed in the Clever search engine (The Clever Project, n.d.; Kleinberg, 1998), Proposals for possible extensions to the HTTP protocol to facilitate the collection and navigation of backlink information in the world wide web made by Chakrabarti, Gibson and McCurley (Chakrabarti, Gibson & McCurley, 1999c) and finally Link Agent, a program we have developed for this paper. The program makes it possible to uncover the reciprocal linking webpages, that exist in relation to the outgoing links from a chosen webpage. Keywords: Informetrics, Webometrics, Citation theory, Link theory, Motivations for links, Motivations for references, Search engines, Webometric tools
42

Evaluation of Algorithm Performance on Identifying OA

Antelman, Kristin, Bakkalbasi, Nisa, Goodman, David, Hajjem, Chawki, Harnad, Stevan 12 1900 (has links)
This is a second signal-detection analysis of the accuracy of a robot in detecting open access (OA) articles (by checking by hand how many of the articles the robot tagged OA were really OA, and vice versa). We found that the robot significantly overcodes for OA. In our Biology sample, 40% of identified OA was in fact OA. In our Sociology sample, only 18% of identified OA was in fact OA. Missed OA was lower: 12% in Biology and 14% in Sociology. The sources of the error are impossible to determine from the present data, since the algorithm did not capture URL's for documents identified as OA. In conclusion, the robot is not yet performing at a desirable level, and future work may be needed to determine the causes, and improve the algorithm.
43

Usability evaluation of the South London and Maudsley NHS Trust Library web site

Ebenezer, Catherine 09 1900 (has links)
A usability evaluation was carried out of the recently-launched South London and Maudsley NHS Trust library web site using a variety of standard methodologies: content and design evaluation of selected comparable sites, focus groups, a questionnaire survey of library and web development staff, heuristic evaluation, observation testing, card sorting/cluster analysis, and label intuitiveness/category membership testing. All test participants were staff of or providers of services to the trust. Demographic information was recorded for each participant. Unsuccessful attempts were made to evaluate user feedback, and to compare usability test results with usage statistics. Test participantsâ overall responses to the site were enthusiastic and favourable, indicating the scope and content of the site to be broadly appropriate to the user group. Numerous suggestions for new content areas were made by testers. Usability problems were discovered in two main areas: in the organisation of the site, and in the terminology used to refer to information services and sources. On the basis of test results, proposals for a revised menu structure, improved accessibility, and changes to the terminology used within the site are presented.
44

Adalhendings and skothendings in the old Saxon Heliand

Pierce, James Parker, 1937- January 1961 (has links)
No description available.
45

Tracing rays the past, present and future of ray tracing performance / Jan Cornelis Willem Kroeze

Kroeze, Jan Cornelis Willem January 2010 (has links)
The metrics used to compare the performance of various ray tracers in the literature are flawed because they are non-standard and depend on the hardware configuration of the specific system used to gather data. A different way of measuring the relative performance of ray tracing algorithms is proposed and tested across several hardware platforms using correlation co-efficient. / Thesis (M.Sc. (Computer Science) North-West University, Vaal Triangle Campus, 2011
46

Tracing rays the past, present and future of ray tracing performance / Jan Cornelis Willem Kroeze

Kroeze, Jan Cornelis Willem January 2010 (has links)
The metrics used to compare the performance of various ray tracers in the literature are flawed because they are non-standard and depend on the hardware configuration of the specific system used to gather data. A different way of measuring the relative performance of ray tracing algorithms is proposed and tested across several hardware platforms using correlation co-efficient. / Thesis (M.Sc. (Computer Science) North-West University, Vaal Triangle Campus, 2011
47

A Geometrical probability approach to location-critical network performance metrics

Zhuang, Yanyan 23 March 2012 (has links)
The field of wireless communications has been experiencing tremendous growth with the ever-increasing dependence on wireless services. In the operation of a communication network, the network coverage and node placement are of profound importance. The network performance metrics can be modeled as nonlinear functions of inter-node distances. Therefore, a geometric abstraction of the distance between wireless devices becomes a prerequisite for accurate system modeling and analysis. A geometrical probability approach is presented in this dissertation to characterize the probabilistic distance properties, for analyzing the location-critical performance metrics through various spatial distance distributions. Ideally, the research in geometrical probability shall give results for the distance distributions 1) over elementary geometries such as a straight line, squares and rectangles, and 2) over complex geometries such as rhombuses and hexagons. Both 1) and 2) are the representative topological shapes for communication networks. The current probability and statistics literature has explicit results for 1), whereas the results for 2) are not in existence. In particular, the absence of the distance distributions for rhombuses and hexagons has posed challenges towards the analytical modeling of location-critical performance metrics in complex geometries. This dissertation is dedicated to the application of existing results in 1) elementary geometries to the networking area, and the development of a new approach to deriving the distance distributions for complex geometries in 2), bridging the gap between the geometrical probability and networking research. The contribution of this dissertation is twofold. First, the one-dimensional Poisson point process in 1) is applied to the message dissemination in vehicular ad-hoc networks, where the network geometry is constrained by highways and city blocks. Second, a new approach is developed to derive the closed-form distributions of inter-node distances associated with rhombuses and hexagons in 2), which are obtained for the first time in the literature. Analytical models can be constructed for characterizing the location-critical network performance metrics, such as connectivity, nearest/farthest neighbor, transmission power, and path loss in wireless networks. Through both analytical and simulation results, this dissertation demonstrates that this geometrical probability approach provides accurate information essential to successful network protocol and system design, and goes beyond the approximations or Monte Carlo simulations by gracefully eliminating the empirical errors. / Graduate
48

Evaluating courseware development effort estimation measures and models

Marshall, Ian Mitchell January 1996 (has links)
No description available.
49

Rhythm and meaning in the Homeric hexameter

Krawitz, Sherry. January 1986 (has links)
No description available.
50

Normalization of Process Safety Metrics

Wang, Mengtian 2012 August 1900 (has links)
This study is aimed at exploring new process safety metrics for measuring the process safety performance in processing industries. Following a series of catastrophic incidents such as the Bhopal chemical tragedy (1984) and Phillips 66 explosion (1989), process safety became a more important subject than ever. These incidents triggered the development and promulgation of the Process Safety Management (PSM) standard in 1992. While PSM enables management to optimize their process safety programs and organizational risks, there is an emerging need to evaluate the process safety implementation across an organization through measurements. Thus, the process safety metric is applied as a powerful tool that measures safety activities, status, and performance within PSM. In this study, process safety lagging metrics were introduced to describe the contribution of process related parameters in determining the safety performance of an organization. Lagging metrics take process safety incidents as the numerator and divide it by different process-related denominators. Currently a process lagging metric (uses work hours as denominator) introduced by the Center for Chemical Process Safety (CCPS) has been used to evaluate the safety performance in processing industries. However, this lagging metric doesn't include enough process safety information. Therefore, modified denominators are proposed in this study and compared with the existing time-based denominator to validate the effectiveness and applicability of the new metrics. Each proposed metric was validated using available industry data. Statistical unitization method has converted incident rates of different ranges for the convenience of comparison. Trend line analysis was the key indication for determining the appropriateness of new metrics. Results showed that some proposed process-related metrics have the potential as alternatives, along with the time-based metric, to evaluate process safety performance within organizations.

Page generated in 0.0465 seconds