• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 386
  • 164
  • 129
  • 26
  • 21
  • 17
  • 14
  • 12
  • 10
  • 8
  • 7
  • 6
  • 5
  • 5
  • 4
  • Tagged with
  • 911
  • 911
  • 911
  • 274
  • 165
  • 157
  • 94
  • 89
  • 89
  • 82
  • 78
  • 69
  • 64
  • 61
  • 60
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

The Influence of User Perceptions on Software Utilization: Application and Evaluation of a Theoretical Model of Technology Acceptance

Morris, Michael G., Dillon, Andrew January 1997 (has links)
This paper presents and empirically evaluates a Technology Acceptance Model (TAM) which can serve as a simple to use, and cost-effective tool for evaluating applications and reliably predicting whether they will be accepted by users. After presenting TAM, the paper reports on a study designed to evaluate its effectiveness at predicting system use. In the study the researchers presented 76 novice users with an overview and hands-on demonstration of Netscape. Following this demonstration, data on user perceptions and attitudes about Netscape were gathered based on this initial exposure to the system. Follow up data was then gathered two weeks later to evaluate actual use of Netscape following the demonstration. Results suggest that TAM is an effective and cost effective tool for predicting end user acceptance of systems. Suggestions for future research and conclusions for both researchers and practitioners are offered.
122

Information navigation on the web by clustering and summarizing query results

Roussinov, Dmitri G., Chen, Hsinchun January 2001 (has links)
Artificial Intelligence Lab, Department of MIS, University of Arizona / We report our experience with a novel approach to interactive information seeking that is grounded in the idea of summarizing query results through automated document clustering. We went through a complete system development and evaluation cycle: designing the algorithms and interface for our prototype, implementing them and testing with human users. Our prototype acted as an intermediate layer between the user and a commercial Internet search engine (Alta Vista), thus allowing searches of the significant portion of the World Wide Web. In our final evaluation, we processed data from 36 users and concluded that our prototype improved search performance over using the same search engine (Alta Vista) directly. We also analyzed effects of various related demographic and task related parameters.
123

Revolutionizing Science and Engineering Through Cyberinfrastructure: Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure

Atkins, Daniel 01 1900 (has links)
This 84-page report defines the Cyberinfrastructure program proposed by the National Science Foundation (NSF). Here is the text of the news release from the University of Michigan School of Information: " Atkins committee issues NSF report on development of cyberinfrastructure (Feb 2003) A National Science Foundation (NSF) committee chaired by University of Michigan professor Daniel Atkins has recommended the organization spend an additional $1 billion per year developing the nation's "cyberinfrastructure" to support scientific research. The Advisory Committee on Cyberinfrastructure argues that investment in a comprehensive cyberinfrastructure can change profoundly what scientists and engineers do, how they do it, and who participates. Its recommendations are detailed in a newly released report titled Revolutionizing Science and Engineering through Cyberinfrastructure. In the same way society now depends on highways, water systems, and power grids, the panel contends, scientific research in the coming years will depend on the quality of the cyberinfrastructure -- the integrated information, computing, and communications systems that tie us together. "It's not just the raw technology, but also the organization and the people," says Atkins, who is professor in the School of Information and the Department of Electrical Engineering and Computer Science at U-M. It's also the standards for interoperability that will allow different disciplines to use the same infrastructure, "just the way we agreed long ago on a standard gauge for railroad tracks." "The path forward that this report envisions ... truly has the potential to revolutionize all fields of research and education," says Peter Freeman, assistant director of the NSF for Computer and Information Sciences and Engineering (CISE), the NSF arm that commissioned the report. The report was issued on the same day the NSF submitted its $5.48 billion budget request for fiscal year 2004. "NSF has been a catalyst for creating the conditions for a nascent cyberinfrastructure-based revolution," says Atkins, a revolution being driven from the ground up. "We've clearly documented extensive grass-roots activity in the scientific and engineering research community to create and use cyberinfrastructure to empower the next wave of discovery." The committee cites NSF support for such projects as the Network for Earthquake Engineering Simulations (NEES), the TeraGrid effort, and the Digital Libraries Initiative as seminal in the development of a cyberinfrastructure. At the same time, the report makes clear that the cyberinfrastructure needed cannot be built with today's off-the-shelf technology, and it argues for increased NSF support for fundamental research in computer science and engineering. The report emphasizes the importance of acting quickly and the risks of failing to do so. Those risks include lack of coordination, which could leave key data in irreconcilable formats; long-term failures to archive and curate data collected at great expense; and artificial barriers between disciplines built from incompatible tools and structures. The NSF has a "once-in-a-generation opportunity," according to the committee, to lead the scientific and engineering community in the coordinated development and expansive use of cyberinfrastructure."
124

CI Spider: a tool for competitive intelligence on the Web

Chen, Hsinchun, Chau, Michael, Zeng, Daniel January 2002 (has links)
Artificial Intelligence Lab, Department of MIS, University of Arizona / Competitive Intelligence (CI) aims to monitor a firm’s external environment for information relevant to its decision-making process. As an excellent information source, the Internet provides significant opportunities for CI professionals as well as the problem of information overload. Internet search engines have been widely used to facilitate information search on the Internet. However, many problems hinder their effective use in CI research. In this paper, we introduce the Competitive Intelligence Spider, or CI Spider, designed to address some of the problems associated with using Internet search engines in the context of competitive intelligence. CI Spider performs real-time collection of Web pages from sites specified by the user and applies indexing and categorization analysis on the documents collected, thus providing the user with an up-to-date, comprehensive view of the Web sites of user interest. In this paper, we report on the design of the CI Spider system and on a user study of CI Spider, which compares CI Spider with two other alternative focused information gathering methods: Lycos search constrained by Internet domain, and manual within-site browsing and searching. Our study indicates that CI Spider has better precision and recall rate than Lycos. CI Spider also outperforms both Lycos and within-site browsing and searching with respect to ease of use. We conclude that there exists strong evidence in support of the potentially significant value of applying the CI Spider approach in CI applications.
125

Advances in Classification Research, Volume 17: Proceedings of the 17th ASIS&T SIG/CR Classification Research Workshop

January 2006 (has links)
Papers from the 17th ASIST&T Special Interest Group for Classification Research Workshop. The workshop ran November 4, 2006 in Austin, Texas.
126

Making Your Web Site Senior Friendly: A Checklist

National Institute on Aging, U.S., National Library of Medicine, U.S. 09 1900 (has links)
Published by the U.S. National Institute on Aging and the National Library of Medicine / This fifteen-page pamphlet includes recommendations regarding designing readable text, presenting information to older adults, improving Web site navigation, and incorporating media into Web sites. A list of references and additional readings is included.
127

Web Searching, Search Engines and Information Retrieval

Lewandowski, Dirk January 2005 (has links)
This article discusses Web search engines; mainly the challenges in indexing the World Wide Web, the user behaviour, and the ranking factors used by these engines. Ranking factors are divided into query-dependent and query-independent factors, the latter of which have become more and more important within recent years. The possibilities of these factors are limited, mainly of those that are based on the widely used link popularity measures. The article concludes with an overview of factors that should be considered to determine the quality of Web search engines.
128

A web-based system for distributed product realization

Kulkarni, Rahul Suresh 12 1900 (has links)
No description available.
129

Spatial metaphors and spatial context on the World Wide Web

Elson, Shawn 05 1900 (has links)
No description available.
130

A distributed engineering problem generator

Boyd, Martin C. 08 1900 (has links)
No description available.

Page generated in 0.0877 seconds