• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 73
  • 9
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 128
  • 128
  • 128
  • 49
  • 47
  • 31
  • 28
  • 25
  • 23
  • 23
  • 20
  • 19
  • 17
  • 16
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Web system design and development using open source technology /

Dowla, Rafi, January 2007 (has links)
Thesis (M.S.)--Edinboro University of Pennsylvania, 2007. / "Technical report 06-03." Typescript, photocopy. Includes bibliographical references. Also available online: http://cslab103.cs.edinboro.edu/~WEBEBOOK/

The crossover point between keyword rich website text and spamdexing

Zuze, Herbert January 2011 (has links)
Thesis Submitted in fulfilment of the requirements for the degree MAGISTER TECHNOLOGIAE In BUSINESS INFORMATION SYSTEMS in the FACULTY OF BUSINESS at the CAPE PENINSULA UNIVERSITY OF TECHNOLOGY 2011 / With over a billion Internet users surfing the Web daily in search of information, buying, selling and accessing social networks, marketers focus intensively on developing websites that are appealing to both the searchers and the search engines. Millions of webpages are submitted each day for indexing to search engines. The success of a search engine lies in its ability to provide accurate search results. Search engines’ algorithms constantly evaluate websites and webpages that could violate their respective policies. For this reason some websites and webpages are subsequently blacklisted from their index. Websites are increasingly being utilised as marketing tools, which result in major competition amongst websites. Website developers strive to develop websites of high quality, which are unique and content rich as this will assist them in obtaining a high ranking from search engines. By focusing on websites of a high standard, website developers utilise search engine optimisation (SEO) strategies to earn a high search engine ranking. From time to time SEO practitioners abuse SEO techniques in order to trick the search engine algorithms, but the algorithms are programmed to identify and flag these techniques as spamdexing. Search engines do not clearly explain how they interpret keyword stuffing (one form of spamdexing) in a webpage. However, they regard spamdexing in many different ways and do not provide enough detail to clarify what crawlers take into consideration when interpreting the spamdexing status of a website. Furthermore, search engines differ in the way that they interpret spamdexing, but offer no clear quantitative evidence for the crossover point of keyword dense website text to spamdexing. Scholars have indicated different views in respect of spamdexing, characterised by different keyword density measurements in the body text of a webpage. This raised several fundamental questions that form the basis of this research. This research was carried out using triangulation in order to determine how the scholars, search engines and SEO practitioners interpret spamdexing. Five websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. Two phases of the experiment were done and the results were recorded. During both phases almost all of the webpages, including the one with a 97.3% keyword density, were indexed. The aforementioned enabled this research to conclusively disregard the keyword stuffing issue, blacklisting and any form of penalisation. Designers are urged to rather concentrate on usability and good values behind building a website. The research explored the fundamental contribution of keywords to webpage indexing and visibility. Keywords used with or without an optimum level of measurement of richness and poorness result in website ranking and indexing. However, the focus should be on the way in which the end user would interpret the content displayed, rather than how the search engine would react towards the content. Furthermore, spamdexing is likely to scare away potential clients and end users instead of embracing them, which is why the time spent on spamdexing should rather be used to produce quality content.

Issues to consider during the development and promotion of a primary school web site

Du Preez, Hendrihette Janette 22 December 2005 (has links)
The purpose of this study was to develop a Web site for a primary school and to determine the quality of the web site in comparison with the web sites of other primary schools. The specific focus of this study was to determine the promotability of the web site and to point out the advantages for the school concerned. Ms Mariaan Greyvenstein, my co-researcher, focussed on the content development and management aspects of the web sites of primary schools. The dissertations of both the researchers discuss the web site of one specific school, and for this reason some information overlaps periodically. Both of the dissertations have separate functions and each one is special in its own way. A detailed description of the development and testing of the product is given. The acquisition and evaluation of results are discussed. The researcher discusses the findings to assure the profitability of the product. / Dissertation (MA (Information Science))--University of Pretoria, 2007. / Information Science / unrestricted

Phi Beta Delta: Implementation of a self-maintaining web site

Pillutla, Pallavi 01 January 2007 (has links)
The purpose of this project was to develop an easy-to-maintain web site for the Gamma Lambda Chapter of Phi Beta Delta International Honor Society here at California State University, San Bernardino, which will manage complete and up-to-date information about the mission, members, officers and all the activities of the honor society.

Online product information load impact on maximizers and satisficers within a choice context /

Mosteller, Jill Renee. January 2007 (has links)
Thesis (Ph. D.)--Georgia State University, 2007. / Naveen Donthu, committee chair; Detmar Straub, Corliss Thornton, Sevo Eroglu, committee members. Electronic text (149 p. : ill.) : digital, PDF file. Title from file title page. Description based on contents viewed Dec. 10, 2007. Includes bibliographical references (p. 142-149).

WIDE web interface development environment /

Okamoto, Sohei. January 2005 (has links)
Thesis (M.S.)--University of Nevada, Reno, 2005. / "December, 2005." Includes bibliographical references (leaves 73-77). Online version available on the World Wide Web.

Optimizing communication performance of web services using differential deserialization of SOAP messages

Abu-Ghazaleh, Nayef Bassam. January 2006 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Computer Science Department, 2006. / Includes bibliographical references.

Creating an online shopping Website for "Chinguun-Tulga" office supply store /

Ginjbaatar, Bilguun, January 2007 (has links)
Thesis (M.S.)--Edinboro University of Pennsylvania, 2007. / Typescript, photocopy. Includes bibliographical references (l. 16-17).

On-line electronic document collaboration and annotation /

Harmon, Trev R., January 2006 (has links) (PDF)
Thesis (M.S.)--Brigham Young University. Dept. of Technology, 2006. / Includes bibliographical references.

A pattern-based approach to the specification and validation of web services interactions

Li, Zheng. January 2007 (has links)
Thesis (MSc) - Swinburne University of Technology, Faculty of Information & Communication Technologies, 2006. / A thesis submitted to Faculty of Information and Communication Technologies, Swinburne University of Technology for the degree of Master of Science by Research, 2007. Typescript. Bibliography p. 107-112.

Page generated in 0.0967 seconds