Thesis
Submitted in fulfilment
of the requirements for the degree
MAGISTER TECHNOLOGIAE
In
BUSINESS INFORMATION SYSTEMS
in the
FACULTY OF BUSINESS
at the
CAPE PENINSULA UNIVERSITY OF TECHNOLOGY
2011 / With over a billion Internet users surfing the Web daily in search of information, buying,
selling and accessing social networks, marketers focus intensively on developing websites
that are appealing to both the searchers and the search engines. Millions of webpages are
submitted each day for indexing to search engines. The success of a search engine lies in its
ability to provide accurate search results. Search engines’ algorithms constantly evaluate
websites and webpages that could violate their respective policies. For this reason some
websites and webpages are subsequently blacklisted from their index.
Websites are increasingly being utilised as marketing tools, which result in major competition
amongst websites. Website developers strive to develop websites of high quality, which are
unique and content rich as this will assist them in obtaining a high ranking from search
engines. By focusing on websites of a high standard, website developers utilise search
engine optimisation (SEO) strategies to earn a high search engine ranking.
From time to time SEO practitioners abuse SEO techniques in order to trick the search
engine algorithms, but the algorithms are programmed to identify and flag these techniques
as spamdexing. Search engines do not clearly explain how they interpret keyword stuffing
(one form of spamdexing) in a webpage. However, they regard spamdexing in many different
ways and do not provide enough detail to clarify what crawlers take into consideration when
interpreting the spamdexing status of a website. Furthermore, search engines differ in the
way that they interpret spamdexing, but offer no clear quantitative evidence for the crossover
point of keyword dense website text to spamdexing. Scholars have indicated different views
in respect of spamdexing, characterised by different keyword density measurements in the
body text of a webpage. This raised several fundamental questions that form the basis of this
research.
This research was carried out using triangulation in order to determine how the scholars,
search engines and SEO practitioners interpret spamdexing. Five websites with varying
keyword densities were designed and submitted to Google, Yahoo! and Bing. Two phases of
the experiment were done and the results were recorded. During both phases almost all of
the webpages, including the one with a 97.3% keyword density, were indexed. The
aforementioned enabled this research to conclusively disregard the keyword stuffing issue,
blacklisting and any form of penalisation. Designers are urged to rather concentrate on
usability and good values behind building a website.
The research explored the fundamental contribution of keywords to webpage indexing and
visibility. Keywords used with or without an optimum level of measurement of richness and
poorness result in website ranking and indexing. However, the focus should be on the way in
which the end user would interpret the content displayed, rather than how the search engine
would react towards the content. Furthermore, spamdexing is likely to scare away potential
clients and end users instead of embracing them, which is why the time spent on
spamdexing should rather be used to produce quality content.
Identifer | oai:union.ndltd.org:netd.ac.za/oai:union.ndltd.org:cput/oai:localhost:20.500.11838/1767 |
Date | January 2011 |
Creators | Zuze, Herbert |
Publisher | Cape Peninsula University of Technology |
Source Sets | South African National ETD Portal |
Language | English |
Detected Language | English |
Type | Thesis |
Rights | http://creativecommons.org/licenses/by-nc-sa/3.0/za/ |
Page generated in 0.0025 seconds