• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A prototype to discover and penetrate access restricted web pages in an Extranet

Van Jaarsveld, Rudi 13 October 2014 (has links)
M.Sc. (Information Technology) / The internet grew exponentially over the last decade. With more information available on the web, search engines, with the help of web crawlers also known as web bots, gather information on the web and indexes billions of web pages. This indexed information helps users to find relevant information on the internet. An extranet is a sub-set of the internet. This part of the web controls access for a selected audience to a specific resource and are also referred to as restricted web sites. Various industries use extranets for different purposes and store different types of information on it. Some of this information could be of a confidential nature and therefore it is important that this information is adequately secured and should not be accessible by web bots. In some cases web bots can accidently stumble onto poorly secured pages in an extranet and add the restricted web pages to their indexed search results. Search engines like Google, that are designed to filter through a large amount of data, can accidently crawl onto access restricted web pages if such pages are not secured properly. Researchers found that it is possible for web crawlers of well known search engines to access poorly secured web pages in access restricted web sites. The risk is that not all web bots have good intentions and that some have a more malicious intent. These malicious web bots search for vulnerabilities in extranets and use the vulnerabilities to access confidential information. The main objective of this dissertation is to develop a prototype web bot called Ferret that would crawl through a web site developed by a web developer(s). Ferret will try to discover and access restricted web pages that are poorly secured in the extranet and report the weaknesses. From the information and findings of this research a best practice guideline will be drafted that will help developers to ensure access restricted web pages are secured and invisible to web bots.

Page generated in 0.0881 seconds