• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • Tagged with
  • 8
  • 8
  • 8
  • 8
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A comparison of DHTML and JAVA applets

Freeby, James M. January 2001 (has links)
Thesis (M.S.)--University of California, Santa Cruz 2001. / Typescript. Includes bibliographical references (leaves 109-111).
2

An introduction to computer programming for complete beginners using HTML, JavaScript, and C♯

Parker, Rembert N. January 2008 (has links)
Thesis (D. Ed.)--Ball State University, 2008. / Title from PDF t.p. (viewed on Nov. 09, 2009). Includes bibliographical references (p. 116-118).
3

Detection, Triage, and Attribution of PII Phishing Sites

Roellke, Dennis January 2022 (has links)
Stolen personally identifiable information (PII) can be abused to perform a multitude of crimes in the victim’s name. For instance, credit card information can be used in drug business, Social Security Numbers and health ID’s can be used in insurance fraud, and passport data can be used for human trafficking or in terrorism. Even Information typically considered publicly available (e.g. name, birthday, phone number, etc.) can be used for unauthorized registration of services and generation of new accounts using the victim’s identity (unauthorized account creation). Accordingly, modern phishing campaigns have outlived the goal of account takeover and are trending towards more sophisticated goals. While criminal investigations in the real world evolved over centuries, digital forensics is only a few decades into the art. In digital forensics, threat analysts have pioneered the field of enhanced attribution - a study of threat intelligence that aims to find a link between attacks and attackers. Their findings provide valuable information for investigators, ultimately bolster takedown efforts and help determine the proper course of legal action. Despite an overwhelming offer of security solutions today suggesting great threat analysis capabilities, vendors only share attack signatures and additional intelligence remains locked into the vendor’s ecosystem. Victims often hesitate to disclose attacks, fearing reputation damage and the accidental revealing of intellectual property. This phenomenon limits the availability of postmortem analysis from real-world attacks and often forces third-party investigators, like government agencies, to mine their own data. In the absence of industry data, it can be promising to actively infiltrate fraudsters in an independent sting operation. Intuitively, undercover agents can be used to monitor online markets for illegal offerings and another common industry practice is to trap attackers in monitored sandboxes called honeypots. Using honeypots, investigators lure and deceive an attacker into believing an attack was successful while simultaneously studying the attacker’s behavior. Insights gathered from this process allow investigators to examine the latest attack vectors, methodology, and overall trends. For either approach, investigators crave additional information about the attacker, such that they can know what to look for. In the context of phishing attacks, it has been repeatedly proposed to "shoot tracers into the cloud", by stuffing phishing sites with fake information that can later be recognized in one way or another. However, to the best of our knowledge, no existing solution can keep up with modern phishing campaigns, because they focus on credential stuffing only, while modern campaigns steal more than just user credentials — they increasingly target PII instead.We observe that the use of HTML form input fields is a commonality among both credential stealing and identity stealing phishing sites and we propose to thoroughly evaluate this feature for the detection, triage and attribution of phishing attacks. This process includes extracting the phishing site’s target PII from its HTML <label> tags, investigating how JavaScript code stylometry can be used to fingerprint a phishing site for its detection, and determining commonalities between the threat actor’s personal styles. Our evaluation shows that <input> tag identifiers, and <label> tags are the most important features for this machine learning classification task, lifting the accuracy from 68% without these features to up to 92% when including them. We show that <input> tag identifiers and code stylometry can also be used to decide if a phishing site uses cloaking. Then we propose to build the first denial-of-phishing engine (DOPE) that handles all phishing; both Credential Stealing and PII theft. DOPE analyzes HTML <label> tags to learn which information to provide, and we craft this information in a believable manner, meaning that it can be expected to pass credibility tests by the phisher.
4

An introduction to computer programming for complete beginners using HTML, JavaScript, and C#

Parker, Rembert N. January 2008 (has links)
Low student success rates in introductory computer programming classes result in low student retention rates in computer science programs. For some sections of the course a traditional approach began using C# in the .Net development environment immediately. An experimental course redesign for one section was prepared that began with a study of HTML and JavaScript and focused on having students build web pages for several weeks; after that the experimental course used C# and the .Net development environment, covering all the material that was covered in the traditional sections. Students were more successful in the experimental section, with a higher percentage of the students passing the course and a higher percentage of the students continuing on to take at least one additional computer science course. / Department of Computer Science
5

Accessing timesheets via internet through ASP and ODBC

Challa, Varshi 01 January 2000 (has links)
The purpose of this project is to develop a computerized timesheet application. Using this application, an employee of a company can log onto the company's Web site and fill out a timesheet from anywhere in the world. The project involved automating timesheet data entry and approval procedures using contemporary technologies like Active Server Pages (ASP), JavaScript, VB Script, Component Object Model (COM), Components and Open Database connectivity (ODBC).
6

Presentations world wide systems

Hengstebeck, Sandra Marie 01 January 2001 (has links)
The purpose of Presentations World Wide System (PWWS) is to allow students to view a live presentation through an Internet browser and allow the instructor to have control over the presentation.
7

Web Texturizer: Exploring intra web document dependencies

Tandon, Seema Amit 01 January 2004 (has links)
The goal of this project is to create a customized web browser to facilitate the skimming of documents by offsetting the document with relevant information. This project added techniques of learning information retrieval to automate the web browsing experience to the web texturizer. The script runs on the web texturizer website; and it allows users to quickly navigate through the web page.
8

Real-time monitoring of distributed real-time and embedded systems using Web

Puranik, Darshan Gajanan 03 January 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Asynchronous JavaScript and XML (AJAX) is the primary method for enabling asynchronous communication over the Web. Although AJAX is providing warranted real-time capabilities to the Web, it requires unconventional programming methods at the expense of extensive resource usage. WebSockets, which is an emerging protocol, has the potential to address many challenges with implementing asynchronous communication over the Web. There, however, has been no in-depth study that quantitatively compares AJAX and WebSockets. This thesis therefore provides two contributions to Web development. First, it provides an experience report for adding real-time monitoring support over the Web to the Open-source Architecture of Software Instrumentation of Systems(OASIS), which is open-source real-time instrumentation middleware for distributed real-time and embedded (DRE) systems. Secondly, it quantitatively compares using AJAX and WebSockets to stream collected instrumentation data over the Web in real-time. Results from quantitative comparison between WebSockets and AJAX show that a WebSockets server consumes 50% less network bandwidth than an AJAX server; a WebSockets client consumes memory at constant rate, not at an increasing rate; and WebSockets can send up to 215.44% more data samples when consuming the same amount network bandwidth as AJAX.

Page generated in 0.0639 seconds