1 |
Web-based Stereo Rendering for Visualization and Annotation of Scientific Volumetric DataEng, Daniel C. 16 January 2010 (has links)
Advancement in high-throughput microscopy technology such as the Knife-Edge
Scanning Microscopy (KESM) is enabling the production of massive amounts of high-resolution
and high-quality volumetric data of biological microstructures. To fully
utilize these data, they should be efficiently distributed to the scientific research community
through the Internet and should be easily visualized, annotated, and analyzed.
Given the volumetric nature of the data, visualizing them in 3D is important. However,
since we cannot assume that every end user has high-end hardware, an approach
that has minimal hardware and software requirements will be necessary, such as a
standard web browser running on a typical personal computer. There are several web
applications that facilitate the viewing of large collections of images. Google Maps
and Google Maps-like interfaces such as Brainmaps.org allow users to pan and zoom
2D images efficiently. However, they do not yet support the rendering of volumetric
data in their standard web interface.
The goal of this thesis is to develop a light-weight volumetric image viewer using
existing web technologies such as HTML, CSS and JavaScript while exploiting the
properties of stereo vision to facilitate the viewing and annotations of volumetric data.
The choice of stereogram over other techniques was made since it allows the usage of
raw image stacks produced by the 3D microscope without any extra computation on
the data at all. Operations to generate stereo images using 2D image stacks include
distance attenuation and binocular disparity. By using HTML and JavaScript that are computationally cheap, we can accomplish both tasks dynamically in a standard
web browser, by overlaying the images with intervening semi-opaque layers.
The annotation framework has also been implemented and tested. In order for
annotation to work in this environment, it should also be in the form of stereogram
and should aid the merging of stereo pairs. The current technique allows users to
place a mark (dot) on one image stack, and its projected position onto the other
image stack is calculated dynamically on the client side. Other extra metadata such
as textual descriptions can be entered by the user as well. To cope with the occlusion
problem caused by changes in the z direction, the structure traced by the user will
be displayed on the side, together with the data stacks. Using the same stereo-gram
creation techniques, the traces made by the user is dynamically generated and shown
as stereogram.
We expect the approach presented in this thesis to be applicable to a broader
scientific domain, including geology and meteorology.
|
2 |
Traumatic brain injury options web applicationNagulavancha, Sruthi January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel A. Andresen / According to the Division of Injury Response, Centers for Disease Control and Prevention, approximately 1.4 million Americans sustain a traumatic brain injury each year. The aim of the project is to create a web interface to link survivors, family members, and caregivers of individuals suffering from traumatic brain injuries (TBI) to potentially helpful agencies or service centers within their local communities. Often the TBI service centers located in the remote places are difficult to trace hence this website mainly concentrates on small rural centers which are located in Kansas State.
The portal will offer two-dimensional and basic information about traumatic brain injury centers and specifically about access of resources. Within the portal, a link to an interactive map will be provided. A form for data entry helps the service centers to publish about their presence and the regions they serve. A search distance feature is also added into the website which interactively searches the nearest latitude, longitude values (TBI service center) to the user’s location by using the haversine formula.
|
3 |
Blue search in Kansas river databaseSama, Haritha Reddy January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Daniel Andresen / Students who are Research Assistants under Dr. Craig Paukert, Division of Biology Kansas State University have problems storing/retrieving their field data directly from database. There data is generally related to fishes in Kansas River. All this data is entered in the data sheets during the research in field.
They enter fish related data like tag numbers (each fish caught is given a tag and after noting the data related to the fish it is left back in the river) and other information like size of the fish, place the fish was caught (stations) and information about the shore habitats, water temperature, depth etc. They also need data to recognize the aging in a recaptured fish (fish caught with a tag). The data from these data sheets is manually entered into the database. Each student has their information on the database entered (wherein they actually have to work on a single database). There is inconsistency of data. Also the database doesn't show the concentration of the fishes or location of the fishes right away. For which they require lot of analysis on the data.
The main objective of this project is to provide a solution for this problem. The solution is an interface to store data and retrieve data and show the data on Maps for easy analysis and also have centralized data. The interface is user friendly which removes the hassle from entering the data manually into the database.
|
4 |
Web application SecurityCharpentier Rojas, Jose Enrique January 2013 (has links)
Problems related to web application security comes in many ways, one example is inexperience programmers but not only in the way they code and program but also which language and structure they use to code. Not only programmers but Software companies left holes in the software they developed of course without intention.Because is proven that most of the vulnerabilities start in the web application side, as developers we need to follow certain principles, test our code and learn as much as possible about the subject, as a foundation of web application security in order to know how to prevent issues to the most significant treats.The penetration test aimed to help the IT business to discover vulnerabilities in their system ensure their integrity and continue further in the web application security process. The vulnerability research perform in this report is the introduction of a big work that is under continuity for the company.Finally the success of following security standards, process and methodologies applied on this field is considered the best approach to ensure web application security and priceless information you can benefit from.
|
5 |
Detecting And Diagnosing Web Application Performance Degradation In Real-Time At The Method Call LevelWang, Mengliao Unknown Date
No description available.
|
6 |
Web application for Office of International Programs.Jarugu, Swathi January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Mitchell L. Neilsen / The mission of the Office of International Programs is to lead the internationalization of Kansas State University by supporting and collaborating with faculty, students, and scholars. The units within the Office of International Programs provide primary support for both in coming international students, faculty and researchers at K-State, faculty and researchers going abroad, etc.
It seldom happens that all the international activity of a particular university is organized under the same hood. There is no conventional source that handles all the information (International activity) in a single web application, it is usually scattered over the web. This makes it tough for prospective students or researchers to search for people or programs. Therefore, to overcome this issue, the current project has been implemented.
The main objective of the current project is to design a website for the Office of International Programs. This website hosts information about International Faculty, Study Abroad Programs, International Agreements, Alumni Groups and International Students on a single platform. This portal would offer the administrators with different modules where information about a particular program can be added, edited, deleted and viewed.
Primary focus of this application is to display information in an innovative format and to enable different users to navigate over the web site easily. This website makes the searching process easy for users. This site is significantly responsible for viewing of information and maintaining a database for various people and programs for Office of International Programs at Kansas State University.
|
7 |
A Contextualized Web-Based Learning Environments for DEVS ModelsSrivrunyoo, Inthira 27 November 2007 (has links)
With the advance in applying technology in education, the traditional lecture-driven teaching style is gradually replaced by a more active teaching style where the students play a more active rule in the learning process. In this paper we introduce a new initiative to provide a suite of online tools for learning DEVS model. The uniqueness of this tutorial project is the integration of information technology and multimedia into education through the development of an interactive tutorial and the characteristic of contextualized learning. The tutorial teaches students about the basic aspects of discrete event system and simulation. The interactive tutorial fully utilizes the power of the information and multimedia technology, web application and the programming language Java, to enhance students’ learning to achieve rich interactivity. The tutorial greatly supports human-computer collaboration to enhance learning and to satisfy user goals by effectively allowing the user to interact.
|
8 |
Malicious Web Page Detection Based on Anomaly SemanticsLuo, Jing-Siang 20 August 2009 (has links)
Web services are becoming the dominant way to provide access to on-line information. Web services have a lot of applications, like e-mail, web search engine, auction network and internet banking. On the web services, web application technology and dynamic webpage technology are very important, but hackers take advantage of web application vulnerabilities and dynamic webpage technology to inject malicious codes into webpages. However, a part of the web sites have neglected the issue of security. In this paper, we propose a novel approach for detecting malicious webpages by URL features, anomaly semantics, potential dangerous tags and tag attributes. This research proposed approach mainly consists of three parts: (1) scripting language and automatic link filter. (2) malicious feature. (3) scoring mechanism. By first part, this step can filter out normal webpages to increae detection speed. Second part can identify some known malicious attacks. Third part can search some unknown malicious webpages by scoring. Our experimental results show that the proposed approach achieves low false positive rate and low false negative rate.
|
9 |
Transformation of round-trip web application to use AJAXChu, Jason 19 June 2008 (has links)
AJAX is a web application programming technique that allows portions of a web page to be loaded dynamically, separately from other parts of the web page. This gives the user a much smoother experience when viewing the web page. This technique also conserves bandwidth by transmitting only new data relevant to the user, keeping all other content on the web page unchanged. The migration from traditional round-trip web application to AJAX-based web application can be difficult to implement due to the many details required by AJAX.
In this thesis, an approach is presented to automate the process of AJAX conversion using source transformation and backward slicing techniques. The result is an AJAX-based web page that will enhance the user experience and also conserve bandwidth. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2008-06-13 09:43:55.515
|
10 |
Integrating research root cause analysis tools into a commercial IT service managerLi, Xiaochun 13 December 2011 (has links)
IT environments are turning more complex by the day and this trend is poised to rise in the coming years. To manage IT resources and maximize productivity better, large organizations are striving for better methods to control their current environments. They also have to prepare for future complexity growth as their environments cater to the growing IT needs. In the current economic recession, organizations are not only threatened by the growing complexity, but also have to cope with limited personnel due to financial constraints. Organizations are ardent about obtaining new technology to have firmer control on different platforms, vendors, and solutions at a reasonable cost. At the same time, this new technology must deliver quality services that can effectively fulfill customer needs.
To deal with IT management challenges, CA developed Spectrum Service Assurance Manager (SAM), a product by CA Inc. (formerly Computer Associates) to solve complex IT environment service management problems. SAM can provide organizations with a wide-ranging view of their multi-faceted IT environments by providing vital pieces of information that no other software can perceive. Thus, SAM can monitor and manage systems, databases, networks, applications, and end-user experiences. Although, this technology is able to detect many errors and problems, it still lacks a good mechanism to diagnose the detected problems and uncover their root causes for end users to fix.
Four research groups from Universities of Alberta, Toronto, Victoria and Waterloo—under the auspices of the Consortium for Software Engineering Research—built different tools for root-cause analysis and detection. To integrate these solutions, these research groups worked together with CA Inc. to produce a web-based integration tool to integrate these add-ons into the main SAM application. The resulting framework does not affect any of SAM’s existing features as the additions only involve a new web communication layer that acts from the core of the software to detect and present root causes. The detection tools only parse the log files for vital information and thus the core functionality of the software remains unaffected.
My contributions to this research project are presented in this thesis. In the beginning of this thesis, I report on background research on SAM and describe how it is going to solve the increasing complexity problem in IT environments. Later on, I propose two software integration approaches to integrate root cause diagnosis tools with SAM and briefly describe CA’s latest software integration framework Catalyst. Towards the end of this thesis, I compare our integration solution with Catalyst, and discuss advantages and disadvantages of these integration solutions. / Graduate
|
Page generated in 0.1021 seconds