Spelling suggestions: "subject:"1echnology anda science"" "subject:"1echnology ando science""
1 |
Improving Enterprise Data Governance Through Ontology and Linked DataDeStefano, R.J. 16 April 2016 (has links)
<p> In the past decade, the role of data has increased exponentially from being the output of a process, to becoming a true corporate asset. As the business landscape becomes increasingly complex and the pace of change increasingly faster, companies need a clear awareness of their data assets, their movement, and how they relate to the organization in order to make informed decisions, reduce cost, and identify opportunity. The increased complexity of corporate technology has also created a high level of risk, as the data moving across a multitude of systems lends itself to a higher likelihood of impacting dependent processes and systems, should something go wrong or be changed. The result of this increased difficulty in managing corporate data assets is poor enterprise data quality, the impacts of which, range in the billions of dollars of waste and lost opportunity to businesses. </p><p> Tools and processes exist to help companies manage this phenomena, however often times, data projects are subject to high amounts of scrutiny as senior leadership struggles to identify return on investment. While there are many tools and methods to increase a companies’ ability to govern data, this research stands by the fact that you can’t govern that which you don’t know. This lack of awareness of the corporate data landscape impacts the ability to govern data, which in turn impacts overall data quality within organizations. </p><p> This research seeks to propose a means for companies to better model the landscape of their data, processes, and organizational attributes through the use of linked data, via the Resource Description Framework (RDF) and ontology. The outcome of adopting such techniques is an increased level of data awareness within the organization, resulting in improved ability to govern corporate data assets. It does this by primarily addressing corporate leadership’s low tolerance for taking on large scale data centric projects. The nature of linked data, with it’s incremental and de-centralized approach to storing information, combined with a rich ecosystem of open source or low cost tools reduces the financial barriers to entry regarding these initiatives. Additionally, linked data’s distributed nature and flexible structure help foster maximum participation throughout the enterprise to assist in capturing information regarding data assets. This increased participation aids in increasing the quality of the information captured by empowering more of the individuals who handle the data to contribute. </p><p> Ontology, in conjunction with linked data, provides an incredibly powerful means to model the complex relationships between an organization, its people, processes, and technology assets. When combined with the graph based nature of RDF the model lends itself to presenting concepts such as data lineage to allow an organization to see the true reach of it’s data. This research further proposes an ontology that is based on data governance standards, visualization examples and queries against data to simulate common data governance situations, as well as guidelines to assist in its implementation in a enterprise setting. </p><p> The result of adopting such techniques will allow for an enterprise to accurately reflect the data assets, stewardship information and integration points that are so necessary to institute effective data governance.</p>
|
2 |
Gamification in Introductory Computer ScienceBehnke, Kara Alexandra 31 December 2015 (has links)
<p> This thesis investigates the impact of gamification on student motivation and learning in several introductory computer science educational activities. The use of game design techniques in education offers the potential to make learning more motivating and more enjoyable for students. However, the design, implementation, and evaluation of game elements that actually realize this promise remains a largely unmet challenge. This research examines whether the introduction of game elements into curriculum positively impacts student motivation and intended learning outcomes for entry-level computer science education in four settings that apply similar game design techniques in different introductory computer science educational settings. The results of these studies are evaluated using mixed methods to compare the effects of game elements on student motivation and learning in both formal and non-formal learning environments.</p>
|
3 |
Cyber terrorrism threatsGobran, Ashraf 27 May 2015 (has links)
<p> The purpose of this study is to explore the potential threats that are posed uniquely by cyber terrorism. While traditional terrorism has affected governmental policy, and inflicted physical damage to people and infrastructure across the world, computers and the Internet can allow for attacks as well. As terrorism groups begin to adapt to, and take advantage of - cyber tools and capabilities, the threat they pose will grow accordingly. While a terrorist is not able to directly kill people with cyber tools, the mayhem or social disruption that such attacks can cause, fit well with these organization's objectives. The anonymity of cyber space permits terrorist groups to plan and execute attacks without being identified immediately, if ever. In addition, the targets of cyber terrorists are often under prepared, and fairly vulnerable to various forms of cyber attacks. While these organizations may be aware of the risk posed by failing to adequately address cyber security deficiencies, their solutions are likely not sufficient to truly prevent cyber terrorism attacks. In order to discover technological advancements, efficient cyber security, and generally spread awareness on the subject, this study attempts to highlights existing threats, as well as an overview of what can be done to mitigate them. </p><p> Keywords: Intelligence, Cyber security, Professor Albert Orbanati</p>
|
4 |
Understanding and Rejecting Errant Touches on Multi-touch TabletsShu, Ke 28 December 2013 (has links)
<p> Given the pervasion of multi-touch tablet, pen-based applications have rapidly moved onto this new platform. Users draw both with bare fingers and using capacitive pens as they would do on paper in the past. Unlike paper, these tablets cannot distinguish legitimate finger/pen input from accidental touches by other parts of the user's hand. In this thesis, we refer it to as errant touch rejection problem since users may unintentionally touch the screen with other parts of their hand. </p><p> In this thesis, I design, implement and evaluate new approaches, bezel-focus rejection, of preventing errant touches on multi-touch tablets. I began the research by conducting a formal study to collect and characterize errant touches. I analyzed the data collected from the study and the results are guiding me to design rejection techniques. I will conclude this research by developing bezel-focus rejection and evaluate its performance. The results show that bezel-focus rejection yields high rejection rate of errant touches and make users more inclined to rest hands on tablet than comparison techniques. </p><p> This research has two major contributions to Human Computer Interaction (HCI) community. First, my proposed errant touch rejection approaches can be applied the other pen-based note-taking applications. Second, my experimental results can serve as a guide to other developing similar techniques.</p>
|
5 |
A quantitative experimental study of the effectiveness of systems to identify network attackersHandorf, C. Russell 14 February 2017 (has links)
<p> This study analyzed the meta-data collected from a honeypot that was run by the Federal Bureau of Investigation for a period of 5 years. This analysis compared the use of existing industry methods and tools, such as Intrusion Detection System alerts, network traffic flow and system log traffic, within the Open Source Security Information Manager (OSSIM) against techniques that were used to prioritize the detailed analysis of the data which would aid in the faster identification of attackers. It was found that by adding the results from computing a Hilbert Curve, Popularity Analysis, Cadence Analysis and Modus Operandi Analysis did not introduce significant or detrimental latency for the identification of attacker traffic. Furthermore, when coupled with the traditional tools within OSSIM, the identification of attacker traffic was greatly enhanced. Future research should consider additional statistical models that can be used to guide the strategic use of more intense analysis that is conducted by deep packet inspection software and broader intelligence models from reviewing attacks against multiple organizations. Additionally, other improvements in detection strategies are possible by these mechanisms when being able to review full data collection.</p>
|
6 |
Detection of communication over DNSSEC covert channelsHands, Nicole M. 01 November 2016 (has links)
<p> Unauthorized data removal and modification from information systems represents a major and formidable threat in modern computing. Security researchers are engaged in a constant and escalating battle with the writers of malware and other methods of network intrusion to detect and mitigate this threat. Advanced malware behaviors include encryption of communications between the server and infected client machines as well as various strategies for resilience and obfuscation of infrastructure. These techniques evolve to use any and all available mechanisms. As the Internet has grown, DNS has been expanded and has been given security updates. This study analyzed the potential uses of DNSSEC as a covert channel by malware writers and operators. The study found that changing information regarding the Start of Authority (SOA) and resigning the zone can create a covert channel. The study provided a proof of concept for this previously undocumented covert channel that uses DNSSEC. </p>
|
7 |
The impact of consumer security awareness on adopting the Internet of Things| A correlational studyHarper, Allen A. 28 December 2016 (has links)
<p> The research topic of this study is the impact of consumer security awareness on the adoption of the Internet of Things. The Internet of Things (IoT) is the emerging network of Internet connected smart devices. Several authors have predicted that adoption of the IoT will be hindered if security issues are not addressed. Other authors have noted that users often trade security and privacy for convenience. To better understand these two points of view, the main research question of this study is: to what extent does consumer security awareness impact adoption of the Internet of Things. To address the competing factors impacting adoption, the unified theory of acceptance and use of technology (UTAUT) will be used as the base model of this study and was extended to account for the construct of security awareness. A quantitative non-experimental correlational study was designed to measure the impact. The population of this study is U.S. adult consumers of Internet connected smart devices. The sample frame was selected from the SurveyMonkey™ voluntary audience panel. Multiple regression was used as the statistical analysis to perform hypothesis testing and attempt to answer the research questions. The findings of the study showed that although there is a statistically significant impact of security awareness on adoption of the IoT, it is not the dominant factor. Other factors, such as performance expectation and effort expectation prove to be better indicators of adoption of the IoT at this time. Several recommendations are given to improve future studies in this area. The results of this study provide business managers, IoT device manufacturers and service providers with valuable information on the relation between awareness of security risks and adoption of the IoT.</p>
|
8 |
End user software product line support for smart spacesTzeremes, Vasilios 29 March 2017 (has links)
<p> Smart spaces are physical environments equipped with pervasive technology that sense and react to human activities and changes in the environment. End User Development (EUD) skills vary significantly among end users who want to design, develop and deploy software applications for their smart spaces. Typical end user development is opportunistic, requirements are usually unplanned and undocumented, applications are simplistic in nature, design is ad-hoc, reuse is limited, and software testing is typically haphazard, leading to many quality issues. On the other hand, technical end users with advanced EUD skills and domain expertise have the ability to create sophisticated software applications for smart spaces that are well designed and tested.</p><p> This research presents a systematic approach for adopting reuse in end user development for smart spaces by using Software Product Line (SPL) concepts. End User (EU) SPL Designers (who are technical end users and domain experts) design and develop EU SPLs for smart spaces whereas less technical end users derive their individual smart space applications from these SPLs. Incorporating SPL concepts in EUD for smart spaces makes it easier for novice end users to derive applications for their spaces without having to interface directly with devices, networks, programming logic, etc. End users only have to select and configure the EU SPL features needed for their space. Another benefit of this approach is that it promotes reuse. End user requirements are mapped to product line features that are realized by common, optional, and variant components available in smart spaces. Product line features and the corresponding component product line architecture can then be used to derive EU applications. Derived EU applications can then be deployed to different smart spaces, thereby avoiding end users having to create EU applications from scratch. Finally the proposed approach has the potential of improving software quality since testing will be an integral part of EU SPL process.</p><p> In particular, this research has: (a) defined a systematic approach for EU SPL Designers to design and develop EU SPLs, (b) provided an EU SPL application derivation approach to enable end users to derive software applications for their spaces, (c) designed an EU SPL meta-model to capture the underlying representation of EU SPL and derived application artifacts in terms of meta-classes and relationships that supports different EUD platforms, (d) designed and implemented an EUD development environment that supports EU SPL development and application derivation, and (e) provided a testing approach and framework for systematic testing of EU SPLs and derived applications.</p>
|
9 |
Global scale identity managementTambasco, Michael J. 11 November 2015 (has links)
<p> Global scale identity management attempts to be the system of identifying and authenticating entities such as people, hardware devices, distributed sensors and actuators, and software applications when accessing critical information technology (IT) systems from anywhere. The term global-scale is intended to emphasize the pervasive nature of identities and implies the existence of identities in federated systems that may be beyond the control of any single organization. The purpose of this research was to analyze the current state of Global Scale Identity Management. Today, news of security breaches is far too commonplace. The results reveal that global scale identity management would have a positive effect on the individual person, businesses, government agencies, and institutions. However for global scale identity management to be operational much work remains. The remaining work is split between the physical realm, i.e., biometric equipment, quantum resistant cryptography, and the abstract realm, i.e., legal considerations, social and cultural mores, privacy issues, and international considerations. The research concluded that humans are repeatedly the weak link in password security that ultimately undermines a system’s stability. For the short term, the best suggestion is to use password managers and have systems disallow poor password choices. For the long-term build infrastructures with quantum resistant cryptography interfacing with the ubiquitous smartphone to provide multifactor authentication. </p>
|
10 |
A system to support clerical review, correction and confirmation assertions in entity identity information managementChen, Cheng 12 August 2015 (has links)
<p> Clerical review of Entity Resolution(ER) is crucial for maintaining the entity identity integrity of an Entity Identity Information Management (EIIM) system. However, the clerical review process presents several problems. These problems include Entity Identity Structures (EIS) that are difficult to read and interpret, excessive time and effort to review large Identity Knowledgebase (IKB), and the duplication of effort in repeatedly reviewing the same EIS in same EIIM review cycle or across multiple review cycles. Although the original EIIM model envisioned and demonstrated the value of correction assertions, these are applied to correct errors after they have been found. The original EIIM design did not focus on the features needed to support the process of clerical review needed to find these errors. </p><p> The research presented here extends and enhances the original EIIM model in two very significant ways. The first is a design for a pair of confirmation assertions that complement the original set of correction assertions. The confirmation assertions confirm correct linking decisions so that they can be excluded from further clerical review. The second is a design and demonstration of a comprehensive visualization system that supports clerical review, and both correction and confirmation assertion configurations in EIIM. This dissertation also describes how the confirmation assertions and the new visualization system have been successfully integrated into the OYSTER open source EIIM framework.</p>
|
Page generated in 0.0888 seconds