Spelling suggestions: "subject:"forminformation technology|computer science"" "subject:"forminformation technology|coomputer science""
11 |
Supporting source code comprehension during software evolution and maintenanceAlhindawi, Nouh 13 June 2014 (has links)
<p> This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information Retrieval (IR) and Natural Language Processing (NLP) are utilized and enhanced to support various software engineering tasks. This research is not aimed at directly improving IR or NLP approaches; rather it is aimed at understanding how additional information can be leveraged to improve the final results. The work advances the field by investigating approaches to augment and re-document source code with different types of abstract behavior information. The hypothesis is that enriching the source code corpus with meaningful descriptive information, and integrating this orthogonal information (semantic and structural) that is extracted from source code, will improve the results of the IR methods for indexing and querying information. Moreover, adding this new information to a corpus is a form of supervision. That is, apriori knowledge is often used to direct and supervise machine-learning and IR approaches. </p><p> The main contributions of this dissertation involve improving on the results of previous work in feature location and source code querying. The dissertation demonstrates that the addition of statically derived information from source code (e.g., method stereotypes) can improve the results of IR methods applied to the problem of feature location. Further contributions include showing the effects of eliminating certain textual information (comments and function calls) from being included when performing source code indexing for feature/concept location. Moreover, the dissertation demonstrates an IR-based method of natural language topic extraction that assists developers in gaining an overview of past maintenance activities based on software repository commits. </p><p> The ultimate goal of this work is to reduce the costs, effort, and time of software maintenance by improving the results of previous work in feature location and source code querying, and by supporting a new platform for enhancing program comprehension and facilitating software engineering research.</p>
|
12 |
In-network processing for mission-critical wireless networked sensing and control| A real-time, efficiency, and resiliency perspectiveXiang, Qiao 13 June 2014 (has links)
<p> As wireless cyber-physical systems (WCPS) are increasingly being deployed in mission-critical applications, it becomes imperative that we consider application QoS requirements in in-network processing (INP). In this dissertation, we explore the potentials of two INP methods, packet packing and network coding, on improving network performance while satisfying application QoS requirements. We find that not only can these two techniques increase the energy efficiency, reliability, and throughput of WCPS while satisfying QoS requirements of applications in a relatively static environment, but also they can provide low cost proactive protection against transient node failures in a more dynamic wireless environment. </p><p> We first study the problem of jointly optimizing packet packing and the timeliness of data delivery. We identify the conditions under which the problem is strong NP-hard, and we find that the problem complexity heavily depends on aggregation constraints instead of network and traffic properties. For cases when the problem is NP-hard, we show that there is no polynomial-time approximation scheme (PTAS); for cases when the problem can be solved in polynomial time, we design polynomial time, offline algorithms for finding the optimal packet packing schemes. We design a distributed, online protocol<i> tPack </i>that schedules packet transmissions to maximize the local utility of packet packing at each node. We evaluate the properties of tPack in NetEye testbed. We find that jointly optimizing data delivery timeliness and packet packing and considering real-world aggregation constraints significantly improve network performance. </p><p> We then work on the problem of minimizing the transmission cost of network coding based routing in sensor networks. We propose the first mathematical framework so far as we know on how to theoretically compute the expected transmission cost of NC-based routing in terms of expected number of transmission. Based on this framework, we design a polynomial-time greedy algorithm for forwarder set selection and prove its optimality on transmission cost minimization. We designed EENCR, an energy-efficient NC-based routing protocol that implement our forwarder set selection algorithm to minimize the overall transmission cost. Through comparative study on EENCR and other state-of-the-art routing protocols, we show that EENCR significantly outperforms CTP, MORE and CodeOR in delivery reliability, delivery cost and network goodput. </p><p> Furthermore, we study the 1+1 proactive protection problem using network coding. We show that even under a simplified setting, finding two node-disjoint routing braids with minimal total cost is NP-hard. We then design a heuristic algorithm to construct two node-disjoint braids with a transmission cost upper bounded by two shortest node-disjoint paths. And we design ProNCP, a proactive NC-based protection protocol using similar design philosophy as in EENCR. We evaluate the performance of ProNCP under various transient network failure scenarios. Experiment results show that ProNCP is resilient to various network failure scenarios and provides a state performance in terms of reliability, delivery cost and goodput. </p><p> Our findings in this dissertation explore the challenges, benefits and solutions in designing real-time, efficient, resilient and QoS-guaranteed wireless cyber-physical systems, and our solutions shed lights for future research on related topics.</p>
|
13 |
An analysis of open source security software products downloadsBarta, Brian J. 16 April 2014 (has links)
<p> Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software development attributes and the number of security software downloads. The research conducted in this study investigates the strength of the relationships of particular open source software project development factors against a particular measure of open source security software success. This research focuses on open source software development with an emphasis on anti-virus, firewall and intrusion detection software. Additionally, reviewed in this study are some key cyber-security events that have shaped the cyber landscape as well as descriptions of some security technologies that have emerged as a result of those events. A level of correlation between the dependent variable <i>number of software downloads</i> and the independent variables <i>project team size</i> and <i>number of software project events</i> are analyzed in this research. </p>
|
14 |
Architecting a Cybersecurity Management Framework| Navigating and Traversing Complexity, Ambiguity, and AgilityTisdale, Susan M. 09 June 2018 (has links)
<p> Despite advancements in technology, countermeasure, and situational awareness, cybersecurity (CS) breaches continue to increase in number, complexity, and severity. This qualitative study is one of a few to comprehensively explore CS management. The study used a systems’ approach to identify business, socioeconomic, and information technology (IT) factors, and their interrelationships. The study examined IT management frameworks and CS standards and literature. Interviews and a focus group of subject matter experts followed. The research found CS is a leadership, not a technical issue. CS is an ecosystem; its components are interrelated and inseparable, requiring qualitative, subjective, risk and knowledge management interventions. CS, IT, and threats are too complex and volatile for organizations to manage all risks and vulnerabilities in a timely, agile manner. CS lexicons lack uniformity and consistency. An IT management framework is better suited for CS. Companies must segregate and encrypt the most sensitive information and curb their appetites for new, unsecured technology. CS and IT is multilayered, requiring subspecialists, who often serve conflicting business needs and security objectives. Organizations need to minimize mid-level CS management, raise CS to a business level function (not subordinate to IT), and involve cyber specialists at all levels in the business lifecycle. Cross-pollinating people from all business areas, especially from finance, CS, and IT, increases awareness of the others’ responsibilities and obligations and facilitates more rapid portfolio, lifecycle CS activities, from investments to detection and response activities. Future studies should focus on these issues as critical success factors. Finally, the study of CS requires agile, qualitative, multidisciplinary methodology to produce thick, quick, actionable information.</p><p>
|
15 |
Accepting the Cloud| A Quantitative Predictive Analysis of Cloud Trust and Acceptance Among IT Security ProfessionalsPeake, Chris 20 December 2018 (has links)
<p> Industry experts recognize the cloud and cloud-based services as advantageous from both operational and economic perspectives, yet the gap is that individuals and organizations hesitate to accept the cloud because of concerns about security and privacy. The purpose of this study is to examine what factors that may influence the cloud acceptance by IT professionals by focusing on the principal research question: To what extent do ease of use, usefulness, attitude, security apprehensions, compatibility, and trust predict IT security professionals’ acceptance of cloud computing. The population for this study consisted of IT security professionals who either had industry security certifications or had been in a security position for at least two years. Sample inclusion criteria consisted IT professionals with the qualification described above and over the age of 18 who were living in the United States. The study survey was administered using SurveyMonkey, which randomly selected and recruited potential participants who met the sample criteria from a participant database, resulting in ninety-seven total study participants. Among the six factors examined, perceived usefulness, attitudes, security apprehensions, and trust were found to significantly predict cloud acceptance. The results indicate that cloud service providers should focus their attention on these factors in order to promote cloud acceptance.</p><p>
|
16 |
Analysis and Detection of the Silent ThievesPerez, Jon 13 September 2018 (has links)
<p> As the cryptocurrency market becomes more lucrative and accessible, cybercriminals will continue to adapt strategies to monetize the unauthorized use of system resources for mining operations. Some of these strategies involve infecting systems with malware that will deploy a cryptomining application. Other attack strategies involve deploying code to a target’s web browser that will cause the web browser to perform mining operations. This research examines existing cryptomining malware, commonalities in targeting and infection vectors, techniques used by cryptomining malware, and distinguishable differences between legitimate and malicious use. </p><p> The research found that cybercriminals employing cryptomining malware, attack targets indiscriminately. Additionally, the techniques employed by cryptomining malware are also used by other types of malware. The research tested the impact of cryptomining applications on CPU utilization and showed a clear distinction when comparing the CPU utilization of cryptomining applications to common applications on a desktop PC. The research also found that distinguishing between the authorized and unauthorized use of cryptomining relied heavily on a holistic examination of the system in question. </p><p> The research synthesized existing literature and the results of the CPU testing to recommend two strategies for detecting malicious cryptomining activity. The optimal strategy involves endpoint, network, and CPU monitoring and the ability to aggregate, and correlate events or alerts produced. A less optimal strategy involves multiple event sources with manual or no correlation, or a single event source. </p><p>
|
17 |
Probabilistic Clustering Ensemble Evaluation for Intrusion DetectionMcElwee, Steven M. 18 August 2018 (has links)
<p> Intrusion detection is the practice of examining information from computers and networks to identify cyberattacks. It is an important topic in practice, since the frequency and consequences of cyberattacks continues to increase and affect organizations. It is important for research, since many problems exist for intrusion detection systems. Intrusion detection systems monitor large volumes of data and frequently generate false positives. This results in additional effort for security analysts to review and interpret alerts. After long hours spent reviewing alerts, security analysts become fatigued and make bad decisions. There is currently no approach to intrusion detection that reduces the workload of human analysts by providing a probabilistic prediction that a computer is experiencing a cyberattack. </p><p> This research addressed this problem by estimating the probability that a computer system was being attacked, rather than alerting on individual events. This research combined concepts from cyber situation awareness by applying clustering ensembles, probability analysis, and active learning. The unique contribution of this research is that it provides a higher level of meaning for intrusion alerts than traditional approaches. </p><p> Three experiments were conducted in the course of this research to demonstrate the feasibility of these concepts. The first experiment evaluated cluster generation approaches that provided multiple perspectives of network events using unsupervised machine learning. The second experiment developed and evaluated a method for detecting anomalies from the clustering results. This experiment also determined the probability that a computer system was being attacked. Finally, the third experiment integrated active learning into the anomaly detection results and evaluated its effectiveness in improving the accuracy. </p><p> This research demonstrated that clustering ensembles with probabilistic analysis were effective for identifying normal events. Abnormal events remained uncertain and were assigned a belief. By aggregating the belief to find the probability that a computer system was under attack, the resulting probability was highly accurate for the source IP addresses and reasonably accurate for the destination IP addresses. Active learning, which simulated feedback from a human analyst, eliminated the residual error for the destination IP addresses with a low number of events that required labeling.</p><p>
|
18 |
Protecting Digital Evidence during Natural Disasters| Why It Is ImportantDodrill, Charles A. 12 May 2018 (has links)
<p> The safeguarding of digital evidence, valuable corporate proprietary intellectual property and related original objects on which it resides, such as cell phones, tablets, external drives and laptops, becomes a more complex challenge when a natural disaster is imminent. Natural devastation disrupts the investigative and legal process, often destroying the evidentiary elements required to serve justice. Traditional methods such as backups to external drives, and copies as well as cloud storage options, are inadequate to serve the requirements of evidence-gathering and chain of custody documentation required by the courts to prove original evidence. Courts point to the original data-containing object as proof of digital evidence validity and admissibility. Current research provides general guidelines for safeguarding digital evidence, but lacks specific detail for its successful safeguarding or evacuation during a natural disaster. Recent natural disasters have completely destroyed law enforcement or court facilities leaving them open to the elements and water damage. In some cases, digital evidence has been destroyed and cases dismissed due to lack of evidence, post-natural disaster. For these reasons, geographical relocation of digital evidence makes sense and is the best way to truly protect digital evidence and continue analysis of data that will successfully serve justice and put criminals away. Borrowing from the U.S. Military, the mobile digital evidence room can be implemented into the law enforcement private digital forensic laboratory and commercial or business sectors, to ensure that digital evidence remains intact. </p><p>
|
19 |
Security, Computation and Data Issues in CloudsLi, Lifeng 07 September 2017 (has links)
<p> Recently, Cloud has become quite attractive due to its elasticity, availability, and scalability. However, the technologies such as virtualization build up Cloud appear like a double-edged sword because of the expansion on attacking surfaces to entire hardware-software stack. Moreover, homogeneous computing in Cloud severely limits the computational power it could potentially provide. As a result, it is strongly desired to have new and comprehensive solutions to take in all benefits from Cloud and suppress backsides. This thesis proposes three new solutions to address security, computation and data issues in Cloud. Firstly, a GPU MapReduce framework specifically aims at improving performance and reducing energy consumption to data parallel problems in Cloud. In addition, the P-CP-ABE scheme overcomes not only the difficulties of data security, access control, and key management issues in Cloud, but the performance weakness of original CP-ABE is enhanced dramatically as well. Finally, the multi tenancy technology on top of the insecure network requires a strong network authentication protocol suite to assure authenticity and nonrepudiation in the Cloud.</p><p>
|
20 |
Constructing a Clinical Research Data Management SystemQuintero, Michael C. 03 January 2018 (has links)
<p> Clinical study data is usually collected without knowing what kind of data is going to be collected in advance. In addition, all of the possible data points that can apply to a patient in any given clinical study is almost always a superset of the data points that are actually recorded for a given patient. As a result of this, clinical data resembles a set of sparse data with an evolving data schema. To help researchers at the Moffitt Cancer Center better manage clinical data, a tool was developed called GURU that uses the Entity Attribute Value model to handle sparse data and allow users to manage a database entity’s attributes without any changes to the database table definition. The Entity Attribute Value model’s read performance gets faster as the data gets sparser but it was observed to perform many times worse than a wide table if the attribute count is not sufficiently large. Ultimately, the design trades read performance for flexibility in the data schema.</p><p>
|
Page generated in 0.1872 seconds