• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2873
  • 957
  • 491
  • 79
  • 70
  • 64
  • 56
  • 44
  • 37
  • 36
  • 35
  • 18
  • 16
  • 15
  • 13
  • Tagged with
  • 5345
  • 5345
  • 1263
  • 857
  • 845
  • 807
  • 757
  • 610
  • 551
  • 511
  • 481
  • 480
  • 412
  • 389
  • 375
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Open source UAV platform development for aerial photography

Richards, Daniel L. 20 May 2015 (has links)
<p> Aerial photography is an important layer in Geographic Information Systems (GISs), and generally provides the base layer from which many other digital map layers are derived. Capturing these photos from a traditional full-sized airplane is a complex and expensive process. The recent development of Unmanned Aerial Vehicles (UAVs) and associated technology are providing an alternative to the traditional aerial mapping process. UAVs produced by popular commercial vendors are effective at capturing photos, but are highly expensive to acquire, and equally expensive to maintain.</p><p> This research project demonstrates the development and successful implementation of a relatively inexpensive ($2000) unmanned aerial vehicle capable of acquiring high-resolution digital aerial photography. The UAV was developed using open source technology and commercially available components. The methods outlined encompass the platform selection, component inventory, design, construction, configuration, implementation, and testing of the UAV, as well as an analysis of the photography produced by the process. This approach can be used by others to implement similar UAV projects.</p>
232

Mapping Webs of Information, Conversation, and Social Connections| Evaluating the Mechanics of Collaborative Adaptive Management in the Sierra Nevada Forests

Lei, Shufei 28 March 2015 (has links)
<p> Managing within social-ecological systems at the landscape scale, such as in the national forests of the Sierra Nevada of California, is challenging to natural resource managers (e.g. the U.S. Forest Service) due to the uncertainties in natural processes and the complexities in social dynamics. Collaborative adaptive management (CAM) has been recently adopted as a viable strategy to diminish uncertainties in natural processes through iterative policy experimentations and adaptations, as well as to overcome conflicting values and goals among diverse environmental stakeholders through fostering and facilitating collaborations. While many CAM studies have focused on evaluating the management impact on natural systems and processes, few have examined the social engagements and dynamics of management itself. To address this knowledge gap, I examined the various social engagements in CAM, particularly the flow of information products, dialogues in public meetings, and social connections among participants, based on my research case study&mdash;the Sierra Nevada Adaptive Management Project (SNAMP). </p><p> SNAMP began in 2005 in response to the USDA National Forest Service's 2004 Sierra Nevada Forest Plan Amendment calls for managing the forest using the best information available to protect forests and homes. The participants in the project can be sorted into three primary categories of environmental stakeholders: federal and state environmental agencies, the public and environmental advocacy groups, and university scientists. The project studies the impact of forest fuel reduction treatment on forest health, fire mitigation and prevention, wildlife, and water quality and quantity at two study sites: Last Chance in the northern region of the Sierra forests at Sugar Pine in the southern region. The primary strategies and methods for fostering partnership and facilitating collaboration among the diverse participants are producing science information and making it transparent and publicly accessible, as well as facilitating discussions about such research and management results in public meetings. </p><p> To evaluate the effectiveness of CAM in the case of SNAMP, I used a mixed-methods research approach (i.e. citation analysis, web analytics, content analysis, self-organizing maps, social network analysis), by leveraging available information technologies and tools, to characterize and analyze the flow of digital information products, the outcomes of facilitated discussions in SNAMP public meetings, and the resilience of the social networks in SNAMP. Some of the interesting findings include: 1) Scientific knowledge products, in the form of peer-reviewed journal publications, contributed to knowledge transfer between scientists and environmental managers; 2) facilitated discussions helped environmental stakeholders to stay engaged on the important administrative and research topics through time; 3) the social networks experienced turbulence but remained resilient due to the existence of a committed and consistent core group of environmental stakeholders that represent diverse backgrounds and interests. As the picture of how information, conversation, and social connections contributed to the success of CAM emerged, my dissertation provides recommendations to natural resource managers on how to improve in these areas for future implementations of CAM.</p>
233

Executive security awareness primer

Toussaint, Gregory W. 22 April 2015 (has links)
<p> The purpose of this paper was to create a primer for a security awareness program to educate senior level executives on the key aspects of cyber security. This is due to the gap area that was discovered in the lack of both executive security awareness programs, and the lack of executives that fully abide by their company's security policies. This, coupled with research showing that executives are highly targeted by attackers, was the impetus behind this project. It was determined that the content of an executive security awareness program should be similar to that of a security awareness program for all other employees, with the differences being in the delivery and time frame of each segment. Due to this, literature was reviewed on the various topics of security awareness. Research revealed the importance of capturing an executive's attention, in order to keep their interest in the program. It was recommended that individuals charged with creating an executive security awareness program begin by having one on one meetings with the executives in their company. These meetings will help assess the time constraints of their company executives as well as their current knowledge of the various security awareness topics. This will help with tailoring the program specifically to their company executives. This primer may be used by any company or organization in the beginning stages of creating their own security awareness program for executives. Keywords: Cybersecurity, Professor Albert Orbinati, Executive Security Awareness, Internet Safety.</p>
234

Windows hibernation and memory forensics

Ayers, Amy L. 30 April 2015 (has links)
<p>ABSTRACT The purpose of this capstone project was to research the hibernation file, its role in memory forensics and to explore current technology, techniques and concepts for analysis. This study includes an in-depth look at the Windows hibernation feature, file format, potential evidence saved to the file and its impacts in digital forensic investigations. This research was performed to demonstrate the importance of the hibernation file and to generate awareness for this forensic artifact. The research questions presented were designed to identify the properties of Windows hibernation and its significance in digital forensics. Additionally, these research questions were aimed at identifying the important concepts analysts should understand in selecting forensic software and in hibernation analysis. Through the literature review process, the hibernation file was identified as an essential part of digital forensics which provides analysts with snapshots of system memory from various points in the past. This data includes web, email and chat sessions in addition to running processes, login credentials, encryption keys, program data and much more. Beyond forensics, the hibernation file is useful in the fields of data recovery and incident response. A review of current hibernation file publications revealed incomplete and conflicting works culminating in the acknowledgment that more research is needed in order to close these research gaps. More awareness for hibernation forensics through its inclusion in future published works and in computer forensic educational courses is recommended. These inclusions will assist to arm practitioners with the ability to accurately utilize the hibernation file in order to obtain the highest quality forensic evidence. Keywords: Cybersecurity, hiberfil.sys, hybrid sleep, malware, slack space, Albert Orbinati.
235

A quantitative investigation of the Technology Obsolescence Model (TOM) factors that influence the decision to replace obsolete systems

Marchek, Scott P. 01 July 2015 (has links)
<p> The Technology Obsolescence Model (TOM) provides a framework of key factors involved in assessing influences to the decision to replace obsolete Information Technology (IT) systems. TOM focuses upon what is important and significant to the replacement decision. Formulated from well-established models in decision making and technology acceptance, TOM presents a structured interface of influence factors crossing technical, business, organizational, and interpersonal effects matched with demographic influence assessment. Survey results of questions exploring TOM are analyzed for insight into decision motivation and their influences and significance to the replacement decision. Primary questions employ both 7-point Likert scale of importance as well as ordered ranking for prioritization assessment. The survey augments quantitative material with qualitative rationale for prioritized responses. Reviewed survey response focuses on a large, multinational conglomerate organization&rsquo;s IT department. Primary assessment tools include ANOVA, regression, factor, and correlation analysis. Validity and reliability are examined in detail. Assessment of responses indicates a business-centric focus of decision makers where systems obsolescence may be influential to, but not a primary causal factor for, a replacement decision. While the business and technical benefits of replacement systems are perceived by respondents as most important, statistical analysis identifies obsolescence as one of the only potential significant influencing factors. Demographic effects also demonstrated influence. Findings and recommendations for instrument improvements and continued research opportunities in additional venues, demographic modification, and longitudinal studies are identified as well.</p>
236

Project managers' perceptions of the primary factors contributing to success or failure of projects| A qualitative phenomenological study

Hickson, Ray C. 30 June 2015 (has links)
<p> This qualitative interpretative phenomenological study increased the understanding of project managers&rsquo; perception and lived experiences of the primary issues contributing to the success or failure of projects. This study used method triangulation to analyze the experiences of 48 project managers. The study was conducted in three phases, including a pilot study, an open-ended questionnaire, and one-on-one interviews. The project managers&rsquo; lived experiences indicated that stakeholder communication; collaboration; and consensus on governance, leadership methods, definition of requirements, and success criteria during the project initiation stage are critical to achieving higher project success rates. The major themes that emerged from this study are the definition of project success, requirements and success criteria, stakeholder consensus and engagement, transparency, and project management methodologies. Additional research is suggested to determine if there is a relationship among experience, qualifications, certification, and project success or failure and to determine implementable solutions to improve project success rates.</p>
237

Adopting Workgroup Collaboration Tools in 3D Virtual Worlds

Schott, Thomas R. 11 September 2014 (has links)
<p> Collaboration is vital in today's information age, and tools are increasingly used to bring together teams that are geographically dispersed. Second Life, a 3D virtual world, can incorporate most of the visual, hearing and spatial elements of the real world, and can create a feeling of presence or the sense of actually "being there" for users. Common 2D groupware collaboration tools, such as web conferencing and conference calls used for virtual team collaboration in professional contexts, are key enablers for virtual teams. However, businesses and organizations have not adopted virtual worlds for virtual teams and workgroup collaboration. Shen &amp; Eder (2009) conducted a study using their modified Technology Acceptance Model (TAM) applied to the adoption of Second Life for business purposes. For participants, they used college students who were new to Second Life. The purpose of this research is to examine how the seven factors identified in the Shen and Eder's (2009) extended Technology Acceptance Model (TAM) relate to the behavioral intention to use workgroup collaboration tools in the Second Life using a non-student sample of experienced Second Life users that was more demographically representative of the Second Life population. Although this research supported many of Shen and Eder's findings, it found a negative relationship between the construct of perceived enjoyment and behavioral intent. This finding is important because contrary to positive relationship with gaming and entertainment environments, perceived enjoyment is not an antecedent for behavioral intention of 3D virtual worlds when used for productivity activities. The results of this study may provide insight for tool developers and integrators on where to focus efforts that lead to improved adoption of these workgroup collaboration tools.</p>
238

Supporting source code comprehension during software evolution and maintenance

Alhindawi, Nouh 13 June 2014 (has links)
<p> This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information Retrieval (IR) and Natural Language Processing (NLP) are utilized and enhanced to support various software engineering tasks. This research is not aimed at directly improving IR or NLP approaches; rather it is aimed at understanding how additional information can be leveraged to improve the final results. The work advances the field by investigating approaches to augment and re-document source code with different types of abstract behavior information. The hypothesis is that enriching the source code corpus with meaningful descriptive information, and integrating this orthogonal information (semantic and structural) that is extracted from source code, will improve the results of the IR methods for indexing and querying information. Moreover, adding this new information to a corpus is a form of supervision. That is, apriori knowledge is often used to direct and supervise machine-learning and IR approaches. </p><p> The main contributions of this dissertation involve improving on the results of previous work in feature location and source code querying. The dissertation demonstrates that the addition of statically derived information from source code (e.g., method stereotypes) can improve the results of IR methods applied to the problem of feature location. Further contributions include showing the effects of eliminating certain textual information (comments and function calls) from being included when performing source code indexing for feature/concept location. Moreover, the dissertation demonstrates an IR-based method of natural language topic extraction that assists developers in gaining an overview of past maintenance activities based on software repository commits. </p><p> The ultimate goal of this work is to reduce the costs, effort, and time of software maintenance by improving the results of previous work in feature location and source code querying, and by supporting a new platform for enhancing program comprehension and facilitating software engineering research.</p>
239

In-network processing for mission-critical wireless networked sensing and control| A real-time, efficiency, and resiliency perspective

Xiang, Qiao 13 June 2014 (has links)
<p> As wireless cyber-physical systems (WCPS) are increasingly being deployed in mission-critical applications, it becomes imperative that we consider application QoS requirements in in-network processing (INP). In this dissertation, we explore the potentials of two INP methods, packet packing and network coding, on improving network performance while satisfying application QoS requirements. We find that not only can these two techniques increase the energy efficiency, reliability, and throughput of WCPS while satisfying QoS requirements of applications in a relatively static environment, but also they can provide low cost proactive protection against transient node failures in a more dynamic wireless environment. </p><p> We first study the problem of jointly optimizing packet packing and the timeliness of data delivery. We identify the conditions under which the problem is strong NP-hard, and we find that the problem complexity heavily depends on aggregation constraints instead of network and traffic properties. For cases when the problem is NP-hard, we show that there is no polynomial-time approximation scheme (PTAS); for cases when the problem can be solved in polynomial time, we design polynomial time, offline algorithms for finding the optimal packet packing schemes. We design a distributed, online protocol<i> tPack </i>that schedules packet transmissions to maximize the local utility of packet packing at each node. We evaluate the properties of tPack in NetEye testbed. We find that jointly optimizing data delivery timeliness and packet packing and considering real-world aggregation constraints significantly improve network performance. </p><p> We then work on the problem of minimizing the transmission cost of network coding based routing in sensor networks. We propose the first mathematical framework so far as we know on how to theoretically compute the expected transmission cost of NC-based routing in terms of expected number of transmission. Based on this framework, we design a polynomial-time greedy algorithm for forwarder set selection and prove its optimality on transmission cost minimization. We designed EENCR, an energy-efficient NC-based routing protocol that implement our forwarder set selection algorithm to minimize the overall transmission cost. Through comparative study on EENCR and other state-of-the-art routing protocols, we show that EENCR significantly outperforms CTP, MORE and CodeOR in delivery reliability, delivery cost and network goodput. </p><p> Furthermore, we study the 1+1 proactive protection problem using network coding. We show that even under a simplified setting, finding two node-disjoint routing braids with minimal total cost is NP-hard. We then design a heuristic algorithm to construct two node-disjoint braids with a transmission cost upper bounded by two shortest node-disjoint paths. And we design ProNCP, a proactive NC-based protection protocol using similar design philosophy as in EENCR. We evaluate the performance of ProNCP under various transient network failure scenarios. Experiment results show that ProNCP is resilient to various network failure scenarios and provides a state performance in terms of reliability, delivery cost and goodput. </p><p> Our findings in this dissertation explore the challenges, benefits and solutions in designing real-time, efficient, resilient and QoS-guaranteed wireless cyber-physical systems, and our solutions shed lights for future research on related topics.</p>
240

An analysis of open source security software products downloads

Barta, Brian J. 16 April 2014 (has links)
<p> Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software development attributes and the number of security software downloads. The research conducted in this study investigates the strength of the relationships of particular open source software project development factors against a particular measure of open source security software success. This research focuses on open source software development with an emphasis on anti-virus, firewall and intrusion detection software. Additionally, reviewed in this study are some key cyber-security events that have shaped the cyber landscape as well as descriptions of some security technologies that have emerged as a result of those events. A level of correlation between the dependent variable <i>number of software downloads</i> and the independent variables <i>project team size</i> and <i>number of software project events</i> are analyzed in this research. </p>

Page generated in 0.1748 seconds