• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • Tagged with
  • 13
  • 13
  • 13
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Project managers' perceptions of the primary factors contributing to success or failure of projects| A qualitative phenomenological study

Hickson, Ray C. 30 June 2015 (has links)
<p> This qualitative interpretative phenomenological study increased the understanding of project managers&rsquo; perception and lived experiences of the primary issues contributing to the success or failure of projects. This study used method triangulation to analyze the experiences of 48 project managers. The study was conducted in three phases, including a pilot study, an open-ended questionnaire, and one-on-one interviews. The project managers&rsquo; lived experiences indicated that stakeholder communication; collaboration; and consensus on governance, leadership methods, definition of requirements, and success criteria during the project initiation stage are critical to achieving higher project success rates. The major themes that emerged from this study are the definition of project success, requirements and success criteria, stakeholder consensus and engagement, transparency, and project management methodologies. Additional research is suggested to determine if there is a relationship among experience, qualifications, certification, and project success or failure and to determine implementable solutions to improve project success rates.</p>
2

Informed Technology Adoption Decisions Based on Innovation-Related Factors

Hsieh, David A. 29 December 2018 (has links)
<p> The timely adoption of technology for organizations in making the right investment or divestment can be achieved by using multicriteria decision making approach with integrated views of established innovation theories, industry best practices in technology acquisition lifecycle, statistical analysis of available technology profiles, expert opinion and trend analysis. This research aimed to develop an analytical approach to assess the correlation among objective data (such as innovation maturity rating and market penetration) and subjective data (such as benefit rating and &ldquo;time to plateau&rdquo;) to provide organizations insights in technology adoption decisions. The objective of this study is not to study the Gartner&rsquo;s Hype Cycles but to utilize the longitudinal technology innovation profile data as factors for informed technology adoption decision. We combined mapping with Department of Defense Technology Readiness Level, statistical analysis, correlations, multiple regression analysis and trend analysis to provides an objective and quantifiable methodology to provide insight into the characteristics of innovations. The goal is to derive a logical and balanced approach for organizations&rsquo; decision-making base on objective (as in the technology maturity rating and market survey) and subjective (as in the expert opinion in benefit rating and time to plateau predictions) data analysis. We used Rogers&rsquo; concept of &ldquo;Diffusion of Innovation&rdquo; as a notional reference for Organizational Technology Adoption to conduct a statistical analysis of a selected set of 345 Gartner&rsquo;s technology profile data from 2009 to 2015. We used market penetration data as a proxy for technology acceptance. To ensure the fit for purpose, we compared Gartner&rsquo;s definition of technology maturity with that of the Department of Defense Technology Readiness Level (TRL). The trending data on market penetration, maturity rating, benefit rating and time to technology plateau determined that the 2<sup>nd</sup> Order Polynomial Model provided the best statistical goodness of fit in all cases. We discuss the non-linear nature of the data and the for more predictive association of technological maturity with organizational adoption. Further empirical approaches with traditional analysis, machine learning or artificial intelligence would allow researchers to test, to explore and to better understand the diffusion of innovation first pioneered by Rogers, Moore and Bass.</p><p>
3

Prioritizing Security Controls Using Multiple Criteria Decision Making for Home Users

Waxler, John 26 April 2018 (has links)
<p> Hundreds of thousands of home users are victimized by cyber-attacks every year. Many experts agree that average home users are not doing enough to protect their home computers from cyber-attacks. Improperly managed home computers can lead to individuals losing data, systems performing slowly, identity loss or theft, and ransom payments. <i>En masse</i> attacks can act in concert to infect personal computers in business and government. Home users currently receive conflicting guidance, often in the form of recommendations such as 'Top 10&rsquo; lists which are not appropriate for their specific needs. In many instances users ignore all guidance. Often, these &lsquo;Top 10&rsquo; lists appear to be based solely on subjective opinion. Ultimately, the researchers asked themselves the following question: how can we provide home users with better guidance for determining and applying appropriate security controls that meet their needs and can be verified by the cyber security community? This praxis proposes a methodology for determining and prioritizing the most appropriate security controls for home computing. Using Multi Criteria Decision Making (MCDM) and subject matter expertise, this praxis identifies, analyzes and prioritizes security controls used by government and industry to determine which controls can substantively improve home computing security. This praxis will then apply our methodology using examples to demonstrate its benefits.</p><p>
4

Considerations on the optimal and efficient processing of information-bearing signals

Harms, Herbert Andrew 27 November 2013 (has links)
<p> Noise is a fundamental hurdle that impedes the processing of information-bearing signals, specifically the extraction of salient information. Processing that is both optimal and efficient is desired; optimality ensures the extracted information has the highest fidelity allowed by the noise, while efficiency ensures limited resource usage. Optimal detectors and estimators have long been known, e.g., for maximum likelihood or minimum mean-squared error criteria, but might not admit an efficient implementation. A tradeoff often exists between the two goals. This thesis explores the tradeoff between optimality and efficiency in a passive radar system and an analog-to-digital converter. A passive radar system opportunistically uses illuminating signals from the environment to detect and track targets of interest, e.g., airplanes or vehicles. As an opportunistic user of signals, the system does not have control over the transmitted waveform. The available waveforms are not designed for radar and often have undesirable properties for radar systems, so the burden is on the receiver processing to overcome these obstacles. A novel technique is proposed for the processing of digital television signals as passive radar illuminators that eases the need for complex detection and tracking schemes while incurring only a small penalty in detection performance. An analog-to-digital converter samples analog signals for digital processing. The Shannon-Nyquist theorem describes a sufficient sampling and recovery scheme for bandlimited signals from uniformly spaced samples taken at a rate twice the bandwidth of the signal. Frequency-sparse signals are composed of relatively few frequency components and have fewer degrees of freedom than a frequency-dense bandlimited signal. Recent results in compressed sensing describe sufficient sampling and recovery schemes for frequency-sparse signals that require a sampling rate proportional to the spectral density and the logarithm of the bandwidth, while providing high fidelity and requiring many fewer samples, which saves resources. A proposed sampling and simple recovery scheme is shown to efficiently recover the locations of tones in a large bandwidth nearly-optimally using relatively few samples. The proposed sampling scheme is further optimized for full recovery of the input signal by matching the statistics of the scheme to the statistics of the input signal.</p>
5

A Model-Based Framework for Analyzing Cloud Service Provider Trustworthiness and Predicting Cloud Service Level Agreement Performance

Maeser, Robert K., III 27 April 2018 (has links)
<p> Analytics firm Cyence estimated Amazon&rsquo;s four-hour cloud computing outage on February 28, 2017 &ldquo;cost S&amp;P 500 companies at least $150 million&rdquo; (Condliffe 2017) and traffic monitoring firm Apica claimed &ldquo;54 of the top 100 online retailers saw site performance slump by at least 20 percent&rdquo; (Condliffe 2017). 2015 data center outages cost Fortune 1000 companies between $1.25 and $2.5 billion (Ponemon 2017). Despite potential risks, the cloud computing industry continues to grow. For example, Internet of Things, which is projected to grow 266% between 2013 and 2020 (MacGillivray et al. 2017), will drive increased demand and dependency on cloud computing as data across multiple industries is collected and sent back to cloud data centers for processing. Enterprises continue to increase demand and dependency with 85% having multi-cloud strategies, up from 2016 (RightScale 2017a). This growth and dependency will influence risk exposure and potential for impact (e.g. availability, reliability, performance, security, financial). The research in this Praxis and proposed solution focuses on calculating cloud service provider (CSP) trustworthiness based on cloud service level agreement (SLA) criteria and predicting cloud SLA availability performance for cloud computing services. Evolving industry standards for cloud SLAs (EC 2014, Hunnebeck et al. 2011, ISO/IEC 2016, NIST 2015, Hogben, Giles and Dekker 2012) and existing work regarding CSP trustworthiness (Ghosh, Ghosh and Das 2015, Taha et al. 2014) will be leveraged as the predictive model (using Linear Regression Analysis) is constructed to analyze CSP cloud computing service, SLA performance and CSP trustworthiness.</p><p>
6

Application of a Hidden Bayes Naive Multiclass Classifier in Network Intrusion Detection

Koc, Levent 11 January 2013
Application of a Hidden Bayes Naive Multiclass Classifier in Network Intrusion Detection
7

Cable modems' transmitted RF| A study of SNR, error rates, transmit levels, and trouble call metrics

Tebbetts, Jo A. 24 April 2013 (has links)
<p> Hypotheses were developed and tested to measure the cable modems operational metrics response to a reconfiguration of the cable modems' transmitted RF applied to the CMTS. The purpose of this experiment was to compare two groups on the use of non-federal RF spectrum to determine if configuring the cable modems' transmitted RF from 25.2 MHz, at 6.4 MHz Wide, 64 QAM and 31 MHz, at 6.4 MHz Wide, 64 QAM to 34.8 MHz, 6.4 MHz Wide, 64QAM improved the data services operational metrics measured by a wire line service operator to determine the quality of their product. The experiment tests the theory; configuring cable modems' transmitted RF to 34.8 MHz, 6.4 MHz Wide, 64QAM on the CMTS significantly impacted a cable modem's operational metrics, and as a result, increased operational effectiveness. </p><p> A randomized experiment on 117,084 cable modems resulted in a significant impact on SNR and transmit rates but did not present a significant impact on error rates and the trouble call metrics. The results showed that reconfiguring the cable modems' transmitted RF from 25.2 MHz, at 6.4 MHz Wide, 64 QAM and 31 MHz, at 6.4 MHz Wide, 64 QAM, to 34.8 MHz, 6.4 MHz Wide, 64QAM did significantly increase the SNR and transmit rates but did not significantly impact error rates and the trouble call truck roll metrics. The results are discussed in relation to other work implicating engineering RF management strategies and the impact on the cable modems operational metrics by reconfiguring the cable modems' RF from the lower ends of the RF spectrum into the middle of the RF spectrum configured on a wire line service operator's CMTS.</p>
8

A systems thinking approach to cloud computing that complements the NIST guidance

Brow, Raymond A. 08 April 2014 (has links)
<p> The move to cloud computing as mandated by the US federal CIO (Kundra, 2010) is one of the key initiatives expected to provide relief from US federal IT budget concerns. NIST was commissioned to provide guidance for the move. Federal agencies expressed concern that the guidance provided to them was deficient and GAO (2012) further stated that more planning was required. There is no research investigating the possible role systems thinking could play for complementing the NIST guidance and enhancing the planning. This study presents systems thinking as a complementary option to the NIST guidance. Using a mixed method, this study demonstrates first, quantitatively through content analysis that the NIST documentation does not take a systems thinking approach. Then secondly, this study qualitatively explores a systems thinking framework to supplement the NIST guidance. The framework is established based upon a thorough review of current scholarship in the areas of systems thinking and cloud computing using the established tools and methods of content analysis. The review of cloud computing demonstrates the diversity, complexity, and uncertainty of the technology. Systems thinking is shown to address just such situations. Based upon the research, a systems thinking framework is created that demonstrates how systems thinking could supplement the NIST guidance on moving to the cloud. Through the application of the framework US federal agencies could more confidently manage the risk of moving US federal assets to the cloud and thereby gain a firmer foothold in arresting the IT budget concerns.</p>
9

Reasons for non-compliance with mandatory information assurance policies by a trained population

Shelton, D. Cragin 13 May 2015 (has links)
<p> Information assurance (IA) is about protecting key attributes of information and the data systems. Treating IA as a system, it is appropriate to consider the three major elements of any system: <i>people, processes,</i> and <i>tools.</i> While IA tools exist in the form of hardware and software, tools alone cannot assure key information attributes. IA procedures and the people that must follow those procedures are also part of the system. There is no argument that people do not follow IA procedures. A review of the literature showed that not only is there no general consensus on why people do not follow IA procedures, no discovered studies simply asked people their reasons. Published studies addressed reasons for non-compliance, but always within a framework of any one of several assumed theories of human performance. The study described here took a first small step by asking a sample from an under-studied population, users of U.S. federal government information systems, why they have failed to comply with two IA procedures related to password management, and how often. The results may lay the groundwork for extending the same methodology across a range of IA procedures, eventually suggesting new approaches to motivating people, modifying procedures, or developing tools to better meet IA goals. In the course of the described study, an unexpected result occurred. The study plan had included comparing the data for workers with and without IA duties. However, almost all of the respondents in the survey declared having IA duties. Consideration of a comment by a pilot study participant brought the realization that IA awareness programs emphasizing universal responsibility for information security may have caused the unexpected responses. The study conclusions address suggestions for refining the question in future studies.</p>
10

Mobility and Traffic Correlations in Device-to-Device (D2D) Communication Networks

Li, Yujin 24 March 2015 (has links)
No description available.

Page generated in 0.1549 seconds