• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 117
  • 1
  • 1
  • 1
  • Tagged with
  • 125
  • 125
  • 96
  • 57
  • 22
  • 15
  • 14
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Information technology programming standards and annual project maintenance costs

Mynyk, John 15 February 2014 (has links)
<p> Organizations that depend on the use of IT in their business models must maintain their systems and keep their systems current to survive (Filipek, 2008; Kulkarni, Kumar, Mookerjee, &amp; Sethi, 2009; Unterkalmsteiner et al., 2012). As most IT departments allocate as much as 80% of their budget to maintain stability while leaving only the other 20% to allow improvements (Telea et al., 2010), high cost of stability may be a reason many IT organizations cannot afford efficient staffing and even jeopardize the existence of the organization (Filipek, 2008; Talib, Abdullah, Atan, &amp; Murad, 2010). The purpose of this exploratory mixed methods study was to discover the IT programming standards used in IT departments that predict a decrease in project maintenance costs. This study employed an exploratory mixed methods data collection and analysis to develop and test a collection of universal programming standards. The qualitative portion of the study resulted in a list of IT programming standards from the Fortune 20 companies of 2011. Surveyed from IT departments in the Fortune 500 companies of 2011, the quantitative portion of this study correlate the degree of enforcement of each IT programming standard to a decrease in average project maintenance costs using a backward stepwise regression. Using a 95% confidence interval and a 5% margin of error (&alpha; = .05), the backward stepwise regression discarded 18 of the 22 IT programming standards. The remaining correlations give evidence that a) the more the department enforces waiting for feedback the higher the maintenance costs, b) the more the department enforces having the architectural team develop coding guidelines the lower the maintenance costs, and c) the more the IT department enforces the following of change management procedures, the higher the maintenance costs.</p>
82

User-defined key pair protocol

Hassan, Omar 26 February 2014 (has links)
<p> E-commerce applications have flourished on the Internet because of their ability to perform secure transactions in which the identities of the two parties could be verified and the communications between them encrypted. The Transport Layer Security (TLS) protocol is implemented to make secure transactions possible by creating a secure tunnel between the user's browser and the server with the help of Certificate Authorities (CAs). CAs are a third party that can be trusted by both the user's browser and the server and are responsible for establishing secured communication between them. The major limitation of this model is the use of CAs as single points of trust that can introduce severe security breaches globally. In my thesis, I provide a high-level design for a new protocol in the application layer of the TCP/IP suite that will build a secure tunnel between the user's browser and the server without the involvement of any third party. My proposed protocol is called User-Defined Key Pair (UDKP), and its objective is to build a secure tunnel between the user's browser and the server using a public/private key pair generated for the user on the fly inside the user's browser based on the user credential information. This key pair will be used by the protocol instead of the server certificate as the starting point for creating the secure tunnel.</p>
83

Designing online conversations to engage local practice a framework for the mutual development of tacit and explicit knowledge /

Wise, Alyssa Friend. January 2007 (has links)
Thesis (Ph.D.)--Indiana University, School of Education, 2007. / Source: Dissertation Abstracts International, Volume: 68-07, Section: A, page: 2816. Adviser: Thomas M. Duffy. Title from dissertation home page (viewed Apr. 14, 2008).
84

The 2014 green book| A qualitative historical case study

Meyer, Robert A. 15 January 2016 (has links)
<p> Effective internal controls to protect government information technology (IT) investments are essential as annual deficits exceed $700 billion dollars, government shutdowns, and sequestrations are threatened. The purpose of this qualitative historical single-case study was to explore, analyze, and describe feedback collected by the United States Government Accountability Office as IT governance and control requirements were rationalized. Prior to publishing an updated Standard for Internal Control in the Federal Government, the federal register requested participants respond to a series of questions directed toward the 2013 Draft Standards for Internal Control in the Federal Government. Four major themes emerged from within the 43 correspondents: (a) challenges exist with financial constraints and control documentation requirements, (b) the central oversight body must ensure that federal, state, county, departments, and agencies have shared understanding and objectives, (c) federal regulatory reform includes requirements identifying internal controls for both the Federal Government 2014 General Accountability Offices Standards and the 2013 Committee of Sponsoring Organization Standards, and (d) the implications of adapting a Standards for Internal Control publication to align with the Federal Government rather than adopting the publication. An efficient and effective approach to identify, integrate, and balance regulatory guidelines, stakeholders' concerns, and technical requirements for government leadership, contractors, and non-federal entity recommendations is proposed for assessment and development. This technique could provide government leadership a method to assess factors affecting or influencing proposed and/or existing regulatory control. Additionally, a conceptual historical narrative construct and a crosswalk between COSO and Federal Standards for Internal Control are included.</p>
85

Examining Tuckman's Team Theory in Non-collocated Software Development Teams Utilizing Collocated Software Development Methodologies

Crunk, John 23 August 2018 (has links)
<p> The purpose of this qualitative, multi-case study was to explain Tuckman&rsquo;s attributes within software development when using a collocated software designed methodology in a non-collocated setting. Agile is a software development methodology that is intended for use in a collocated setting; however, organizations are using it in a non-collocated setting, which is increasing the software errors in the final software product. The New Agile Process for Distributed Projects (NAPDiP) was developed to fix these software errors that arise when using Agile in a non-collocated setting but have not been effective. This research utilized Tuckman's team theory to explore the disparity related to why these errors still occur. The research question asked is how software development programmers explain Tuckman's attributes (i.e., forming, storming, norming, performing) on software development projects. The study adopted a qualitative model using nomothetic major and minor themes in the exploration of shared expressions of sentiments from participants. The study&rsquo;s population came from seven participants located in the United States and India who met the requirement of using the Agile development methodology and work for organizations on teams with a size of at least thirty individuals from various organizations. A total of seven participants reached saturation in this multi-case study supporting the research question explored. The findings of the research demonstrated that development teams do not meet all stages and attributes of Tuckman&rsquo;s team development. Future research should explore additional ways that software development teams satisfy a more significant number of Tuckman&rsquo;s team development stages.</p><p>
86

Factors Influencing the Adoption of Cloud Computing Driven by Big Data Technology| A Quantitative Study

Chowdhury, Naser 25 August 2018 (has links)
<p>A renewed interest in cloud computing adoption has occurred in academic and industry settings because emerging technologies have strong links to cloud computing and Big Data technology. Big Data technology is driving cloud computing adoption in large business organizations. For cloud computing adoption to increase, cloud computing must transition from low-level technology to high-level business solutions. The purpose of this study was to develop a predictive model for cloud computing adoption that included Big Data technology-related variables, along with other variables from two widely used technology adoption theories: technology acceptance model (TAM), and technology-organization-environment (TOE). The inclusion of Big Data technology-related variables extended the cloud computing?s mix theory adoption approach. The six independent variables were perceived usefulness, perceived ease of use, security effectiveness, the cost-effectiveness, intention to use Big Data technology, and the need for Big Data technology. Data collected from 182 U.S. IT professionals or managers were analyzed using binary logistic regression. The results showed that the model involving six independent variables was statistically significant for predicting cloud computing adoption with 92.1% accuracy. Independently, perceived usefulness was the only predictor variable that can increase cloud computing adoption. These results indicate that cloud computing may grow if it can be leveraged into the emerging Big Data technology trends to make cloud computing more useful for its users.
87

Examination of Insider Threats| A Growing Concern

Hartline, Cecil L., Jr. 06 January 2018 (has links)
<p> The National Infrastructure Advisory Council (NAIC) reports that "...preventing all insider threats is neither possible nor economically feasible..." because the threat is already behind perimeter defenses and often know exactly where vulnerabilities exist within organizations (Cline, 2016). The purpose of this research was to determine the prevalence of malicious and unintentional insider threats. Statistically, the numbers support the idea that insider threats are increasing and occurring more frequently. The true numbers, which only account for the incidents that were reported, may be higher than originally expected. The statistical numbers are likely to much higher because organizations fear reputational damage and client loss. Organizations give reasons such as not enough evidence for conviction or too hard to prove guilt. The result of the paper indicates that companies focus most of their resources on external threats and not the insider threat that is costlier to remediate and considered the most damaging of all threats. The research focuses on malicious and unintentional insider threats and how they are different. A 2018 Crowd Research Partners report found 90% of organizations believe they are vulnerable to insider attacks, while 53% of businesses confirmed they had experienced an insider threat in the past 12 months (Crowd Research Partners, 2017a). The insider threat is hard to manage because an organization not only need worry about their own employees they also must monitor and manage third-party vendors, partners, and contractors. However, with a combination of technical and nontechnical solutions, including an insider threat program, companies can detect, deter, prevent or at least reduce the impacts of insider threats. Abstract The National Infrastructure Advisory Council (NAIC) reports that "...preventing all insider threats is neither possible nor economically feasible..." because the threat is already behind perimeter defenses and often know exactly where vulnerabilities exist within organizations (Cline, 2016). The purpose of this research was to determine the prevalence of malicious and unintentional insider threats. Statistically, the numbers support the idea that insider threats are increasing and occurring more frequently. The true numbers, which only account for the incidents that were reported, may be higher than originally expected. The statistical numbers are likely to much higher because organizations fear reputational damage and client loss. Organizations give reasons such as not enough evidence for conviction or too hard to prove guilt. The result of the paper indicates that companies focus most of their resources on external threats and not the insider threat that is costlier to remediate and considered the most damaging of all threats. The research focuses on malicious and unintentional insider threats and how they are different. A 2018 Crowd Research Partners report found 90% of organizations believe they are vulnerable to insider attacks, while 53% of businesses confirmed they had experienced an insider threat in the past 12 months (Crowd Research Partners, 2017a). The insider threat is hard to manage because an organization not only need worry about their own employees they also must monitor and manage third-party vendors, partners, and contractors. However, with a combination of technical and nontechnical solutions, including an insider threat program, companies can detect, deter, prevent or at least reduce the impacts of insider threats.</p><p>
88

Leading Across Boundaries| Collaborative Leadership and the Institutional Repository in Research Universities and Liberal Arts Colleges

Seaman, David M. 03 November 2017 (has links)
<p> Libraries often engage in services that require collaboration across stakeholder boundaries to be successful. Institutional repositories (IRs) are a good example of such a service. IRs are an infrastructure to preserve intellectual assets within a university or college, and to provide an open access showcase for that institution&rsquo;s research, teaching, and creative excellence. They involve multiple stakeholders (librarians, IT experts, administrators, faculty, and students) and are typically operated by academic libraries. They have existed since the early 2000s. </p><p> Collaborative leadership has been studied in areas such as health care and business, but it has received little attention in studies of library leadership and management. Collaborative leadership has been shown to be an effective leadership style for an increasingly networked world; it is an interactive process in which people set aside self-interests, share power, work across boundaries, and discuss issues openly and supportively. Collaborative leadership moves organizations beyond mere cooperation towards a state of interdependence; it empowers all members of a team to help each other to achieve broader goals, find personal satisfaction in their work, and sustain productive relationships over time. A better understanding of collaborative leadership can inform both IR development and future complex multi-stakeholder campus services. </p><p> Two methodologies &ndash; content analysis of IR web pages and surveys of library directors and IR developers &ndash; were employed to determine if IRs revealed evidence of collaborative leadership. The study populations were those members of the Association of Research Libraries (ARL) and the Oberlin Group of liberal arts colleges that operated IR services by July 2014 (146 institutions overall). The research examined if IR format, size, age, nomenclature, or technology platform varied between ARL and Oberlin Group members. It asked if there is any difference in the perception of collaborative leadership traits, perceived IR success, or collaborative involvement with stakeholder communities between ARL and Oberlin Group members or between library directors and IR developers. The study found evidence of all six collaborative leadership traits being examined: assessing the environment for collaboration, creating clarity, building trust, sharing power, developing people, and self-reflection. </p><p>
89

A Study of How Young Adults Leverage Multiple Profile Management Functionality in Managing their Online Reputation on Social Networking Sites

McCune, T. John 26 October 2017 (has links)
<p> With privacy settings on social networking sites (SNS) perceived as complex and difficult to use and maintain, young adults can be left vulnerable to others accessing and using their personal information. Consequences of not regulating the boundaries their information on SNS include the ability for current and future employers to make career-impacting decisions based upon their online reputation that may include disqualifying them as job candidates. </p><p> On SNS, such as Facebook, LinkedIn, and Twitter, young adults must decide on how to manage their online reputation by regulating boundaries to their own personal and professional information and identities. One known practice for the regulation of boundaries is the use of multiple profile management (MPM), where users of SNS create and use multiple accounts on a SNS and separate the social and professional identities that they disclose publicly and privately. </p><p> The purpose of the study was to understand the lived experiences of young adults in how they regulate boundaries on SNS, through the use of MPM, as they manage their online reputation to different audiences. The practice was studied by applying interpretative phenomenological analysis (IPA) through interviewing young adults of 18-23 years of age, who use MPM on a SNS. Semi-structured interviews permitted participants to provide in-depth descriptions of their lived experiences.</p><p> Eight themes were identified and described based on the analysis of the interviews that include: SNS use with online audiences, motivations for using MPM, the processes for the presentation of self, online search results, privacy settings, untagging SNS posts, self-editing and censorship, and new features. The themes describe the complexity and challenges that young adults face with regulating boundaries with their professional and social identities online through the use of MPM.</p><p> Findings from this study have implications for a variety of audiences. Through the findings of this study, SNS developers can introduce new features, improve usability related to privacy management, and further encourage use of their networks. Users of SNS can use this study to understand risks of using SNS and for learning of practices for how to manage their online reputation on SNS.</p><p>
90

An Investigation of Circumstances Affecting Consumer Behavioral Intentions to Use Telemedicine Technology| An Interpretative Phenomenological Study

Cutts, Haywood 16 November 2017 (has links)
<p> Concerns related to the protection of personal identification information, graphic user interface, patient privacy, and consumer acceptance, to name a few, have plagued the implementation of telemedicine. Advocates of telemedicine have gained the interests of consumers but failed to recognize the true nature of consumer attitudes towards the use of telemedicine. This research was a significant step towards understanding consumer unwillingness to use telemedicine. Understanding and acknowledging what customers feel is detrimental to improving the telemedicine implementation process. The purpose of this qualitative study was to explore consumers who may have experienced cognitive dissonance between their interest and the use of wireless body area networks. The interpretative phenomenological method was employed to understand and contribute knowledge about the phenomenon. The research participants were randomly selected patients, physicians, nurses, paramedics, and healthcare professionals. The findings contribute to knowledge by exposing the relevance of understanding cognitive dissonance, and its underrated affiliations. Such alliances play a meaningful role when embracing or rejecting the use of telemedicine. Future research may consider aligning and employing use behavioral models, such as the social cognitive theory, or the social capital theory, to help increase knowledge and understanding of consumer cognitive dissonance towards the use of telemedicine Advocates planning to implement telemedicine in rural areas could use these findings to help diminish or subdue indigenous consumer anxiety towards the use of telemedicine.</p><p>

Page generated in 0.0562 seconds