• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1015
  • 224
  • 97
  • 96
  • 69
  • 31
  • 29
  • 19
  • 19
  • 14
  • 12
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 2076
  • 745
  • 706
  • 585
  • 437
  • 357
  • 330
  • 310
  • 227
  • 221
  • 193
  • 189
  • 174
  • 165
  • 160
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
581

Strategies for Cloud Services Adoption in Saudi Arabia

Mahmoud, Wessam Hussein Abdulghani 01 January 2019 (has links)
The adoption rate of cloud computing is low among business organizations in Saudi Arabia, despite the cost-€saving benefits of using cloud services. The purpose of this multiple case study was to explore the strategies that information technology (IT) leaders in the manufacturing industry in Saudi Arabia used to adopt cloud computing to reduce IT costs. The target population of this study consisted of 5 IT leaders from 5 different manufacturing companies in Saudi Arabia who successfully adopted cloud computing in their companies to reduce IT costs. Rogers's diffusion of innovation theory was the conceptual framework for this research. Data collected from face-€to-€face, semistructured interviews and a review of relevant corporate documentation were analyzed using Yin'€™s 5-€step data analysis method, which included compiling, disassembling, reassembling, interpreting, and concluding the data. Five themes emerged from the data analysis: identify business needs and requirements, apply value realization metrics, plan for migration, choose the right cloud service provider, and provide adequate training and awareness sessions. The implications of this study for positive social change include the potential to improve the local economy in Saudi Arabia by ensuring the sustainability of firms in the manufacturing industry through the implementation of cost-€saving strategies associated with cloud computing adoption.
582

Strategies Used by Cloud Security Managers to Implement Secure Access Methods

Harmon, Eric 01 January 2018 (has links)
Cloud computing can be used as a way to access services and resources for many organizations; however, hackers have created security concerns for users that incorporate cloud computing in their everyday functions. The purpose of this qualitative multiple case study was to explore strategies used by cloud security managers to implement secure access methods to protect data on the cloud infrastructure. The population for this study was cloud security managers employed by 2 medium size businesses in the Atlanta, Georgia metropolitan area and that have strategies to implement secure access methods to protect data on the cloud infrastructure. The technology acceptance model was used as the conceptual framework for the study. Data were collected from semi-structured interviews of 7 security managers and review of 21 archived documents that reflected security strategies from past security issues that occurred. Data analysis was performed using methodological triangulation and resulted in the identification of three major themes: implementing security policies, implementing strong authentication methods, and implementing strong access control methods. The findings from this research may contribute to positive social by decreasing customers' concerns regarding personal information that is stored on the cloud being compromised.
583

The Training Deficiency in Corporate America: Training Security Professionals to Protect Sensitive Information

Johnson, Kenneth Tyrone 01 January 2017 (has links)
Increased internal and external training approaches are elements senior leaders need to know before creating a training plan for security professionals to protect sensitive information. The purpose of this qualitative case study was to explore training strategies telecommunication industry leaders use to ensure security professionals can protect sensitive information. The population consisted of 3 senior leaders in a large telecommunication company located in Dallas, Texas that has a large footprint of securing sensitive information. The conceptual framework on which this study was based was the security risk planning model. Semistructured interviews and document reviews helped to support the findings of this study. Using the thematic approach, 3 major themes emerged. The 3 themes included security training is required for all professionals, different approaches to training are beneficial, and using internal and external training's to complement each other. The findings revealed senior leaders used different variations of training programs to train security professionals on how to protect sensitive information. The senior leaders' highest priority was the ability to ensure all personnel accessing the network received the proper training. The findings may contribute to social change by enhancing area schools' technology programs with evolving cyber security technology, helping kids detect and eradicate threats before any loss of sensitive information occurs.
584

Management Strategies for Adopting Agile Methods of Software Development in Distributed Teams

Schtein, Igor A 01 January 2018 (has links)
Between 2003 and 2015, more than 61% of U.S. software development teams failed to satisfy project requirements, budgets, or timelines. Failed projects cost the software industry an estimated 60 billion dollars. Lost opportunities and misused resources are often the result of software development leaders failing to implement appropriate methods for managing software projects. The purpose of this qualitative multiple case study was to explore strategies software development managers use in adopting Agile methodology in the context of distributed teams. The tenets of Agile approach are individual interaction over tools, working software over documentation, and collaboration over a contract. The conceptual framework for the study was adapting Agile development methodologies. The targeted population was software development managers of U.S.-based companies located in Northern California who had successfully adopted Agile methods for distributed teams. Data were collected through face-to-face interviews with 5 managers and a review of project-tracking documentation and tools. Data analysis included inductive coding of transcribed interviews and evaluation of secondary data to identify themes through methodological triangulation. Findings indicated that coaching and training of teams, incremental implementation of Agile processes, and proactive management of communication effectiveness are effective strategies for adopting Agile methodology in the context of distributed teams. Improving the efficacy of Agile adoption may translate to increased financial stability for software engineers across the world as well as accelerate the successful development of information systems, thereby enriching human lives.
585

Plug-and-Play Web Services

Jain, Arihant 01 December 2019 (has links)
The goal of this research is to make it easier to design and create web services for relational databases. A web service is a software service for providing data over computer networks. Web services provide data endpoints for many web applications. We adopt a plug-and-play approach for web service creation whereby a designer constructs a “plug,” which is a simple specification of the output produced by the service. If the plug can be “played” on the database then the web service is generated. Our plug-and-play approach has three advantages. First, a plug is portable. You can take the plug to any data source and generate a web service. Second, a plug-and-play service is more reliable. The web service generation checks the database to determine if the service can be safely and correctly generated. Third, plug-and-play web services are easier to code for complex data since a service designer can write a simple plug, abstracting away the data’s real complexity. We describe how to build a system for plug-and-play web services, and experimentally evaluate the system. The software produced by this research will make life easier for web service designers.
586

Multi-Robot Complete Coverage Using Directional Constraints

Malan, Stefanus 01 January 2018 (has links)
Complete coverage relies on a path planning algorithm that will move one or more robots, including the actuator, sensor, or body of the robot, over the entire environment. Complete coverage of an unknown environment is used in applications like automated vacuum cleaning, carpet cleaning, lawn mowing, chemical or radioactive spill detection and cleanup, and humanitarian de-mining. The environment is typically decomposed into smaller areas and then assigned to individual robots to cover. The robots typically use the Boustrophedon motion to cover the cells. The location and size of obstacles in the environment are unknown beforehand. An online algorithm using sensor-based coverage with unlimited communication is typically used to plan the path for the robots. For certain applications, like robotic lawn mowing, a pattern might be desirable over a random irregular pattern for the coverage operation. Assigning directional constraints to the cells can help achieve the desired pattern if the path planning part of the algorithm takes the directional constraints into account. The goal of this dissertation is to adapt the distributed coverage algorithm with unrestricted communication developed by Rekleitis et al. (2008) so that it can be used to solve the complete coverage problem with directional constraints in unknown environments while minimizing repeat coverage. It is a sensor-based approach that constructs a cellular decomposition while covering the unknown environment. The new algorithm takes directional constraints into account during the path planning phase. An implementation of the algorithm was evaluated in simulation software and the results from these experiments were compared against experiments conducted by Rekleitis et al. (2008) and with an implementation of their distributed coverage algorithm. The results of this study confirm that directional constraints can be added to the complete coverage algorithm using multiple robots without any significant impact on performance. The high-level goals of complete coverage were still achieved. The work was evenly distributed between the robots to reduce the time required to cover the cells.
587

Integrative Computational Genomics Based Approaches to Uncover the Tissue-Specific Regulatory Networks in Development and Disease

Srivastava, Rajneesh 03 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Regulatory protein families such as transcription factors (TFs) and RNA Binding Proteins (RBPs) are increasingly being appreciated for their role in regulating the respective targeted genomic/transcriptomic elements resulting in dynamic transcriptional (TRNs) and post-transcriptional regulatory networks (PTRNs) in higher eukaryotes. The mechanistic understanding of these two regulatory network types require a high resolution tissue-specific functional annotation of both the proteins as well as their target sites. This dissertation addresses the need to uncover the tissue-specific regulatory networks in development and disease. This work establishes multiple computational genomics based approaches to further enhance our understanding of regulatory circuits and decipher the associated mechanisms at several layers of biological processes. This study potentially contributes to the research community by providing valuable resources including novel methods, web interfaces and software which transforms our ability to build high-quality regulatory binding maps of RBPs and TFs in a tissue specific manner using multi-omics datasets. The study deciphered the broad spectrum of temporal and evolutionary dynamics of the transcriptome and their regulation at transcriptional and post transcriptional levels. It also advances our ability to functionally annotate hundreds of RBPs and their RNA binding sites across tissues in the human genome which help in decoding the role of RBPs in the context of disease phenotype, networks, and pathways. The approaches developed in this dissertation is scalable and adaptable to further investigate the tissue specific regulators in any biological systems. Overall, this study contributes towards accelerating the progress in molecular diagnostics and drug target identification using regulatory network analysis method in disease and pathophysiology.
588

The effect of blockchain technology on the South African banking environment

Gray, Jared January 2018 (has links)
A research article submitted to the Faculty of Commerce, Law and Management, University of the Witwatersrand, in partial fulfilment of the requirements for the degree of Master of Business Administration Johannesburg, 2018 / Blockchain technology is a foundational technology with various use cases that can significantly impact the manner in which banking is carried out in South Africa. The following paper seeks to put together a framework for understanding the potential effect of blockchain technology on the South African banking environment, with a specific focus on how blockchain technology will impact the South African banking environment (i.e. the applications and use cases) and when this impact will take place. A qualitative approach to addressing the problem statement was adopted, specifically in the form of focus interviews and strategic discussions with subject matter experts in both the blockchain and South African banking environment. Findings indicate that there are number of blockchain applications that can impact the South African banking environment namely, Private Digital Ledgers, Smart Contracts and Tokens/ Cryptocurrencies. Further to this, research indicates that the former is most likely in the short term, while the latter two applications are subject to a high-level stakeholder coordination, a high level of effort in educating the end customer and a high level of friction from existing systems and process, and will therefore only realise mass adoption in the long-term. As a result, this research contributes to providing an initial view of which applications are most likely to be adopted by South African banks and can form the foundation for further research in this area. / E.K. 2019
589

An Analysis of Generational Caching Implemented in a Production Website

Zych, Marc E 01 June 2013 (has links) (PDF)
Website scaling has been an issue since the inception of the web. The demand for user generated content and personalized web pages requires the use of a database for a storage engine. Unfortunately, scaling the database to handle large amounts of traffic is still a problem many companies face. One such company is iFixit, a provider of free, publicly-editable, online repair manuals. Like many websites, iFixit uses Memcached to decrease database load and improve response time. However, the caching strategy used is a very ad hoc one and therefore can be greatly improved. Most research regarding web application caching focuses on cache invalidation, the process of keeping cached content consistent with the permanent data store. Generational caching is a technique that involves including the object’s last modified date in the cache key. This ensures that cache keys change when- ever the object is modified, which effectively invalidates all relevant cache entries. Fine-grained caching is now very simple because the developer doesn’t need to explicitly delete all possible cache keys that might need to be invalidated when an object is modified. This is particularly useful for caching arbitrary fragments of HTML without increasing the complexity of cache invalidation. In this work, we describe the process of implementing a caching strategy based on generational caching concepts in iFixit’s website. Our implementation yielded a 20% improvement in page response time by caching fragments of HTML and results of database queries.
590

Geospatial Data Modeling to Support Energy Pipeline Integrity Management

Wylie, Austin 01 June 2015 (has links) (PDF)
Several hundred thousand miles of energy pipelines span the whole of North America -- responsible for carrying the natural gas and liquid petroleum that power the continent's homes and economies. These pipelines, so crucial to everyday goings-on, are closely monitored by various operating companies to ensure they perform safely and smoothly. Happenings like earthquakes, erosion, and extreme weather, however -- and human factors like vehicle traffic and construction -- all pose threats to pipeline integrity. As such, there is a tremendous need to measure and indicate useful, actionable data for each region of interest, and operators often use computer-based decision support systems (DSS) to analyze and allocate resources for active and potential hazards. We designed and implemented a geospatial data service, REST API for Pipeline Integrity Data (RAPID) to improve the amount and quality of data available to DSS. More specifically, RAPID -- built with a spatial database and the Django web framework -- allows third-party software to manage and query an arbitrary number of geographic data sources through one centralized REST API. Here, we focus on the process and peculiarities of creating RAPID's model and query interface for pipeline integrity management; this contribution describes the design, implementation, and validation of that model, which builds on existing geospatial standards.

Page generated in 0.0269 seconds