• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2878
  • 957
  • 497
  • 79
  • 70
  • 64
  • 56
  • 44
  • 37
  • 37
  • 36
  • 18
  • 16
  • 15
  • 13
  • Tagged with
  • 5359
  • 5359
  • 1266
  • 858
  • 845
  • 807
  • 757
  • 611
  • 552
  • 520
  • 481
  • 480
  • 413
  • 398
  • 384
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Feasibility study to aid management's decision on a computer

Sylvester, Robert A. January 1965 (has links)
Thesis (M.B.A.)--Boston University / PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you. / 2031-01-01
322

Outsourcing of IT Services: Studies on Diffusion and New Theoretical Perspectives

January 2012 (has links)
abstract: Information technology (IT) outsourcing, including foreign or offshore outsourcing, has been steadily growing over the last two decades. This growth in IT outsourcing has led to the development of different hubs of services across nations, and has resulted in increased competition among service providers. Firms have been using IT outsourcing to not only leverage advanced technologies and services at lower costs, but also to maintain their competitive edge and grow. Furthermore, as prior studies have shown, there are systematic differences among industries in terms of the degree and impact of IT outsourcing. This dissertation uses a three-study approach to investigate issues related to IT outsourcing at the macro and micro levels, and provides different perspectives for understanding the issues associated with IT outsourcing at a firm and industry level. The first study evaluates the diffusion patterns of IT outsourcing across industries at aggregate level and within industries at a firm level. In addition, it analyzes the factors that influence the diffusion of IT outsourcing and tests models that help us understand the rate and patterns of diffusion at the industry level. This study establishes the presence of hierarchical contagion effects in the diffusion of IT outsourcing. The second study explores the role of location and proximity of industries to understand the diffusion patterns of IT outsourcing within clusters using the spatial analysis technique of space-time clustering. It establishes the presence of simultaneous space and time interactions at the global level in the diffusion of IT outsourcing. The third study examines the development of specialized hubs for IT outsourcing services in four developing economies: Brazil, Russia, India, and China (BRIC). In this study, I adopt a theory-building approach involving the identification of explanatory anomalies, and propose a new hybrid theory called- knowledge network theory. The proposed theory suggests that the growth and development of the IT and related services sector is a result of close interactions among adaptive institutions. It is also based on new knowledge that is created, and which flows through a country's national diaspora of expatriate entrepreneurs, technologists and business leaders. In addition, relevant economic history and regional geography factors are important. This view diverges from the traditional view, wherein effective institutions are considered to be the key determinants of long-term economic growth. / Dissertation/Thesis / Ph.D. Business Administration 2012
323

The Impact of the Evidence-Based Clinical Decision Support Resource "UpToDate" on the Speed and Accuracy of Determining Drug-Drug-Interactions in a Dental Setting| A Randomized Crossover Controlled Pilot Trial

Dragan, Irina F. 09 August 2018 (has links)
<p> <b>Aim &amp; Hypothesis:</b> The aim of the study was to compare the time dental students need to answer questions about drug-drug interactions (DDI) when using the Evidence-Based Clinical Decision Support Resource (EBCDSR) UpToDate<sup>&reg;</sup> to retrieve patient-critical information versus general internet access, during a preclinical session. We hypothesized that the dental students utilizing the UpToDate<sup>&reg;</sup> would take less time to identify the correct DDIs and obtain higher examination scores, compared with the group with only internet access. </p><p> <b>Materials &amp; Methods:</b> The proposed study design was a randomized blinded crossover controlled pilot and each subject examined four computer-based virtual cases, during two study visits. In the first visit, one group assessed two cases presented in axiUm (Tufts University School of Dental Medicine&rsquo;s electronic health record system), using UpToDate<sup> &reg;</sup> access and the other group, using their own electronic resources assessed other two cases with no UpToDate<sup>&reg;</sup> access, and determined the DDI. At the second visit, after the ten days wash-out period, the cross-over took place. Each case was followed by three questions regarding the drug-drug interactions, focusing on the use of antibiotics, analgesics and local anesthetics. The mean time duration of the sessions conducted by each subject was captured and calculated. Chi-square tests were used for the statistical analysis of the examination scores. All statistical analyses were performed using SAS Version 9.2 (SAS Institute, Cary, NC). </p><p> <b>Results:</b> A total of 50 dental students presented for the first study visit and 44 dental students for the second study visit. The third year dental students utilizing the UpToDate<sup>&reg;</sup> took a similar amount of time to identify the correct DDIs compared with the third year dental students with no UpToDate<sup>&reg;</sup> access and only internet access (p-value = 0.429). Both groups obtained similar examination scores for all the questions related to antibiotics (p-value = 0.797), analgesics (p-value = 0.850) and local anesthetics (p-value = 0.850). </p><p> <b>Conclusions:</b> The current study has shown that UpToDate<sup> &reg;</sup> can provide answers to clinical questions at the point of care in a timely manner, with a high level of student satisfaction. Future studies might involve a more seamless entry into EBCDSR&rsquo;s using &ldquo;Infobutton&rdquo; in the Electronic Health Record (EHR).</p><p>
324

The Impact of Mindfulness on Non-malicious Spillage within Images on Social Networking Sites

Landress, Angela D. 14 August 2018 (has links)
<p> Insider threat by employees in organizations is a problematic issue in today&rsquo;s fast-paced, internet-driven society. Gone are the days when securing the perimeter of one&rsquo;s network protected their business. Security threats are now mobile, and employees have the ability to share sensitive business data with hundreds of people instantaneously from mobile devices. While prior research has addressed social networking topics such as trust in relation to information systems, the use of social networking sites, social networking security, and social networking sharing, there is a lack of research in the mindfulness of users who spill sensitive data contained within images posted on social networking sites (SNS). The author seeks to provide an understanding of how non-malicious spillage through images relates to the mindfulness of employees, who are also deemed insiders. Specifically, it explores the relationships between the following variables: mindfulness, proprietary information spillage, and spillage of personally identifiable information (PII). A quasi-experimental study was designed, which was correlational in nature. Individuals were the unit of analysis. A sample population of business managers with SNS accounts were studied. A series of video vignettes were used to measure mindfulness. Surveys were used as a tool to collect and analyze data. There was a positive correlation between non-malicious spillage of sensitive business, both personally identifiable information and proprietary data, and a lack of mindfulness. </p><p>
325

Analysis and Detection of the Silent Thieves

Perez, Jon 13 September 2018 (has links)
<p> As the cryptocurrency market becomes more lucrative and accessible, cybercriminals will continue to adapt strategies to monetize the unauthorized use of system resources for mining operations. Some of these strategies involve infecting systems with malware that will deploy a cryptomining application. Other attack strategies involve deploying code to a target&rsquo;s web browser that will cause the web browser to perform mining operations. This research examines existing cryptomining malware, commonalities in targeting and infection vectors, techniques used by cryptomining malware, and distinguishable differences between legitimate and malicious use. </p><p> The research found that cybercriminals employing cryptomining malware, attack targets indiscriminately. Additionally, the techniques employed by cryptomining malware are also used by other types of malware. The research tested the impact of cryptomining applications on CPU utilization and showed a clear distinction when comparing the CPU utilization of cryptomining applications to common applications on a desktop PC. The research also found that distinguishing between the authorized and unauthorized use of cryptomining relied heavily on a holistic examination of the system in question. </p><p> The research synthesized existing literature and the results of the CPU testing to recommend two strategies for detecting malicious cryptomining activity. The optimal strategy involves endpoint, network, and CPU monitoring and the ability to aggregate, and correlate events or alerts produced. A less optimal strategy involves multiple event sources with manual or no correlation, or a single event source. </p><p>
326

Probabilistic Clustering Ensemble Evaluation for Intrusion Detection

McElwee, Steven M. 18 August 2018 (has links)
<p> Intrusion detection is the practice of examining information from computers and networks to identify cyberattacks. It is an important topic in practice, since the frequency and consequences of cyberattacks continues to increase and affect organizations. It is important for research, since many problems exist for intrusion detection systems. Intrusion detection systems monitor large volumes of data and frequently generate false positives. This results in additional effort for security analysts to review and interpret alerts. After long hours spent reviewing alerts, security analysts become fatigued and make bad decisions. There is currently no approach to intrusion detection that reduces the workload of human analysts by providing a probabilistic prediction that a computer is experiencing a cyberattack. </p><p> This research addressed this problem by estimating the probability that a computer system was being attacked, rather than alerting on individual events. This research combined concepts from cyber situation awareness by applying clustering ensembles, probability analysis, and active learning. The unique contribution of this research is that it provides a higher level of meaning for intrusion alerts than traditional approaches. </p><p> Three experiments were conducted in the course of this research to demonstrate the feasibility of these concepts. The first experiment evaluated cluster generation approaches that provided multiple perspectives of network events using unsupervised machine learning. The second experiment developed and evaluated a method for detecting anomalies from the clustering results. This experiment also determined the probability that a computer system was being attacked. Finally, the third experiment integrated active learning into the anomaly detection results and evaluated its effectiveness in improving the accuracy. </p><p> This research demonstrated that clustering ensembles with probabilistic analysis were effective for identifying normal events. Abnormal events remained uncertain and were assigned a belief. By aggregating the belief to find the probability that a computer system was under attack, the resulting probability was highly accurate for the source IP addresses and reasonably accurate for the destination IP addresses. Active learning, which simulated feedback from a human analyst, eliminated the residual error for the destination IP addresses with a low number of events that required labeling.</p><p>
327

Success Factors of Implementing Enterprise Resource Planning Systems in North American Organizations

Alghamdi, Mazen 05 October 2018 (has links)
<p> Enterprise Resource Planning (ERP) is a single set of software applications that include finance, sales, and human resources and it is used to integrate business functions into a single computer system application, which allows different systems to work together. The quantitative correlation research study is to determine to what extent, if any, there is a correlation between the critical success factors (independent variables) (IV) and the successful implementation of ERP systems (dependent variable) (DV) in the Western region of the United States (specifically Washington, Oregon, and California). The IVs are the critical success factors (CSFs) (clear goals and objective, top management support, business process re-engineering, use of the consultant, effective communication, ERP vendor selection, ERP customization, ERP vendor support, and user training). The DV is the successful implementation of ERP. The study was to predict successful ERP system implementation using various technical and managerial constructs controlling for other demographics in a sample of Information Technology (IT) leaders working in Washington, Oregon, and California. The population of this study included a current 90 IT leaders from the Western region of the United States including Chief Information Officer (CIO), Project manager, consultant, and developer. According to the correlation results, none of the subscales was a significant predictor of successful ERP implementation, but four out of five of the technical success factors (ERP package selection, ERP customization, vendor support, and user training) had a moderate effect in increasing the likelihood of successful implementation. </p><p>
328

Informed Technology Adoption Decisions Based on Innovation-Related Factors

Hsieh, David A. 29 December 2018 (has links)
<p> The timely adoption of technology for organizations in making the right investment or divestment can be achieved by using multicriteria decision making approach with integrated views of established innovation theories, industry best practices in technology acquisition lifecycle, statistical analysis of available technology profiles, expert opinion and trend analysis. This research aimed to develop an analytical approach to assess the correlation among objective data (such as innovation maturity rating and market penetration) and subjective data (such as benefit rating and &ldquo;time to plateau&rdquo;) to provide organizations insights in technology adoption decisions. The objective of this study is not to study the Gartner&rsquo;s Hype Cycles but to utilize the longitudinal technology innovation profile data as factors for informed technology adoption decision. We combined mapping with Department of Defense Technology Readiness Level, statistical analysis, correlations, multiple regression analysis and trend analysis to provides an objective and quantifiable methodology to provide insight into the characteristics of innovations. The goal is to derive a logical and balanced approach for organizations&rsquo; decision-making base on objective (as in the technology maturity rating and market survey) and subjective (as in the expert opinion in benefit rating and time to plateau predictions) data analysis. We used Rogers&rsquo; concept of &ldquo;Diffusion of Innovation&rdquo; as a notional reference for Organizational Technology Adoption to conduct a statistical analysis of a selected set of 345 Gartner&rsquo;s technology profile data from 2009 to 2015. We used market penetration data as a proxy for technology acceptance. To ensure the fit for purpose, we compared Gartner&rsquo;s definition of technology maturity with that of the Department of Defense Technology Readiness Level (TRL). The trending data on market penetration, maturity rating, benefit rating and time to technology plateau determined that the 2<sup>nd</sup> Order Polynomial Model provided the best statistical goodness of fit in all cases. We discuss the non-linear nature of the data and the for more predictive association of technological maturity with organizational adoption. Further empirical approaches with traditional analysis, machine learning or artificial intelligence would allow researchers to test, to explore and to better understand the diffusion of innovation first pioneered by Rogers, Moore and Bass.</p><p>
329

Evaluating Applications of a Telemedicine Taxonomy on the Classification of Research

January 2015 (has links)
abstract: By offering increased access to medical care, telemedicine offers significant opportunity for the process of development under Amartya Sen’s definition, that development is freedom, including freedom from illness, early death, and preventable disease. It advances development by freeing people from these burdens. However, like many emerging technologies, organizing information and understanding the field faces significant challenges. This paper applies Bashshur's three-dimensional model of telemedicine to the classification of telemedicine literature found in databases to assess the value of the model as a tool for classification. By standardizing language and creating a repository of research done to date in a centralized location, the field can better understand how it is progressing and where work still needs to be done. This paper aims to see if Bashshur's model serves well for this task. / Dissertation/Thesis / Masters Thesis Global Technology and Development 2015
330

From Understanding Telephone Scams to Implementing Authenticated Caller ID Transmission

January 2017 (has links)
abstract: The telephone network is used by almost every person in the modern world. With the rise of Internet access to the PSTN, the telephone network today is rife with telephone spam and scams. Spam calls are significant annoyances for telephone users, unlike email spam, spam calls demand immediate attention. They are not only significant annoyances but also result in significant financial losses in the economy. According to complaint data from the FTC, complaints on illegal calls have made record numbers in recent years. Americans lose billions to fraud due to malicious telephone communication, despite various efforts to subdue telephone spam, scam, and robocalls. In this dissertation, a study of what causes the users to fall victim to telephone scams is presented, and it demonstrates that impersonation is at the heart of the problem. Most solutions today primarily rely on gathering offending caller IDs, however, they do not work effectively when the caller ID has been spoofed. Due to a lack of authentication in the PSTN caller ID transmission scheme, fraudsters can manipulate the caller ID to impersonate a trusted entity and further a variety of scams. To provide a solution to this fundamental problem, a novel architecture and method to authenticate the transmission of the caller ID is proposed. The solution enables the possibility of a security indicator which can provide an early warning to help users stay vigilant against telephone impersonation scams, as well as provide a foundation for existing and future defenses to stop unwanted telephone communication based on the caller ID information. / Dissertation/Thesis / Doctoral Dissertation Computer Science 2017

Page generated in 0.1004 seconds