• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 6
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 84
  • 84
  • 84
  • 23
  • 15
  • 13
  • 13
  • 13
  • 12
  • 12
  • 11
  • 11
  • 11
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Navigating the Data Stream - Enhancing Inbound Logistics Processes through Big Data Analytics : A Study of Information Processing Capabilities facilitating Information Utilisation in Warehouse Resource Planning

Zuber, Johannes, Hahnewald, Anton January 2024 (has links)
Background: Nowadays an ever-increasing amount of data is generated which is why companies face the challenge of extracting valuable information from these data streams. An enhanced Information Utilisation carriers the opportunity for improved decision-making. This could address challenges that come along with delayed trucks in inbound logistics and associated warehouse resource planning. Purpose: This study aims to deepen the understanding of Big Data Analytics capabilities that foster Information Integration and decision support to facilitate Information Utilisation. We apply this to the context of warehouse resource replanning in inbound logistics in case of unexpected short-term deviations. Method: We conducted a qualitative research study, comprising a Ground Theory approach in combination with an abductive reasoning. Derived from a literature review we adapted a framework and proposed an own conceptual framework after conducting and analysing 14 semi-structured interviews with inbound logistics practitioners and experts. Conclusion: We identified four interconnected capabilities that facilitate Information Utilisation. Data Generation Capabilities and Data Integration & Management Capabilities contribute to improved Information Integration, establishing a base for subsequent data analytics. Consequently, Data Analytics Capabilities and Data Interpretation Capabilities lead to enhanced decision support, facilitating Information Utilisation.
42

Användningen av Industri 4.0- teknologier i inköpsprocessen : En systematisk litteraturgenomgång

Abdullah, Rahaf, Nedeva, Mirjana January 2024 (has links)
Sammanfattning Industri 4.0 tyder på den senaste fasen av industrialiseringen och fokuserar på digitalisering, automatisering och datautbyte. Den fjärde industriella revolutionen, Industri 4.0, bygger på tidigare industriella revolutioner och använder sig av olika teknologier såsom IoT, Big Data Analytics och Cloud Services. Inom inköpsprocessen innebär Industri 4.0 ett behov att effektivisera och uppnå flexibla processer för att möta ökade krav på innovation och kvalitet. Syftet med arbetet är att identifiera och analysera användningen av Industri 4.0-teknologier i inköpsprocessen för att förbättra effektiviteten. Forskningsfrågorna fokuserar på vilka teknologier som används i inköpsprocessen och på vilket sätt de används samt att analysera deras roll i att förbättra effektiviteten. Arbetet begränsas till inköpsprocessen inom supply chain management och använder en strukturerad litteraturgenomgång som metod. Användningen av IoT, CPS, Big Data-analys, Blockchain och Cloud Services automatiserar och effektiviserar inköpsaktiviteter. IoT används för realtidsövervakning och automatiserad påfyllning av lager. CPS samordnar både digitala och fysiska komponenter i syfte att förbättra spårbarhet och beslutsfattande. Big Data-analys förutser efterfrågan samt utvärderar leverantörers prestationer. Blockchain säkerställer transparenta och säkra transaktioner med hjälp av smarta kontrakt. Cloud Services hanterar realtidsdata och förbättrar kommunikation och samarbete i leveranskedjan. Dessa teknologier spelar en central roll för effektiviseringen i inköpsprocessen där manuella lager minskar, leverantörsförhandlingar effektiviseras och efterfrågeprognoser, spårbarhet och beslutsfattande förbättras. Det ökar i sin tur transparensen, säkerställer transaktioner och underlättar strategiska beslut, vilket leder till förbättrad effektivitet, resursförvaltning och konkurrenskraft. Syfte  Syftet med arbetet är att sammanställa information om Industri 4.0-teknologier med hjälp av systematisk litteraturgenomgång för att identifiera användningen av teknologierna i inköpsprocessen samt analysera vilken roll dessa teknologier spelar för att förbättra effektiviteten. Metod  En systematisk litteraturgenomgång genomfördes för att samla in relevant data som i sin tur användes för att analysera och besvara forskningsfrågorna. / Abstract Industry 4.0 signifies the latest phase of industrialization, focusing on digitalization, automation, and data exchange, utilizing different technologies to increase productivity and competitiveness. This work aims to identify and analyze the use of Industry 4.0 technologies in the procurement process to improve efficiency. The research questions focus on which technologies are used and their role in improving efficiency, limited to the procurement process within supply chain management, using a structured literature review as a method. The use of IoT, CPS, Big Data Analytics, Blockchain, and Cloud Services automates procurement activities. IoT is used for real-time monitoring and automated inventory replenishment. CPS coordinates both digital and physical components in order to improve traceability and decision-making. Big Data Analytics forecasts demand as well as evaluates supplier performance. Blockchain ensures transparent and secure transactions with the help of smart contracts. Cloud Services manage real-time data and improve communication and collaboration in the supply chain. These technologies play a central role in effectiveness for the procurement process where manual inventories decrease, supplier negotiations become more efficient and demand forecasts, traceability, and decision-making improve. This, in turn, increases transparency, ensures transactions, and facilitates strategic decisions, leading to improved efficiency, resource management, and competitiveness. Purpose  The purpose of the work is to compile information on Industry 4.0 technologies using a systematic literature review to identify the use of technologies in the purchasing process and analyze what role these technologies play in improving efficiency. Methodology  A systematic literature review was conducted to gather relevant data which in turn was used to analyze and answer the research questions.
43

Are HiPPOs losing power in organizational decision-making? : An exploratory study on the adoption of Big Data Analytics

Moquist Sundh, Ellinor January 2019 (has links)
Background: In the past decades, big data (BD) has become a buzzword which is associated with the opportunities of gaining competitive advantage and enhanced business performance. However, data in a vacuum is not valuable, but its value can be harnessed when used to drive decision-making. Consequently, big data analytics (BDA) is required to generate insights from BD. Nevertheless, many companies are struggling in adopting BDA and creating value. Namely, organizations need to deal with the hard work necessary to benefit from the analytics initiatives. Therefore, businesses need to understand how they can effectively manage the adoption of BDA to reach decision-making quality. The study answers the following research questions: What factors could influence the adoption of BDA in decision-making? How can the adoption of BDA affect the quality of decision-making? Purpose: The purpose of this study is to explore the opportunities and challenges of adopting big data analytics in organizational decision-making. Method: Data is collected through interviews based on a theoretical framework. The empirical findings are deductively and inductively analysed to answer the research questions. Conclusion: To harness value from BDA, companies need to deal with several challenges and develop capabilities, leading to decision-maker quality. The major challenges of BDA adoption are talent management, leadership focus, organizational culture, technology management, regulation compliance and strategy alignment. Companies should aim to develop capabilities regarding: knowledge exchange, collaboration, process integration, routinization, flexible infrastructure, big data source quality and decision maker quality. Potential opportunities generated from the adoption of BDA, leading to improved decision-making quality, are: automated decision-making, predictive analytics and more confident decision makers.
44

How Big Data Analytics are perceived as a driver for Competitive Advantage : A qualitative study on food retailers

Galletti, Alessandro, Papadimitriou, Dimitra-Christina January 2013 (has links)
The recent explosion of digital data has led the business world to a new era towards a more evidence-based decision making. Companies nowadays collect, store and analyze huge amount of data and the terms such Big Data Analytics are used to define those practices. This paper investigates how Big Data Analytics (BDA) can be perceived and used as a driver for companies’ Competitive Advantage (CA). It thus contributes in the debate about the potential role of IT assets as a source of CA, through a Resource-Based View approach, by introducing a new phenomenon such as BDA in that traditional theoretical background. A conceptual model developed by Wade and Nevo (2010) is used as guidance, where the concept of synergy developed between IT assets and other organizational resources is seen as crucial in order to create such a CA. We focus our attention on the Food Retail industry and specifically investigate two case studies, ICA Sverige AB and Masoutis S.A. The evidence shows that, although this process is at an embryonic stage, the companies perceive the implementation of BDA as a key driver for the creation of CA. Efforts are put in place in order to develop successful implementation of BDA within the company as a strategic tool for several departments, however, some hurdles have been spotted which might impede that practice.
45

Elektroninės komercijos, naudojant didžiuosius duomenis, rinkodaros modelis / Big Data Driven E-commerce marketing model

Milišauskas, Paulius 18 February 2014 (has links)
Baigiamajame magistro darbe analizuojamos elektroninės komercijos, tradicinės ir skaitmeninės rinkodaros bei duomenų analizės teorijos. Identifikuojamos pragrindinės probleminės duomenų analizės vietos. Pristatomi mažų ir vidutinių Lietuvoje veikiančių elektroninių parduotuvių savininkų apklausos skaitmeninės rinkodaros ir vartotojų kuriamų duomenų analizės temomis rezultatai. Sudaromas teorinis rinkodaros modelis, o vėliau papildomas apklausos rezultatų įžvalgomis, nurodomos pagrindinės išspręstos probleminės sritys. Remiantis nauju rinkodaros modeliu siūlomas verslo modelis. Darbo pabaigoje yra pateikiamos išvados. Darbą sudaro 6 dalys: įvadas, teorijos analizė, tiriamasis-analitinis skyrius, projektinis skyrius, išvados ir siūlymai, literatūros sąrašas. Darbo apimtis – 63p. teksto be priedų, 9 paveiksl., 3 lent., 79 bibliografiniai šaltiniai. Atskirai pridedami darbo priedai. / This master’s thesis begins with an analysis of e-commerce, traditional and electronic marketing, and data analytics theories. Later on the results of Lithuanian based small and medium sized electronic shop owner’s query on digital marketing and data analytics are being presented. Theoretical marketing model is being presented and later updated depending on the results of previously presented results of query. Also main solutions for previously clarified problems are being proposed. Depending on new marketing model a new business model is also proposed. At the end of this bachelor’s thesis there are conclusions being brought, considering all three parts of the paper. Structure: introduction, theory analysis, analytical research, project, conclusions and suggestions, references. Thesis consist of: 63 p. text without appendixes, 9 pictures, 3 tables, 79 bibliographical entries. Appendixes included.
46

Is Big data too Big for Swedish SMEs? : A quantitative study examining how the employees of small and medium-sized enterprises perceive Big data analytics

Danielsson, Lukas, Toss, Ronja January 2018 (has links)
Background:  Marketing is evolving because of Big data, and there are a lot of possibilities as well as challenges associated with Big data, especially for small and medium-sized companies (SMEs), who face barriers that prevent them from taking advantage of Big data. For companies to analyze Big data, Big data analytics are used which helps companies analyze large amounts of data. However, previous research is lacking in regard to how SMEs can implement Big data analytics and how Big data analytics are perceived by SMEs. Purpose:  The purpose of this study is to investigate how the employees of Swedish SMEs perceive Big data analytics. Research Questions: How do employees of Swedish SMEs perceive Big data analytics in their current work environment? How do the barriers impact the perceptions of Big data analytics? Methodology: The research proposes a quantitative cross-sectional design as the source of empirical data. To gather the data, a survey was administered to the employees of Swedish companies that employed less than 250 people, these companies were regarded as SMEs. 139 answered the survey and out of those, the analysis was able to use 93 of the answers. The data was analyzed using previous theories, such as the Technology Acceptance Model (TAM). Findings: The research concluded that the employees had positive perceptions about Bigdata analytics. Further, the research concluded that two of the barriers (security and resources) analyzed impacted the perceptions of the employees, whereas privacy of personal data did not. Theoretical Implications: This study adds to the lacking Big data research and improves the understanding of Big data and Big data analytics. The study also adds to the existing gap in literature to provide a more comprehensive view of Big data. Limitations: The main limitation of the study was that previous literature has been vague and ambiguous and therefore may not be applicable. Practical Implications: The study helps SMEs understand how to better implement Big data analytics and what barriers need to be prioritized regarding Big data analytics. Originality: To the best of the author’s knowledge, there is a significant lack of academic literature regarding Big data, Big data analytics and Swedish SMEs, therefore this study could be one of the pioneer studies examining these topics which will significantly contribute to current research.
47

Big data analytics em cloud gaming: um estudo sobre o reconhecimento de padrões de jogadores

Barros, Victor Perazzolo 06 February 2017 (has links)
Submitted by Rosa Assis (rosa_assis@yahoo.com.br) on 2017-11-14T18:05:03Z No. of bitstreams: 2 VICTOR PERAZZOLO BARROS.pdf: 24134660 bytes, checksum: 8761000aa9ba093f81a14b3c2368f2b7 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Paola Damato (repositorio@mackenzie.br) on 2017-11-27T12:14:38Z (GMT) No. of bitstreams: 2 VICTOR PERAZZOLO BARROS.pdf: 24134660 bytes, checksum: 8761000aa9ba093f81a14b3c2368f2b7 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2017-11-27T12:14:38Z (GMT). No. of bitstreams: 2 VICTOR PERAZZOLO BARROS.pdf: 24134660 bytes, checksum: 8761000aa9ba093f81a14b3c2368f2b7 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2017-02-06 / The advances in Cloud Computing and communication technologies enabled the concept of Cloud Gaming to become a reality. Through PCs, consoles, smartphones, tablets, smart TVs and other devices, people can access and use games via data streaming, regardless the computing power of these devices. The Internet is the fundamental way of communication between the device and the game, which is hosted and processed on an environment known as Cloud. In the Cloud Gaming model, the games are available on demand and offered in large scale to the users. The players' actions and commands are sent to servers that process the information and send the result (reaction) back to the players. The volume of data processed and stored in these Cloud environments exceeds the limits of analysis and manipulation of conventional tools, but these data contains information about the players' profile, its singularities, actions, behavior and patterns that can be valuable when analyzed. For a proper comprehension and understanding of this raw data and to make it interpretable, it is necessary to use appropriate techniques and platforms to manipulate this amount of data. These platforms belong to an ecosystem that involves the concepts of Big Data. The model known as Big Data Analytics is an effective and capable way to, not only work with these data, but understand its meaning, providing inputs for assertive analysis and predictive actions. This study searches to understand how these technologies works and propose a method capable to analyze and identify patterns in players' behavior and characteristics on a virtual environment. By knowing the patterns of different players, it is possible to group and compare information, in order to optimize the user experience, revenue for developers and raise the level of control over the environment in a way that players' actions can be predicted. The results presented are based on different analysis modeling using the Hadoop technology combined with data visualization tools and information from open data sources in a dataset of the World of Warcraft game. Fraud detection, users' game patterns, churn prevention inputs and relations with game attractiveness elements are examples of modeling used. In this research, it was possible to map and identify the players' behavior patterns and create a prediction of its frequency and tendency to evade or stay in the game. / Os avanços das tecnologias de Computacão em Nuvem (Cloud Computing) e comunicações possibilitaram o conceito de Jogos em Nuvem (Cloud Gaming) se tornar uma realidade. Por meio de computadores, consoles, smartphones, tablets, smart TVs e outros equipamentos é possível acessar via streaming e utilizar jogos independentemente da capacidade computacional destes dispositivos. Os jogos são hospedados e executados em um ambiente computacional conhecido como Nuvem, a Internet é o meio de comunicação entre estes dispositivos e o jogo. No modelo conhecido como Cloud Gaming, compreendesse que os jogos são disponibilizados sob demanda para os usuários e podem ser oferecidos em larga escala. Os comandos e ações dos jogadores são enviados para servidores que processam a informação e enviam o resultado (reação) para o jogador. A quantidade de dados que são processados e armazenados nestes ambientes em Nuvem superam os limites de análise e manipulação de plataformas convencionais, porém tais dados contém informacões sobre o perfil dos jogadores, suas particularidades, ações, comportamentos e padrões que podem ser importantes quando analisados. Para uma devida compreensão e lapidação destes dados brutos, a fim de torná-los interpretáveis, se faz necessário o uso de técnicas e plataformas apropriadas para manipulação desta quantidade de dados. Estas plataformas fazem parte de um ecossistema que envolvem os conceitos de Big Data. Arquiteturas e ferramentas de Big Data, mais especificamente, o modelo denominado Big Data Analytics, são instrumentos eficazes e capazes de não somente trabalhar com estes dados, mas entender seu significado, fornecendo insumos para análise assertiva e predição de acões. O presente estudo busca compreender o funcionamento destas tecnologias e fornecer um método capaz de identificar padrões nos comportamentos e características dos jogadores em ambiente virtual. Conhecendo os padrões de diferentes usuários é possível agrupar e comparar as informações, a fim de otimizar a experiência destes usuários no jogo, aumentar a receita para os desenvolvedores e elevar o nível de controle sobre o ambiente ao ponto que seja possível de prever ações futuras dos jogadores. Os resultados obtidos são derivados de diferentes modelagens de análise utilizando a tecnologia Hadoop combinada com ferramentas de visualização de dados e informações de fontes de dados abertas, em um dataset do jogo World of Warcraft. Detecção de fraude, padrões de jogo dos usuários, insumos para prevencão de churn e relações com elementos de atratividade no jogo, são exemplos de modelagens abordadas. Nesta pesquisa foi possível mapear e identificar os padrões de comportamento dos jogadores e criar uma previsão e tendência de assiduidade sobre evasão ou permanencia de usuários no jogo.
48

Big Data Analytics for Fault Detection and its Application in Maintenance / Big Data Analytics för Feldetektering och Applicering inom Underhåll

Zhang, Liangwei January 2016 (has links)
Big Data analytics has attracted intense interest recently for its attempt to extract information, knowledge and wisdom from Big Data. In industry, with the development of sensor technology and Information & Communication Technologies (ICT), reams of high-dimensional, streaming, and nonlinear data are being collected and curated to support decision-making. The detection of faults in these data is an important application in eMaintenance solutions, as it can facilitate maintenance decision-making. Early discovery of system faults may ensure the reliability and safety of industrial systems and reduce the risk of unplanned breakdowns. Complexities in the data, including high dimensionality, fast-flowing data streams, and high nonlinearity, impose stringent challenges on fault detection applications. From the data modelling perspective, high dimensionality may cause the notorious “curse of dimensionality” and lead to deterioration in the accuracy of fault detection algorithms. Fast-flowing data streams require algorithms to give real-time or near real-time responses upon the arrival of new samples. High nonlinearity requires fault detection approaches to have sufficiently expressive power and to avoid overfitting or underfitting problems. Most existing fault detection approaches work in relatively low-dimensional spaces. Theoretical studies on high-dimensional fault detection mainly focus on detecting anomalies on subspace projections. However, these models are either arbitrary in selecting subspaces or computationally intensive. To meet the requirements of fast-flowing data streams, several strategies have been proposed to adapt existing models to an online mode to make them applicable in stream data mining. But few studies have simultaneously tackled the challenges associated with high dimensionality and data streams. Existing nonlinear fault detection approaches cannot provide satisfactory performance in terms of smoothness, effectiveness, robustness and interpretability. New approaches are needed to address this issue. This research develops an Angle-based Subspace Anomaly Detection (ABSAD) approach to fault detection in high-dimensional data. The efficacy of the approach is demonstrated in analytical studies and numerical illustrations. Based on the sliding window strategy, the approach is extended to an online mode to detect faults in high-dimensional data streams. Experiments on synthetic datasets show the online extension can adapt to the time-varying behaviour of the monitored system and, hence, is applicable to dynamic fault detection. To deal with highly nonlinear data, the research proposes an Adaptive Kernel Density-based (Adaptive-KD) anomaly detection approach. Numerical illustrations show the approach’s superiority in terms of smoothness, effectiveness and robustness.
49

‘Data over intuition’ – How big data analytics revolutionises the strategic decision-making processes in enterprises

Höcker, Filip, Brand, Finn January 2020 (has links)
Background: Digital technologies are increasingly transforming traditional businesses, and their pervasive impact is leading to a radical restructuring of entire industries. While the significance of generating competitive advantages for businesses utilizing big data analytics is recognized, there is still a lack of consensus of big data analytics influencing strategic decision-making in organisations. As big data and big data analytics become increasingly common, understanding the factors influencing decision-making quality becomes of paramount importance for businesses. Purpose: This thesis investigates how big data and big data analytics affect the operational strategic decision-making processes in enterprises through the theoretical lens of the strategy-as-practice framework. Method: The study follows an abductive research approach by testing a theory (i.e., strategy-aspractice) through the use of a qualitative research design. A single case study of IKEA was conducted to generate the primary data for this thesis. Sampling is carried out internally at IKEA by first identifying the heads of the different departments within the data analysis and from there applying the snowball sampling technique, to increase the number of interviewees and to ensure the collection of enough data for coding. Findings: The findings show that big data analytics has a decisive influence on practitioners. At IKEA, data analysts have become an integral part of the operational strategic decision-making processes and discussions are driven by data and rigor rather than by gut and intuition. In terms of practices, it became apparent that big data analytics has led to a more performance-oriented use of strategic tools and enabling IKEA to make strategic decisions in real-time, which not only increases agility but also mitigates the risk of wrong decisions.
50

Big data analytics capability and market performance: The roles of disruptive business models and competitive intensity

Olabode, Oluwaseun E., Boso, N., Hultman, M., Leonidou, C.N. 08 October 2021 (has links)
Yes / Research shows that big data analytics capability (BDAC) is a major determinant of firm performance. However, scant research has theoretically articulated and empirically tested the mechanisms and conditions under which BDAC influences performance. This study advances existing knowledge on the BDAC–performance relationship by drawing on the knowledge-based view and contingency theory to argue that how and when BDAC influences market performance is dependent on the intervening role of disruptive business models and the contingency role of competitive intensity. We empirically test this argument on primary data from 360 firms in the United Kingdom. The results show that disruptive business models partially mediate the positive effect of BDAC on market performance, and this indirect positive effect is strengthened when competitive intensity increases. These findings provide new perspectives on the business model processes and competitive conditions under which firms maximize marketplace value from investments in BDACs.

Page generated in 0.0837 seconds