Spelling suggestions: "subject:"data analytics"" "subject:"mata analytics""
141 |
DataOps : Towards Understanding and Defining Data Analytics ApproachMainali, Kiran January 2020 (has links)
Data collection and analysis approaches have changed drastically in the past few years. The reason behind adopting different approach is improved data availability and continuous change in analysis requirements. Data have been always there, but data management is vital nowadays due to rapid generation and availability of various formats. Big data has opened the possibility of dealing with potentially infinite amounts of data with numerous formats in a short time. The data analytics is becoming complex due to data characteristics, sophisticated tools and technologies, changing business needs, varied interests among stakeholders, and lack of a standardized process. DataOps is an emerging approach advocated by data practitioners to cater to the challenges in data analytics projects. Data analytics projects differ from software engineering in many aspects. DevOps is proven to be an efficient and practical approach to deliver the project in the Software Industry. However, DataOps is still in its infancy, being recognized as an independent and essential task data analytics. In this thesis paper, we uncover DataOps as a methodology to implement data pipelines by conducting a systematic search of research papers. As a result, we define DataOps outlining ambiguities and challenges. We also explore the coverage of DataOps to different stages of the data lifecycle. We created comparison matrixes of different tools and technologies categorizing them in different functional groups to demonstrate their usage in data lifecycle management. We followed DataOps implementation guidelines to implement data pipeline using Apache Airflow as workflow orchestrator inside Docker and compared with simple manual execution of a data analytics project. As per evaluation, the data pipeline with DataOps provided automation in task execution, orchestration in execution environment, testing and monitoring, communication and collaboration, and reduced end-to-end product delivery cycle time along with the reduction in pipeline execution time. / Datainsamling och analysmetoder har förändrats drastiskt under de senaste åren. Anledningen till ett annat tillvägagångssätt är förbättrad datatillgänglighet och kontinuerlig förändring av analyskraven. Data har alltid funnits, men datahantering är viktig idag på grund av snabb generering och tillgänglighet av olika format. Big data har öppnat möjligheten att hantera potentiellt oändliga mängder data med många format på kort tid. Dataanalysen blir komplex på grund av dataegenskaper, sofistikerade verktyg och teknologier, förändrade affärsbehov, olika intressen bland intressenter och brist på en standardiserad process. DataOps är en framväxande strategi som förespråkas av datautövare för att tillgodose utmaningarna i dataanalysprojekt. Dataanalysprojekt skiljer sig från programvaruteknik i många aspekter. DevOps har visat sig vara ett effektivt och praktiskt tillvägagångssätt för att leverera projektet i mjukvaruindustrin. DataOps är dock fortfarande i sin linda och erkänns som en oberoende och viktig uppgiftsanalys. I detta examensarbete avslöjar vi DataOps som en metod för att implementera datarörledningar genom att göra en systematisk sökning av forskningspapper. Som ett resultat definierar vi DataOps som beskriver tvetydigheter och utmaningar. Vi undersöker också täckningen av DataOps till olika stadier av datalivscykeln. Vi skapade jämförelsesmatriser med olika verktyg och teknologier som kategoriserade dem i olika funktionella grupper för att visa hur de används i datalivscykelhantering. Vi följde riktlinjerna för implementering av DataOps för att implementera datapipeline med Apache Airflow som arbetsflödesorkestrator i Docker och jämfört med enkel manuell körning av ett dataanalysprojekt. Enligt utvärderingen tillhandahöll datapipelinen med DataOps automatisering i uppgiftskörning, orkestrering i exekveringsmiljö, testning och övervakning, kommunikation och samarbete, och minskad leveranscykeltid från slut till produkt tillsammans med minskningen av tid för rörledningskörning.
|
142 |
Development and evaluation of an interactive e-module on Central Limit TheoremHolovchenko, Anastasiia 04 May 2023 (has links)
No description available.
|
143 |
Användningen av Industri 4.0- teknologier i inköpsprocessen : En systematisk litteraturgenomgångAbdullah, Rahaf, Nedeva, Mirjana January 2024 (has links)
Sammanfattning Industri 4.0 tyder på den senaste fasen av industrialiseringen och fokuserar på digitalisering, automatisering och datautbyte. Den fjärde industriella revolutionen, Industri 4.0, bygger på tidigare industriella revolutioner och använder sig av olika teknologier såsom IoT, Big Data Analytics och Cloud Services. Inom inköpsprocessen innebär Industri 4.0 ett behov att effektivisera och uppnå flexibla processer för att möta ökade krav på innovation och kvalitet. Syftet med arbetet är att identifiera och analysera användningen av Industri 4.0-teknologier i inköpsprocessen för att förbättra effektiviteten. Forskningsfrågorna fokuserar på vilka teknologier som används i inköpsprocessen och på vilket sätt de används samt att analysera deras roll i att förbättra effektiviteten. Arbetet begränsas till inköpsprocessen inom supply chain management och använder en strukturerad litteraturgenomgång som metod. Användningen av IoT, CPS, Big Data-analys, Blockchain och Cloud Services automatiserar och effektiviserar inköpsaktiviteter. IoT används för realtidsövervakning och automatiserad påfyllning av lager. CPS samordnar både digitala och fysiska komponenter i syfte att förbättra spårbarhet och beslutsfattande. Big Data-analys förutser efterfrågan samt utvärderar leverantörers prestationer. Blockchain säkerställer transparenta och säkra transaktioner med hjälp av smarta kontrakt. Cloud Services hanterar realtidsdata och förbättrar kommunikation och samarbete i leveranskedjan. Dessa teknologier spelar en central roll för effektiviseringen i inköpsprocessen där manuella lager minskar, leverantörsförhandlingar effektiviseras och efterfrågeprognoser, spårbarhet och beslutsfattande förbättras. Det ökar i sin tur transparensen, säkerställer transaktioner och underlättar strategiska beslut, vilket leder till förbättrad effektivitet, resursförvaltning och konkurrenskraft. Syfte Syftet med arbetet är att sammanställa information om Industri 4.0-teknologier med hjälp av systematisk litteraturgenomgång för att identifiera användningen av teknologierna i inköpsprocessen samt analysera vilken roll dessa teknologier spelar för att förbättra effektiviteten. Metod En systematisk litteraturgenomgång genomfördes för att samla in relevant data som i sin tur användes för att analysera och besvara forskningsfrågorna. / Abstract Industry 4.0 signifies the latest phase of industrialization, focusing on digitalization, automation, and data exchange, utilizing different technologies to increase productivity and competitiveness. This work aims to identify and analyze the use of Industry 4.0 technologies in the procurement process to improve efficiency. The research questions focus on which technologies are used and their role in improving efficiency, limited to the procurement process within supply chain management, using a structured literature review as a method. The use of IoT, CPS, Big Data Analytics, Blockchain, and Cloud Services automates procurement activities. IoT is used for real-time monitoring and automated inventory replenishment. CPS coordinates both digital and physical components in order to improve traceability and decision-making. Big Data Analytics forecasts demand as well as evaluates supplier performance. Blockchain ensures transparent and secure transactions with the help of smart contracts. Cloud Services manage real-time data and improve communication and collaboration in the supply chain. These technologies play a central role in effectiveness for the procurement process where manual inventories decrease, supplier negotiations become more efficient and demand forecasts, traceability, and decision-making improve. This, in turn, increases transparency, ensures transactions, and facilitates strategic decisions, leading to improved efficiency, resource management, and competitiveness. Purpose The purpose of the work is to compile information on Industry 4.0 technologies using a systematic literature review to identify the use of technologies in the purchasing process and analyze what role these technologies play in improving efficiency. Methodology A systematic literature review was conducted to gather relevant data which in turn was used to analyze and answer the research questions.
|
144 |
Evaluation of Health Data Warehousing: Development of a Framework and Assessment of Current PracticesLeenaerts, Marianne 09 April 2015 (has links)
If knowledge has been gathered by the practitioners’ community in the area of health data warehousing evaluation, it is mostly relying on anecdotal evidence instead of academic research. Isolated dimensions have received more attention and benefit from definitions and performance measures. However, very few cases can be found in the literature which describe how the assessment of the technology can be made, and these cases do not provide insight on how to systematize such assessment.
The research in this dissertation is aimed at bridging this knowledge gap by developing an evaluation framework, and conducting an empirical study to further investigate the state of health data warehousing evaluation and the use of the technology to improve healthcare efficiency, as well as to compare these findings with the proposed framework.
The empirical study involved an exploratory approach and used a qualitative method, i.e. audio-taped semi-structured interviews. The interviews were conducted in collaboration with the Healthcare Data Warehousing Association and involved 21 participants who were members of the Association working in a mid- to upper-level management capacity on the development and implementation of health data warehousing. All audio-taped interviews were transcribed and transcripts were coded using a qualitative analysis software package (NVivo, QSR International). Results were obtained in three areas. First, the study established that current health data warehousing systems are typically not formally evaluated. Systematic assessments relying on predetermined indicators and commonly accepted evaluation methods are very seldom performed and Critical Success Factors are not used as a reference to guide the system’s evaluation. This finding appears to explain why a literature review on the topic returns so few publications. Second, from patient throughput to productivity tracking and cost optimization, the study provided evidence of the contribution of data warehousing to the improvement of healthcare systems’ efficiency. Multiple examples were given by participants to illustrate the ways in which the technology contributed to streamlining the care process and increase healthcare efficiency in their respective organizations. Third, the study compared the proposed framework with current practices. Because formal evaluations were seldom performed, the empirical study offered limited feedback on the framework’s structure and rather informed its content and the assessment factors initially defined. / Graduate
|
145 |
DYNAMICS OF IDENTITY THREATS IN ONLINE SOCIAL NETWORKS: MODELLING INDIVIDUAL AND ORGANIZATIONAL PERSPECTIVESSyed, Romilla 01 January 2015 (has links)
This dissertation examines the identity threats perceived by individuals and organizations in Online Social Networks (OSNs). The research constitutes two major studies. Using the concepts of Value Focused Thinking and the related methodology of Multiple Objectives Decision Analysis, the first research study develops the qualitative and quantitative value models to explain the social identity threats perceived by individuals in Online Social Networks. The qualitative value model defines value hierarchy i.e. the fundamental objectives to prevent social identity threats and taxonomy of user responses, referred to as Social Identity Protection Responses (SIPR), to avert the social identity threats. The quantitative value model describes the utility of the current social networking sites and SIPR to achieve the fundamental objectives for averting social identity threats in OSNs. The second research study examines the threats to the external identity of organizations i.e. Information Security Reputation (ISR) in the aftermath of a data breach. The threat analysis is undertaken by examining the discourses related to the data breach at Home Depot and JPMorgan Chase in the popular microblogging website, Twitter, to identify: 1) the dimensions of information security discussed in the Twitter postings; 2) the attribution of data breach responsibility and the related sentiments expressed in the Twitter postings; and 3) the subsequent diffusion of the tweets that threaten organizational reputation.
|
146 |
NEW ARTIFACTS FOR THE KNOWLEDGE DISCOVERY VIA DATA ANALYTICS (KDDA) PROCESSLi, Yan 01 January 2014 (has links)
Recently, the interest in the business application of analytics and data science has increased significantly. The popularity of data analytics and data science comes from the clear articulation of business problem solving as an end goal. To address limitations in existing literature, this dissertation provides four novel design artifacts for Knowledge Discovery via Data Analytics (KDDA). The first artifact is a Snail Shell KDDA process model that extends existing knowledge discovery process models, but addresses many existing limitations. At the top level, the KDDA Process model highlights the iterative nature of KDDA projects and adds two new phases, namely Problem Formulation and Maintenance. At the second level, generic tasks of the KDDA process model are presented in a comparative manner, highlighting the differences between the new KDDA process model and the traditional knowledge discovery process models. Two case studies are used to demonstrate how to use KDDA process model to guide real world KDDA projects. The second artifact, a methodology for theory building based on quantitative data is a novel application of KDDA process model. The methodology is evaluated using a theory building case from the public health domain. It is not only an instantiation of the Snail Shell KDDA process model, but also makes theoretical contributions to theory building. It demonstrates how analytical techniques can be used as quantitative gauges to assess important construct relationships during the formative phase of theory building. The third artifact is a data mining ontology, the DM3 ontology, to bridge the semantic gap between business users and KDDA expert and facilitate analytical model maintenance and reuse. The DM3 ontology is evaluated using both criteria-based approach and task-based approach. The fourth artifact is a decision support framework for MCDA software selection. The framework enables users choose relevant MCDA software based on a specific decision making situation (DMS). A DMS modeling framework is developed to structure the DMS based on the decision problem and the users' decision preferences and. The framework is implemented into a decision support system and evaluated using application examples from the real-estate domain.
|
147 |
Are HiPPOs losing power in organizational decision-making? : An exploratory study on the adoption of Big Data AnalyticsMoquist Sundh, Ellinor January 2019 (has links)
Background: In the past decades, big data (BD) has become a buzzword which is associated with the opportunities of gaining competitive advantage and enhanced business performance. However, data in a vacuum is not valuable, but its value can be harnessed when used to drive decision-making. Consequently, big data analytics (BDA) is required to generate insights from BD. Nevertheless, many companies are struggling in adopting BDA and creating value. Namely, organizations need to deal with the hard work necessary to benefit from the analytics initiatives. Therefore, businesses need to understand how they can effectively manage the adoption of BDA to reach decision-making quality. The study answers the following research questions: What factors could influence the adoption of BDA in decision-making? How can the adoption of BDA affect the quality of decision-making? Purpose: The purpose of this study is to explore the opportunities and challenges of adopting big data analytics in organizational decision-making. Method: Data is collected through interviews based on a theoretical framework. The empirical findings are deductively and inductively analysed to answer the research questions. Conclusion: To harness value from BDA, companies need to deal with several challenges and develop capabilities, leading to decision-maker quality. The major challenges of BDA adoption are talent management, leadership focus, organizational culture, technology management, regulation compliance and strategy alignment. Companies should aim to develop capabilities regarding: knowledge exchange, collaboration, process integration, routinization, flexible infrastructure, big data source quality and decision maker quality. Potential opportunities generated from the adoption of BDA, leading to improved decision-making quality, are: automated decision-making, predictive analytics and more confident decision makers.
|
148 |
Business Analytics Maturity Model : An adaptation to the e-commerce industry.Nilsson, Valentin, Dahlgren, André January 2019 (has links)
Maturity models have become a widely used framework for assessing various capabilities and technologies among businesses. This thesis develops a maturity model for assessing Business Analytics (BA) in Swedish e-commerce firms. Business Analytics has become an increasingly important part of modern businesses, and firms are continuously looking for new ways to perform analysis of the data available to them. The prominent previous maturity models within BA have mainly been developed by IT-consultancy firms with the underlying intent of selling their IT services. Consequently, these models have a primary focus on the technical factors towards Business Analytics maturity, partly neglecting the importance of organisational factors. This thesis develops a Business Analytic Maturity Model (BAMM) which fills an identified research gap of academic maturity models with emphasis on the organisational factors of BA maturity. Using a qualitative research design, the BAMM is adapted to the Swedish e-commerce industry through two sequential evaluation stages. The study finds that organisational factors have a greater impact on BA maturity than previous research suggests. The BAMM and the study's results contribute with knowledge of Business Analytics, as well as providing e-commerce firms with insights into how to leverage their data.
|
149 |
Digitaliseringens påverkan på revisionen : Vilken påverkan har digitaliseringen haft på revisionsmetodiken och revisionskvaliteten? / Digitalization’s impact on auditing : What impact does the digitalization have on audit methodology and audit quality?Frykenberger, Robin January 2019 (has links)
Syftet med denna studie är att förklara och förstå digitaliseringens påverkan på revisionen. Revisionsbranschen är idag inne i en betydande förändringsprocess där traditionella metoder ersätts med modernare revisionstekniker. Digitaliseringen sägs vara en stark drivkraft till denna förändring, vilket har en stor påverkan på yrket. Genom att kunderna blivit mer digitala, samt att nya digitala revisionsverktyg har utvecklats, skapas möjligheten för en ny typ av revision. Tidigare forskning tyder på att yrket är på väg att automatiseras, och i framtiden tror forskare att robotisering kommer vara aktuell, även inom revisionsbranschen. Med fokus på två olika frågeställningar; digitaliseringens påverkan på revisionsmetodiken och digitaliseringens påverkan på revisionskvaliteten, har en kvalitativt inriktad studie genomförts. Studien bygger på intervjuer med tolv olika revisorer och programutvecklare. Studien visar att digitaliseringen av revisionen har haft en stor påverkan på yrket. Framförallt har digitaliseringen påverkat möjligheten att hantera en större mängd data. Bland annat har revisionen gått från att tidigare arbetat med statistiska urval, till att idag fokusera mer på att analysera data. Genom nya dataanalysverktyg har det blivit möjligt för revisorerna att gå från stickprov till att granska hela populationen av data. Granskningen kan då bättre inriktas på områden som avviker, eller områden med större risk. Studien visar att förändringen har haft en positiv påverkan på revisionskvaliteten, bland annat därför att revisionsbevisen blivit mer relevanta och tillförlitliga, samt att risken för att revisorerna gör ett felaktigt uttalande minskat. Mer fokus läggs idag på att revisorerna ska förstå kundens verksamhet och de faktorer som kan påverka denna. Förändringen har gjort att revisorernas analytiska förmåga blivit allt viktigare för att klara av yrket. Utvecklingen har möjliggjort att revisorerna kan ge bättre rådgivning, vilket skapar nya möjligheter inom branschen. / The purpose of this study is to explain and understand the influence of digitalization on the audit. The auditing industry is today in a significant process of change, where traditional methods are replaced by more modern auditing techniques. Digitalization is said to be a strong driving force for this change, which has a great impact on the profession. Because customers have become more digital and that new digital audit tools have been developed, the possibility of a new type of audit is created. Previous studies show that the profession is about to be automated, and in the future, researchers believe that robotization will be relevant also in the auditing industry. Focusing on two different issues; the impact of digitization on the audit methodology, and the impact of digitization on audit quality, a qualitatively oriented study has been conducted. The study is based on interviews with twelve different auditors and program developers. The study shows that the digitization of the audit has had a major impact on the profession. In particular, digitization has affected the ability to handle a larger amount of data. Among other things, the audit has gone from having previously worked with a large number of samples to today instead focusing on analyzing data. Through new data analysis tools, it has become possible for the auditors to switch from sampling to reviewing the entire population of data. The audit can then focus on areas that deviate or areas with greater risk. The study shows that the change has had a positive impact on the audit quality, since the audit evidence has become both more relevant and reliable, and that the risk that the auditors make an incorrect statement has decreased. More focus is now being placed on the auditors being able to understand the customer's business and the things that can affect this. The change has meant that the auditors' analytical ability has become increasingly important in order to cope with the profession. The development has made it possible for the auditors to provide better advice, which creates new opportunities in the industry.
|
150 |
How Big Data Analytics are perceived as a driver for Competitive Advantage : A qualitative study on food retailersGalletti, Alessandro, Papadimitriou, Dimitra-Christina January 2013 (has links)
The recent explosion of digital data has led the business world to a new era towards a more evidence-based decision making. Companies nowadays collect, store and analyze huge amount of data and the terms such Big Data Analytics are used to define those practices. This paper investigates how Big Data Analytics (BDA) can be perceived and used as a driver for companies’ Competitive Advantage (CA). It thus contributes in the debate about the potential role of IT assets as a source of CA, through a Resource-Based View approach, by introducing a new phenomenon such as BDA in that traditional theoretical background. A conceptual model developed by Wade and Nevo (2010) is used as guidance, where the concept of synergy developed between IT assets and other organizational resources is seen as crucial in order to create such a CA. We focus our attention on the Food Retail industry and specifically investigate two case studies, ICA Sverige AB and Masoutis S.A. The evidence shows that, although this process is at an embryonic stage, the companies perceive the implementation of BDA as a key driver for the creation of CA. Efforts are put in place in order to develop successful implementation of BDA within the company as a strategic tool for several departments, however, some hurdles have been spotted which might impede that practice.
|
Page generated in 0.1191 seconds