• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 635
  • 79
  • 64
  • 59
  • 35
  • 28
  • 26
  • 21
  • 13
  • 10
  • 8
  • 4
  • 3
  • 3
  • 2
  • Tagged with
  • 1213
  • 554
  • 243
  • 220
  • 207
  • 192
  • 190
  • 173
  • 157
  • 154
  • 148
  • 145
  • 132
  • 130
  • 129
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Visual analytics of arsenic in various foods

Johnson, Matilda Olubunmi 06 1900 (has links)
Arsenic is a naturally occurring toxic metal and its presence in food composites could be a potential risk to the health of both humans and animals. Arseniccontaminated groundwater is often used for food and animal consumption, irrigation of soils, which could potentially lead to arsenic entering the human food chain. Its side effects include multiple organ damage, cancers, heart disease, diabetes mellitus, hypertension, lung disease and peripheral vascular disease. Research investigations, epidemiologic surveys and total diet studies (market baskets) provide datasets, information and knowledge on arsenic content in foods. The determination of the concentration of arsenic in rice varieties is an active area of research. With the increasing capability to measure the concentration of arsenic in foods, there are volumes of varied and continuously generated datasets on arsenic in food groups. Visual analytics, which integrates techniques from information visualization and computational data analysis via interactive visual interfaces, presents an approach to enable data on arsenic concentrations to be visually represented. The goal of this doctoral research in Environmental Science is to address the need to provide visual analytical decision support tools on arsenic content in various foods with special emphasis on rice. The hypothesis of this doctoral thesis research is that software enabled visual representation and user interaction facilitated by visual interfaces will help discover hidden relationships between arsenic content and food categories. The specific objectives investigated were: (1) Provide insightful visual analytic views of compiled data on arsenic in food categories; (2) Categorize table ready foods by arsenic content; (3) Compare arsenic content in rice product categories and (4) Identify informative sentences on arsenic concentrations in rice. The overall research method is secondary data analyses using visual analytics techniques implemented through Tableau Software. Several datasets were utilized to conduct visual analytical representations of data on arsenic concentrations in foods. These consisted of (i) arsenic concentrations in 459 crop samples; (ii) arsenic concentrations in 328 table ready foods from multi-year total diet studies; (iii) estimates of daily inorganic arsenic intake for 49 food groups from multicountry total diet studies; (iv) arsenic content in rice product categories for 193 samples of rice and rice products; (v) 758 sentences extracted from PubMed abstracts on arsenic in rice. Several key insights were made in this doctoral research. The concentration of inorganic arsenic in instant rice was lower than those of other rice types. The concentration of Dimethylarsinic Acid (DMA) in wild rice, an aquatic grass, was notably lower than rice varieties (e.g. 0.0099 ppm versus 0.182 for a long grain white rice). The categorization of 328 table ready foods into 12 categories enhances the communication on arsenic concentrations. Outlier concentration of arsenic in rice were observed in views constructed for integrating data from four total diet studies. The 193 rice samples were grouped into two groups using a cut-off level of 3 mcg of inorganic arsenic per serving. The visual analytics views constructed allow users to specify cut-off levels desired. A total of 86 sentences from 53 PubMed abstracts were identified as informative for arsenic concentrations. The sentences enabled literature curation for arsenic concentration and additional supporting information such as location of the research. An informative sentence provided global “normal” range of 0.08 to 0.20 mg/kg for arsenic in rice. A visual analytics resource developed was a dashboard that facilitates the interaction with text and a connection to the knowledge base of the PubMed literature database. The research reported provides a foundation for additional investigations on visual analytics of data on arsenic concentrations in foods. Considering the massive and complex data associated with contaminants in foods, the development of visual analytics tools are needed to facilitate diverse human cognitive tasks. Visual analytics tools can provide integrated automated analysis; interaction with data; and data visualization critically needed to enhance decision making. Stakeholders that would benefit include consumers; food and health safety personnel; farmers; and food producers. Arsenic content of baby foods warrants attention because of the early life exposures that could have life time adverse health consequences. The action of microorganisms in the soil is associated with availability of arsenic species for uptake by plants. Genomic data on microbial communities presents wealth of data to identify mitigation strategies for arsenic uptake by plants. Arsenic metabolism pathways encoded in microbial genomes warrants further research. Visual analytics tasks could facilitate the discovery of biological processes for mitigating arsenic uptake from soil. The increasing availability of central resources on data from total diet studies and research investigations presents a need for personnel with diverse levels of skills in data management and analysis. Training workshops and courses on the foundations and applications of visual analytics can contribute to global workforce development in food safety and environmental health. Research investigations could determine learning gains accomplished through hardware and software for visual analytics. Finally, there is need to develop and evaluate informatics tools that have visual analytics capabilities in the domain of contaminants in foods. / Environmental Sciences / P. Phil. (Environmental Science)
322

Revealing the Non-technical Side of Big Data Analytics : Evidence from Born analyticals and Big intelligent firms

Denadija, Feda, Löfgren, David January 2016 (has links)
This study aspired to gain a more a nuanced understanding of the emerging analytics technologies and the vital capabilities that ultimately drive evidence-based decision making. Big data technology is widely discussed by varying groups in society and believed to revolutionize corporate decision making. In spite of big data's promising possibilities only a trivial fraction of firms deploying big data analytics (BDA) have gained significant benefits from their initiatives. Trying to explain this inability we leaned back on prior IT literature suggesting that IT resources can only be successfully deployed when combined with organizational capabilities. We identified key theoretical components at an organizational, relational, and human level. The data collection included 20 interviews with decision makers and data scientist from four analytical leaders. Early on we distinguished the companies into two categories based on their empirical characteristics. The terms “Born analyticals” and “Big intelligent firms” were coined. The analysis concluded that social, non-technical elements play a crucial role in building BDA abilities. These capabilities differ among companies but can still enable BDA in different ways, indicating that organizations´ history and context seem to influence how firms deploy capabilities. Some capabilities have proven to be more important than others. The individual mindset towards data is seemingly the most determining capability in building BDA ability. Varying mindsets foster different BDA-environments in which other capabilities behave accordingly. Born analyticals seemed to display an environment benefitting evidence based decisions.
323

Close and Distant Reading Visualizations for the Comparative Analysis of Digital Humanities Data

Jänicke, Stefan 19 July 2016 (has links) (PDF)
Traditionally, humanities scholars carrying out research on a specific or on multiple literary work(s) are interested in the analysis of related texts or text passages. But the digital age has opened possibilities for scholars to enhance their traditional workflows. Enabled by digitization projects, humanities scholars can nowadays reach a large number of digitized texts through web portals such as Google Books or Internet Archive. Digital editions exist also for ancient texts; notable examples are PHI Latin Texts and the Perseus Digital Library. This shift from reading a single book “on paper” to the possibility of browsing many digital texts is one of the origins and principal pillars of the digital humanities domain, which helps developing solutions to handle vast amounts of cultural heritage data – text being the main data type. In contrast to the traditional methods, the digital humanities allow to pose new research questions on cultural heritage datasets. Some of these questions can be answered with existent algorithms and tools provided by the computer science domain, but for other humanities questions scholars need to formulate new methods in collaboration with computer scientists. Developed in the late 1980s, the digital humanities primarily focused on designing standards to represent cultural heritage data such as the Text Encoding Initiative (TEI) for texts, and to aggregate, digitize and deliver data. In the last years, visualization techniques have gained more and more importance when it comes to analyzing data. For example, Saito introduced her 2010 digital humanities conference paper with: “In recent years, people have tended to be overwhelmed by a vast amount of information in various contexts. Therefore, arguments about ’Information Visualization’ as a method to make information easy to comprehend are more than understandable.” A major impulse for this trend was given by Franco Moretti. In 2005, he published the book “Graphs, Maps, Trees”, in which he proposes so-called distant reading approaches for textual data that steer the traditional way of approaching literature towards a completely new direction. Instead of reading texts in the traditional way – so-called close reading –, he invites to count, to graph and to map them. In other words, to visualize them. This dissertation presents novel close and distant reading visualization techniques for hitherto unsolved problems. Appropriate visualization techniques have been applied to support basic tasks, e.g., visualizing geospatial metadata to analyze the geographical distribution of cultural heritage data items or using tag clouds to illustrate textual statistics of a historical corpus. In contrast, this dissertation focuses on developing information visualization and visual analytics methods that support investigating research questions that require the comparative analysis of various digital humanities datasets. We first take a look at the state-of-the-art of existing close and distant reading visualizations that have been developed to support humanities scholars working with literary texts. We thereby provide a taxonomy of visualization methods applied to show various aspects of the underlying digital humanities data. We point out open challenges and we present our visualizations designed to support humanities scholars in comparatively analyzing historical datasets. In short, we present (1) GeoTemCo for the comparative visualization of geospatial-temporal data, (2) the two tag cloud designs TagPies and TagSpheres that comparatively visualize faceted textual summaries, (3) TextReuseGrid and TextReuseBrowser to explore re-used text passages among the texts of a corpus, (4) TRAViz for the visualization of textual variation between multiple text editions, and (5) the visual analytics system MusikerProfiling to detect similar musicians to a given musician of interest. Finally, we summarize our and the collaboration experiences of other visualization researchers to emphasize the ingredients required for a successful project in the digital humanities, and we take a look at future challenges in that research field.
324

Towards Prescriptive Analytics in Cyber-Physical Systems

Siksnys, Laurynas 11 November 2015 (has links) (PDF)
More and more of our physical world today is being monitored and controlled by so-called cyber-physical systems (CPSs). These are compositions of networked autonomous cyber and physical agents such as sensors, actuators, computational elements, and humans in the loop. Today, CPSs are still relatively small-scale and very limited compared to CPSs to be witnessed in the future. Future CPSs are expected to be far more complex, large-scale, wide-spread, and mission-critical, and found in a variety of domains such as transportation, medicine, manufacturing, and energy, where they will bring many advantages such as the increased efficiency, sustainability, reliability, and security. To unleash their full potential, CPSs need to be equipped with, among other features, the support for automated planning and control, where computing agents collaboratively and continuously plan and control their actions in an intelligent and well-coordinated manner to secure and optimize a physical process, e.g., electricity flow in the power grid. In today’s CPSs, the control is typically automated, but the planning is solely performed by humans. Unfortunately, it is intractable and infeasible for humans to plan every action in a future CPS due to the complexity, scale, and volatility of a physical process. Due to these properties, the control and planning has to be continuous and automated in future CPSs. Humans may only analyse and tweak the system’s operation using the set of tools supporting prescriptive analytics that allows them (1) to make predictions, (2) to get the suggestions of the most prominent set of actions (decisions) to be taken, and (3) to analyse the implications as if such actions were taken. This thesis considers the planning and control in the context of a large-scale multi-agent CPS. Based on the smart-grid use-case, it presents a so-called PrescriptiveCPS – which is (the conceptual model of) a multi-agent, multi-role, and multi-level CPS automatically and continuously taking and realizing decisions in near real-time and providing (human) users prescriptive analytics tools to analyse and manage the performance of the underlying physical system (or process). Acknowledging the complexity of CPSs, this thesis provides contributions at the following three levels of scale: (1) the level of a (full) PrescriptiveCPS, (2) the level of a single PrescriptiveCPS agent, and (3) the level of a component of a CPS agent software system. At the CPS level, the contributions include the definition of PrescriptiveCPS, according to which it is the system of interacting physical and cyber (sub-)systems. Here, the cyber system consists of hierarchically organized inter-connected agents, collectively managing instances of so-called flexibility, decision, and prescription models, which are short-lived, focus on the future, and represent a capability, an (user’s) intention, and actions to change the behaviour (state) of a physical system, respectively. At the agent level, the contributions include the three-layer architecture of an agent software system, integrating the number of components specially designed or enhanced to support the functionality of PrescriptiveCPS. At the component level, the most of the thesis contribution is provided. The contributions include the description, design, and experimental evaluation of (1) a unified multi-dimensional schema for storing flexibility and prescription models (and related data), (2) techniques to incrementally aggregate flexibility model instances and disaggregate prescription model instances, (3) a database management system (DBMS) with built-in optimization problem solving capability allowing to formulate optimization problems using SQL-like queries and to solve them “inside a database”, (4) a real-time data management architecture for processing instances of flexibility and prescription models under (soft or hard) timing constraints, and (5) a graphical user interface (GUI) to visually analyse the flexibility and prescription model instances. Additionally, the thesis discusses and exemplifies (but provides no evaluations of) (1) domain-specific and in-DBMS generic forecasting techniques allowing to forecast instances of flexibility models based on historical data, and (2) powerful ways to analyse past, current, and future based on so-called hypothetical what-if scenarios and flexibility and prescription model instances stored in a database. Most of the contributions at this level are based on the smart-grid use-case. In summary, the thesis provides (1) the model of a CPS with planning capabilities, (2) the design and experimental evaluation of prescriptive analytics techniques allowing to effectively forecast, aggregate, disaggregate, visualize, and analyse complex models of the physical world, and (3) the use-case from the energy domain, showing how the introduced concepts are applicable in the real world. We believe that all this contribution makes a significant step towards developing planning-capable CPSs in the future. / Mehr und mehr wird heute unsere physische Welt überwacht und durch sogenannte Cyber-Physical-Systems (CPS) geregelt. Dies sind Kombinationen von vernetzten autonomen cyber und physischen Agenten wie Sensoren, Aktoren, Rechenelementen und Menschen. Heute sind CPS noch relativ klein und im Vergleich zu CPS der Zukunft sehr begrenzt. Zukünftige CPS werden voraussichtlich weit komplexer, größer, weit verbreiteter und unternehmenskritischer sein sowie in einer Vielzahl von Bereichen wie Transport, Medizin, Fertigung und Energie – in denen sie viele Vorteile wie erhöhte Effizienz, Nachhaltigkeit, Zuverlässigkeit und Sicherheit bringen – anzutreffen sein. Um ihr volles Potenzial entfalten zu können, müssen CPS unter anderem mit der Unterstützung automatisierter Planungs- und Steuerungsfunktionalität ausgestattet sein, so dass Agents ihre Aktionen gemeinsam und kontinuierlich auf intelligente und gut koordinierte Weise planen und kontrollieren können, um einen physischen Prozess wie den Stromfluss im Stromnetz sicherzustellen und zu optimieren. Zwar sind in den heutigen CPS Steuerung und Kontrolle typischerweise automatisiert, aber die Planung wird weiterhin allein von Menschen durchgeführt. Leider ist diese Aufgabe nur schwer zu bewältigen, und es ist für den Menschen schlicht unmöglich, jede Aktion in einem zukünftigen CPS auf Basis der Komplexität, des Umfangs und der Volatilität eines physikalischen Prozesses zu planen. Aufgrund dieser Eigenschaften müssen Steuerung und Planung in CPS der Zukunft kontinuierlich und automatisiert ablaufen. Der Mensch soll sich dabei ganz auf die Analyse und Einflussnahme auf das System mit Hilfe einer Reihe von Werkzeugen konzentrieren können. Derartige Werkzeuge erlauben (1) Vorhersagen, (2) Vorschläge der wichtigsten auszuführenden Aktionen (Entscheidungen) und (3) die Analyse und potentiellen Auswirkungen der zu fällenden Entscheidungen. Diese Arbeit beschäftigt sich mit der Planung und Kontrolle im Rahmen großer Multi-Agent-CPS. Basierend auf dem Smart-Grid als Anwendungsfall wird ein sogenanntes PrescriptiveCPS vorgestellt, welches einem Multi-Agent-, Multi-Role- und Multi-Level-CPS bzw. dessen konzeptionellem Modell entspricht. Diese PrescriptiveCPS treffen und realisieren automatisch und kontinuierlich Entscheidungen in naher Echtzeit und stellen Benutzern (Menschen) Prescriptive-Analytics-Werkzeuge und Verwaltung der Leistung der zugrundeliegenden physischen Systeme bzw. Prozesse zur Verfügung. In Anbetracht der Komplexität von CPS leistet diese Arbeit Beiträge auf folgenden Ebenen: (1) Gesamtsystem eines PrescriptiveCPS, (2) PrescriptiveCPS-Agenten und (3) Komponenten eines CPS-Agent-Software-Systems. Auf CPS-Ebene umfassen die Beiträge die Definition von PrescriptiveCPS als ein System von wechselwirkenden physischen und cyber (Sub-)Systemen. Das Cyber-System besteht hierbei aus hierarchisch organisierten verbundenen Agenten, die zusammen Instanzen sogenannter Flexibility-, Decision- und Prescription-Models verwalten, welche von kurzer Dauer sind, sich auf die Zukunft konzentrieren und Fähigkeiten, Absichten (des Benutzers) und Aktionen darstellen, die das Verhalten des physischen Systems verändern. Auf Agenten-Ebene umfassen die Beiträge die Drei-Ebenen-Architektur eines Agentensoftwaresystems sowie die Integration von Komponenten, die insbesondere zur besseren Unterstützung der Funktionalität von PrescriptiveCPS entwickelt wurden. Der Schwerpunkt dieser Arbeit bilden die Beiträge auf der Komponenten-Ebene, diese umfassen Beschreibung, Design und experimentelle Evaluation (1) eines einheitlichen multidimensionalen Schemas für die Speicherung von Flexibility- and Prescription-Models (und verwandten Daten), (2) der Techniken zur inkrementellen Aggregation von Instanzen eines Flexibilitätsmodells und Disaggregation von Prescription-Models, (3) eines Datenbankmanagementsystem (DBMS) mit integrierter Optimierungskomponente, die es erlaubt, Optimierungsprobleme mit Hilfe von SQL-ähnlichen Anfragen zu formulieren und sie „in einer Datenbank zu lösen“, (4) einer Echtzeit-Datenmanagementarchitektur zur Verarbeitung von Instanzen der Flexibility- and Prescription-Models unter (weichen oder harten) Zeitvorgaben und (5) einer grafische Benutzeroberfläche (GUI) zur Visualisierung und Analyse von Instanzen der Flexibility- and Prescription-Models. Darüber hinaus diskutiert und veranschaulicht diese Arbeit beispielhaft ohne detaillierte Evaluation (1) anwendungsspezifische und im DBMS integrierte Vorhersageverfahren, die die Vorhersage von Instanzen der Flexibility- and Prescription-Models auf Basis historischer Daten ermöglichen, und (2) leistungsfähige Möglichkeiten zur Analyse von Vergangenheit, Gegenwart und Zukunft auf Basis sogenannter hypothetischer „What-if“-Szenarien und der in der Datenbank hinterlegten Instanzen der Flexibility- and Prescription-Models. Die meisten der Beiträge auf dieser Ebene basieren auf dem Smart-Grid-Anwendungsfall. Zusammenfassend befasst sich diese Arbeit mit (1) dem Modell eines CPS mit Planungsfunktionen, (2) dem Design und der experimentellen Evaluierung von Prescriptive-Analytics-Techniken, die eine effektive Vorhersage, Aggregation, Disaggregation, Visualisierung und Analyse komplexer Modelle der physischen Welt ermöglichen und (3) dem Anwendungsfall der Energiedomäne, der zeigt, wie die vorgestellten Konzepte in der Praxis Anwendung finden. Wir glauben, dass diese Beiträge einen wesentlichen Schritt in der zukünftigen Entwicklung planender CPS darstellen. / Mere og mere af vores fysiske verden bliver overvåget og kontrolleret af såkaldte cyber-fysiske systemer (CPSer). Disse er sammensætninger af netværksbaserede autonome IT (cyber) og fysiske (physical) agenter, såsom sensorer, aktuatorer, beregningsenheder, og mennesker. I dag er CPSer stadig forholdsvis små og meget begrænsede i forhold til de CPSer vi kan forvente i fremtiden. Fremtidige CPSer forventes at være langt mere komplekse, storstilede, udbredte, og missionskritiske, og vil kunne findes i en række områder såsom transport, medicin, produktion og energi, hvor de vil give mange fordele, såsom øget effektivitet, bæredygtighed, pålidelighed og sikkerhed. For at frigøre CPSernes fulde potentiale, skal de bl.a. udstyres med støtte til automatiseret planlægning og kontrol, hvor beregningsagenter i samspil og løbende planlægger og styrer deres handlinger på en intelligent og velkoordineret måde for at sikre og optimere en fysisk proces, såsom elforsyningen i elnettet. I nuværende CPSer er styringen typisk automatiseret, mens planlægningen udelukkende er foretaget af mennesker. Det er umuligt for mennesker at planlægge hver handling i et fremtidigt CPS på grund af kompleksiteten, skalaen, og omskifteligheden af en fysisk proces. På grund af disse egenskaber, skal kontrol og planlægning være kontinuerlig og automatiseret i fremtidens CPSer. Mennesker kan kun analysere og justere systemets drift ved hjælp af det sæt af værktøjer, der understøtter præskriptive analyser (prescriptive analytics), der giver dem mulighed for (1) at lave forudsigelser, (2) at få forslagene fra de mest fremtrædende sæt handlinger (beslutninger), der skal tages, og (3) at analysere konsekvenserne, hvis sådanne handlinger blev udført. Denne afhandling omhandler planlægning og kontrol i forbindelse med store multi-agent CPSer. Baseret på en smart-grid use case, præsenterer afhandlingen det såkaldte PrescriptiveCPS hvilket er (den konceptuelle model af) et multi-agent, multi-rolle, og multi-level CPS, der automatisk og kontinuerligt tager beslutninger i nær-realtid og leverer (menneskelige) brugere præskriptiveanalyseværktøjer til at analysere og håndtere det underliggende fysiske system (eller proces). I erkendelse af kompleksiteten af CPSer, giver denne afhandling bidrag til følgende tre niveauer: (1) niveauet for et (fuldt) PrescriptiveCPS, (2) niveauet for en enkelt PrescriptiveCPS agent, og (3) niveauet for en komponent af et CPS agent software system. På CPS-niveau, omfatter bidragene definitionen af PrescriptiveCPS, i henhold til hvilken det er det system med interagerende fysiske- og IT- (under-) systemer. Her består IT-systemet af hierarkisk organiserede forbundne agenter der sammen styrer instanser af såkaldte fleksibilitet (flexibility), beslutning (decision) og præskriptive (prescription) modeller, som henholdsvis er kortvarige, fokuserer på fremtiden, og repræsenterer en kapacitet, en (brugers) intention, og måder til at ændre adfærd (tilstand) af et fysisk system. På agentniveau omfatter bidragene en tre-lags arkitektur af et agent software system, der integrerer antallet af komponenter, der er specielt konstrueret eller udbygges til at understøtte funktionaliteten af PrescriptiveCPS. Komponentniveauet er hvor afhandlingen har sit hovedbidrag. Bidragene omfatter beskrivelse, design og eksperimentel evaluering af (1) et samlet multi- dimensionelt skema til at opbevare fleksibilitet og præskriptive modeller (og data), (2) teknikker til trinvis aggregering af fleksibilitet modelinstanser og disaggregering af præskriptive modelinstanser (3) et database management system (DBMS) med indbygget optimeringsproblemløsning (optimization problem solving) der gør det muligt at formulere optimeringsproblemer ved hjælp af SQL-lignende forespørgsler og at løse dem "inde i en database", (4) en realtids data management arkitektur til at behandle instanser af fleksibilitet og præskriptive modeller under (bløde eller hårde) tidsbegrænsninger, og (5) en grafisk brugergrænseflade (GUI) til visuelt at analysere fleksibilitet og præskriptive modelinstanser. Derudover diskuterer og eksemplificerer afhandlingen (men giver ingen evalueringer af) (1) domæne-specifikke og in-DBMS generiske prognosemetoder der gør det muligt at forudsige instanser af fleksibilitet modeller baseret på historiske data, og (2) kraftfulde måder at analysere tidligere-, nutids- og fremtidsbaserede såkaldte hypotetiske hvad-hvis scenarier og fleksibilitet og præskriptive modelinstanser gemt i en database. De fleste af bidragene på dette niveau er baseret på et smart-grid brugsscenarie. Sammenfattende giver afhandlingen (1) modellen for et CPS med planlægningsmulighed, (2) design og eksperimentel evaluering af præskriptive analyse teknikker der gør det muligt effektivt at forudsige, aggregere, disaggregere, visualisere og analysere komplekse modeller af den fysiske verden, og (3) brugsscenariet fra energiområdet, der viser, hvordan de indførte begreber kan anvendes i den virkelige verden. Vi mener, at dette bidrag udgør et betydeligt skridt i retning af at udvikle CPSer til planlægningsbrug i fremtiden.
325

“People analytics kan jämföras med tonårssex: alla pratar om det, ingen har gjort det och ingen vet hur man ska göra” : En kvalitativ studie om hur svenska praktiker upplever people analytics / ”People analytics can be compared to teenage sex: everyone talks about it, no one has done it, and no one knows how to do it” : A qualitative study on how practitioners in Sweden experience people analytics

Stenborg, Vera, Högren, Louise January 2019 (has links)
Teknisk utveckling har varit en drivande kraft för effektivisering av organisationer genom historien. I takt med denna effektivisering har även arbetssätt och krav på anställda förändrats vilket har lett till ett behov av att strukturerat hantera personalrelaterade frågor. Utifrån detta har human resources (HR) som organisatorisk funktion vuxit fram. Den tekniska utvecklingen fortsätter och idag används tekniska lösningar som verktyg för många av organisationens funktioner. Dock menar tidigare forskning att HR inte hänger med i denna utveckling vilket påverkar HR-funktionens roll i organisationen och är en utmaning idag och framöver. En uppmärksammad teknisk lösning idag, i form av en uppsättning teknikbaserade vertyg, aktiviteter och arbetssätt, som anses kunna hjälpa HR som funktion att ta nästa steg i utvecklingen är people analytics. Syftet med denna studie är att öka förståelsen för people analytics i Sverige, vilka effekter som eftersträvas samt vilka möjligheter och utmaningar som finns för att uppnå dessa effekter. Detta grundar sig i att det finns ett behov av empiriskt grundad forskning i en svensk kontext eftersom det finns en avsaknad av detta idag. Genom tillämpning av en kvalitativ surveyundersökning av tolkande karaktär med semistrukturerade intervjuer framkommer ett empiriskt material som analyseras med hjälp av tidigare forskning och socioteknisk teori. Studien visar att people analytics är en uppsättning aktiviteter som är värdefulla för såväl organisationer som HR-funktioner eftersom det möjliggör evidensbaserat strategiskt arbete, med data och statistik som grund, i personalrelaterade frågor. Genom att använda people analytics kan organisationer följa upp sitt personalrelaterade arbete på ett mer strukturerat sätt och HR som funktion har även fått ökad legitimitet i organisationen på grund av detta. Däremot finns det ett antal komplext sammanflätade faktorer som idag agerar hindrande för att people analytics ska nå sin fulla potential i svenska organisationer, dessa faktorer återfinns både i sociala och tekniska aspekter av arbetet med people analytics. Genom att skapa förståelse för varför dessa faktorer påverkar användandet och effekterna av people analytics bidrar denna studie med värdefulla insikter för både praktiker och akademiker. Dessa insikter handlar om hur arbetet med people analytics kan uppnå de långsiktiga mål och syften som finns både inom forskning och hos svenska organisationer.
326

KPI-framtagning på ledningsnivå : Ett tillämpat arbete baserat på modellen Analytics for Management, anpassat efter SAAB Aeronautics / KPI-development on management level : An applied thesis based upon the model of Analytics for Management, adapted to SAAB Aeronautics

Westerdahl, Robert, Yngemo, Martin January 2019 (has links)
The goal of this master thesis is to supply management at Saab Aeronautics with information and tools that allow for fast and effective insight into the manufacturing process. Additionally it aims to suggest improvements concerning measurements and the use of data for the internal workshops. The method of creating good performance indicators will also be part of the deliverables to Saab for future use. The master thesis is based on Analytics for Management by Tanima Chowhury and Louise Sandén, an earlier master thesis which describes a method for generating KPI:s. Furthermore the master thesis intends to answer some research questions, these questions are based around how well Analytics for Management can be adapted to a large organization like Saab and an evaluation of their previously used KPI:s. The main changes to the method as described in Analytics for Management are regarding differences in the data collection and the addition of a fourth phase. The new fourth phase uses the output from the previous three phases that are described in Analytics for Management, to assess which KPI:s are most suitable for top managerial use. The first phase utilizesthe company vision and strategy, to create suitable tactical goals. In this project, this translates to the company strategy getting linked to goals set up for the internal workshops at Saab Aeronautics. After this, in the second phase, the goals are used to produce KPI:s for the workshops. The output of phase two is therefore the chosen KPI:s that have been confirmed with management. The third phase uses the KPI:s and aims to evaluate the measurement system to either improve it or create it, if nonexistent. This part has been intentionally reduced in this report due to complexity. The fourth phase, evaluates out which KPI:s should be selected and how they should be visualized to create a good overview for management. The main result or output of the master thesis report consists of two parts, the method of generating KPI:s and the actual generated KPI:s. The KPI:s created during the master thesis are very similar to the KPI:s previously used, but improved. This indicates that the previous KPI:s is use at Saab were up to standard. However, the underlying data and traceability was not. These results generate good value for the future and the findings underline an important area. However, according to the authors, the main value should consist of highlighting the defective underlying data and solutions created to solve the problem / Med denna rapport vill man tillgodose ledningen på Saab Aeronautics med information och verktyg som effektivt ger en inblick kring verksamhetens produktionsenheter. Man ämnar också ge förslag till taktiska ledningen angående mätetal och datahantering. Utöver konkreta förslag tar man även fram metodiken för framtida bruk på Saab. För att genomföra detta appliceras en anpassning av metodiken beskriven i Analytics for Management av Tanima Chowdhury och Louise Sandén. I Analytics for Management presenteras en metodik för att ta fram och utvärdera KPI:er. I samband med metodiken ställs också forskningsfrågor upp angående anpassning av Analytics for Management mot verksamheten och en utvärdering av dagens mätetal hos Saab. Som nämnt är rapporten baserad på anpassning och tillämpning av Analytics for Management. Efter analys av metodiken och en genomförd litteraturstudie infördes bl.a. en fjärde fas samt förändringar kring datainsamling. Den fjärde fasen behandlar främst aggregering gentemot ledning och nyttjar resultatet ur fas ett till tre för att lyfta fram KPI:er som passar bra på många olika nivåer i organisationen. Den första av faserna behandlar företagets vision och strategi för att kunna koppla den mot mål på taktisk nivå. I fallet på Saab handlar det om att koppla företagsstrategi mot mål på verkstadsnivå. Sedan genereras KPI:er i fas två, där dessa också valideras mot intressenter och ansvariga i produktionen. Den tredje fasen konkretiserar mätningar och visualisering för de valda KPI:erna. Här får man alltså ut färdiga KPI:er för taktiska nivån. Fjärde fasen aggregerar samman genererade KPI:er för ledningen, på så sätt får man en överblicksbild över en större del av organisationen. Resultatet från examensarbetet består av två delar: dels metoden för framtagning av lämpliga KPI:er, men också de KPI:erna som arbetsgången lett fram till. Gällande KPI:erna som utvecklats under examensarbetet, så finns det stora likheter med de KPI:er som redan används på Saab. Skillnaden ligger i spårbarheten samt hur underliggande data hanteras. Genom metodiken och med de framtagna KPI:erna finns därför värde som kan nyttjas på flera verksamheter i organisationen. Vad författarna dock vill lyfta fram är att det kanske största värdet ligger i upptäckten kring brister i underliggande data, som idag många av mätetalen bygger på.
327

Big Data em conteúdo espontâneo não-estruturado da internet como estratégia organizacional de orientação para o mercado

Corrêa Junior, Dirceu Silva Mello 25 April 2018 (has links)
Submitted by JOSIANE SANTOS DE OLIVEIRA (josianeso) on 2018-09-25T15:42:43Z No. of bitstreams: 1 Dirceu Silva Mello Corrêa Junior_.pdf: 5130564 bytes, checksum: 9921c0e8eafdc5eb26cc6cf6211bdb01 (MD5) / Made available in DSpace on 2018-09-25T15:42:43Z (GMT). No. of bitstreams: 1 Dirceu Silva Mello Corrêa Junior_.pdf: 5130564 bytes, checksum: 9921c0e8eafdc5eb26cc6cf6211bdb01 (MD5) Previous issue date: 2018-04-25 / Nenhuma / O Big Data é uma realidade social, com crescente impacto nos negócios. Entretanto, uma pesquisa realizada com executivos americanos de grandes corporações identificou uma baixa capacidade no aproveitamento efetivo dessa oportunidade de inteligência competitiva em suas empresas. Ao aprofundar o entendimento desse contexto, a partir da perspectiva de Orientação para o Mercado, a presente dissertação apresentou uma análise exploratória sobre a atual capacidade de grandes empresas com atuação nacional em absorver valor do Big Data, focando sua atenção num tipo específico de conteúdo, chamado Dado Não-Estruturado. Como resultado, identificou-se que as empresas estudadas se encontram em um momento peculiar para a gestão moderna de Orientação para o Mercado, uma espécie de processo evolutivo e de transição na compreensão e aproveitamento desse dilúvio de dados. Tal momento de adaptação é ainda reforçado por uma tendência para o uso de dados mais espontâneos dos consumidores. Neste estudo inicialmente são apresentadas cinco dimensões desse momento peculiar, abordando sistematicamente quesitos relacionados à organização interna; fornecedores e perfis de investimentos; adaptações internas; entre outros achados estratégicos. Após, também é detalhada a atual caminhada na efetiva compreensão do Big Data, a partir das práticas possíveis identificadas nesse contexto empresarial. / Big Data is a social reality with growing business impact. However, a survey of US executives of large corporations identified a low capacity to effectively exploit this competitive intelligence opportunity in their companies. In order to deepen the understanding of this context, from the perspective of Market Orientation, the present dissertation presented an exploratory analysis about the current capacity of large companies with national performance in absorbing Big Data value, focusing their attention on a type of content, called Unstructured Data. As a result, it was identified that the companies studied are in a peculiar moment for the modern management of Market Orientation, a sort of evolutionary process and of transition in the understanding and use of this deluge of data. This moment of adaptation is further reinforced by a trend towards the use of more spontaneous data from consumers. In this study, five dimensions of this peculiar moment are presented, systematically addressing questions related to internal organization; suppliers and investment profiles; internal adaptations; among other strategic findings. Afterwards, the current path to the understanding of Big Data is also detailed, based on the possible practices identified in this business context.
328

Em busca da imagem videojográfica: uma cartografia das imagens de jogos digitais de 1976 a 2017

Menezes, João Ricardo de Bittencourt 07 June 2018 (has links)
Submitted by JOSIANE SANTOS DE OLIVEIRA (josianeso) on 2018-12-12T11:36:59Z No. of bitstreams: 1 João Ricardo_.pdf: 27143811 bytes, checksum: 4d619b388e4b16293ffab363f82d0012 (MD5) / Made available in DSpace on 2018-12-12T11:36:59Z (GMT). No. of bitstreams: 1 João Ricardo_.pdf: 27143811 bytes, checksum: 4d619b388e4b16293ffab363f82d0012 (MD5) Previous issue date: 2018-06-07 / UNISINOS - Universidade do Vale do Rio dos Sinos / O objetivo principal desta pesquisa foi desenvolver o conceito de imagem videojográfica propondo-a como uma articulação entre três camadas: máquina, lúdica e audiovisual. Para isso, foi construído um processo metodológico que combinou técnicas quantitativas e aplicáveis em grande volume de dados da chamada analítica cultural ou cultural analytics, desenvolvida por Lev Manovich e articulada a uma perspectiva de caráter ontológico sobre os jogos digitais proposta por Alexander Galloway, que propõe inscrever os games em um framework composto por dois eixos (diegético/não diegético e operador/máquina) que geram quatro formas do jogo agir ao combinar cada um de seus componentes: maquínico diegético, maquínico não diegético, operador diegético e operador não diegético. Do ponto de vista da fundamentação teórica, partimos da ideia de compreender o fenômeno do jogar como uma virtualidade que se atualiza de diferentes formas incluindo máquinas criadas pelo homem com objetivo de jogar. Identificamos, em uma perspectiva de inspiração arqueológica, máquinas pouco conhecidas e relacionadas com o contexto de jogos digitais como primeiras atualizações dessa imagem videojográfica, compreendendo essas máquinas da era digital também parte de um processo de midiatização que produz inúmeros conteúdos derivados da performance do jogar. Percebeu-se o papel do pixel nessa revolução tecnocultural atrelada a uma exponencialidade da presença destas imagens técnicas do jogar, o que contribui para a convocação da analítica cultural como forma de dar conta desta profusão. Seu uso permitiu a criação do primeiro movimento de análise, através de uma série de cartografias sobre as imagens do jogar. Assim, passamos a tentar compreender - a partir de uma proposta conceitual sobre a imagem videojográfica – como esta atualizou-se nas imagens técnicas de jogos digitais produzidos entre 1976 e 2017. Adotou-se, então, um segundo movimento, para qual foram convocados os procedimentos da metodologia das molduras propostas por Kilpp e trabalhadas no grupo de pesquisa Audiovisualidades e Tecnocultura: Comunicação, Memória e Design, com vistas a observar os resultados das cartografias através da geração de imagens médias. Além das considerações desenvolvidas no primeiro movimento, compreendeu-se que as imagens videojográficas atualizam-se em suas três camadas como ethicidades e com quatro grandes territórios de significação, de molduras que as permeiam - as plataformas, os instrumentos de interface gráfica, a diegese e o círculo mágico. Esses territórios são moldurados de forma que possamos compreender como seus sentidos são agenciados e, consequentemente, refletindo na forma de atualização das ethicidades nas imagens videojográficas. As superfícies técnicas molduram as plataformas; os apetrechos, os painéis informativos, as caixas de diálogo e as barras de ação molduram os instrumentos de interface gráfica. Já, a diegese é uma moldura importante da audiovisualidade moldurada também pelas caixas de diálogo, painéis decorativos e painéis diegéticos. E, por último, o círculo mágico, recorrente e, por vezes, excessivamente dominante na reflexão sobre o ato de jogar, é percebido agora como moldura que entra em tensionamento com as demais através dos mesmos molduramentos propostos. / The main goal of this research was to develop videogamegraphics image concept proposing it as a joint between three layers: machine,ludic and audiovisual. For this, a methodological process was constructed that combined quantitative techniques and applicable in big data in cultural analytic, developed by Lev Manovich and articulated to an ontological perspective on the digital games proposed by Alexander Galloway.He proposes to put the games in a framework composed of two axes (diegetic/non-diegetic and operator/machine) that generate four forms of act game: diegetic machine, nondiegetic machine, diegetic operator and nondiegetic operator. From theoretical point of view, we start from understanding the playing phenomenon as a virtuality that is updated in different ways including machines created by man with purpose of playing.We have identified, from an archaeologically perspective inspiration, machines are unknown and related to digital games context of as the first updates of videogamegraphics image, including these machines of the digital era also part of a mediatization process produces countless contents derived from game performance. The role of the pixel in technocultural revolution is linked to an exponentiality of the presence of these technical images of the game, has been perceived, which contributes to the convening of cultural analytics as a way to account for this profusion. Its use allowed the creation of first movement of analysis, through a series of cartographies on the game images. So, we start to understand - from a conceptual proposal on videogamegraphics image - how this one was updated in the technical images of games created between 1976 and 2017. A second movement was adopted, for which frames methodology proposed by Kilpp and worked on in the research group Audiovisualities and Technoculture: Communication, Memory and Design, in order to observe cartography results through generation of mean images. In addition to the considerations developed in the first movement, it was understood that the videogamegraphics images are updated in their three layers as ethicities and with four great signification, of frames that around them - the platforms, the instruments of graphical interface, the diegese and the magic circle.These territories are framed that we can understand how their senses are created and, consequently, reflecting in the way of updating the ethicities in the videogamegraphics images. The technical surfaces frame the platforms; gadgets, information panels, dialog boxes and action bars are framed by graphics user interface instruments. Already, diegese is an important frame of audiovisuality framed also by dialog boxes, decorative panels and diegetic panels. And finally, the magic circle, recurrent and sometimes excessively dominant in the reflection on the playing act, is now perceived as a frame that enters into tension with the others through the same proposed frames.
329

Estratégias para minimizar a evasão e potencializar a permanência em EAD a partir de sistema que utiliza mineração de dados educacionais e learning analytics

Portal, Cleber 29 February 2016 (has links)
Submitted by Silvana Teresinha Dornelles Studzinski (sstudzinski) on 2016-06-29T13:59:29Z No. of bitstreams: 1 Cleber Portal_.pdf: 1881529 bytes, checksum: aa34b82855518c0d808e9a3ab2e103e0 (MD5) / Made available in DSpace on 2016-06-29T13:59:29Z (GMT). No. of bitstreams: 1 Cleber Portal_.pdf: 1881529 bytes, checksum: aa34b82855518c0d808e9a3ab2e103e0 (MD5) Previous issue date: 2016-02-29 / CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / A presente dissertação de mestrado, desenvolvida no contexto do Grupo de Pesquisa Educação Digital GPe-dU UNISINOS/CNPq, vinculada à Linha de Pesquisa Educação, Desenvolvimento e Tecnologias do Programa de Pós-Graduação em Educação, investigou como são elaboradas as estratégias utilizadas pelos diferentes atores envolvidos no contexto da Educação a Distância (EaD), para minimizar a evasão e potencializar a permanência dos estudantes nessa modalidade, tendo como subsídios um conjunto de informações e indicadores gerados por um sistema, o GVWise, que faz uso de mineração de dados e Learning Analytics. A pesquisa é exploratória, de natureza qualitativa. Fundamenta-se na Teoria Ator-Rede (LATOUR, 2012) e faz uso da metodologia da cartografia das controvérsias. (LATOUR, 2012). Envolve, ainda, uma análise documental nos registros do sistema – ator não humano - ANH e entrevistas semi-estruturadas com os atores humanos - AH, em diferentes instâncias: coordenadores vinculados à gestão da EaD e aos cursos de graduação, professores e tutores dos respectivos cursos. O objetivo principal consistiu em compreender de que forma as informações fornecidas pelo sistema estão sendo compreendidas pelos diferentes atores, bem como perceber se as articulações dessas informações estão sendo eficientes no sentido de contribuir para a criação de estratégias que possam minimizar a evasão e potencializar a permanência dos estudantes nessa modalidade. Os principais resultados obtidos indicam, no que se refere ao ANH – sistema, que esse fornece um conjunto de informações, as quais, quando articuladas, evidenciam que a maior evasão ocorre antes da avaliação, ou seja, da realização dos Graus B e C. No que se refere aos AH da gestão em EaD e dos cursos, bem como os professores e tutores, os resultados evidenciam compreensões distintas e singulares sobre a evasão e a permanência, bem como sobre a forma de articular as informações fornecidas pelo sistema, na criação de estratégias para minimizar a evasão e potencializar a permanência do estudante na EaD, embora esses AH integrem a mesma equipe mas com funções diferentes. Esse resultado se manifesta como controvérsia, as quais são acessadas por meio da abertura das caixas pretas, no momento em que esses atores são instigados a refletir sobre as estratégias utilizadas. As relações dos AH se apresentam distanciadas uns dos outros, principalmente na disciplina de maior evasão. Produzem uma comunicação pouco eficiente ou ineficiente, gerando obstáculos no campo metodológico da disciplina, dificultando possíveis mudanças positivas e restringindo o desenvolvimento dos processos pedagógicos. A estratégia de contatar o AH estudante, se usado de forma adequada, pode colaborar e abrir possibilidades para a melhor compreensão do fenômeno da evasão e ampliação da visão estratégica institucional. Como principal contribuição da dissertação apresenta-se o diagrama das mediações, ou seja, o desenho da distribuição da mobilidade, os movimentos na construção, na busca por uma estratégia que possa minimizar a evasão e potencializar a permanência do estudante em EaD. / The currrent dissertation of Master’s Degree, developed in the context of Grupo de Pesquisa Educação Digital GPe-dU UNISINOS/CNPq (Research Group on Digital Education GPe-dU UNISINOS/CNPq) bound to Linha de Pesquisa Educação, Desenvolvimento e Tecnologias do Programa de Pós-Graduação em Educação (Line of Research Education, Development and Technologies of the Program of Postgraduate in Education) investigated how the strategies used were made by different actors, enfolded in the context of Distance Education (EaD) to minimize evasion and potentiate the permanence of students in this modality, having as subsidy a set of information and indexes generated by a system, the oGVWise, which makes use of data mining and Learning Analytics. The research is exploratory, of qualitative nature, having as a basis the Teoria Ator-Rede (the Actor-Net Theory) (LATOUR, 2012) and makes use of the methodology of the cartography of controversies.(LATOUR, 2012). It still enfolds a documental analysis in the registers of the system – non human actor – NHA, and semi-structured interviews with the human actors – HA, in different instances: Coordinators bound to the managements of EaD and to the courses of graduation, professors and tutors of the related courses. The main goal consisted in understanding how information given by the system are being understood by different actors, as well in noticing if the articulations of these information are being efficient in the sense of contributing to the creation of strategies that might minimize the evasion and potentiate the permanence of students in this modality. The main results obtained indicate, in what relates to NHA – system, that this one gives a set of information, which when articulated, give evidence that the major evasion happens before the evaluations, it means, of the accomplishment of Degrees B and C. Relating to HA of management in EaD and of the courses, as well as the professors and tutors, the results give evidence to different and singular comprehensions on evasion and permanence, as well as on the way of articulating the information given by the system, on the making of strategies to minimize evasion and potentiate the permanence of the student in EaD, however these HA integrate the same team. This result is manifested as controversy, which are accessed by means of opening the black boxes, in the moment that these actors are instigated to reflect on the used strategies. There is is a detachment among HA, mainly in the discipline of major evasion, that produce little efficient communication or inefficient generating obstacles in the methodological field of the subject, making difficult possible positive changes and restraining the development of pedagogical processes. The strategy of contacting a HA student, if a proper way is used, might collaborate and open possibilities to the comprehension of the phenomenon of evasion and enlargement of the institutional strategic vision. As main contribution of the dissertation the diagram of mediations is presented, it means, the design of the distribution of mobility, the movements of construction, the search for a strategy that might minimize evasion and potentiate the permanence of the student in EaD.
330

Compréhension fine du comportement des lignes des réseaux métro, RER ettramway pour la réalisation des études d’exploitabilité. / Detailed understanding of the metro, RER and streetcar network lines behaviour for the realization of operating studies

Dimanche, Vincent 11 June 2018 (has links)
Les réseaux ferroviaires en milieu dense font face à des saturations importantes. Et l'adéquation entre l'offre théorique et la demande croissante impose des contraintes d'exploitabilités fortes. Un déséquilibre générera des points conflictuels comme des goulets d'étranglement avec pour effet des retards sur les trains amonts. Comme le facteur humain, parmi une multitude, influence l'exploitation ; le prendre en compte plus finement devrait améliorer la compréhension et la modélisation des lignes pour en accroître la capacité sans sacrifier le confort des passagers. Pour répondre à cet objectif, nos travaux reposent sur une visualisation adaptée des données remontées de l'exploitation et sur leur fouille automatisée. Elles ont été adaptées et appliquées au domaine ferroviaire notamment aux lignes des réseaux ferrés exploités par la RATP. Le processus « Visual Analytics », mis en œuvre dans nos travaux pour répondre à ces besoins, englobe les étapes nécessaires à la valorisation de la donnée, allant de leur préparation à l’analyse experte en passant par leur représentation graphique et par l’utilisation d'algorithmes de fouille de données. Parmi ces derniers, le CorEx et le Sieve nous ont permis par un apprentissage non supervisé basé sur une mesure de l'information mutuelle multivariée d'analyser les données d'exploitation pour en extraire des caractéristiques du comportement humain. Enfin, nous proposons aussi une visualisation intuitive d'une grande quantité de données permettant leur intégration et facilitant le diagnostic global du comportement des lignes ferroviaires. / Dense railway networks face significant saturation. And the balance between the theoretical offer and the growing demand imposes strong operability constraints. An imbalance will generate conflicting points such as bottlenecks with the effect of delays on the following trains. As the human factor influences the operation performance; taking it into account more accurately should improve understanding and modeling of railway lines to increase capacity without reducing passenger comfort. To fulfill this objective, we are working on an adapted visualization of the operating data and on their automated mining. These two solutions have been adapted and applied to the railway sector, particularly to the lines of rail networks operated by RATP. The "Visual Analytics" process, implemented in our work to meet these needs, encompasses the steps required to value the data, going from the preparation of the data to the expert analysis. This expert analysis is made through graphic representation and the use of data mining algorithms. Among these data mining algorithms, CorEx and Sieve allowed us to analyze operating data and then extract characteristics human behavior thanks to unsupervised learning based on a multivariate mutual information measure to. Finally, we propose an intuitive visualization of a large amount of data allowing their global integration and facilitating the overall diagnosis of the railway lines behavior.

Page generated in 0.0635 seconds