• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 1
  • 1
  • Tagged with
  • 11
  • 11
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Using the IBM WatsonTM Dialog Service for Assisting Parallel Programming

Calvo, Adrián January 2016 (has links)
IBM Watson is on the verge of becoming a milestone in computer science as it is using a new technology that relies on cognitive systems. IBM Watson is able to understand questions in natural language and give proper answers. The use of cognitive computing in parallel programming is an open research issue. Therefore, the objective of this project is to investigate how IBM Watson can help in parallel programming by using the Dialog Service. In order to answer our research question an application has been built based on the IBM Watson Dialog Service and a survey has been carried out. The results of our research demonstrate that the developed application offers valuable answers to the questions asked by a programmer and the survey reveals that students would be interested in using it.
2

Artificial Intelligence (AI): Multidisciplinary Perspectives on Emerging Challenges, Opportunities, and Agenda for Research, Practice and Policy

Dwivedi, Y.K., Hughes, L., Ismagilova, Elvira, Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A., Galanos, V., Ilavarasan, P.V., Janssen, M., Jones, P., Kar, A.K., Kizgin, Hatice, Kronemann, B., Lal, B., Lucini, B., Medaglia, R., Le Meunier-FitzHugh, K., Le Meunier-FitzHugh, L.C., Misra, S., Mogaji, E., Sharma, S.K., Singh, J.B., Raghaven, V., Raman, R., Rana, Nripendra P., Samothrakis, S., Spencer, J., Tamilmani, Kuttimani, Tubadji, A., Walton, P., Williams, M.D. 08 August 2019 (has links)
Yes / As far back as the industrial revolution, significant development in technical innovation has succeeded in transforming numerous manual tasks and processes that had been in existence for decades where humans had reached the limits of physical capacity. Artificial Intelligence (AI) offers this same transformative potential for the augmentation and potential replacement of human tasks and activities within a wide range of industrial, intellectual and social applications. The pace of change for this new AI technological age is staggering, with new breakthroughs in algorithmic machine learning and autonomous decision-making, engendering new opportunities for continued innovation. The impact of AI could be significant, with industries ranging from: finance, healthcare, manufacturing, retail, supply chain, logistics and utilities, all potentially disrupted by the onset of AI technologies. The study brings together the collective insight from a number of leading expert contributors to highlight the significant opportunities, realistic assessment of impact, challenges and potential research agenda posed by the rapid emergence of AI within a number of domains: business and management, government, public sector, and science and technology. This research offers significant and timely insight to AI technology and its impact on the future of industry and society in general, whilst recognising the societal and industrial influence on pace and direction of AI development.
3

Retail atmospherics effect on store performance and personalised shopper behaviour: A cognitive computing approach

Behera, R.K., Bala, P.K., Tata, S.V., Rana, Nripendra P. 19 June 2021 (has links)
Yes / Abstract Purpose: The best possible way for brick-and-mortar retailers to maximise engagement with personalised shoppers is capitalising on intelligent insights. The retailer operates differently with diversified items and services, but influencing retail atmospheric on personalised shoppers, the perception remains the same across industries. Retail atmospherics stimuli such as design, smell and others create behavioural modifications. The purpose of this study is to explore the atmospheric effects on brick-and- mortar store performance and personalised shopper’s behaviour using cognitive computing based in-store analytics in the context of emerging market. Design/methodology/approach: The data are collected from 35 shoppers of a brick-and-mortar retailer through questionnaire survey and analysed using quantitative method. Findings: The result of the analysis reveals month-on-month growth in footfall count (46%), conversation rate (21%), units per transaction (27%), average order value (23%), dwell time (11%), purchase intention (29%), emotional experience (40%) and a month-on-month decline in remorse (20%). The retailers need to focus on three control gates of shopper behaviour: entry, browsing and exit. Attention should be paid to the cognitive computing solution to judge the influence of retail atmospherics on store performance and behaviour of personalised shoppers. Retail atmospherics create the right experience for individual shoppers and forceful use of it has an adverse impact. Originality/value: The paper focuses on strategic decisions of retailers, the tactical value of personalised shoppers and empirically identifies the retail atmospherics effect on brick-and-mortar store performance and personalised shopper behaviour.
4

Cognitive Computing

11 November 2015 (has links) (PDF)
"Cognitive Computing" has initiated a new era in computer science. Cognitive computers are not rigidly programmed computers anymore, but they learn from their interactions with humans, from the environment and from information. They are thus able to perform amazing tasks on their own, such as driving a car in dense traffic, piloting an aircraft in difficult conditions, taking complex financial investment decisions, analysing medical-imaging data, and assist medical doctors in diagnosis and therapy. Cognitive computing is based on artificial intelligence, image processing, pattern recognition, robotics, adaptive software, networks and other modern computer science areas, but also includes sensors and actuators to interact with the physical world. Cognitive computers – also called "intelligent machines" – are emulating the human cognitive, mental and intellectual capabilities. They aim to do for human mental power (the ability to use our brain in understanding and influencing our physical and information environment) what the steam engine and combustion motor did for muscle power. We can expect a massive impact of cognitive computing on life and work. Many modern complex infrastructures, such as the electricity distribution grid, railway networks, the road traffic structure, information analysis (big data), the health care system, and many more will rely on intelligent decisions taken by cognitive computers. A drawback of cognitive computers will be a shift in employment opportunities: A raising number of tasks will be taken over by intelligent machines, thus erasing entire job categories (such as cashiers, mail clerks, call and customer assistance centres, taxi and bus drivers, pilots, grid operators, air traffic controllers, …). A possibly dangerous risk of cognitive computing is the threat by “super intelligent machines” to mankind. As soon as they are sufficiently intelligent, deeply networked and have access to the physical world they may endanger many areas of human supremacy, even possibly eliminate humans. Cognitive computing technology is based on new software architectures – the “cognitive computing architectures”. Cognitive architectures enable the development of systems that exhibit intelligent behaviour.
5

Kognitiva tjänster på en myndighet : Förstudie om hur Lantmäteriet kan tillämpa IBM Watson

Åström, Gustav January 2017 (has links)
Many milestones have been passed in computer science and currently we are on our way to pass yet another: artificial intelligence. One of the characteristics of AI is to be able to interpret so-called unstructured data, i.e., data that lacks structure. Unstructured data can be useful and with the new tools within AI is it possible to interpret it and use it to solve problems. This has the potential to be useful in practical applications such as processing and decision support. The work has been done at Apendo AB, which has the Swedish National Land Survey as a customer. The work is to investigate how AI-driven cognitive services through IBM Watson can be applied to the Swedish National Land Survey. The goal is to answer the following questions: Is it possible to apply cognitive services through Watson's services to give decision support to the Swedish National Land Survey already? In what ways can you use Watson's services to create a decision support? How effective can the solution for the Swedish National Land Survey be, i.e. how much time and costs can they save by using Watson's services on the chosen concept? As a practical part of the AI study, a perceptron was developed and evaluated. Through an agile approach, tests and studies about IBM Watson have taken place in parallel with interviews with employees at the Swedish National Land Survey. The tests were performed in the PaaS service IBM Bluemix with both Node-RED and an own built web application. Though the interviews, the Watson service Retrieve and Rank became interesting and examined more closely. With Retrieve and Rank you can get questions answered by ranking selected corpus pieces that are then trained for better answers. Uploading the corpus with related questions resulted in that 75% of the questions was answered correctly. Applications for the Swedish National Land Survey can then be a cognitive search function that helps administrators to search information in manuals and the law book. / Många milstolpar har passerats inom datavetenskapen och just nu håller vi på att passera en till: artificiell intelligens. En av de egenskaper som kännetecknar AI är att kunna tolka s.k. ostrukturerad data, alltså sådan data som saknar struktur. Ostrukturerad data vara användbar och med de nya verktygen inom AI är det möjligt att tolka för sedan använda det till att lösa problem. Detta har potential att vara användbart inom praktiska applikationer såsom handläggning och beslutsstöd. Arbetet har skett på företaget Apendo AB som har Lantmäteriet som kund. Arbetet går ut på att undersöka hur AI-drivna kognitiva tjänster genom IBM Watson kan tillämpas på Lantmäteriet. Målet är att besvara följande frågor: Är det möjligt att tillämpa kognitiva tjänster genom Watsons tjänster för att ge beslutsstöd åt Lantmäteriet redan i dagsläget? På vilka sätt kan man använda Watsons tjänster för att skapa ett beslutsstöd? Hur effektiv kan lösningen för Lantmäteriet bli, d.v.s. hur mycket tid och kostnader kan de tänkas spara genom att använda Watsons tjänster på valt koncept? Som praktisk del av studien om AI utvecklades och utvärderades en perceptron. Genom ett agilt förhållningssätt har tester och studier om IBM Watson skett parallellt med intervjuer med anställda på Lantmäteriet. Testerna utfördes i PaaS-tjänsten IBM Bluemix med både Node- RED och egenbyggd webbapplikation. Av intervjuerna blev Watson-tjänsten Retrieve and Rank intressant och undersöktes noggrannare. Med Retrieve and Rank kan man få frågor besvarade genom rankning av stycken av valt korpus som sedan tränas upp för bättre svar. Uppladdning av korpus med tillhörade frågor gav att 75 % av frågorna besvarades korrekt. Tillämpningarna Lantmäteriet kan då vara en kognitiv uppträningsbar sökfunktion som hjälper handläggare att söka information i handböcker och lagboken.
6

MAGNETO-ELECTRIC APPROXIMATE COMPUTATIONAL FRAMEWORK FOR BAYESIAN INFERENCE

Kulkarni, Sourabh 27 October 2017 (has links) (PDF)
Probabilistic graphical models like Bayesian Networks (BNs) are powerful artificial-intelligence formalisms, with similarities to cognition and higher order reasoning in the human brain. These models have been, to great success, applied to several challenging real-world applications. Use of these formalisms to a greater set of applications is impeded by the limitations of the currently used software-based implementations. New emerging-technology based circuit paradigms which leverage physical equivalence, i.e., operating directly on probabilities vs. introducing layers of abstraction, promise orders of magnitude increase in performance and efficiency of BN implementations, enabling networks with millions of random variables. While majority of applications with small network size (100s of nodes) require only single digit precision for accurate results, applications with larger size (1000s to millions of nodes) require higher precision computation. We introduce a new BN integrated circuit fabric based on mixed-signal magneto-electric circuits which perform probabilistic computations based on the principle of approximate computation. Precision scaling in this fabric is logarithmic in area vs. linear in prior directions. Results show 33x area benefit for a 0.001 precision compared to prior direction, while maintaining three orders of magnitude performance benefits vs. 100-core processor implementations.
7

Augmenting MPI Programming Process with Cognitive Computing

Kazilas, Panagiotis January 2019 (has links)
Cognitive Computing is a new and quickly advancing technology. In thelast decade Cognitive Computing has been used to assist researchers in theirendeavors in many different scientific fields such as Health & medicine,Education, Marketing, Psychology and Financial Services. On the otherhand, Parallel programming is a more complex concept than sequentialprogramming. The additional complexity of Parallel Programming isintroduced by its nature that requires implementations of more complexalgorithms and it introduces additional concepts to the developers, namelythe communication between the processes (Distributed memory systems)that execute the parallel program and their synchronization (Share memorysystems). As a result of this additional complexity, a lot of novice developersare reserved in their attempts to implement parallel programs. The objectiveof this research project was to investigate whether we can assist parallelprogramming process through cognitive computing solutions. In order toachieve our objective, the MPI Assistant, a Q&A system has been developedand a case study has been carried out to determine our application’s efficiencyin our attempt to assist parallel programming developers. The case studyshowed that our MPI Assistant system indeed helped developers reduce thetime they spend to develop their solutions, but not improve the quality ofthe program or its efficiency as these improvements require features that areout of this research project’s scope. However, the case study had limitednumber of participants, which may affect our results’ reliability. As a nextstep in our attempt to determine if cognitive computing technologies are ableto assist developers in their parallel programming development, we movedto investigate if cognitive solutions can extract better and more completeresponses compared to our manually-created responses that we created forthe MPI Assistant. We have experimented with 2 different approaches to theproblem. An approach where we manually created responses for the MPIAssistant, and an approach where we investigated if cognitive solutions canautomatically extract better and complete responses. We compared the qualityof the latter automatic responses with the quality of the former which weremanually created.
8

Cognitive Computing: Collected Papers

Püschel, Georg, Furrer, Frank J. 11 November 2015 (has links)
Cognitive Computing' has initiated a new era in computer science. Cognitive computers are not rigidly programmed computers anymore, but they learn from their interactions with humans, from the environment and from information. They are thus able to perform amazing tasks on their own, such as driving a car in dense traffic, piloting an aircraft in difficult conditions, taking complex financial investment decisions, analysing medical-imaging data, and assist medical doctors in diagnosis and therapy. Cognitive computing is based on artificial intelligence, image processing, pattern recognition, robotics, adaptive software, networks and other modern computer science areas, but also includes sensors and actuators to interact with the physical world. Cognitive computers – also called 'intelligent machines' – are emulating the human cognitive, mental and intellectual capabilities. They aim to do for human mental power (the ability to use our brain in understanding and influencing our physical and information environment) what the steam engine and combustion motor did for muscle power. We can expect a massive impact of cognitive computing on life and work. Many modern complex infrastructures, such as the electricity distribution grid, railway networks, the road traffic structure, information analysis (big data), the health care system, and many more will rely on intelligent decisions taken by cognitive computers. A drawback of cognitive computers will be a shift in employment opportunities: A raising number of tasks will be taken over by intelligent machines, thus erasing entire job categories (such as cashiers, mail clerks, call and customer assistance centres, taxi and bus drivers, pilots, grid operators, air traffic controllers, …). A possibly dangerous risk of cognitive computing is the threat by “super intelligent machines” to mankind. As soon as they are sufficiently intelligent, deeply networked and have access to the physical world they may endanger many areas of human supremacy, even possibly eliminate humans. Cognitive computing technology is based on new software architectures – the “cognitive computing architectures”. Cognitive architectures enable the development of systems that exhibit intelligent behaviour.:Introduction 5 1. Applying the Subsumption Architecture to the Genesis Story Understanding System – A Notion and Nexus of Cognition Hypotheses (Felix Mai) 9 2. Benefits and Drawbacks of Hardware Architectures Developed Specifically for Cognitive Computing (Philipp Schröppe)l 19 3. Language Workbench Technology For Cognitive Systems (Tobias Nett) 29 4. Networked Brain-based Architectures for more Efficient Learning (Tyler Butler) 41 5. Developing Better Pharmaceuticals – Using the Virtual Physiological Human (Ben Blau) 51 6. Management of existential Risks of Applications leveraged through Cognitive Computing (Robert Richter) 61
9

SkyNet: Memristor-based 3D IC for Artificial Neural Networks

Bhat, Sachin 27 October 2017 (has links)
Hardware implementations of artificial neural networks (ANNs) have become feasible due to the advent of persistent 2-terminal devices such as memristor, phase change memory, MTJs, etc. Hybrid memristor crossbar/CMOS systems have been studied extensively and demonstrated experimentally. In these circuits, memristors located at each cross point in a crossbar are, however, stacked on top of CMOS circuits using back end of line processing (BOEL), limiting scaling. Each neuron’s functionality is spread across layers of CMOS and memristor crossbar and thus cannot support the required connectivity to implement large-scale multi-layered ANNs. This work proposes a new fine-grained 3D integrated circuit technology for ANNs that is one of the first IC technologies for this purpose. Synaptic weights implemented with devices are incorporated in a uniform vertical nanowire template co-locating the memory and computation requirements of ANNs within each neuron. Novel 3D routing features are used for interconnections in all three dimensions between the devices enabling high connectivity without the need for special pins or metal vias. To demonstrate the proof of concept of this fabric, classification of binary images using a perceptron-based feed forward neural network is shown. Bottom-up evaluations for the proposed fabric considering 3D implementation of fabric components reveal up to 19x density, 1.2x power benefits when compared to 16nm hybrid memristor/CMOS technology.
10

Technologies émergentes de mémoire résistive pour les systèmes et application neuromorphique / Emerging Resistive Memory Technology for Neuromorphic Systems and Applications

Suri, Manan 18 September 2013 (has links)
La recherche dans le domaine de l’informatique neuro-inspirée suscite beaucoup d'intérêt depuis quelques années. Avec des applications potentielles dans des domaines tels que le traitement de données à grande échelle, la robotique ou encore les systèmes autonomes intelligents pour ne citer qu'eux, des paradigmes de calcul bio-inspirés sont étudies pour la prochaine génération solutions informatiques (post-Moore, non-Von Neumann) ultra-basse consommation. Dans ce travail, nous discutons les rôles que les différentes technologies de mémoire résistive non-volatiles émergentes (RRAM), notamment (i) Phase Change Memory (PCM), (ii) Conductive-Bridge Memory (CBRAM) et de la mémoire basée sur une structure Metal-Oxide (OXRAM) peuvent jouer dans des dispositifs neuromorphiques dédies. Nous nous concentrons sur l'émulation des effets de plasticité synaptique comme la potentialisation à long terme (Long Term Potentiation, LTP), la dépression à long terme (Long Term Depression, LTD) et la théorie STDP (Spike-Timing Dependent Plasticity) avec des synapses RRAM. Nous avons développé à la fois de nouvelles architectures de faiblement énergivore, des méthodologies de programmation ainsi que des règles d’apprentissages simplifiées inspirées de la théorie STDP spécifiquement optimisées pour certaines technologies RRAM. Nous montrons l’implémentation de systèmes neuromorphiques a grande échelle et efficace énergétiquement selon deux approches différentes: (i) des synapses multi-niveaux déterministes et (ii) des synapses stochastiques binaires. Des prototypes d'applications telles que l’extraction de schéma visuel et auditif complexe sont également montres en utilisant des réseaux de neurones impulsionnels (Feed-forward Spiking Neural Network, SNN). Nous introduisons également une nouvelle méthodologie pour concevoir des neurones stochastiques très compacts qui exploitent les caractéristiques physiques intrinsèques des appareils CBRAM. / Research in the field of neuromorphic- and cognitive- computing has generated a lot of interest in recent years. With potential application in fields such as large-scale data driven computing, robotics, intelligent autonomous systems to name a few, bio-inspired computing paradigms are being investigated as the next generation (post-Moore, non-Von Neumann) ultra-low power computing solutions. In this work we discuss the role that different emerging non-volatile resistive memory technologies (RRAM), specifically (i) Phase Change Memory (PCM), (ii) Conductive-Bridge Memory (CBRAM) and Metal-Oxide based Memory (OXRAM) can play in dedicated neuromorphic hardware. We focus on the emulation of synaptic plasticity effects such as long-term potentiation (LTP), long term depression (LTD) and spike-timing dependent plasticity (STDP) with RRAM synapses. We developed novel low-power architectures, programming methodologies, and simplified STDP-like learning rules, optimized specifically for some RRAM technologies. We show the implementation of large-scale energy efficient neuromorphic systems with two different approaches (i) deterministic multi-level synapses and (ii) stochastic-binary synapses. Prototype applications such as complex visual- and auditory- pattern extraction are also shown using feed-forward spiking neural networks (SNN). We also introduce a novel methodology to design low-area efficient stochastic neurons that exploit intrinsic physical effects of CBRAM devices.

Page generated in 0.1345 seconds