• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 819
  • 339
  • 294
  • 27
  • 17
  • 13
  • 8
  • 7
  • 7
  • 6
  • 4
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 1701
  • 541
  • 534
  • 435
  • 390
  • 277
  • 276
  • 265
  • 232
  • 232
  • 226
  • 225
  • 193
  • 178
  • 149
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Energy Consumption of Browser-based Creative AI

Lund, Leonard, Blomkvist, Felix January 2022 (has links)
Creative AI in the music field has in recent years begun stepping out of the confines of academia and seen increased adoption among musicians thanks to developers launching consumer products powered by AI. These new tools are opening up new possibilities in music-making, but their increased use and development prompts inquiry regarding their sustainability. While studies have been conducted on the sustainability of training AI models, the sustainability of the usage of Creative AI remains largely unexplored. To amend this, this paper studies the energy consumption of using four music-related browser-based Creative AI tools. The four tools are Tone Transfer, Piano Scribe, MidiMe and Performance RNN, all developed by Google Magenta. The energy consumption of the tools was found by measuring the power provided to the computer. This was done by connecting a smart plug between the computer’s power cord and the wall socket. We found that Tone Transfer consumed the most energy per use with an average energy consumption of 392 J. MidiMe consumed the least energy per use with 138 J. All the tools consumed less energy per use than leaving the computer running in steady-state for 70 seconds. With this study, we have shown that the usage of music-related Creative AI tools does not represent a threat to sustainability goals. Our findings indicate that the tools studied in this paper manage to be efficient, while being both powerful and useful. This disputes the notion that there is a trade-off between performance and efficiency in the design of AI tools. We postulate that when developing tools for local use by consumers, developers are bound by limitations that force them to design efficient tools. / Kreativ AI inom musikområdet har under de senaste åren börjat ta sig ut ur den akademiska världens ramar och anammats i högre grad bland musiker. Detta tack vare att utvecklare börjat lanserat konsumentprodukter som drivs av AI. Dessa nya verktyg öppnar upp för nya möjligheter inom musikskapande, men deras ökade användning och utveckling föranleder undersökningar om deras hållbarhet. Även om studier har gjorts gällande hållbarheten av att träna AI-modeller, är hållbarheten av användningen av Kreativ AI fortfarande till stor del outforskat. För att ändra detta studerar vi i denna artikel energiförbrukningen av att använda fyra musikrelaterade webbläsarbaserade Kreativa AI-verktyg. De fyra verktygen är Tone Transfer, Piano Scribe, MidiMe och Performance RNN, alla utvecklade av Google Magenta. Verktygens energiförbrukning hittades genom att mäta effekten till datorn. Detta gjordes genom att ansluta en smart kontakt mellan datorns nätsladd och vägguttaget. Vi fann att Tone Transfer förbrukade mest energi per användning med en genomsnittlig energiförbrukning på 392 J. MidiMe förbrukade minst energi per användning med 138 J. Alla verktyg förbrukade mindre energi per användning än vad som konsumeras av att låta datorn vara igång i steady-state i 70 sekunder. Med denna studie har vi visat att användningen av musikrelaterade Kreativa AI-verktyg inte utgör ett hot mot hållbarhetsmål. Våra resultat tyder på att verktygen som studerats i denna artikel lyckas vara effektiva, samtidigt som de är både kraftfulla och användbara. Detta ifrågasätter uppfattningen om att det finns en avvägning mellan prestanda och effektivitet i utformningen av AI-verktyg. Vi anser att när utvecklare utvecklar verktyg för lokal användning av konsumenter är utvecklare bundna av begränsningar som tvingar dem att designa effektiva verktyg.
42

Decomposition and fire retardancy of naturally occurring mixtures of huntite and hydromagnesite

Hollingbery, Luke A. January 2011 (has links)
Mixtures of the two minerals huntite and hydromagnesite have been successfully used as a fire retardant additive in polymers for many years. The onset of decomposition of hydromagnesite is at a higher temperature than that of aluminium hydroxide but lower than that of magnesium hydroxide, the two most commonly used mineral fire retardants. This makes it an ideal addition to the range of materials available to polymer compounders for improving fire retardant properties. In comparison to the better known mineral fire retardants there has been little published research on the fire retardant properties of huntite and hydromagnesite. What has been published has often been commercially orientated and the limited quantity of scientific literature does not fully explain the fire retardant mechanism of these blends of minerals, often dismissing huntite as having no useful fire retardant action other than diluting the solid phase fuel. Standard thermal analysis techniques (thermal gravimetric analysis, differential scanning calorimetry, Fourier transform infra-red analysis) have been used to characterise the thermal decomposition of huntite and hydromagnesite from a source in Turkey. This has lead to an understanding of the decomposition mechanism of the minerals in terms of mass loss, enthalpy of decomposition, and evolved gases between room temperature and 1000°C. Hydromagnesite endothermically decomposes between about 220°C and 500°C, initially releasing water followed by carbon dioxide. The rate of heating and partial pressure of carbon dioxide in the atmosphere can influence the mechanism of carbon dioxide release. Huntite endothermically decomposes between about 450°C and 800°C releasing carbon dioxide in two stages. The use of the cone calorimeter to study the rate of heat release during combustion of ethylene vinyl acetate based polymer compounds has lead to an understanding of how both huntite and hydromagnesite affect the burning processes at different stages of the fire. By varying the ratio of the two minerals, hydromagnesite has been shown to increase the time to ignition and reduce the initial peak in rate of heat release, while huntite has been shown to reduce the rate of heat release later in the fire. It has been shown that huntite is far from being an inactive diluent filler. The endothermic decomposition of huntite in the later stages of the fire reduces the heat reaching underlying polymer and continues to dilute the flame with inert carbon dioxide. The platy huntite particles have been shown to align themselves in such a way that they can hinder the escape of volatiles from the decomposing polymer and also physically reinforce the inorganic ash residue.
43

Improving the identification and management of aspiration after stroke

Boaden, Elizabeth L. January 2011 (has links)
Dysphagia, a common clinical corollary following stroke, may contribute to aspiration pneumonia, malnutrition, and dehydration which may significantly impair patient rehabilitation. Survey Aim: Establish current clinical practice regarding nurse dysphagia screening. Method: A cross-sectional regional postal survey was undertaken with 60 nurses and 45 Speech and Language Therapists. Results: Nurses were taught to use water swallow screening tools but, in reality, used a variety of testing materials. Conclusion: This demonstrated the need for a clinically useful bedside swallow screening tool. Pilot Study Aim: Develop and evaluate the diagnostic accuracy of a new BEdside Swallow Screening Tool (BESST), for use by nurses with acute stroke patients. Method: A literature search was undertaken to inform the BESST. Face validity was established using an iterative process of semi-structured interviews with eight specialist SLTs and eight nurses. The tool was piloted on 12 purposefully selected stroke patients by comparing the management options chosen by two nurses using the BESST with those of the Specialist SLT using their bedside assessment (gold standard). Results: The BESST demonstrated excellent sensitivity (100%) but specificity demonstrated by both nurses was poor (< 45% for both). Conclusion: A larger validation study of a modified BEEST would be appropriate. Main Study Aim: Establish the diagnostic accuracy and utility of the BESST. Method: Ratings by nurses using the BESST were compared with experienced SLT bedside assessment in 124 consecutively admitted stroke patients. Results: The BESST demonstrated good agreement between nurses (81%) and within nurses (87% nurse 1, 86% nurse 2), 93% sensitivity, 82% specificity; 71% positive iii predictive value, 95% negative predictive value; and overall efficiency was 84%. The BESST dictated the same management as the SLT in 75% of cases, and safely allowed 92% of patients modified oral intake when compared to the water swallow screening tool. Conclusion: The BESST has potential use in clinical practice, but further research is needed.
44

Synthesis of project planning networks using an intelligent knowledge-based systems methodology

Marshall, G. January 1988 (has links)
No description available.
45

The artificial engineer

Rolph, R. N. January 1982 (has links)
No description available.
46

An investigation to determine the kinematic variables associated with the production of topspin in the tennis groundstrokes

Protheroe, Laurence January 2011 (has links)
The ability to impart topspin to the ball when playing forehand and backhand groundstrokes can give a tennis player a tactical advantage in a rally. Recent developments in racket technology and tactical approaches to the game have increased the prevalence of topspin strokes. However, there is a limited scientific knowledge base for players and coaches to draw upon when seeking to improve this aspect of the game. Many of the kinematic analyses into tennis groundstrokes were conducted more than ten years ago, with measurement techniques that may not have accurately measured the anatomical rotations important for generating racket velocity. It has only recently been possible to measure the spin rate of a ball, and this has not been investigated in relation to the kinematics of a player. This study aimed to make an important contribution to the knowledge of tennis professionals by establishing which kinematic variables are related to the production of high ball spin rates resulting from topspin strokes. In order to achieve this aim, consideration was given to the accurate measurement of the joint rotations of the player in all planes of movement and the quantification of the ball spin rate. This information was used to answer three further questions; what are the kinematic differences between flat and topspin groundstrokes, how do these differences relate to the spin rate of the ball and how do these findings relate to individual players? Joint rotations were calculated based on three-dimensional data captured from twenty participants playing flat and topspin forehand and backhand strokes. The resulting ball spin rate was captured using a high-speed camera. The participants produced larger ball spin rates when playing the topspin strokes, indicating that they were able to produce spin if required. Analysis of the joint rotations revealed that there were adaptations in the stroke in order to achieve the higher spin rates. The adaptations were not uniform among participants, but did produce similar alterations in racket trajectory, inclination and velocity for the topspin strokes. It was these measures that were found to be the strongest predictors of ball spin rates, accounting for over 60 % of the variation in ball spin rate in the forehand stroke and over 70% in the backhand. Case study analyses confirmed the importance of the optimal racket kinematics at impact and provided models of technique throughout the forward swing of each stroke. This study has made a contribution to the knowledge of generating topspin in the tennis groundstrokes by establishing the parameters that predict high spin rates and applying them to analyses of individual players. In doing so, this investigation has also demonstrated methodology that is capable of accurately measuring the joint rotations associated with tennis strokes, and suggested a method by which the spin rate of the ball can be calculated.
47

AI in computer games : generating interesting interactive opponents by the use of evolutionary computation

Yannakakis, Georgios N. January 2005 (has links)
Which features of a computer game contribute to the player’s enjoyment of it? How can we automatically generate interesting and satisfying playing experiences for a given game? These are the two key questions addressed in this dissertation. Player satisfaction in computer games depends on a variety of factors; here the focus is on the contribution of the behaviour and strategy of game opponents in predator/prey games. A quantitative metric of the ‘interestingness’ of opponent behaviours is defined based on qualitative considerations of what is enjoyable in such games, and a mathematical formulation grounded in observable data is derived. Using this metric, neural-network opponent controllers are evolved for dynamic game environments where limited inter-agent communication is used to drive spatial coordination of opponent teams. Given the complexity of the predator task, cooperative team behaviours are investigated. Initial candidates are generated using off-line learning procedures operating on minimal neural controllers with the aim of maximising opponent performance. These example controllers are then adapted using on-line (i.e. during play) learning techniques to yield opponents that provide games of high interest. The on-line learning methodology is evaluated using two dissimilar predator/prey games with a number of different computer player strategies. It exhibits generality across the two game test-beds and robustness to changes of player, initial opponent controller selected, and complexity of the game field. The interest metric is also evaluated by comparison with human judgement of game satisfaction in an experimental survey. A statistically significant number of players were asked to rank game experiences with a test-bed game using perceived interestingness and their ranking was compared with that of the proposed interest metric. The results show that the interest metric is consistent with human judgement of game satisfaction. Finally, the generality, limitations and potential of the proposed methodology and techniques are discussed, and other factors affecting the player’s satisfaction, such as the player’s own strategy, are briefly considered. Future directions building on the work described herein are presented and discussed.
48

The logical modelling of computational multi-agent systems

Wooldridge, Michael J. January 1992 (has links)
No description available.
49

Managing indie-auteurism in an era of sectoral media convergence

Stubbs, Andrew January 2019 (has links)
Since the mid-1980s, authorship has become an increasingly prominent component in the promotional, extratextual and critical discourse surrounding independent and indie film. During the same period, independent and indie film has become more lucrative and increasingly drawn attention and investment from Hollywood studios and other vertically and/or horizontally integrated media institutions seeking to further expand their businesses. In a context of sectoral media convergence, therefore, the thesis explores the management of indie-auteurism, defined as a discursive construct conveying authenticity, autonomy, artistry, natural talent, innovation and quality attached to authorial figures associated loosely with American independent or indie film. It explores especially the role played by producers and talent managers, two types of talent intermediaries, in constructing and managing indie-auteurism, the industrial and economic functions it serves, as well as its cultural repercussions. The thesis begins by analysing the Coen brothers' collaboration with various producers to explore the construction and management of indie-auteurism across three periods of contemporary independent film outlined by Yannis Tzioumakis (2013): independent, indie and indiewood. The thesis goes on to expand this periodization, however, by exploring the strategies and operations of two highly diversified talent management and media production companies, Propaganda Films and Anonymous Content, in using indie-auteurism to sell and/or market their film, television, music video and commercial spot projects and productions. In doing so, the thesis helps to develop understandings of independent and indie film in two interrelated ways. First, it sheds light on the role that producers and talent managers, figures who have been under researched in the study of independent film (and in media studies generally), have played in constructing and disseminating indie-auteurism and in shaping independent and indie film. Second, it expands the history of independent and indie film by tracing talent management strategies across media and reconfiguring indie-auteurism within an era of media convergence.
50

Steps towards an empirically responsible AI : a methodological and theoretical framework

Svedberg, Peter O.S. January 2004 (has links)
<p>Initially we pursue a minimal model of a cognitive system. This in turn form the basis for the development of amethodological and theoretical framework. Two methodological requirements of the model are that explanation be from the perspective of the phenomena, and that we have structural determination. The minimal model is derived from the explanatory side of a biologically based cognitive science. Fransisco Varela is our principal source for this part. The model defines the relationship between a formally defined autonomous system and an environment, in such a way as to generate the world of the system, its actual environment. The minimal model is a modular explanation in that we find it on different levels in bio-cognitive systems, from the cell to small social groups. For the latter and for the role played by artefactual systems we bring in Edwin Hutchins' observational study of a cognitive system in action. This necessitates the introduction of a complementary form of explanation. A key aspect of Hutchins' findings is the social domain as environment for humans. Aspects of human cognitive abilities usually attributed to the person are more properly attributed to the social system, including artefactual systems.</p><p>Developing the methodological and theoretical framework means making a transition from the bio-cognitive to the computational. The two complementary forms of explanation are important for the ability to develop a methodology that supports the construction of actual systems. This has to be able to handle the transition from external determination of a system in design to internal determination (autonomy) in operation.</p><p>Once developed, the combined framework is evaluated in an application area. This is done by comparing the standard conception of the Semantic Web with how this notion looks from the perspective of the framework. This includes the development of the methodological framework as a metalevel external knowledge representation. A key difference between the two approaches is the directness by which the semantic is approached. Our perspective puts the focus on interaction and the structural regularities this engenders in the external representation. Regularities which in turn form the basis for machine processing. In this regard we see the relationship between representation and inference as analogous to the relationship between environment and system. Accordingly we have the social domain as environment for artefactual agents. For human level cognitive abilities the social domain as environment is important. We argue that a reasonable shortcut to systems we can relate to, about that very domain, is for artefactual agents to have an external representation of the social domain as environment.</p>

Page generated in 0.0802 seconds