• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 54
  • 26
  • 6
  • 4
  • 3
  • 1
  • 1
  • Tagged with
  • 99
  • 99
  • 28
  • 26
  • 25
  • 16
  • 15
  • 15
  • 13
  • 12
  • 12
  • 11
  • 10
  • 9
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Guidelines for Creating Game Levels

Kedman, Viktor January 2012 (has links)
In this research the aim was to create guidelines regarding design of levels in the RPG-genre. This was done relying heavily on a literary study and the work of others but also through exploring the standard of the gaming-industry today. The research results in a proposed set of rules and guidelines are aimed at helping a novice designer to successfully create game levels. Some emphasis on the RPG-genre has been made but the rules/guidelines proposed are adaptable to any game-genre.
72

What Level Design Elements Determine Flow? : How Light and Objects Guide the Player in Overwatch and Doom

Eliasson, David January 2017 (has links)
This thesis presents a comparative study between Overwatch (2016) and Doom (2016) to determine how these fast-paced games facilitate flow in their gameplay. The second chapter looks at formal definitions of flow and level design to establish a vocabulary for following chapters. Through formal analysis the level designs of both games are then examined to establish what elements in them guide players and keep the flow in gameplay. The thesis also examines how the initial gameplay design principles, which are rooted in the older shooter genre, have impacted the level design. The author uses screenshots from both games, interviews with the design teams and published literature on game design for the study.It was found that the architectural design of a level in hero-based gameplay (Overwatch (2016)) could control the pacing by changing the elements that enable certain types of movement such as climbing or creating setups that favor one team over the other. On an individual player level, flow is kept with intentional placement of light and bright colors to guide the player. While Doom (2016) uses different abilities and movement sets for its hero, the tools of guiding the player proved to be very similar but with heavier focus on environmental markings and lights. In both cases the look of these guiding tools was adapted to fit into the game world without breaking the player’s immersion. / I uppsatsen görs en jämförelse mellan Overwatch (2016) och Doom (2016) för att komma fram till hur dessa snabba spel underlättar flödet i spelandet. I kapitel två av uppsatsen ges formella definitioner av flöde och level design för att skapa förutsättningar för förståelsen av följande kapitel. Genom en formell analys undersöks level design i bägge spel för att fastställa vilka element som styr spelarna och håller flödet uppe. Uppsatsen undersöker också hur de ursprungliga designprinciperna, som har sitt ursprung i den äldre shooter genren, har påverkat spelens level design. Författaren använder skärmdumpar från båda spelen, intervjuer med designteam och litteratur om speldesign för studien.Det visade sig att den arkitektoniska utformningen av en level i ett hjältebaserat spel (Overwatch (2016)) skulle kunna styra takten genom att ändra element som möjliggör vissa typer av rörelser, t.ex. klättring eller skapandet av uppsättningar som gynnar ett lag framför det andra. På en individuell spelarnivå hålls flödet uppe med avsiktliga placeringar av ljus och ljusa färger för att styra spelaren. Medan Doom använder olika förmågor och rörelser för sin hjälte, visade sig verktygen för att styra spelaren vara mycket lika men med tyngre fokus på miljömärkningar och ljus. I båda fallen var dessa verktygs utseende utformat för att passa in i spelvärlden utan att bryta spelarens inlevelse.
73

Horror game design – what instills fear in the player? : A study on the effects of horror game design theories and level design patterns on player behaviour in a horror environment. / Skräckspelsdesign – Vad ingjuter skräck hos spelaren? : En studie om nivådesign och skräckspelsteorier på spelarbeteende i skräckspelsmiljö.

Årnell, Tobias, Stojanovic, Nikola January 2020 (has links)
This research paper aimed to study how to make a scary horror game and what in turn makes these games scary. This study utilizes an original game called The House specifically designed and created by us. This is done in order to study the effects of implementing level design and navigation patterns and horror game design theories in an original horror game on player behaviour and reaction in relation to these theories. The study was done with the use of 10 participants, who each took part in a 15 minute play session, and were later interviewed using the data gathering method stimulated recall. The result of the study shows that level design had no significant effect on the amount of fear that the participants expressed. The implementation of proven horror game design theories proved successful at contributing to the general horror experience, and combining elements of level design and horror game theories in horror game design proved successful at scaring the participants. / Denna forskningsstudie syftade till att studera hur man skapar och designar ett skrämmande skräckspel och vad som gör ett skräckspel skräckinjagande. Denna studie använder ett originellt spel vid namn The House som var specifikt utformat och skapat av oss. Detta gjordes för att studera effekterna av implementering av nivå design och navigationsmönster samt skräckspelsteorier på spelarens beteende och reaktion i relation till dessa teorier. Studien gjordes med användning av 10 deltagare, som var och en deltog i en 15-minuters spelsession och intervjuades senare med hjälp av datainsamlingsmetoden stimulated recall. Resultatet av studien visar att nivådesign inte hade någon signifikant effekt på mängden rädsla som deltagarna uttryckte. Implementeringen av beprövade teorier om skräckspelsdesign visade sig framgångsrika då de bidrog till den allmänna skräckupplevelsen, och att kombinera element av nivådesign och skräckspelsteorier i skräckspelsdesign visade sig mycket framgångsrika att skrämma deltagarna.
74

Modularization of Test Rigs / Modularisering av provningsriggar

Williamsson, David January 2015 (has links)
This Master of Science Thesis contains the result of a product development project, conducted in collaboration with Scania CV AB in Södertälje. Scania has a successful history in vehicle modularization and therefore wanted to investigate the possibility to modularize their test rigs as well, in order to gain various types of benefits. The section UTT (Laboratory Technology) at Scania, where the project was conducted, had however little experience in product modularization. The author of the thesis therefore identified a specific test rig and modularized it by using appropriate methods. Moreover, a new method was developed by the author, in order to modularize the test rig according to both product complexity and company strategies. This was done by adapting the DSM (Design Structure Matrix) with strategies from the MIM (Module Indication Matrix), before clustering it with the IGTA++ clustering algorithm. The result of the different modularization methods was finally evaluated and compared, before choosing the most suitable modular test rig architecture. The chosen architecture was then analyzed, in order to determine potential benefits that it could offer. Another purpose of the thesis was to answer the research questions about the possibility to combine a DSM and MIM, and if that would improve the result when modularizing a product. The thesis also aimed at providing the project owners with a theoretical background in the field of product modularization and System-Level design (embodiment design). The conclusions of the thesis is that the chosen modular test rig architecture has 41% less complexity (compared with the original architecture) and could potentially increase the flexibility, reduce the risk of design mistakes and reduce the development time by up to 70%. It would also be theoretically possible to reuse up to 57% of the modules, when redesigning the test rig in the future. The thesis also identified that it is possible to transfer some information from a MIM and import it to a DSM, which answered one of the research questions, it was however not possible to claim that it will always improve the result. / Detta M.Sc. examensarbete innehåller resultatet av ett produktframtagningsprojekt som genomfördes i samarbete med Scania CV AB i Södertälje. Scania har en framgångsrik historia inom modularisering av fordon och var därför intresserade av att undersöka möjligheten att modularisera sina provningsriggar, för att uppnå olika typer av strategiska fördelar. Sektionen UTT (Laboratorieteknik) på Scania, där projektet genomfördes, hade dock lite erfarenhet av modularisering av produkter. Författaren av detta examensarbete identifierade därför en specifik provningsrigg och modulariserade den med hjälp av lämpliga metoder. Dessutom utvecklades en ny metod av författaren för att både kunna betrakta företagsstrategier och produktkomplexiteten under modulariseringen. Detta gjordes genom att anpassa en DSM (Design Structure Matrix) med strategier från en MIM (Module Indication Matrix), innan den klustrades med hjälp av algoritmen IGTA++. Resultatet av de olika modulariseringsmetoderna utvärderades och jämfördes slutligen innan den lämpligaste modulära provriggsarkitekturen valdes. Den valda arkitekturen analyserades sedan för att identifiera tänkbara strategiska fördelar som den skulle kunna möjliggöra. Ett annat syfte med examensarbetet var att besvara forskningsfrågorna om möjligheten att kombinera en DSM och MIM, och om det i så fall skulle förbättra resultatet av modulariseringen. Målet med examensarbetet var också att förse sektionen UTT med en teoretisk bakgrund inom modularisering och systemkonstruktion. Slutsatserna av examensarbetet är att den valda modulära produktarkitekturen har 41% lägre komplexitet (jämfört med den ursprungliga arkitekturen) och skulle dessutom potentiellt kunna öka flexibiliteten, minska risken för konstruktionsfel samt minska ledtiden (under utvecklingen) med upp till 70%. Det skulle också vara teoretiskt möjligt att återanvända upp till 57% av modulerna när den studerade provningsriggen behöver utvecklas i framtiden. Under examensarbetet identifierades också möjligheten att överföra information från en MIM till en DSM, vilket besvarade en av forskningsfrågorna. Det var dock inte möjligt att besvara frågan om det alltid förbättrar resultatet.
75

Development and validation of NESSIE: a multi-criteria performance estimation tool for SoC / Développement et validation de NESSIE: un outil d'estimation de performances multi-critères pour systèmes-sur-puce.

Richard, Aliénor 18 November 2010 (has links)
The work presented in this thesis aims at validating an original multicriteria performances estimation tool, NESSIE, dedicated to the prediction of performances to accelerate the design of electronic embedded systems. <p><p>This tool has been developed in a previous thesis to cope with the limitations of existing design tools and offers a new solution to face the growing complexity of the current applications and electronic platforms and the multiple constraints they are subjected to. <p><p>More precisely, the goal of the tool is to propose a flexible framework targeting embedded systems in a generic way and enable a fast exploration of the design space based on the estimation of user-defined criteria and a joint hierarchical representation of the application and the platform.<p><p>In this context, the purpose of the thesis is to put the original framework NESSIE to the test to analyze if it is indeed useful and able to solve current design problems. Hence, the dissertation presents :<p><p>- A study of the State-of-the-Art related to the existing design tools. I propose a classification of these tools and compare them based on typical criteria. This substantial survey completes the State-of-the-Art done in the previous work. This study shows that the NESSIE framework offers solutions to the limitations of these tools.<p>- The framework of our original mapping tool and its calculation engine. Through this presentation, I highlight the main ingredients of the tool and explain the implemented methodology.<p>- Two external case studies that have been chosen to validate NESSIE and that are the core of the thesis. These case studies propose two different design problems (a reconfigurable processor, ADRES, applied to a matrix multiplication kernel and a 3D stacking MPSoC problem applied to a video decoder) and show the ability of our tool to target different applications and platforms. <p><p>The validation is performed based on the comparison of a multi-criteria estimation of the performances for a significant amount of solutions, between NESSIE and the external design flow. In particular, I discuss the prediction capability of NESSIE and the accuracy of the estimation. <p><p>-The study is completed, for each case study, by a quantification of the modeling time and the design time in both flows, in order to analyze the gain achieved by our tool used upstream from the classical tool chain compared to the existing design flow alone. <p><p><p>The results showed that NESSIE is able to predict with a high degree of accuracy the solutions that are the best candidates for the design in the lower design flows. Moreover, in both case studies, modeled respectively at a low and higher abstraction level, I obtained a significant gain in the design time. <p><p>However, I also identified limitations that impact the modeling time and could prevent an efficient use of the tool for more complex problems. <p><p>To cope with these issues, I end up by proposing several improvements of the framework and give perspectives to further develop the tool. / Doctorat en Sciences de l'ingénieur / info:eu-repo/semantics/nonPublished
76

Developing multi-criteria performance estimation tools for Systems-on-chip

Vander Biest, Alexis 23 March 2009 (has links)
The work presented in this thesis targets the analysis and implementation of multi-criteria performance prediction methods for System-on-Chips (SoC).<p>These new SoC architectures offer the opportunity to integrate complete heterogeneous systems into a single chip and can be used to design battery powered handhelds, security critical systems, consumer electronics devices, etc. However, this variety in terms of application usually comes with a lot of different performance objectives like power consumption, yield, design cost, production cost, silicon area and many others. These performance requirements are often very difficult to meet together so that SoC design usually relies on making the right design choices and finding the best performance compromises.<p>In parallel with this architectural paradigm shift, new Very Deep Submicron (VDSM) silicon processes have more and more impact on the performances and deeply modify the way a VLSI system is designed even at the first stages of a design flow.<p>In such a context where many new technological and system related variables enter the game, early exploration of the impact of design choices becomes crucial to estimate the performance of the system to design and reduce its time-to-market.<p>In this context, this thesis presents: <p>- A study of state-of-the-art tools and methods used to estimate the performances of VLSI systems and an original classification based on several features and concepts that they use. Based on this comparison, we highlight their weaknesses and lacks to identify new opportunities in performance prediction.<p>- The definition of new concepts to enable the automatic exploration of large design spaces based on flexible performance criteria and degrees of freedom representing design choices.<p>- The implementation of a couple of two new tools of our own:<p>- Nessie, a tool enabling hierarchical representation of an application along with its platform and automatically performs the mapping and the estimation of their performance.<p>-Yeti, a C++ library enabling the defintion and value estimation of closed-formed expressions and table-based relations. It provides the user with input and model sensitivity analysis capability, simulation scripting, run-time building and automatic plotting of the results. Additionally, Yeti can work in standalone mode to provide the user with an independent framework for model estimation and analysis.<p><p>To demonstrate the use and interest of these tools, we provide in this thesis several case studies whose results are discussed and compared with the literature.<p>Using Yeti, we successfully reproduced the results of a model estimating multi-core computation power and extended them thanks to the representation flexibility of our tool.<p>We also built several models from the ground up to help the dimensioning of interconnect links and clock frequency optimization.<p>Thanks to Nessie, we were able to reproduce the NoC power consumption results of an H.264/AVC decoding application running on a multicore platform. These results were then extended to the case of a 3D die stacked architecture and the performance benefits are then discussed.<p>We end up by highlighting the advantages of our technique and discuss future opportunities for performance prediction tools to explore. / Doctorat en Sciences de l'ingénieur / info:eu-repo/semantics/nonPublished
77

AI as a Creative Colleague in Game Level Design

Larsson, Tinea January 2022 (has links)
This paper describes a modification to the mixed-initiative co-creative tool, the Evolutionary Dungeon Designer, and a study performed to evaluate and analyze the co-creative relationship between human designer and AI. The relationship between AI and human designer in creative processes is delicate, as adjusting the autonomy of the AI can negatively affect the user experience. To perfect this type of Human-AI collaboration, further research is needed. In this thesis, different degrees of initiative of the AI are explored, to gain further understanding of mixed-initiative co-creative tools. A user study was performed on the Evolutionary Dungeon Designer, with three varying degrees of AI-initiative. The study highlighted elements of frustration that the human designer experiences when using the tool, and the behaviour in the AI that led to possible strains on the relationship. The paper concludes with the identified issues and possible solutions, as well as suggested further research.
78

En färgglad studie om spelarbeteenden

André, Andreas January 2014 (has links)
This is the result of a 10 week long study about if player behaviors and choices are affectedby the surrounding colors in their environment. A number of players are asked to movethrough a virtual environment, while being clocked, where they have to make left and rightdecisions based on different colors (red, green, blue and gray). While most choices consist oftwo different colors, some use the same color. The result shows that the players most likelyprefer one color over another, and that they most likely prefer a non-neutral color over aneutral color. With these results and the interviews that are conducted the conclusion is madethat it is very likely the players choices are affected by the colors in the environment, but it isnot clear to say how they are affected. / Det här är resultatet av en 10 veckor lång studie om spelares beteenden och om deras valpåverkas av färgerna i deras omgivning. Ett antal personer är tillfrågade att röra sig genom envirtuell miljö, på tid, där de gör höger- och vänsterval baserat på olika färger (röd, grön, blåoch grå). Medan de flesta av valen består av två olika färger så använder några samma färger.Resultatet visar att personerna troligen föredrar en färg framför en annan och att de troligtvisföredrar en icke-neutral färg framför en neutral färg. Med dessa resultat och de intervjuer somgjordes så blir slutsatsen att det är stor sannolikhet att personerna påverkas av färgerna i derasomgivning, men det är svårt att säga exakt hur de påverkas.
79

Game Design Patterns for Designing Stealth Computer Games

Hu, Mengchen January 2014 (has links)
Design patterns are widely used in game design, especially in action games. Design patterns can be seen as a group of concluded gameplay. A stealth game is a video game genre that rewards the player for using stealth (conceal avatar of player in order to avoid enemies) to overcome antagonists. In some cases there is a conflict between difficulty and game experience in stealth game. In order to solve this problem, we researched design patterns in stealth games. We observed a set of stealth game design patterns from three different stealth games. The collection used a different template of game design pattern.Then we created a questionnaire to collect opinions from designers that have experience in stealth game area. Based on such data, we designed and created a prototype of application. Unlike other websites or books, the application shown game design pattern for a single type of game(stealth game). From the application designers can check stealth game design patterns based on design document. The application can introduce stealth game design patterns to designers, and show how to use them in stealth game design.
80

An Investigation of Methods to Improve Area and Performance of Hardware Implementations of a Lattice Based Cryptosystem

Beckwith, Luke Parkhurst 05 November 2020 (has links)
With continuing research into quantum computing, current public key cryptographic algorithms such as RSA and ECC will become insecure. These algorithms are based on the difficulty of integer factorization or discrete logarithm problems, which are difficult to solve on classical computers but become easy with quantum computers. Because of this threat, government and industry are investigating new public key standards, based on mathematical assumptions that remain secure under quantum computing. This paper investigates methods of improving the area and performance of one of the proposed algorithms for key exchanges, "NewHope." We describe a pipelined FPGA implementation of NewHope512cpa which dramatically increases the throughput for a similar design area. Our pipelined encryption implementation achieves 652.2 Mbps and a 0.088 Mbps/LUT throughput-to-area (TPA) ratio, which are the best known results to date, and achieves an energy efficiency of 0.94 nJ/bit. This represents TPA and energy efficiency improvements of 10.05× and 8.58×, respectively, over a non-pipelined approach. Additionally, we investigate replacing the large SHAKE XOF (hash) function with a lightweight Trivium based PRNG, which reduces the area by 32% and improves energy efficiency by 30% for the pipelined encryption implementation, and which could be considered for future cipher specifications. / Master of Science / Cryptography is prevalent in almost every aspect of our lives. It is used to protect communication, banking information, and online transactions. Current cryptographic protections are built specifically upon public key encryption, which allows two people who have never communicated before to setup a secure communication channel. However, due to the nature of current cryptographic algorithms, the development of quantum computers will make it possible to break the algorithms that secure our communications. Because of this threat, new algorithms based on principles that stand up to quantum computing are being investigated to find a suitable alternative to secure our systems. These algorithms will need to be efficient in order to keep up with the demands of the ever growing internet. This paper investigates four hardware implementations of a proposed quantum-secure algorithm to explore ways to make designs more efficient. The improvements are valuable for high throughput applications, such as a server which must handle a large number of connections at once.

Page generated in 0.0713 seconds