Spelling suggestions: "subject:"forminformation aprocessing"" "subject:"forminformation eprocessing""
221 |
A study on dynamic memory allocation mechanisms for small block sizes in real-time embedded systemsHeikkilä, V. (Valtteri) 08 February 2013 (has links)
Embedded real-time and battery-powered systems are increasing in numbers, and their software complexity is growing. This creates a demand for more efficient dynamic memory allocation in real-time embedded systems. Small improvements in dynamic memory allocation can greatly reduce system overall memory usage, fragmentation and energy consumption. Most of today’s general-purpose allocators are unsuitable for real-time embedded systems since they are not designed for real-time constraints.
This thesis contains a study on the suitability of dynamic memory allocation mechanisms for small block allocation in real-time embedded systems. We first perform a literature survey on dynamic memory allocation mechanisms and then analyze general-purpose allocators. From this we arrive to a set of allocation mechanisms for additional experimental study. We then conduct simulations on the selected mechanisms by using both real and synthetic traces to measure the mechanism fragmentation and WCET. We then evaluate the mechanisms and their tradeoffs and present an analysis on their suitability for small block allocation in real-time embedded systems.
This thesis additionally introduces Bitframe allocator, a new bitmapped fits allocator. The introduced allocator demonstrates that bitmapped fits can be used effectively for dynamic memory allocation. We are however unsure if bitmapped fits can offer better efficiency than other mechanisms.
Our results confirm that TLSF is one of the best allocators for real-time systems in terms of performance and fragmentation. Our results also confirm that reaps has low fragmentation and very low WCET when small blocks are allocated. Our results also show that simple segregated storage and region mechanism should not be used in real-time systems due to high worst-case fragmentation.
|
222 |
Älykäs ympäristö kotonaKajaste, E. (Enni) 17 January 2014 (has links)
Älykäs teknologia ja älykodit ovat viime vuosikymmeninä yleistyneet entisestään. Älykotien rakentaminen on kasvussa ja älyasunnot ovat tulleet tavallisten kuluttajien markkinoille. Älykotien käyttäjien tarpeita, asenteita ja hyväksyntää tulisi tutkia tarkemmin, jotta älykodit voivat vastata asukkaiden arjen tarpeita.
Tämän tutkimuksen tavoitteena oli vastata seuraavaan tutkimuskysymykseen: Kuinka tavalliset ihmiset kokevat älykkään teknologian osana heidän kotiaan?
Tutkimuksen tavoitteena oli tutkia älykästä ympäristöä arjessa. Mitkä ovat asukkaiden asenteet älykoteja kohtaan, miten asukkaat kokevat älykkään teknologian heidän arjesssaan ja kuinka älykäs ympäristö vastaa tavallisten ihmisten arjen tarpeisiin ja kuinka älykoteja tulisi kehittää tulevaisuudessa.
Tässä tutkimuksessa tutkimusmenetelmänä käytettiin laadullista tutkimusta. Empiirinen aineisto kerättiin haastattelujen avulla. Haastattelumenetelmänä käytettiin teemahaastattelua. Haastateltavina oli kahdeksan 22–82-vuotiasta miestä ja naista Kummatin asuinalueelta Raahesta. Haastateltavat asuivat asuinnoissa, joihin oli sisällytetty arjen teknologiaa.
Tutkimustuloksista voidaan todeta, että haastateltavien tietämys arjen teknologiasta ja älykodeista oli vähäistä. Haastateltavien teknologian tarve oli vähäistä heidän arjessaan, se painottui lähinnä tietokoneeseen ja kännykkään. Tulevaisuuden osalta haastateltavat toivoivat elämäänsä lisää yhteisöllisyyttä, sosiaalista vuorovaikutusta, ihmisten kohtaamista ja aktiivista elämää. Kodit tulisi säilyttää koteina, vaikka niissä älyteknologiaa olisikin. Kodissa ihmisen tulisi olla ensisijalla ja teknologian tulla vasta seuraavalla sijalla.
|
223 |
Choosing a platform as a service for software as a service providerKreivi, A. (Antti) 05 May 2014 (has links)
Cloud computing is still very sparsely researched, especially concerning platform as a service(PaaS) deployment model. PaaS model is still a very new in cloud computing and often confused with more familiar IaaS and SaaS models. First the basic concepts of cloud computing such as essential characters, deployment models and service models are presented and everything in cloud computing lean on these main concepts.
The main objectives in this thesis are, to find out why PaaS is better alternative than building an own platform over a selected infrastructure, and to justify, why the selected PaaS is the most suitable for a cloud software project. PaaS has many advantages and concerns that must be sort out before a decision of deploying one can be made. With PaaS, developers do not need to know much about underlying cloud infrastructure and the infrastructure services are usually completely hidden, instead they can focus more on the development. PaaS platforms bring also a lot of cost reductions from reduced hardware needs, staff and more predictable expenditure. The biggest challenges in PaaS are the security and privacy concerns, and possible vendor lock-in risks.
The four PaaS alternatives which are measured are Amazon Web Services with Beanstalks, Google App Engine, Heroku and Cloud Foundry. These are selected from number of alternatives together with the contractor. These four were among the most well-known PaaS providers and they were the most suitable with the contractor’s project and with the criteria. To measure these, five justified criteria, which are derived from the literature, are defined. Measurable factors are derived from scalability, service agreements and vendor lock-in, security, framework and supported languages and pricing models.
Choosing-by-advantage(CBA) decision making method is used to find out the best alternative. CBA is made for fast and sound decision making and it is based on the importance of advantages. The factors from the criteria are derived in away that they should uncover the advantages of each PaaS alternative. Disadvantages are also found out and listed to strengthen the decision further. With the help of the justified criteria and the CBA analysis, the goal is to support and justify the PaaS decision, which will be made.
|
224 |
Students versus professionals as experiment subjects:an investigation on the effectiveness of TDD on code qualitySalman, I. (Iflaah) 10 April 2015 (has links)
Background: Most of the software engineering empirical studies use students as subjects for conducting experiments. This raises the concern that whether results acquired through students are applicable to the industry or not. The researchers argue that experiment with students lack realism. This situation not only causes the threats to the generalizability of the research results but also becomes a potential barrier to the adoption of novice research in industry.
Aim: The objective of the study is to investigate, whether students are representative of professionals in software engineering experiments. We are addressing the objective by investigating the difference in the code quality of the two subject groups — students and professionals — in the context of the effectiveness of Test Driven Development (TDD) on the internal code quality. The study tests two hypotheses; the first one aims to test the difference in the code quality of the two subject groups. The second one aims to test the difference in the code quality of tasks implemented following TDD and Test-Last Development (TLD) methodologies.
Method: The study follows a quantitative research approach with the experimentation methodology. The study involved graduate students from academia and professionals from the industry in their respective environmental settings. Both of the subject groups implemented tasks in TDD and TLD approaches. The treatments of the first hypothesis are professionals and students for studying the difference in the two subject groups. Furthermore, TDD and TLD are the treatments of the second hypothesis for assessing the difference in the code quality of the two methodologies.
Results: Students differed professionals when compared for TLD and TDD1 implementations. The results couldn’t show the difference for TDD2 task. The code quality differed as an effect of TDD for professionals in the cases of TLD vs. TDD1 and TLD vs. TDD2. Students’ data showed the difference in the code quality for TLD vs. TDD1 but couldn’t for TLD vs. TDD2 tasks. The null hypotheses were refuted with the p-value < 0.05.
Conclusion: The results of the study are confined to the context of TDD and to some extent with the experimental design. The future extension can be to compare the two subject groups by designing the same experiment. Additionally, comparison can also be studied in other areas of software engineering for a better generalizability of the comparison results.
|
225 |
Free-to-play games:what are gamers paying for?Riekki, J. (Johannes) 29 February 2016 (has links)
Free-to-play games are a rather new phenomenon on the world of video games. However they have proved to be immensely popular among many gamers. In this study the purpose was to find out more about the business model of free-to-play games and their content, both free and monetized. To help us study these games a closer look at a few games that offered the player free access to the game was taken.
The other purpose of this study was to examine the behaviour and motivations of the gamers themselves. By conducting both a survey and researching previous studies on the gamer’s behaviour it was found out that gamers value both functional and visual improvements on the game that they were offering for charge. It was also discovered that players aim to enhance their immersion and enjoyment of the game-play by consuming paid in-game content.
|
226 |
Discovering value for health with grocery shopping dataPonsimaa, P. (Petteri) 25 May 2016 (has links)
Food retailers are taking more active role in the customer value creation process and shifting their attention from the sale of goods to support customer’s value-creation to discover more innovative service-based business models. From customer data consumers may develop more responsible consumption behaviour, make more economical choices, and raise awareness on food healthiness. This exploratory study sets out to answer the question what value if any does the use of grocery shopping data bring to the customers.
Using design science research, the thesis makes use of grocery purchase data available to S-Group customers and presents ways of applying the data while making it meaningful for them. The aim was to construct visualization application prototypes for seeking value and benefits of purchase data experienced by the customers. To evaluate the application design, a study group of eight customers were invited to provide purchase data and feedback on the data visualizations. The focus was on building designs of the grocery consumption patterns based on customer interviews and then evaluating the impact on the study group via interviews and usage data.
The visualization prototypes allowed the participants to discover something new of their shopping and food consumption behaviour, not known to them before the study and not visible from the mere purchase data. Interviews suggested that the visualizations of health data encourage reflection of consuming habits, and thus may be used as a tool for increasing awareness of one’s shopping behaviour. A number of limitations in the data utilization were met hindering inference-making and reflecting on the data. Lastly, the prototypes led the participants to envision new digital health services, some of which might have commercial value.
|
227 |
Cloud computing:migrating to the cloud, Amazon Web Services and Google Cloud PlatformQuadri, S. (Samsideen) 15 March 2017 (has links)
Cloud computing provides great benefits and challenges for small, medium, and large organisations. Whether it is a financial, technology, or engineering sector any company may find a useful cloud component for the organisation’s needs. Though with its benefits come many challenges, experts believe that cloud computing advantages outweigh the disadvantages. With more research in the area of cloud computing, the challenges will be dealt with.
Amazon and Google are the prominent cloud services providers with different components and services that are evolving. However, there are many other cloud vendors and this makes it difficult for potential cloud users to migrate or choose a suitable vendor from the numerous available ones.
This thesis, using design science approach describes the implementation of cloud computing by different cloud providers, i.e., examines and demonstrates how Amazon and Google implemented and structured their cloud infrastructure. Also, it provides the criteria for accessing the suitability of a business or organisation for choosing a cloud provider. In addition, a web application hosted on both AWS and GCP platforms is developed in order to demonstrate the workability of the framework and guideline for selecting a cloud provider. The application is called KlaudCelet. KlaudCelet is a recommender system that is developed based on previous research on cloud deployment frameworks and proposed model (CPSM; Cloud Provider Selection Model). The recommendation by the system is reliable because it is based on research results. KlaudCelet recommends a deployment model, a cloud service provider, and a decision either to migrate or not. CPSM was not used in KlaudCelet but the model by Keung & Kwok and Misra & Mondal were used.
The guideline developed was used to create a recommender framework. The need for solving this problem was shown by conducting a review of previous research and a small-scale survey among IT experts. The survey found that IT experts are optimistic about cloud computing adoption despite its challenges such as security, privacy and leaning curve. The decision to migrate to the cloud must take into consideration the organisation’s state and culture, cloud deployment model, and choosing a suitable cloud service provider.
|
228 |
Pervasive gaming:from special to mundaneTaipaleenmäki, T. (Tomi) 09 June 2014 (has links)
In the early 2000’s the idea about digital pervasive gaming was somewhat limited, as there was a lot of technological obstacles, which made implementing pervasive elements into practical use either difficult or expensive. Pervasive games could mean, that if you wanted to play them you would have to have a plethora of different devices with you, depending of if you needed a GPS for navigation, laptop for data search or just some contraption for virtual reality or just a camera to take photos with. And on the top of that, accessing the internet on the go was, if not impossible, potentially very expensive and not necessarily easy, depending on your location.
Gradually the technology has begun to evolve in more versatile direction. Today a good cellphone can replace most of the devices that were cumbersome to lug around ten years ago. 3G and 4G internet connections can provide an access to the internet so, that it is possible to access the data sources almost anywhere, were you in the city or in a forest. Of course there’s still gaps in the networks, but at least there’s lesser need to find a phone outlet.
Pervasive gaming is not just for location based outdoors activity anymore. Thanks to the advancements of technology elements enabling pervasive gaming have also found their way to home game consoles, MMO games and many kinds of social platform games.
Pervasive games come in many different shapes and forms. The elements used can be simple social interaction, where the player informs the social network when the game is played or what kind of achievements is done in the game, thus trying to lure in more gamers or they can be more massive style of a games, where not only social activity is a must, but also where and when the game is played has an effect as well. The game can use specialized controllers, such as motion or voice or the player can just simply stream the gameplay to the web, enabling others to spectate and comment the gameplay.
The designers need to ask themselves what kind of features they want to use and what kind of devices the players need in order to use these kind of pervasive features. There are also questions of the player security, be it actual physical wellbeing of the player on the real life location of the game or information security of the data gathered during the game.
This thesis tries to provide some theoretical insight on what pervasive gaming has been, is now and where it is heading. There are some speculation about how pervasive elements should be designed and used as well points the designers should take a note of.
|
229 |
Käytettävyyden vieminen avoimen lähdekoodin projekteihin:tapaustutkimus neljästä UKKOSS-projektistaHietala, L. (Lassi) 03 May 2017 (has links)
1980- ja 1990-luvuilla ilmaantuneet avoimen lähdekoodin ohjelmat (open source software, OSS) ovat vakiinnuttaneet asemansa osana nykyaikaista tietojenkäsittelyä. Varhaisimmat avoimen lähdekoodin ohjelmat olivat lähinnä spesialisteille tarkoitettuja työkaluja, mutta nykyisin kuka tahansa on saattanut tutustua esimerkiksi Mozilla Firefox -selaimeen tai Ubuntu-käyttöjärjestelmään työpaikalla tai kotona. Käyttäjäkunnan tavallistuessa huomiota on alettu kiinnittää yhä enemmän avoimien ohjelmien käytettävyyteen, jonka puutteita on laajalti kritisoitu. Mahdollisia syitä avoimen lähdekoodin ohjelmien heikkoon käytettävyyteen voi löytää niiden hakkerilähtöisestä historiasta ja toisaalta OSS-kehittäjien teknologiaan ja ohjelmien toiminnallisuuteen keskittyneestä mielenmaisemasta. Tämän pro gradu -tutkielman keskiössä on OSS-kehittäjien ja käyttäjien suhtautuminen käytettävyyteen.
Oulun yliopistolla on järjestetty vuodesta 2007 lähtien UKKOSS-projekteja, joiden tarkoituksena on käytettävyyden vieminen avoimen lähdekoodin projekteihin. Tässä pro gradu -tutkielmassa tarkastellaan tapaustutkimuksen keinoin neljää vuosina 2013–2014 järjestettyä UKKOSS-projektia. Tutkimuksessa käytetään laadullisen sisällönanalyysin menetelmiä. Tutkimuksessa pyritään selvittämään, miten avoimen lähdekoodin yhteisöt suhtautuvat yhteisön ulkopuolelta tuleviin käytettävyyskontribuutioihin. Lisäksi tutkitaan, käyttävätkö avoimen lähdekoodin yhteisöt Rajasen, Iivarin & Lanamäen kuvailemia portinvartiointitaktiikoita käytettävyystoimien estämiseen.
Tuloksista saatiin selville, että avoimen lähdekoodin kehittäjät ovat kiinnostuneita käytettävyydestä ja suhtautuvat pääosin myönteisesti kehitysyhteisön ulkopuolelta tulevan käytettävyysryhmän toimintaan. Portinvartiointitaktiikoiden käyttöä oli havaittavissa varsin niukalti, vaikkakin sosiaalisen rajaamisen taktiikka esiintyi eriasteisena kaikissa tapauksissa. Kaikki UKKOSS-projektiryhmät pystyivät paikantamaan kohdeohjelmistostaan useita eriasteisia käytettävyysongelmia ja tarjoamaan näihin parannusehdotuksia; näin ollen opiskelijaprojektit onnistuivat tuottamaan OSS-kehittäjille hyödyllistä tietoa käytettävyysasioista.
Koska tutkittavana oli vain neljä UKKOSS-projektia, joiden aiheina oli keskenään erityyppiset OSS-ohjelmistot, eivät tulokset ole suoraan yleistettävissä. Tapaustutkimus tarjoaa kuitenkin uusia näkökulmia avoimen lähdekoodin ohjelmien käytettävyyteen ja OSS-kehittäjäyhteisöjen näkemyksiin käytettävyysasioista. Tutkimuksen tuloksia voidaan käyttää hyödyksi tulevissa UKKOSS-projekteissa.
|
230 |
Metadata management in distributed file systemsLaitala, J. (Joni) 12 September 2017 (has links)
The purpose of this research has been to study the architectures of popular distributed file systems used in cloud computing, with a focus on their metadata management, in order to identify differences between and issues within varying designs from the metadata perspective. File system and metadata concepts are briefly introduced before the comparisons are made.
|
Page generated in 0.1282 seconds