71 |
Från data till kunskap : En kvalitativ studie om interaktiv visualisering av big data genom dashboardsAgerberg, David, Eriksson, Linus January 2016 (has links)
Rapid growing volumes of data demands new solutions in terms of analysing and visualizing. The growing amount of data contains valuable information which organizations in a more digitized society need to manage. It is a major challenge to visualize data, both in a static and interactive way. Through visualization of big data follows several opportunities containing risk assessment and decision basis. Previous research indicates a lack of standards and guidelines considering the development of interactive dashboards. By studying factors of success from a user-centered perspective we proceeded with a qualitative approach using semi-structured interviews. In addition to this we performed a thorough examination of existing literature in this particular field of research. A total of eight interviews were held, all eight respondents had experience from using or developing dashboards. The results indicates that user experience is an important yet not a sufficiently used principle. They also indicates challenges concerning the management of big data and particularly visualizing it. The results developed into a model which illustrates guidelines and vital components to orchestrate when developing a dashboard. A user-centered approach should pervade the entire developing process. Interactive functionalities are rather a necessity than a recommendation. With interactiveness comes drill-down functionalities which leads to a more intuitively practice. User experience is an essential component of the model, bringing light to individual customisations as well as it makes allowances to a large target group. The last component highlights the importance of early prototyping and an iterative approach to software development. The conclusion of the study is our complete model which brings opportunities to transform big data to great knowledge.
|
72 |
Compaction Strategies in Apache Cassandra : Analysis of Default Cassandra stress modelRavu, Venkata Sathya Sita J S January 2016 (has links)
Context. The present trend in a large variety of applications are ranging from the web and social networking to telecommunications, is to gather and process very large and fast growing amounts of information leading to a common set of problems known collectively as “Big Data”. The ability to process large scale data analytics over large number of data sets in the last decade proved to be a competitive advantage in a wide range of industries like retail, telecom and defense etc. In response to this trend, the research community and the IT industry have proposed a number of platforms to facilitate large scale data analytics. Such platforms include a new class of databases, often refer to as NoSQL data stores. Apache Cassandra is a type of NoSQL data store. This research is focused on analyzing the performance of different compaction strategies in different use cases for default Cassandra stress model. Objectives. The performance of compaction strategies are observed in various scenarios on the basis of three use cases, Write heavy- 90/10, Read heavy- 10/90 and Balanced- 50/50. For a default Cassandra stress model, so as to finally provide the necessary events and specifications that suggest when to switch from one compaction strategy to another. Methods. Cassandra single node network is deployed on a web server and its behavior of read and write performance with different compaction strategies is studied with read heavy, write heavy and balanced workloads. Its performance metrics are collected and analyzed. Results. Performance metrics of different compaction strategies are evaluated and analyzed. Conclusions. With a detailed analysis and logical comparison, we finally conclude that Level Tiered Compaction Strategy performs better for a read heavy (10/90) workload while using default Cassandra stress model , as compared to size tiered compaction and date tiered compaction strategies. And for Balanced Date tiered compaction strategy performs better than size tiered compaction strategy and date tiered compaction strategy.
|
73 |
Personlighetens betydelse för upplevda hälsorisker på arbetsplatsen samt copingstrategiMomeni, Ellen January 2006 (has links)
<p>Den svenska välfärden finner stora problem p g a den höga andelen arbetsoförmögna och forskningen inom området är ansenlig. Fokus i dem traditionella, kvantitativa studierna ligger oftast på faktorer i arbetet som orsakar ohälsa utan att ta hänsyn till personlighetsrelaterade faktorer. I kontrast till det syftar föreliggande studie att undersöka personlighetens betydelse för individens copingstrategi samt upplevelse av psykosociala hälsoriskfaktorer på arbetsplatsen. Kvalitativa intervjuer genomfördes med 23 individer som samtliga arbetar inom tjänstemannasektorn, och både induktiv och deduktiv analys användes. Personlighet analyserades efter Big Five och förhållningssätten enligt problem- respektive emotionsfokuserad coping. Personlighet visade sig ha betydelse för copingstrategi, även om mönstret visar en aning oenighet mellan personlighetsdimensionerna, vilket skulle kunna bero på otillräcklig data. Betydelsen av personligheten för upplevda hälsoriskfaktorer var däremot svårare att se, dock verifierar studien tidigare kvantitativa resultat inom arbetsmiljöforskning och ger krav, kontroll och socialt stöd en djupare och kvalitativ innebörd.</p>
|
74 |
Estimation of net economic benefits of the Oregon big game resource to huntersShalloof, Faisal M. 22 April 1981 (has links)
Much outdoor recreation occurs on publicly owned land and water
resources, or involves use of these public resources. Consequently, an
economic problem arises concerning the value of recreational resources
which do not have a conventional market price. Without a price to guide
the allocation of resources, it is difficult to obtain optimal decisions
in allocation of these publicly owned natural resources among alternative
uses, including recreation, timber, and domestic livestock production.
In Oregon, the big game resource has a great impact on the economy
of the state. Positive values of this resource are related to recreational
use and to income generated which benefit local economies. Negative
values of big game include its competition for resources used for
timber production and/or livestock grazing.
In order to better assess the value of the big game resource, an
attempt has been made in this thesis to improve demand models from which
the net economic value of the Oregon big game resource can be derived.
The data used in this study were obtained from the questionnaires mailed
to a random sample of Oregon big game hunters during the fall of 1968.
The travel cost method was used to estimate the demand for big game
hunting, based on the actual behavior of the hunters. Several algebraic
forms of the travel cost demand equation were estimated for the Northeast
and the Central regions of Oregon.
The concept of consumers' surplus was used to estimate the net economic
value for the Oregon big game resources. Net economic value for
the Northeast and Central regions of Oregon in 1968 dollars was approximately
$14.3 million, based on the exponential demand function. Net
economic value for the same two regions was approximately $11 million,
based on the linear demand function.
An attempt was made in this study to predict the changes in consumers'
surplus from changes in the number of deer and elk harvested.
Note that the regression models in this thesis implied that a ten percent
increase in harvest would increase the consumers' surplus of
hunters by more than ten percent. However, the hypothesis that a ten
percent increase in harvest would increase consumers' surplus by exactly
ten percent was not rejected by a statistical test. Therefore, a good
deal more research is needed to determine the value of marginal changes
in the number of deer and elk harvested.
It is thought that the estimation of net economic value in this
study for the Northeast and Central regions of Oregon will be useful
from the viewpoint of big game management and resource allocation in
Oregon. / Graduation date: 1981
|
75 |
Ecology and hunting behaviour of lions and leopardsStander, Philip January 1994 (has links)
No description available.
|
76 |
Stonehenge-mer än bara stora stenarTraiven, Charlie January 2016 (has links)
Stonehenge is a place of mystery and wonder, where it stands as a last witness to long forgotten religious practices and rituals, and its sophisticated stone structure still makes a huge impression after over 4000 years. And it raises questions as to how it was built, and why? Stonehenge is today one of the world`s most famous megalithic monument in the world, and in its right. But Stonehenge is more than just big stones;it has a much longer and richer history than that. Stonehenge also has many surrounding monuments, from the same time period, and thought of as today, to coexist and fill different, specific functions, as a ritual landscape. The more archaeologists learn about Stonehenge, the more complex the picture gets. Today, the understanding of Stonehenge lies just as much in the surrounding landscape, as in the monument itself.
|
77 |
Levererar Big-4 en högre revisionskvalitet jämfört med Non-Big 4? : En kvantitativ studie som jämför större och mindre revisionsbolags revisionskvalitet relaterat till revisionsarvodet / Do Big-4 audit companies deliver a higher quality compared to Non-Big 4 companies? : A quantitative study comparing large and small audit companies audit quality related to the audit feeDahlström, Viktor, Danielsson, Robin January 2017 (has links)
Syfte: Större revisionsbolag har länge inom redovisningslitteraturen associerats med bättre revisionskvalitet jämfört med mindre revisionsbolag. På senare tid har frågan lyfts om större revisionsbolags höga revisionsarvoden beror på högre revisionskvalitet eller marknadsmakt. Denna studie bidrar med nya empiriska underlag inom jämförelsestudier mellan större och mindre revisionsbolag, där revisionsarvodet används som proxy för revisionskvalitet. Vidare tar denna studie, till skillnad från tidigare studier, även hänsyn till revisionskvalitet inom olika riskmiljöer. Metod: Studien har använt sig av en kvantitativ metod med ett positivistiskt förhållningssätt genom arbetet. En deduktiv forskningsansats har tillämpats där tidigare forskning har legat som grund till studiens framförda hypoteser. Insamling av finansiella sekundärdata för totalt 2518 företag har utförts via databasen Thomson Reuters Datastream. Resultat & slutsats: Studiens resultat påvisar signifikanta skillnader i revisionskvalitet mellan större och mindre revisionsbolag i studiens olika riskmiljöer. För studiens europeiska länder är förhållandet mellan större och mindre revisionsbolag likvärdiga medan de amerikanska revisionsbolagen skiljer sig signifikant revisionskvalitetsmässigt. Förslag till fortsatt forskning: Studien har genomförts utan hänsyn tagen till kvalitativa faktorer som kan komma att påverka revisionskvalitet, vilket öppnar ett utrymme för komparativa studier med en kvalitativ inriktning. Det finns även möjlighet att utöka antalet börsmarknader för respektive land eller utvidga antalet länder i olika riskmiljöer. Uppsatsens bidrag: Studien lämnar två bidrag till redovisningslitteraturen i form av nya empiriska bevis inom revisionskvalitet mellan större och mindre revisionsbolag samt unik forskning kring revisionskvalitetsstudier mellan riskmiljöer. Vidare lämnar studiens resultat incitament åt praktiker att granska revisionsmarknaden för eget vinstintresse samt svarar på normgivares funderingar kring marknadsbalansen mellan större och mindre revisionsbolag. / Aim: Big audit firms have long been associated with higher audit quality, compared to smaller audit firms. Recent studies suggest that the higher audit fees from bigger audit firm is affected by market misuse rather than better audit quality. This study provides new empirical evidence between the comparison of big vs small audit firm, where audit fees are used as proxy for audit quality. Furthermore, this study investigates different litigation environment that could affect audit quality. Method: This study uses an quantitative based method with an positivist, deductive approach, were earlier studies have had an impact on our hypotheses. Financial information from 2518 companies has been collected from Thomson Reuters Datastream. Result & Conclusions: This study's result provides significant differences of audit quality between big and small audit firms in different risk environments. For this study, the audit quality relationship between big and small audit firms are equivalent for the European countries while audit quality between big and small audit firms in the US differ significantly. Contribution of the thesis: This study leaves two contributions to the extent audit literature, in terms of empirical evidence of audit quality between big and small audit firms and unique research results of audit quality in different litigation environments. Furthermore, the results of this study creates incentives for practitioners to review the audit market for self interests and answer legal setters concerns about unbalanced audit markets. Suggestions for future research: The study has been carried out without consideration of qualitative factors that may affect audit quality. It’s opening a space for comparative studies with an qualitative approach. It is also possible to expand the number of stock exchanges for a country or expand the number of countries in different risk environments.
|
78 |
Exploiting Application Characteristics for Efficient System Support of Data-Parallel Machine LearningCui, Henggang 01 May 2017 (has links)
Large scale machine learning has many characteristics that can be exploited in the system designs to improve its efficiency. This dissertation demonstrates that the characteristics of the ML computations can be exploited in the design and implementation of parameter server systems, to greatly improve the efficiency by an order of magnitude or more. We support this thesis statement with three case study systems, IterStore, GeePS, and MLtuner. IterStore is an optimized parameter server system design that exploits the repeated data access pattern characteristic of ML computations. The designed optimizations allow IterStore to reduce the total run time of our ML benchmarks by up to 50×. GeePS is a parameter server that is specialized for deep learning on distributed GPUs. By exploiting the layer-by-layer data access and computation pattern of deep learning, GeePS provides almost linear scalability from single-machine baselines (13× more training throughput with 16 machines), and also supports neural networks that do not fit in GPU memory. MLtuner is a system for automatically tuning the training tunables of ML tasks. It exploits the characteristic that the best tunable settings can often be decided quickly with just a short trial time. By making use of optimization-guided online trial-and-error, MLtuner can robustly find and re-tune tunable settings for a variety of machine learning applications, including image classification, video classification, and matrix factorization, and is over an order of magnitude faster than traditional hyperparameter tuning approaches.
|
79 |
Performance Optimization Techniques and Tools for Distributed Graph ProcessingKalavri, Vasiliki January 2016 (has links)
In this thesis, we propose optimization techniques for distributed graph processing. First, we describe a data processing pipeline that leverages an iterative graph algorithm for automatic classification of web trackers. Using this application as a motivating example, we examine how asymmetrical convergence of iterative graph algorithms can be used to reduce the amount of computation and communication in large-scale graph analysis. We propose an optimization framework for fixpoint algorithms and a declarative API for writing fixpoint applications. Our framework uses a cost model to automatically exploit asymmetrical convergence and evaluate execution strategies during runtime. We show that our cost model achieves speedup of up to 1.7x and communication savings of up to 54%. Next, we propose to use the concepts of semi-metricity and the metric backbone to reduce the amount of data that needs to be processed in large-scale graph analysis. We provide a distributed algorithm for computing the metric backbone using the vertex-centric programming model. Using the backbone, we can reduce graph sizes up to 88% and achieve speedup of up to 6.7x. / <p>QC 20160919</p>
|
80 |
Who am I? : The Neurobiology of the Big FiveHuynh, Yen Nhi January 2019 (has links)
Personality is something that sets every human being apart, yet it is something that has been quite hard to pinpoint. Recently, neuroscientists have begun pinning down the neural correlates of personality traits – with focus on the Big Five, sparking a whole new subfield within personality research, known as personality neuroscience. By using neuroscientific methods and techniques to find the underpinnings of the Big Five have led to a deeper and broader understanding of how genetics and the environment integrate into making individuals who they are. This research has also been helpful in the prediction of various outcomes e.g. academic performance and achievement and neuropsychological disorders. In this thesis, the supposed neural correlates of the Big Five are examined through thorough and critical investigations, where evidence from some of the existing relevant studies is reviewed and compared, as well as the different problems and complexities that the field of personality neuroscience is dealing with. The findings in this thesis shows that extraversion has neurobiological basis in the frontal areas of the brain, neuroticism with reduced volume in the frontal areas, agreeableness with frontoparietal areas that are related to theory of mind as well as temporal regions, conscientiousness with frontal parts that are associated with planning and goal-orientation, and openness/intellect with frontoparietal areas as well as subcortical regions, which have been linked with intelligence and creativity. However, some of the correlations were inconsistent and scattered and further research needs to be done. The analysis of academic achievement and performance, as well as neuropsychological disorders and the Big Five with neuroimaging as a method, have shown to be limited, thus much more research is needed.
|
Page generated in 0.0185 seconds