Spelling suggestions: "subject:"scalability."" "subject:"calability.""
371 |
Locality Optimizations for Regular and Irregular ApplicationsRajbhandari, Samyam 28 December 2016 (has links)
No description available.
|
372 |
Performance of Digital Currency and Improvements : An analysis of current implementations and the future of digital currency / Prestanda av digital valuta och förbättrningarJohannesson, Tobias January 2022 (has links)
Currency has changed a lot, and the introduction of the Internet sped up the evolution of the currency. Digital currency introduced many benefits compared to physical currencies. Ideas such as cryptocurrencies work as an option for other means of payment. During the recent pandemic, interest in new digital currencies has increased, leading to more research on digital currency. With the introduction of new currencies and their increased popularity, many central banks have started looking into the idea of innovating currency. All this new research has coined the term central bank digital currency. As of today, there is no single idea on how a digital currency should work or be implemented. With many variations, the future is still unclear. There seem to be vulnerabilities to solve and many potential ways to improve current systems. When building this new currency it is crucial to know what different use cases could demand from the implementation. In conclusion, the results show that digital currency is still in early development, with central bank digital currency research showing promise. It is theoretically possible to create a better transaction solution contra traditional currencies. More research is needed on the topic of digital currency, but there could be incremental improvements to today’s currency leading to better future solutions. / Valuta har förändrats genom tiderna och introductionen av internet skyndade på denna utveckling. Digital valuta har introducerat många fördelar jämfört med fysiska valutor. Fler idéer som till exempel kryptovalutor har introducerats som alternativa betalmedel. Under den senaste pandemin så har intresset för nya digitala valutor ökat, vilket har lett till mer forskning inom området av digitala valutor. På grund av växande popularitet och nya digitala valutor så har många central banker börjat testa idéen om att nyskapa valuta, och med detta så har termen centralbanks valuta skapats. Det finns inte idag en enda lösning på hur digitala valutor ska fungera eller bli implementerade. Med många varianter så är framtiden fortfarande oklar. Det verkar finnas problem att lösa och många möjliga sätt att förbättra existerande system. Vid byggandet av denna nya valuta så är det extremt viktigt att veta vad som är viktigt och hur det kan finnas olika krav beroende på hur valutan ska användas. Enligt resultaten så är digital valuta fortfarande tidigt i sin utveckling och forsking gällande centralbanks styrda digitala valutor verkar lovande. Mer forsking kommer att behövas inom området digital valuta, men det kan komma många små förbättringar på dagens valuta som leder till bättre framtida lösningar.
|
373 |
Utvärdering utifrån ett mjukvaruutveckling perspektiv av ramverk för SharePoint / Evaluation from a software development perspective of a framework for SharePointAl-Battat, Ahmed, Anwer, Noora January 2017 (has links)
Inom ett företag eller en organisation finns det stor nytta av intranät som ett arbetsverktyg för att kunna dela information. Ett välfungerat intranät bidrar till ett bättre informationsflöde och ett effektivare samarbete. SharePoint är en plattform för intranät med interaktiva funktioner. Omnia är ett ramverk anpassad för Microsofts SharePoint 2013.I detta arbete undersöks hur Omnia fungerar som ett ramverk och vad produkten lämpar sig för. Omnia ramverket utvärderades noggrant och en oberoende bedömning utfördes under examensarbetet. Utvärderingen var baserad på vetenskapliga undersökningar som byggde på den kvalitativa och kvantitativa forskningsmetodiken. Utvärderingens huvudområden baserades på systemets prestanda, skalbarhet, arkitektur och funktionalitet. En testprototyp utvecklades under arbetets gång genom Omnia i from av en webbaserad applikation.Ramverket Omnia var lämplig för utveckling av interaktiva webbaserade applikationer för intranät i SharePoint. Dock saknade den färdig dokumentation/API, vilket gjorde utvecklingsprocessen mer avancerad. Lösningsarkitekturen för systemet uppfyllde kraven för skalbara system, eftersom den baserades på lagerarkitektur. Systemet hade även bra prestanda, dock försämrades den efter att antalet användare översteg ettusen. Funktionaliteten testades med hjälp av två olika tester, vilket visade att produkten är lämplig för att användas i intranät. / The functionality was tested by two different tests, which showed that the product is suitable for usage in the intranet within a company or an organization, there are great benefits from using intranet as a tool for sharing of information. A good intranet contributes to a better flow of information and effective cooperation. SharePoint is a platform for intranet with interactive features, it makes the job easier for staff and even the company. The framework Omnia is a solution designed for Microsoft SharePoint 2013.This essay evaluates how Omnia acts as a framework and what the product is suitable for. Omnia framework evaluates carefully and is an independent assessment carried during this essay. The evaluation is based on scientific studies which are based on the qualitative and quantitative research methodology. The evaluator's main areas are based on system performance, scalability, architecture and functionality. A test prototype develops during the process in the form of an employee vacation request application by the development framework Omnia.The framework Omnia is considered to be suitable for the development of interactive web-based applications for SharePoint. The architecture for the system meets the requirements for scalable systems because it is based on the tier architecture. The system also has good performance but it needs to be improved if the number of users exceeds one thousand. The functionality of this product is quite suitable for the system's usage.
|
374 |
Ten thousand applications in ten minutes : Evaluating scalable recruitment, evaluation and screening methods of candidates for sales jobsKirk, Stephen January 2017 (has links)
While personnel evaluation has been extensively covered in literature, little is known about evaluation procedures screening a large number of applicants. The basis of this research was to investigate if candidates for sales positions can be evaluated in a scalable way (where the number of applications does not impact the cost of evaluation much) for an on demand sales platform. The study consists of interviews with the recruiters and growth leads of the studied firm, a case study of a firm that has omitted resumes in their salesperson recruitment processes, and sample tests performed on candidates for sales positions. Further, some data on salespeople was collected and analysed. In summary, the study links the findings to the restrictions of a process that requires scalability. Previous research outlines how various indicators (personality facets, biodata, and optimism) predict sales performance in salespeople. Mental ability of candidates is relevant especially for the work training phase. Some of these findings were supported by the case study. While traditional resumes contain information predicting sales ability, some sales managers argue that they are obsolete. Previous research shows that recruiters risk drawing broad generalizations based on resume content. Video resumes have some potential, but currently have technical and ethical limitations. Personality and mental ability tests show predictive ability for sales performance, and are scalable. Previous research discusses limitations in many personality tests being commercial, resulting in limitations in how they may be modified; in their transparency of scoring; and validity studies being hard to conduct. Other limitations with personality tests in evaluation settings are that they are prone to faking. The study also suggests future topics of research in how culture defines what an ideal salesperson is, and extending these findings to other areas than sales. / Medan bedömning av sökande för tjänster har täckts i tidigare forskning, är lite känt om utvärderingsprocesser som utvärderar stort antal sökande. Denna studie söker att svara på om kandidater för säljtjänster kan utvärderas på ett skalbart sätt (där antalet sökande har liten påverkan på kostnaden för utvärdering) för en säljplattform. Studien består av intervjuer med rekryterare och growth leads av det studerade företaget, en fallstudie av ett företag som har slopat CV:n i sin ansökningsprocess, och test på kandidater för säljtjänster. Vidare analyserades befintlig data på säljare. Sammanfattningsvis länkar studien resultaten till de begränsningar som krävs av en skalbar process. Tidigare forskning visar hur olika indikatorer (personlighet, biografisk data, och optimism) kan förutse säljförmåga. Kandidatens mentala förmåga är särskilt relevant för träningsfasen. Vissa av dessa resultat stöds av fallstudien. Medan CV:n innehåller information för att förutse säljförmåga, hävdar vissa säljchefer att de är utdaterade. Tidigare forskning visar att rekryterare ibland generaliserar brett baserat på innehållet av ett CV. Videobaserade CV:n har viss potential, men har etiska och tekniska brister i dagsläget. Personlighetstest och test som mäter mental förmåga visar prediktiv potential för säljförmåga och är också skalbara. Tidigare forskning diskuterar även de begränsningar som uppstår av att många personlighetstest är kommersiella, vilket leder till begränsningar i hur de kan modifieras; i transparensen av rättningen; och att validitetsstudier är svåra att utföra på dem. Andra begränsningar med personlighetstest är att kandidater kan manipulera resultaten. Studien föreslår även framtida forskning inom till exempel hur kultur definierar en ideal säljperson, och om dessa resultat kan utökas till andra områden än försäljning.
|
375 |
Optimized Renewable Energy Forecasting in Local Distribution NetworksUlbricht, Robert, Fischer, Ulrike, Lehner, Wolfgang, Donker, Hilko 16 September 2022 (has links)
The integration of renewable energy sources (RES) into local energy distribution networks becomes increasingly important. Renewable energy highly depends on weather conditions, making it difficult to maintain stability in such networks. To still enable efficient planning and balancing, forecasts of energy supply are essential. However, typical distribution networks contain a variety of heterogeneous RES installations (e.g. wind, solar, water), each providing different characteristics and weather dependencies. Additionally, advanced meters, which allow the communication of final-granular production curves to the network operator, are not available at all RES sites. Despite these heterogeneities and missing measurements, reliable forecasts over the whole local distribution network have to be provided. This poses high challenges on choosing the right input parameters, statistical models and forecasting granularity (e.g. single RES installations vs. aggregated data). In this paper, we will discuss such problems in energy supply forecasting using a real-world scenario. Subsequently, we introduce our idea of a generalized optimization approach that determines the best forecasting strategy for a given scenario and sketch research challenges we are planning to investigate in future work.
|
376 |
Programming Model and Protocols for Reconfigurable Distributed SystemsArad, Cosmin January 2013 (has links)
Distributed systems are everywhere. From large datacenters to mobile devices, an ever richer assortment of applications and services relies on distributed systems, infrastructure, and protocols. Despite their ubiquity, testing and debugging distributed systems remains notoriously hard. Moreover, aside from inherent design challenges posed by partial failure, concurrency, or asynchrony, there remain significant challenges in the implementation of distributed systems. These programming challenges stem from the increasing complexity of the concurrent activities and reactive behaviors in a distributed system on the one hand, and the need to effectively leverage the parallelism offered by modern multi-core hardware, on the other hand. This thesis contributes Kompics, a programming model designed to alleviate some of these challenges. Kompics is a component model and programming framework for building distributed systems by composing message-passing concurrent components. Systems built with Kompics leverage multi-core machines out of the box, and they can be dynamically reconfigured to support hot software upgrades. A simulation framework enables deterministic execution replay for debugging, testing, and reproducible behavior evaluation for large-scale Kompics distributed systems. The same system code is used for both simulation and production deployment, greatly simplifying the system development, testing, and debugging cycle. We highlight the architectural patterns and abstractions facilitated by Kompics through a case study of a non-trivial distributed key-value storage system. CATS is a scalable, fault-tolerant, elastic, and self-managing key-value store which trades off service availability for guarantees of atomic data consistency and tolerance to network partitions. We present the composition architecture for the numerous protocols employed by the CATS system, as well as our methodology for testing the correctness of key CATS algorithms using the Kompics simulation framework. Results from a comprehensive performance evaluation attest that CATS achieves its claimed properties and delivers a level of performance competitive with similar systems which provide only weaker consistency guarantees. More importantly, this testifies that Kompics admits efficient system implementations. Its use as a teaching framework as well as its use for rapid prototyping, development, and evaluation of a myriad of scalable distributed systems, both within and outside our research group, confirm the practicality of Kompics. / Kompics / CATS / REST
|
377 |
[en] A NEW APPROACH FOR ESTIMATING THE COEFFICIENTS OF SCALABILITY ASSOCIATED WITH NON PARAMETRIC ITEM RESPONSE THEORY / [pt] UMA NOVA ABORDAGEM PARA A ESTIMAÇÃO DOS COEFICIENTES DE ESCALONABILIDADE ASSOCIADOS À TEORIA DE RESPOSTA AO ITEM NÃO PARAMÉTRICAMARCIA SANTOS ANDRADE 10 April 2014 (has links)
[pt] A finalidade desta tese é propor estimadores pontuais para os coeficientes de
escalonabilidade associados à Teoria de Resposta ao Item não Paramétrica
(TRIN), a saber: Hij, Hi e H, e seus respectivos estimadores da variância, baseados
na abordagem da amostragem de populações finitas. Com o objetivo de investigar
empiricamente a qualidade destes estimadores são consideradas as populações de
referência que são formadas pelos alunos que frequentavam o 9° ano do Ensino
Fundamental, na rede pública, em áreas urbanas dos Estados de Roraima e do Rio
de Janeiro, que participaram da Prova Brasil 2007. As respostas obtidas destes
alunos a um conjunto de 10 itens dicotomizados que mensuram o capital
econômico da sua família foram usadas na construção dos coeficientes de
escalonabilidade do Modelo de Homogeneidade Monótona da TRIN. Repetidas
amostras foram selecionadas de cada população de referência empregando dois
planos amostrais: AC1S (amostragem por conglomerados em único estágio) e
AC2-SAEB (com seleção de escolas e turmas, estratificação e sorteio das unidades
do primeiro estágio com probabilidade proporcional a uma medida de tamanho da
escola. A estimação pontual é baseada no Modelo de Superpopulação. Duas
técnicas foram tratadas para a estimação da variância: método do Conglomerado
Primário e Delete - 1 Jackknife. As medidas usuais: vício relativo, erro relativo
médio, intervalo de confiança e efeito do plano amostral são usadas para a
avaliação da qualidade dos estimadores em termos das propriedades de vício e
precisão. O estudo assinala que os estimadores pontuais apresentam boas
propriedades e, além disto, o estimador da variância corrigido pelo fator de
correção de população finita é o mais apropriado em termos de vício e precisão. O
plano amostral complexo adotado teve impacto na estimação pontual e da
variância dos estimadores dos coeficientes de escalonabilidade. / [en] The purpose of this thesis is to propose estimators for the coefficients of
scalability associated with Non Parametric Item Response Theory (NIRT),
namely: Hij, Hi and H, and their variance estimators, based on the approach of
sampling finite populations. To investigate empirically the quality of these
estimators are considered the reference populations that are formed by students
attending the 9th year of elementary school, in public, in urban areas of the states
of Roraima and Rio de Janeiro, who participated Prova Brasil 2007. The
responses of students to a set of 10 dichotomized items that measure the economic
status of their families were used in the construction of the coefficients of
scalability of the Homogeneity Model. Repeated samples were selected from each
reference population using two sampling plans: AC1S (cluster sampling single
stage) and AC2-SAEB (with selecting schools and classes, stratification and draw
units of the first stage with probability proportional to a measure of school size).
The point estimate is based on the approach of the Model Overpopulation. Two
techniques were treated to estimate the variance: Ultimate Cluster method and
Delete - 1 Jackknife. The usual measures: relative bias, mean relative error,
confidence intervals and effect of the sampling plan is used to assess the quality of
the estimators in terms of the properties of bias and accuracy. The study notes that
the estimators have good properties and, in addition, the estimator of the variance
corrected by the correction factor for finite population is the most appropriate in
terms of accuracy and bias. The complex sampling (AC2-SAEB) impacted the
point estimate and variance of the estimators of the coefficients of scalability.
|
378 |
Error-robust coding and transformation of compressed hybered hybrid video streams for packet-switched wireless networksHalbach, Till January 2004 (has links)
<p>This dissertation considers packet-switched wireless networks for transmission of variable-rate layered hybrid video streams. Target applications are video streaming and broadcasting services. The work can be divided into two main parts.</p><p>In the first part, a novel quality-scalable scheme based on coefficient refinement and encoder quality constraints is developed as a possible extension to the video coding standard H.264. After a technical introduction to the coding tools of H.264 with the main focus on error resilience features, various quality scalability schemes in previous research are reviewed. Based on this discussion, an encoder decoder framework is designed for an arbitrary number of quality layers, hereby also enabling region-of-interest coding. After that, the performance of the new system is exhaustively tested, showing that the bit rate increase typically encountered with scalable hybrid coding schemes is, for certain coding parameters, only small to moderate. The double- and triple-layer constellations of the framework are shown to perform superior to other systems.</p><p>The second part considers layered code streams as generated by the scheme of the first part. Various error propagation issues in hybrid streams are discussed, which leads to the definition of a decoder quality constraint and a segmentation of the code stream to transmit. A packetization scheme based on successive source rate consumption is drafted, followed by the formulation of the channel code rate optimization problem for an optimum assignment of available codes to the channel packets. Proper MSE-based error metrics are derived, incorporating the properties of the source signal, a terminate-on-error decoding strategy, error concealment, inter-packet dependencies, and the channel conditions. The Viterbi algorithm is presented as a low-complexity solution to the optimization problem, showing a great adaptivity of the joint source channel coding scheme to the channel conditions. An almost constant image qualiity is achieved, also in mismatch situations, while the overall channel code rate decreases only as little as necessary as the channel quality deteriorates. It is further shown that the variance of code distributions is only small, and that the codes are assigned irregularly to all channel packets.</p><p>A double-layer constellation of the framework clearly outperforms other schemes with a substantial margin. </p><p>Keywords — Digital lossy video compression, visual communication, variable bit rate (VBR), SNR scalability, layered image processing, quality layer, hybrid code stream, predictive coding, progressive bit stream, joint source channel coding, fidelity constraint, channel error robustness, resilience, concealment, packet-switched, mobile and wireless ATM, noisy transmission, packet loss, binary symmetric channel, streaming, broadcasting, satellite and radio links, H.264, MPEG-4 AVC, Viterbi, trellis, unequal error protection</p>
|
379 |
Ring amplification for switched capacitor circuitsHershberg, Benjamin Poris 19 July 2013 (has links)
A comprehensive and scalable solution for high-performance switched capacitor amplification is presented. Central to this discussion is the concept of ring amplification. A ring amplifier is a small modular amplifier derived from a ring oscillator that naturally embodies all the essential elements of scalability. It can amplify with accurate rail-to-rail output swing, drive large capacitive loads with extreme efficiency using slew-based charging, naturally scale in performance according to process trends, and is simple enough to be quickly constructed from only a handful of inverters, capacitors, and switches. In addition, the gain-enhancement technique of Split-CLS is introduced, and used to extend the efficacy of ring amplifiers in specific and other amplifiers in general. Four different pipelined ADC designs are presented which explore the practical implementation options and design considerations relevant to ring amplification and Split-CLS, and are used to establish ring amplification as a new paradigm for scalable amplification. / Graduation date: 2012 / Access restricted to the OSU Community, at author's request, from July 19, 2012 - July 19, 2013
|
380 |
Error-robust coding and transformation of compressed hybered hybrid video streams for packet-switched wireless networksHalbach, Till January 2004 (has links)
This dissertation considers packet-switched wireless networks for transmission of variable-rate layered hybrid video streams. Target applications are video streaming and broadcasting services. The work can be divided into two main parts. In the first part, a novel quality-scalable scheme based on coefficient refinement and encoder quality constraints is developed as a possible extension to the video coding standard H.264. After a technical introduction to the coding tools of H.264 with the main focus on error resilience features, various quality scalability schemes in previous research are reviewed. Based on this discussion, an encoder decoder framework is designed for an arbitrary number of quality layers, hereby also enabling region-of-interest coding. After that, the performance of the new system is exhaustively tested, showing that the bit rate increase typically encountered with scalable hybrid coding schemes is, for certain coding parameters, only small to moderate. The double- and triple-layer constellations of the framework are shown to perform superior to other systems. The second part considers layered code streams as generated by the scheme of the first part. Various error propagation issues in hybrid streams are discussed, which leads to the definition of a decoder quality constraint and a segmentation of the code stream to transmit. A packetization scheme based on successive source rate consumption is drafted, followed by the formulation of the channel code rate optimization problem for an optimum assignment of available codes to the channel packets. Proper MSE-based error metrics are derived, incorporating the properties of the source signal, a terminate-on-error decoding strategy, error concealment, inter-packet dependencies, and the channel conditions. The Viterbi algorithm is presented as a low-complexity solution to the optimization problem, showing a great adaptivity of the joint source channel coding scheme to the channel conditions. An almost constant image qualiity is achieved, also in mismatch situations, while the overall channel code rate decreases only as little as necessary as the channel quality deteriorates. It is further shown that the variance of code distributions is only small, and that the codes are assigned irregularly to all channel packets. A double-layer constellation of the framework clearly outperforms other schemes with a substantial margin. Keywords — Digital lossy video compression, visual communication, variable bit rate (VBR), SNR scalability, layered image processing, quality layer, hybrid code stream, predictive coding, progressive bit stream, joint source channel coding, fidelity constraint, channel error robustness, resilience, concealment, packet-switched, mobile and wireless ATM, noisy transmission, packet loss, binary symmetric channel, streaming, broadcasting, satellite and radio links, H.264, MPEG-4 AVC, Viterbi, trellis, unequal error protection
|
Page generated in 0.075 seconds