• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 110
  • 60
  • 32
  • 22
  • 11
  • 9
  • 9
  • 7
  • 6
  • 5
  • 4
  • 3
  • 2
  • 1
  • Tagged with
  • 303
  • 303
  • 66
  • 63
  • 42
  • 35
  • 32
  • 32
  • 32
  • 31
  • 31
  • 29
  • 28
  • 27
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

RR3D: Uma solu??o para renderiza??o remota de imagens m?dicas tridimensionais

Papaiz, Fabiano 05 April 2013 (has links)
Made available in DSpace on 2014-12-17T15:48:05Z (GMT). No. of bitstreams: 1 FabianoP_DISSERT.pdf: 3905877 bytes, checksum: 0721827c114aec9f61c1c690dc27dd1d (MD5) Previous issue date: 2013-04-05 / The visualization of three-dimensional(3D)images is increasigly being sed in the area of medicine, helping physicians diagnose desease. the advances achived in scaners esed for acquisition of these 3d exames, such as computerized tumography(CT) and Magnetic Resonance imaging (MRI), enable the generation of images with higher resolutions, thus, generating files with much larger sizes. Currently, the images of computationally expensive one, and demanding the use of a righ and computer for such task. The direct remote acess of these images thruogh the internet is not efficient also, since all images have to be trasferred to the user?s equipment before the 3D visualization process ca start. with these problems in mind, this work proposes and analyses a solution for the remote redering of 3D medical images, called Remote Rendering (RR3D). In RR3D, the whole hedering process is pefomed a server or a cluster of servers, with high computational power, and only the resulting image is tranferred to the client, still allowing the client to peform operations such as rotations, zoom, etc. the solution was developed using web services written in java and an architecture that uses the scientific visualization packcage paraview, the framework paraviewWeb and the PACS server DCM4CHEE.The solution was tested with two scenarios where the rendering process was performed by a sever with graphics hadwere (GPU) and by a server without GPUs. In the scenarios without GPUs, the soluction was executed in parallel with several number of cores (processing units)dedicated to it. In order to compare our solution to order medical visualization application, a third scenario was esed in the rendering process, was done locally. In all tree scenarios, the solution was tested for different network speeds. The solution solved satisfactorily the problem with the delay in the transfer of the DICOM files, while alowing the use of low and computers as client for visualizing the exams even, tablets and smart phones / A visualiza??o de imagens tridimensionais (3D) est? cada vez mais presente na ?rea da medicina, auxiliando os m?dicos no diagn?stico de doen?as e na emiss?o de laudos. Com o avan?o dos equipamentos que geram imagens tomogr?ficas dos pacientes, como os de Tomografia Computadorizada (TC), est?o sendo geradas imagens cada vez mais n?tidas e, portanto, com resolu??es e tamanhos maiores. Atualmente, as imagens contidas em um exame de TC geralmente ocupam o tamanho de dezenas e centenas de megabytes, tornando o processo de visualiza??o 3D cada vez mais pesado - exigindo do usu?rio um equipamento com bom poder computacional. O acesso remoto ? estas imagens, via internet por exemplo, tamb?m n?o ? muito eficiente, pois todas as imagens precisam ser transferidas para o equipamento do usu?rio antes que o processo de visualiza??o 3D seja iniciado. Diante destes problemas (tamanho das imagens e acesso remoto), este trabalho envolve a cria??o e an?lise de um servi?o web para renderiza??o remota de imagens m?dicas 3D, denominado RR3D. Nele, todo o processo de renderiza??o volum?trica ser? realizado por um servidor, ou cluster de servidores, com alto poder computacional e somente a imagem 3D resultante ser? enviada ao cliente, permitindo que este ainda possa fazer opera??es como rota??o, zoom etc. O servi?o web ser? desenvolvido utilizando a linguagem Java e na arquitetura do projeto ser?o utilizados o programa de visualiza??o cient?fica Paraview, o framework ParaviewWeb e o servidor DCM4CHEE
282

Improvement of web service composition using semantic similarities and formal concept analysis / Amélioration du processus de composition de services web en utilisant les similarités sémantiques et l'analyse de concepts formels

Abid, Ahmed 19 July 2017 (has links)
Les Architectures Orientées Services (SOA) se sont progressivement imposées comme outil incontournable dans les échanges inter-entreprises grâce à leurs potentiels stratégiques et technologiques. Leurs mise en oeuvre est concrétisée à travers les services Web dont l'un des principaux atouts est leur composabilité. Avec l'émergence du Web sémantique la découverte et la composition de services Web sémantiques constituent un réel défi. Le processus de découverte s'appui généralement sur les registres traditionnels offrant des descriptions syntaxiques regroupés statiquement, ce qui pose un problème lié à l'hétérogénéité des descriptions syntaxiques et à la rigidité de la classification. Le processus de composition dépend à son tour de la qualité de l'appariement des services. Nous proposons dans cette thèse une architecture d'un framework qui couvre toutes les phases du processus de composition. Ensuite, nous proposons une mesure de similarité sémantique pour un appariement entre les descriptions des services Web. Le processus de découverte de services Web s'appuie sur la similarité entre les services, le formalisme d'Analyse de Concepts Formels et l'organisation des services en treillis. La composition ensuite repose sur l'établissement de services composites cohérents et pertinaents pour la fonctionnalité espérée. Les points forts de cette architecture sont l'adaptation et l'intégration des technologies sémantiques, le calcul de similarité sémantique et l'utilisation de cette similarité sémantique et du formalisme FCA afin d'optimiser le processus de composition. / Service Oriented Architectures (SOA) have been progressively confirmed as an essential tool in inter-companies exchanges thanks to their strategic and technological potential. Their implementation is realised through Web services. One of the main assets of services is their compostability. With the emergence of the semantic Web, the discovery and composition of semantic Web services become a real challenge. The discovery process is generally based on traditional registries with syntactic descriptions where services are statically grouped. This poses a problem related to the heterogeneity of syntactic descriptions and the rigidity of the classification. The composition process depends on the Web service matching quality processed in the discovery phase. We propose in this dissertation an architecture of a framework that covers all the phases of the composition process. Then, we propose a semantic similarity measure Web services. The Web services discovery process relies on the proposed similarity measure, the formal concept analysis (FCA) formalism, and the organisation of lattice services. The composition is then based on the establishment of coherent and relevant composite services for the expected functionality. The main strengths of this architecture are the adaptation and integration of semantic technologies, the calculation of semantic similarity and the use of this semantic similarity and the FCA formalism in order to optimise the composition process.
283

Plataforma computacional híbrida de coprocessamento paralelo distribuído por web services aplicada à radiointerferometria

Silva, Gustavo Poli Lameirão da 19 August 2013 (has links)
Made available in DSpace on 2016-06-02T19:03:58Z (GMT). No. of bitstreams: 1 5593.pdf: 13078959 bytes, checksum: 1cc88a226e87c0a4ca26af32176acea5 (MD5) Previous issue date: 2013-08-19 / Financiadora de Estudos e Projetos / The requirements imposed by the new applications presents great challenges to the computation. There is not a perfect computer architecture, capable to attend to all the requirements. The parallel and hybrid computer arrangement rise as a solution to this scenario i.e., the CPU-Coprocessor pair arrangement can form a specialized computerized instrument for a special application task. This doctoral thesis proposes a parallel and hybrid computational platform denoted CoP-WS, that uses the interoperability technology known as Web Services. As coprocessor it is used the graphic processing unit, known as the GPU, functioning recently as parallel thread level processing of general use applications. The platform test of feasibility was inspired in radio astronomy, and it has been implemented two applications: a complex correlator of signals provided by a radio interferometric arrangement, and a flare recognition system with a solar radio interferometer image. Both processings can be inserted in the context of pipeline execution, using sufficient configuration of CPU-GPU pairs, having on one side the interferometric arrangement antenna signal input and in the other side the result of the solar flare recognition. The obtained results of the both applications show the feasibility of the CoP-WS platform, for greater volume of data being processed in quasi real time. In the case of the correlator the average processing time in each integration period was around 160 ms, and in the case of the solar flare recognition, 48 ms for each solar disk image. / Os requisitos impostos pelas novas aplicações, sejam estas científicas, ou não, apresentam grandes desafios à computação. Não existe uma arquitetura de computadores "perfeita" que seja capaz de atender a todos estes requisitos. A configuração de arranjos paralelos e híbridos de computadores se apresenta como uma solução para este cenário, ou seja, a configuração de arranjos de pares CPU-Coprocessador, pode ser especializada para o processamento de uma aplicação distintas. Este trabalho de doutorado propõe uma plataforma computacional paralela e híbrida distribuída denominada CoP-WS, que utiliza a tecnologia de interoperabilidade conhecida como Web Services. Como coprocessador é utilizada a unidade de processamento gráfico conhecida como GPU, cuja função tem sido de processamento paralelo ao nível de threads, para aplicações gerais nos últimos tempos. A prova de viabilidade da plataforma implementada foi inspirada na radioastronomia, tendo sido implementados dois aplicativos: um correlacionador complexo de sinais provindos dos arranjos interferométricos e um sistema para o reconhecimento de explosões solares, numa imagem de radiointerferometria solar. Ambos os processamentos podem ser inseridos num contexto de execução em pipeline, usando uma configuração suficiente de pares CPU-GPU, tendo de um lado a entrada dos sinais das antenas do arranjo interferométrico e do outro lado o resultado do processamento de reconhecimento de explosões solares. Em ambas aplicações os resultados foram satisfatórios sendo que no caso do correlacionador o tempo médio de processamento de cada ciclo de integração foi de aproximadamente 160 ms, e para a aplicação de reconhecimento de explosões solares, de 48 ms por imagem de disco solar.
284

Arquitetura de um provedor de serviços interativos para sistemas de televisão digital

Prado, Gabriel Massote 12 December 2011 (has links)
Made available in DSpace on 2016-06-02T19:06:03Z (GMT). No. of bitstreams: 1 4998.pdf: 1357751 bytes, checksum: d370f0d3802dc9fc019fc57a6dc5ab16 (MD5) Previous issue date: 2011-12-12 / Financiadora de Estudos e Projetos / The evolution of television systems, analog to digital, provides a wide range of new services possibility of data transmission from the high quality audio and video. With the aggregation of data transmission of television programs, systems arise Interactive Digital TV allows you to embed interactive content resulting in a new way to produce and transmit television programs. This paper presents an architecture of an Interactive Service Provider named Interactive Service Provider for Digital Television Systems - ispTV - to interactive digital television systems, based on a service-oriented architecture that enables the publication and delivery of services in different technologies such as JAX-WS, Axis2 and ispTV services, with the inclusion of an abstract layer of communication in order to simplify the development of interactive programs with a standardized communication APIs provided by the architecture. The Brazilian Digital Television System is to provide an objective of social, cultural and technological inclusion through broadcasting educational, government and commercial television applications for the television user programs. In this context, this paper presents some interactive applications with educational support to viewers with special needs using the proposed architecture. The applications presented demonstrate the usability aspects of the architecture along ispTV to test usability of the APIs of abstraction that the architecture offers. Also presented are performance testing and scalability of the architecture, showing that its operation allows its use for the transmission of interactive applications for various classes of projects, institutions and companies. / A evolução dos sistemas de televisão, de analógicos para digitais, proporciona uma nova e vasta gama de serviços diante da possibilidade de transmissão de dados junto à alta qualidade de áudio e vídeo. Com a agregação de dados a transmissão dos programas televisivos, surgem os sistemas de TV Digital Interativa permitindo a inclusão de conteúdos interativos implicando em um novo modo de produzir e transmitir os programas televisivos. Este trabalho apresenta uma Arquitetura de um Provedor de Serviços Interativos, nomeada Interactive Service Provider for Digital Television Systems ispTV para sistemas de televisão digital interativa, baseado em uma Arquitetura Orientada a Serviços que possibilita a publicação e oferecimento de serviços em diferentes tecnologias, tais como JAX-WS, Axis2 e serviços ispTV, com a inclusão de uma camada abstrata de comunicação com o intuito de simplificar o desenvolvimento de programas interativos junto a uma comunicação padronizada oferecida por APIs da arquitetura. O Sistema Brasileiro de Televisão Digital tem como um dos objetivos proporcionar a inclusão social, cultural e tecnológica através da televisão transmitindo aplicações educacionais, governamentais e comerciais para a interação do usuário com os programas televisivos. Neste contexto, este trabalho apresenta algumas aplicações interativas, educacionais e de suporte aos telespectadores com necessidades especiais, utilizando a arquitetura proposta. As aplicações apresentadas, evidenciam aspectos de usabilidade da arquitetura ispTV junto aos teste de usabilidade das APIs de abstração que a arquitetura oferece. Também são apresentados testes de desempenho e de escalabilidade da arquitetura, evidenciando que o seu funcionamento possibilita sua utilização para a transmissão de aplicações interativas para diversas classes de projetos, instituições e empresas.
285

Uma plataforma para disponibilização centralizada de dados abertos governamentais como suporte para aplicações no contexto de cidades inteligentes. / A platform for centralized distribution of open government data as support for applications in the contexto of smart cities

Veira, Daniel Ianegitz 27 July 2016 (has links)
Submitted by Milena Rubi (milenarubi@ufscar.br) on 2017-06-01T17:34:28Z No. of bitstreams: 1 VIEIRA_Daniel_2016.pdf: 25243835 bytes, checksum: f2ce45fadf28ef4fb0273bbec68dec6d (MD5) / Approved for entry into archive by Milena Rubi (milenarubi@ufscar.br) on 2017-06-01T17:34:34Z (GMT) No. of bitstreams: 1 VIEIRA_Daniel_2016.pdf: 25243835 bytes, checksum: f2ce45fadf28ef4fb0273bbec68dec6d (MD5) / Approved for entry into archive by Milena Rubi (milenarubi@ufscar.br) on 2017-06-01T17:34:39Z (GMT) No. of bitstreams: 1 VIEIRA_Daniel_2016.pdf: 25243835 bytes, checksum: f2ce45fadf28ef4fb0273bbec68dec6d (MD5) / Made available in DSpace on 2017-06-01T17:34:45Z (GMT). No. of bitstreams: 1 VIEIRA_Daniel_2016.pdf: 25243835 bytes, checksum: f2ce45fadf28ef4fb0273bbec68dec6d (MD5) Previous issue date: 2016-07-27 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / The Smart City (SC) concept aims to promote dynamic and flexible systems to the cities, that interact with each others and help to maintain the stability and scalability of it’s resources and services, usually through sensors and collectors, with high implantation and maintenance costs. On the other hand, the Open Goverment Data (OGD) are data relationated to the governamental activity, available on the Internet. The publications are fragmented in dierent and uncorrelated portals, also do not follow a standard pattern format. Thus, this work proposes a platform to collect and distribute OGD in a centralized way, in order to provide consolidated information to the context of SC. Also, two validations of this proposal are presented, the first one promoting the solidification of the platform and the second one being an experimental study applied at the academic environment, were eight graduation students where supervisionated, divided into two groups, during the development of two SC projects, consisting of comparative and graphic visualization OGD portals. The experimental group used the platform proposed here as a OGD source, and the control group used OGD collected directly through the governmental transparency portals. At the end of the study, with the intention to find benefits in the utilization of a centralized platform, statistical tests were applied to the collected metrics, and expressive advantages to the experimental group were found, related to the quantity of source-code lines and the development time, indicating that the platform oered benefits during the cycle of development of the project. / O conceito de Cidades Inteligentes (CI) visa prover às cidades sistemas dinâmicos e flexíveis, que interagem entre si e auxiliam a manter a estabilidade e escalabilidade de seus recursos e serviços, usualmente utilizando-se de sensores e coletores, com altos custos de implantação e manutenção. Por outro lado, os Dados Abertos Governamentais (DAG) são dados referentes às atividades governamentais, disponibilizados através da Internet. As publicações são fragmentadas em portais distintos e não correlacionados, além de não seguirem um padrão de formato definido. Assim, este trabalho propõe uma plataforma de coleta e disponibilização centralizada de DAG, com o intuito de prover informações consolidadas ao contexto de CI. Ainda, são apresentadas duas validações desta proposta, a primeira promovendo a solidificação da plataforma e a segunda um estudo experimental aplicado no ambiente acadêmico, no qual foram supervisionados oito estudantes de graduação, divididos em dois grupos, durante o desenvolvimento de dois projetos de CI, consistindo em portais de visualização gráfica e comparativa de DAG. O grupo experimental utilizou a plataforma aqui proposta como fonte de DAG, e o grupo controle utilizou-se de DAG coletados diretamente nos portais de transparência governamental. Ao fim do estudo, com o intuito de encontrar benefícios na utilização de uma plataforma centralizada, foram aplicados testes estatísticos às métricas coletadas, e expressivas vantagens em relação à quantidade de linhas de código-fonte e ao tempo de desenvolvimento foram encontradas para o grupo experimental, indicando que a plataforma ofereceu benefícios durante o ciclo de desenvolvimento do projeto.
286

Uma metodologia baseada na lógica linear para análise de processos de workflow interorganizacionais

Passos, Lígia Maria Soares 22 February 2016 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This work formalizes four methods based on Linear Logic for the verification of interorganizational workflow processes modelled by Interorganizational Workflow nets, which are Petri nets that model such processes. The first method is related to the verification of the Soundness criteria for interorganizational workflow processes. The method is based on the construction and analysis of Linear Logic proof trees, which represent the local processes as much as they do the global processes. The second and third methods are related, respectively to Soundness criteria verification, Relaxed Soundness and Weak Soundness for the interorganizational workflow processes. These are obtained through the analysis of reutilized Linear Logic proof trees that have been constructed for the verification of the Soundness criteria. However, the fourth method has the objective of detecting the deadlock free scenarios in interorganizational workflow and is based on the construction and analysis of Linear Logic proof trees, which initially takes into consideration the local processes and communication between such, and thereafter the candidate scenarios. A case study is carried out in the context of a Web services composition check, since there is a close correlation between the modelling of the interorganizational workflow process and a Web services composition. Therefore, the four methods proposed in the interorganizational workflow process context, are applied to a Web services composition. The evaluation of the obtained results shows that the reutilization of Linear Logic proof trees initially constructed for verifying the Soundness criteria, in fact occurs in the context of verifying the Relaxed Soundness andWeak Soundness criteria. In addition, the evaluation shows how the Linear Logic sequents and their proof trees explicitly show the possibilities for existing collaborations in a Web service composition. An evaluation that takes into account the number of constructed linear logic proof trees shows that this number can be significantly reduced in the deadlock-freeness scenarios detection method. An approach for resource planning based on the symbolic date calculation, which considers data extracted from Linear Logic proof trees is presented and validated through simulations performed on the CPN tools simulator. Two approaches for the monitoring of deadlockfreeness scenarios are introduced and show how data obtained from the Linear Logic proof trees can be used to guide the execution of such scenarios. / Este trabalho formaliza quatro métodos baseados na Lógica Linear para verificação de processos de workflow interorganizacionais modelados por WorkFlow nets interorganizacionais, que são redes de Petri que modelam tais processos. O primeiro método está relacionado com a verificação do critério de correção Soundness para processos de workflow interorganizacionais. O método é baseado na construção e análise de árvores de prova da Lógica Linear que representam tanto os processos locais quanto o processo global. O segundo e terceiro métodos estão relacionados, respectivamente, com a verificação dos critérios de correção Relaxed Soundness e Weak Soundness para processos de workflow interorganizacionais, e são obtidos através da análise de árvores de prova da Lógica Linear reutilizadas, construídas para a prova do critério de correção Soundness. Já o quarto método tem por objetivo a detecção dos cenários livres de deadlock em processos de workflow interorganizacionais e é baseado na construção e análise de árvores de prova da Lógica Linear que consideram, inicialmente, os processos locais e as comunicações entre estes e, posteriormente, os cenários candidatos. Um estudo de caso é realizado no contexto da verificação de composições de serviços Web, uma vez que há uma relação estreita entre a modelagem de um processo de workflow interorganizacional e uma composição de serviços Web. Assim, os quatro métodos propostos no contexto dos processos de workflow interorganizacionais são aplicados a uma composição de serviços Web. A avaliação dos resultados mostra que o reuso de árvores de prova da Lógica Linear construídas inicialmente para a prova do critério de correção Soundness de fato ocorre no contexto da verificação dos critérios de correção Relaxed Soundness e Weak Soundness. Além disso, a avaliação mostra como os sequentes da Lógica Linear e suas árvores de prova explicitam as possibilidades de colaboração existentes em uma composição de serviços Web. Uma avaliação que leva em conta o número de árvores de prova da Lógica Linear construídas mostra que este número pode ser significativamente reduzido no método para detecção de cenários livres de deadlock. Uma abordagem para planejamento de recursos, baseada no cálculo de datas simbólicas, que considera dados extraídos de árvores de prova da Lógica Linear, é apresentada e validada através de simulações realizadas no simulador CPN Tools. Duas abordagens para a monitoração dos cenários livres de deadlock são introduzidas e mostram como dados obtidos nas árvores de prova da Lógica Linear podem ser utilizados para guiar a execução de tais cenários. / Doutor em Ciência da Computação
287

Achieving Autonomic Web Service Compositions with Models at Runtime

Alférez Salinas, Germán Harvey 26 December 2013 (has links)
Over the last years, Web services have become increasingly popular. It is because they allow businesses to share data and business process (BP) logic through a programmatic interface across networks. In order to reach the full potential of Web services, they can be combined to achieve specifi c functionalities. Web services run in complex contexts where arising events may compromise the quality of the system (e.g. a sudden security attack). As a result, it is desirable to count on mechanisms to adapt Web service compositions (or simply called service compositions) according to problematic events in the context. Since critical systems may require prompt responses, manual adaptations are unfeasible in large and intricate service compositions. Thus, it is suitable to have autonomic mechanisms to guide their self-adaptation. One way to achieve this is by implementing variability constructs at the language level. However, this approach may become tedious, difficult to manage, and error-prone as the number of con figurations for the service composition grows. The goal of this thesis is to provide a model-driven framework to guide autonomic adjustments of context-aware service compositions. This framework spans over design time and runtime to face arising known and unknown context events (i.e., foreseen and unforeseen at design time) in the close and open worlds respectively. At design time, we propose a methodology for creating the models that guide autonomic changes. Since Service-Oriented Architecture (SOA) lacks support for systematic reuse of service operations, we represent service operations as Software Product Line (SPL) features in a variability model. As a result, our approach can support the construction of service composition families in mass production-environments. In order to reach optimum adaptations, the variability model and its possible con figurations are verifi ed at design time using Constraint Programming (CP). At runtime, when problematic events arise in the context, the variability model is leveraged for guiding autonomic changes of the service composition. The activation and deactivation of features in the variability model result in changes in a composition model that abstracts the underlying service composition. Changes in the variability model are refl ected into the service composition by adding or removing fragments of Business Process Execution Language (WS-BPEL) code, which are deployed at runtime. Model-driven strategies guide the safe migration of running service composition instances. Under the closed-world assumption, the possible context events are fully known at design time. These events will eventually trigger the dynamic adaptation of the service composition. Nevertheless, it is diffi cult to foresee all the possible situations arising in uncertain contexts where service compositions run. Therefore, we extend our framework to cover the dynamic evolution of service compositions to deal with unexpected events in the open world. If model adaptations cannot solve uncertainty, the supporting models self-evolve according to abstract tactics that preserve expected requirements. / Alférez Salinas, GH. (2013). Achieving Autonomic Web Service Compositions with Models at Runtime [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/34672 / TESIS
288

Automatické zjištování rozměrových změn betonových směsí při tuhnutí / Automatic determination of dimensional changes of concrete mixtures during solidification

Hortvík, Martin January 2017 (has links)
This diploma thesis deals with the design and implementation of a desktop application designed to determine the dimensional changes of concrete mixtures during solidification. The work includes implementation of the server portal for real-time data evaluation. The introductory part deals with the analysis of the measurement task and the original solution, as well as the specification of the new application and its deployment in real operation. The next part of the thesis describes current software technologies used in the development of desktop and server applications. The design and implementation of both parts of the solution is a major part of the work. In the last part, the real-time functionality test of the application is presented.
289

Modernizace portálu fakturyonline.eu / Upgrade of fakturyonline.eu Service

Jahoda, Vojtěch January 2014 (has links)
Goal of this thesis is modernization and extension of established web portal serving to create and manage invoices. The design and implementation emphasis on feedback from users so changes are really a benefit and not a burden. Uses wide range of different technologies and implementation is done using incremental model. Analyzes the impact of changes to number of visitors and receives feedback from users.
290

Webová aplikace zprostředkovávající výsledky testování výkonu platformy JBoss / Test Result Repository with Web User Interface

Vlasák, Jaroslav January 2013 (has links)
This thesis deals with the development of a client-server application for Red Hat company. Client participates in testing process of JBoss platform and gets user-defined performance data which sends during testing to the server application by platform independent communication. The server application allows to analyze the received data which can be also compared by several perspectives. These services for data analysis and comparison are accessible for server users using the web client. The server application supports the import of the performance data stored in the XML file and also their export for qVue portal. The client part of the application is implemented in Java and the server application is based on Java EE platform.

Page generated in 0.0894 seconds