• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 382
  • 89
  • 72
  • 70
  • 67
  • 37
  • 33
  • 18
  • 12
  • 11
  • 10
  • 8
  • 7
  • 5
  • 5
  • Tagged with
  • 936
  • 936
  • 452
  • 196
  • 133
  • 124
  • 115
  • 100
  • 89
  • 88
  • 86
  • 83
  • 79
  • 75
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
901

Resource Centre Sites: The New Gatekeepers of the Web?

Bruns, Axel Unknown Date (has links)
This thesis introduces and analyses the emerging Website genre of Resource Centre Sites. RCSs are sites which combine news, rumours and background information as well as community discussion and commentary on their chosen topic, and frequently serve as a first point of entry for readers interested in learning more about the field. They also offer spaces for virtual communities of specialists or enthusiasts to emerge, who in the process and as a product of their interaction on these sites collate detailed resource collections and hyperlink directories for their fields of interest. Therefore, Resource Centre Sites significantly involve their users as content contributors and producers, turning them into what is here termed ‘produsers’ of the site. Aiming to evaluate all the content relevant to their field that is becoming available online, and to coopt or at least link to this information from the news and resources collection that is a central part of the RCS, Resource Centre Site produsers engage in an adaptation of both traditional journalistic gatekeeping methodologies and librarianly resource collection approaches to the Web environment: in the absence of gates to keep online, they have become ‘gatewatchers’, observing the publication of news and information in other sources and publicising its existence through their own sites. Their operation is studied here through a number of case studies of major existing Resource Centre Sites from various fields of interest. These sites are analysed both based on their available Web content, and using background information obtained in a series of email interviews with RCS creators. In combination, this offers insights into the operating philosophies of sites and site editors, and provides an opportunity to assess to what extent these ideas have been translated into everyday practice. Chapter 1 provides an overview of past and current theoretical views of the Web in an effort to evaluate their suitability for the current study. Older approaches positing an abstract ‘ideal’ form of hypertext are rejected in favour of a direct engagement with the World Wide Web as the now dominant mode of hypertextuality. Chapter 2 outlines the principles of gatewatching in contrast to traditional methods of evaluating news and information as they exist in journalistic media and archival institutions, and investigates the effects such gatewatching practices may have on editors and users. Chapter 3 describes the overall characteristics of Resource Centre Sites as a genre of Web publications. It notes the special role site users play in the operation of such sites (in their new role as ‘produsers’), and distinguishes the RCS genre from similar Website models such as portals and cybermediaries. Chapter 4 observes the everyday operation of such Websites in practice, using case studies of major existing Resource Centre Sites including Slashdot, MediaChannel and CountingDown, and interviews with their creators. (These interviews are included in full in the Appendix.) This analysis works with both a synchronic view to the variety of topics existing Resource Centre Sites are able to address, and a diachronic view to the evolution of proto-RCSs (such as enthusiast community or online advocacy sites) into fully-featured Resource Centre Sites. Finally, based on this analysis, Chapter 5 is then able to point out some of the implications and effects that increasing use of this media form may have on its users and the network of news and information publications on- and offline, and to indicate the potential for further developments of the site genre.
902

Resource Centre Sites: The New Gatekeepers of the Web?

Bruns, Axel Unknown Date (has links)
This thesis introduces and analyses the emerging Website genre of Resource Centre Sites. RCSs are sites which combine news, rumours and background information as well as community discussion and commentary on their chosen topic, and frequently serve as a first point of entry for readers interested in learning more about the field. They also offer spaces for virtual communities of specialists or enthusiasts to emerge, who in the process and as a product of their interaction on these sites collate detailed resource collections and hyperlink directories for their fields of interest. Therefore, Resource Centre Sites significantly involve their users as content contributors and producers, turning them into what is here termed ‘produsers’ of the site. Aiming to evaluate all the content relevant to their field that is becoming available online, and to coopt or at least link to this information from the news and resources collection that is a central part of the RCS, Resource Centre Site produsers engage in an adaptation of both traditional journalistic gatekeeping methodologies and librarianly resource collection approaches to the Web environment: in the absence of gates to keep online, they have become ‘gatewatchers’, observing the publication of news and information in other sources and publicising its existence through their own sites. Their operation is studied here through a number of case studies of major existing Resource Centre Sites from various fields of interest. These sites are analysed both based on their available Web content, and using background information obtained in a series of email interviews with RCS creators. In combination, this offers insights into the operating philosophies of sites and site editors, and provides an opportunity to assess to what extent these ideas have been translated into everyday practice. Chapter 1 provides an overview of past and current theoretical views of the Web in an effort to evaluate their suitability for the current study. Older approaches positing an abstract ‘ideal’ form of hypertext are rejected in favour of a direct engagement with the World Wide Web as the now dominant mode of hypertextuality. Chapter 2 outlines the principles of gatewatching in contrast to traditional methods of evaluating news and information as they exist in journalistic media and archival institutions, and investigates the effects such gatewatching practices may have on editors and users. Chapter 3 describes the overall characteristics of Resource Centre Sites as a genre of Web publications. It notes the special role site users play in the operation of such sites (in their new role as ‘produsers’), and distinguishes the RCS genre from similar Website models such as portals and cybermediaries. Chapter 4 observes the everyday operation of such Websites in practice, using case studies of major existing Resource Centre Sites including Slashdot, MediaChannel and CountingDown, and interviews with their creators. (These interviews are included in full in the Appendix.) This analysis works with both a synchronic view to the variety of topics existing Resource Centre Sites are able to address, and a diachronic view to the evolution of proto-RCSs (such as enthusiast community or online advocacy sites) into fully-featured Resource Centre Sites. Finally, based on this analysis, Chapter 5 is then able to point out some of the implications and effects that increasing use of this media form may have on its users and the network of news and information publications on- and offline, and to indicate the potential for further developments of the site genre.
903

A data management and analytic model for business intelligence applications

Banda, Misheck 05 1900 (has links)
Most organisations use several data management and business intelligence solutions which are on-premise and, or cloud-based to manage and analyse their constantly growing business data. Challenges faced by organisations nowadays include, but are not limited to growth limitations, big data, inadequate analytics, computing, and data storage capabilities. Although these organisations are able to generate reports and dashboards for decision-making in most cases, effective use of their business data and an appropriate business intelligence solution could achieve and retain informed decision-making and allow competitive reaction to the dynamic external environment. A data management and analytic model has been proposed on which organisations could rely for decisive guidance when planning to procure and implement a unified business intelligence solution. To achieve a sound model, literature was reviewed by extensively studying business intelligence in general, and exploring and developing various deployment models and architectures consisting of naïve, on-premise, and cloud-based which revealed their benefits and challenges. The outcome of the literature review was the development of a hybrid business intelligence model and the accompanying architecture as the main contribution to the study.In order to assess the state of business intelligence utilisation, and to validate and improve the proposed architecture, two case studies targeting users and experts were conducted using quantitative and qualitative approaches. The case studies found and established that a decision to procure and implement a successful business intelligence solution is based on a number of crucial elements, such as, applications, devices, tools, business intelligence services, data management and infrastructure. The findings further recognised that the proposed hybrid architecture is the solution for managing complex organisations with serious data challenges. / Computing / M. Sc. (Computing)
904

Studies On The Viability Of The Boundary Element Method For The Real-Time Simulation Of Biological Organs

Kirana Kumara, P 22 August 2016 (has links) (PDF)
Realistic and real-time computational simulation of biological organs (e.g., human kidneys, human liver) is a necessity when one tries to build a quality surgical simulator that can simulate surgical procedures involving these organs. Currently deformable models, spring-mass models, or finite element models are widely used to achieve the realistic simulations and/or the real-time performance. It is widely agreed that continuum mechanics based numerical techniques are preferred over deformable models or spring-mass models, but those techniques are computationally expensive and hence the higher accuracy offered by those numerical techniques come at the expense of speed. Hence there is a need to study the speed of different numerical techniques, while keeping an eye on the accuracy offered by those numerical techniques. Such studies are available for the Finite Element Method (FEM) but rarely available for the Boundary Element Method (BEM). Hence the present work aims to conduct a study on the viability of BEM for the real-time simulation of biological organs, and the present study is justified by the fact that BEM is considered to be inherently efficient when compared to mesh based techniques like FEM. A significant portion of literature on the real-time simulation of biological organs suggests the use of BEM to achieve better simulations. When one talks about the simulation of biological organs, one needs to have the geometry of a biological organ in hand. Geometry of biological organs of interest is not readily available many a times, and hence there is a need to extract the three dimensional (3D) geometry of biological organs from a stack of two dimensional (2D) scanned images. Software packages that can readily reconstruct 3D geometry of biological organs from 2D images are expensive. Hence, a novel procedure that requires only a few free software packages to obtain the geometry of biological organs from 2D image sequences is presented. The geometry of a pig liver is extracted from CT scan images for illustration purpose. Next, the three dimensional geometry of human kidney (left and right kidneys of male, and left and right kidneys of female) is obtained from the Visible Human Dataset (VHD). The novel procedure presented in this work can be used to obtain patient specific organ geometry from patient specific images, without requiring any of the many commercial software packages that can readily do the job. To carry out studies on the speed and accuracy of BEM, a source code for BEM is needed. Since the BEM code for 3D elasticity is not readily available, a BEM code that can solve 3D linear elastostatic problems without accounting for body forces is developed from scratch. The code comes in three varieties: a MATLAB version, a Fortran version (sequential version), and a Fortran version (parallelized version). This is the first free and open source BEM code for 3D elasticity. The developed code is used to carry out studies on the viability of BEM for the real-time simulation of biological organs, and a few representative problems involving kidneys and liver are found to give accurate solutions. The present work demonstrates that it is possible to simulate linear elastostatic behaviour in real-time using BEM without resorting to any type of precomputations, on a computer cluster by fully parallelizing the simulations and by performing simulations on different number of processors and for different block sizes. Since it is possible to get a complete solution in real-time, there is no need to separately prove that every type of cutting, suturing etc. can be simulated in real-time. Future work could involve incorporating nonlinearities into the simulations. Finally, a BEM based simulator may be built, after taking into account details like rendering.
905

Projekt vývoje Integrovaného testovacího nástroje / Project Development of Integrated Testing Node

Ženíšek, Jan January 2014 (has links)
Nowadays the development speed of new software products is a key to success and it is not important whether the aim is to make customer's urges satisfied or get ahead of one's competitors and fill the market gap. Because of the increase of development speed the demands on the saving process of software quality are increasing. There are two types of tools that are supporting the process of software quality assurance. Firstly, we talk about comprehensive testing tools of commercial character that usually include many functions, but their purchase is extremely expensive. On the other hand there are open-source tools that are available for free, they function on many kinds of operating systems and it is possible to modify them. Unfortunately their functions are basically focused on a certain subset of controlling the software quality assurance. Company TRASK solution a.s. has decided to change this current situation, so it asked competence centre Software Quality Assurance at the University of Economics in Prague in order to create Integrated Testing Node (ITN) that would combine the advantages of open-source tools. Moreover, it would offer broad range of functions as commercial solution. The purpose of this thesis is to describe relevant phases of the process of creating the Integrated Testing Node from the factual and methodical point of view. This aim is divided into partial aims included task analysis and the proposal of solving system, open-source products portfolio analysis, choice of the most convenient tools for following integration, choosing the method of information system building, evaluating the feedback from a client and the proposal of future development of this tool. As far as the biggest contribution of this thesis is concerned, it is the realisation of ITN project that can be used during information classes at the University of Economics in Prague. Furthermore, it can be used as the control of software quality in commercial companies.
906

Thermal finite element analysis of ceramic/metal joining for fusion using X-ray tomography data

Evans, Llion Marc January 2013 (has links)
A key challenge facing the nuclear fusion community is how to design a reactor that will operate in environmental conditions not easily reproducible in the laboratory for materials testing. Finite element analysis (FEA), commonly used to predict components’ performance, typically uses idealised geometries. An emerging technique shown to have improved accuracy is image based finite element modelling (IBFEM). This involves converting a three dimensional image (such as from X ray tomography) into an FEA mesh. A main advantage of IBFEM is that models include micro structural and non idealised manufacturing features. The aim of this work was to investigate the thermal performance of a CFC Cu divertor monoblock, a carbon fibre composite (CFC) tile joined through its centre to a CuCrZr pipe with a Cu interlayer. As a plasma facing component located where thermal flux in the reactor is at its highest, one of its primary functions is to extract heat by active cooling. Therefore, characterisation of its thermal performance is vital. Investigation of the thermal performance of CFC Cu joining methods by laser flash analysis and X ray tomography showed a strong correlation between micro structures at the material interface and a reduction in thermal conductivity. Therefore, this problem leant itself well to be investigated further by IBFEM. However, because these high resolution models require such large numbers of elements, commercial FEA software could not be used. This served as motivation to develop parallel software capable of performing the necessary transient thermal simulations. The resultant code was shown to scale well with increasing problem sizes and a simulation with 137 million elements was successfully completed using 4096 cores. In comparison with a low resolution IBFEM and traditional FEA simulations it was demonstrated to provide additional accuracy. IBFEM was used to simulate a divertor monoblock mock up, where it was found that a region of delamination existed on the CFC Cu interface. Predictions showed that if this was aligned unfavourably it would increase thermal gradients across the component thus reducing lifespan. As this was a feature introduced in manufacturing it would not have been accounted for without IBFEM.The technique developed in this work has broad engineering applications. It could be used similarly to accurately model components in conditions unfeasible to produce in the laboratory, to assist in research and development of component manufacturing or to verify commercial components against manufacturers’ claims.
907

Aplicação de práticas de usabilidade ágil em software livre / Application of agile usability practices in free and open source software

Ana Paula Oliveira dos Santos 22 March 2012 (has links)
Esta dissertação de mestrado fez parte do projeto Qualipso (Quality Platform for Open Source Software) que teve como principal objetivo melhorar a confiabilidade de sistemas de software livre. Nesse contexto, o enfoque desta pesquisa é um dos atributos de qualidade de software: usabilidade. As práticas de usabilidade no desenvolvimento de software livre, são aplicadas na maioria das vezes, em projetos patrocinados por grandes empresas ou que possuam especialistas em usabilidade como membros da equipe. Mas, em projetos menores da comunidade, compostos geralmente por desenvolvedores, raramente ela é considerada. Porém, a usabilidade é um atributo fundamental para a qualidade durante o uso de um sistema. Com base em valores compartilhados entre as comunidades de métodos ágeis e de software livre, esta dissertação propõe a adaptação de práticas de usabilidade no contexto de métodos ágeis para o contexto de comunidades de software livre. Por meio de pesquisa bibliográfica, levantamos as principais práticas de usabilidade tanto no âmbito de métodos ágeis, quanto no âmbito de software livre, e as classificamos de acordo com as fases do Design Centrado em Usuário, descrevendo cada uma com o formato nome-contexto-problema-solução-exemplos. As práticas foram exploradas em projetos de software livre, o que possibilitou maior entendimento de problemas enfrentados em contextos reais. Essa experiência resultou na proposta de adaptação de práticas de usabilidade ágil no contexto de comunidades de software livre. Dessa forma, descrevemos a realização de uma pesquisa-ação no projeto Arquigrafia-Brasil, um estudo de caso no projeto Mezuro e a aplicação de práticas de usabilidade em quatro projetos do Centro de Competência em Software Livre do IME-USP. / This Masters thesis was part of the Qualipso project (Quality Platform for Open Source Software) whose main objective was to improve the reliability of free and open source software systems. Within such context, the focus of this research is one of the attributes of software quality: usability. The usability practices in free and open source software development are applied most often in projects sponsored by large companies or employing usability experts as team members. But on smaller projects in the community, generally composed by developers, it is rarely considered. However, usability is an essential attribute to the quality in use of a system. Based on values shared between the communities of agile methods and free and open source software, this thesis proposes the adaptation of usability practices in the context of agile methods to the context of free and open source software communities. Through the study of the literature in the field, we gathered the main usability practices both within agile methods, as in free and open source software, and we classified according to User-Centered Design phases, describing each one with the format name-context-problem-solution-examples. The practices were explored in free and open source software projects, which enabled greater understanding of problems faced in real contexts. This experience resulted in the adaptation proposal of agile usability practices into the context of free and open source software communities. We describe the implementation of an action research in the Arquigrafia-Brazil project, a case study in the Mezuro project and the application of usability practices in four projects of the IME-USP FLOSS Competence Center.
908

Open Call / Open Call

Gajdošík, Andreas January 2018 (has links)
In diploma thesis Open Call I focus on unequal position of artists in current art world in which, despite the transparent practices like open calls, still persists the cult of name, the power of networking and personal recommendation. This topic I artistically process in form of practical artistic intervention, which is close to the tactics of 1:1 scale of Arte Útil - specifically by creation of software tool called Nomin. Its purpose is to support weakened or marginalized groups of artists. Nomin uses properties of email protocol SMTP to allow its users-spectators to send fake self-recommending emails - from email addresses of famous curators to the inboxes various galleries or other art institutions. During development of program Nomin and its technical background (software documentation, web page etc.) I followed the paradigm of free, libre, open source software (FLOSS) and also the methodology of agile software development in order to provide in this gesamtsoftwerk the users-spectators with fully functional, user-friendly software and give them possibility to influence further development of Nomin or directly participate on it. Created artwork is thus not a single artefact but rather a set of interconnected objects and practices grounded in the network of social bonds and behaviours which balances on the edge of institutional critique, useful art, participatory art and collective performance.
909

Automatické generování testů pro GNOME GUI aplikace z metadat AT-SPI / Automated Generation of Tests for GNOME GUI Applications Using AT-SPI Metadata

Krajňák, Martin January 2020 (has links)
Cieľom tejto práce je vývoj nástroja na automatické generovanie testov pre aplikácie s grafickým užívateľským rozhraním v~prostredí GNOME. Na generovanie testov sú použité metadáta asistenčných technológií, konrétne AT-SPI. Navrhnutý generátor testov využíva dané metadáta na vytvorenie modelu testovanej aplikácie. Model mapuje sekvencie udalostí, ktoré generátor vykoná na testovanej aplikácii počas generovania testov. Súčasťou procesu generovania je zároveň detekcia závažných chýb v testovaných aplikáciách. Výstupom procesu generovania sú automatizované testy, ktoré sú vhodné na regresné testovanie. Funkčnosť implementovaného generátora testov bola úspešne overená testovaním 5 aplikácií s otvoreným zdrojovým kódom. Počas testovania aplikácií navrhnutým nástrojom sa preukázala schopnosť detekovať nové chyby.
910

Návrh bezpečnostní infrastruktury elektronického archivu / Design of security infrastructure for electronic archive

Doležel, Radek January 2009 (has links)
This master's thesis deals with design of security infrastructure for electronic archive. In theoretical part is disscus about technical resources which are based on security services and protocols and methods which are used for protection. On basics of theoretical part is designed model of security infrastructure and it is built in laboratory. Model of security infrastructure is based on Open Source Software and as safety storages for private user authentication data are used cryptographic USB tokens. This master's thesis includes design and construction of real infrastructure of secured electronic archive. In each part of master's thesis is put main emphases on security and clear explanation from the beginning of desing of model of security infrastructure for electronic archive to finish of construction.

Page generated in 0.0541 seconds