• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1614
  • 643
  • 580
  • 293
  • 278
  • 193
  • 151
  • 76
  • 52
  • 50
  • 42
  • 39
  • 39
  • 32
  • 23
  • Tagged with
  • 4669
  • 891
  • 726
  • 695
  • 576
  • 557
  • 486
  • 465
  • 419
  • 418
  • 408
  • 370
  • 358
  • 352
  • 334
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

[en] SCIENTIFIC APPLICATION: REENGINEERING TO ADD WORKFLOW CONCEPTS / [pt] REENGENHARIA DE UMA APLICAÇÃO CIENTÍFICA PARA INCLUSÃO DE CONCEITOS DE WORKFLOW

THIAGO MANHENTE DE CARVALHO MARQUES 17 January 2017 (has links)
[pt] A aplicação de técnicas de workflows na área de computação científica é bastante explorada para a condução de experimentos e construção de modelos in silico. Ao analisarmos alguns desafios enfrentados por uma aplicação científica na área de geociências, percebemos que workflows podem ser usados para representar os modelos gerados na aplicação e facilitar o desenvolvimento de funcionalidades que supram as necessidades identificadas. A maioria dos trabalhos e ferramentas na área de workflows científicos, porém, são voltados para uso em ambientes de computação distribuída, como serviços web e computação em grade, sendo de difícil uso ou integração dentro de aplicações científicas mais simples. Nesta dissertação, discutimos como viabilizar a composição e representação de workflows dentro de uma aplicação científica existente. Descrevemos uma arquitetura conceitual de motor de workflows voltado para o uso dentro de uma aplicação stand-alone. Descrevemos também um modelo de implantação em uma aplicação C plus plus usando redes de Petri para modelar um workflow e funções C plus plus para representar as tarefas. Como prova de conceito, implantamos esse modelo de workflows em uma aplicação existente e analisamos o impacto do seu uso na aplicação. / [en] The use of workflow techniques in scientific computing is widely adopted in the execution of experiments and building in silico models. By analysing some challenges faced by a scientific application in the geosciences domain, we noticed that workflows could be used to represent the geological models created using the application so as to ease the development of features to meet those challenges. Most works and tools on the scientific workflows domain, however, are designed for use in distributed computing contexts like web services and grid computing, which makes them unsuitable for integration or use within simpler scientific applications. In this dissertation, we discuss how to make viable the composition and representation of workflows within an existing scientific application. We describe a conceptual architecture of a workflow engine designed to be used within a stand-alone application. We also describe an implementation model of this architecture in a C plus plus application using Petri nets to model a workflow and C plus plus functions to represent tasks. As proof of concept, we implement this workflow model in an existing application and studied its impact on the application.
562

Design and manufacture of engineered titanium-based materials for biomedical applications

Almushref, Fares R. January 2017 (has links)
Metallic materials have gained much attention recently from the areas of medical devices and orthopaedics. Artificial organs, dental implants, prostheses and implants that replace damaged or malfunctioning parts in the body are, or contain, metal components. Our ageing society poses an increased demand to provide devices and implants that can demonstrate better performance than those presented by traditional solutions. Matching the mechanical properties (i.e. stiffness and strength) of the device to those of the host tissue is a major challenge for the design and manufacture of engineered metal materials for biomedical applications. Failure in doing so provokes implant loosening, patient discomfort and repeated surgeries. Therefore, tailoring physical properties and biocompatibility of those materials is the main final aim of this research programme. This PhD study has focused on the tailoring of the mechanical properties of titanium-based materials and titanium-based alloys. Titanium inertness and the selection of biocompatible alloying elements were set as the baseline. Two approaches were employed to decrease stiffness (i.e. Young s modulus): one, by introducing porosity in a titanium matrix and therefore, reduce its Young s modulus, and two, by designing and manufacturing beta-titanium-based alloys with a reduced Young s modulus. Titanium scaffolds were manufactured using powder metallurgy with space holder technique and a sintering process. Different space holder sizes were used in four different categories to study the effect of pore size and porosity on the mechanical properties of the porosity engineered Ti scaffolds. Ti-based alloys were manufactured using manufacturing techniques such as sintering and arc-melting. The effect of different fabrication processes and the addition of beta-stabilising elements were studied and investigated. The obtained results of mechanical properties for pore size and porosity were within the values that match bone properties. This means these materials are suitable for biomedical application and the beta-Ti alloys results show that the mechanical properties can be decreased via tailoring the crystal structures. The characterisation of the Ti-based alloys helps to develop this material for its use in biomedical application.
563

Nouveauté technologique et milieu professionnel : la tablette tactile et les acteurs de l'immobilier / Technological novelty and occupational world : the touchscreen tablet and the real estate’s professionals

Thevenot, Pauline 06 March 2015 (has links)
En milieu professionnel, les acteurs d’un nouveau dispositif technologique doivent faire face à de nombreux inconnus : objet technique inédit, technologie non maitrisée, manière de faire différente, usager inconnu, etc. À partir de l’étude des représentations et des pratiques associées aux tablettes tactiles, nous avons interrogé la construction du sens des usages en situation de nouveauté technologique. Pour cela, nous avons étudié une web application d’états des lieux immobiliers développée peu de temps après le lancement des tablettes tactiles en France. Les représentations à l’œuvre agissent en réaction à l’inconnu, au non maitrisé et en constituent une première forme de gestion. Pour la tablette tactile, objet de convergence technologique et fonctionnelle, l’inconnu et le rapport à celui-ci renvoient à un paradoxe : la tablette se présente comme étant en rupture avec le paysage numérique existant tout en se positionnant sur des fonctionnalités et des pratiques préexistantes. Dès lors, entre expérience et nouveauté technologique, comment le sens des usages se construit-il en situation professionnelle ? Comment le paradoxe de la tablette tactile intervient-il dans le développement informatique ? / In occupational world, the actors of a new technological device have to deal with numerous elements which are unknowns: unpublished device, new technology, new practices, unknown users, etc. From the study of the representations and the practices which are associated with touch screen tablets, we questioned the construction of the sense of the uses in situation of technological novelty. For that purpose, we studied a web application for real estate situations which was developed a little time after the launch of touchpads in France. The representations act in reaction to the unknowns’ elements, to the not mastered and establish a first shape of management. For the tablet computer, the object of technological and functional convergence, the unknown and the relationship in this one highlight to a paradox: the touchscreen tablet appears as in break with the existing digital devices and positioning on features and pre-existent practices. From then on, between experience and technological novelty, how sense of the practices builds itself in professional situation? How does the paradox of the touchscreen tablet intervenes in a IT development ?
564

Modeling the power consumption of computing systems and applications through machine learning techniques / Modélisation de la consommation énergétique des systèmes informatiques et ses applications grâce à des techniques d'apprentissage automatique

Fontoura Cupertino, Leandro 17 July 2015 (has links)
Au cours des dernières années, le nombre de systèmes informatiques n'a pas cesser d'augmenter. Les centres de données sont peu à peu devenus des équipements hautement demandés et font partie des plus consommateurs en énergie. L'utilisation des centres de données se partage entre le calcul intensif et les services web, aussi appelés informatique en nuage. La rapidité de calcul est primordiale pour le calcul intensif, mais pour les autres services ce paramètre peut varier selon les accords signés sur la qualité de service. Certains centres de données sont dits hybrides car ils combinent plusieurs types de services. Toutes ces infrastructures sont extrêmement énergivores. Dans ce présent manuscrit nous étudions les modèles de consommation énergétiques des systèmes informatiques. De tels modèles permettent une meilleure compréhension des serveurs informatiques et de leur façon de consommer l'énergie. Ils représentent donc un premier pas vers une meilleure gestion de ces systèmes, que ce soit pour faire des économies d'énergie ou pour facturer l'électricité à la charge des utilisateurs finaux. Les politiques de gestion et de contrôle de l'énergie comportent de nombreuses limites. En effet, la plupart des algorithmes d'ordonnancement sensibles à l'énergie utilisent des modèles de consommation restreints qui renferment un certain nombre de problèmes ouverts. De précédents travaux dans le domaine suggèrent d'utiliser les informations de contrôle fournies par le système informatique lui-même pour surveiller la consommation énergétique des applications. Néanmoins, ces modèles sont soit trop dépendants du type d'application, soit manquent de précision. Ce manuscrit présente des techniques permettant d'améliorer la précision des modèles de puissance en abordant des problèmes à plusieurs niveaux: depuis l'acquisition des mesures de puissance jusqu'à la définition d'une charge de travail générique permettant de créer un modèle lui aussi générique, c'est-à-dire qui pourra être utilisé pour des charges de travail hétérogènes. Pour atteindre un tel but, nous proposons d'utiliser des techniques d'apprentissage automatique.Les modèles d'apprentissage automatique sont facilement adaptables à l'architecture et sont le cœur de cette recherche. Ces travaux évaluent l'utilisation des réseaux de neurones artificiels et la régression linéaire comme technique d'apprentissage automatique pour faire de la modélisation statistique non linéaire. De tels modèles sont créés par une approche orientée données afin de pouvoir adapter les paramètres en fonction des informations collectées pendant l'exécution de charges de travail synthétiques. L'utilisation des techniques d'apprentissage automatique a pour but d'atteindre des estimateurs de très haute précision à la fois au niveau application et au niveau système. La méthodologie proposée est indépendante de l'architecture cible et peut facilement être reproductible quel que soit l'environnement. Les résultats montrent que l'utilisation de réseaux de neurones artificiels permet de créer des estimations très précises. Cependant, en raison de contraintes de modélisation, cette technique n'est pas applicable au niveau processus. Pour ce dernier, des modèles prédéfinis doivent être calibrés afin d'atteindre de bons résultats. / The number of computing systems is continuously increasing during the last years. The popularity of data centers turned them into one of the most power demanding facilities. The use of data centers is divided into high performance computing (HPC) and Internet services, or Clouds. Computing speed is crucial in HPC environments, while on Cloud systems it may vary according to their service-level agreements. Some data centers even propose hybrid environments, all of them are energy hungry. The present work is a study on power models for computing systems. These models allow a better understanding of the energy consumption of computers, and can be used as a first step towards better monitoring and management policies of such systems either to enhance their energy savings, or to account the energy to charge end-users. Energy management and control policies are subject to many limitations. Most energy-aware scheduling algorithms use restricted power models which have a number of open problems. Previous works in power modeling of computing systems proposed the use of system information to monitor the power consumption of applications. However, these models are either too specific for a given kind of application, or they lack of accuracy. This report presents techniques to enhance the accuracy of power models by tackling the issues since the measurements acquisition until the definition of a generic workload to enable the creation of a generic model, i.e. a model that can be used for heterogeneous workloads. To achieve such models, the use of machine learning techniques is proposed. Machine learning models are architecture adaptive and are used as the core of this research. More specifically, this work evaluates the use of artificial neural networks (ANN) and linear regression (LR) as machine learning techniques to perform non-linear statistical modeling.Such models are created through a data-driven approach, enabling adaptation of their parameters based on the information collected while running synthetic workloads. The use of machine learning techniques intends to achieve high accuracy application- and system-level estimators. The proposed methodology is architecture independent and can be easily reproduced in new environments.The results show that the use of artificial neural networks enables the creation of high accurate estimators. However, it cannot be applied at the process-level due to modeling constraints. For such case, predefined models can be calibrated to achieve fair results.% The use of process-level models enables the estimation of virtual machines' power consumption that can be used for Cloud provisioning.
565

Gestão do design na prática : framework para implementação em empresas

Moreira, Bruna Ruschel January 2016 (has links)
O Design passou a ser executado em níveis cada vez mais complexos. No entanto, ainda existem desafios relacionados à inserção integrada das competências dos designers nas empresas, em virtude de uma visão fragmentada de muitas corporações acerca dos benefícios daquele. Por isso, a Gestão do Design ainda se assume como um desafio, quando se procura coordenar as práticas do Design de forma vertical nos níveis estratégico, tático e operacional. Nesse sentido, pesquisadores têm se detido em investigar sobre tal área, mas, de forma geral, os resultados restringem-se a contribuições mais teóricas do que aplicáveis em companhias. Dessa forma, a presente pesquisa visa apresentar um framework de Gestão do Design para aplicação em empresas desenvolvedoras de produtos. Para alcançar esse objetivo, foi necessário: compreender, sob uma perspectiva teórica e prática, os níveis de aplicação dessa gestão nas empresas, seus benefícios e desafios; identificar os fatores críticos de sucesso presentes na aplicação integrada do Design em companhias; explorar formas de realizar uma aplicação prática em empresas; analisar os elementos constitutivos de um framework conceitual e avaliar a aplicabilidade do framework. A fim de viabilizar tais discussões, a presente investigação foi delineada por meio de revisão bibliográfica, pesquisa de campo exploratória (composta pela opinião de noventa designers), Estudo de Caso (realizada com colaboradores e gestores de três empresas) e Pesquisa Ação (destinada a cinco empresas desenvolvedoras de produtos). Os resultados obtidos permitiram construir um framework de Gestão do Design, composto por oito etapas, que visam, respectivamente, pretender, engajar, estruturar e executar a inserção de práticas integradas do Design nas empresas. Cada etapa do framework apresenta, de forma clara, seu objetivo, as ações necessárias e os impactos organizacionais que tais atividades produzem na organização. Além disto, são recomendados alguns processos contínuos para a manutenção da inserção integral do Design nas companhias. / In the last decades, Design started being practiced in more complex levels. However, there are still challenges related to its vertical insertion due to a limited vision that many companies still have of its benefits. For this reason, Design Management is seen as a challenge since it aims to coordinate design practices in an integrated manner on strategic, tactical and operational levels. In this sense, many researchers have been investigating this area, but in general, the results are restricted to more theoretical contributions than applicable in companies. Thus, the present research presents a Design Management Framework for application in companies. To achieve this objective, it was necessary: to understand, from a theoretical and practical perspective, the application of this kind of management inside of companies, its benefits and challenges; identify the critical success factors present in application of design in companies; explore ways to achieve a practical application in companies; analyze the components of a conceptual framework and validate your applicability. In order to facilitate such discussions, this research was designed through a literature review, exploratory study (composed of the opinion of ninety designers), Case Study (conducted with employees and managers of three companies) and Research Action (aimed at five developers of companies). The results achieved enabled the construction of a framework composed by eight stages which aims to “intend, engage, structure and undertake” the insertion of integrated Design practices in companies. Each stage of the framework has the objective, the actions and the organizational impacts produced in the organization. Furthermore, some continuous processes are recommended for the maintenance of insertion of the Design in companies.
566

Why Johnny Still Can’t Pentest: A Comparative Analysis of Open-source Black-box Web Vulnerability Scanners

Khalil, Rana Fouad 19 December 2018 (has links)
Black-box web application vulnerability scanners are automated tools that are used to crawl a web application to look for vulnerabilities. These tools are often used in one of two ways. In the first approach, scanners are used as Point-and-Shoot tools where a scanner is only given the root URL of an application and asked to scan the site. Whereas, in the second approach, scanners are first configured to maximize the crawling coverage and vulnerability detection accuracy. Although the performance of leading commercial scanners has been thoroughly studied, very little research has been done to evaluate open-source scanners. This paper presents a feature and performance evaluation of five open-source scanners. We analyze the crawling coverage, vulnerability detection accuracy, scanning speed, report- ing and usability features. The scanners are tested against two well known benchmarks: WIVET and WAVSEP. Additionally, the scanners are tested against a realistic web application called WackoPicko. The chosen benchmarks are composed of a wide range of vulnerabilities and crawling challenges. Each scanner is tested in two modes: default and configured. Lastly, the scanners are compared with the state of the art commercial scanner Burp Suite Professional. Our results show that being able to properly crawl a web application is a critical task in detecting vulnerabilities. Unfortunately, the majority of the scanners evaluated had difficulty crawling through common web technologies such as dynamically generated JavaScript content and Flash applications. We also identified several classes of vulnerabilities that are not being detected by the scanners. Furthermore, our results show that scanners displayed considerable improvement when run in configured mode.
567

Assistência de ar em barra, ângulo e volume de pulverização no controle de doenças e na produtividade da cultura do trigo

Christovam, Rafael de Souza [UNESP] 19 August 2011 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:34:59Z (GMT). No. of bitstreams: 0 Previous issue date: 2011-08-19Bitstream added on 2014-06-13T19:24:05Z : No. of bitstreams: 1 christovam_rs_dr_botfca.pdf: 504495 bytes, checksum: 175be209a6aecfa1989b2dd269f958d3 (MD5) / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Empresa Privada / Com o intuito de melhorar a eficiência de controle de Blumeria graminis Tritici e Puccinia Tritici na cultura do trigo, o presente trabalho objetivou avaliar diferentes técnicas de pulverização sobre os depósitos, controle de oídio e ferrugem da folha, bem como sobre a produtividade da cultura. Oito experimentos foram conduzidos na área experimental da FCA/UNESP – Campus de Botucatu, na cultura do trigo (cultivar IAC-370), sendo divididos em duas fases: quatro na área didática do Departamento de Produção Vegetal – Agricultura (Fase I) e quatro na área cascalheira (Fase II) na safra agrícola de 2009. O experimento 1 foi conduzido no delineamento em blocos ao acaso com 6 tratamentos e quatro repetições, totalizando 24 parcelas. Os tratamentos foram distribuídos no esquema fatorial 3 x 2 (três níveis de ar: 0, 15 e 29 km h-1 combinados a dois volumes de pulverização: 100 L ha-1 com pontas de jato plano XR 110015 e 200 L ha-1 com pontas de jato plano XR 11003). As parcelas foram dimensionadas em 8,0 x 10,0 m (largura x comprimento). De cada parcela foram selecionadas, ao acaso, 20 plantas para amostragem dos depósitos da pulverização. Para avaliação dos depósitos, foi utilizada uma substância marcadora (corante Azul Brilhante) na concentração de 0,15% do volume da calda de pulverização e a quantificação dos depósitos foi por espectrofotometria. O experimento 2 foi conduzido no mesmo local e os tratamentos distribuídos no mesmo delineamento do experimento anterior, acrescido da testemunha (plantas não tratadas). Nos estádios de desenvolvimento 32 (segundo nó visível) e 45 (emborrachamento completo) foram realizadas pulverizações com um fungicida triazol 2 (tebuconazole) à 100 e 200 L ha-1. A severidade das doenças oídio e ferrugem da folha foram avaliadas através... / In order to improve the control efficiency of Blumeria graminis Tritici and Puccinia Tritici in wheat crop, this study aimed to evaluate different spraying techniques on spray deposits, the control of wheat powdery mildew and wheat leaf rust, as well as crop productivity. In general, eight experiments were conducted at experimental area of FCA / UNESP - Botucatu city, in the wheat crop (IAC-370 variety) being divided into two steps: four experiments were carried out at experimental area of Plant Production - Agriculture Department (Step I) and others four experiments were performed at experimental area entitled Cascalheira (Step II) during 2009 season. The first experiment was carried out in randomized blocks with six treatments and four replications, totaling 24 plots. The treatments were arranged in a factorial scheme 3 x 2 (three air speed levels: 0, 15 and 29 km h-1 combined with two spray volumes 100 L ha-1 produced by flat fan nozzle XR 110 015 and 200 L ha-1 produced by flat fan nozzle XR 11003. The plots were dimensioned with 8.0 x 10.0 m (width x length). For each plot, it was selected 20 plants at random, which these plants were utilized to quantify spray deposits. For spray deposits evaluation, it was used a tracer substance (Brilliant Blue dye) in the 0.15% of spray volume concentration and the spray deposits quantification was made by spectrophotometer. The second experiment was carried out at the same place and the treatments were arranged in the same design from the previous experiment, plus a control treatment (untreated plants). In the growth stages 32 (second visible node) and stage 45 (complete boot), it was sprayed a triazol fungicide (tebuconazole) with... (Complete abstract click electronic access below)
568

Data visualization for the modern web : A look into tools and techniques for visualizing data in Angular 5 applications

Almroth, Tobias January 2018 (has links)
This paper looks into how data is best visualized and how visualizations should be designed to be most easily perceived. Furthermore the study looks into what tools there are available on the market today for visualizing data in angular 5 applications. With regards to a client, a developer team from the swedish police IT-department, the tools are evaluated and the one most suitable for the client is found. The paper also looks into how a dynamic data solution can be developed in angular 5. A solution where data can be selected in one component and displayed in another. To answer the questions sought a study of previous research into data visualization was done as well as a look into how angular 5 applications can be developed. Interviews with the clients were held where their specific requirements on visualization tools were identified. After searching and listing available visualization tools on the market the tools were evaluated against the clients requirements and a prototype application were developed. Showcasing both the most suitable tool and its integration but also a dynamic data solution in angular 5. As a conclusion data visualizations should be made as simple as possible with the main focus on the data. When it comes to tools the one most suitable to the client was Chart.js that easily integrated into an angular 5 application. An application that thanks to angular’s features is well equipped for handling and developing dynamic data solutions.
569

Beach Museum Web Application

Kakkireni, Nithin Kumar January 1900 (has links)
Master of Science / Department of Computer Science / Daniel Andresen / This project involves in developing a responsive web application for Beach Museum at Manhattan, Kansas. Application is built on development boxes using Amazon web services. Project is built on MVC architecture that helps user to search images, create their own collection from the images and include an admin module. Migrating the current existing SQL database to couchDB for better performance of the available data. Integrated Apache Lucene to support text search in the couch database writing different indexes to retrieve the results. Implementing core functionalities like basic search, advanced search, filter objects with respective to artist, decade, object type and relevance using different indexes and Mango queries in the couchDB. Search Results are further chunked and displayed to the user. Web storage API’s were used to provide the functionality for a user to create their own collection (set of Images). Built an Admin module to perform CRUD operations the database. Admin module involves in creating exhibitions, adding/editing works and artists in the couch DB.
570

Near Me – a location-aware to-do Android application

Garlapati, Deepti Reddy January 1900 (has links)
Master of Science / Computing and Information Sciences / Daniel A. Andresen / The growing needs of different products is only being increased from day to day and among these numerous products that each person plan to purchase, it has become a tedious task to keep track of all the products that should be purchased. One such important thing is that, everyone wish to keep track of an item when the location associated with the item is nearby. We have many To-Do applications where we can just note down our day to day needs and things to get. But we might face situations like forgetting to keep track of what we have in our To Do list related to buying an item and when the location associated to the item is nearby, there are high possibilities that we overlook and just forget about purchasing these items. These situations occurred most of the times when I did a small survey among my friends. This difficulty has lead me to rethink and find a solution. The usage of smart phones has become very common these days. Android market which is an open source has helped many people to develop their own applications and these could easily be run on Android smart phones. I thought of developing an android application that helps in tracking not only the To-Do list of the items that a person tracks down to purchase but also stores the location where that item can be purchased. This To-Do app then provide notifications when a person is nearby the location associated with the item. It also triggers an alarm so that the user can easily remember what item he has planned to get in that particular location. The proposed app tries to solve most of the problems by providing an intuitive interface to the user where the user can note down all their planned purchases with location of the products and get reminders about it when passing through that location. The Near Me application is about tracking the items to be purchased or the tasks that are to be done specific to the location. Each to-do item is associated with a date, location and notes. Storing the locations in the application helps the user with timely notifications and alarms according to the location the user is in and the tasks that should be done in that location. Also these To-Do items can also be synced with online storage application like Dropbox.

Page generated in 0.1071 seconds