Spelling suggestions: "subject:"computer software devevelopment."" "subject:"computer software agentdevelopment.""
311 |
Implante neural controlado em malha fechada / Closed loop controlled neural implantAraujo, Carlos Eduardo de 15 December 2015 (has links)
Um dos desafios propostos por pesquisadores em neurociência aos engenheiros biomédicos é a interação cérebro-máquina. O sistema nervoso comunica-se interpretando sinais eletroquímicos, e circuitos implantáveis podem tomar decisões de modo a interagir com o meio biológico. Sabe-se também que a doença de Parkinson está relacionada a um déficit do neurotransmissor dopamina. Para controlar a concentração de dopamina diferentes técnicas tem sido empregadas como estimuladores elétricos, magnéticos e drogas. Neste trabalho obteve-se o controle da concentração do neurotransmissor de maneira automática uma vez que atualmente isto não é realizado. Para tanto, projetou-se e desenvolveu-se quatro sistemas: a estimulação cerebral profunda ou deep brain stimulation (DBS), a estimulação transmagnética ou transmagnetic stimulation (TMS), um controle de bomba de infusão ou infusion pump control (IPC) para a entrega de drogas e um sistema de voltametria cíclica de varredura rápida ou fast scan ciclic voltammetry (FSCV) (circuito que detecta variações de concentração de neurotransmissores como a dopamina - DA). Também foi necessário o desenvolvimento de softwares para a visualização de dados e análises em sincronia com acontecimentos ou experimentos correntes, facilitando a utilização destes dispositivos quando emprega-se bombas de infusão e a sua flexibilidade é tal que a DBS ou a TMS podem ser utilizadas de maneira manual ou automática além de outras técnicas de estimulação como luzes, sons, etc. O sistema desenvolvido permite controlar de forma automática a concentração da DA. A resolução do sistema é de 0.4 µmol/L podendo-se ajustar o tempo para correção da concentração entre 1 e 90 segundos. O sistema permite controlar concentrações entre 1 e 10 µmol/L, com um erro de cerca de +/- 0,8 µmol/L. Embora desenhado para o controle da concentração de dopamina o sistema pode ser utilizado para controlar outros neurotransmissores. Propõe-se continuar o desenvolvimento em malha fechada empregando FSCV e DBS (ou TMS, ou infusão), utilizando modelos animais parkinsonianos. / One of the challenges to biomedical engineers proposed by researchers in neuroscience is brain machine interaction. The nervous system communicates by interpreting electrochemical signals, and implantable circuits make decisions in order to interact with the biological environment. It is well known that Parkinson’s disease is related to a deficit of dopamine (DA). Different methods has been employed to control dopamine concentration like magnetic or electrical stimulators or drugs. In this work was automatically controlled the neurotransmitter concentration since this is not currently employed. To do that, four systems were designed and developed: deep brain stimulation (DBS), transmagnetic stimulation (TMS), Infusion Pump Control (IPC) for drug delivery, and fast scan cyclic voltammetry (FSCV) (sensing circuits which detect varying concentrations of neurotransmitters like dopamine caused by these stimulations). Some softwares also were developed for data display and analysis in synchronously with current events in the experiments. This allowed the use of infusion pumps and their flexibility is such that DBS or TMS can be used in single mode and other stimulation techniques and combinations like lights, sounds, etc. The developed system allows to control automatically the concentration of DA. The resolution of the system is around 0.4 µmol/L with time correction of concentration adjustable between 1 and 90 seconds. The system allows controlling DA concentrations between 1 and 10 µmol/L, with an error about +/- 0.8 µmol/L. Although designed to control DA concentration, the system can be used to control, the concentration of other substances. It is proposed to continue the closed loop development with FSCV and DBS (or TMS, or infusion) using parkinsonian animals models.
|
312 |
Linguagem e compilador para o paradigma orientado a notificações (PON): avanços e comparaçõesFerreira, Cleverson Avelino 28 August 2015 (has links)
Atuais paradigmas correntes de programação de software, mais precisamente o Paradigma Imperativo (PI) e o Paradigma Declarativo (PD), apresentam deficiências que afetam o desempenho das aplicações e a obtenção de “desacoplamento” (ou acoplamento mínimo) entre elementos de software. Com o objetivo de amenizar essas deficiências, foi desenvolvido o Paradigma Orientado a Notificações (PON). O PON se inspira nos conceitos do PI (e.g. objetos) e do PD (e.g. base de fatos e regras), mas altera a essência da execução ou inferência lógica-causal. Basicamente, o PON usa objetos para tratar de fatos e regras na forma de composições de outros objetos menores que, entretanto, apresentam características comportamentais de certa autonomia, independência, reatividade e colaboração por meio de notificações pontuais para fins de inferência. Isto dito, salienta-se que a materialização dos conceitos do PON se deu por meio de um arquétipo ou Framework elaborado em linguagem de programação C++. Tal materialização do PON vem sendo utilizada como uma alternativa para o desenvolvimento de aplicações sob o domínio desse paradigma e possibilitou, de fato, a criação de aplicações para ambientes computacionais usuais baseados na chamada arquitetura Von Neumann. Apesar destas contribuições para com a sua materialização, o desenvolvimento de aplicações no PON ainda não apresentava resultados satisfatórios em termos de desempenho tal qual deveria a luz do seu cálculo assintótico, nem a facilidade de programação que seria uma das suas características principais. Nesse âmbito, o presente trabalho propõe como evolução para o estado da técnica do PON a criação de uma linguagem e compilador para o paradigma. Sendo assim, este trabalho apresenta a definição da linguagem criada com a utilização de exemplos práticos guiados pelo desenvolvimento de aplicações. Subsequentemente são apresentados detalhes do compilador bem como sua estrutura. Para demonstrar a evolução do estado da técnica do paradigma, no tocante a desempenho (e.g. tempo de processamento) e facilidade de programação foram realizados estudos comparativos com a utilização da linguagem e compilador. Os estudos comparativos foram guiados com a elaboração de dois softwares denominados aplicação Mira ao Alvo e aplicação de Vendas. Essas aplicações foram desenvolvidas com base na linguagem PON e foram realizados experimentos simulando sequências de execução com o intuito de avaliar o tempo de processamento para o resultado gerado pelo compilador PON. Ainda, tais experimentos possibilitaram a avaliação de maneira subjetiva da linguagem de programação PON no tocante a facilidade de programação. Deste modo, foi possível observar com tais estudos comparativos que os resultados apresentados pelo compilador PON foram satisfatórios quando comparados aos resultados obtidos pelo Framework e por aplicações equivalentes desenvolvidas baseadas no Paradigma Orientado a Objetos (POO). / The current software development paradigms, specifically the Imperative Paradigm (IP) and the Declarative Paradigm (DP), have weaknesses that affect the applications performance and decoupling (or minimal coupling) between the software modules. In order to provide a solution regarding these weaknesses, the Notification Oriented Paradigm (NOP) was developed. NOP is inspired by the concepts of the IP (e.g. objects) and DP (e.g. base of facts and Rules). Basically, NOP uses objects to deal with facts and Rules as compositions of other, smaller, objects. These objects have the following behavioral characteristics: autonomy, independence, responsiveness and collaboration through notifications. Thus, it’s highlighted that the realization of these concepts was firstly instantiated through a Framework developed in C++. Such NOP materialization has been used as an alternative for Application development in the domain of this paradigm and made possible, in fact, the creation of applications for typical computing environments based on Von Neumann architecture. The development of the C++ materialization of NOP has not presented satisfactory results in terms of performance as it should when taking into account its asymptotic calculation and programming facility. In this context, this work presents an evolution of NOP by creating a specific programming language, and its respective compiler, for this paradigm. Therefore, this work presents the language definition and the details of the development of its compiler. To evaluate the evolution regarding to performance (e.g. processing time) and programming facility, some comparative studies using the NOP language and compiler are presented. These comparative studies were performed by developing two software applications called Target and Sales Application. These applications have been developed based on NOP language, and the experiments were performed simulating sequences of execution in order to evaluate the processing time for the generated results by NOP compiler. Still, these experiments allowed the evaluation of NOP programming language, in a subjective way, regarding to ease programming. Thus, with such comparative studies, it was possible to observe that the results presented by the compiler NOP were satisfactory when compared to the results achieved via Framework and for equivalent applications developed based on the Oriented Object Paradigm (OOP).
|
313 |
Desenvolvimento de software e hardware para diagnóstico e acompanhamento de lesões dermatológicas suspeitas para câncer de peleOselame, Gleidson Brandão 28 August 2014 (has links)
O câncer é responsável por cerca de 7 milhões de óbitos anuais em todo o mundo. Estima-se que 25% de todos os cânceres são de pele, sendo no Brasil o tipo mais incidente em todas as regiões geográficas. Entre eles, o tipo melanoma, responsável por 4% dos cânceres de pele, cuja incidência dobrou mundialmente nos últimos dez anos. Entre os métodos diagnósticos empregados, cita-se a regra ABCD, que leva em consideração assimetria (A), bordas (B), cor (C) e diâmetro (D) de manchas ou nevos. O processamento digital de imagens tem mostrado um bom potencial para auxiliar no diagnóstico precoce de melanomas. Neste sentido, o objetivo do presente estudo foi desenvolver um software, na plataforma MATLAB®, associado a um hardware para padronizar a aquisição de imagens, visando realizar o diagnóstico e acompanhamento de lesões cutâneas suspeitas de malignidade (melanoma). Utilizou-se como norteador a regra ABCD para o desenvolvimento de métodos de análise computacional. Empregou-se o MATLAB como ambiente de programação para o desenvolvimento de um software para o processamento digital de imagens. As imagens utilizadas foram adquiridas de dois bancos de imagens de acesso livre. Foram inclusas imagens de melanomas (n=15) e imagens nevos (não câncer) (n=15). Utilizaram-se imagens no canal de cor RGB, as quais foram convertidas para escala de cinza, aplicação de filtro de mediana 8x8 e técnica de aproximação por vizinhança 3x3. Após, procedeu-se a binarização e inversão de preto e branco para posterior extração das características do contorno da lesão. Para a aquisição padronizada de imagens foi desenvolvido um protótipo de hardware, o qual não foi empregado neste estudo (que utilizou imagens com diagnóstico fechado, de bancos de imagem), mas foi validado para a avaliação do diâmetro das lesões (D). Utilizou-se a estatística descritiva onde os grupos foram submetidos ao teste não paramétrico para duas amostras independentes de Mann-Whitney U. Ainda, para avaliar a sensibilidade (SE) e especificidade (SP) de cada variável, empregou-se a curva ROC. O classificador utilizado foi uma rede neural artificial de base radial, obtendo acerto diagnóstico para as imagens melanomas de 100% e para imagens não câncer de 90,9%. Desta forma, o acerto global para predição diagnóstica foi de 95,5%. Em relação a SE e SP do método proposto, obteve uma área sob a curva ROC de 0,967, o que sugere uma excelente capacidade de predição diagnóstica, sobretudo, com baixo custo de utilização, visto que o software pode ser executado na grande maioria dos sistemas operacionais hoje utilizados. / Cancer is responsible for about 7 million deaths annually worldwide. It is estimated that 25% of all cancers are skin, and in Brazil the most frequent in all geographic regions type. Among them, the melanoma type, accounting for 4% of skin cancers, whose incidence has doubled worldwide in the past decade. Among the diagnostic methods employed, it is cited ABCD rule which considers asymmetry (A), edges (B), color (C) and diameter (D) stains or nevi. The digital image processing has shown good potential to aid in early diagnosis of melanoma. In this sense, the objective of this study was to develop software in MATLAB® platform, associated with hardware to standardize image acquisition aiming at performing the diagnosis and monitoring of suspected malignancy (melanoma) skin lesions. Was used as the ABCD rule for guiding the development of methods of computational analysis. We used MATLAB as a programming environment for the development of software for digital image processing. The images used were acquired two banks pictures free access. Images of melanomas (n = 15) and pictures nevi (not cancer) (n = 15) were included. We used the image in RGB color channel, which were converted to grayscale, application of 8x8 median filter and approximation technique for 3x3 neighborhood. After we preceded binarization and reversing black and white for subsequent feature extraction contours of the lesion. For the standardized image acquisition was developed a prototype hardware, which was not used in this study (that used with enclosed diagnostic images of image banks), but has been validated for evaluation of lesion diameter (D). We used descriptive statistics where the groups were subjected to non-parametric test for two independent samples Mann-Whitney U test yet, to evaluate the sensitivity (SE) and specificity (SP) of each variable, we used the ROC curve. The classifier used was an artificial neural network with radial basis function, obtaining diagnostic accuracy for melanoma images and 100% for images not cancer of 90.9%. Thus, the overall diagnostic accuracy for prediction was 95.5%. Regarding the SE and SP of the proposed method, obtained an area under the ROC curve of 0.967, which suggests an excellent diagnostic ability to predict, especially with low costs, since the software can be run in most systems operational use today.
|
314 |
Context driven workflow adaptation applied to healthcare planning = Adaptação de workflows dirigida por contexto aplicada ao planejamento de saúde / Adaptação de workflows dirigida por contexto aplicada ao planejamento de saúdeVilar, Bruno Siqueira Campos Mendonça, 1982- 25 August 2018 (has links)
Orientadores: Claudia Maria Bauzer Medeiros, André Santanchè / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-25T03:19:51Z (GMT). No. of bitstreams: 1
Vilar_BrunoSiqueiraCamposMendonca_D.pdf: 3275725 bytes, checksum: 4ccdd82eebca5b8da9748c7c515ea4c1 (MD5)
Previous issue date: 2014 / Resumo: Sistemas de Gerenciamento de Workflows (WfMS -- em inglês) são usados para gerenciar a execução de processos, melhorando eficiência e eficácia de procedimentos em uso. As forças motrizes por trás da adoção e do desenvolvimento de um WfMS são aplicações científicas e de negócios. Esforços conjuntos de ambos resultaram em mecanismos consolidados, além de padrões e protocolos consensuais. Em particular, um WfMS científico (SWfMS -- \textit{Scientific WfMS}) auxilia cientistas a especificar e executar experimentos distribuídos. Ele fornece diferentes recursos que suportam atividades em um ambiente experimental, como prover flexibilidade para mudar o projeto de workflow, manter a proveniência e suportar reproducibilidade de experimentos. Por outro lado, apesar de poucas iniciativas de pesquisa, WfMSs não fornecem suporte apropriado à personalização dinâmica e baseada em contexto durante a execução; adaptações em tempo de execução normalmente requerem intervenção do usuário. Esta tese se concentra em superar essa deficiência, fornecendo a WfMSs um mecanismo de ciente do contexto para personalizar a execução de workflows. Como resultado, foi projetado e desenvolvido o DynFlow -- uma arquitetura de software que permite tal personalização aplicada a um domínio: planejamento de saúde. Este domínio foi escolhido por ser um ótimo exemplo de personalização sensível ao contexto. Procedimentos de saúde constantemente sofrem mudanças que podem ocorrer durante um tratamento, como a reação de um paciente a um medicamento. Para suprir a demanda, a pesquisa em planejamento de saúde desenvolveu técnicas semi-automáticas para suportar mudanças rápidas dos passos de fluxos de tratamento, de acordo com o estado e a evolução do paciente. Uma dessas técnicas é \textit{Computer-Interpretable Guidelines} (CIG), cujo membro mais proeminente é \textit{Task-Network Model} (TNM) -- uma abordagem baseada em regras capaz de construir um plano em tempo de execução. Nossa pesquisa nos levou a concluir que CIGs não suportam características necessárias por profissionais de saúde, como proveniência e extensibilidade, disponíveis em WfMSs. Em outras palavras, CIGs e WfMSs têm características complementares e são direcionadas à execução de atividades. Considerando os fatos citados, as principais contribuições desta tese são: (a) especificação e desenvolvimento do DynFlow, cujo modelo associa características de TNMs e WfMS; (b) caracterização das principais vantagens e desvantagens de modelos CIGs e WfMSs; (c) implementação de um protótipo, baseado em ontologias e aplicadas ao domínio da saúde e enfermagem / Abstract: Workflow Management Systems (WfMS) are used to manage the execution of processes, improving efficiency and efficacy of the procedure in use. The driving forces behind the adoption and development of WfMSs are business and scientific applications. Associated research efforts resulted in consolidated mechanisms, consensual protocols and standards. In particular, a scientific WfMS helps scientists to specify and run distributed experiments. It provides several features that support activities within an experimental environment, such as providing flexibility to change workflow design and keeping provenance (and thus reproducibility) of experiments. On the other hand, barring a few research initiatives, WfMSs do not provide appropriate support to dynamic, context-based customization during run-time; on-the-fly adaptations usually require user intervention. This thesis is concerned with mending this gap, providing WfMSs with a context-aware mechanism to dynamically customize workflow execution. As a result, we designed and developed DynFlow ¿ a software architecture that allows such a customization, applied to a specific domain: healthcare planning. This application domain was chosen because it is a very good example of context-sensitive customization. Indeed, healthcare procedures constantly undergo unexpected changes that may occur during a treatment, such as a patient¿s reaction to a medicine. To meet dynamic customization demands, healthcare planning research has developed semi-automated techniques to support fast changes of the careflow steps according to a patient¿s state and evolution. One such technique is Computer-Interpretable Guidelines (CIG), whose most prominent member is the Task-Network Model (TNM) -- a rule based approach able to build on the fly a plan according to the context. Our research led us to conclude that CIGs do not support features required by health professionals, such as distributed execution, provenance and extensibility, which are available from WfMSs. In other words, CIGs and WfMSs have complementary characteristics, and both are directed towards execution of activities. Given the above facts, the main contributions of the thesis are the following: (a) the design and development of DynFlow, whose underlying model blends TNM characteristics with WfMS; (b) the characterization of the main advantages and disadvantages of CIG models and workflow models; and (c) the implementation of a prototype, based on ontologies, applied to nursing care. Ontologies are used as a solution to enable interoperability across distinct SWfMS internal representations, as well as to support distinct healthcare vocabularies and procedures / Doutorado / Ciência da Computação / Doutor em Ciência da Computação
|
315 |
The development, design, and theory of educational interactive multimedia softwareChapman, William James 01 January 2002 (has links)
This project provides teachers and students with background information regarding software design. Specifically it will focus on design theory including content, interface, graphics, animation, navigation, and audio issues which may arise during the design and construction of educational multimedia software. The design issues represented here are demonstrated in the software "You can make a difference ... Human Rights".
|
316 |
Easy Net Admin: Inventory tool for network administratorsReddy, Rohini Mopu 01 January 2004 (has links)
The purpose of this project was to develop a tool / utility for network administrators to maintain information about all the systems operated by the College of Natural Sciences at California State University, San Bernardino.
|
317 |
Java/XML-based Trading Information Processing System for produce wholesale marketYang, Ching-Ling 01 January 2004 (has links)
The purpose of this project is to investigate the use of the emerging XML technologies to improve online Business to Business (B2B) supply chain processes.
|
318 |
Development of a framework to understand the factors that influence software productivity in agile teamsNzou, Viola 10 1900 (has links)
Productivity improvement in the software industry is one of the major challenges facing many software development companies in this century. Most companies have adopted agile methodologies in order to profit from the benefits claimed for them. Agile methodologies are characterised by frequent software delivery, short feedback loops, quicker response to change, and problem identification earlier in the development process.
The agile approach has been recognised as paving a way for companies to acquire higher software productivity, delivering good-quality and cost-effective software, enabling software development companies to respond to business challenges with their demands for high quality, high performance and high development speed in delivering the final product. For companies that adopt agile methodologies, understanding the factors that influence their teams’ software development productivity is a challenging task for management and practitioners today.
In this research, an analysis is presented that identifies productivity factors that affect agile teams. It is a study of agile methods to identify common agile practices and/or values that have impact on productivity, and describes suitable metrics that could be used to measure agile team productivity. A qualitative research approach was used, and the case study was chosen as the research strategy. Two South African companies that are located in two different provinces and that adopted agile methodologies in their software development, were selected for the case studies. Qualitative content analysis was used in the research to permit subjective interpretation of factors that influence agile team productivity, and to analyse to what extent these factors affected productivity. This research has shown that an understanding of the factors that influence an agile team’s productivity gives significant insight into the way agile teams work, motivates team members to work together, and leads to uniform metrics in tracking each team’s progress. The study indicates that tracking an agile team’s work and providing adequate tools needed to execute their tasks results in improving agile team productivity. It should be recognised that using metrics to measure performance in agile teams is helpful in creating a team’s culture and trust. In this study, it was found that the factors identified in both literature and case studies affected productivity in the two companies under study, both positively and negatively. The study also found that applying the correct metrics in assessing, analysing and reviewing an agile team’s performance is important when monitoring productivity. Successful software delivery is only possible if individuals are committed to their work, are provided with the necessary tools and have access to a stable working environment. In addition, individual factors such as knowledge, skills, abilities, personalities and experience should be considered when forming agile teams. Consideration of these factors will result in grouping people that are able to work together and achieve a common goal, which is important in improving productivity. A conceptual framework for agile team productivity was proposed. The discussion of the findings is presented in more detail in this research. / School of Computing / M.Sc. (Computing)
|
319 |
From specification through refinement to implementation : a comparative studyVan Coppenhagen, Ingrid H. M. 30 June 2002 (has links)
This dissertation investigates the role of specification, refinement and implementation in the software development cycle. Both the structured and object-oriented paradigms are looked at. Particular emphasis is placed on the role of the refinement process.
The requirements for the product (system) are determined, the specifications are drawn up, the product is designed, specified, implemented and tested. The stage between the (formal) specification of the system and the implementation of the system is the refinement stage.
The refinement process consists out of data refinement, operation refinement, and operation decomposition. In this dissertation, Z, Object-Z and UML (Unified Modelling Language) are used as specification languages and C, C++, Cobol and Object-Oriented Cobol are used as implementation languages.
As an illustration a small system, The ITEM System, is specified in Z and UML and implemented in Object-Oriented Cobol. / Computing / M. Sc. (Information Systems)
|
320 |
Coping with evolution in information systems: a database perspectiveLawrence, Gregory 25 August 2009 (has links)
Business organisations today are faced with the complex problem of dealing with
evolution in their software information systems. This effectively concerns the
accommodation and facilitation of change, in terms of both changing user
requirements and changing technological requirements. An approach that uses the
software development life-cycle as a vehicle to study the problem of evolution is
adopted. This involves the stages of requirements analysis, system specification,
design, implementation, and finally operation and maintenance. The problem of
evolution is one requiring proactive as well as reactive solutions for any given
application domain. Measuring evolvability in conceptual models and the
specification of changing requirements are considered. However, even "best designs"
are limited in dealing with unanticipated evolution, and require implementation phase
paradigms that can facilitate an evolution correctly (semantic integrity), efficiently
(minimal disruption of services) and consistently (all affected parts are consistent
following the change). These are also discussed / Computing / M. Sc. (Information Systems)
|
Page generated in 0.1065 seconds