• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 29
  • 8
  • 8
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 115
  • 35
  • 23
  • 14
  • 14
  • 13
  • 13
  • 13
  • 13
  • 12
  • 12
  • 10
  • 10
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

El rol del Estado en el nuevo régimen previsional en garantías, control y beneficios mínimos

Raicevich Flores, Carlos Nicolás January 2018 (has links)
Memoria (licenciado en ciencias jurídicas y sociales)
82

Fusing DL Reasoning with HTN Planning as a Deliberative Layer in Mobile Robotics

Hartanto, Ronny 08 March 2010 (has links)
Action planning has been used in the field of robotics for solving long-running tasks. In the robot architectures field, it is also known as the deliberative layer. However, there is still a gap between the symbolic representation on the one hand and the low-level control and sensor representation on the other. In addition, the definition of a planning problem for a complex, real-world robot is not trivial. The planning process could become intractable as its search spaces become large. As the defined planning problem determines the complexity and the computationability for solving the problem, it should contain only relevant states. In this work, a novel approach which amalgamates Description Logic (DL) reasoning with Hierarchical Task Network (HTN) planning is introduced. The planning domain description as well as fundamental HTN planning concepts are represented in DL and can therefore be subject to DL reasoning; from these representations, concise planning problems are generated for HTN planning. The method is presented through an example in the robot navigation domain. In addition, a case study of the RoboCup@Home domain is given. As proof of concept, a well-known planning problem that often serves as a benchmark, namely that of the blocks-world, is modeled and solved using this approach. An analysis of the performance of the approach has been conducted and the results show that this approach yields significantly smaller planning problem descriptions than those generated by current representations in HTN planning.
83

Deep Learning Based Detection, Quantification, and Subdivision of White Matter Hyperintensities in Brain MRI

Fryckstedt, Inna January 2023 (has links)
White matter hyperintensities (WMH) are commonly found as bright regions in brain MRI images in older individuals. They are associated with various neurological and vascular diseases, such as stroke, dementia, and cardiovascular disorders. WMH is also one of the seven radiological parameters included in the idiopathic normal pressure hydrocephalus (iNPH) Radscale, used to grade the radiological signs of normal pressure hydrocephalus. The radiological patterns are highly heterogenous, making quantification and classification of WMHs complex. In this project, an automated method for quantitative and qualitative assessment of white matter hyperintensities was developed based on the deep learning architecture nnU-Net.  Different configurations of the nnU-Net network were trained on a publicly available dataset from the 2017 Medical Image Computing and Computer Assisted Invention Society (MICCAI) WMH segmentation challenge constituting different grades of WMHs, and eventually ensembled to produce the final output of the segmentation model. Based on the segmentation result, a volumetric assessment was performed using Fazekas scale, which can then be used as one of the essential radiological parameters in the iNPH Radscale. Furthermore, the pipeline subdivides and classifies the hyperintense regions based on spatial information and T1-signal intensity, which is believed to have a significant impact on the pathology of the WMHs.  The final pipeline accurately segments WMHs from T1-weighted and FLAIR MRI images with a Dice’s similarity coefficient of 0.81, quantitatively classifies each case according to Fazekas scale and further subdivides each hyperintense voxel based on its location in the brain and intensity in the T1-weighted image. Hopefully, this can serve as a meaningful tool in the diagnosis of iNPH as well as future research aiming to fully understand the clinical implications of different types of WMHs.
84

What Are Radiologists' Perceptions in Regard to Image Quality and Increased Utilization Due to Vendor Provided Deep Learning Signal to Noise Ratio and Deep Learning Reconstruction on 3.0T Magnetic Resonance Imagine?

Venturi, Gianni 02 August 2023 (has links)
No description available.
85

Machine Learning Approaches to Develop Weather Normalize Models for Urban Air Quality

Ngoc Phuong, Chau January 2024 (has links)
According to the World Health Organization, almost all human population (99%) lives in 117 countries with over 6000 cities, where air pollutant concentration exceeds recommended thresholds. The most common, so-called criteria, air pollutants that affect human lives, are particulate matter (PM) and gas-phase (SO2, CO, NO2, O3 and others). Therefore, many countries or regions worldwide have imposed regulations or interventions to reduce these effects. Whenever an intervention occurs, air quality changes due to changes in ambient factors, such as weather characteristics and human activities. One approach for assessing the effects of interventions or events on air quality is through the use of the Weather Normalized Model (WNM). However, current deterministic models struggle to accurately capture the complex, non-linear relationship between pollutant concentrations and their emission sources. Hence, the primary objective of this thesis is to examine the power of machine learning (ML) and deep learning (DL) techniques to develop and improve WNMs. Subsequently, these enhanced WNMs are employed to assess the impact of events on air quality. Furthermore, these ML/DL-based WNMs can serve as valuable tools for conducting exploratory data analysis (EDA) to uncover the correlations between independent variables (meteorological and temporal features) and air pollutant concentrations within the models.  It has been discovered that DL techniques demonstrated their efficiency and high performance in different fields, such as natural language processing, image processing, biology, and environment. Therefore, several appropriate DL architectures (Long Short-Term Memory - LSTM, Recurrent Neural Network - RNN, Bidirectional Recurrent Neural Network - BIRNN, Convolutional Neural Network - CNN, and Gated Recurrent Unit - GRU) were tested to develop the WNMs presented in Paper I. When comparing these DL architectures and Gradient Boosting Machine (GBM), LSTM-based methods (LSTM, BiRNN) have obtained superior results in developing WNMs. The study also showed that our WNMs (DL-based) could capture the correlations between input variables (meteorological and temporal variables) and five criteria contaminants (SO2, CO, NO2, O3 and PM2.5). This is because the SHapley Additive exPlanations (SHAP) library allowed us to discover the significant factors in DL-based WNMs. Additionally, these WNMs were used to assess the air quality changes during COVID-19 lockdown periods in Ecuador. The existing normalized models operate based on the original units of pollutants and are designed for assessing pollutant concentrations under “average” or consistent weather conditions. Predicting pollution peaks presents an even greater challenge because they often lack discernible patterns. To address this, we enhanced the Weather Normalized Models (WNMs) to boost their performance specifically during daily concentration peak conditions. In the second paper, we accomplished this by developing supervised learning techniques, including Ensemble Deep Learning methods, to distinguish between daily peak and non-peak pollutant concentrations. This approach offers flexibility in categorizing pollutant concentrations as either daily concentration peaks or non-daily concentration peaks. However, it is worth noting that this method may introduce potential bias when selecting non-peak values. In the third paper, WNMs are directly applied to daily concentration peaks to predict and analyse the correlations between meteorological, temporal features and daily concentration peaks of air pollutants.
86

Hardware Implementation of Learning-Based Camera ISP for Low-Light Applications

Preston Rashad Rahim (17676693) 20 December 2023 (has links)
<p dir="ltr">A camera's image signal processor (ISP) is responsible for taking the mosaiced and noisy image signal from the image sensor and processing it such a way that an end-result image is produced that is informative and accurately captures the scene. Real-time video capture in photon-limited environments remains a challenge for many ISP's today. In these conditions, the image signal is dominated by the photon shot noise. Deep learning methods show promise in extracting the underlying image signal from the noise, but modern AI-based ISPs are too computationally complex to be realized as a fast and efficient hardware ISP. An ISP algorithm, BLADE2 has been designed, which leverages AI in a computationally conservative manner to demosaic and denoise low-light images. The original implementation of this algorihtm is in Python/PyTorch. This Thesis explores taking BLADE2 and implementing it on a general purpose GPU via a suite of Nvidia optimization toolkits, as well as a low-level implementation in C/C++, bringing the algorithm closer to FPGA realization. The GPU implementation demonstrated significant throughput gains and the C/C++ implementation demonstrated the feasibility of further hardware development.</p>
87

Deep Image Processing with Spatial Adaptation and Boosted Efficiency & Supervision for Accurate Human Keypoint Detection and Movement Dynamics Tracking

Chao Yang Dai (14709547) 31 May 2023 (has links)
<p>This thesis aims to design and develop the spatial adaptation approach through spatial transformers to improve the accuracy of human keypoint recognition models. We have studied different model types and design choices to gain an accuracy increase over models without spatial transformers and analyzed how spatial transformers increase the accuracy of predictions. A neural network called Widenet has been leveraged as a specialized network for providing the parameters for the spatial transformer. Further, we have evaluated methods to reduce the model parameters, as well as the strategy to enhance the learning supervision for further improving the performance of the model. Our experiments and results have shown that the proposed deep learning framework can effectively detect the human key points, compared with the baseline methods. Also, we have reduced the model size without significantly impacting the performance, and the enhanced supervision has improved the performance. This study is expected to greatly advance the deep learning of human key points and movement dynamics. </p>
88

Convolutional Neural Network Optimization Using Genetic Algorithms

Reiling, Anthony J. January 2017 (has links)
No description available.
89

Algorithmes pour la factorisation d'entiers et le calcul de logarithme discret / Algorithms for integer factorization and discrete logarithms computation

Bouvier, Cyril 22 June 2015 (has links)
Dans cette thèse, nous étudions les problèmes de la factorisation d'entier et de calcul de logarithme discret dans les corps finis. Dans un premier temps, nous nous intéressons à l'algorithme de factorisation d'entier ECM et présentons une méthode pour analyser les courbes elliptiques utilisées dans cet algorithme en étudiant les propriétés galoisiennes des polynômes de division. Ensuite, nous présentons en détail l'algorithme de factorisation d'entier NFS, et nous nous intéressons en particulier à l'étape de sélection polynomiale pour laquelle des améliorations d'algorithmes existants sont proposées. Puis, nous présentons les algorithmes NFS-DL et FFS pour le calcul de logarithme discret dans les corps finis. Nous donnons aussi des détails sur deux calculs de logarithme discret effectués durant cette thèse, l'un avec NFS-DL et l'autre avec FFS. Enfin, nous étudions une étape commune à l'algorithme NFS pour la factorisation et aux algorithmes NFS-DL et FFS pour le calcul de logarithme discret: l'étape de filtrage. Nous l'étudions en détail et nous présentons une amélioration dont nous validons l'impact en utilisant des données provenant de plusieurs calculs de factorisation et de logarithme discret / In this thesis, we study the problems of integer factorization and discrete logarithm computation in finite fields. First, we study the ECM algorithm for integer factorization and present a method to analyze the elliptic curves used in this algorithm by studying the Galois properties of division polynomials. Then, we present in detail the NFS algorithm for integer factorization and we study in particular the polynomial selection step for which we propose improvements of existing algorithms. Next, we present two algorithms for computing discrete logarithms in finite fields: NFS-DL and FFS. We also give some details of two computations of discrete logarithms carried out during this thesis, one with NFS-DL and the other with FFS. Finally, we study a common step of the NFS algorithm for integer factorization and the NFS-DL and FFS algorithms for discrete logarithm computations: the filtering step. We study this step thoroughly and present an improvement for which we study the impact using data from several computations of discrete logarithms and factorizations
90

Clinical Decision Support Systems for Palliative Care Referral: Design and Evaluation of Frailty and Mortality Predictive Models

Blanes Selva, Vicent 30 December 2022 (has links)
[ES] Los Cuidados Paliativos (PC) son cuidados médicos especializados cuyo objetivo esmejorar la calidad de vida de los pacientes con enfermedades graves. Históricamente,se han aplicado a los pacientes en fase terminal, especialmente a los que tienen undiagnóstico oncológico. Sin embargo, los resultados de las investigaciones actualessugieren que la PC afecta positivamente a la calidad de vida de los pacientes condiferentes enfermedades. La tendencia actual sobre la PC es incluir a pacientes nooncológicos con afecciones como la EPOC, la insuficiencia de funciones orgánicas ola demencia. Sin embargo, la identificación de los pacientes con esas necesidades escompleja, por lo que se requieren herramientas alternativas basadas en datos clínicos. La creciente demanda de PC puede beneficiarse de una herramienta de cribadopara identificar a los pacientes con necesidades de PC durante el ingreso hospitalario.Se han propuesto varias herramientas, como la Pregunta Sorpresa (SQ) o la creaciónde diferentes índices y puntuaciones, con distintos grados de éxito. Recientemente,el uso de algoritmos de inteligencia artificial, en concreto de Machine Learning (ML), ha surgido como una solución potencial dada su capacidad de aprendizaje a partirde las Historias Clínicas Electrónicas (EHR) y con la expectativa de proporcionarpredicciones precisas para el ingreso en programas de PC. Esta tesis se centra en la creación de herramientas digitales basadas en ML para la identificación de pacientes con necesidades de cuidados paliativos en el momento del ingreso hospitalario. Hemos utilizado la mortalidad y la fragilidad como los dos criterios clínicos para la toma de decisiones, siendo la corta supervivencia y el aumento de la fragilidad, nuestros objetivos para hacer predicciones. También nos hemos centrado en la implementación de estas herramientas en entornos clínicos y en el estudio de su usabilidad y aceptación en los flujos de trabajo clínicos. Para lograr estos objetivos, en primer lugar, estudiamos y comparamos algoritmos de ML para la supervivencia a un año en pacientes adultos durante el ingreso hospitalario. Para ello, definimos una variable binaria a predecir, equivalente a la SQ y definimos el conjunto de variables predictivas basadas en la literatura. Comparamos modelos basados en Support Vector Machine (SVM), k-Nearest Neighbours (kNN), Random Forest (RF), Gradient Boosting Machine (GBM) y Multilayer Perceptron (MLP), atendiendo a su rendimiento, especialmente al Área bajo la curva ROC (AUC ROC). Además, obtuvimos información sobre la importancia de las variables para los modelos basados en árboles utilizando el criterio GINI. En segundo lugar, estudiamos la medición de la fragilidad de la calidad de vida(QoL) en los candidatos a la intervención en PC. Para este segundo estudio, redujimosla franja de edad de la población a pacientes ancianos (≥ 65 años) como grupo objetivo. A continuación, creamos tres modelos diferentes: 1) la adaptación del modelo demortalidad a un año para pacientes ancianos, 2) un modelo de regresión para estimarel número de días desde el ingreso hasta la muerte para complementar los resultadosdel primer modelo, y finalmente, 3) un modelo predictivo del estado de fragilidad aun año. Estos modelos se compartieron con la comunidad académica a través de unaaplicación web b que permite la entrada de datos y muestra la predicción de los tresmodelos y unos gráficos con la importancia de las variables. En tercer lugar, propusimos una versión del modelo de mortalidad a un año enforma de calculadora online. Esta versión se diseñó para maximizar el acceso de losprofesionales minimizando los requisitos de datos y haciendo que el software respondiera a las plataformas tecnológicas actuales. Así pues, se eliminaron las variablesadministrativas específicas de la fuente de datos y se trabajó en un proceso para minimizar las variables de entrada requeridas, manteniendo al mismo tiempo un ROCAUC elevado del modelo. Como resultado, e / [CA] Les Cures Pal·liatives (PC) són cures mèdiques especialitzades l'objectiu de les qualsés millorar la qualitat de vida dels pacients amb malalties greus. Històricament, s'hanaplicat als pacients en fase terminal, especialment als quals tenen un diagnòstic oncològic. No obstant això, els resultats de les investigacions actuals suggereixen que lesPC afecten positivament a la qualitat de vida dels pacients amb diferents malalties. Latendència actual sobre les PC és incloure a pacients no oncològics amb afeccions comla malaltia pulmonar obstructiva crònica, la insuficiència de funcions orgàniques o lademència. No obstant això, la identificació dels pacients amb aqueixes necessitats éscomplexa, per la qual cosa es requereixen eines alternatives basades en dades clíniques. La creixent demanda de PC pot beneficiar-se d'una eina de garbellat per a identificar als pacients amb necessitats de PC durant l'ingrés hospitalari. S'han proposatdiverses eines, com la Pregunta Sorpresa (SQ) o la creació de diferents índexs i puntuacions, amb diferents graus d'èxit. Recentment, l'ús d'algorismes d'intel·ligènciaartificial, en concret de Machine Learning (ML), ha sorgit com una potencial soluciódonada la seua capacitat d'aprenentatge a partir de les Històries Clíniques Electròniques (EHR) i amb l'expectativa de proporcionar prediccions precises per a l'ingrés enprogrames de PC. Aquesta tesi se centra en la creació d'eines digitals basades en MLper a la identificació de pacients amb necessitats de cures pal·liatives durant l'ingréshospitalari. Hem utilitzat mortalitat i fragilitat com els dos criteris clínics per a lapresa de decisions, sent la curta supervivència i la major fragilitat els nostres objectiusa predir. Després, ens hem centrat en la seua implementació en entorns clínics i hemestudiat la seua usabilitat i acceptació en els fluxos de treball clínics.Aquesta tesi se centra en la creació d'eines digitals basades en ML per a la identificació de pacients amb necessitats de cures pal·liatives en el moment de l'ingrés hospitalari. Hem utilitzat la mortalitat i la fragilitat com els dos criteris clínics per ala presa de decisions, sent la curta supervivència i l'augment de la fragilitat, els nostresobjectius per a fer prediccions. També ens hem centrat en la implementació d'aquesteseines en entorns clínics i en l'estudi de la seua usabilitat i acceptació en els fluxos detreball clínics. Per a aconseguir aquests objectius, en primer lloc, estudiem i comparem algorismesde ML per a la supervivència a un any en pacients adults durant l'ingrés hospitalari.Per a això, definim una variable binària a predir, equivalent a la SQ i definim el conjuntde variables predictives basades en la literatura. Comparem models basats en Support Vector Machine (SVM), k-Nearest Neighbours (kNN), Random Forest (RF), Gradient Boosting Machine (GBM) i Multilayer Perceptron (MLP), atenent el seu rendiment,especialment a l'Àrea sota la corba ROC (AUC ROC). A més, vam obtindre informaciósobre la importància de les variables per als models basats en arbres utilitzant el criteri GINI. En segon lloc, estudiem el mesurament de la fragilitat de la qualitat de vida (QoL)en els candidats a la intervenció en PC. Per a aquest segon estudi, vam reduir lafranja d'edat de la població a pacients ancians (≥ 65 anys) com a grup objectiu. Acontinuació, creem tres models diferents: 1) l'adaptació del model de mortalitat a unany per a pacients ancians, 2) un model de regressió per a estimar el nombre de dies desde l'ingrés fins a la mort per a complementar els resultats del primer model, i finalment,3) un model predictiu de l'estat de fragilitat a un any. Aquests models es van compartiramb la comunitat acadèmica a través d'una aplicació web c que permet l'entrada dedades i mostra la predicció dels tres models i uns gràfics amb la importància de lesvariables. En tercer lloc, vam proposar una versió del model de mortalitat a un any en formade calculadora en línia. Aquesta versió es va di / [EN] Palliative Care (PC) is specialized medical care that aims to improve patients' quality of life with serious illnesses. Historically, it has been applied to terminally ill patients, especially those with oncologic diagnoses. However, current research results suggest that PC positively affects the quality of life of patients with different conditions. The current trend on PC is to include non-oncological patients with conditions such as Chronic Obstructive Pulmonary Disease (COPD), organ function failure or dementia. However, the identification of patients with those needs is complex, and therefore alternative tools based on clinical data are required. The growing demand for PC may benefit from a screening tool to identify patients with PC needs during hospital admission. Several tools, such as the Surprise Question (SQ) or the creation of different indexes and scores, have been proposed with varying degrees of success. Recently, the use of artificial intelligence algorithms, specifically Machine Learning (ML), has arisen as a potential solution given their capacity to learn from the Electronic Health Records (EHRs) and with the expectation to provide accurate predictions for admission to PC programs. This thesis focuses on creating ML-based digital tools for identifying patients with palliative care needs at hospital admission. We have used mortality and frailty as the two clinical criteria for decision-making, being short survival and increased frailty, as our targets to make predictions. We also have focused on implementing these tools in clinical settings and studying their usability and acceptance in clinical workflows. To accomplish these objectives, first, we studied and compared ML algorithms for one-year survival in adult patients during hospital admission. To do so, we defined a binary variable to predict, equivalent to the SQ and defined the set of predictive variables based on literature. We compared models based on Support Vector Machine (SVM), k-Nearest Neighbours (kNN), Random Forest (RF), Gradient Boosting Machine (GBM) and Multilayer Perceptron (MLP), attending to their performance, especially to the Area under the ROC curve (AUC ROC). Additionally, we obtained information on the importance of variables for tree-based models using the GINI criterion. Second, we studied frailty measurement of Quality of Life (QoL) in candidates for PC intervention. For this second study, we narrowed the age of the population to elderly patients (≥ 65 years) as the target group. Then we created three different models: 1) for the adaptation of the one-year mortality model for elderly patients, 2) a regression model to estimate the number of days from admission to death to complement the results of the first model, and finally, 3) a predictive model for frailty status at one year. These models were shared with the academic community through a web application a that allows data input and shows the prediction from the three models and some graphs with the importance of the variables. Third, we proposed a version of the 1-year mortality model in the form of an online calculator. This version was designed to maximize access from professionals by minimizing data requirements and making the software responsive to the current technological platforms. So we eliminated the administrative variables specific to the dataset source and worked on a process to minimize the required input variables while maintaining high the model's AUC ROC. As a result, this model retained most of the predictive power and required only seven bed-side inputs. Finally, we evaluated the Clinical Decision Support System (CDSS) web tool on PC with an actual set of users. This evaluation comprised three domains: evaluation of participant's predictions against the ML baseline, the usability of the graphical interface, and user experience measurement. A first evaluation was performed, followed by a period of implementation of improvements and corrections to the pla / Blanes Selva, V. (2022). Clinical Decision Support Systems for Palliative Care Referral: Design and Evaluation of Frailty and Mortality Predictive Models [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/190993

Page generated in 0.0536 seconds