• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 6
  • 3
  • Tagged with
  • 21
  • 21
  • 8
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Materials and Device Engineering for Efficient and Stable Polymer Solar Cells

Hansson, Rickard January 2017 (has links)
Polymer solar cells form a promising technology for converting sunlight into electricity, and have reached record efficiencies over 10% and lifetimes of several years. The performance of polymer solar cells depends strongly on the distribution of electron donor and acceptor materials in the active layer. To achieve longer lifetimes, degradation processes in the materials have to be understood. In this thesis, a set of complementary spectroscopy and microscopy techniques, among which soft X-ray techniques have been used to determine the morphology of polymer:fullerene based active layers. We have found that the morphology of TQ1:PC70BM films is strongly influenced by the processing solvent and the use of solvent additives. We have also found, by using soft X-ray techniques, that not only the light-absorbing polymer TQ1, but also the fullerene is susceptible to photo-degradation in air. Moreover, the fullerene degradation is accelerated in the presence of the polymer. Additionally, this thesis addresses the role of the interfacial layers for device performance and stability. The commonly used hole transport material PEDOT:PSS has the advantage of being solution processable at room temperature, but this layer is also known to contribute to the device degradation. We have found that low-temperature processed NiOx is a promising alternative to PEDOT:PSS, leading to improved device performance. Even for encapsulated polymer solar cells, some photo-induced degradation of the electrical performance is observed and is found to depend on the nature of the hole transport material. We found a better initial stability for solar cells with MoO3 hole transport layers than with PEDOT:PSS. In the pursuit of understanding the initial decrease in electrical performance of PEDOT:PSS-based devices, simulations were performed, from which a number of degradation sources could be excluded. / With the increasing global demand for energy, solar cells provide a clean method for converting the abundant sunlight to electricity. Polymer solar cells can be made from a large variety of light-harvesting and electrically conducting molecules and are inexpensive to produce. They have additional advantages, like their mechanical flexibility and low weight, which opens opportunities for novel applications. In order for polymer solar cells to be more competitive, however, both the power conversion efficiencies and lifetimes need to further improve. One way to achieve this is to optimize the morphology of the active layer. The active layer of a polymer solar cell consists of electron donating and electron accepting molecules whose distribution in the bulk of the film is a major factor that determines the solar cell performance. This thesis presents the use of complementary spectroscopy and microscopy methods to probe the local composition in the active layer of polymer solar cells. The stability of the active layer is studied and the interplay between the photo-degradation of the donor and acceptor molecules is investigated. Additionally, this thesis addresses how the interfacial layers between the active layer and the electrodes can influence device performance and stability. / <p>I publikationen felaktigt ISBN 978-91-7063-739-1</p>
12

Cloud-based Knowledge Management in Greek SME’s

Dimitrios, Rekleitis January 2018 (has links)
Nowadays, Cloud Technologies are commonly used for a lot of large organizations to aid knowledge sharing.  This brings some benefits to the organization by reducing the cost of the charges, improve security, enhance content accessibility, improve efficiency etc. On the other hand, Small and Medium Enterprises (SMEs) tend to manage their information in more informal way by not using the specific language or terminology of KM. Moreover, Small and Medium enterprises do not trust the adoption of cloud-based techniques for managing information for many reasons that discussed later. This thesis tries to provide the benefits and drawbacks of cloud-based Knowledge Management techniques in Greek SMEs and also to find how knowledge processes are used in Greek SMEs according to cloud-based Knowledge Management techniques. Also, through this work I will come up with the benefits and drawbacks of applying cloud-based techniques for managing information-knowledge in SMEs. To accomplish this, I derived with a methodology that is based on qualitative approach. More specifically, I provide an exhaustive literature review and then I contacted with five SMEs in Greece to explore, using different techniques, if these SMEs can benefit from the cloud-based Knowledge Management techniques and how indent are for adopting cloud-based Knowledge Management techniques in their organization. I realized that three of the SMEs are using cloud-based techniques for Knowledge Management, where the two of them does not. To be more specific one of these two SMEs does not manage its knowledge at all. However, all of the five organizations showed a great interest to adopt cloud-based and information system technologies for Knowledge Management. At the end, this work comes up with the following important findings and insights, as well: Cloud-based Knowledge Management techniques can bring a lot of benefits in terms of cost savings and performance. However, this suits the right and efficiently use of cloud-based techniques. The lack of using efficiently cloud-based Knowledge Management techniques may lead to some drawbacks, such as reduction on the performance of the organization and reduction on the savings.   This thesis also discusses some points for future direction such as the analysis of a larger space of organizations, the investigation of quantitative analysis and also the combination of both (qualitative and quantitative).
13

Optimizing Lexicon-Based Sentiment Analysis for COVID-19 Twitter : Interactions in Health Contexts

Ramin, Jafari January 2023 (has links)
During the COVID-19 pandemic, the surge in social media usage has elevated interestin sentiment analysis, especially for health-related applications. This bachelor thesisexplores the effectiveness of two lexicon-based sentiment analysis techniques, with afocus on enhancing the accuracy of the Valence Aware Dictionary for SentimentReasoning (VADER) algorithm. This bachelor's thesis delves into two lexicon-basedsentiment analysis methods, primarily aiming to enhance the accuracy of the ValenceAware Dictionary for Sentiment Reasoning (VADER) algorithm. By assessing 5000manually labeled COVID-19-related tweets across four dataset versions, we gauge therelative effectiveness of these methods. The focus lies on understanding the rolepreprocessing techniques play in sentiment analysis and refining the VADER algorithm.The insights drawn can inform the design of more effective public health policies andcommunication approaches by capturing more accurately public sentiment expressed intweets. In health contexts like COVID-19, it's vital to gauge public sentiment, whichhelps identify and manage psychological distress, anxiety, and fear. Through thissentiment exploration, healthcare providers can offer comprehensive care and improvesupport systems and mechanisms during global health crises like COVID-19.
14

Materials aspects in spin-coated films for polymer photovoltaics

Anselmo, Ana Sofia January 2013 (has links)
Polymer-based photovoltaics have the potential to contribute to boosting photovoltaic energy conversion overall. Besides allowing large-area inexpensive processing, polymeric materials have the added benefit of opening new market applications for photovoltaics due to their low-weight and interesting mechanical properties. The energy conversion efficiency values of polymer photovoltaics have reached new record values over the past years. It is however crucial that stability issues are addressed together with efficiency optimization. Understanding fundamental materials aspects is key in both areas. In the work presented in this thesis, the morphology of polymer:fullerene films and its influence on device performance was studied, as well as the effect of light exposure on the surface of fullerene films. Several polyfluorene copolymers were used for the morphology studies, where the effects of changing spin-coating solvent and of side chain engineering were investigated with dynamic secondary ion mass spectrometry (dSIMS) and near-edge X-ray absorption fine structure (NEXAFS) spectroscopy. Polymer-enriched surfaces were found in all blend films, even in the cases with homogeneous distributions in the bulk. Side chain engineering of the polymer led to gradual changes in the compositional variations perpendicular to the surface, and to slight variations in the photocurrent. The electronic structure of the fullerene derivative PCBM was studied in detail and the spectroscopic fingerprint of the materials was analysed by comparison with theoretically simulated spectra. Photo-stability studies done in air showed that the surface of fullerene films underwent severe damages at the molecular level, which is evident from changes in the valence band and X-ray absorption spectra. These changes were explained by transitions from sp2-type to sp3 hybridization of the carbon atoms in the cage that resulted in the destruction of the fullerene cage.
15

Análise espacial e os sistemas de informação geográfica-a oferta de equipamentos urbanos na óptica do ambiente urbano e do desenvolvimento sustentável das cidades

Vidigal, Ana January 2001 (has links)
No description available.
16

Uncertainty visualization of ensemble simulations

Sanyal, Jibonananda 09 December 2011 (has links)
Ensemble simulation is a commonly used technique in operational forecasting of weather and floods. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists and hydrologists are interested in understanding the uncertainties associated with the simulation; specifically variability between the ensemble members. The visualization of ensemble members is currently accomplished through spaghetti plots or hydrographs. To improve visualization techniques and tools for forecasters, we conducted a userstudy to evaluate the effectiveness of existing uncertainty visualization techniques on 1D and 2D synthetic datasets. We designed an uncertainty evaluation framework to enable easier design of such studies for scientific visualization. The techniques evaluated are errorbars, scaled size of glyphs, color-mapping on glyphs, and color-mapping of uncertainty on the data surface. Although we did not find a consistent order among the four techniques for all tasks, we found that the efficiency of techniques used highly depended on the tasks being performed. Errorbars consistently underperformed throughout the experiment. Scaling the size of glyphs and color-mapping of the surface performed reasonably well. With results from the user-study, we iteratively developed a tool named ‘Noodles’ to interactively explore the ensemble uncertainty in weather simulations. Uncertainty was quantified using standard deviation, inter-quartile range, width of the 95% confidence interval, and by bootstrapping the data. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers and avoiding the parametrizations leading to these outliers. Additionally, they could identify spatial regions with high uncertainty thereby determining poorly simulated storm environments and deriving physical interpretation of these model issues. We also describe uncertainty visualization capabilities developed for a tool named ‘FloodViz’ for visualization and analysis of flood simulation ensembles. Simple member and trend plots and composited inundation maps with uncertainty are described along with different types of glyph based uncertainty representations. We also provide feedback from a hydrologist using various features of the tool from an operational perspective.
17

Proactive university library book recommender system

Mekonnen, Tadesse Zewdu January 2021 (has links)
M. Tech. (Department of Information Communication Technology, Faculty of Applied and Computer Sciences), Vaal University of Technology. / Too many options on the internet are the reason for the information overload problem to obtain relevant information. A recommender system is a technique that filters information from large sets of data and recommends the most relevant ones based on people‟s preferences. Collaborative and content-based techniques are the core techniques used to implement a recommender system. A combined use of both collaborative and content-based techniques called hybrid techniques provide relatively good recommendations by avoiding common problems arising from each technique. In this research, a proactive University Library Book Recommender System has been proposed in which hybrid filtering is used for enhanced and more accurate recommendations. The prototype designed was able to recommend the highest ten books for each user. We evaluated the accuracy of the results using Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). A measure value of 0.84904 MAE and 0.9579 RMSE found by our system shows that the combined use of both techniques gives an improved prediction accuracy for the University Library Book Recommender System.
18

Automatic Burns Analysis Using Machine Learning

Abubakar, Aliyu January 2022 (has links)
Burn injuries are a significant global health concern, causing high mortality and morbidity rates. Clinical assessment is the current standard for diagnosing burn injuries, but it suffers from interobserver variability and is not suitable for intermediate burn depths. To address these challenges, machine learning-based techniques were proposed to evaluate burn wounds in a thesis. The study utilized image-based networks to analyze two medical image databases of burn injuries from Caucasian and Black-African cohorts. The deep learning-based model, called BurnsNet, was developed and used for real-time processing, achieving high accuracy rates in discriminating between different burn depths and pressure ulcer wounds. The multiracial data representation approach was also used to address data representation bias in burn analysis, resulting in promising performance. The ML approach proved its objectivity and cost-effectiveness in assessing burn depths, providing an effective adjunct for clinical assessment. The study's findings suggest that the use of machine learning-based techniques can reduce the workflow burden for burn surgeons and significantly reduce errors in burn diagnosis. It also highlights the potential of automation to improve burn care and enhance patients' quality of life. / Petroleum Technology Development Fund (PTDF); Gombe State University study fellowship
19

Novel chemometric proposals for advanced multivariate data analysis, processing and interpretation

Vitale, Raffaele 03 November 2017 (has links)
The present Ph.D. thesis, primarily conceived to support and reinforce the relation between academic and industrial worlds, was developed in collaboration with Shell Global Solutions (Amsterdam, The Netherlands) in the endeavour of applying and possibly extending well-established latent variable-based approaches (i.e. Principal Component Analysis - PCA - Partial Least Squares regression - PLS - or Partial Least Squares Discriminant Analysis - PLSDA) for complex problem solving not only in the fields of manufacturing troubleshooting and optimisation, but also in the wider environment of multivariate data analysis. To this end, novel efficient algorithmic solutions are proposed throughout all chapters to address very disparate tasks, from calibration transfer in spectroscopy to real-time modelling of streaming flows of data. The manuscript is divided into the following six parts, focused on various topics of interest: Part I - Preface, where an overview of this research work, its main aims and justification is given together with a brief introduction on PCA, PLS and PLSDA; Part II - On kernel-based extensions of PCA, PLS and PLSDA, where the potential of kernel techniques, possibly coupled to specific variants of the recently rediscovered pseudo-sample projection, formulated by the English statistician John C. Gower, is explored and their performance compared to that of more classical methodologies in four different applications scenarios: segmentation of Red-Green-Blue (RGB) images, discrimination of on-/off-specification batch runs, monitoring of batch processes and analysis of mixture designs of experiments; Part III - On the selection of the number of factors in PCA by permutation testing, where an extensive guideline on how to accomplish the selection of PCA components by permutation testing is provided through the comprehensive illustration of an original algorithmic procedure implemented for such a purpose; Part IV - On modelling common and distinctive sources of variability in multi-set data analysis, where several practical aspects of two-block common and distinctive component analysis (carried out by methods like Simultaneous Component Analysis - SCA - DIStinctive and COmmon Simultaneous Component Analysis - DISCO-SCA - Adapted Generalised Singular Value Decomposition - Adapted GSVD - ECO-POWER, Canonical Correlation Analysis - CCA - and 2-block Orthogonal Projections to Latent Structures - O2PLS) are discussed, a new computational strategy for determining the number of common factors underlying two data matrices sharing the same row- or column-dimension is described, and two innovative approaches for calibration transfer between near-infrared spectrometers are presented; Part V - On the on-the-fly processing and modelling of continuous high-dimensional data streams, where a novel software system for rational handling of multi-channel measurements recorded in real time, the On-The-Fly Processing (OTFP) tool, is designed; Part VI - Epilogue, where final conclusions are drawn, future perspectives are delineated, and annexes are included. / La presente tesis doctoral, concebida principalmente para apoyar y reforzar la relación entre la academia y la industria, se desarrolló en colaboración con Shell Global Solutions (Amsterdam, Países Bajos) en el esfuerzo de aplicar y posiblemente extender los enfoques ya consolidados basados en variables latentes (es decir, Análisis de Componentes Principales - PCA - Regresión en Mínimos Cuadrados Parciales - PLS - o PLS discriminante - PLSDA) para la resolución de problemas complejos no sólo en los campos de mejora y optimización de procesos, sino también en el entorno más amplio del análisis de datos multivariados. Con este fin, en todos los capítulos proponemos nuevas soluciones algorítmicas eficientes para abordar tareas dispares, desde la transferencia de calibración en espectroscopia hasta el modelado en tiempo real de flujos de datos. El manuscrito se divide en las seis partes siguientes, centradas en diversos temas de interés: Parte I - Prefacio, donde presentamos un resumen de este trabajo de investigación, damos sus principales objetivos y justificaciones junto con una breve introducción sobre PCA, PLS y PLSDA; Parte II - Sobre las extensiones basadas en kernels de PCA, PLS y PLSDA, donde presentamos el potencial de las técnicas de kernel, eventualmente acopladas a variantes específicas de la recién redescubierta proyección de pseudo-muestras, formulada por el estadista inglés John C. Gower, y comparamos su rendimiento respecto a metodologías más clásicas en cuatro aplicaciones a escenarios diferentes: segmentación de imágenes Rojo-Verde-Azul (RGB), discriminación y monitorización de procesos por lotes y análisis de diseños de experimentos de mezclas; Parte III - Sobre la selección del número de factores en el PCA por pruebas de permutación, donde aportamos una guía extensa sobre cómo conseguir la selección de componentes de PCA mediante pruebas de permutación y una ilustración completa de un procedimiento algorítmico original implementado para tal fin; Parte IV - Sobre la modelización de fuentes de variabilidad común y distintiva en el análisis de datos multi-conjunto, donde discutimos varios aspectos prácticos del análisis de componentes comunes y distintivos de dos bloques de datos (realizado por métodos como el Análisis Simultáneo de Componentes - SCA - Análisis Simultáneo de Componentes Distintivos y Comunes - DISCO-SCA - Descomposición Adaptada Generalizada de Valores Singulares - Adapted GSVD - ECO-POWER, Análisis de Correlaciones Canónicas - CCA - y Proyecciones Ortogonales de 2 conjuntos a Estructuras Latentes - O2PLS). Presentamos a su vez una nueva estrategia computacional para determinar el número de factores comunes subyacentes a dos matrices de datos que comparten la misma dimensión de fila o columna y dos planteamientos novedosos para la transferencia de calibración entre espectrómetros de infrarrojo cercano; Parte V - Sobre el procesamiento y la modelización en tiempo real de flujos de datos de alta dimensión, donde diseñamos la herramienta de Procesamiento en Tiempo Real (OTFP), un nuevo sistema de manejo racional de mediciones multi-canal registradas en tiempo real; Parte VI - Epílogo, donde presentamos las conclusiones finales, delimitamos las perspectivas futuras, e incluimos los anexos. / La present tesi doctoral, concebuda principalment per a recolzar i reforçar la relació entre l'acadèmia i la indústria, es va desenvolupar en col·laboració amb Shell Global Solutions (Amsterdam, Països Baixos) amb l'esforç d'aplicar i possiblement estendre els enfocaments ja consolidats basats en variables latents (és a dir, Anàlisi de Components Principals - PCA - Regressió en Mínims Quadrats Parcials - PLS - o PLS discriminant - PLSDA) per a la resolució de problemes complexos no solament en els camps de la millora i optimització de processos, sinó també en l'entorn més ampli de l'anàlisi de dades multivariades. A aquest efecte, en tots els capítols proposem noves solucions algorítmiques eficients per a abordar tasques dispars, des de la transferència de calibratge en espectroscopia fins al modelatge en temps real de fluxos de dades. El manuscrit es divideix en les sis parts següents, centrades en diversos temes d'interès: Part I - Prefaci, on presentem un resum d'aquest treball de recerca, es donen els seus principals objectius i justificacions juntament amb una breu introducció sobre PCA, PLS i PLSDA; Part II - Sobre les extensions basades en kernels de PCA, PLS i PLSDA, on presentem el potencial de les tècniques de kernel, eventualment acoblades a variants específiques de la recentment redescoberta projecció de pseudo-mostres, formulada per l'estadista anglés John C. Gower, i comparem el seu rendiment respecte a metodologies més clàssiques en quatre aplicacions a escenaris diferents: segmentació d'imatges Roig-Verd-Blau (RGB), discriminació i monitorització de processos per lots i anàlisi de dissenys d'experiments de mescles; Part III - Sobre la selecció del nombre de factors en el PCA per proves de permutació, on aportem una guia extensa sobre com aconseguir la selecció de components de PCA a través de proves de permutació i una il·lustració completa d'un procediment algorítmic original implementat per a la finalitat esmentada; Part IV - Sobre la modelització de fonts de variabilitat comuna i distintiva en l'anàlisi de dades multi-conjunt, on discutim diversos aspectes pràctics de l'anàlisis de components comuns i distintius de dos blocs de dades (realitzat per mètodes com l'Anàlisi Simultània de Components - SCA - Anàlisi Simultània de Components Distintius i Comuns - DISCO-SCA - Descomposició Adaptada Generalitzada en Valors Singulars - Adapted GSVD - ECO-POWER, Anàlisi de Correlacions Canòniques - CCA - i Projeccions Ortogonals de 2 blocs a Estructures Latents - O2PLS). Presentem al mateix temps una nova estratègia computacional per a determinar el nombre de factors comuns subjacents a dues matrius de dades que comparteixen la mateixa dimensió de fila o columna, i dos plantejaments nous per a la transferència de calibratge entre espectròmetres d'infraroig proper; Part V - Sobre el processament i la modelització en temps real de fluxos de dades d'alta dimensió, on dissenyem l'eina de Processament en Temps Real (OTFP), un nou sistema de tractament racional de mesures multi-canal registrades en temps real; Part VI - Epíleg, on presentem les conclusions finals, delimitem les perspectives futures, i incloem annexos. / Vitale, R. (2017). Novel chemometric proposals for advanced multivariate data analysis, processing and interpretation [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/90442
20

Measuring Semantic Distance using Distributional Profiles of Concepts

Mohammad, Saif 01 August 2008 (has links)
Semantic distance is a measure of how close or distant in meaning two units of language are. A large number of important natural language problems, including machine translation and word sense disambiguation, can be viewed as semantic distance problems. The two dominant approaches to estimating semantic distance are the WordNet-based semantic measures and the corpus-based distributional measures. In this thesis, I compare them, both qualitatively and quantitatively, and identify the limitations of each. This thesis argues that estimating semantic distance is essentially a property of concepts (rather than words) and that two concepts are semantically close if they occur in similar contexts. Instead of identifying the co-occurrence (distributional) profiles of words (distributional hypothesis), I argue that distributional profiles of concepts (DPCs) can be used to infer the semantic properties of concepts and indeed to estimate semantic distance more accurately. I propose a new hybrid approach to calculating semantic distance that combines corpus statistics and a published thesaurus (Macquarie Thesaurus). The algorithm determines estimates of the DPCs using the categories in the thesaurus as very coarse concepts and, notably, without requiring any sense-annotated data. Even though the use of only about 1000 concepts to represent the vocabulary of a language seems drastic, I show that the method achieves results better than the state-of-the-art in a number of natural language tasks. I show how cross-lingual DPCs can be created by combining text in one language with a thesaurus from another. Using these cross-lingual DPCs, we can solve problems in one, possibly resource-poor, language using a knowledge source from another, possibly resource-rich, language. I show that the approach is also useful in tasks that inherently involve two or more languages, such as machine translation and multilingual text summarization. The proposed approach is computationally inexpensive, it can estimate both semantic relatedness and semantic similarity, and it can be applied to all parts of speech. Extensive experiments on ranking word pairs as per semantic distance, real-word spelling correction, solving Reader's Digest word choice problems, determining word sense dominance, word sense disambiguation, and word translation show that the new approach is markedly superior to previous ones.

Page generated in 0.0749 seconds