• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 11
  • 11
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Desenvolvimento de algoritmos para análise e modelagem variográfica

Drumond, David Alvarenga January 2016 (has links)
A análise da continuidade espacial inclui uma série de ferramentas para estimar e modelar a continuidade de variáveis aleatórias regionalizadas. Ela é a base para muitas das avaliações de depósitos minerais baseadas na geoestatísitca. O modelo ajustado é de grande importância e influencia nos resultados em vários algoritmos de krigagem e simulações subsequentes. Tanto os softwares acadêmicos e comerciais podem melhorar no desenvolvimento dos gráficos, na interatividade com o usuário e no uso de formas automáticas de modelagem. O SGeMS (Stanford Geoestatistical Modeling Software) é um programa gratuito usado entre a comunidade de geoestatísticos ao qual tem um grande potencial de desenvolvimento, mas que, no entanto, ainda não possui todas as ferramentas de análise da continuidade espacial incorporadas. Diferentemente do SGeMS, o GSLIB é uma boa biblioteca gratuita para análise geoestatística e é mais completa, mas as estradas do programa são modificadas pela edição de arquivos .txt e usando linhas de comando o que torna a utilização do software pouco amigável com o usuário, apesar da robustez e qualidade dos programas da biblioteca. Dada as limitações dos mais usados e completos softwares gratuitos de geoestatística, essa dissertação objetiva a transcrição e adaptação do algoritmo do GSLIB (GamV .f) para o software SGeMS, modificando a lógica de programação para criar diferentes ferramentas auxiliares como h-scatterplots e mapas de variograma e covariograma. Os resultados demonstraram que a adaptação de algoritmos antigos leva a uma solução gratuita. Além disso, um algoritmo para a otimização da modelagem de variogramas pelo método dos mínimos quadrados foi desenvolvido. As rotinas foram desenvolvidas ambas em C++ e em Python. Os algoritmos foram validados com os valores obtidos pelo software GSLIB. Todos os desenvolvimentos dos plug-ins foram testados e validados usando dois casos ilustrativos: um depósito de ferro e um caso polimetálico. Os resultados provaram ser consistentes e similares com aqueles obtidos com softwares comerciais e renomados. / The spatial continuity analysis includes a serie of tools to estimate and model the continuity of regionalized random variables. It is the basics for many mineral deposit evaluation methods based on geostatistics. The model adjusted is of paramount importance and influences the results in many subsequent kriging and simulation algorithms. Both commercial and academic softwares can be improved in graphics, users interactivity with and automated tools for modeling spatial continuity. SGeMS (Stanford Geoestatistical Modeling Software) is a freeware program used among the geostatistical community which has an extremely potential for development however it does not have enough variographic or graphical tools. Unlike SGeMS, GSLIB is a good and more complete free library for geostatistical analysis, however the program inputs are modified by editing of .txt files and uses DOS command lines. This makes the software less user friendly, despite its robustness and quality. Given the limitation on both most used and complete freeware geostatistical softwares, this dissertation aims at transcripting and adpating an algorithm from GSLIB(GamV.f) into SGeMS software, handling the programming logic to create different auxiliary tools as h-scatterplot and variomaps. The results demonstrated that the adaptation of the old and stable algortihms lead to an inexpensive solution. Futhermore, an algorithm was developed for optimizing variogram modeling based on weigthed least squares method. The routines were developed in both C++ and Phyton. The algorithms were validated against actual values generated by GSLIB. All developed of plug-ins were tested and validated using two illustration studies: an iron ore deposit and a polymetallic one. The results proved to be consistent and similar to the ones obtained by commercial well known sofwares.
2

Does it pay to be green? : an empirical study of the South African mining industry / T.F. Prinsloo

Prinsloo, Thomas Frederik January 2010 (has links)
In recent years, the growing importance of environmental and social issues has put pressure on companies to implement environmental and social systems. With the pressure on companies to improve environmental performance, environmental management accounting can provide a valuable tool that enables companies to respond to environmental challenges. The purpose of this study is to determine the relationship between environmental performance and economic performance in the South African mining industry and also to identify and evaluate the opportunities to improve both a company's environmental performance and economic performance. In this study, scatter plot diagrams were used to determine the relationship between environmental performance and economic performance in the South African mining industry. Ten South African mining companies were selected for the study and their financial information as well as environmental information for the period 2005 to 2009 was used. After the analysis of the scatter plot diagrams was done, it was found that it pays to be green for coal–mining companies, but not for gold– and platinum–mining companies. This study also identified that environmental management accounting is essential to identify and effectively manage environmental costs to improve environmental performance and that it is a very important tool to help companies to implement environmentally friendly programmes for ensuring a company's long–term strategic position. Despite all the risks and challenges facing the mining industry, opportunities to improve a company's environmental performance and economic performance, include emissions trading, development of new technologies, investing in projects in renewable energy and an increase in demand of mining products due to the effects of climate change. ii The value of the study is that it is the first study to investigate the relationship between environmental performance and economic performance in the South African mining industry. This study is also unique as it takes into account how investors see the company in terms of environmental performance. This study uses economic performance measures from an internal and external point of view and not merely from an internal point of view like the previous studies. Companies in the mining industry as well as investors can use the findings presented in this study to realise the importance of preserving the environment as well as the importance of triple bottom line accounting. / Thesis (M.Com. (Management Accounting))--North-West University, Potchefstroom Campus, 2011.
3

Does it pay to be green? : an empirical study of the South African mining industry / T.F. Prinsloo

Prinsloo, Thomas Frederik January 2010 (has links)
In recent years, the growing importance of environmental and social issues has put pressure on companies to implement environmental and social systems. With the pressure on companies to improve environmental performance, environmental management accounting can provide a valuable tool that enables companies to respond to environmental challenges. The purpose of this study is to determine the relationship between environmental performance and economic performance in the South African mining industry and also to identify and evaluate the opportunities to improve both a company's environmental performance and economic performance. In this study, scatter plot diagrams were used to determine the relationship between environmental performance and economic performance in the South African mining industry. Ten South African mining companies were selected for the study and their financial information as well as environmental information for the period 2005 to 2009 was used. After the analysis of the scatter plot diagrams was done, it was found that it pays to be green for coal–mining companies, but not for gold– and platinum–mining companies. This study also identified that environmental management accounting is essential to identify and effectively manage environmental costs to improve environmental performance and that it is a very important tool to help companies to implement environmentally friendly programmes for ensuring a company's long–term strategic position. Despite all the risks and challenges facing the mining industry, opportunities to improve a company's environmental performance and economic performance, include emissions trading, development of new technologies, investing in projects in renewable energy and an increase in demand of mining products due to the effects of climate change. ii The value of the study is that it is the first study to investigate the relationship between environmental performance and economic performance in the South African mining industry. This study is also unique as it takes into account how investors see the company in terms of environmental performance. This study uses economic performance measures from an internal and external point of view and not merely from an internal point of view like the previous studies. Companies in the mining industry as well as investors can use the findings presented in this study to realise the importance of preserving the environment as well as the importance of triple bottom line accounting. / Thesis (M.Com. (Management Accounting))--North-West University, Potchefstroom Campus, 2011.
4

Desenvolvimento de algoritmos para análise e modelagem variográfica

Drumond, David Alvarenga January 2016 (has links)
A análise da continuidade espacial inclui uma série de ferramentas para estimar e modelar a continuidade de variáveis aleatórias regionalizadas. Ela é a base para muitas das avaliações de depósitos minerais baseadas na geoestatísitca. O modelo ajustado é de grande importância e influencia nos resultados em vários algoritmos de krigagem e simulações subsequentes. Tanto os softwares acadêmicos e comerciais podem melhorar no desenvolvimento dos gráficos, na interatividade com o usuário e no uso de formas automáticas de modelagem. O SGeMS (Stanford Geoestatistical Modeling Software) é um programa gratuito usado entre a comunidade de geoestatísticos ao qual tem um grande potencial de desenvolvimento, mas que, no entanto, ainda não possui todas as ferramentas de análise da continuidade espacial incorporadas. Diferentemente do SGeMS, o GSLIB é uma boa biblioteca gratuita para análise geoestatística e é mais completa, mas as estradas do programa são modificadas pela edição de arquivos .txt e usando linhas de comando o que torna a utilização do software pouco amigável com o usuário, apesar da robustez e qualidade dos programas da biblioteca. Dada as limitações dos mais usados e completos softwares gratuitos de geoestatística, essa dissertação objetiva a transcrição e adaptação do algoritmo do GSLIB (GamV .f) para o software SGeMS, modificando a lógica de programação para criar diferentes ferramentas auxiliares como h-scatterplots e mapas de variograma e covariograma. Os resultados demonstraram que a adaptação de algoritmos antigos leva a uma solução gratuita. Além disso, um algoritmo para a otimização da modelagem de variogramas pelo método dos mínimos quadrados foi desenvolvido. As rotinas foram desenvolvidas ambas em C++ e em Python. Os algoritmos foram validados com os valores obtidos pelo software GSLIB. Todos os desenvolvimentos dos plug-ins foram testados e validados usando dois casos ilustrativos: um depósito de ferro e um caso polimetálico. Os resultados provaram ser consistentes e similares com aqueles obtidos com softwares comerciais e renomados. / The spatial continuity analysis includes a serie of tools to estimate and model the continuity of regionalized random variables. It is the basics for many mineral deposit evaluation methods based on geostatistics. The model adjusted is of paramount importance and influences the results in many subsequent kriging and simulation algorithms. Both commercial and academic softwares can be improved in graphics, users interactivity with and automated tools for modeling spatial continuity. SGeMS (Stanford Geoestatistical Modeling Software) is a freeware program used among the geostatistical community which has an extremely potential for development however it does not have enough variographic or graphical tools. Unlike SGeMS, GSLIB is a good and more complete free library for geostatistical analysis, however the program inputs are modified by editing of .txt files and uses DOS command lines. This makes the software less user friendly, despite its robustness and quality. Given the limitation on both most used and complete freeware geostatistical softwares, this dissertation aims at transcripting and adpating an algorithm from GSLIB(GamV.f) into SGeMS software, handling the programming logic to create different auxiliary tools as h-scatterplot and variomaps. The results demonstrated that the adaptation of the old and stable algortihms lead to an inexpensive solution. Futhermore, an algorithm was developed for optimizing variogram modeling based on weigthed least squares method. The routines were developed in both C++ and Phyton. The algorithms were validated against actual values generated by GSLIB. All developed of plug-ins were tested and validated using two illustration studies: an iron ore deposit and a polymetallic one. The results proved to be consistent and similar to the ones obtained by commercial well known sofwares.
5

Detekce anomálií v datech výzkumu PROSO

Šormová, Hana January 2014 (has links)
The thesis deals with algorithms for detecting anomalies in the data collected by the Problem Solving Tutor research. In the theoretical part, the author introduces the term of anomaly, the ideas of the PROSO research and a detailed overview of existing algorithms to detect anomalies. In the practical part, selected algorithms are implemented. Real outputs and results as well as recommendations to a user are presented in this part, and the chapter is supplemented by a number of graphs. The implemented algorithms are also compared to existing data mining software. An example of working with the data mining tools, applied to the data coming from PROSO, and explanation of their outputs is also part of the thesis. In the summary, the appropriate methodology for behavior analysis and anomaly detection is determinied.
6

Desenvolvimento de algoritmos para análise e modelagem variográfica

Drumond, David Alvarenga January 2016 (has links)
A análise da continuidade espacial inclui uma série de ferramentas para estimar e modelar a continuidade de variáveis aleatórias regionalizadas. Ela é a base para muitas das avaliações de depósitos minerais baseadas na geoestatísitca. O modelo ajustado é de grande importância e influencia nos resultados em vários algoritmos de krigagem e simulações subsequentes. Tanto os softwares acadêmicos e comerciais podem melhorar no desenvolvimento dos gráficos, na interatividade com o usuário e no uso de formas automáticas de modelagem. O SGeMS (Stanford Geoestatistical Modeling Software) é um programa gratuito usado entre a comunidade de geoestatísticos ao qual tem um grande potencial de desenvolvimento, mas que, no entanto, ainda não possui todas as ferramentas de análise da continuidade espacial incorporadas. Diferentemente do SGeMS, o GSLIB é uma boa biblioteca gratuita para análise geoestatística e é mais completa, mas as estradas do programa são modificadas pela edição de arquivos .txt e usando linhas de comando o que torna a utilização do software pouco amigável com o usuário, apesar da robustez e qualidade dos programas da biblioteca. Dada as limitações dos mais usados e completos softwares gratuitos de geoestatística, essa dissertação objetiva a transcrição e adaptação do algoritmo do GSLIB (GamV .f) para o software SGeMS, modificando a lógica de programação para criar diferentes ferramentas auxiliares como h-scatterplots e mapas de variograma e covariograma. Os resultados demonstraram que a adaptação de algoritmos antigos leva a uma solução gratuita. Além disso, um algoritmo para a otimização da modelagem de variogramas pelo método dos mínimos quadrados foi desenvolvido. As rotinas foram desenvolvidas ambas em C++ e em Python. Os algoritmos foram validados com os valores obtidos pelo software GSLIB. Todos os desenvolvimentos dos plug-ins foram testados e validados usando dois casos ilustrativos: um depósito de ferro e um caso polimetálico. Os resultados provaram ser consistentes e similares com aqueles obtidos com softwares comerciais e renomados. / The spatial continuity analysis includes a serie of tools to estimate and model the continuity of regionalized random variables. It is the basics for many mineral deposit evaluation methods based on geostatistics. The model adjusted is of paramount importance and influences the results in many subsequent kriging and simulation algorithms. Both commercial and academic softwares can be improved in graphics, users interactivity with and automated tools for modeling spatial continuity. SGeMS (Stanford Geoestatistical Modeling Software) is a freeware program used among the geostatistical community which has an extremely potential for development however it does not have enough variographic or graphical tools. Unlike SGeMS, GSLIB is a good and more complete free library for geostatistical analysis, however the program inputs are modified by editing of .txt files and uses DOS command lines. This makes the software less user friendly, despite its robustness and quality. Given the limitation on both most used and complete freeware geostatistical softwares, this dissertation aims at transcripting and adpating an algorithm from GSLIB(GamV.f) into SGeMS software, handling the programming logic to create different auxiliary tools as h-scatterplot and variomaps. The results demonstrated that the adaptation of the old and stable algortihms lead to an inexpensive solution. Futhermore, an algorithm was developed for optimizing variogram modeling based on weigthed least squares method. The routines were developed in both C++ and Phyton. The algorithms were validated against actual values generated by GSLIB. All developed of plug-ins were tested and validated using two illustration studies: an iron ore deposit and a polymetallic one. The results proved to be consistent and similar to the ones obtained by commercial well known sofwares.
7

Teacher Implementation of a Pretreatment Assessment Procedure in a Public Middle School

Alcala, Angelo L. (Angelo Lee) 05 1900 (has links)
In an attempt to determine the effectiveness of a pretreatment assessment procedure known as the scatter plot (Touchette, MacDonald, & Langer, 1985), direct observational data was collected by 13 middle school teachers on four "problem" students. After four weeks of data collection, interobserver agreement probes were calculated and a visual analysis of the plotted data was performed to ascertain a possible pattern of problem behavior. Additionally, in an attempt to assess the teachers' perceptions of the scatter plot, the 13 teachers were asked to complete a questionnaire. Although a visual analysis of the plotted data suggested a possible pattern of problem behavior, interobserver agreement probes failed to achieve a desired overall reliability of 90% or higher. Despite a low IOA, results of the questionnaire administered to the 13 teachers generally supported the use of the scatter plot as a means of assessing student behavior. Possible reasons for failing to attain an IOA of 90% or higher include the total number of students in a class, the number of subjects observed per period, the teacher's location in the classroom, and the subjects ability to recognize if the teacher was "looking." Recommendations are provided regarding future research concerning the scatter plot and other more formal approaches to assessing student behavior.
8

Introdução à econometria no Ensino Médio : aplicações da regressão linear

Will, Ricardo de Souza January 2016 (has links)
Orientador: Prof. Dr. André Ricardo Oliveira da Fonseca / Dissertação (mestrado) - Universidade Federal do ABC, Programa de Pós-Graduação em Mestrado Profissional em Matemática em Rede Nacional, 2016. / Esta dissertação tem como objetivo propor e dar subsídios aos professores de Matemática do 3º ano do ensino médio sobre temas envolvendo Estatística e Economia, tendo como sugestão a Econometria, especificamente a Regressão Linear Simples e Múltipla, em virtude de possuir conceitos muito abrangentes e que permitirá ao aluno desenvolver condições de entender as diversas aplicações e ser capaz de reconhecer o fenômeno linear e utilizar a regressão para fazer previsões. Abordaremos no capítulo 1 uma revisão dos conceitos de Estatística dando ênfase ao desvio padrão, intervalos de confiança e teste de hipóteses. No capítulo 2 teremos os conceitos de Econometria, como: a origem da palavra, a análise econométrica de um modelo matemático, os objetivos e a metodologia econométrica. Dando uma atenção em especial a Keynes e seus postulados sobre propensão marginal ao consumo e a poupar. Também permitirá aos alunos utilizarem seus conhecimentos de obtenção e tabulação dos dados das variáveis observadas, a construção do gráfico de dispersão, o ajustamento de uma reta que passa pelos pontos, determinar os parâmetros e a equação da reta. No capítulo 3 trataremos do Modelo de Regressão Linear Simples. Inicialmente damos uma atenção especial a Galton que deu origem ao conceito de correlação, então passaremos para os cálculos dos parâmetros, dos resíduos, da variância, do desvio padrão e dos coeficientes de correlação e determinação. Utilizaremos os conceitos de estatística que foram relembrados no capítulo 1. No capítulo 4 trataremos do Modelo de Regressão Linear Múltipla, portanto, abordaremos as diferenças quando deparamos com duas ou mais variáveis explicativas. O professor poderá revisar os conceitos de matrizes. Por fim no capítulo 5 teremos o plano de aula, a escolha do público alvo e da unidade escolar, calendário e cronograma das atividades com os alunos e os resultados obtidos. / This dissertation aims to propose and to give subsidies to Mathematics teachers of the 3rd year of high school on topics involving Statistics and Economics, with Econometrics as a suggestion, specifically the Simple Linear Regression and Multiple, due to having very broad concepts and allow the student to develop a condition to understand the various applications and be able to recognize the linear phenomenon and use regression to make predictions. We discuss in chapter 1 a review of statistical concepts emphasizing the standard deviation, confidence intervals and hypothesis testing. In chapter 2, we will have concepts of Econometrics as the word¿s origin, the econometric analysis of a mathematical model, econometrics¿ goals and methodology. With a particular attention to Keynes and his postulates of small propensity for consuming and saving. It will also allow students to use knowledge of collecting and observed variables data tabulating, the construction of the scatter plot, adjustment of a line that passes through the points and determination of parameters and equation of the line. In chapter 3, we will deal of the Simple Linear Regression Model. Initially we give a special attention to Galton that gave rise to the concept of correlation, then move on to the calculations of the parameters of waste, variance, standard deviation and to correlation and determination¿s coefficients. We will use the statistical concepts that were recalled in Chapter 1. In chapter 4, we will treat the Multiple Linear Regression Model, therefore, we will discuss the differences when faced with two or more explanatory variables. The teacher may review the concepts of matrices. Finally, in Chapter 5, we have the lesson plan, target audience and the school unit choice, calendar and schedule of activities with the students and the results obtained.
9

Modeled Estimates of Solar Direct Normal Irradiance and Diffuse Horizontal Irradiance in Different Terrestrial Locations

Abyad, Emad January 2017 (has links)
The transformation of solar energy into electricity is starting to impact to overall worldwide energy production mix. Photovoltaic-generated electricity can play a significant role in minimizing the use of non-renewable energy sources. Sunlight consists of three main components: global horizontal irradiance (GHI), direct normal irradiance (DNI) and diffuse horizontal irradiance (DHI). Typically, these components are measured using specialized instruments in order to study solar radiation at any location. However, these measurements are not always available, especially in the case of the DNI and DHI components of sunlight. Consequently, many models have been developed to estimate these components from available GHI data. These models have their own merits. For this thesis, solar radiation data collected at four locations have been analyzed. The data come from Al-Hanakiyah (Saudi Arabia), Boulder (U.S.), Ma’an (Jordan), and Ottawa (Canada). The BRL, Reindl*, DISC, and Perez models have been used to estimate DNI and DHI data from the experimentally measured GHI data. The findings show that the Reindl* and Perez model outcomes offered similar accuracy of computing DNI and DHI values when comparing with detailed experimental data for Al-Hanakiyah and Ma’an. For Boulder, the Perez and BRL models have similar estimation abilities of DHI values and the DISC and Perez models are better estimators of DNI. The Reindl* model performs better when modeling DHI and DNI for Ottawa data. The BRL and DISC models show similar metrics error analyses, except in the case of the Ma’an location where the BRL model shows high error metrics values in terms of MAE, RMSE, and standard deviation (σ). The Boulder and Ottawa locations datasets were not complete and affected the outcomes with regards to the model performance metrics. Moreover, the metrics show very high, unreasonable values in terms of RMSE and σ. It is advised that a global model be developed by collecting data from many locations as a way to help minimize the error between the actual and modeled values since the current models have their own limitations. Availability of multi-year data, parameters such as albedo and aerosols, and one minute to hourly time steps data could help minimize the error between measured and modeled data. In addition to having accurate data, analysis of spectral data is important to evaluate their impact on solar technologies.
10

Méthode géométrique de séparation de sources non-négatives : applications à l'imagerie dynamique TEP et à la spectrométrie de masse / Geometrical method for non-negative source separation : Application to dynamic PET imaging and mass spectrometry

Ouedraogo, Wendyam 28 November 2012 (has links)
Cette thèse traite du problème de séparation aveugle de sources non-négatives (c'est à dire des grandeurs positives ou nulles). La situation de séparation de mélanges linéaires instantanés de sources non-négatives se rencontre dans de nombreux problèmes de traitement de signal et d'images, comme la décomposition de signaux mesurés par un spectromètre (spectres de masse, spectres Raman, spectres infrarouges), la décomposition d'images (médicales, multi-spectrale ou hyperspectrales) ou encore l'estimation de l'activité d'un radionucléide. Dans ces problèmes, les grandeurs sont intrinsèquement non-négatives et cette propriété doit être préservée lors de leur estimation, car c'est elle qui donne un sens physique aux composantes estimées. La plupart des méthodes existantes de séparation de sources non-négatives requièrent de ``fortes" hypothèses sur les sources (comme l'indépendance mutuelle, la dominance locale ou encore l'additivité totale des sources), qui ne sont pas toujours vérifiées en pratique. Dans ce travail, nous proposons une nouvelle méthode de séparation de sources non-négatives fondée sur la répartition géométrique du nuage des observations. Les coefficients de mélange et les sources sont estimées en cherchant le cône simplicial d'ouverture minimale contenant le nuage des observations. Cette méthode ne nécessite pas l'indépendance mutuelle des sources, ni même leur décorrélation; elle ne requiert pas non plus la dominance locale des sources, ni leur additivité totale. Une seule condition est nécessaire et suffisante: l'orthant positif doit être l'unique cône simplicial d'ouverture minimale contenant le nuage de points des signaux sources. L'algorithme proposé est évalué avec succès dans deux situations de séparation de sources non-négatives de nature très différentes. Dans la première situation, nous effectuons la séparation de spectres de masse mesurés à la sortie d'un chromatographe liquide haute précision, afin d'identifier et quantifier les différents métabolites (petites molécules) présents dans l'urine d'un rat traité au phénobarbital. Dans la deuxième situation, nous estimons les différents compartiments pharmacocinétiques du radio-traceur FluoroDeoxyGlucose marqué au fluor 18 ([18F]-FDG) dans le cerveau d'un patient humain, à partir d'une série d'images 3D TEP de cet organe. Parmi ces pharmacocinétiques, la fonction d'entrée artérielle présente un grand intérêt pour l'évaluation de l'efficacité d'un traitement anti-cancéreux en oncologie. / This thesis addresses the problem of non-negative blind source separation (i.e. positive or zero quantities). The situation of linear instantaneous mixtures of non-negative sources occurs in many problems of signal and image processing, such as decompositions of signals measured by a spectrometer (mass spectra, Raman spectra, infrared spectra), decomposition of images (medical, multi-spectral and hyperspectral) or estimating of the activity of a radionuclide. In these problems, the sources are inherently non-negative and this property should be preserved during their estimation, in order to get physical meaning components. Most of existing non-negative blind source separation methods require ``strong" assumptions on sources (such as mutual independence, local dominance or total additivity), which are not always satisfied in practice. In this work, we propose a new geometrical method for separating non-negative sources. The mixing matrix and the sources are estimated by finding the minimum aperture simplicial cone containing the scatter plot of mixed data. The proposed method does not require the mutual independence of the sources, neither their decorrelation, nor their local dominance, or their total additivity. One condition is necessary and sufficient: the positive orthant must be the unique minimum aperture simplicial cone cone containing the scatter plot of the sources. The proposed algorithm is successfully evaluated in two different problems of non-negative sources separation. In the first situation, we perform the separation of mass spectra measured at the output of a liquid chromatograph to identify and quantify the different metabolites (small molecules) present in the urine of rats treated with phenobarbital . In the second situation, we estimate the different pharmacokinetics compartments of the radiotracer [18F]-FDG in human brain, from a set of 3D PET images of this organ, without blood sampling. Among these pharmacokinetics, arterial input function is of great interest to evaluate the effectiveness of anti-cancer treatment in oncology.

Page generated in 0.0763 seconds