• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 88
  • 23
  • 14
  • 9
  • 5
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 172
  • 172
  • 31
  • 27
  • 18
  • 18
  • 17
  • 15
  • 15
  • 15
  • 13
  • 13
  • 13
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Informacinių technologijų rizikos valdymo sistema / Information technology risk management framework

Virbalas, Linas 08 September 2009 (has links)
Šiuo darbu pristatoma sukurta sistema, kuria galima modeliuoti ir valdyti rizikas, kylančias iš IT, susijusias su IS nepasiekiamumu ar lėtu veikimu. Sistema realizuota pasitelkus neuroninius tinklus ir yra apmokoma sukaupta statistine informacija iš informacinių sistemų. Jai nurodoma, kurios statistinės informacijos laiko eilutes norima modeliuoti – t.y. kurios iš jų yra rizikos išraiška (serverių apkrovimas, IS atsakymo laikas ir pan.). Sistema pati nustato koreliuojančias statistines laiko eilutes, sugrupuoja susijusias ir kiekvienai grupei sukuria po modelį – apibendrina iki tol nežinomą priklausomybę tarp laiko eilučių pasitelkusi neuroninį tinklą. Kiekvienam iš tų modelių pateikus įtakojančių parametrų reikšmes, sistema sumodeliuoja rizikos parametro reikšmę. Eksperimentai parodė, jog sistema gali būti sėkmingai naudojama mišriame IT ūkyje ir geba modeliuoti įvairius IT bei IS komponentų parametrus, kurie sąlygoja rizikas. / By this work we present an IT risk management system, which is capable to model and manage risks that arise from IT wich are related with IS downtimes and slow response times. The system is implemented by using a proposed neural network architecture as a heart of the modeling engine. It is trained with accumulated datasets from existing information systems. The user shows for the system which statistical data time series one needs to model – i.e. the one which represents the risk (like server load, IS response time, etc.). The system automatically determines correlated statistical time series, groups them and creates a separate model for each group – this model generalizes until then unknown relationship between time series by invoking neural network. The model then accepts values of the input parameters and the system models the value of the risk parameter. Experiments have shown that the proposed system can be successfully used in a mixed IT environment and can be rewarding for one who tracks IT risks coming from various IT and IS components.
112

Development Towards a Three-Component Three-Dimensional Micro Velocity Measurement Technique

Abdolrazaghi, Mona Unknown Date
No description available.
113

On the experimental design of the material microstructures

Staraselski, Yauheni 03 May 2014 (has links)
The design techniques of the components on the macro level are established in the scientific community, however are far behind from the real material performance limits. To obtain those limits, the deeper understanding of the material structure is required. The methods of a new comonents production through standard alloying are the basis of the modern material science manufacturing. The design of the materials with expected required performance limits is the next conceptual step for the materials scientist. As results, to make this step, the problem of a precise material structure analyses on the microstructural level is one os the major importance for the next generation materials design. The complexity of the material structure across the scales(macro-micro) requires a new non deterministic methods for better understanding of the connectivity betwen a materials performance and material microstructure features. This work presents a various new research methodologies and techniques of the material microstructure characterization and numerical design with future applications to the anlyses of the material behavior. The focus of the particular research was to analyse a new cross correlation function of the material structure on the micro length scale and develop a novel framework which allows a better understanding of various important material phenomenas such as failure initiation and recrystallization.
114

A spectroscopic study of detached binary systems using precise radial velocities

Ramm, David John January 2004 (has links)
Spectroscopic orbital elements and/or related parameters have been determined for eight binary systems, using radial-velocity measurements that have a typical precision of about 15 ms⁻¹. The orbital periods of these systems range from about 10 days to 26 years, with a median of about 6 years. Orbital solutions were determined for the seven systems with shorter periods. The measurement of the mass ratio of the longest-period system, HD217166, demonstrates that this important astrophysical quantity can be estimated in a model-free manner with less than 10% of the orbital cycle observed spectroscopically.\\ Single-lined orbital solutions have been derived for five of the binaries. Two of these systems are astrometric binaries: β Ret and ν Oct. The other SB1 systems were 94 Aqr A, θ Ant, and the 10-day system, HD159656. The preliminary spectroscopic solution for θ Ant (P~18 years), is the first one derived for this system. The improvement to the precision achieved for the elements of the other four systems was typically between 1--2 orders of magnitude. The very high precision with which the spectroscopic solution for HD159656 has been measured should allow an investigation into possible apsidal motion in the near future. In addition to the variable radial velocity owing to its orbital motion, the K-giant, ν Oct, has been found to have an additional long-term irregular periodicity, attributed, for the time being, to the rotation of a large surface feature.\\ Double-lined solutions were obtained for HD206804 (K7V+K7V), which previously had two competing astrometric solutions but no spectroscopic solution, and a newly discovered seventh-magnitude system, HD181958 (F6V+F7V). This latter system has the distinction of having components and orbital characteristics whose study should be possible with present ground-based interferometers. All eight of the binary systems have had their mass ratio and the masses of their components estimated.\\ The following comments summarize the motivation for getting these results, and the manner in which the research was carried out. \\ The majority of stars exist in binary systems rather than singly as does the Sun. These systems provide astronomers with the most reliable and proven means to determine many of the fundamental properties of stars. One of these properties is the stellar mass, which is regarded as being the most important of all, since most other stellar characteristics are very sensitive to the mass. Therefore, empirical masses, combined with measurements of other stellar properties, such as radii and luminosities, are an excellent test for competing models of stellar structure and evolution.\\ Binary stars also provide opportunities to observe and investigate many extraordinary astrophysical processes that do not occur in isolated stars. These processes often arise as a result of direct and indirect interactions between the components, when they are sufficiently close to each other. Some of the interactions are relatively passive, such as the circularization of the mutual orbits, whilst others result from much more active processes, such as mass exchange leading to intense radiation emissions. \\ A complete understanding of a binary system's orbital characteristics, as well as the measurement of the all-important stellar masses, is almost always only achieved after the binary system has been studied using two or more complementary observing techniques. Two of the suitable techniques are astrometry and spectroscopy. In favourable circumstances, astrometry can deduce the angular dimensions of the orbit, the total mass of the system, and sometimes, its distance from us. Spectroscopy, on the other hand, can determine the linear scale of the orbit and the ratio of the stellar masses, based on the changing radial velocities of both stars. When a resolved astrometric orbital solution is also available, the velocities of both stars can allow the binary system's parallax to be determined, and the velocities of one star can provide a measure of the system mass ratio.\\ Unfortunately, relatively few binary systems are suited to these complementary studies. Underlying this difficulty are the facts that, typically, astrometrically-determined orbits favour those with periods of years or decades, whereas spectroscopic orbital solutions are more often measured for systems with periods of days to months. With the development of high-resolution astrometric and spectroscopic techniques in recent years, it is hoped that many more binary systems will be amenable to these complementary strategies.\\ Several months after this thesis began, a high-resolution spectrograph, HERCULES, commenced operations at the Mt John University Observatory, to be used in conjuction with the 1-metre McLellan telescope. For late-type stars, the anticipated velocity precision was ≲10 ms⁻¹. The primary goals of this thesis were: 1.~to assess the performance of HERCULES and the related reduction software that subsequently followed, 2.~to carry out an observational programme of 20 or so binary systems, and 3.~to determine the orbital and stellar parameters which characterize some of these systems. The particular focus was on those binaries that have resolved or unresolved astrometric orbital solutions, which therefore may be suited to complementary investigations.\\ HERCULES was used to acquire spectra of the programme stars, usually every few weeks, over a timespan of about three years. High-resolution spectra were acquired for the purpose of measuring precise radial velocities of the stars. When possible, orbital solutions were derived from these velocities, using the method of differential corrections.
115

AVALIAÇÃO DA QUALIDADE DO PROCESSO DE LINGOTAMENTO CONTÍNUO NA PRESENÇA DE CORRELAÇÃO CRUZADA / QUALITY EVALUATION OF CONTINUOUS CASTING PROCESS IN PRESENCE OF CROSS-CORRELATION

Mezzomo, Meire 25 July 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / In the current competitive market, a great part of companies has as the main goal the search for continuous improvement of their products and services. Therefore, the application of statistical methods has great relevance in the quality evaluation, helping in the understanding and monitoring of the processes. In such context, the present study concerns to the use of multivariate control charts in the evaluation of the productive processes in the presence of cross-correlation, which the objective is to verify the continuous casting process stability in the production of still billets by means of Hotelling's T2 multivariate control charts applied in the estimated residual mathematical linear models. Initially, the existence of data autocorrelation was verified, it is necessary the ARIMA modeling, because when it happens, it is necessary to determine the residues and apply multivariate control charts to the residues and not on the original variables. The existence of correlation showed to be meaningful among the variables, being one of the assumptions for the statistical application T2. When the T2 chart instability is verified, it was necessary to identify the variable or the set of variables of steel temperatures in the distributor and in the distributor weight, which are responsible for the instability. Later, the estimated residues were decomposed into principal components, and with the help of the correlation of the original variables and the principal components, the variables which most contributed to the formation of each component were identified. Therefore, it was possible to detect the variables which caused the system instability, once for the steel temperature in the distributor were the T4 and T5, followed by T6, T3, T7 and T2 and for the weight of the distributor, PD4, PD5, PD3, PD6 and PD2, respectively. This way, the estimated residues from the mathematical models, the use of multivariate chart control Hotelling's T2 and the decomposition into principal components which were able to represent the productive process. This methodology allowed the understanding of the behavior of the variables and helped the monitoring of this process, as well as, in the determination of the possible variables which caused the instability in the continuous casting process. / No atual mercado competitivo, grande parte das empresas tem como principal objetivo a busca da melhoria contínua dos seus produtos e serviços. Assim, a aplicação de métodos estatísticos apresenta grande relevância na avaliação da qualidade, auxiliando na compreensão e monitoramento de processos. Nesse contexto, o presente estudo aborda a utilização de gráficos de controle multivariados na avaliação do processo produtivo na presença de correlação cruzada, cujo objetivo é verificar a estabilidade do processo de lingotamento contínuo na fabricação de tarugos de aço por meio do gráfico de controle multivariado T2 de Hotelling aplicado nos resíduos estimados de modelos matemáticos lineares. Inicialmente, foi verificada a existência de autocorrelação nos dados, sendo necessária a utilização da modelagem ARIMA, pois quando isso ocorre, deve-se proceder à determinação dos resíduos e aplicar os gráficos de controle multivariados aos resíduos e não nas variáveis originais. A existência de correlação cruzada mostrou-se significativa entre as variáveis, sendo um dos pressupostos para a aplicação da estatística T2. Verificada a instabilidade no gráfico T2, buscaram-se identificar a variável ou conjunto de variáveis das temperaturas do aço no distribuidor e peso do distribuidor, responsáveis pela instabilidade. Posteriormente, os resíduos estimados foram decompostos em componentes principais, e com o auxílio da correlação entre as variáveis originais e as componentes principais, identificou-se as variáveis que mais contribuíram para a formação de cada componente. Assim, foi possível detectar as variáveis causadoras da instabilidade do sistema, sendo que para às temperaturas do aço no distribuidor foram às temperaturas T4 e T5, seguidas de T6, T3, T7 e T2 e para o peso do distribuidor, PD4, PD5, PD3, PD6 e PD2, respectivamente. Deste modo, os resíduos estimados oriundos dos modelos matemáticos, a aplicação dos gráficos de controle multivariados T2 de Hotelling e a decomposição em componentes principais foram capazes de representar o processo produtivo. Esta metodologia possibilitou a compreensão do comportamento das variáveis e auxiliou no monitoramento do processo, bem como, na determinação das possíveis variáveis causadoras da instabilidade no processo de lingotamento contínuo.
116

Avaliação e implementação de métodos de estimação de tempo de atraso de sinais de ultra-som /

Martinhon, Guilherme. January 2007 (has links)
Orientador: Ricardo Tokio Higuti / Banca: Alexandre César Rodrigues da Silva / Banca: Flávio Buiochi / Resumo: A estimação do tempo de atraso entre dois sinais de ultra-som é uma tarefa muito comum e importante em diversas aplicações, como em sistemas de posicionamento para medição de distâncias, medidores de espessura em ensaios não-destrutivos, células de medição de propriedades de materiais, entre outros. Em algumas aplicações há necessidade de elevada acurácia e precisão na determinação do tempo de atraso, que dependem de diversos parâmetros do transdutor, de sua excitação e do meio em que a onda se propaga, além do método de estimação e representação numérica. Neste trabalho são avaliados três estimadores de tempo de atraso, com implementações em ponto-fixo e ponto-flutuante: correlação cruzada com interpolação parabólica, transformada de Hilbert da correlação e envoltória do sinal analítico. Os estimadores são avaliados em MATLAB, em ponto-flutuante, com sinais sintetizados e com sinais reais obtidos em laboratório, e em ponto-fixo, usando um processador digital de sinais TMS320VC5416, da Texas Instruments. São explorados parâmetros como freqüência central do transdutor, freqüência de amostragem, largura de banda, relação sinal-ruído e atenuação do meio. O desempenho dos métodos é comparado por meio dos erros médios e desvios-padrão das medidas / Abstract: Time-delay estimation between two ultrasonic signals is a very common and important task in several applications, such as distance measurement in positioning systems, thickness measurement in nondestructive testing, measurement cells of materials properties, among others. Some applications require high accuracy and precision on the determination of the time-delay, which depend on several transducer parameters, excitation and medium of propagation, as well as the estimation method and numerical representation. In this work, three time-delay estimators are evaluated, with fixed- and floating-point implementations: cross-correlation with parabolic interpolation, Hilbert transform of correlation and analytic signal envelope. The estimators are evaluated in MATLAB with floating-point representation, using synthesized signals and real signals acquired in laboratory, and in fixed-point using a Texas Instruments TMS320VC5416 digital signal processor. Parameters as transducer central frequency, sampling frequency, bandwidth, signal-to-noise ratio and medium attenuation are considered. The performances of the methods are compared by means of errors (or bias) and standard deviations / Mestre
117

Leis de potências e correlações em séries temporais de preços de produtos agrícolas

SIQUEIRA JÚNIOR, Erinaldo Leite 10 August 2009 (has links)
Submitted by (ana.araujo@ufrpe.br) on 2016-07-05T15:38:42Z No. of bitstreams: 1 Erinaldo Leite Batista Almeida.pdf: 3620819 bytes, checksum: b2532ef7524f47d5417d01445fec797b (MD5) / Made available in DSpace on 2016-07-05T15:38:42Z (GMT). No. of bitstreams: 1 Erinaldo Leite Batista Almeida.pdf: 3620819 bytes, checksum: b2532ef7524f47d5417d01445fec797b (MD5) Previous issue date: 2009-08-10 / Financial markets are complex systems that contain large numbers of interacting units, including interactions among various units in the same market and interactions between units in different markets. Various methods of economics, statistics and econophysics have been developed to analyze financial temporal series (such as price returns, share volume, number of transactions), and serve to establish theoretical models for underlying stochastic processes. The availability of financial data on the internet and increasing computational power have enabled researchers to conduct a large number of empirical studies on financial markets. These studies have shown some universal properties: the risk function of price returns is scale invariant, with power-law behavior and similar value of exponent for different markets; the absolute values of returns (volatility) exhibit long-range power-law correlations. In this work, we use methods if econophysics to study the statistical properties of Brazilian financial markets. We analyze and compare scale properties of risk functions and correlations in temporal series of price returns of agricultural commodities and stocks of various companies traded at Bovespa. We analyze the daily prices of five commodities and twenty stocks traded in the period 2000-2008. For both commodities and stocks, the risk function of daily price returns shows powerlaw behavior with the exponent outside the Levy stable region. The values of exponents are higher for stocks than for commodities. We use Detrended Fluctuation Analysis (DFA) to study correlations in daily time series of absolute values of returns (volatility). This method was developed to quantify long range correlations in non-stationary temporal series.All analyzed series show persistent behavior, meaning that large (small) values are more likely to be followed with large (small) values. The value of the DFA exponent is higher for commodities than for stocks. We also use Detrended Cross Correlation Analysis (DCCA) to study cross-correlations between two series. The values of DCCA exponents are above 0.5 for all series, indicating the existence of long range cross-correlations. This means that each stock or commodity has long memory of its own previous values and of previous values of other stocks or commodities studied. These results are in agreement with results obtained for American financial markets. / Mercados financeiros são caracterizados por um grande número de unidades e interações complexas, incluindo as interações internas (entre diferentes elementos de um mercado) e fatores externos (influência de outros mercados). Vários métodos de economia, estatística e recentemente econofísica foram desenvolvidos para analisar as séries temporais de variáveis financeiras (retorno de preços de ações, mercadorias e taxas de cambio, índice de mercado, volume de negociação, etc.), com objetivo de estabelecer os modelos teóricos para processos estocásticos que estão em base desses fenômenos. A disponibilidade de dados financeiros de vários mercados e crescente poder computacional resultaram em um grande número de estudos empíricos cujos resultados mostraram algumas propriedades universais: a função risco de retornos de preços segue uma lei de potência com o valor de expoente similar para os vários mercados; os valores absolutos de retornos possuem correlações de longo alcance. Neste trabalho foram usados os métodos de econofísica para estudar as propriedades estatísticas do mercado financeiro brasileiro. Foram analisadas e comparadas as propriedades de escala de função risco e de correlações em séries temporais de retornos de preços de mercadorias agrícolas e preços de ações de várias empresas negociadas na Bolsa de Valores de São Paulo (BOVESPA). Foram analisados os preços diários de cinco mercadorias: açúcar, algodão, café, soja e boi, registrados em período 2000-2008. Para ações, analisamos as características seguintes: preços de abertura, fechamento, valores máximo e mínimo, volume e montante. Todas as séries são diárias, registradas no período de 2000-2008. São estudadas 20 empresas divididas em 4 grupos: bancos, energia, telecomunicações e siderurgia (5 empresas de cada grupo). Para todas as séries estudadas a função risco de retornos de preços segue uma lei de potência com os valores de expoente maiores para ações do que para mercadorias. As correlações são analisadas para os valores absolutos de retornos de preços (volatilidade). Foi usado o método Detrended Fluctuation Analysis (DFA), desenvolvido para quantificar as correlações de longo alcance em séries temporais não estacionárias. Todas as séries mostraram um comportamento persistente, significando que os valores grandes (pequenos) tem maior probabilidade de serem seguidos por valores grandes (pequenos). Os valores de expoente DFA são maiores para mercadorias do que para as ações. Foi utilizada uma generalização de DFA, Detrended Cross Correlation Analysis (DCCA) para analisar as correlações cruzadas entre duas séries. Os valores de expoente DCCA para todas as séries estudadas indicam a existência de correlações cruzadas de longo alcance significando que os valores de cada série possuem memória de longo alcance de seus valores anteriores e também de valores anteriores de outras série. Os resultados estão em acordo com os resultados obtidos para mercado americano.
118

Padrão da ocupação da baleia-franca-austral (Eubalaena australis) em enseadas do litoral catarinense e influencia das anomalias climáticas em sua taxa de natalidade

Seyboth, Elisa January 2013 (has links)
Dissertação(mestrado) - Universidade Federal do Rio Grande, Programa de Pós–Graduação em Oceanografia Biológica, Instituto de Oceanografia, 2013. / Submitted by Cristiane Gomides (cristiane_gomides@hotmail.com) on 2013-11-19T11:38:07Z No. of bitstreams: 1 elisa.pdf: 1223620 bytes, checksum: b663448fea1aa40da19e4272a5aab6d2 (MD5) / Approved for entry into archive by Angelica Miranda (angelicacdm@gmail.com) on 2013-11-20T21:35:09Z (GMT) No. of bitstreams: 1 elisa.pdf: 1223620 bytes, checksum: b663448fea1aa40da19e4272a5aab6d2 (MD5) / Made available in DSpace on 2013-11-20T21:35:09Z (GMT). No. of bitstreams: 1 elisa.pdf: 1223620 bytes, checksum: b663448fea1aa40da19e4272a5aab6d2 (MD5) Previous issue date: 2013 / A vulnerabilidade dos mamíferos marinhos a ameaças que comprometam a manutenção de suas populações é uma das razões que os levam a ser alvo de pesquisas que visam sua conservação. A baleia-franca-austral, Eubalaena australis, é uma dessas espécies, sendo que a caça foi uma forte ameaça a todas as suas populações. No Brasil, sua principal concentração reprodutiva ocorre no litoral de Santa Catarina, onde indivíduos da espécie são observados anualmente entre os meses de julho e novembro. Esses indivíduos pertencem a uma população compartilhada entre Brasil e Argentina e que se recupera a taxas significativas. Esforços vêm sendo realizados a fim de preservar essa importante área para a espécie, porém faz-se necessário um melhor conhecimento acerca do seu uso de habitat na região, bem como de fatores que podem influenciar a taxa relativa de nascimentos de indivíduos, a qual possui forte relação com a recuperação populacional. O objetivo do presente trabalho foi testar a influência de variáveis temporais e ambientais na distribuição da espécie no litoral sul de Santa Catarina e avaliar a influência de anomalias climáticas em sua taxa relativa de nascimentos através de Modelos Lineares Generalizados e correlação cruzada, respectivamente. Os resultados sugerem que tanto grupos de fêmeas com filhotes quanto de adultos desacompanhados preferem enseadas amplas, com declive suave e parecem evitar enseadas com grandes ângulos de inclinação quando ventos intensos da direção leste atuam sobre elas. O sucesso reprodutivo dos indivíduos parece influenciado por anomalias climáticas, relacionadas principalmente à temperatura superficial da água do mar, que afetam a disponibilidade de alimento em sua área de alimentação, no entorno das ilhas Geórgias do Sul. / Vulnerability to threats that can compromise population maintenance is one of the reasons why many marine mammal species are targeted for conservation research. The southern right whale, Eubalaena australis, is one such species, and hunting was a strong threat to all of their populations. On the Brazilian coast, its main reproductive site is located along the Santa Catarina State, where individuals of the species are observed annually between July and November. These individuals belong to a population shared between Brazil and Argentina, which recovers at significant rates. Efforts have been made to preserve this important area for the species, but the habitat use of right whales in the region needs to be better known, as well as factors that may be influencing their relative birth rates, which are strongly related to population recovery. The objective of this study was to test the influence of temporal and environmental variables on species distribution at the southern coast of Santa Catarina and whether climate anomalies influence their relative birth rate using Generalized Linear Models and cross correlation, respectively. Our results suggest that both cowcalf and unaccompanied adult groups prefer large bays with gentle slope and they seem to avoid bays with great inclination angles when strong east winds are acting on them. The reproductive success of individuals appears to be influenced by climate anomalies, mainly the ones related to sea surface temperature, which affect food availability on the species feeding area, in the vicinity of South Georgia Islands.
119

Um novo algoritmo de granulometria com aplicação em caracterização de nanoestruturas de silício. / A new correlation-based granulometry algorithm with application in characterizing porous silicon nanomaterials.

Ricardo Hitoshi Maruta 14 October 2011 (has links)
Granulometria é o processo usado para medir objetos de diferentes tamanhos em imagens de material granular. Frequentemente algoritmos baseados em morfologia matemática ou detecção de arestas são utilizados para esta finalidade. Propomos uma nova abordagem para a granulometria utilizando correlações cruzadas com círculos de tamanhos diferentes. Esta técnica é primeiramente adequada para a detecção de objetos de formato circular, mas pode ser estendido para outras formas utilizando outros núcleos (kernels) de correlação. Experimentos mostram que o novo algoritmo é robusto ao ruído e pode detectar objetos com pouco contraste e/ou com sobreposição parcial. Este trabalho também apresenta características quantitativas estruturais da camada de silício poroso, obtidas aplicando o algoritmo proposto em imagens de microscopia eletrônica de varredura (MEV). O novo algoritmo, que chamamos Granul, calcula as áreas e frequências dos poros. Processamentos adicionais utilizando outros algoritmos classificam os poros em circulares ou quadrados. Relacionamos os resultados quantitativos obtidos com o processo de fabricação e discutimos o mecanismo de formação do poro quadrado no silício. O novo algoritmo mostrou-se confiável no processamento de imagens de MEV e é uma ferramenta promissora para controle no processo de formação dos poros. / Granulometry is the process of measuring the size distribution of objects in an image of granular material. Usually, algorithms based on mathematical morphology or edge detection are used for this task. We propose a entirely new approach for the granulometry using the cross correlations with circles of different sizes. This technique is primarily adequate for detecting circular shaped objects, but it can be extended to other shapes using other correlation kernels. Experiments show that the new algorithm is greatly robust to noise and can detect even faint objects and/or objects with partial superposition. This paper also reports the quantitative structural characteristics of the porous silicon layer based on the proposed algorithm applied to Scanning Electron Microscopy (SEM) images. The new algorithm, that we call Granul, computes the size distribution of pores and classifies the pores in circular or square ones. We relate these quantitative results to the fabrication process and discuss the square porous silicon formation mechanism. The new algorithm shows to be reliable in SEM images processing and is a promising tool to control the pores formation process.
120

Visualisation Studio for the analysis of massive datasets

Tucker, Roy Colin January 2016 (has links)
This thesis describes the research underpinning and the development of a cross platform application for the analysis of simultaneously recorded multi-dimensional spike trains. These spike trains are believed to carry the neural code that encodes information in a biological brain. A number of statistical methods already exist to analyse the temporal relationships between the spike trains. Historically, hundreds of spike trains have been simultaneously recorded, however as a result of technological advances recording capability has increased. The analysis of thousands of simultaneously recorded spike trains is now a requirement. Effective analysis of large data sets requires software tools that fully exploit the capabilities of modern research computers and effectively manage and present large quantities of data. To be effective such software tools must; be targeted at the field under study, be engineered to exploit the full compute power of research computers and prevent information overload of the researcher despite presenting a large and complex data set. The Visualisation Studio application produced in this thesis brings together the fields of neuroscience, software engineering and information visualisation to produce a software tool that meets these criteria. A visual programming language for neuroscience is produced that allows for extensive pre-processing of spike train data prior to visualisation. The computational challenges of analysing thousands of spike trains are addressed using parallel processing to fully exploit the modern researcher’s computer hardware. In the case of the computationally intensive pairwise cross-correlation analysis the option to use a high performance compute cluster (HPC) is seamlessly provided. Finally the principles of information visualisation are applied to key visualisations in neuroscience so that the researcher can effectively manage and visually explore the resulting data sets. The final visualisations can typically represent data sets 10 times larger than previously while remaining highly interactive.

Page generated in 0.1064 seconds