• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 76
  • 18
  • 10
  • 7
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 164
  • 23
  • 19
  • 18
  • 17
  • 17
  • 16
  • 16
  • 14
  • 14
  • 14
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

ANISOTROPIC POLARIZED LIGHT SCATTER AND MOLECULAR FACTOR COMPUTING IN PHARMACEUTICAL CLEANING VALIDATION AND BIOMEDICAL SPECTROSCOPY

Urbas, Aaron Andrew 01 January 2007 (has links)
Spectroscopy and other optical methods can often be employed with limited or no sample preparation, making them well suited for in situ and in vivo analysis. This dissertation focuses on the use of a near-infrared spectroscopy (NIRS) and polarized light scatter for two such applications: the assessment of cardiovascular disease, and the validation of cleaning processes for pharmaceutical equipment.There is a need for more effective in vivo techniques for assessing intravascular disorders, such as aortic aneurysms and vulnerable atherosclerotic plaques. These, and other cardiovascular disorders, are often associated with structural remodeling of vascular walls. NIRS has previously been demonstrated as an effective technique for the analysis of intact biological samples. In this research, traditional NIRS is used in the analysis of aortic tissue samples from a murine knockout model that develops abdominal aortic aneurysms (AAAs) following infusion of angiotensin II. Effective application of NIRS in vivo, however, requires a departure from traditional instrumental principles. Toward this end, the groundwork for a fiber optic-based catheter system employing a novel optical encoding technique, termed molecular factor computing (MFC), was developed for differentiating cholesterol, collagen and elastin through intervening red blood cell solutions. In MFC, the transmission spectra of chemical compounds are used to collect measurements directly correlated to the desired sample information.Pharmaceutical cleaning validation is another field that can greatly benefit from novel analytical methods. Conventionally cleaning validation is accomplished through surface residue sampling followed by analysis using a traditional analytical method. Drawbacks to this approach include cost, analysis time, and uncertainties associated with the sampling and extraction methods. This research explores the development of in situ cleaning validation methods to eliminate these issues. The use of light scatter and polarization was investigated for the detection and quantification of surface residues. Although effective, the ability to discriminate between residues was not established with these techniques. With that aim in mind, the differentiation of surface residues using NIRS and MFC was also investigated.
52

Contributions to spectral CT

Opie, Alexander M. T. January 2013 (has links)
Spectral x-ray computed tomography (CT) is an important nascent imaging modality with several exciting potential applications. The research presented in this thesis separates into two primary areas with the common underlying theme of spectral CT; the first area is Compton scatter estimation and the second is interior tomography. First, the research is framed and outputs are identified. Background on the concepts used in the thesis is offered, including x-ray imaging and computed tomography, CT scanner architecture, spectral imaging, interior tomography and x-ray scatter. The mathematical background of techniques for image reconstruction from x-ray transmission measurements are presented. Many of the tools used to perform the research, both hardware and software, are described. An algorithm is developed for estimating the intensity of Compton scattered photons within a spectral CT scan, and a major approximation used by the algorithm is analysed. One proposed interior reconstruction algorithm is briefly evaluated; while this is not directly linked to spectral CT, it is related to the work on a novel hybrid spectral interior micro-CT architecture. Conclusions are summarised and suggestions for future work are offered. Scatter is known to cause artefacts in CT reconstructions, and several methods exist to correct data that has been corrupted by scatter. Compton scatter affects the energy of photons, therefore spectral CT measurements offer the potential to correct for this phenomenon more accurately than conventional measurements. A Compton scatter algorithm is developed and is found to match very well to Monte Carlo validation simulations, with the constraints that the object be at the micro-CT scale and that electron-binding effects are omitted. Development of the algorithm uses an approximation of the post-scatter attenuation to simplify the estimation problem and enable implementation. The consequences of this approximation are analysed, and the error introduced is found to be less than 5% in most biomedical micro-CT situations. Interior tomography refers to the incomplete data situation caused by the truncation of some or all CT projections, and is an active research area. A recently proposed interior reconstruction algorithm is evaluated with regard to its sensitivity to input error, and is found to have mediocre performance in this respect. Published results are not found to be reproducible, suggesting some omission from the published algorithm. A novel hybrid spectral interior architecture is described, along with an iterative reconstruction algorithm for hybrid data sets. The system combines a full field of view conventional imaging chain and an interior field of view spectral imaging chain to enable spectral measurement of a region of interest, and addresses some important limitations of spectral x-ray detectors; promising results are shown. Spectral reconstructions from interior data are shown to have sufficient information to distinguish two k-edge contrast agents (iodine and gadolinium) not only within the interior field of view but also beyond it. The architecture is further explored in the context of radiation exposure reduction, including testing of an analytical hybrid reconstruction algorithm.
53

Managing Slash to Minimize Colonization of Residual Leave Trees by Ips and Other Bark Beetle Species Following Thinning in Southwestern Ponderosa Pine

DeGomez, Tom, Fettig, Christopher J., McMillin, Joel D., Anhold, John A., Hayes, Christopher 05 1900 (has links)
12 pp. / Pine Bark Beetles, THE PIÑON IPS BARK BEETLE, FIREWOOD AND BARK BEETLES IN THE SOUTHWEST, USING INSECTICIDES TO PREVENT BARK BEETLE ATTACKS ON CONIFERS, GUIDELINES FOR THINNING PONDEROSA PINE FOR IMPROVED FOREST HEALTH AND FIRE PREVENTION / Various techniques to reduce brood production of Ips and Dendroctonus spp. in ponderosa pine slash are discussed.
54

Coding Strategies for X-ray Tomography

Holmgren, Andrew January 2016 (has links)
<p>This work focuses on the construction and application of coded apertures to compressive X-ray tomography. Coded apertures can be made in a number of ways, each method having an impact on system background and signal contrast. Methods of constructing coded apertures for structuring X-ray illumination and scatter are compared and analyzed. Apertures can create structured X-ray bundles that investigate specific sets of object voxels. The tailored bundles of rays form a code (or pattern) and are later estimated through computational inversion. Structured illumination can be used to subsample object voxels and make inversion feasible for low dose computed tomography (CT) systems, or it can be used to reduce background in limited angle CT systems. </p><p>On the detection side, coded apertures modulate X-ray scatter signals to determine the position and radiance of scatter points. By forming object dependent projections in measurement space, coded apertures multiplex modulated scatter signals onto a detector. The multiplexed signals can be inverted with knowledge of the code pattern and system geometry. This work shows two systems capable of determining object position and type in a 2D plane, by illuminating objects with an X-ray `fan beam,' using coded apertures and compressive measurements. Scatter tomography can help identify materials in security and medicine that may be ambiguous with transmission tomography alone.</p> / Dissertation
55

Investigation and Development of a Fully 3D Tilt Capable Hybrid SPECT - CT System for Dedicated Breast Imaging

Shah, Jainil January 2015 (has links)
<p>X-ray mammography has been the gold standard for breast imaging for decades, despite the significant limitations posed by the two dimensional (2D) image acquisitions. Difficulty in diagnosing lesions close to the chest wall and axilla, high amount of structural overlap and patient discomfort due to compression are only some of these limitations. To overcome these drawbacks, three dimensional (3D) breast imaging modalities have been developed including dual modality single photon emission computed tomography (SPECT) and computed tomography (CT) systems. This thesis focuses on the development and integration of the next generation of such a device for dedicated breast imaging. The goals of this dissertation work are to: [1] understand and characterize any effects of fully 3-D trajectories on reconstructed image scatter correction, absorbed dose and Hounsifeld Unit accuracy, and [2] design, develop and implement the fully flexible, third generation hybrid SPECT-CT system capable of traversing complex 3D orbits about a pendant breast volume, without interference from the other. Such a system would overcome artifacts resulting from incompletely sampled divergent cone beam imaging schemes and allow imaging closer to the chest wall, which other systems currently under research and development elsewhere cannot achieve. </p><p>The dependence of x-ray scatter radiation on object shape, size, material composition and the CT acquisition trajectory, was investigated with a well-established beam stop array (BSA) scatter correction method. While the 2D scatter to primary ratio (SPR) was the main metric used to characterize total system scatter, a new metric called ‘normalized scatter contribution’ was developed to compare the results of scatter correction on 3D reconstructed volumes. Scatter estimation studies were undertaken with a sinusoidal saddle (±15° polar tilt) orbit and a traditional circular (AZOR) orbit. Clinical studies to acquire data for scatter correction were used to evaluate the 2D SPR on a small set of patients scanned with the AZOR orbit. Clinical SPR results showed clear dependence of scatter on breast composition and glandular tissue distribution, otherwise consistent with the overall phantom-based size and density measurements. Additionally, SPR dependence was also observed on the acquisition trajectory where 2D scatter increased with an increase in the polar tilt angle of the system. </p><p>The dose delivered by any imaging system is of primary importance from the patient’s point of view, and therefore trajectory related differences in the dose distribution in a target volume were evaluated. Monte Carlo simulations as well as physical measurements using radiochromic film were undertaken using saddle and AZOR orbits. Results illustrated that both orbits deliver comparable dose to the target volume, and only slightly differ in distribution within the volume. Simulations and measurements showed similar results, and all measured dose values were within the standard screening mammography-specific, 6 mGy dose limit, which is used as a benchmark for dose comparisons.</p><p>Hounsfield Units (HU) are used clinically in differentiating tissue types in a reconstructed CT image, and therefore the HU accuracy of a system is very important, especially when using non-traditional trajectories. Uniform phantoms filled with various uniform density fluids were used to investigate differences in HU accuracy between saddle and AZOR orbits. Results illustrate the considerably better performance of the saddle orbit, especially close to the chest and nipple region of what would clinically be a pedant breast volume. The AZOR orbit causes shading artifacts near the nipple, due to insufficient sampling, rendering a major portion of the scanned phantom unusable, whereas the saddle orbit performs exceptionally well and provides a tighter distribution of HU values in reconstructed volumes. </p><p>Finally, the third generation, fully-suspended SPECT-CT system was designed in and developed in our lab. A novel mechanical method using a linear motor was developed for tilting the CT system. A new x-ray source and a custom made 40 x 30 cm2 detector were integrated on to this system. The SPECT system was nested, in the center of the gantry, orthogonal to the CT source-detector pair. The SPECT system tilts on a goniometer, and the newly developed CT tilting mechanism allows ±15° maximum polar tilting of the CT system. The entire gantry is mounted on a rotation stage, allowing complex arbitrary trajectories for each system, without interference from the other, while having a common field of view. This hybrid system shows potential to be used clinically as a diagnostic tool for dedicated breast imaging.</p> / Dissertation
56

A meta-heurística busca dispersa em problemas de roteirização com coleta e entrega simultâneas: aplicação na Força Aérea Brasileira. / The scatter search metaheuristic in vehicle routing problems with simultaneous delivery and pickup: application in the brazilian air force.

Mesquita, Antônio Célio Pereira de 08 April 2010 (has links)
O presente trabalho trata da solução para o problema da elaboração de programações de transporte do sistema de distribuição de materiais da Força Aérea Brasileira (FAB). Essas programações de transporte consistem em definir os roteiros de entrega e coleta de materiais a serem realizadas simultaneamente em cada local de entrega/coleta a partir de um centro de distribuição, considerando-se a frota de veículos homogênea. Isto é característico de um Problema de Roteirização de Veículos com Coletas e Entregas Simultâneas (PRVCES). A gestão do sistema de distribuição física da FAB considera a complexidade desse sistema e os dados relativos às demandas de transporte de carga em cada um desses locais para elaborar as programações de transporte. Essas programações são elaboradas tendo em vista os limites de capacidade dos veículos, as características físicas das cargas e as prioridades de embarque. O gestor desse sistema possui boa visibilidade das demandas de transporte, porém, devido à grande quantidade de informações disponíveis e à elevada complexidade desse sistema, é impossível elaborarem-se manualmente programações de transporte que resultem em viagens de distribuição eficientes. O PRVCES foi resolvido por meio da meta-heurística Busca Dispersa (do inglês Scatter Search) integrada com a meta-heurística Descida em Vizinhança Variável (do inglês Variable Neighborhood Descent) utilizada como método de melhoria das soluções. Os resultados superaram ou se igualaram a alguns dos obtidos por outros autores para os mesmos problemas de teste com as mesmas restrições, o que demonstra que a Busca Dispersa implementada é competitiva para solucionar o PRVCES. Quanto à aplicação na FAB, os resultados mostraram que a utilização do método de solução desenvolvido resultará em programações de transporte elaboradas em curto tempo de processamento e que estas incidirão positivamente sobre a eficiência do sistema de distribuição de materiais da FAB. / This work deals with the solution to the problem of drawing up transport schedules in the material distribution system of the Brazilian Air Force (BAF). These transport schedules consist in defining the routes for material pickup and delivery to be accomplished simultaneously in each delivery/pickup location from a distribution center, considering a homogeneous fleet of vehicles. This is characteristic of a Vehicle Routing Problem with Simultaneous Delivery and Pick-up (VRPSDP). The management of the physical distribution of BAF considers the complexity of this system and the data regarding the cargo transport demands in each one of those locations to draw up transport schedules. These schedules are drawn up regarding the capacity limits of the vehicles, the physical characteristics of the cargoes and the shipping priorities. A good visibility of transport demands in each location is available to the manager of this system, but due to the great quantity of data to deal with and the high complexity of the physical distribution system of BAF, it is impossible to draw up transport schedules that result in efficient distribution trips. The VRPSDP was solved by means of the Scatter Search meta-heuristic integrated with the Variable Neighborhood Descent meta-heuristic as the solution improvement method. The results exceeded or equaled some of those obtained by other authors using the same test problems with the same restrictions, what indicates that the implemented Scatter Search is competitive to solve the VRPSDP. As for the application in the BAF, the results showed that using the solution method developed will result in schedules drawn up in short processing time and focused on the efficiency of the material distribution system of the BAF.
57

Space-time sampling strategies for electronically steerable incoherent scatter radar

Swoboda, John Philip 10 March 2017 (has links)
Incoherent scatter radar (ISR) systems allow researchers to peer into the ionosphere via remote sensing of intrinsic plasma parameters. ISR sensors have been used since the 1950s and until the past decade were mainly equipped with a single mechanically steerable antenna. As such, the ability to develop a two or three dimensional picture of the plasma parameters in the ionosphere has been constrained by the relatively slow mechanical steering of the antennas. A newer class of systems using electronically steerable array (ESA) antennas have broken the chains of this constraint, allowing researchers to create 3-D reconstructions of plasma parameters. There have been many studies associated with reconstructing 3-D fields of plasma parameters, but there has not been a systematic analysis into the sampling issues that arise. Also, there has not been a systematic study as to how to reconstruct these plasma parameters in an optimum sense as opposed to just using different forms of interpolation. The research presented here forms a framework that scientists and engineers can use to plan experiments with ESA ISR capabilities and to better analyze the resulting data. This framework attacks the problem of space-time sampling by ESA ISR systems from the point of view of signal processing, simulation and inverse theoretic image reconstruction. We first describe a physics based model of incoherent scatter from the ionospheric plasma, along with processing methods needed to create the plasma parameter measurements. Our approach leads to development of the space-time ambiguity function, forming a theoretical foundation of the forward model for ISR. This forward model is novel in that it takes into account the shape of the antenna beam and scanning method along with integration time to develop the proper statistics for a desired measurement precision. Once the forward model is developed, we present the simulation method behind the Simulator for ISR (SimISR). SimISR uses input plasma parameters over space and time and creates complex voltage samples in a form similar to that produced by a real ISR system. SimISR allows researchers to evaluate different experiment configurations in order to efficiently and accurately sample specific phenomena. We present example simulations using input conditions derived from a multi-fluid ionosphere model and reconstructions using standard interpolation techniques. Lastly, methods are presented to invert the space-time ambiguity function using techniques from image reconstruction literature. These methods are tested using SimISR to quantify accurate plasma parameter reconstruction over a simulated ionospheric region.
58

Desenvolvimento de algoritmos para análise e modelagem variográfica

Drumond, David Alvarenga January 2016 (has links)
A análise da continuidade espacial inclui uma série de ferramentas para estimar e modelar a continuidade de variáveis aleatórias regionalizadas. Ela é a base para muitas das avaliações de depósitos minerais baseadas na geoestatísitca. O modelo ajustado é de grande importância e influencia nos resultados em vários algoritmos de krigagem e simulações subsequentes. Tanto os softwares acadêmicos e comerciais podem melhorar no desenvolvimento dos gráficos, na interatividade com o usuário e no uso de formas automáticas de modelagem. O SGeMS (Stanford Geoestatistical Modeling Software) é um programa gratuito usado entre a comunidade de geoestatísticos ao qual tem um grande potencial de desenvolvimento, mas que, no entanto, ainda não possui todas as ferramentas de análise da continuidade espacial incorporadas. Diferentemente do SGeMS, o GSLIB é uma boa biblioteca gratuita para análise geoestatística e é mais completa, mas as estradas do programa são modificadas pela edição de arquivos .txt e usando linhas de comando o que torna a utilização do software pouco amigável com o usuário, apesar da robustez e qualidade dos programas da biblioteca. Dada as limitações dos mais usados e completos softwares gratuitos de geoestatística, essa dissertação objetiva a transcrição e adaptação do algoritmo do GSLIB (GamV .f) para o software SGeMS, modificando a lógica de programação para criar diferentes ferramentas auxiliares como h-scatterplots e mapas de variograma e covariograma. Os resultados demonstraram que a adaptação de algoritmos antigos leva a uma solução gratuita. Além disso, um algoritmo para a otimização da modelagem de variogramas pelo método dos mínimos quadrados foi desenvolvido. As rotinas foram desenvolvidas ambas em C++ e em Python. Os algoritmos foram validados com os valores obtidos pelo software GSLIB. Todos os desenvolvimentos dos plug-ins foram testados e validados usando dois casos ilustrativos: um depósito de ferro e um caso polimetálico. Os resultados provaram ser consistentes e similares com aqueles obtidos com softwares comerciais e renomados. / The spatial continuity analysis includes a serie of tools to estimate and model the continuity of regionalized random variables. It is the basics for many mineral deposit evaluation methods based on geostatistics. The model adjusted is of paramount importance and influences the results in many subsequent kriging and simulation algorithms. Both commercial and academic softwares can be improved in graphics, users interactivity with and automated tools for modeling spatial continuity. SGeMS (Stanford Geoestatistical Modeling Software) is a freeware program used among the geostatistical community which has an extremely potential for development however it does not have enough variographic or graphical tools. Unlike SGeMS, GSLIB is a good and more complete free library for geostatistical analysis, however the program inputs are modified by editing of .txt files and uses DOS command lines. This makes the software less user friendly, despite its robustness and quality. Given the limitation on both most used and complete freeware geostatistical softwares, this dissertation aims at transcripting and adpating an algorithm from GSLIB(GamV.f) into SGeMS software, handling the programming logic to create different auxiliary tools as h-scatterplot and variomaps. The results demonstrated that the adaptation of the old and stable algortihms lead to an inexpensive solution. Futhermore, an algorithm was developed for optimizing variogram modeling based on weigthed least squares method. The routines were developed in both C++ and Phyton. The algorithms were validated against actual values generated by GSLIB. All developed of plug-ins were tested and validated using two illustration studies: an iron ore deposit and a polymetallic one. The results proved to be consistent and similar to the ones obtained by commercial well known sofwares.
59

Novas aplicações de metaheurísticas na solução do problema de planejamento da expansão do sistema de transmissão de energia elétrica /

Taglialenha, Silvia Lopes de Sena. January 2008 (has links)
Orientador: Rubén Augusto Romero Lázaro / Banca: José Roberto Sanches Mantovani / Banca: Antonio Padilha Feltrin / Banca: Luiz Carlos Pereira da Silva / Banca: Eduardo Nobuhiro Asada / Resumo: O Problema de Planejamento da Expansão de Sistemas de Transmissão de Energia Elétrica consiste em se escolher, entre um conjunto pré-definido de circuitos candidatos, aqueles que devem ser incorporados ao sistema de forma a minimizar os custos de investimento e operação ao e atender a demanda de energia futura ao longo de um horizonte de planejamento com confiabilidade, assumindo como conhecido o plano de geração. É considerado um problema muito complexo e difícil por se tratar de um problema não linear inteiro misto, não convexo, multimodal e altamente combinatório. Este problema tem sido solucionado usando técnicas clássicas como Decomposição ao de Benders e Branch and Bound, assim como também algoritmos heurísticos e metaheurísticas obtendo diversos resultados, mais com uma série de problemas como, por exemplo, alto esforço computacional e problemas de convergência. Neste trabalho apresentam-se duas novas técnicas de solução para o problema, a saber, as metaheurísticas Busca em Vizinhança Variável e a Busca Dispersa. A Busca em Vizinhança Variável é uma técnica baseada em trocas de estruturas de vizinhança dentro de um algoritmo de busca local, e a metaheurística Busca Dispersa, um método evolutivo que combina sistematicamente conjuntos de soluções para se obter solucões melhores. Essas técnicas de solução oferecem novas alternativas de solução que oferecem solução aos problemas encontrados com outros métodos, como é um baixo esforço computacional é uma melhor convergência, sendo este o principal aporte do trabalho. Os algoritmos são apresentados sistematicamente, explicando os seus algoritmos e a forma como são adaptados para resolver o problema do planejamento da expansão de sistemas de transmissão considerando-se a modelagem matemática conhecida com o modelo de transporte e o modelo DC. São realizados testes com os sistemas... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Electric Energy Transmission Network Expansion Problem consist in choose among a set of pre-defined circuits candidates, who must be incorporated into the system so as to minimize the investment costs and operation and meet the future energy demand over a planning horizon with reliability, assuming the generation plan is known. It is a very complex and difficult problem because it is non linear, non convex, multimodal and highly combinatorial. This problem has been solved using traditional techniques such as Benders decomposition and Branch and Bound, as well as heuristic algorithms and metaheuristics getting different results, but with a series of problems such as high computational effort and convergence problems. This paper tests out two new techniques for solving the problem as are the metaheuristics Variable Neighborhood Search and Scatter Search. The Variable Neighborhood Search is a technique based on trading structures within a neighborhood of a local search algorithm, and the Scatter Search metaheuristic is a method which combines systematically sets of solutions in an evolutionary way to achieve better solutions. These solution techniques offer new alternatives to solve the problems encountered with other methods, such as a low computational effort and better convergence, which is the main contribution of this work. The techniques are presented systematically, explaining their algorithms and the way they are adapted to solve the network expansion planning problem based on the mathematical model known as the transportation model and the DC model. They are tested with the systems Southern Brazilian with 46 buses and the IEEE 24 buses system, results are compared with those obtained with other metaheuristics, obtaining excellent results with a best performance both in processing speed as in computational effort. / Doutor
60

29-Day Analysis of Scale Heights and the Inference of the Topside Ionosphere Over Millstone Hill During the 2002 Incoherent Scatter Radar Campaign

Meehan, Jennifer L 01 August 2017 (has links)
Ionospheric scale height is a measure of the topside altitude dependence of electron density and is a key ionospheric parameter due to its intrinsic connection to ionospheric dynamics, plasma temperature, and composition. A longtime problem has been that information on the bottomside ionospheric profile is readily available, but the observation of the topside ionosphere is still challenging. Despite numerous data techniques to characterize the topside ionosphere, the knowledge of the behavior of the topside ionosphere and its subsequent scale heights remains insufficient. The goal of this study is to evaluate whether or not we can characterize the topside ionospheric density and temperature profiles in the event that neither temperature nor electron density are measured by using a cost-effective method. In a simple model, the electron density in the F-region topside decreases exponentially with height. This exponential decay is mainly driven by thermal diffusive equilibrium, but also dependent on the dominant ion species, as well as other drivers during nondiffusive conditions. A scale height based on observations of the temperature can generate topside electron density profiles. While a measure of the electron density profile enables a scale height to be inferred, hence yielding temperature information. We found a new way to represent how much total electron content (TEC) is allotted for the topside ionosphere. We then used this information to successfully determine TEC using ionosonde data containing only bottomside electron density information. For the first time, slab thickness, which is directly proportional to scale height, was found to be correlated to the peak density height and introduced as a new index, k. Ultimately, k relates electron density parameters and can be a very useful tool for describing the topside ionosphere shape and subsequently, scale height. The methodology of using cost-effective, readily available ionosonde bottomside electron density data combined with GPS TEC was discovered to be capable of inferring the topside ionosphere. This was verified by incoherent scatter radar (ISR) data, though major issues surrounding the availability of ionogram data during nighttime hours greatly limited our study, especially during diffusive equilibrium conditions. Also, significant differences were found between ISR and ionosonde-determined peak density parameters, NmF2 and hmF2, and raised concerns in how the instruments were calibrated.

Page generated in 0.0566 seconds