• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 105
  • 51
  • 21
  • 10
  • 9
  • 7
  • 5
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 269
  • 60
  • 45
  • 34
  • 33
  • 27
  • 26
  • 24
  • 24
  • 23
  • 22
  • 22
  • 20
  • 20
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Modelos digitais: o ensino de sistemas estruturais para Arquitetura e Urbanismo / Digital models: teaching structural systems to Architecture and Urbanism

Cunto, Ivanóe De 04 March 2016 (has links)
A partir da grande complexidade e dos vários fatores inerentes ao processo de projeto em Arquitetura e Urbanismo, a presente tese busca analisar se na formação do Arquiteto, as escolas e professores estão preparados para enfrentar essa realidade pelo ponto de vista do conhecimento estrutural. Para tanto a presente tese analisou o resultado de trabalhos de TFG, das disciplinas de introdução aos Sistemas Estruturais e os procedimentos adotados por seus professores, de duas escolas de Arquitetura e Urbanismo no Paraná. A primeira é uma Universidade pública e a outra um Centro Universitário particular. A busca pela resposta a essa questão, demonstra que todo o processo de ensino de Arquitetura, requer mudanças significativas. Passando por atualizações nos currículos e nos procedimentos adotados por professores, em face a defasagem percebida durante a pesquisa. A pesquisa propõe inicialmente uma nova sistemática aliada ao cálculo, propiciando que o aluno tenha primeiramente uma visualização real dos diversos Sistemas Estruturais, apoiado em modelos digitais de edifícios reais. Esse novo procedimento foi aplicado aos alunos dessas disciplinas iniciais com resultados positivos quanto ao conhecimento adquirido. Nos períodos seguintes, em acompanhamento a um grupo desses alunos, verificou-se que os trabalhos elaborados nas disciplinas seguintes de Projeto Arquitetônico, não se utilizavam desse conhecimento adquirido em seus trabalhos. Uma pesquisa feita com os professores de Projeto mostrou a dificuldade que os próprios professores enfrentavam com orientações nas questões estruturais. Para tanto, além desse procedimento inicial de uso de modelos digitais, novas mudanças se fazem necessárias também no currículo e na proposta de trabalho de cada uma das instituições pesquisadas, objetivando que os conhecimentos adquiridos com o uso de modelos digitais pudessem ser permanentes. Como embasamento as mudanças de posicionamento pedagógico e curricular com o uso de modelos digitais, a tese foi construída sob dois pontos fundamentais. Primeiramente a Teoria das Inteligências Múltiplas do psicólogo americano Howard Gardner, focando na Inteligência Espacial e a Taxonomia de Bloom, elaborada por Benjamim Samuel Bloom, sobre o Domínio Cognitivo, que são aplicadas para verificação da aprendizagem através da analisa de seis categorias hierarquicamente separadas, da mais simples ao de maior complexidade, que envolve conhecimento, compreensão, aplicação, análise, síntese e avaliação. Como resultado a essa pesquisa, chegou-se à conclusão de que os cursos pesquisados devem prover o conhecimento de Sistemas Estruturais, através de projetos pedagógicos atualizados e mais eficientes, aliados a professores comprometidos com as modernas metodologias pedagógicas, visando uma ampla compreensão dos conteúdos de suas disciplinas por parte dos alunos e buscando que o conhecimento estrutural adquirido num primeiro momento, seja efetivamente praticado nas demais disciplinas. / From the great complexity and the various factors inherent in the design process in Architecture and Urbanism, this thesis seeks to analyze whether the architectural education, schools and teachers are prepared to face this reality from the point of view of structural knowledge. For this purpose, this thesis analyzed the result of TFG work, introduction of disciplines to structural systems and procedures adopted by their teachers, two schools of Architecture and Urbanism of Londrina in Parana. The first is a public university and the other a private university center. The search for the answer to this question show that the whole architecture of teaching process requires significant changes. Undergoing upgrades in the curricula and procedures adopted by teachers, given the lag perceived while searching. The search initially proposes a new system combined with calculation, enabling the student to first have a real view of various systems supported by digital models of real buildings. This new procedure was applied to the students of these disciplines with initial positive results regarding the acquired knowledge. In subsequent periods, in monitoring a group of these students, it was found that the works carried out in the following disciplines of Architectural Design, not used this knowledge acquired in their work. A survey of the project teachers showed the difficulty that teachers themselves faced with guidelines on structural issues. For that, beyond this initial procedure using digital models, new changes should also happen in curriculum and work proposal of each of the institutions surveyed, aiming that the knowledge acquired with the use of digital models could be permanent. As basis the changes in teaching and curriculum positioning with the use of digital models, the thesis was built on two fundamental points. First, the Theory of Multiple Intelligences of the American Howard Gardner psychologist, focusing on the Space Intelligence and Bloom\'s Taxonomy, developed by Benjamin Samuel Bloom, on the Cognitive Domain, which are applied for verification of learning through analyzes six hierarchically separate categories of the simplest to the most complex, involving knowledge, comprehension, application, analysis, synthesis and evaluation. As a result to this research, we came to the conclusion that the surveyed courses should provide the structural systems knowledge through upgraded and more efficient educational projects, together with teachers committed to the modern teaching methods, aimed at a broad understanding of the contents of their disciplines by students and seeking the structural knowledge gained at first, is actually practiced in other disciplines.
202

Detekce a monitoring potenciálně toxických sinicových lipopeptidů

BÁRTOVÁ, Marie January 2019 (has links)
The aim of this study was to design and optimize new PCR primers for detection of potential cyanobacterial producers of cytotoxic lipopeptides puwainaphycins and minutissamides in environmental samples. Samples from two distinct localities were tested, as suggested based on preliminary data. The first set of samples consisted of cyanobacterial soil biofilms from sheep pastures affected by Alveld illness in Norway. The other one contained samples of planktic cyanobacaterial blooms from Protected Landscape Area Třeboň and its vicinity. Three different approaches were used for evaluation of the presence of cyanobacterial lipopeptide producers: microscopy, PCR with the designed primeres, and liquid chromatography-mass spectrometry analysis. Results of this study confirmed the specificity of the newly designed PCR primers. The presence of producers of puwainaphycins/minutissamides was proven at both tested localities.
203

Os concursos vestibulares das universidades estaduais paulistas e o ensino de Química no nível médio / The entrance exam of University São Paulo state and the teaching of chemistry at the secondary level

Amauro, Nicéa Quintino 17 September 2010 (has links)
A presente pesquisa teve como objetivo estudar a influência do sistema de verificação final de aprendizado sobre as práticas de ensino de Química no nível médio. Para tanto, identificamos e caracterizamos o nível de compreensão do conhecimento químico solicitado dos alunos egressos do ensino médio Brasileiro para seleção dos futuros universitários. Focamos nossas investigações sobre as questões das provas de Química da segunda fase dos concursos vestibulares da Universidade de São Paulo e da Universidade de Campinas, assim como nas provas de conhecimento específico para as carreiras das áreas de Ciências Biológicas e Ciências Exatas da Universidade Estadual Paulista e os relatórios de desempenho correspondentes, entre os anos de 1998 e 2008. O percurso metodológico desenvolvido analisa as questões em três eixos: (1) tema do ensino de química, que se utiliza da Proposta Curricular para o ensino de Química do estado de São Paulo de 1998; (2) processo cognitivo, que tem como referencia a Taxonomia de Bloom e (3) o desempenho médio dos candidatos nas questões das provas de Química. A triangulação dos dados evidenciou o caráter normativo, orientador e controlador destes exames sobre o sistema de ensino que os antecede. / This work aimed to study the influence of the system of final verification of learning on Chemistry teaching practices in high school. For that, we identified and characterized the level of chemical knowledge comprehension required from Brazilian students egress from high school for the selection of future undergraduates. We focused our investigations on the Chemistry questions of the second phase of the entrance tests from the University of São Paulo and the University of Campinas, as well as on the specific knowledge tests for careers in the field of Biological Sciences and Exact Sciences from the University of São Paulo State and the performance reports between the years of 1998 and 2008. Our methodological route follows the questions in three axes: (1) Chemistry teaching theme, which uses the Curriculum Proposition for the Chemistry teaching from the state of São Paulo from 1998; (2) cognitive process, whose reference is Bloom\'s Taxonomy and (3) average performance of the candidates in the questions in the Chemistry tests. The triangulation of data made clear the normative, guiding and controlling character of these exams on the teaching system that precedes them.
204

Physical Mechanisms Driving Harmful Algal Blooms Along the Texas Coast

Ogle, Marcus 1982- 14 March 2013 (has links)
Commonly referred to as “red tide”, harmful algal blooms (HABs) formed by Karenia brevis occur frequently in the Gulf of Mexico (GOM). A bloom is defined as cell abundances >105 cells L-1. This thesis will focus primarily on Karenia brevis, formerly known as Gymnodinium breve, in the Gulf of Mexico. K. brevis is harmful because it produces brevetoxin, a ladder-frame polyether that acts as a potent neurotoxin in vertebrates. K. brevis commonly causes fish kills, respiratory irritation in humans, and Neurotoxic Shellfish Poisoning (NSP) if ingested. Blooms of K. brevis occur almost annually along the West Florida Shelf (WFS) in the late summer and early fall, when the coastal current is favorable for bloom initiation. Along the Texas-Louisiana shelf (TLS) however, blooms of K. brevis are infrequent and sporadic. While much is known of the blooms along the WFS due to their frequent presence, little is known of the mechanisms driving the blooms along the TLS due to their inconsistent presence. To understand the stochastic nature of HABs along the TLS, historical data of bloom occurrences from 1996 to present were compared with NOAA station PTAT2 wind, sea-level pressure, air and water temperature data and NCEP NARR-A sea-level pressure data. The difference in the monthly-mean along-shore component of the wind was statistically significant between bloom and non-bloom years in September (p<<0.001) and April (p=0.0015), with bloom years having a strong downcoast current. Monthly mean water temperature values yielded similar results between bloom and non-bloom years. Both March and September monthly-mean water temperature values were lower during non-bloom years with p-values of 0.01 and 0.048, respectively. These results suggest the possibly of forecasting for HABs along the TLS with currently measured, publicly available data.
205

Space-efficient data sketching algorithms for network applications

Hua, Nan 06 July 2012 (has links)
Sketching techniques are widely adopted in network applications. Sketching algorithms “encode” data into succinct data structures that can later be accessed and “decoded” for various purposes, such as network measurement, accounting, anomaly detection and etc. Bloom filters and counter braids are two well-known representatives in this category. Those sketching algorithms usually need to strike a tradeoff between performance (how much information can be revealed and how fast) and cost (storage, transmission and computation). This dissertation is dedicated to the research and development of several sketching techniques including improved forms of stateful Bloom Filters, Statistical Counter Arrays and Error Estimating Codes. Bloom filter is a space-efficient randomized data structure for approximately representing a set in order to support membership queries. Bloom filter and its variants have found widespread use in many networking applications, where it is important to minimize the cost of storing and communicating network data. In this thesis, we propose a family of Bloom Filter variants augmented by rank-indexing method. We will show such augmentation can bring a significant reduction of space and also the number of memory accesses, especially when deletions of set elements from the Bloom Filter need to be supported. Exact active counter array is another important building block in many sketching algorithms, where storage cost of the array is of paramount concern. Previous approaches reduce the storage costs while either losing accuracy or supporting only passive measurements. In this thesis, we propose an exact statistics counter array architecture that can support active measurements (real-time read and write). It also leverages the aforementioned rank-indexing method and exploits statistical multiplexing to minimize the storage costs of the counter array. Error estimating coding (EEC) has recently been established as an important tool to estimate bit error rates in the transmission of packets over wireless links. In essence, the EEC problem is also a sketching problem, since the EEC codes can be viewed as a sketch of the packet sent, which is decoded by the receiver to estimate bit error rate. In this thesis, we will first investigate the asymptotic bound of error estimating coding by viewing the problem from two-party computation perspective and then investigate its coding/decoding efficiency using Fisher information analysis. Further, we develop several sketching techniques including Enhanced tug-of-war(EToW) sketch and the generalized EEC (gEEC)sketch family which can achieve around 70% reduction of sketch size with similar estimation accuracies. For all solutions proposed above, we will use theoretical tools such as information theory and communication complexity to investigate how far our proposed solutions are away from the theoretical optimal. We will show that the proposed techniques are asymptotically or empirically very close to the theoretical bounds.
206

Improving The Communication Performance Of I/O Intensive And Communication Intensive Application In Cluster Computer Systems

Kumar, V Santhosh 10 1900 (has links)
Cluster computer systems assembled from commodity off-the-shelf components have emerged as a viable and cost-effective alternative to high-end custom parallel computer systems.In this thesis, we investigate how scalable performance can be achieved for database systems on clusters. In this context we specfically considered database query processing for evaluation of botlenecks and suggest optimization techniques for obtaining scalable application performance. First we systematically demonstrated that in a large cluster with high disk bandwidth, the processing capability and the I/O bus bandwidth are the two major performance bottlenecks in database systems. To identify and assess bottlenecks, we developed a Petri net model of parallel query execution on a cluster. Once identified and assessed,we address the above two performance bottlenecks by offoading certain application related tasks to the processor in the network interface card. Offoading application tasks to the processor in the network interface cards shifts the bottleneck from cluster processor to I/O bus. Further, we propose a hardware scheme,network attached disk ,and a software scheme to achieve a balanced utilization of re-sources like host processor, I/O bus, and processor in the network interface card. The proposed schemes result in a speedup of upto 1.47 compared to the base scheme, and ensures scalable performance upto 64 processors. Encouraged by the benefits of offloading application tasks to network processors, we explore the possibilities of performing the bloom filter operations in network processors. We combine offloading bloom filter operations with the proposed hardware schemes to achieve upto 50% reduction in execution time. The later part of the thesis provides introductory experiments conducted in Community At-mospheric Model(CAM), a large scale parallel application used for global weather and climate prediction. CAM is a communication intensive application that involves collective communication of large messages. In our limited experiment, we identified CAM to see the effect of compression techniques and offloading techniques (as formulated for database) on the performance of communication intensive applications. Due to time constraint, we considered only the possibility of compression technique for improving the application performance. However, offloading technique could be taken as a full-fledged research problem for further investigation In our experiment, we found compression of messages reduces the message latencies, and hence improves the execution time and scalability of the application. Without using compression techniques, performance measured on 64 processor cluster resulted in a speed up of only 15.6. While lossless compression retains the accuracy and correctness of the program, it does not result in high compression. We therefore propose lossy compression technique which can achieve a higher compression, yet retain the accuracy and numerical stability of the application while achieving a scalable performance. This leads to speedup of 31.7 on 64 processors compared to a speedup of 15.6 without message compression. We establish that the accuracy within prescribed limit of variation and numerical stability of CAM is retained under lossy compression.
207

Comparaison de novo de données de séquençage issues de très grands échantillons métagénomiques : application sur le projet Tara Oceans

Maillet, Nicolas 19 December 2013 (has links) (PDF)
La métagénomique vise à étudier le contenu génétique et génomique d'un échantillon provenant d'un environnement naturel. Cette discipline récente s'attache à étudier les génomes de différents organismes provenant d'un même milieu. La métagénomique pose de nouvelles questions, tant d'un point de vue biologique qu'informatique. Les masses de données générées par les études métagénomiques et la complexité des milieux étudiés, nécessitent de développer de nouvelles structures de données et de nouveaux algorithmes dédiés. Parmi les différentes approches existantes en métagénomique, la métagénomique comparative consiste à comparer plusieurs métagénomes afin d'en connaître les divers degrés de similarité. Lorsque cette comparaison se base uniquement sur le contenu brut des échantillons, sans faire appel à des connaissances externes, on parle de métagénomique comparative de novo. L'objectif des travaux que nous proposons est de développer une méthode permettant d'extraire les séquences similaires de deux jeux de données métagénomiques, où chaque jeu peut être composé de centaines de millions de courtes séquences. La comparaison proposée consiste à identifier les séquences d'un premier jeu similaires à au moins une séquence d'un second jeu. Afin d'être rapide et économe en mémoire, l'implémentation de notre méthode a nécessité la conception d'une nouvelle structure d'indexation, basée sur le filtre de bloom. Le logiciel final, nommé Compareads, a une consommation mémoire faible (de l'ordre de quelques go) et peut calculer l'intersection de deux échantillons de 100 millions de séquences chacun en une dizaine d'heures. Notre méthode est une heuristique qui génère un faible taux de faux positifs. Le logiciel Compareads est dédié à l'analyse de grands jeux de données métagénomiques. À l'heure actuelle, il est le seul outil capable de comparer de tels jeux. Compareads a été appliqué sur plusieurs projets métagénomiques. Notre outil produit des résultats robustes, biologiquement exploitables et en accord avec diverses méthodes fondamentalement différentes. Il est actuellement utilisé de manière intensive sur les échantillons provenant de l'expédition tara oceans. Sur ce projet, notre méthode à permis de mettre en évidence que les grands systèmes océaniques influent sur la répartition globale des micro-organismes marins.
208

The Influence of Genetic Variation on Susceptibility of Common Bottlenose Dolphins (<italic>Tursiops truncatus</italic>) to Harmful Algal Blooms

Cammen, Kristina Marstrand January 2014 (has links)
<p>The capacity of marine organisms to adapt to natural and anthropogenic stressors is an integral component of ocean health. Harmful algal blooms (HABs), which are one of many growing threats in coastal marine ecosystems, represent a historically present natural stressor that has recently intensified and expanded in geographic distribution partially due to anthropogenic activities. In the Gulf of Mexico, HABs of <italic>Karenia brevis</italic> occur almost annually and produce neurotoxic brevetoxins that have been associated with large-scale mortality events of many marine species, including the common bottlenose dolphin (<italic>Tursiops truncatus</italic>). The factors resulting in large-scale dolphin mortality associated with HABs are not well understood, particularly in regards to the seemingly different impacts of HABs in geographically disjunct dolphin populations. My dissertation investigates a genetic basis for resistance to HABs in bottlenose dolphins in central-west Florida and the Florida Panhandle. I used both genome-wide and candidate gene approaches to analyze genetic variation in dolphins that died putatively due to brevetoxicosis and live dolphins from the same geographic areas that survived HAB events. Using restriction site-associated DNA sequencing, I identified genetic variation that suggested both a common genetic basis for resistance to HABs in bottlenose dolphins across the Gulf coast of Florida and regionally specific resistance. Many candidate genes involved in the immune, nervous, and detoxification systems were found in close genomic proximity to survival-associated polymorphisms throughout the bottlenose dolphin genome. I further investigated two groups of candidate genes, nine voltage-gated sodium channel genes selected because of their putative role in brevetoxin binding and four major histocompatibility complex (MHC) loci selected because of their genomic proximity to a polymorphism exhibiting a strong association with survival. I found little variation in the sodium channel genes and conclude that bottlenose dolphins have not evolved resistance to HABs via mutations in the toxin binding site. The immunologically relevant MHC loci were highly variable and exhibited patterns of genetic differentiation among geographic regions that differed from neutral loci; however, genetic variation at the MHC also could not fully explain variation in survival of bottlenose dolphins exposed to HABs. In my final chapter, I consider the advantages and drawbacks of the genome-wide approach in comparison to a candidate gene approach and, as laid out in my dissertation, I recommend using both complementary approaches in future investigations of adaptation in genome-enabled non-model organisms.</p> / Dissertation
209

Scaling Software Security Analysis to Millions of Malicious Programs and Billions of Lines of Code

Jang, Jiyong 01 August 2013 (has links)
Software security is a big data problem. The volume of new software artifacts created far outpaces the current capacity of software analysis. This gap has brought an urgent challenge to our security community—scalability. If our techniques cannot cope with an ever increasing volume of software, we will always be one step behind attackers. Thus developing scalable analysis to bridge the gap is essential. In this dissertation, we argue that automatic code reuse detection enables an efficient data reduction of a high volume of incoming malware for downstream analysis and enhances software security by efficiently finding known vulnerabilities across large code bases. In order to demonstrate the benefits of automatic software similarity detection, we discuss two representative problems that are remedied by scalable analysis: malware triage and unpatched code clone detection. First, we tackle the onslaught of malware. Although over one million new malware are reported each day, existing research shows that most malware are not written from scratch; instead, they are automatically generated variants of existing malware. When groups of highly similar variants are clustered together, new malware more easily stands out. Unfortunately, current systems struggle with handling this high volume of malware. We scale clustering using feature hashing and perform semantic analysis using co-clustering. Our evaluation demonstrates that these techniques are an order of magnitude faster than previous systems and automatically discover highly correlated features and malware groups. Furthermore, we design algorithms to infer evolutionary relationships among malware, which helps analysts understand trends over time and make informed decisions about which malware to analyze first. Second, we address the problem of detecting unpatched code clones at scale. When buggy code gets copied from project to project, eventually all projects will need to be patched. We call clones of buggy code that have been fixed in only a subset of projects unpatched code clones. Unfortunately, code copying is usually ad-hoc and is often not tracked, which makes it challenging to identify all unpatched vulnerabilities in code basesat the scale of entire OS distributions. We scale unpatched code clone detection to spot over15,000 latent security vulnerabilities in 2.1 billion lines of code from the Linux kernel, allDebian and Ubuntu packages, and all C/C++ projects in SourceForge in three hours on asingle machine. To the best of our knowledge, this is the largest set of bugs ever reported in a single paper.
210

Potencialiai toksinių planktoninių melsvabakterių erdvinio pasiskirstymo ypatumai šiaurinėje Kuršių marių dalyje / Spatial patterns of potential toxic planktonic cyanobacteria occurrence in northern part of the coronian lagoon

Vaičiūtė, Diana 23 June 2014 (has links)
Dumbliai – mikroskopiniai planktono organizmai – vienas iš pagrindinių hidroekosistemų komponentų, pirminiai organinės medžiagos producentai. Didėjant vandens telkinių trofiškumui, mažėja dumblių rūšių įvairovė, keičiasi vyraujančių rūšių kompleksas. Dažnai eutrofikuotuose vandens telkiniuose ima dominuoti prokariotiniai autotrofiniai mikroorganizmai – melsvabakterės, kurios sukelia intensyvius vandens „žydėjimo“ procesus ežeruose, jūrinėse lagūnose, jūrose bei vandenynuose. Dėl šios priežasties blogėja vandens kokybė. Pastaraisiais dešimtmečiais išsamių tyrimų objektu visame pasaulyje tampa toksiniai fitoplanktono dumbliai ir melsvabakterės. Tyrimais yra nustatyta, kad pusė iš visų vandens „žydėjimo“ atvejų yra toksiški (RAPALA, LAHTI, 2002). Pasaulyje atliekami monitoringiniai tyrimai, siekiant įvertinti toksinių dumblių ir melsvabakterių vystymosi tendencijas, priklausomybę nuo aplinkos sąlygų, toksinio vandens „žydėjimo“ priežastis. Pasitelkiant cheminius bei genetinius metodus, nustatoma toksinių medžiagų cheminė sudėtis, vertinamas jų poveikis gyviems organizmams. Šiaurinės Kuršių marių dalies vasariniame planktone 2004-2006 m. aptiktos 223 dumblių rūšys, priklausančios 5 klasėms. 97 rūšys (43 %), priklauso Chlorophyceae klasei, 71 rūšis (32 %) – Cyanophyceae, 40 rūšių (18 %) – Bacillariophyceae, 9 rūšys (4 %) – Euglenophyceae ir 6 rūšys (3 %) – Dinophyceae klasei, iš jų aptiktos 26 potencialiai toksinės dumblių ir melsvabakterių rūšys, priklausančios 3 klasėms, 14... [toliau žr. visą tekstą] / Curonian Lagoon is a shallow transitional water basin located in the south-eastern part of the Baltic Sea. The southern and central parts of the lagoon contain freshwater due to discharge from the Nemunas River, while the salinity in the northern part varies from 0 to 8 PSU, depending on winds activity affecting brackish water inflow from the Baltic Sea. The investigation was carried out in the fresh-brackish water mixing zone (Influence zone of Baltic Sea), in the central part and Nemunas River influence zone in July-August 2004 - 2006. Changes in physico-chemical parameters, chlorophyll a concentration, phytoplankton and toxic algae cell density were monitored. Totally 223 species and varieties mainly belonging to Chlorophyceae (43 %) and Cyanophyceae (32 %) were found. 26 algae species from 3 algae classes (Cyanophyceae, Chlorophyceae and Dinophyceae) were identified as potential toxic species in the northern part of Curonian Lagoon during 2004 and 2006 summer time. Dominated toxic species Ahpanizomenon flos-aquae, Microcystis aeruginosa, M. viridis, M. wesenbergii, Woronichinia compacta. Phytoplankton biomass in Curonian Lagoon surface ranged from 12,27 to 50,22 mg/l. The peak of phytoplankton (33,11 mg/l) and potential toxic algae (28,67 mg/l) biomass in 2004 summer time was observed near by Klaipeda Strait, were Aphanizomenon flos-aquae contain 36 % from total biomass. In 2005 summer time the highest phytoplankton (50,22 mg/l) and toxic algae (21.46 mg//l) biomass were... [to full text]

Page generated in 0.0539 seconds