• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 32
  • 32
  • 7
  • 6
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Teoria de correção de erros quânticos durante operações lógicas e medidas de diagnóstico de duração finita / Quantum error-correction theory during logical gates and finitetime syndrome measurements

Castro, Leonardo Andreta de 17 February 2012 (has links)
Neste trabalho, estudamos a teoria quântica de correção de erros, um dos principais métodos de prevenção de perda de informação num computador quântico. Este método, porém, normalmente é estudado considerando-se condições ideais em que a atuação das portas lógicas que constituem o algoritmo quântico não interfere com o tipo de erro que o sistema sofre. Além disso, as medidas de síndrome empregadas no método tradicional são consideradas instantâneas. Nossos objetivos neste trabalho serão avaliar como a alteração dessas duas suposições modificaria o processo de correção de erros. Com relação ao primeiro objetivo, verificamos que, para erros causados por ambientes externos, a atuação de uma porta lógica simultânea ao ruído pode gerar erros que, a princípio, podem não ser corrigíveis pelo código empregado. Propomos em seguida um método de correção a pequenos passos que pode ser usado para tornar desprezíveis os erros incorrigíveis, além de poder ser usado para reduzir a probabilidade de erros corrigíveis. Para o segundo objetivo, estudamos primeiro como medidas de tempo finito afetam a descoerência de apenas um qubit, concluindo que esse tipo de medida pode na verdade proteger o estado que está sendo medido. Motivados por isso, mostramos que, em certos casos, medidas de síndrome finitas realizadas conjuntamente ao ruído são capazes de proteger o estado dos qubits contra os erros mais eficientemente do que se as medidas fossem realizadas instantaneamente ao fim do processo. / In this work, we study the theory of quantum error correction, one of the main methods of preventing loss of information in a quantum computer. This method, however, is normally studied under ideal conditions in which the operation of the quantum gates that constitute the quantum algorithm do not interefere with the kind of error the system undergoes. Moreover, the syndrome measurements employed in the traditional method are considered instantaneous. Our aims with this work are to evaluate how altering these two suppositions would modify the quantum error correction process. In respect with the first objective, we verify that, for errors caused by external environments, the action of a logical gate simultaneously to the noise can provoke errors that, in principle, may not be correctable by the code employed. We subsequently propose a short-step correction method that can be used to render negligible the uncorrectable errors, besides being capable of reducing the probability of occurrence of correctable errors. For the second objective, we first study how finite-time measurements affect the decoherence of a single qubit, concluding that this kind of measurement can actually protect the state under scrutiny. Motivated by that, we demonstrate, that, in certain cases, finite syndrome measurements performed concurrently with the noise are capable of protecting more efficiently the state of the qubits against errors than if the measurements had been performed instantaneously at the the end of the process.
22

Teoria de correção de erros quânticos durante operações lógicas e medidas de diagnóstico de duração finita / Quantum error-correction theory during logical gates and finitetime syndrome measurements

Leonardo Andreta de Castro 17 February 2012 (has links)
Neste trabalho, estudamos a teoria quântica de correção de erros, um dos principais métodos de prevenção de perda de informação num computador quântico. Este método, porém, normalmente é estudado considerando-se condições ideais em que a atuação das portas lógicas que constituem o algoritmo quântico não interfere com o tipo de erro que o sistema sofre. Além disso, as medidas de síndrome empregadas no método tradicional são consideradas instantâneas. Nossos objetivos neste trabalho serão avaliar como a alteração dessas duas suposições modificaria o processo de correção de erros. Com relação ao primeiro objetivo, verificamos que, para erros causados por ambientes externos, a atuação de uma porta lógica simultânea ao ruído pode gerar erros que, a princípio, podem não ser corrigíveis pelo código empregado. Propomos em seguida um método de correção a pequenos passos que pode ser usado para tornar desprezíveis os erros incorrigíveis, além de poder ser usado para reduzir a probabilidade de erros corrigíveis. Para o segundo objetivo, estudamos primeiro como medidas de tempo finito afetam a descoerência de apenas um qubit, concluindo que esse tipo de medida pode na verdade proteger o estado que está sendo medido. Motivados por isso, mostramos que, em certos casos, medidas de síndrome finitas realizadas conjuntamente ao ruído são capazes de proteger o estado dos qubits contra os erros mais eficientemente do que se as medidas fossem realizadas instantaneamente ao fim do processo. / In this work, we study the theory of quantum error correction, one of the main methods of preventing loss of information in a quantum computer. This method, however, is normally studied under ideal conditions in which the operation of the quantum gates that constitute the quantum algorithm do not interefere with the kind of error the system undergoes. Moreover, the syndrome measurements employed in the traditional method are considered instantaneous. Our aims with this work are to evaluate how altering these two suppositions would modify the quantum error correction process. In respect with the first objective, we verify that, for errors caused by external environments, the action of a logical gate simultaneously to the noise can provoke errors that, in principle, may not be correctable by the code employed. We subsequently propose a short-step correction method that can be used to render negligible the uncorrectable errors, besides being capable of reducing the probability of occurrence of correctable errors. For the second objective, we first study how finite-time measurements affect the decoherence of a single qubit, concluding that this kind of measurement can actually protect the state under scrutiny. Motivated by that, we demonstrate, that, in certain cases, finite syndrome measurements performed concurrently with the noise are capable of protecting more efficiently the state of the qubits against errors than if the measurements had been performed instantaneously at the the end of the process.
23

The importance's of the physical analogue clock in mediating learning of analogue clock time in Grade 4 learners

Metelerkamp, Roger Gregory January 2014 (has links)
My research topic concerns how learners use the analogue clock (as a human tool) to make meaning of clock time. This study is informed by a Vygotskian socio-cultural framework to learning and development based on the concept that human activities take place in cultural contexts and is mediated by tools. In this qualitative study I report on the learners meaning making of analogue clock time using the physical clock. This study was carried out at a South African primary school through an intervention programme after school. The research employed a case study method. It involved a purposeful sample of 4 learners (n=38) from the grade four class group based on their response to a baseline assessment task. The selected sample of learners included learners across the ability spectrum to gain rich insight into how learners make meaning of analogue clock time. Data collection and analysis was done through an interpretive approach. The video-taped interviews and intervention programme was my main instrument of data collection. Other research instruments included document analysis of responses to baseline assessment tasks. These research tools yielded the data collected and also allowed for triangulation. My research topic explored how the learners make meaning of analogue clock time. In particular the two-way movement of how the learners use the physical tool namely the analogue clock to develop meaning and how the clock mediates clock knowledge in return. The findings of the study suggest that learners’ find it difficult to conceptualise analogue clock symbols and signs, in particular, in Afrikaans the half hour concept, in relation to the two hands. The physical analogue clock is also important to support and extend clock knowledge in solving time-related problems. This shows the power of the analogue clock to mediate meaning making of clock time in young learners. Because of its potential to improve teaching and learning analogue clock time in primary school it is therefore recommended that the analogue clock time be further researched in South Africa.
24

Network Characterization using Active Measurements for Small Cell Networks

Saffarzadeh, Mozhgan January 2013 (has links)
Due to the rapid growth of mobile networks, network operators need to expand their coverage and capacity. Addressing these two needs is challenging. One factor is the requirement for cost-efficient transport via heterogeneous networks. In order to achieve this goal, Internet connectivity is considered a cost-efficient transport option by many operators for small cell backhaul. This thesis project investigates if a small cell network's requirements can be fulfilled by utilizing Internet connectivity for backhaul. In order to answer this question several measurements have been made to assess different aspect of live networks and compare them with the network operator's requirements. Different measurement protocols are utilized to evaluate some of the key network characteristics, such as throughput, jitter, packet loss, and delay. These measurement protocols are described in this thesis. Moreover, improving the bandwidth available in real-time (BART) measurement method was one of the main achievements of this thesis project. Evaluation of the measurement results indicates that fiber based access together with Internet connectivity would be the best and cheapest solution as a backhaul for small cell network in comparison with almost all of the other types of broadband access technologies. It should be noted that asymmetric digital subscriber line (ADSL) and cable- TV access networks proved to be unable to meet the requirements for small cell backhaul. This project gives a clear picture of the current broadband access network infrastructure's attributes and highlights the possibility of reducing backhaul costs by using broadband Internet connectivity as a backhaul transport option. / Dagens snabbt ökande mobilia datatrafik gör att nätverksoperatörerna behöver utöka både täckning och kapacitet hos sina nät. Att tillgodose båda dessa behov är en utmaning. Ett krav är kostnadseffektiva transporter via heterogena nätverk. För att uppfylla detta utreder många operatörer möjligheten att använda Internet-baserad returtrafik (backhaul) för småceller. Detta examensarbete utreder huruvida kraven för småceller kan uppfyllas genom att utnyttja en Internet-baserad returtrafik. För att kunna besvara denna fråga har flera mätningar utförts i syfte att bedöma olika aspekter av verkliga nätverk och jämföra dem med nätverksoperatörens krav. Olika mätprotokoll utnyttjas för att utvärdera några av de viktigaste egenskaperna hos nätet, såsom hastighet, jitter, paketförluster och förseningar. Dessa mätprotokoll beskrivs i dettta examensarbete. Dessutom/Vidare har metoden "bandbredd tillgänglig för realtidsmätningar" bandwidth available in real-time (BART) förbättrats. Utvärdering av mätresultaten visar att fiberbaserad access tillsammans med Internetanslutning är den bästa och billigaste returtrafiklösningen för småcellsnätverk för nästan alla olika typerna av bredbandsteknik, förutom för (asymmetric digital subscriber line) ADSL och kabelaccessnät. Detta projekt ger en tydlig bild av den aktuella nätinfrastrukturens egenskaper och möjligheten att reducera returtrafik-kostnaderna genom att använd bredbandsanslutning med Internet som transport kostnader.
25

Potential Sustainability Improvements by Using Real-Time Measuring Temperature Sensors in Offices : A Case Study at Vasakronan’s Head Office Evaluating Sensor Solutions and Their Applicability. / Potentiella hållbarhetsfördelar av att använda realtidsmätande temperatursensorer i kontor : En studie på Vasakronans huvudkontor med utvärdering av sensorer och deras användbarhet

Franzén, Linda, Fredheim, Jessica January 2016 (has links)
This thesis is written on the subject of smart cities, where real-time temperature measuringsensors were tested at Vasakronan’s head office. The purpose of doing so was to evaluate theavailable sensor solutions for real-time measurements as well as analyze the sustainabilitybenefits of doing so. Three sensors were tested: Yanzi Climate, Texas Instruments SensorTag,and Smart Citizen Board. The measurements from the sensors were compared within the sensorsolution as well as between each other and the traditional measuring equipment Testo 480 andTinytags. A multi-criteria analysis was conducted to compare the qualities of the sensors, whichshowed that Yanzi was the best of these. Mainly this was due to the quality of measurementsand conformity within the sensor solution. The performance of the other sensor solutions weresimilar, although they had different strengths and weaknesses. Finally, the temperaturemeasurements from Yanzi was used to make a temperature map over the office. If implementedin real-time, this would serve as an indicator for the superintendents of the building whenmanaging the HVAC system, which could improve energy efficiency and decrease costs.Additionally, employees could use the temperature map for individual choice of the indoortemperature that suits them. This will result in improving social sustainability at the office, aswell as economic sustainability due to increased productivity of the employees. / Smarta städer är ett koncept som växer i popularitet bland forskare inom hållbarhet. Hittills harfokus främst varit på att minska energi- och materialflöden för hushåll, medan tillämpningar förkontor varit få. Därmed sågs ett behov av att utforska vilka hållbarhetsfördelar som kundeuppnås genom att tillämpa smarta städer-strategier på kontor.Syftet med examensarbetet är att utforska hur realtidsmätningar av temperatur på kontor kanbidra till hållbar utveckling. Detta inkluderar dels att utreda hur data från dessa mätningar kananvändas för att bidra till ökad hållbarhet, men också att implementera realtidsmätningar avtemperatur på kontor för att bedöma vilka möjligheter som finns i nuläget. Det var ocksånödvändigt att bedöma om sensorlösningarna har möjlighet att ersätta den traditionellautrustningen för temperaturmätningar.Arbetet krävde forskning inom områden som smarta kontor, sensorer med möjlighet att mätatemperatur i realtid, energiförbrukning i byggnader, traditionell mätutrustning för temperatur,och termisk komfort. Då sensorer med passande egenskaper för att kunna utföra testernafunnits gjordes ett val av tre sensorlösningar som skulle installeras på Vasakronans huvudkontor.Den huvudsakliga sensorlösningen som skulle placeras på hela kontoret valdes till Yanzi. De tvåövriga som skulle testas i mindre skala var Smart Citizen Board och Texas Instruments SensorTag.För att kunna avgöra kvaliteten på mätningarna var det nödvändigt att data sparades på ett brasätt. För Smart Citizen Board och Yanzi innebar detta att data var tvungen att hämtas från derasrespektive servrar, vilket gjordes genom att skriva program som kopplade upp mot deras servraroch hämtade data en gång i minuten. För Texas Instruments SensorTag gjordes lagring direkt pådatorns hårddisk.Yanzisensorerna som var installerade i en större skala användes också under testperioden för attsamla data till en temperaturkarta över kontoret. Den initiala tanken med temperaturkartan varatt användarna av kontoret skulle ha möjligheten att basera vart de sätter sig utifrån derastermiska preferenser. Därför behövdes även ett flertal test göras på Vasakronans kontor för attutröna om en temperaturkarta för denna användning skulle vara applicerbar där.Testerna på Vasakronans kontor var uppdelat i fem delar: 1) Jämförelse mellansensorlösningarna och de traditionellt använda Tinytag 2) Jämförelse mellan lufttemperaturenoch den operativa temperaturen och det termiska klimatet 3) Jämförelse mellansensorlösningarna och den traditionellt använda Testo 480 4) Test av sensorlösningarnastäckning på olika delar av kontoret 5) Insamling av data för temperaturkartan.I det första testet var sensorlösningarna placerade bredvid Tinytagsen och de var inställda för attmäta från fredag till tisdag. Resultaten från detta test var att det var stor skillnad i batteritidmellan sensorlösningarna samt att enhetlighet mellan de två enheterna från vardera lösningenvarierade kraftigt mellan de olika lösningarna. Yanzi bedömdes ha den bästa batteritiden ochenhetligheten mellan sensorernas mätningar.I det andra testet användes den traditionella utrustningen Testo 480 för att undersöka om detvar en signifikant skillnad i den termiska upplevelsen mellan olika områden på kontoret och ifalldessa kan användas för att skapa en temperaturkarta. Testet undersökte också omlufttemperatur var en bra indikator för den termiska upplevelsen i varje område. Resultatet från  detta test var att det är tillräckligt stora temperaturskillnader på Vasakronans kontor för att entemperaturkarta ska vara applicerbar. Slutsatsen var också att lufttemperatur är en tillräckligtbra indikator för det termiska klimatet.I det tredje testet jämfördes sensorlösningarnas tester med mätningarna av Testo 480. Dettagjordes för att utröna sensorlösningarnas noggrannhet. Testerna visade att Yanzi varsensorlösningen med de mest korrekta mätningarna.I testet av sensorernas räckvidd var det tydligt att SCB hade en fördel som endast berodde påhur bra Wi-Fi täckningen var i byggnaden. Varken Yanzi och TI hade täckning i hela byggnaden,då avstånden mellan gateway och sensorer blev för långt. Av de två hade Yanzi dock en fördel dåde kan använda sig av andra enheter som repeterar signalerna.I det sista testet samlades data in för att skapa temperaturkartan som användarna av kontoretskulle kunna basera sitt val av plats på. En lämplig skala och en prototyp av temperaturkartanutformades.Efter testperioden gjordes en multikriterieanalys för att avgöra vilken av sensorlösningarna sompassade denna typ av applikation bäst. Slutsatsen var att Yanzi var den som passade bäst. Dettaberodde främst på mätnoggrannheten, enhetligheten mellan sensorernas mätningar och att denvar mest pålitlig under testerna. TI och SCB hade liknande poäng men hade olika styrkor ochsvagheter och bör därför väljas för att passa varje enskilt fall.Slutsatserna av detta examensarbete är många. Studien har funnit att temperaturmätningar irealtid på kontoret kan bidra till en förbättrad hållbarhet. Förbättringarna uppnås främst inomenergibesparingar genom en annorlunda reglering av VVS-systemet och genom att förbättra dentermiska komforten hos användarna av kontoret via nyttjandet av temperaturkartan.I slutskedet av detta examensarbete är förhoppningarna att smarta städer och smarta byggnaderska fortsätta att utvecklas och då möjliggöras av sensorlösningar som fortfarande är ekonomisktförsvarbara men som är mer tillförlitliga än idag.
26

Studies of urban air quality using electrochemical based sensor instruments

Popoola, Olalekan Abdul Muiz January 2012 (has links)
Poor air quality has been projected to be the world’s top cause of environmental premature mortality by 2050 surpassing poor sanitation and dirty water (IGBP / IGAC press release, 2012 ). One of the major challenges of air quality management is how to adequately quantify both the spatial and temporal variations of pollutants for the purpose of implementing necessary mitigation measures. The work described in this thesis aims to address this problem using novel electrochemical based air quality (AQ) sensors. These instruments are shown to provide cost effective, portable, reliable, indicative measurements for urban air quality assessment as well as for personal exposure studies. Three principal pollutants CO, NO and NO2 are simultaneously measured in each unit of the AQ instrument including temperature / RH measurements as well as GPS (for time and position) and GPRS for data transmission. Laboratory studies showed that the electrochemical sensor nodes can be highly sensitive, showing linear response during calibration tests at ppb level (0-160 ppb). The instrumental detection limits were found to be < 4 ppb (CO and NO) and < 1 ppb for NO2 with fast response time equivalent to t90 < 20 s. Several field studies were carried out involving deployment of both the mobile and static electrochemical sensor nodes. Results from some short-term studies in four different cities including Cambridge (UK), London (UK), Valencia (Spain) and Lagos (Nigeria) are presented. The measurements in these cities represent snapshot of the pollution levels, the stark contrast between the pollution level especially CO (mean mixing ratio of 16 ppm over 3 hrs) in Lagos and the other three cities is a reflection of the poor air quality in that part of the world. Results from long-term AQ monitoring using network of 46 static AQ sensors were used to characterise pollution in different environments ranging from urban to semi-urban and rural locations. By coupling meteorological information (wind measurements) with pollution data, pollution sources, and phenomena like the street canyon effect can be studied. Results from the long-term study also revealed that siting of the current fixed monitoring stations can fail to represent the actual air quality distribution and may therefore be unrepresentative. This work has shown the capability of electrochemical based AQ sensors in complementing the existing fixed site monitors thus demonstrating an emerging measurement paradigm for air quality monitoring and regulation, source attribution and human exposure studies.
27

The Role of Nanoclay on the Deformation Behavior of Polypropylene/Maleic Anhydride Modified Polypropylene Films and Fibers in Full and Partially Molten State Processing

Fujiyama-Novak, Jane Hitomi 12 November 2009 (has links)
No description available.
28

Numerische Simulationen zur Rückrechnung und Prognose von Setzungen und Gebirgsdeformation

Wöhrl, Benedikt, Bock, Sven, Schürmann, Christopher 02 February 2024 (has links)
Die stetige Weiterentwicklung von Mess- und Überwachungstechnik ermöglicht Setzungsprozesse und Gebirgsdeformationen mit zunehmender Genauigkeit in numerischen Modellen abzubilden. Mit Hilfe von Laserscans, durchgeführt sowohl vor als auch während der Bauarbeiten, können genauere numerische Modelle erstellt werden. Die baubegleitende Anpassung der Modellgeometrie ermöglicht zudem eine weitere Nachkalibrierung der Simulationen und erhöht damit die Zuverlässigkeit von Prognoseberechnungen. Die Simulationsergebnisse können fortlaufend mit den ursprünglichen Planungen abgeglichen und somit Ausbauplanungen angepasst und optimiert werden. / The ongoing development of monitoring and surveillance technology enables us to reproduce subsidence and rock mass deformation in numerical models with increasing precision. Laserscans prior to and during the construction work increase the spatial accuracy of numerical models. Adjustments of the model geometry during the construction work allow a recalibration of models and increase the reliability of forecast simulations. Simulation results can be successively compared to construction plans and allow adjustments and optimizations of support designs.
29

Calibration of the Measurement System for Methane Pyrolysis in Rocket Nozzle Cooling Channels

Ly, Jennifer January 2023 (has links)
Methane-based rocket propellant is gaining traction as a green technology with advantages in sustainability, cost-effectiveness, and performance. However, under high temperatures found in rocket nozzle cooling channels, methane can undergo thermal decomposition known as methane pyrolysis, resulting in the generation of hydrogen and solid carbon. This poses challenges to rocket engine performance and can eventually cause engine failure. Understanding and predicting the composition of evolved gases in rocket engine processes is therefore crucial. This thesis focuses on quantifying the production of hydrogen in the exhaust stream. To achieve this objective, a correlational measurement method utilizing sensors was developed and experimentally investigated. This approach involved the detailed mapping of sensor responses to variations in gas composition, temperature, and pressure, which were compared and validated against theoretical data derived from REFPROP; a widely used software tool for calculating gas properties. The sensors employed in this study enabled direct measurements of the speed of sound (SOS) and thermal conductivity (TCD) of the gas. The SOS measurements exhibited strong agreement with theoretical predictions in response to changes in hydrogen content. In contrast, the TCD measurements showed lower sensitivity to hydrogen. It was observed that temperature exhibited a substantial influence on both SOS and TCD compared to pressure. However, the implementation of experimental and theoretical correction coefficients effectively compensated for these effects. The resulting calibration curves demonstrated an absolute deviation of 0.2-0.3%vol in hydrogen concentration, which demonstrates the effectiveness of the developed method of quantifying hydrogen in gas mixtures. Lastly, the occurrence of methane pyrolysis was tested and confirmed. / Metan-baserat raketbränsle är en attraktiv grön teknologi med fördelar inom hållbarhet, kostnadseffektivitet och prestanda. Dock kan metan vid höga temperaturer funna i kylningskanalerna av raketmunnstycken undergå termisk sönderfallning via en process som kallas för metanpyrolys, vilket resulterar i produktionen av vätgas och fast kol. Detta medför utmaningar för prestandan av raketmotorn och kan i slutändan förstöra motorn. Det är därför mycket viktigt att kunna förstå och förutsäga sammansättningen av de gaser som bildas i processerna i raketmotorer. Detta examensarbete fokuserar på att kvantifiera produktionen av vätgas i avgasströmmen. För att uppnå detta mål utvecklades och experimentellt undersöktes en korrelationsmätmetod som använder sensorer. Detta tillvägagångssätt innebar en detaljerad kartläggning av sensorernas svar på variationer i gassammansättning, temperatur och tryck, som sedan jämfördes och validerades mot teoretiska data från REFPROP; ett välkänt programverktyg för beräkningen av gasegenskaper. De sensorer som användes i denna studie möjliggjorde direkta mätningar av ljudhastigheten (SOS) och värmeledningsförmågan (TCD) hos gasen. SOS-mätningarna visade en stark överensstämmelse med teoretiska förutsägelser som svar på förändringar i vätgasinnehållet. TCD-mätningarna visade däremot lägre känslighet för väte. Det observerades att temperaturen hade en betydande inverkan på både SOS och TCD jämfört med trycket. Implementeringen av experimentella och teoretiska korrigeringskoefficienter kompenserade dock effektivt för dessa effekter. De resulterande kalibreringskurvorna visade en absolut avvikelse på 0.2-0.3%vol i vätgaskoncentration, vilket betonar effektiviteten hos den utvecklade metoden för att kvantifiera väte i gasblandningar. Slutligen testades och bekräftades förekomsten av metanpyrolys.
30

Temps de cohérence temporelle de structures turbulentes porteuses de scalaires passifs au sein d'une turbulence homogène quasi-isotrope / Coherence times of passive scalar space scales in homogeneous and quasi-isotropic turbulence

Lenoir, Jean-Michel 18 July 2011 (has links)
Le but principal du présent travail est ainsi de réaliser une expérience de mélange par la turbulence, dans laquelle il est possible de déterminer et de quantifier les temps de cohérence des différentes échelles spatiales des fluctuations du champ de vitesse et du champ de concentration qu'il transporte et mélange. La turbulence est ici voisine de la situation idéale statistiquement homogène et isotrope, et la configuration est conçue pour qu'il en soit de même pour le champ de concentration. La turbulence est créée par une grille placée perpendiculairement à un écoulement uniforme à l'extérieur des couches limites qui se développent le long des parois de la veine d'essais à section carrée constante. L'écoulement de la présente étude est un écoulement d'eau, dans lequel le champ de concentration est celui d'une solution de Rhodamine B injectée au niveau de la grille à travers des injecteurs équi-répartis le long des barreaux de celle-ci. Ce choix, dicté par la technique de mesure du champ de concentration par Fluorescence Induite par Laser, permet en outre de mesurer le champ de vitesse par une autre technique optique, elle aussi non-intrusive. Pour se rapprocher le plus de la théorie d'un mélange idéal statistiquement homogène et isotrope sans vitesse moyenne, on considère dans l'expérience, conformément à l'hypothèse de Taylor, que toutes les échelles associées à chacun de ces champs, sont convectés à la vitesse moyenne U de l'écoulement, et l'on suit une "boîte de turbulence" qui se déplace à cette vitesse le long de la veine. Par suite déterminer l'état de la turbulence en un point donné de cette boite à l'instant t et à l'instant t'=t+dt, revient à l'étudier dans l'expérience à t à l'abscisse x de la veine d'essai, et à t' à l'abscisse x+dx , avec dx=Udt, où se trouve le point de la boîte aux deux instants successifs. Les résultats expérimentaux concernant les échelles pour lesquelles l'isotropie statistique est satisfaite, permettent alors de vérifier une phénoménologie de l'évolution de la cohérence temporelle des diverses échelles spatiales du champ des fluctuations de concentration fondée sur les idées de Comte-Bellot et Corrsin. Cette expérience, est en outre l'occasion de donner des résultats sur les densités de probabilité de diverses propriétés statistiques des champs de fluctuation de vitesse. / The main purpose of this work is to make an experiment of mixing by turbulence, in which it is possible to determine and quantify the coherence time of the different spatial scales of fluctuations of a scalar field. We measure concentration fluctuations of rhodamine B by Planar Laser Induced Fluorescence (PLIF) which is transported and mixed by velocity fluctuations. These latter ones are generated by a grid placed perpendicularly to the flow in a water channel and are measured by Particle Image Velocimetry (PIV). The concentration field is injected in the flow by injectors regularly spaced on the grid so that it is a situation where both the velocity and the concentration fields are statistically homogeneous and isotropic. To get as close as the theory of statistically homogeneous and isotropic turbulence with no mean velocity, we consider, according to Taylor's hypothesis, that all scales associated with each of these fields are convected with the mean velocity U of the flow, and we follow a "turbulent box" that moves at U along the channel. As a result determining the state of turbulence at a given point of the box at time t and time t ' = t + dt, is like studying in the experiment at time t and space x of test section, and time t' and space x + dx of the test section, with dx = U dt. When statistical isotropy is satisfied, we can verify a phenomenology of the evolution of the temporal coherence of various space scales of the concentration fluctuation fields based on the ideas of Comte-Bellot and Corrsin. This experiment is also an opportunity to give results on probability densities of various statistical properties of fluctuating velocity fields.

Page generated in 0.0938 seconds