• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 4
  • 3
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 35
  • 35
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Hierarchical Group-Based Sampling

Gemulla, Rainer, Berthold, Henrike, Lehner, Wolfgang 12 January 2023 (has links)
Approximate query processing is an adequate technique to reduce response times and system load in cases where approximate results suffice. In database literature, sampling has been proposed to evaluate queries approximately by using only a subset of the original data. Unfortunately, most of these methods consider either only certain problems arising due to the use of samples in databases (e.g. data skew) or only join operations involving multiple relations. We describe how well-known sampling techniques dealing with group-by operations can be combined with foreign-key joins such that the join is computed after the generation of the sample. In detail, we show how senate sampling and small group sampling can be combined efficiently with the idea of join synopses. Additionally, we introduce different algorithms which maintain the sample if the underlying data changes. Finally, we prove the superiority of our method to the naive approach in an extensive set of experiments.
22

Processor design-space exploration through fast simulation / Exploration de l'espace de conception de processeurs via simulation accélérée

Khan, Taj Muhammad 12 May 2011 (has links)
Nous nous focalisons sur l'échantillonnage comme une technique de simulation pour réduire le temps de simulation. L'échantillonnage est basé sur le fait que l'exécution d'un programme est composée des parties du code qui se répètent, les phases. D'où vient l'observation que l'on peut éviter la simulation entière d'un programme et simuler chaque phase juste une fois et à partir de leurs performances calculer la performance du programme entier. Deux questions importantes se lèvent: quelles parties du programme doit-on simuler? Et comment restaurer l'état du système avant chaque simulation? Pour répondre à la première question, il existe deux solutions: une qui analyse l'exécution du programme en termes de phases et choisit de simuler chaque phase une fois, l'échantillonnage représentatif, et une deuxième qui prône de choisir les échantillons aléatoirement, l'échantillonnage statistique. Pour répondre à la deuxième question de la restauration de l'état du système, des techniques ont été développées récemment qui restaurent l'état (chauffent) du système en fonction des besoins du bout du code simulé (adaptativement). Les techniques des choix des échantillons ignorent complètement les mécanismes de chauffage du système ou proposent des alternatives qui demandent beaucoup de modification du simulateur et les techniques adaptatives du chauffage ne sont pas compatibles avec la plupart des techniques d'échantillonnage. Au sein de cette thèse nous nous focalisons sur le fait de réconcilier les techniques d'échantillonnage avec celles du chauffage adaptatif pour développer un mécanisme qui soit à la fois facile à utiliser, précis dans ses résultats, et soit transparent à l'utilisateur. Nous avons prit l'échantillonnage représentatif et statistique et modifié les techniques adaptatives du chauffage pour les rendre compatibles avec ces premiers dans un seul mécanisme. Nous avons pu montrer que les techniques adaptatives du chauffage peuvent être employées dans l'échantillonnage. Nos résultats sont comparables avec l'état de l'art en terme de précision mais en débarrassant l'utilisateur des problèmes du chauffage et en lui cachant les détails de la simulation, nous rendons le processus plus facile. On a aussi constaté que l'échantillonnage statistique donne des résultats meilleurs que l'échantillonnage représentatif / Simulation is a vital tool used by architects to develop new architectures. However, because of the complexity of modern architectures and the length of recent benchmarks, detailed simulation of programs can take extremely long times. This impedes the exploration of processor design space which the architects need to do to find the optimal configuration of processor parameters. Sampling is one technique which reduces the simulation time without adversely affecting the accuracy of the results. Yet, most sampling techniques either ignore the warm-up issue or require significant development effort on the part of the user.In this thesis we tackle the problem of reconciling state-of-the-art warm-up techniques and the latest sampling mechanisms with the triple objective of keeping the user effort minimum, achieving good accuracy and being agnostic to software and hardware changes. We show that both the representative and statistical sampling techniques can be adapted to use warm-up mechanisms which can accommodate the underlying architecture's warm-up requirements on-the-fly. We present the experimental results which show an accuracy and speed comparable to latest research. Also, we leverage statistical calculations to provide an estimate of the robustness of the final results.
23

Contribution of random sampling in the context of rotating machinery diagnostic / Apport de l'échantillonnage aléatoire dans le cadre de diagnostic de machines tournantes

Hajar, Mayssaa 26 January 2018 (has links)
Récemment, le diagnostic des machines tournantes devient un des sujets de recherche les plus importants. Plusieurs axes sont développés dans ce domaine : traitement de signal, reconnaissance des formes et autres. En plus, les systèmes industriels peuvent être surveillés à distance en temps réel grâce à la disponibilité de l’internet. Cette surveillance se trouve exigeante au niveau de l’acquisition et le stockage des données. En 2004, le Compressive Sensing est introduit dans le but d’acquérir les données a une basse fréquence afin d’économiser l’énergie dans les réseaux de capteurs sans fils. Des résultats similaires peuvent être achevés par l’Echantillonnage Aléatoire qui procure une acquisition à basse fréquence grâce à sa propriété d’anti-repliement. Comme cette technique d’échantillonnage est jusqu’à l’instant de la rédaction de cette thèse n’est pas encore disponible au marché, le travail sur ce sujet se trouve promettant afin de présenter une implémentation pratique validée. D’où, la contribution de cette thèse est de présenter les différentes propriétés de l’échantillonnage aléatoire à travers une étude théorique détaillée dans le domaine temporel et fréquentiel suivie d’une simulation et d’une application pratique sur des signaux synthétisés simples puis sur des signaux de vibration extraits des principaux composants des machines : roulements et engrenages. Les résultats obtenus au niveau de la simulation et la pratique sont satisfaisants grâce à la diminution de la fréquence d’échantillonnage et la quantité de données à sauvegarder ce qui peut être considéré comme une résolution de la problématique de la surveillance à temps réel / Nowadays, machine monitoring and supervision became one of the most important domains of research. Many axes of exploration are involved in this domain: signal processing, machine learning and several others. Besides, industrial systems can now be remotely monitored because of the internet availability. In fact, as many other systems, machines can now be connected to any network by a specified address due to the Internet of Things (IOT) concept. However, this combination is challenging in data acquisition and storage. In 2004, the compressive sensing was introduced to provide data with low rate in order to save energy consumption within wireless sensor networks. This aspect can also be achieved using random sampling (RS). This approach is found to be advantageous in acquiring data randomly with low frequency (much lower than Nyquist rate) while guaranteeing an aliasing-free spectrum. However, this method of sampling is still not available by hardware means in markets. Thus, a comprehensive review on its concept, its impact on sampled signal and its implementation in hardware is conducted. In this thesis, a study of RS and its different modes is presented with their conditions and limitations in time domain. A detailed examination of the RS’s spectral analysis is then explained. From there, the RS features are concluded. Also, recommendations regarding the choice of the adequate mode with the convenient parameters are proposed. In addition, some spectral analysis techniques are proposed for RS signals in order to provide an enhanced spectral representation. In order to validate the properties of such sampling, simulations and practical studies are shown. The research is then concluded with an application on vibration signals acquired from bearing and gear. The obtained results are satisfying, which proves that RS is quite promising and can be taken as a solution for reducing sampling frequencies and decreasing the amount of stored data. As a conclusion, the RS is an advantageous sampling process due to its anti-aliasing property. Further studies can be done in the scope of reducing its added noise that was proven to be cyclostationary of order 1 or 2 according to the chosen parameters
24

個案無反應資料之各種加權方法分析比較 / Weighting Adjustments for Unit Nonresponse

劉淑芳, Liou, Shue-Fang Unknown Date (has links)
在本論文中,根據所建立的100,000筆模擬資料作為抽樣的母體,利用簡單隨機抽樣法(simple random sampling;SRS)從此模擬的資料中共抽出1068筆成功樣本,分別考慮了當個案訪問失敗(unit nonresponse)情形發生時是『隨機性』及『非隨機性』兩種情況下比較(1)事後分層加權(poststratification approach);(2)多個變數反覆加權(raking or raking ratio);及(3)估計成功率加權等三種加權方法之效果如何。 當訪問失敗具完全隨機性的情況之下所抽出之樣本,由於原始樣本的代表性過於『完美』,即使是經過事後分層加權或是raking加權後,均無顯著的效果。因此,對於樣本的改善程度實在是微不足道!而在訪問失敗是非隨機性的情況時,事後分層加權對於變數間具較強相關性時,則具有較佳的加權效果;raking加權方式的加權效果普遍上均不錯的表現,值得廣泛地採用;而估計成功率加權的效果則必須取決於估計準確與否,否則可能由於估計的偏差而導致加權效果不彰。 最後,本文亦提供了事後分層加權及raking加權的適用時機及建議,以作為日後從事抽樣調查工作者的參考意見。
25

Application of Java on Mathematical Statistics Education

Su, Yi-Che 20 June 2001 (has links)
In the recent years, the internet has been developed rapidly. By this convenient medium, the information can be spread easily all over the world. Using the convenience and variety of internet, e-learning has become a burgeoning and efficient way for learning. The main idea of e-learning is applying the concept of Asynchronous Course Delivery, and establishing a learning environment on the internet. With the connection between computer and the internet, user can learn more in a convenient environment. In order to apply the concept of e-learning to the course of statistics, we use the Java programming language to establish an on-line interactive environment. In addition to learn some fundamental concepts of statistics, learner can also strengthen the abilities of researching and surfing by themselves. In this paper we developed six interactive examples. Not only interpreting and illustrating, we also introduce the motive, goal, relative concepts and applications in detail for each example. Finally, we hope that user can easily learn more knowledge of statistics by this learning environment, then our e-learning to statistical education, can be achieved.
26

Avaliação da Pesca da Lagosta Vermelha (Panulirus argus) e da Lagosta verde (Panulirus laevicauda) na Plataforma Continental do Brasil. / Assessment of the fishing of spiny lobsters (Panulirus argus and Panulirus laevicauda) in the continental shelf of Brazil

Barroso, Juarez Coelho January 2012 (has links)
BARROSO, Juarez Coelho. Avaliação da Pesca da Lagosta Vermelha (Panulirus argus) e da Lagosta verde (Panulirus laevicauda) na Plataforma Continental do Brasil. 2012. 109 f. : Dissertação (mestrado) - Universidade Federal do Ceará, Centro de Ciências Agrárias, Departamento de Engenharia de Pesca, Fortaleza-CE, 2012 / Submitted by Nádja Goes (nmoraissoares@gmail.com) on 2016-07-18T13:17:31Z No. of bitstreams: 1 2012_dis_jcbarroso.pdf: 2594115 bytes, checksum: ce2b67584a13e444ae16d1c1c08898f1 (MD5) / Approved for entry into archive by Nádja Goes (nmoraissoares@gmail.com) on 2016-07-18T13:17:44Z (GMT) No. of bitstreams: 1 2012_dis_jcbarroso.pdf: 2594115 bytes, checksum: ce2b67584a13e444ae16d1c1c08898f1 (MD5) / Made available in DSpace on 2016-07-18T13:17:44Z (GMT). No. of bitstreams: 1 2012_dis_jcbarroso.pdf: 2594115 bytes, checksum: ce2b67584a13e444ae16d1c1c08898f1 (MD5) Previous issue date: 2012 / Extractive fishing lobster in Brazil is an important economic activity which includes different social sectors in the coastal region and provides an average annual income of USD$ 84 million. The great demand for the product, the high value in international market, expansion of the artisanal fleet, failure in enforce of conservation measures, the fishing impact on ecosystems and perhaps climate variability, have led to a high exploitation or overexploitation of the crustacean stocks. The evaluation and fishery prediction depends of biological and fisheries data collection, in the last two decades the absence of these data and the lack in the life cycle study generated a high level of uncertainty in the management of the fishery. In the present study, a comprehensive analysis is done of biological and fishery information generated by the thesis project and the data that precedes. The areas of the stock aggregation were distributed into 31 strata covering an area of 356.610 km², divided into two regions: shallow < 50 m (160.510 km²) and depth between 50 and 100 m (196.100 km²). The medium productivity index (1999-2006) between species was 29,75 kg/km² in Panulirus argus and 8,39 kg/km² in P. laevicauda; in different strata varied between 0.02 and 217.0 kg/km² (both species). Catchability coefficients ranged between 0.10 and 0.14, showing that the fishing gears (caçoeira, manzuá and cangalha) have a low efficiency in the lobster fishery. The sampling of landings between different fishing periods (1970-1979, 1980-1988 and 1989-1993) showed a progressive decrease in the small lobsters and as a result the mean length showed an increasing trend. The random samples on boats that fished between 20 and 35 m (1999), revealed that lobsters recruited (50 and 75 mm, Lc) accounted for 91% (P. argus) and 96% (P. laevicauda) of the total catch, which with the expansion of fishing (increased effort) was accompanied by a decreasing of the CPUE by fishing grids (between 1974 and 1991), showed a growth overfishing which are probably influencing in the high variations of the annual catches and its negative trend observed in the last fifteen years (1995-2009). The analysis between CPUE and abundance (N), revealed a lack of proportionality (hyperstability) between both parameters, which could be affecting the relationship between CPUE and effort and overestimating yield maximum sustainable (YMS). We estimated a yield maximum sustainable per unit area of 14 kg/km² and a yield maximum sustainable preliminary between 5,000 and 5,604 t. This empirical result should be confirmed through the development of stratified random sampling is proposed in this study. There is not evidence of recruitment overfishing in the stocks, but high rates of exploitation in the deep stock (50 to 100 m) of P. argus, composed mainly of older specimens and high reproductive power, could be accentuating the risk of low recruitment and collapse. This high level of uncertainty which takes the lobster fishery on the continental shelf of Brazil must be addressed by new ways of thinking, by a long-term and acquiring new skills and knowledge to develop fisheries sustainable. / A pesca extrativa de lagosta no Brasil é uma atividade econômica importante que engloba diferentes setores sociais na região costeira e que aporta um ingresso médio anual de 84 milhões de dólares. A grande demanda do produto, o elevado valor no mercado internacional, a expansão da frota artesanal, o não cumprimento das medidas de conservação, o impacto da pesca no ecossistema e possivelmente a variabilidade do clima, tem propiciado uma elevada explotação ou sobre-explotação dos estoques destes crustáceos. A avaliação e previsão da pescaria dependem da coleta de informação biológica e pesqueira, em que nas duas últimas décadas a ausência destes dados e a falta de estudo do ciclo de vida gerou um elevado nível de incerteza na administração da pescaria. No presente estudo, foi realizada uma análise integral da informação biológico-pesqueira gerada pelo projeto de mestrado e pelos dados que o precedem. As zonas de agregação dos estoques pesqueiros se distribuíram em 31 estratos que cobrem uma área de 356.610 km², divididos em duas regiões: rasa < 50 m (160.510 km²) e profunda entre 50 e 100 m (196.100 km²). O índice de produtividade médio (1999-2006) entre espécies foi de 29,75 kg/km² em Panulirus argus e de 8,39 kg/km² em P. laevicauda; nos diferentes estratos variou entre 0,02 e 217,0 kg/km² (ambas as espécies). Os coeficientes de capturabilidade variaram entre 0,10 e 0,14, demonstrando-se que os aparelhos de pesca (caçoeira, manzuá e cangalha) apresentam uma baixa eficiência na pescaria de lagosta. A amostragem dos desembarques entre diferentes períodos de pesca (1970-1979, 1980-1988 e 1989-1993) mostrou uma diminuição progressiva das lagostas pequenas e, como resultado, o comprimento médio mostrou uma tendência crescente. As amostragens aleatórias nas embarcações que pescaram entre 20 e 35 m (1999) revelaram que as lagostas recrutadas (50 e 75 mm, CC) representaram 91% (P. argus) e 96% (P. laevicauda) do total capturado, que conjuntamente com a expansão da área de pesca (aumento do esforço) e a diminuição da CPUE por quadrículas de pesca (entre 1974 e 1991), evidenciaram uma sobrepesca de crescimento que provavelmente esteja influenciando nas elevadas variações das capturas anuais e em sua tendência negativa observada nos últimos 15 anos (1995-2009). A análise realizada entre a CPUE e a abundância (N), revelou uma falta de proporcionalidade (hiperestabilidade) entre ambos os parâmetros, que poderia estar afetando a relação entre a CPUE e o esforço, e sobre-estimando as estimativas do rendimento máximo sustentável (RMS). Foi estimado um rendimento máximo sustentável por unidade de área de 14 kg/km² e um rendimento máximo sustentável preliminar entre 5.000 e 5.604 t. Este resultado empírico deve ser corroborado a partir do desenvolvimento da amostragem aleatória estratificada que foi proposto no presente estudo. Não existem evidências de uma sobrepesca de recrutamento nos estoques, mas as altas taxas de explotação no estoque do setor profundo (50 e 100 m) de P. argus, composto principalmente por exemplares mais velhos e de elevado poder reprodutor, poderia estar acentuando o risco dos recrutamentos baixos e dos colapsos. Este grande nível de incerteza pelo que transcorre a pescaria de lagosta na plataforma continental do Brasil deve ser abordado por formas de pensar novas e diferentes, por uma visão em longo prazo e pela obtenção de novos conhecimentos científicos que permitam desenvolver uma pesca sustentável.
27

AvaliaÃÃo da Pesca da Lagosta Vermelha (Panulirus argus) e da Lagosta verde (Panulirus laevicauda) na Plataforma Continental do Brasil. / Assessment of the fishing of spiny lobsters (Panulirus argus and Panulirus laevicauda) in the continental shelf of Brazil

Juarez Coelho Barroso 05 March 2012 (has links)
CoordenaÃÃo de AperfeiÃoamento de NÃvel Superior / A pesca extrativa de lagosta no Brasil à uma atividade econÃmica importante que engloba diferentes setores sociais na regiÃo costeira e que aporta um ingresso mÃdio anual de 84 milhÃes de dÃlares. A grande demanda do produto, o elevado valor no mercado internacional, a expansÃo da frota artesanal, o nÃo cumprimento das medidas de conservaÃÃo, o impacto da pesca no ecossistema e possivelmente a variabilidade do clima, tem propiciado uma elevada explotaÃÃo ou sobre-explotaÃÃo dos estoques destes crustÃceos. A avaliaÃÃo e previsÃo da pescaria dependem da coleta de informaÃÃo biolÃgica e pesqueira, em que nas duas Ãltimas dÃcadas a ausÃncia destes dados e a falta de estudo do ciclo de vida gerou um elevado nÃvel de incerteza na administraÃÃo da pescaria. No presente estudo, foi realizada uma anÃlise integral da informaÃÃo biolÃgico-pesqueira gerada pelo projeto de mestrado e pelos dados que o precedem. As zonas de agregaÃÃo dos estoques pesqueiros se distribuÃram em 31 estratos que cobrem uma Ãrea de 356.610 kmÂ, divididos em duas regiÃes: rasa < 50 m (160.510 kmÂ) e profunda entre 50 e 100 m (196.100 kmÂ). O Ãndice de produtividade mÃdio (1999-2006) entre espÃcies foi de 29,75 kg/km em Panulirus argus e de 8,39 kg/km em P. laevicauda; nos diferentes estratos variou entre 0,02 e 217,0 kg/km (ambas as espÃcies). Os coeficientes de capturabilidade variaram entre 0,10 e 0,14, demonstrando-se que os aparelhos de pesca (caÃoeira, manzuà e cangalha) apresentam uma baixa eficiÃncia na pescaria de lagosta. A amostragem dos desembarques entre diferentes perÃodos de pesca (1970-1979, 1980-1988 e 1989-1993) mostrou uma diminuiÃÃo progressiva das lagostas pequenas e, como resultado, o comprimento mÃdio mostrou uma tendÃncia crescente. As amostragens aleatÃrias nas embarcaÃÃes que pescaram entre 20 e 35 m (1999) revelaram que as lagostas recrutadas (50 e 75 mm, CC) representaram 91% (P. argus) e 96% (P. laevicauda) do total capturado, que conjuntamente com a expansÃo da Ãrea de pesca (aumento do esforÃo) e a diminuiÃÃo da CPUE por quadrÃculas de pesca (entre 1974 e 1991), evidenciaram uma sobrepesca de crescimento que provavelmente esteja influenciando nas elevadas variaÃÃes das capturas anuais e em sua tendÃncia negativa observada nos Ãltimos 15 anos (1995-2009). A anÃlise realizada entre a CPUE e a abundÃncia (N), revelou uma falta de proporcionalidade (hiperestabilidade) entre ambos os parÃmetros, que poderia estar afetando a relaÃÃo entre a CPUE e o esforÃo, e sobre-estimando as estimativas do rendimento mÃximo sustentÃvel (RMS). Foi estimado um rendimento mÃximo sustentÃvel por unidade de Ãrea de 14 kg/km e um rendimento mÃximo sustentÃvel preliminar entre 5.000 e 5.604 t. Este resultado empÃrico deve ser corroborado a partir do desenvolvimento da amostragem aleatÃria estratificada que foi proposto no presente estudo. NÃo existem evidÃncias de uma sobrepesca de recrutamento nos estoques, mas as altas taxas de explotaÃÃo no estoque do setor profundo (50 e 100 m) de P. argus, composto principalmente por exemplares mais velhos e de elevado poder reprodutor, poderia estar acentuando o risco dos recrutamentos baixos e dos colapsos. Este grande nÃvel de incerteza pelo que transcorre a pescaria de lagosta na plataforma continental do Brasil deve ser abordado por formas de pensar novas e diferentes, por uma visÃo em longo prazo e pela obtenÃÃo de novos conhecimentos cientÃficos que permitam desenvolver uma pesca sustentÃvel. / Extractive fishing lobster in Brazil is an important economic activity which includes different social sectors in the coastal region and provides an average annual income of USD$ 84 million. The great demand for the product, the high value in international market, expansion of the artisanal fleet, failure in enforce of conservation measures, the fishing impact on ecosystems and perhaps climate variability, have led to a high exploitation or overexploitation of the crustacean stocks. The evaluation and fishery prediction depends of biological and fisheries data collection, in the last two decades the absence of these data and the lack in the life cycle study generated a high level of uncertainty in the management of the fishery. In the present study, a comprehensive analysis is done of biological and fishery information generated by the thesis project and the data that precedes. The areas of the stock aggregation were distributed into 31 strata covering an area of 356.610 kmÂ, divided into two regions: shallow < 50 m (160.510 kmÂ) and depth between 50 and 100 m (196.100 kmÂ). The medium productivity index (1999-2006) between species was 29,75 kg/km in Panulirus argus and 8,39 kg/km in P. laevicauda; in different strata varied between 0.02 and 217.0 kg/km (both species). Catchability coefficients ranged between 0.10 and 0.14, showing that the fishing gears (caÃoeira, manzuà and cangalha) have a low efficiency in the lobster fishery. The sampling of landings between different fishing periods (1970-1979, 1980-1988 and 1989-1993) showed a progressive decrease in the small lobsters and as a result the mean length showed an increasing trend. The random samples on boats that fished between 20 and 35 m (1999), revealed that lobsters recruited (50 and 75 mm, Lc) accounted for 91% (P. argus) and 96% (P. laevicauda) of the total catch, which with the expansion of fishing (increased effort) was accompanied by a decreasing of the CPUE by fishing grids (between 1974 and 1991), showed a growth overfishing which are probably influencing in the high variations of the annual catches and its negative trend observed in the last fifteen years (1995-2009). The analysis between CPUE and abundance (N), revealed a lack of proportionality (hyperstability) between both parameters, which could be affecting the relationship between CPUE and effort and overestimating yield maximum sustainable (YMS). We estimated a yield maximum sustainable per unit area of 14 kg/km and a yield maximum sustainable preliminary between 5,000 and 5,604 t. This empirical result should be confirmed through the development of stratified random sampling is proposed in this study. There is not evidence of recruitment overfishing in the stocks, but high rates of exploitation in the deep stock (50 to 100 m) of P. argus, composed mainly of older specimens and high reproductive power, could be accentuating the risk of low recruitment and collapse. This high level of uncertainty which takes the lobster fishery on the continental shelf of Brazil must be addressed by new ways of thinking, by a long-term and acquiring new skills and knowledge to develop fisheries sustainable.
28

Schémas numériques adaptés aux accélérateurs multicoeurs pour les écoulements bifluides / Numerical simulations of two-fluid flow on multicores accelerator

Jung, Jonathan 28 October 2013 (has links)
Cette thèse traite de la modélisation et de l'approximation numérique des écoulements liquide-gaz compressibles. La difficulté centrale est la modélisation et l'approximation de l'interface liquide-gaz. Le modèle bifluide est constitué d'un système de lois de conservation fermé par une loi d'état du mélange. La loi d'état conditionne les bonnes propriétés (hyperbolicité, existence d'une entropie de Lax) du système. Les schémas classiques de type Godunov conduisent à des imprécisions les rendant inutilisables en pratique. L'existence de solutions discontinues rend difficile la construction de schémas d'ordre élevé et nécessite des maillages très fins pour une précision acceptable. Il est indispensable de proposer des algorithmes performants pour les calculateurs parallèles les plus récents. Nous aborderons chacune de ces problématiques: construction d'une "bonne" loi de pression, construction de schémas numériques adaptés, programmation sur calculateur massivement multicoeur. / This thesis deals with the modeling and numerical approximation of compressible gas-liquid flows. The main difficulty lies in modeling and approximation of the liquid-gas interface. The two-fluid model is a system of conservation laws closed with a mixture pressure law. The law has to be chosen carefully, it conditions good properties of the system as hyperbolicity or existence of a Lax entropy. Classic conservative Godunov-type schemes lead to inaccuracies that make them unusable inpractice. The existence of discontinuous solutions makes it difficult to build high order schemes and requires very fine meshes to an acceptable accuracy. It is therefore essential to provide efficient algorithms for the High Performance Computing. In this thesis, we will partially treat each of these issues : construction of a "good" pressure law, building adapted numerical schemes, programming on GPU or GPU cluster.
29

Développement d’une approche quantitative pour l’étude du poumon équin : fixation et échantillonnage pour l’application des principes de la stéréologie

Gélinas-Lymburner, Emilie 03 1900 (has links)
La présente étude visait à développer un protocole de fixation et d'échantillonnage pour le poumon équin suivant les directives publiées sur l’utilisation d’une approche stéréologique de type « design-based ». Les poumons gauches de chevaux contrôles et atteints du souffle ont été fixés avec du formaldéhyde 10% pendant 48h à une pression constante de 25-30 cm d’H2O. Les poumons ont été sectionnés en 20-21 tranches d’une épaisseur d'environ 2,5 cm chacune; de 10-11 tranches ont été sélectionnées de façon aléatoire et systématique pour la mesure du volume de référence avec la méthode de Cavalieri. Un protocole d’échantillonnage systématique, aléatoire et uniforme utilisant le principe du « smooth fractionator » et un poinçon à biopsie de 17 mm ont été utilisés pour échantillonner une fraction représentative de chaque poumon. Les méthodes d’échantillonnage de sections verticales, uniformes et aléatoires (VUR) et d’échantillonnage isotropique, uniforme et aléatoire (IUR) ont toutes deux été effectuées pour comparer le nombre de voies respiratoires en coupe perpendiculaire obtenues à partir de chaque méthode. L'architecture globale et la qualité des tissus fixés ont également été évaluées. Des spécimens pulmonaires équins ont été échantillonnés avec succès selon un protocole visant à produire des données morphométriques valides. Les tissus ont été fixés avec un minimum d'artéfacts et contenaient une quantité suffisante de voies respiratoires en coupe perpendiculaire dans les deux types d’échantillons. En conclusion, un protocole de fixation et d'échantillonnage adapté au poumon équin permettant l'utilisation d'une approche stéréologique de type « design-based » a été élaboré pour l’étude du remodelage des voies respiratoires. / The present study aimed at developing a fixation and sampling protocol for the horse lung in agreement with recent published guidelines for a design-based stereology approach. The left lungs from control and from heaves-affected horses were fixed in 10% formaldehyde for 48hr at a controlled constant pressure of 25-30 cm H2O. Lungs were cut into 20-21 slices of a thickness of approximately 2.5cm each; 10-11 slices were then randomly and systematically selected for the measurement of the reference volume using the Cavalieri method. A systematic, uniform and random sampling (SURS) protocol using a 17 mm punch biopsy and the smooth fractionator principle was used to select a representative fraction of each lung. The vertical uniform random (VUR) and isotropic uniform random (IUR) sampling methods were both performed to compare the number of perpendicular airways obtained with each method. The general architecture and the quality of the fixed tissues were also evaluated. Equine lung tissues were successfully sampled with a protocol designed to yield accurate morphometric data. The tissues were fixed with minimal artifacts and contained an adequate amount of perpendicular airways in both VUR and IUR sections. In conclusion, we developed a fixation and sampling protocol adapted to the equine lung allowing the use of a design-based stereology approach to study airway remodeling.
30

The impact of nightclubs and restaurant bars noise pollution on the population of Melville, Johannesburg, South Africa

Mahapa, Tebogo Patience 11 1900 (has links)
Nightclubs and restaurant bars have become major sources of noise pollution particularly in areas close to residential dwellings. The purpose of this study was to investigate the impact of noise emanating from nightclubs and restaurant bars on the community of Melville, Johannesburg. This study followed both qualitative and quantitative research methods. A total of 100 respondents were randomly sampled within the study area. Qualitative data was collected using a structured questionnaire. A calibrated sound level meter was used to measure environmental noise levels at 10 different measuring points. The research finding revealed that about:  87% of noise levels measured with the sound level meter did not comply with officially acceptable levels of 40dB at night.  69% of respondents indicated that the main source of noise is pollution is nightclubs.  78% of respondents described noise as annoying, disturbing and unwanted.  57% of respondents indicated that members of their household have suffered from sleeping disorders due to noise activities at night disrupting their sleep patterns and resulting in irritability and fatigue. The noise measurements were taken on weekends and public holidays during the day from 10h00 to 14h30 and at night from 22h00 to 02h30. The research findings revealed that the residents of Melville experienced high level of noise at night with nightclub as major source of noise and as a result the majority of the sampled population complained about irritability, fatigue and sleeping disorders due to exposure to noise. The outcome of this research indicated the need of health education on the adverse effects of noise pollution and the need of sound insulation at places of entertainment. Implementation of a noise management policy is needed in order to effectively control and manage the noise pollution in its area of jurisdiction and regular noise level monitoring by constantly taking noise measurements by law enforcements officers. / Department of Environmental Sciences / M. Sc. (Environmental Management)

Page generated in 0.0771 seconds