• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 5
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On Non-Parametric Confidence Intervals for Density and Hazard Rate Functions & Trends in Daily Snow Depths in the United States and Canada

Xu, Yang 09 December 2016 (has links)
The nonparametric confidence interval for an unknown function is quite a useful tool in statistical inferential procedures; and thus, there exists a wide body of literature on the topic. The primary issues are the smoothing parameter selection using an appropriate criterion and then the coverage probability and length of the associated confidence interval. Here our focus is on the interval length in general and, in particular, on the variability in the lengths of nonparametric intervals for probability density and hazard rate functions. We start with the analysis of a nonparametric confidence interval for a probability density function noting that the confidence interval length is directly proportional to the square root of a density function. That is variability of the length of the confidence interval is driven by the variance of the estimator used to estimate the square-root of the density function. Therefore we propose and use a kernel-based constant variance estimator of the square-root of a density function. The performance of confidence intervals so obtained is studied through simulations. The methodology is then extended to nonparametric confidence intervals for the hazard rate function. Changing direction somewhat, the second part of this thesis presents a statistical study of daily snow trends in the United States and Canada from 1960-2009. A storage model balance equation with periodic features is used to describe the daily snow depth process. Changepoint (inhomogeneities features) are permitted in the model in the form of mean level shifts. The results show that snow depths are mostly declining in the United States. In contrast, snow depths seem to be increasing in Canada, especially in north-western areas of the country. On the whole, more grids are estimated to have an increasing snow trend than a decreasing trend. The changepoint component in the model serves to lessen the overall magnitude of the trends in most locations.
2

Crystallization and Emplacement of the Monte Amarelo Dikes: Magma Storage Assessment on Fogo, Cape Verde Islands / Intrusion och kristallisering av vulkaniska gångbergarter i Monte Amarelo-vulkanen: En studie om magmalagringssystem på ön Fogo, Kap Verde

Risby, Olle January 2017 (has links)
The volcanic island of Fogo belongs to the Cape Verde archipelago, a two-tiered chain of islands situated 500 km west of the African coast. Fogo is regarded as one of the most active volcanoes in the world with 10 eruptions during the last 250 years. The former shield volcano Monte Amarelo reached 3500 m.a.s.l. before it collapsed into the Atlantic Ocean. The massive landslide event occurred between 124 and 86 ka, forming the Bordeira cliffs and the high plateau Cha das Caldeiras on Fogo. We have collected rock samples from the Bordeira dikes, which intruded into the Bordeira wall prior to collapse. The purpose of the project is to produce a magmatic storage model for Fogo using mineral chemistry and thermobarometric methods. Additionally, I aim to determine the processes prevailing in the magmatic system, the link between the volcanic and plutonic system. Previous studies on the magma storage beneath Fogo have focused on the volcanics, which show crystallization pressures between 0.45 to 0.68 GPa using clinopyroxene-melt thermobarometry on rims. The Bordeira dikes are basanitic to nephelinitic in composition. The mineral assemblage of the 20 dike samples consist of phenocrystic clinopyroxene ± olivine ± plagioclase ± xenocrystic amphibole. Accessory minerals are titanomagnetite, apatite, nepheline, plagioclase and alkali feldspar in a microcrystalline groundmass. Clinopyroxene displays a large compositional variation, ranging from Mg#38 to Mg#85, with a mean of Mg#71±10 2s.d. (n=614). Xenocrystic amphibole varies from Mg#37 to Mg#72, with a mean of Mg#62±15 2s.d. (n=78). Interstitial feldspar forms two groups, one of An#24 to An#79, with a mean of An#66±19 2s.d., (n=125) and a second with Or#19 to 100 with a mean of Or#69±42 2s.d.(n=71). Bulk geochemistry of the 20 samples range from 1.82 to 11.5 MgO wt%. Our clinopyroxene-melt thermobarometry show crystallization pressures ranging from 0.02 to 0.85 GPa, with a mean of 0.47±0.29 2s.d. (n=502) (Putirka et al. 2003). Structural data from the intrusive dikes in the Bordeira contain three preferred orientations, N-S, NW-SE and E-W (n=371). The main process occurring in the magmatic system is fractional crystallization, however there is some evidence for phenocryst accumulation and magma recharge. Our magma storage model show that clinopyroxene crystallization initiates in the lithospheric mantle, between 15 to 28 km depth. Significant clinopyroxene rim and microcryst crystallization occur above Moho, between 9 to 12 km, implying that magma storage levels do exist in the oceanic crust. The intrusive and extrusive rocks present on Fogo show common storage levels, suggesting that they are formed in the same system but the difference being their residence time in the crustal level storage. Our structural data and 3D model suggest that the Monte Amarelo rift zone was composed of three components, being oriented NW-SE, N-NE and E-W. The flank collapse was caused by dike intrusions of N-S orientation which enabled a E-W extension of the shield volcano. / Vulkanön Fogo är en del av ögruppen Kap Verde i Atlanten. Ögruppen bildar en två delad arkipelag positionerad 500 km väster om det afrikanska fastlandet. Ön, tillika vulkanen Fogo har på senare tid varit en av de mest aktiva vulkanerna i världen med 10 utbrott under de senaste 250 åren. Ön byggdes upp av sköldvulkanen Monte Amarelo nådde 3500 m ö h innan delar av den kollapsade ned i Atlanten. Det massiva skredet som skedde mellan 86 och 124 tusen år sedan skapade högplatån Cha das Caldeiras samt den omringande klippsektionen Bordeira. Vi har samlat stenprover från de plutoniska bergarter som har trängt in sig i klippsektionen Bordeira. Målet med vår studie är att skapa en modell för hur magma lagringen fungerar under Fogo. Vi ämnar kartlägga magmalagringsdjupet med hjälp av kemiska variation i mineral som kan användas för att kartlägga kristalliseringstryck och temperatur som i t.ex. klinopyroxen. Vi är samtidigt intresserade av att veta vilka processer som sker i det magmatiska systemet och sambandet mellan vulkanska bergarter t.ex. lava och plutoniska bergarter. Tidigare studier av Fogos magmalagring har använt vulkaniska bergarter, som kristalliserar sig mellan 0.45 till 0.68 GPa när man undersökt kemin på kristallkanter av klinopyroxen. 20 prover har analyserats från Bordeiraklipporna och de innehåller låga kiselhalter, mellan 37 till 47% samt höga mängder alkaliska oxider så som kalium och natrium. Provernas mineralinnehåll består främst av större kristaller av silikatmineralen klinopyroxen ± olivin± fältspat ± främmande amfibolkristaller. De större kristallerna är omringande av en mikrokristallin grundmassa bestående av järn-titanoxider, apatit och fältspatoider. Klinopyroxen har en relativt stor kemisk variation, med Mg#37 till Mg#85, med ett medelvärde på Mg#71. Vi har även två olika sorter av fältspat, en grupp med ett kalciumrik rikt innehåll klassificeras som anortit, och en annan med ett kaliumrikt innehåll, som ortoklas. Vår analys av klinopyroxen-smälta har gett oss kristalliseringstryck som sträcker sig mellan 0.02 till 0.85 GPa med ett medelvärde på 0.47 GPa. Detta innebär att den dominerande processen i magmalagringssystemet är fraktionerad kristallisering då vi kan se ett linjärt avtagande för många ämnen när de jämförs mot magnesiumhalten. Vår magmalagringsmodell för vulkanen Fogo visar att klinopyroxenkrystallisering påbörjas i den litosfäriska manteln, mellan 15 och 28 km djup. Kristallisering av kanter på klinopyroxenkristaller samt mindre kristaller i grundmassan sker ytligare och visar på att det finns en eller flera magmalagringsnivåer i den oceaniska jordskorpan, mellan 9 till 12 km djup. Vulkaniska och plutoniska bergarter vittnar om ett delat magmasystem, vilket indikerar att skillnaden mellan de två bergarterna främst är tiden de befinner sig på respektive lagringsnivå. Vår strukturgeologiska data samt 3D modell visar att den intrusiva aktiviteten var primärt orienterad NV-SO, N-NO och O-Vriktning. Monte Amarelo-vulkanens skred och kollaps orsakades av intruderande gångar med en generell N-S orientering vilket ledde till ett skred på östsidan.
3

PERFORMANCE ANALYSIS FOR A RESIDENTIAL-SCALE ICE THERMAL ENERGY STORAGE SYSTEM

Andrew David Groleau (17499033) 30 November 2023 (has links)
<p dir="ltr">Ice thermal energy storage (ITES) systems have long been an economic way to slash cooling costs in the commercial sector since the 1980s. An ITES system generates cooling in the formation of ice within a storage tank. This occurs during periods of the day when the cost of electricity is low, normally at night. This ice is then melted to absorb the energy within the conditioned space. While ITES systems have been prosperous in the commercial sector, they have yet to take root in the residential sector.</p><p dir="ltr">The U.S. Department of Energy (DoE) has published guidelines for TES. The DoE guidelines include providing a minimum of four hours of cooling, shifting 30-50% of a space’s cooling load to non-peak hours, minimizing the weight, volume, complexity, and cost of the system, creating a system than operates for over 10,000 cycles, enacting predictive control measures, and being modular to increase scale for larger single-family and multi-family homes [1]. The purpose of this research is to develop a model that meets these guidelines.</p><p dir="ltr">After extensive research in both experimental data, technical specifications, existing models, and best practices taken from the works of others a MATLAB model was generated. The modeled ITES system is comprised of a 1m diameter tank by 1m tall. Ice was selected as the PCM. A baseline model was constructed with parameters deemed to be ideal. This model generated an ITES system that can be charged in under four hours and is capable of providing a total of 22.18 kWh of cooling for a single-family home over a four-hour time period. This model was then validated with experimental data and found to have a root mean squared error of 0.0959 for the system state of charge. During the validation both the experimental and model estimation for the water/ice within the tank converged at the HTF supply temperature of -5.2°C.</p><p dir="ltr">With the model established, a parametric analysis was conducted to learn how adjusting a few of the system parameters impact it. The first parameter, reducing the pipe radius, has the potential to lead to a 152.6-minute reduction in charge time. The second parameter, varying the heat transfer fluid (HTF) within the prescribed zone of 0.7 kg/s to 1.2 kg/s, experienced a 4.8-minute increase in charge time for the former and a decrease in charge time by 5.4 minutes for the latter. The third parameter, increasing the pipe spacing and consequently increasing the ratio of mass of water to mass of HTF, yielded a negative impact. A 7.1mm increase in pipe spacing produced a 16.6-minute increase in charge time. Meanwhile, a 14.2mm increase in pipe spacing created a 93.3-minute increase in charge time and exceeded the charging time limit of five hours.</p><p dir="ltr">This functioning model establishes the foundation of creating a residential-scale ITES system. The adjustability and scalability of the code enable it to be modified to user specifications. Thus, allowing for various prototypes to be generated based on it. The model also lays the groundwork to synthesize a code containing an ITES system and a heat pump operating as one. This will aid in the understanding of residential-scale ITES systems and their energy effects.</p>
4

Méthodes pour la réduction d’attaques actives à passives en cryptographie quantique

Lamontagne, Philippe 12 1900 (has links)
No description available.
5

Workload- and Data-based Automated Design for a Hybrid Row-Column Storage Model and Bloom Filter-Based Query Processing for Large-Scale DICOM Data Management / Conception automatisée basée sur la charge de travail et les données pour un modèle de stockage hybride ligne-colonne et le traitement des requêtes à l’aide de filtres de Bloom pour la gestion de données DICOM à grande échelle

Nguyen, Cong-Danh 04 May 2018 (has links)
Dans le secteur des soins de santé, les données d'images médicales toujours croissantes, le développement de technologies d'imagerie, la conservation à long terme des données médicales et l'augmentation de la résolution des images entraînent une croissance considérable du volume de données. En outre, la variété des dispositifs d'acquisition et la différence de préférences des médecins ou d'autres professionnels de la santé ont conduit à une grande variété de données. Bien que la norme DICOM (Digital Imaging et Communication in Medicine) soit aujourd'hui largement adoptée pour stocker et transférer les données médicales, les données DICOM ont toujours les caractéristiques 3V du Big Data: volume élevé, grande variété et grande vélocité. En outre, il existe une variété de charges de travail, notamment le traitement transactionnel en ligne (en anglais Online Transaction Processing, abrégé en OLTP), le traitement analytique en ligne (anglais Online Analytical Processing, abrégé en OLAP) et les charges de travail mixtes. Les systèmes existants ont des limites concernant ces caractéristiques des données et des charges de travail. Dans cette thèse, nous proposons de nouvelles méthodes efficaces pour stocker et interroger des données DICOM. Nous proposons un modèle de stockage hybride des magasins de lignes et de colonnes, appelé HYTORMO, ainsi que des stratégies de stockage de données et de traitement des requêtes. Tout d'abord, HYTORMO est conçu et mis en œuvre pour être déployé sur un environnement à grande échelle afin de permettre la gestion de grandes données médicales. Deuxièmement, la stratégie de stockage de données combine l'utilisation du partitionnement vertical et un stockage hybride pour créer des configurations de stockage de données qui peuvent réduire la demande d'espace de stockage et augmenter les performances de la charge de travail. Pour réaliser une telle configuration de stockage de données, l'une des deux approches de conception de stockage de données peut être appliquée: (1) conception basée sur des experts et (2) conception automatisée. Dans la première approche, les experts créent manuellement des configurations de stockage de données en regroupant les attributs des données DICOM et en sélectionnant une disposition de stockage de données appropriée pour chaque groupe de colonnes. Dans la dernière approche, nous proposons un cadre de conception automatisé hybride, appelé HADF. HADF dépend des mesures de similarité (entre attributs) qui prennent en compte les impacts des informations spécifiques à la charge de travail et aux données pour générer automatiquement les configurations de stockage de données: Hybrid Similarity (combinaison pondérée de similarité d'accès d'attribut et de similarité de densité d'attribut) les attributs dans les groupes de colonnes; Inter-Cluster Access Similarity est utilisé pour déterminer si deux groupes de colonnes seront fusionnés ou non (pour réduire le nombre de jointures supplémentaires); et Intra-Cluster Access La similarité est appliquée pour décider si un groupe de colonnes sera stocké dans une ligne ou un magasin de colonnes. Enfin, nous proposons une stratégie de traitement des requêtes adaptée et efficace construite sur HYTORMO. Il considère l'utilisation des jointures internes et des jointures externes gauche pour empêcher la perte de données si vous utilisez uniquement des jointures internes entre des tables partitionnées verticalement. De plus, une intersection de filtres Bloom (intersection of Bloom filters, abrégé en ) est appliqué pour supprimer les données non pertinentes des tables d'entrée des opérations de jointure; cela permet de réduire les coûts d'E / S réseau. (...) / In the health care industry, the ever-increasing medical image data, the development of imaging technologies, the long-term retention of medical data and the increase of image resolution are causing a tremendous growth in data volume. In addition, the variety of acquisition devices and the difference in preferences of physicians or other health-care professionals have led to a high variety in data. Although today DICOM (Digital Imaging and Communication in Medicine) standard has been widely adopted to store and transfer the medical data, DICOM data still has the 3Vs characteristics of Big Data: high volume, high variety and high velocity. Besides, there is a variety of workloads including Online Transaction Processing (OLTP), Online Analytical Processing (OLAP) and mixed workloads. Existing systems have limitations dealing with these characteristics of data and workloads. In this thesis, we propose new efficient methods for storing and querying DICOM data. We propose a hybrid storage model of row and column stores, called HYTORMO, together with data storage and query processing strategies. First, HYTORMO is designed and implemented to be deployed on large-scale environment to make it possible to manage big medical data. Second, the data storage strategy combines the use of vertical partitioning and a hybrid store to create data storage configurations that can reduce storage space demand and increase workload performance. To achieve such a data storage configuration, one of two data storage design approaches can be applied: (1) expert-based design and (2) automated design. In the former approach, experts manually create data storage configurations by grouping attributes and selecting a suitable data layout for each column group. In the latter approach, we propose a hybrid automated design framework, called HADF. HADF depends on similarity measures (between attributes) that can take into consideration the combined impact of both workload- and data-specific information to generate data storage configurations: Hybrid Similarity (a weighted combination of Attribute Access and Density Similarity measures) is used to group the attributes into column groups; Inter-Cluster Access Similarity is used to determine whether two column groups will be merged together or not (to reduce the number of joins); and Intra-Cluster Access Similarity is applied to decide whether a column group will be stored in a row or a column store. Finally, we propose a suitable and efficient query processing strategy built on top of HYTORMO. It considers the use of both inner joins and left-outer joins. Furthermore, an Intersection Bloom filter () is applied to reduce network I/O cost.We provide experimental evaluations to validate the benefits of the proposed methods over real DICOM datasets. Experimental results show that the mixed use of both row and column stores outperforms a pure row store and a pure column store. The combined impact of both workload-and data-specific information is helpful for HADF to be able to produce good data storage configurations. Moreover, the query processing strategy with the use of the can improve the execution time of an experimental query up to 50% when compared to the case where no is applied.

Page generated in 0.0512 seconds