• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 110
  • 66
  • 13
  • 13
  • 9
  • 8
  • 8
  • 8
  • 4
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 287
  • 40
  • 30
  • 30
  • 23
  • 19
  • 17
  • 16
  • 16
  • 15
  • 15
  • 15
  • 14
  • 14
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Optimization of the conversion of lignocellulosic agricultural by-products to bioethanol using different enzyme cocktails and recombinant yeast strains

Mubazangi, Munyaradzi 03 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2011. / ENGLISH ABSTRACT: The need to mitigate the twin crises of peak oil and climate change has driven a headlong rush to biofuels. This study was aimed at the development of a process to efficiently convert steam explosion pretreated (STEX) sugarcane bagasse into ethanol by using combinations of commercial enzyme cocktails and recombinant Saccharomyces cerevisiae strains. Though enzymatic saccharification is promising in obtaining sugars from lignocellulosics, the low enzymatic accessibility of the cellulose and hemicellulose is a key impediment thus necessitating development of an effective pretreatment scheme and optimized enzyme mixtures with essential accessory activities. In this context, the effect of uncatalysed and SO2 catalysed STEX pretreatment of sugarcane bagasse on the composition of pretreated material, digestibility of the water insoluble solids (WIS) fraction and overall sugar recovery was investigated. STEX pretreatment with water impregnation was found to result in a higher glucose recovery (28.1 g/ 100 bagasse) and produced WIS with a higher enzymatic digestibility, thus was used in the optimization of saccharification and fermentation. Response surface methodology (RSM) based on the 33 factorial design was used to optimize the composition of the saccharolytic enzyme mixture so as to maximize glucose and xylose production from steam exploded bagasse. It was established that a combination of 20 FPU cellulase/ g WIS and 30 IU -glucosidases/ g WIS produced the highest desirability for glucose yield. Subsequently the optimal enzyme mixture was used to supplement enzyme activities of recombinant yeast strains co-expressing several cellulases and xylanases in simultaneous saccharification and fermentations SSFs. In the SSFs, ethanol yield was found to be inversely proportional to substrate concentration with the lowest ethanol yield of 70% being achieved in the SSF at a WIS concentration of 10% (w/v). The ultimate process would however be a one-step “consolidated” bio-processing (CBP) of lignocellulose to ethanol, where hydrolysis and fermentation of polysaccharides would be mediated by a single microorganism or microbial consortium without added saccharolytic enzymes. The cellulolytic yeast strains were able to autonomously multiply on sugarcane bagasse and concomitantly produce ethanol, though at very low titres (0.4 g/L). This study therefore confirms that saccharolytic enzymes exhibit synergism and that bagasse is a potential substrate for bioethanol production. Furthermore the concept of CBP was proven to be feasible. / AFRIKAANSE OPSOMMING: Die behoefte om die twee krisisse van piek-olie en klimaatsverandering te versag, het veroorsaak dat mense na biobrandstof as alternatiewe energiebron begin kyk het. Hierdie studie is gemik op die ontwikkeling van 'n proses om stoomontplofde voorafbehandelde (STEX) suikerriet bagasse doeltreffend te omskep in etanol deur die gebruik van kombinasies van kommersiële ensiem mengsels en rekombinante Saccharomyces cerevisiae stamme. Alhoewel ensiematiese versuikering belowend is vir die verkryging van suikers vanaf lignosellulose, skep die lae ensiematiese toeganklikheid van die sellulose en hemisellulose 'n hindernis en dus is die ontwikkeling van' n effektiewe behandelingskema en optimiseerde ensiemmengsels met essensiële bykomstige aktiwiteite noodsaaklik. In hierdie konteks, was die effek van ongekataliseerde en SO2 gekataliseerde stoomontploffing voorafbehandeling van suikerriet bagasse op die samestelling van voorafbehandelde materiaal, die verteerbaarheid van die (WIS) breuk van onoplosbare vastestowwe in water (WIS), en die algehele suikerherstel ondersoek. Daar was bevind dat stoomontploffing behandeling (STEX) met water versadiging lei tot 'n hoër suikerherstel (21.8 g/ 100g bagasse) en dit het WIS met ‘n hoër ensimatiese verteerbaarheid vervaardig en was dus gebruik in die optimalisering van versuikering en fermentasie. Reaksie oppervlak metodologie (RSM), gebasseer op die 33 faktoriële ontwerp, was gebruik om die samestelling van die ‘saccharolytic’ ensiemmengsel te optimaliseer om sodoende die maksimering van glukose en ‘xylose’ produksie van stoomontplofde bagasse te optimaliseer. Daar was bevestig dat ‘n kombinasie van 20 FPU sellulase/ g WIS en 30 IU ‘ -glucosidases/ g’ WIS die hoogste wenslikheid vir glukose-opbrengs produseer het. Daarna was die optimale ensiemmengsel gebruik om ensiemaktiwiteit van rekombinante gisstamme aan te vul, wat gelei het tot die medeuitdrukking van verskillende ‘cellulases’ en ‘xylanases’ in gelyktydige versuikering en fermentasie SSFs. In die SSFs was daar bevind dat die etanol-produksie omgekeerd proporsioneel is tot substraat konsentrasie, met die laagste etanolopbrengs van 70% wat bereik was in die SSF by ‘n WIS konsentrasie van 10% (w/v). Die uiteindelike proses sal egter 'n eenmalige "gekonsolideerde" bioprosessering (CBP) van lignosellulose na etanol behels, waar die hidrolise en fermentasie van polisakkariede deur' n enkele mikroorganisme of mikrobiese konsortium sonder bygevoegde ‘saccharolytic’ ensieme bemiddel sal word. Die ‘cellulolytic’ gisstamme was in staat om vanself te vermeerder op suikerriet bagasse en gelyktydig alkohol te produseer, al was dit by baie lae titres (0.4 g/L). Hierdie studie bevestig dus dat ‘saccharolytic’ ensieme sinergisme vertoon en dat bagasse 'n potensiële substraat is vir bio-etanol produksie. Daar was ook onder meer bewys dat die konsep van CBP uitvoerbaar is. / The National Research Foundation (NRF) for financial support
52

Studies in support of a quantitative approach to hazardous area classification

Cox, Andrew William January 1989 (has links)
A study was made of the feasibility of putting hazardous area classification (HAC) on a more quantitative basis. A review of current HAC practice showed that the widespread policy of setting fixed zone distances around sources of hazard was subjective and sometimes led to inconsistencies between different codes of practice when applied to the same situation. Fatality and injury statistics were used to show that there is a significant risk to workers from the ignition of flammable atmospheres. which should be reduced. Data were researched and compiled to fit into a proposed framework for the quantification of HAC. These included information concerning leak source inventory: source leak frequency: and source leak size distribution. Mathematical models were collected which could be used to describe the emission and dispersion of flammable releases. Example calculations were performed for typical leak scenarios to illustrate the variation in hazard distances. Estimates were made of the ignition and explosion probabilities of flammable leaks. which depended princi pally on emission size. To compensate for uncertainties in the researched data. a fire and explosion model was devised to estimate the ignition frequency on a typical process plant. The model was applied to a "standard" plant which was formulated from researched data. By iteratively checking the estimated ignition frequencies against historical data it was concluded that reasonable agreement was achieved with some adjustment of the input data. The special problems of HAC of indoor plants were also addressed. It was concluded that the results of this study provided a basic framework for the quantification of HAC. although the quality of currently available data necessary for quantification is generally poor. The acquisition of better quality leak and ignition data should provide a platform from which the current work may progress. Further work should include the further refinement of the basic fire and explosion model to account for ignitions which HAC cannot protect against such as autoignitions. It was also noted that the behaviour of indoor releases requires clarification. together with the concept of a minimum flammable inventory below which there is negligible risk of ignition.
53

Feedstock and process variables influencing biomass densification

Shaw, Mark Douglas 17 March 2008
Densification of biomass is often necessary to combat the negative storage and handling characteristics of these low bulk density materials. A consistent, high-quality densified product is strongly desired, but not always delivered. Within the context of pelleting and briquetting, binding agents are commonly added to comminuted biomass feedstocks to improve the quality of the resulting pellets or briquettes. Many feedstocks naturally possess such binding agents; however, they may not be abundant enough or available in a form or state to significantly contribute to product binding. Also, process parameters (pressure and temperature) and material variables (particle size and moisture content) can be adjusted to improve the quality of the final densified product.<p>Densification of ground biomass materials is still not a science, as much work is still required to fully understand how the chemical composition and physical properties, along with the process variables, impact product quality. Generating densification and compression data, along with physical and mechanical properties of a variety of biomass materials will allow for a deeper understanding of the densification process. This in turn will result in the design of more efficient densification equipment, thus improving the feasibility of using biomass for chemical and energy production.<p>Experiments were carried out wherein process (pressure and temperature) and material (particle size and moisture content) variables were studied for their effect on the densification process (compression and relaxation characteristics) and the physical quality of the resulting products (pellets). Two feedstocks were selected for the investigation; namely, poplar wood and wheat straw, two prominent Canadian biomass resources. Steam explosion pretreatment was also investigated as a potential method of improving the densification characteristics and binding capacity of the two biomass feedstocks.<p> Compression/densification and relaxation testing was conducted in a closed-end cylindrical die at loads of 1000, 2000, 3000, and 4000 N (31.6, 63.2, 94.7, and 126.3 MPa) and die temperatures of 70 and 100°C. The raw poplar and wheat straw were first ground through a hammer mill fitted with 0.8 and 3.2 mm screens, while the particle size of the pretreated poplar and wheat straw was not adjusted. The four feedstocks (2 raw and 2 pretreated) were also conditioned to moisture contents of 9 and 15% wb prior to densification. <p> Previously developed empirical compression models fitted to the data elucidated that along with particle rearrangement and deformation, additional compression mechanisms were present during compression. Also, the compressibility and asymptotic modulus of the biomass grinds were increased by increasing the die temperature and decreasing product moisture content. While particle size did not have a significant effect on the compressibility, reducing it increased the resultant asymptotic modulus value. Steam explosion pretreatment served to decrease the compressibility and asymptotic modulus of the grinds.<p>In terms of physical quality of the resulting product, increasing the applied load naturally increased the initial density of the pellets (immediately after removal from the die). Increasing the die temperature served to increase the initial pellet density, decrease the dimensional (diametral and longitudinal) expansion (after 14 days), and increase the tensile strength of the pellets. Decreasing the raw feedstock particle size allowed for the increase in initial pellet density, decrease in diametral expansion (no effect on longitudinal expansion), and increase in tensile strength of the pellets. Decreasing the moisture content of the feedstocks allowed for higher initial pellet densities, but also an increased dimensional expansion. The pretreated feedstocks generally had higher initial pellet densities than the raw grinds. Also, the pretreated feedstocks shrank in diameter and length, and had higher tensile strengths than the raw feedstocks. The high performance of the pretreated poplar and wheat straw (as compared to their raw counterparts) was attributed to the disruption of the lignocellulosic structure, and removal/hydrolysis of hemicellulose, during the steam pretreatment process which was verified by chemical and Fourier transform infrared analysis. As a result, a higher relative amount of lignin was present. Also, the removal/hydrolysis of hemicellulose would indicate that this lignin was more readily available for binding, thus producing superior pellets.
54

Numerical modelling and observations of nuclear-explosion coda wavefields

Zhang, Chaoying 04 May 2009
Frequency-dependent earthquake coda attenuation values are often reported; however such measurements usually depend on the types of the attenuation models employed. In this thesis, I use numerical modeling of Peaceful Nuclear Explosion (PNE) codas at far regional to teleseismic distances to compare two of such models, namely the conventional frequency-dependent attenuation with parameters (Q0, ¦Ç) defined by Qcoda(f) = Q0f¦Ç and frequency-independent effective attenuation (Qe) with geometrical attenuation (¦Ã). The results favour strongly the (¦Ã, Qe) model and illustrate the mechanisms leading to apparent Qcoda(f) dependencies. Tests for variations of the crustal velocity structures show that the values of ¦Ã are stable and related to lithospheric structural types, and the inverted Qe values can be systematically mapped into the true Swave attenuation factors within the crust. Modeling also shows that ¦Ã could increase in areas where relatively thin attenuating layers are present within the crust; such areas could likely be related to younger and active tectonics. By contrast, when interpreted by using the traditional (Q0,¦Ç) approach, the synthetic coda shows a strong and spurious frequency dependence with ¦Ç ¡Ö 0.5, which is also similar to many published observations.<p> Observed Lg codas from two Peaceful Nuclear Explosions located in different areas in Russia show similar values of ¦Ã ¡Ö 0.75¡¤10-2 s-1, which are also remarkably close to the independent numerical predictions in this thesis. At the same time, coda Qe values vary strongly, from 850 in the East European Platform to 2500 within the Siberian Craton. This suggests that parameters ¦Ã and Qe could provide stable and transportable discriminants for differentiating between the lithospheric tectonic types and ages, and also for seismic coda regionalization in nuclear-test monitoring research.
55

Feedstock and process variables influencing biomass densification

Shaw, Mark Douglas 17 March 2008 (has links)
Densification of biomass is often necessary to combat the negative storage and handling characteristics of these low bulk density materials. A consistent, high-quality densified product is strongly desired, but not always delivered. Within the context of pelleting and briquetting, binding agents are commonly added to comminuted biomass feedstocks to improve the quality of the resulting pellets or briquettes. Many feedstocks naturally possess such binding agents; however, they may not be abundant enough or available in a form or state to significantly contribute to product binding. Also, process parameters (pressure and temperature) and material variables (particle size and moisture content) can be adjusted to improve the quality of the final densified product.<p>Densification of ground biomass materials is still not a science, as much work is still required to fully understand how the chemical composition and physical properties, along with the process variables, impact product quality. Generating densification and compression data, along with physical and mechanical properties of a variety of biomass materials will allow for a deeper understanding of the densification process. This in turn will result in the design of more efficient densification equipment, thus improving the feasibility of using biomass for chemical and energy production.<p>Experiments were carried out wherein process (pressure and temperature) and material (particle size and moisture content) variables were studied for their effect on the densification process (compression and relaxation characteristics) and the physical quality of the resulting products (pellets). Two feedstocks were selected for the investigation; namely, poplar wood and wheat straw, two prominent Canadian biomass resources. Steam explosion pretreatment was also investigated as a potential method of improving the densification characteristics and binding capacity of the two biomass feedstocks.<p> Compression/densification and relaxation testing was conducted in a closed-end cylindrical die at loads of 1000, 2000, 3000, and 4000 N (31.6, 63.2, 94.7, and 126.3 MPa) and die temperatures of 70 and 100°C. The raw poplar and wheat straw were first ground through a hammer mill fitted with 0.8 and 3.2 mm screens, while the particle size of the pretreated poplar and wheat straw was not adjusted. The four feedstocks (2 raw and 2 pretreated) were also conditioned to moisture contents of 9 and 15% wb prior to densification. <p> Previously developed empirical compression models fitted to the data elucidated that along with particle rearrangement and deformation, additional compression mechanisms were present during compression. Also, the compressibility and asymptotic modulus of the biomass grinds were increased by increasing the die temperature and decreasing product moisture content. While particle size did not have a significant effect on the compressibility, reducing it increased the resultant asymptotic modulus value. Steam explosion pretreatment served to decrease the compressibility and asymptotic modulus of the grinds.<p>In terms of physical quality of the resulting product, increasing the applied load naturally increased the initial density of the pellets (immediately after removal from the die). Increasing the die temperature served to increase the initial pellet density, decrease the dimensional (diametral and longitudinal) expansion (after 14 days), and increase the tensile strength of the pellets. Decreasing the raw feedstock particle size allowed for the increase in initial pellet density, decrease in diametral expansion (no effect on longitudinal expansion), and increase in tensile strength of the pellets. Decreasing the moisture content of the feedstocks allowed for higher initial pellet densities, but also an increased dimensional expansion. The pretreated feedstocks generally had higher initial pellet densities than the raw grinds. Also, the pretreated feedstocks shrank in diameter and length, and had higher tensile strengths than the raw feedstocks. The high performance of the pretreated poplar and wheat straw (as compared to their raw counterparts) was attributed to the disruption of the lignocellulosic structure, and removal/hydrolysis of hemicellulose, during the steam pretreatment process which was verified by chemical and Fourier transform infrared analysis. As a result, a higher relative amount of lignin was present. Also, the removal/hydrolysis of hemicellulose would indicate that this lignin was more readily available for binding, thus producing superior pellets.
56

Numerical modelling and observations of nuclear-explosion coda wavefields

Zhang, Chaoying 04 May 2009 (has links)
Frequency-dependent earthquake coda attenuation values are often reported; however such measurements usually depend on the types of the attenuation models employed. In this thesis, I use numerical modeling of Peaceful Nuclear Explosion (PNE) codas at far regional to teleseismic distances to compare two of such models, namely the conventional frequency-dependent attenuation with parameters (Q0, ¦Ç) defined by Qcoda(f) = Q0f¦Ç and frequency-independent effective attenuation (Qe) with geometrical attenuation (¦Ã). The results favour strongly the (¦Ã, Qe) model and illustrate the mechanisms leading to apparent Qcoda(f) dependencies. Tests for variations of the crustal velocity structures show that the values of ¦Ã are stable and related to lithospheric structural types, and the inverted Qe values can be systematically mapped into the true Swave attenuation factors within the crust. Modeling also shows that ¦Ã could increase in areas where relatively thin attenuating layers are present within the crust; such areas could likely be related to younger and active tectonics. By contrast, when interpreted by using the traditional (Q0,¦Ç) approach, the synthetic coda shows a strong and spurious frequency dependence with ¦Ç ¡Ö 0.5, which is also similar to many published observations.<p> Observed Lg codas from two Peaceful Nuclear Explosions located in different areas in Russia show similar values of ¦Ã ¡Ö 0.75¡¤10-2 s-1, which are also remarkably close to the independent numerical predictions in this thesis. At the same time, coda Qe values vary strongly, from 850 in the East European Platform to 2500 within the Siberian Craton. This suggests that parameters ¦Ã and Qe could provide stable and transportable discriminants for differentiating between the lithospheric tectonic types and ages, and also for seismic coda regionalization in nuclear-test monitoring research.
57

Applications of lattice theory to model checking

Kashyap, Sujatha 27 April 2015 (has links)
Society is increasingly dependent on the correct operation of concurrent and distributed software systems. Examples of such systems include computer networks, operating systems, telephone switches and flight control systems. Model checking is a useful tool for ensuring the correctness of such systems, because it is a fully automatic technique whose use does not require expert knowledge. Additionally, model checking allows for the production of error trails when a violation of a desired property is detected. Error trails are an invaluable debugging aid, because they provide the programmer with the sequence of events that lead to an error. Model checking typically operates by performing an exhaustive exploration of the state space of the program. Exhaustive state space exploration is not practical for industrial use in the verification of concurrent systems because of the well-known phenomenon of state space explosion caused by the exploration of all possible interleavings of concurrent events. However, the exploration of all possible interleavings is not always necessary for verification. In this dissertation, we show that results from lattice theory can be applied to ameliorate state space explosion due to concurrency, and to produce short error trails when an error is detected. We show that many CTL formulae exhibit lattice-theoretic structure that can be exploited to avoid exploring multiple interleavings of a set of concurrent events. We use this structural information to develop efficient model checking techniques for both implicit (partial order) and explicit (interleaving) models of the state space. For formulae that do not exhibit the required structure, we present a technique called predicate filtering, which uses a weaker property with the desired structural characteristics to obtain a reduced state space which can then be exhaustively explored. We also show that lattice theory can be used to obtain a path of shortest length to an error state, thereby producing short error trails that greatly ease the task of debugging. We provide experimental results from a wide range of examples, showing the effectiveness of our techniques at improving the efficiency of verifying and debugging concurrent and distributed systems. Our implementation is based on the popular model checker SPIN, and we compare our performance against the state-of-the-art state space reduction strategies implemented in SPIN. / text
58

A study of dispersion and combustion of particle clouds in post-detonation flows

Gottiparthi, Kalyana Chakravarthi 21 September 2015 (has links)
Augmentation of the impact of an explosive is routinely achieved by packing metal particles in the explosive charge. When detonated, the particles in the charge are ejected and dispersed. The ejecta influences the post-detonation combustion processes that bolster the blast wave and determines the total impact of the explosive. Thus, it is vital to understand the dispersal and the combustion of the particles in the post-detonation flow, and numerical simulations have been indispensable in developing important insights. Because of the accuracy of Eulerian-Lagrangian (EL) methods in capturing the particle interaction with the post-detonation mixing zone, EL methods have been preferred over Eulerian-Eulerian (EE) methods. However, in most cases, the number of particles in the flow renders simulations using an EL method unfeasible. To overcome this problem, a combined EE-EL approach is developed by coupling a massively parallel EL approach with an EE approach for granular flows. The overall simulation strategy is employed to simulate the interaction of ambient particle clouds with homogenous explosions and the dispersal of particles after detonation of heterogeneous explosives. Explosives packed with aluminum particles are also considered and the aluminum particle combustion in the post-detonation flow is simulated. The effect of particles, both reactive and inert, on the combustion processes is analyzed. The challenging task of solving for clouds of micron and sub-micron particles in complex post-detonation flows is successfully addressed in this thesis.
59

The Influence of Particle Size and Crystalline Level on the Combustion Characteristics of Particulated Solids

Castellanos Duarte, Diana Yazmin 16 December 2013 (has links)
Over the past years, catastrophic dust explosion incidents have caused numerous injuries, fatalities and economical losses. Dust explosions are rapid exothermic reactions that take place when a combustible dust is mixed with air in the presence of an ignition source within a confined space. A variety of strategies are currently available to prevent dust explosion accidents. However, the recurrence of these tragic events confirms flaws in process safety for dust handling industries. This dissertation reports advances in different approaches that can be followed to prevent and mitigate dust explosions. For this research, a 36 L dust explosion vessel was designed, assembled and automated to perform controlled dust explosion experiments. First, we explored the effect of size polydispersity on the evolution of aluminum dust explosions. By modifying systematically the span of the particle size distribution we demonstrated the dramatic effect of polydispersity on the initiation and propagation of aluminum dust explosions. A semi-empirical combustion model was used to quantify the laminar burning velocity at varying particle size. Moreover, correlations between ignition sensitivity and rate of pressure rise with polydispersity were developed. Second, we analyzed the effect of particle size and crystalline levels in the decomposition reactions of explosion inhibitor agents (i.e., phosphates). We fractionated ammonium phosphate- monobasic (NH_4H_2PO_4) and dibasic ((NH_4)_2HPO_4) at different size ranges, and synthesized zirconium phosphate (Zr(HPO_4)_2·H_2O) at varying size and crystalline levels. Particle size was found to be crucial to improve the rate of heat absorption of each inhibitor. A simplified model was developed to identify factors dominating the efficiency of dust explosion inhibitors. Finally, we conducted computational fluid dynamic (CFD) simulations to predict overpressures in dust explosions vented through ducts in large scale scenarios. We particularly focused on the adverse effects caused by flow restrictions in vent ducts. Critical parameters, including ignition position, geometric configuration of the vent duct, and obstructions of outflow such as bends and panels were investigated. Comparison between simulation and experimental results elucidated potential improvements in available guidelines. The theoretical analyses complemented the experimental work to provide a better understanding of the effects of particle size on the evolution of dust explosions. Furthermore, the validation of advanced simulation tools is considered crucial to overcome current limitations in predicting dust explosions in large scale scenarios.
60

Underwater Pressure Pulses Generated by Mechanically Alloyed Intermolecular Composites

Maines, Geoffrey C. 25 March 2014 (has links)
Recently, the use of thermite-based pressure waves for applications in cellular transfection and drug delivery have shown significant improvements over previous technologies. In the present study, a new technique for producing thermite-generated pressure pulses using fully-dense nano-scale thermite mixtures was evaluated. This was accomplished by evaluation of a stoichiometric mixture of aluminium (Al) and copper(II)-oxide (CuO) prepared by mechanical alloying. Flame propagation speeds, constant-volume pressure characteristics and underwater pressure characteristics of both a micron-scale and mechanically alloyed mixture were measured experimentally and compared with conventional nano-scale thermites. It was determined that mechanically alloyed mixtures are capable of attaining flame propagation speeds on the same order as nano-scale mixtures, with flame speeds reaching as high as approximately 100 m/s. Constant-volume pressure experiments indicated that mechanically alloyed mixtures result in lower pressurization rates compared with conventional nano-scale mixtures, however, an improvement by as much as an order of magnitude was achieved compared with micron-scale mixtures. Thermochemical equilibrium predictions for pressures observed in constant-volume reactions were found to capture relatively well the equilibrium pressure for both low and high values of relative density. Generally, the predictions over-estimated the measured pressures by approximately 60%. Results from underwater experiments indicated that the mechanically alloyed samples produced peak shock pressures and waveforms similar to those for a nano-scale Al-Bi2O3 mixture reported by Apperson et al. (2008). In an effort to model the pressure signal obtained from the underwater reaction, calculations were performed based on the rate of expansion of the high pressure gas sphere. Predicted pressures were found to agree fairly well in terms of both the peak pressure and pressurization rate. The present study has thus identified the ability for mechanically alloyed thermite mixtures to produce underwater pressure profiles that may be conducive for applications in cellular transfection and drug delivery. Récemment, l'utilisation d'ondes de pression produite par des mélanges de thermite pour des applications dans la transfection cellulaire et l'administration de médicaments ont démontré des améliorations importantes par rapport aux technologies précédentes. Dans l'étude ci jointe, une nouvelle technique pour produire des impulsions de pression générée par un mélange thermite, soumit a de l'alliage mécanique, a été évaluée. Ceci a été accompli par l'évaluation d'un mélange stoechiométrique d' aluminium (Al) et de l'oxyde de cuivre(II) (CuO), préparé par mécanosynthèse. Les vitesses de propagation de la flamme, les caractéristiques de pression pour la combustion à volume constant et les caractéristiques de pression pour la combustion sous l'eau ont été mesurées expérimentalement et comparés avec les thermites conventionnel à l'échelle nano. Nous avons déterminé que les mélanges alliés mécaniquement sont capables d'atteindre des vitesses de propagation de flamme du même ordre que les mélanges à l'échelle nanométrique, atteignant jusqu'à environ 100 m/s. Les expériences de combusition à volume constant, indique que les mélanges alliés mécaniquement induit des taux de pressurisation inférieures à celles des mélanges de nano-échelle conventionnel, cependant, une amélioration de près d'un ordre de grandeur a été atteint par rapport aux mélanges d'échelle micronique. Prédictions thermochimiques des pression de compbustion se sont révélés capable de relativement bien saisir les valeurs observées dans les expériences à volume constant. En règle générale, les prévisions sur-estimé les pressions mesurées par environ 60%. Les résultats des expériences sous-marines ont indiqué que les échantillons alliés mécaniquement ont produit des pressions et des profils d'onde similaires à celles produit par un mélange de Al-Bi2O3 de nano-échelle, comme indiqué par Apperson et al. (2008). Pour modéliser les pressions obtenues dans les expériences sous-marines, des calculs basés sur le taux d'expansion de la bulle de gaz à haute pression ont été obtenus. Les pressions prédites ont été trouvés d'être relativement en accord avec la pression maximale et le taux de pressurisation observé. Cette étude a ainsi identifié la possibilité pour l'utilisation des mélanges de thermites alliés mécaniquement pour produire des profils de pression sous l'eau propices pour des applications de transfection cellulaire et l'administration de médicaments.

Page generated in 0.0179 seconds