Spelling suggestions: "subject:"coaggregation"" "subject:"andaggregation""
821 |
Algorithms and protocols for multi-channel wireless networksKakumanu, Sandeep 03 November 2011 (has links)
A wireless channel is shared by all devices, in the vicinity, that are tuned to the channel, and at any given time, only one of the devices can transmit information. One way to overcome this limitation, in throughput capacity, is to use multiple orthogonal channels for different devices, that want to transmit information at the same time. In this work, we consider the use of multiple orthogonal channels in wireless data networks. We explore algorithms and protocols for such multi-channel wireless networks under two broad categories of network-wide and link-level challenges. Towards handling the network-wide issues, we consider the channel assignment and routing issues in multi-channel wireless networks. We study both single radio and multi-radio multi-channel networks. For single radio multi-channel networks, we propose a new granularity for channel assignment, that we refer to as component level channel assignment. The strategy is relatively simple, and is characterized by several impressive practical advantages. For multi-radio multi-channel networks, we propose a joint routing and channel assignment protocol, known as Lattice Routing. The protocol manages channels of the radios, for the different nodes in the network, using information about current channel conditions, and adapts itself to varying traffic patterns, in order to efficiently use the multiple channels. Through ns2 based simulations, we show how both the protocols outperform other existing protocols for multi-channel networks under different network environments. Towards handling the link-level challenges, we identify the practical challenges in achieving a high data-rate wireless link across two devices using multiple off-the-shelf wireless radios. Given that the IEEE 802.11 a/g standards define 3 orthogonal wi-fi channels in the 2.4GHz band and 12 orthogonal wi-fi channels in the 5GHz band, we answer the following question: ``can a pair of devices each equipped with 15 wi-fi radios use all the available orthogonal channels to achieve a high data-rate link operating at 600Mbps?' Surprisingly, we find through experimental evaluation that the actual observed performance when using all fifteen orthogonal channels between two devices is a mere 91Mbps. We identify the reasons behind the low performance and present Glia, a software only solution that effectively exercises all available radios. We prototype Glia and show using experimental evaluations that Glia helps achieve close to 600Mbps data-rate when using all possible wi-fi channels.
|
822 |
Characterizing, imaging, and quantifying the environmental behavior and biological interactions of metal-based nanoparticlesZhang, Wen 24 June 2011 (has links)
Due to the rapid expansion of nanotechnology and the increasing applications of nanomaterials under production and development, it is essential evaluate the potential impacts on human health, ecosystems and the environment. This study is specifically focused on the interactions between metal-based nanoparticles (NPs) and target cells, aiming at exploration of the fundamental knowledge essentially useful for understanding nanotoxicity and its connections with particle properties. The whole structure of this study can be divided into three levels: the first level is to quantitatively understand physicochemical properties of NPs of interest and their dynamic changes under varying environmental conditions. The second level is to evaluate the biological interactions of representative NPs with a specific focus on the size-dependent adsorption processes, interfacial forces, cellular disruption, and membrane damages. The third level is to develop effective, accurate, and valid tools based on atomic force microscopy (AFM) to characterize NPs in terms of the nanoscale hydrophobicity and the nanoscale electric properties, which are most relevant and important properties in the bio-nano interactions. Overall, this study systematically investigated the kinetic environmental behaviors, biological interactions, and unique nano-properties of metal-based NPs, which should be of interest to people in application and implication of nanotechnology.
|
823 |
Untersuchungen zum Einfluss von London-Dispersionswechselwirkungen auf die Molekülaggregation / Influence of London dispersion on molecular aggregationAltnöder, Jonas 21 May 2015 (has links)
No description available.
|
824 |
Impacts of Artificial Reefs on Surrounding EcosystemsManoukian, Sarine 01 January 2011 (has links)
Artificial reefs are becoming a popular biological and management component in shallow water environments characterized by soft seabed, representing both important marine habitats and tools to manage coastal fisheries and resources. An artificial reef in the marine environment acts as an open system with exchange of material and energy, altering the physical and biological characteristics of the surrounding area. Reef stability will depend on the balance of scour, settlement, and burial resulting from ocean conditions over time. Because of the unstable nature of sediments, they require a detailed and systematic investigation.
Acoustic systems like high-frequency multibeam sonar are efficient tools in monitoring the environmental evolution around artificial reefs, whereas water turbidity can limit visual dive and ROV inspections. A high-frequency multibeam echo sounder offers the potential of detecting fine-scale distribution of reef units, providing an unprecedented level of resolution, coverage, and spatial definition. How do artificial reefs change over time in relation to the coastal processes? How accurately does multibeam technology map different typologies of artificial modules of known size and shape? How do artificial reefs affect fish school behavior? What are the limitations of multibeam technology for investigating fish school distribution as well as spatial and temporal changes? This study addresses the above questions and presents results of a new approach for artificial reef seafloor mapping over time, based upon an integrated analysis of multibeam swath bathymetry data and geoscientific information (backscatter data analysis, SCUBA observations, physical oceanographic data, and previous findings on the geology and sedimentation processes, integrated with unpublished data) from Senigallia artificial reef, northwestern Adriatic Sea (Italy) and St. Petersburg Beach Reef, west-central Florida continental shelf. A new approach for observation of fish aggregations associated with Senigallia reef based on the analysis of multibeam backscatter data in the water column is also explored.
The settlement of the reefs and any terrain change are investigated over time providing a useful description of the local hydrodynamics and geological processes. All the artificial structures (made up by water-based concrete for Senigallia reef and mainly steel for St. Petersburg Beach reef) are identified and those showing substantial horizontal and/or vertical movements are analyzed in detail. Most artificial modules of Senigallia reef are not intact and scour signatures are well depicted around them, indicating reversals of the local current. This is due to both the wind pattern and to the quite close arrangement of the reef units that tend to deflect the bottom flow. As regards to the St. Petersburg Beach reef, all the man-made steel units are still in their upright position. Only a large barge shows a gradual collapse of its south side, and presents well-developed scouring at its east-northeast side, indicating dominant bottom flow from west-southwest to east-northeast. While an overall seafloor depth shallowing of about 0.30 m from down-current deposits was observed for Senigallia reef, an overall deepening of about 0.08 m due to scour was observed at the St. Petersburg Beach reef.
Based on the backscatter data interpretation, surficial sediments are coarser in the vicinities of both artificial reefs than corresponding surrounding sediments. Scouring reveals this coarser layer underneath the prevalent mud sediment at Senigallia reef, and the predominant silt sediment at St. Petersburg Beach reef. In the ten years of Senigalia reef study, large-scale variations between clay and silt appear to be directly linked to large flood events that have occurred just prior to the change.
As regards the water column investigation, acoustic backscatter from fish aggregations gives detailed information on their morphology and spatial distribution. In addition, relative fish biomass estimates can be extrapolated. Results suggest that most of the fish aggregations are generally associated with the artificial modules showing a tendency for mid- and bottom-water depth distribution than for the surface waters.
This study contributes to understanding the changes in artificial reefs over time in relation to coastal processes. Moreover, the preliminary results concerning the water column backscatter data represents progress in fisheries acoustics research as a result of three-dimensional acoustics. They demonstrate the benefits of multibeam sonar as a tool to investigate and quantify size distribution and geometry of fish aggregations associated with shallow marine habitats.
|
825 |
Multi-Agent Systems with Reciprocal Interaction LawsChen, Xudong 06 June 2014 (has links)
In this thesis, we investigate a special class of multi-agent systems, which we call reciprocal multi-agent (RMA) systems. The evolution of agents in a RMA system is governed by interactions between pairs of agents. Each interaction is reciprocal, and the magnitude of attraction/repulsion depends only on distances between agents. We investigate the class of RMA systems from four perspectives, these are two basic properties of the dynamical system, one formula for computing the Morse indices/co-indices of critical formations, and one formation control model as a variation of the class of RMA systems. An important aspect about RMA systems is that there is an equivariant potential function associated with each RMA system so that the equations of motion of agents are actually a gradient flow. The two basic properties about this class of gradient systems we will investigate are about the convergence of the gradient flow, and about the question whether the associated potential function is generically an equivariant Morse function. We develop systematic approaches for studying these two problems, and establish important results. A RMA system often has multiple critical formations and in general, these are hard to locate. So in this thesis, we consider a special class of RMA systems whereby there is a geometric characterization for each critical formation. A formula associated with the characterization is developed for computing the Morse index/co-index of each critical formation. This formula has a potential impact on the design and control of RMA systems. In this thesis, we also consider a formation control model whereby the control of formation is achieved by varying interactions between selected pairs of agents. This model can be interpreted in different ways in terms of patterns of information flow, and we establish results about the controllability of this control system for both centralized and decentralized problems. / Engineering and Applied Sciences
|
826 |
Essays in economic design : information, markets and dynamicsKhan, Urmee, 1977- 06 July 2011 (has links)
This dissertation consists of three essays that apply both economic theory and econometric methods to understand design and dynamics of institutions. In particular, it studies how institutions aggregate information and deal with uncertainty and attempts to derive implications for optimal institution design. Here is a brief summary of the essays. In many economic, political and social situations where the environment changes in a random fashion necessitating costly action we face a choice of both the timing of the action as well as choosing the optimal action. In particular, if the stochastic environment possesses the property that the next environmental change becomes either more or less likely as more time passes since the last change (in other words the hazard rate of environmental change is not constant over time), then the timing of the action takes on special importance. In the first essay, joint with Maxwell B Stinchcombe, we model and solve a dynamic decision problem in a semi-Markov environment. We find that if the arrival times for state changes do not follow a memoryless process, time since the last observed change of state, in addition to the current state, becomes a crucial variable in the decision. We characterize the optimal policy and the optimal timing of executing that policy in the differentiable case by a set of first order conditions of a relatively simple form. They show that both in the case of increasing and decreasing hazard rates, the optimal response may be to wait before executing a policy change. The intuitive explanation of the result has to do with the fact that waiting reveals information about the likelihood of the next change occurring, hence waiting is valuable when actions are costly. This result helps shed new light on the structure of optimal decisions in many interesting problems of institution design, including the fact that constitutions often have built-in delay mechanisms to slow the pace of legislative change. Our model results could be used to characterize optimal timing rules for constitutional amendments. The paper also contributes to generalize the methodology of semi-Markov decision theory by formulating a dynamic programming set-up that looks to solve the timing-of-action problem whereas the existing literature looks to optimize over a much more limited set of policies where the action can only be taken at the instant when the state changes. In the second essay, we extend our research to situations, where the current choice of action influences the future path of the stochastic process, and apply it to the legal framework surrounding environmental issues, particularly to the ‘Precautionary Principle' as applied to climate change legislation. We represent scientific uncertainty about environmental degradation using the concept of 'ambiguity' and show that ambiguity aversion generates a 'precautionary effect'. As a result, justification is provided for the Precautionary Principle that is different from the ones provided by expected utility theory. This essay serves both as an application of the general theoretical results derived in the first essay and also stands alone as an analysis of a substantive question about environmental law. Prediction markets have attracted public attention in recent years for making accurate predictions about election outcomes, product sales, film box office and myriad other variables of interest and many believe that they will soon become a very important decision support system in a wide variety of areas including governance, law and industry. For successful design of these markets, a thorough understanding of the theoretical and empirical foundations of such markets is necessary. But the information aggregation process in these markets is not fully understood yet. There remains a number of open questions. The third essay, joint with Robert Lieli, attempts to analyze the direction and timing of information flow between prices, polls, and media coverage of events traded on prediction markets. Specifically, we examine the race between Barack Obama and Hillary Clinton in the 2008 Democratic primaries for presidential nomination. Substantively, we ask the following question: (i) Do prediction market prices have information that is not reflected in viii contemporaneous polls and media stories? (ii) Conversely, do prices react to information that appears to be news for pollsters or is prominently featured by the media? Quantitatively, we construct time series variables that reflect the "pollster's surprise" in each primary election, measured as the difference between actual vote share and vote share predicted by the latest poll before the primary, as well as indices that describe the extent of media coverage received by the candidates. We carry out Granger Causality tests between the day-to-day percentage change in the price of the "Obama wins nomination" security and these information variables. Some key results from our exercise can be summarized as follows. There seems to be mutual (two-way) Granger causality between prediction market prices and the surprise element in the primaries. There is also evidence of one-way Granger causality in the short run from price changes towards media news indices. These results suggest that prediction market prices anticipate at least some of the discrepancy between the actual outcome and the latest round of polls before the election. Nevertheless, prices also seem to be driven partly by election results, suggesting that there is an element of the pollster’s surprise that is genuine news for the market as well. / text
|
827 |
Βελτιστοποίηση ερωτημάτων με πολλαπλά κριτήρια σε βάσεις δεδομένων / Multiobjective query optimization under parametric aggregation constraintsΡήγα, Γεωργία 24 September 2007 (has links)
Το πρόβλημα της βελτιστοποίησης ερωτημάτων πολλαπλών κριτηρίων σε βάσεις δεδομένων είναι ένα αρκετά δύσκολο και ενδιαφέρον ερευνητικά πρόβλημα, διότι χαρακτηρίζεται από αντικρουόμενες απαιτήσεις. Κάθε βήμα στην απάντηση ενός ερωτήματος μπορεί να εκτελεστεί με παραπάνω από έναν τρόπους. Για την επίλυση τέτοιου είδους ερωτημάτων έχουν προταθεί διάφοροι αλγόριθμοι, με πιο πρόσφατους τους: Mariposa, M' και Generate Partitions. Ο Mariposa και ο Μ' εφαρμόζονται στην βάση δεδομένων Mariposa, η οποία δίνει την δυνατότητα στον χρήστη να καθορίζει την επιθυμητή εξισορόπηση (tradeoff) καθυστέρησης/κόστους για κάθε ερώτημα που θέτει. Ο αλγόριθμος Mariposa ακολουθεί μία προσέγγιση απληστίας (greedy approach) προσπαθώντας σε κάθε βήμα να μεγιστοποιήσει το «κέρδος» ενώ ο Μ' χρησιμοποιεί σύνολα βέτιστων κατά Pareto λύσεων για την επιλογή του επόμενου βήματος στην θέση του κριτηρίου απληστίας. Τέλος, ο αλγόριθμος Generate Partition χρησιμοποιεί έναν διαχωρισμό του χώρου απαντήσεων χρησιμοποιώντας δομές R-trees πετυχαίνοντας πολύ καλή απόδοση. / The optimization of queries in distributed database systems is known to be subject to delicate trade-offs. For example, the Mariposa database system allows users to specify a desired delay-cost tradeoff (that is to supply a decreasing function u(d) specifying how much the user is willing to pay in order to receive the query results within time d) Mariposa divides a query graph into orizontal strides analyzes each stride, and uses a greedy heuristic to find the best plan for all strides.
|
828 |
Molecular Mechanisms of Tau Protein Aggregation InhibitionAkoury, Elias 30 September 2013 (has links)
No description available.
|
829 |
Application of the HLD and NAC Models to the Formation and Stability of EmulsionsKiran, Sumit K. 10 January 2014 (has links)
This thesis explored how asphaltene and naphthenic amphiphile species influence the formation (morphology and size) and stability of heavy crude oil (bitumen) emulsions. It was experimentally shown that asphaltenes produce water-in-oil emulsions. Naphthenic amphiphiles on the other hand flip the emulsion morphology to oil-in-water. It was further demonstrated that the size and stability of these emulsions is influenced by physicochemical effects such as the pH, solvent-bitumen-water ratios, solvent aromaticity, and temperature. In view of these findings, the hydrophilic-lipophilic deviation (HLD) and net-average curvature (NAC) models were looked at as potential means for predicting the formation and stability of emulsions. Owing to the complexity of bitumen emulsions, however, the HLD and NAC models were instead tested against well-defined sodium dihexyl sulfosuccinate-toluene-water emulsions. The morphologies of these emulsions were predicted as a function of the formulation salinity whereas corresponding droplet sizes were predicted as a function of the continuous phase density and interfacial tension (γow). Emulsion stability trends were in turn predicted using a collision-coalescence-separation assumption. From this assumption, emulsion stability was expressed as a function of the emulsion droplet collision frequency and activation energy. The key parameters of the highly scrutinized activation energy term included the γow, interfacial rigidity, and critical film thickness. In applying the same modeling approach to the stability of other emulsions already published in the literature, it was found that the rigidity of adsorbed multilayer/liquid crystal films cannot yet be fully accounted for. This shortcoming was the reason for which only minimum stability times were reported for bitumen emulsions.
|
830 |
Réduction des dommages myocardiques par le célécoxib suite à une ischémie transitoire chez le ratLada-Moldovan, Laura 11 1900 (has links)
Cette étude a été conçue afin d’évaluer l’effet d’un pré-traitement à long terme au célécoxib sur la taille d’infarctus suite à un infarctus du myocarde. Sachant que le célécoxib est un anti-inflammatoire et que des dommages myocardiques peuvent découler des processus inflammatoires, l’inhibition de l’inflammation devrait hypothétiquement réduire la taille d’un éventuel infarctus. Pour ce faire, un traitement au célécoxib (3 mg/kg/jour i.p.) ou au véhicule (DMSO 50% ; EtOH 15% ; eau distillée) a été administré chroniquement pendant 28 jours à des rats mâles Sprague-Dawley (n=18 par groupe) par pompes osmotiques ALZET. Après avoir été anesthésiés, les animaux ont été sujets à l’occlusion de l’artère coronaire gauche descendante, suivie d’une période de reperfusion de 24 heures. Les résultats démontrent que la taille de l’infarctus des animaux traités au célécoxib est significativement réduite comparativement à celle du groupe témoin (37,5±2,5% versus 48,0±2,6% de la zone à risque, p < 0,05). Par la suite, l’accumulation de neutrophiles indique une hausse de ces leucocytes pour la zone ischémique, sans toutefois discriminer entre les groupes traité et non-traité, qui contenaient aussi les couches sub-endocardique et sous-épicardique. Cependant, aucune différence significative est notée entre les groupes traité et témoin au niveau de l’expression de la prostaglandine E2 plasmatique et du facteur de nécrose tumorale alpha. D’un autre côté, l’apoptose, déterminée par le ratio de Bax/Bcl2 et par un essai TUNEL est significativement réduite pour la couche sub-endocardique de la zone à risque des animaux traités au célécoxib. Enfin, l’agrégation plaquettaire, induite à l’adénosine diphosphate et analysée dans le sang complet, suggère que le célécoxib diminue l’agrégation plaquettaire. Cette étude indique alors qu’un pré-traitement au célécoxib peut réduire la taille d’infarctus par un mécanisme impliquant l’apoptose. / This study was designed to evaluate the effect of long-term pre-treatment with celecoxib, a cyclooxygenase-2 inhibitor, on myocardial infarct size. Since celecoxib is an anti-inflammatory and that myocardial damages can be present in the occurrence of inflammatory processes, inhibition of inflammation should hypothetically reduce the size of an eventual infarct. Celecoxib (3 mg/kg/day i.p.) or vehicle (DMSO 50%; EtOH 15%; distilled water) was administered chronically to male Sprague-Dawley rats (n=18 per group) through ALZET osmotic pumps for 28 days. Under anaesthesia, the animals were then subjected to left anterior descending coronary artery occlusion for 40 minutes, followed by 24-hour reperfusion. The results show that myocardial infarct size in celecoxib-treated rats was significantly reduced compared to the control group (37.5±2.5% versus 48.0±2.6% of the area at risk, p < 0.05). Accumulation of neutrophils, estimated by myeloperoxidase levels, indicated an increase in the ischemic area without any significant difference between groups. No significant difference was observed between the treated and vehicle groups in terms of plasma prostaglandin E2 and tumour necrosis factor-alpha. Apoptosis, evaluated by Bax/Bcl-2 and terminal dUTP nick-end labelled-positive cells, was significantly decreased in the subendocardial layer of the ischemic area in celecoxib-treated rats. Adenosine diphosphate-induced platelet aggregation in whole blood suggested that celecoxib diminished platelet aggregation. This study indicates that pre-treatment with celecoxib can reduce infarct size by a mechanism which may involve apoptosis.
|
Page generated in 0.0867 seconds