71 |
Post critical heat flux heat transfer in a vertical tube including spacer grid effectsCluss, Edward Max January 1978 (has links)
Thesis. 1978. M.S.--Massachusetts Institute of Technology. Dept. of Mechanical Engineering. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING. / Includes bibliographical references. / by Edward M. Cluss, Jr. / M.S.
|
72 |
Couche MAC adaptative pour les applications critiques de surveillance à base d’un réseau de capteurs d’image / Designing of MAC layer for Mission-Critical Surveillance Applications in Wireless Image Sensor NetworksEhsan, Muhammad 09 June 2015 (has links)
Les réseaux de capteurs sans fil sont conçus dans le but de remplir différentes tâches de surveillance dans des conditions environnementales variées. Ces petits appareils électroniques sont capables de détecter, traiter et transmettre des données environnementales avec des communications multi-sauts et peuvent par conséquent aussi se coordonner. En même temps, ces dispositifs ont des ressources limitées (mémoire, capacités de calcul) et doivent fonctionner le plus souvent sur des batteries. C’est pour ces raisons que les recherches menées dans le domaine des réseaux de capteurs possèdent naturellement une forte partie qui concerne la réduction de la consommation d’énergie et une auto-organisation du réseau.Nos recherches considèrent les applications critiques de surveillance. Ces applications peuvent avoir des exigences très différentes des réseaux de capteurs traditionnels. De plus, nous utilisons des capteurs images, dont l’activité est définie en fonction de la criticité de l’application. Un ordonnancement basé sur la criticité permet de définir des nœuds sentinelles qui possèderont une vitesse de capture plus grande, cela afin d’avoir une probabilité plus élevée de détecter des intrusions et d’alerter leurs nœuds voisins. Au niveau de la couche de contrôle d’accès au medium (couche MAC), les approches alternant activité-sommeil (consistant à allumer et éteindre la radio de manière cyclique) sont utilisées pour préserver l’énergie et prolonger la durée de vie du réseau. Cependant, tout en conservant l’énergie, les applications critiques de surveillance ne doivent pas compromettre la qualité de la surveillance et le réseau doit être toujours en mesure de propager rapidement les messages d’alerte. Notre but est de définir un protocole MAC qui pourrait réduire la latence de propagation d’alerte ainsi que de prolonger la durée de vie du réseau. Nous proposons tout d’abord une approche originale pour déterminer dynamiquement la durée de la période d’activité radio des nœuds pour augmenter la probabilité de propager rapidement les alertes. Les résultats des simulations ont confirmé que notre approche réussit à améliorer la réactivité du système par rapport à une approche statique. En même temps, notre proposition permet de réduire considérablement la consommation d’énergie du réseau. Ensuite, nous avons implémenter notre approche sur des capteurs réels et les résultats obtenus sont très proches des résultats de simulation. / Wireless Sensor Networks (WSNs) are designed for the purpose of completing different monitoring tasks under various environmental conditions. The small electronic devices called sensors are capable of sensing, processing and communicating the environmental data through multi-hop communication and coordination. These devices have limited resources (memory, computing capabilities) and usually run on batteries. This is the reason the research on wireless sensor networks have been focused on energy efficiency and self-organization of the network. We consider mission-critical surveillance applications in our research work. These applications can have different requirements than traditional WSNs. In addition, we use image sensor nodes, whose activity is defined based on criticality of the application. The criticality-based scheduling scheme defines sentry nodes with faster capture rates, to have higher probability to detect intrusions and to alert neighbor nodes. At Medium Access Control (MAC) Layer level, duty cycled approaches are used to preserve energy and prolong the network lifetime. However, while conserving energy, mission-critical surveillance applications cannot compromise on quality of surveillance and the network should still be able to quickly propagate the alert messages. In this thesis, we propose a low latency, energy efficient adaptive MAC protocol. We first propose an original approach to dynamically determine the duty-cycle length of sensor nodes to increase the probability of quick propagation of alerts. Simulation results confirmed that our approach succeeds in improving the system responsiveness when compared to a static duty-cycling approach. At the same time, our proposition considerably reduces the energy consumption of the network. Then, we implemented our approach on sensor node hardware and results were found to be very close to our simulation results.
|
73 |
Configura??o da educa??o f?sica no CTUR ? entrela?amentos cotidianos e possibilidades pedag?gicas / Configuration of Physical Education at CTUR - Everyday twists Pedagogical and Possibilities.Costa, Regiane de Souza 10 December 2010 (has links)
Submitted by Sandra Pereira (srpereira@ufrrj.br) on 2018-08-24T14:22:59Z
No. of bitstreams: 1
2010 - Regiane de Souza Costa.pdf: 976546 bytes, checksum: 76af99d4f7ecb28633203926dc58048a (MD5) / Made available in DSpace on 2018-08-24T14:22:59Z (GMT). No. of bitstreams: 1
2010 - Regiane de Souza Costa.pdf: 976546 bytes, checksum: 76af99d4f7ecb28633203926dc58048a (MD5)
Previous issue date: 2010-12-10 / Funda??o Carlos Chagas Filho de Amparo ? Pesquisa do Estado do RJ, FAPERJ, Brasil. / This dissertation is based on research conducted at the Technical College of Rural University
(CTUR / UFRRJ), with the participation of students in 3rd year of the Technical Course in
Agricultural Structure, one of the teachers of Physical Education School and also an observer
of pedagogical processes. Brings with it the main concern a critical and emancipatory Physical
Education and its elements for the human / professional formation. Its objective is to
investigate the manifestation of physical education as a curricular component of the Vocation
Agricultural Education from CTUR. For this, it?s focused on a theoretical and methodological
framework that allowed capture changes in everyday reality. Approach based on Qualitative
Research in Education, and the Methodology of Action Research as its main key element.
Under this reasoning was possible to know at first, such as physical education was manifested
in the school years, based on the relation-goals-assessment methods. Secondly, the descriptions
were used as the setting of Physical Education in previous years that were built by the group
involved in research, workshops, aiming to foster a process of awareness / reflection on the
plurality of interpretations that this area is able to present. The guiding themes of the
workshops were based on the Matrix Reference to the New ENEM 2009, rephrasing the
National Examination of Secondary Education proposed by the Ministry of Education.
Combining the data obtained from the questionnaires, interviews, reports from students and the
observations indicated that the ?spacetimes? teaching physical education, when treated in its
many possible interpretations, relying on the criticality, presents itself as an important
component for the formation of man, in a holistic manner. / Esta disserta??o est? fundamentada na pesquisa realizada no Col?gio T?cnico da Universidade
Rural (CTUR/UFRRJ), contando com a participa??o de estudantes do 3? ano do Curso T?cnico
em Agropecu?ria Org?nica, um dos professores de Educa??o F?sica da escola e, ainda, uma
observadora dos processos pedag?gicos. Traz consigo a preocupa??o central com uma
Educa??o F?sica cr?tica e emancipat?ria e seus elementos constitutivos para a forma??o
humana/profissional. Apresentou como objetivo geral investigar a manifesta??o da Educa??o
F?sica Escolar como componente curricular da Educa??o Profissional Agr?cola do CTUR. Para
tanto, debru?ou-se sobre um aporte te?rico-metodol?gico que permitiu captar as varia??es
cotidianas dispostas na realidade. Contou com a Abordagem Qualitativa de Pesquisa em
Educa??o, tendo a Metodologia da Pesquisa-A??o o seu principal eixo orientador. Sob esta
fundamenta??o foi poss?vel conhecer, num primeiro momento, como a Educa??o F?sica se
manifestou nos anos escolares, com base na rela??o objetivos-m?todos-avalia??o. Num
segundo momento, foram utilizadas as descri??es quanto ? configura??o da Educa??o F?sica
nos anos anteriores para que fossem constru?das, pelo coletivo envolvido na pesquisa, oficinas
tem?ticas, visando possibilitar um processo de sensibiliza??o/reflex?o diante da pluralidade de
interpreta??es que esta ?rea ? capaz de apresentar. Os temas orientadores das oficinas foram
baseados na Matriz de Refer?ncia para o Novo ENEM 2009, reformula??o do Exame Nacional
do Ensino M?dio proposta pelo Minist?rio da Educa??o. A conjuga??o dos dados obtidos com
a aplica??o dos question?rios, das entrevistas, dos relatos dos estudantes e das observa??es dos
espa?ostempos pedag?gicos sinalizou que a Educa??o F?sica, quando tratada nas suas m?ltiplas
possibilidades interpretativas, apoiando-se na criticidade, se apresenta como um importante
componente para a forma??o do homem, numa perspectiva hol?stica.
|
74 |
Accélération de la convergence dans le code de transport de particules Monte-Carlo TRIPOLI-4® en criticité / Convergence acceleration in the Monte-Carlo particle transport code TRIPOLI-4® in criticalityDehaye, Benjamin 05 December 2014 (has links)
Un certain nombre de domaines tels que les études de criticité requièrent le calcul de certaines grandeurs neutroniques d'intérêt. Il existe deux types de code : les codes déterministes et les codes stochastiques. Ces derniers sont réputés simuler la physique de la configuration traitée de manière exacte. Toutefois, le temps de calcul nécessaire peut s'avérer très élevé.Le travail réalisé dans cette thèse a pour but de bâtir une stratégie d'accélération de la convergence de la criticité dans le code de calcul TRIPOLI-4®. Nous souhaitons mettre en œuvre le jeu à variance nulle. Pour ce faire, il est nécessaire de calculer le flux adjoint. L'originalité de cette thèse est de calculer directement le flux adjoint par une simulation directe Monte-Carlo sans passer par un code externe, grâce à la méthode de la matrice de fission. Ce flux adjoint est ensuite utilisé comme carte d'importance afin d'accélérer la convergence de la simulation. / Fields such as criticality studies need to compute some values of interest in neutron physics. Two kind of codes may be used : deterministic ones and stochastic ones. The stochastic codes do not require approximation and are thus more exact. However, they may require a lot of time to converge with a sufficient precision.The work carried out during this thesis aims to build an efficient acceleration strategy in the TRIPOLI-4®. We wish to implement the zero variance game. To do so, the method requires to compute the adjoint flux. The originality of this work is to directly compute the adjoint flux directly from a Monte-Carlo simulation without using external codes thanks to the fission matrix method. This adjoint flux is then used as an importance map to bias the simulation.
|
75 |
Open Source Software Evolution and Its DynamicsWu, Jingwei January 2006 (has links)
This thesis undertakes an empirical study of software evolution by analyzing open source software (OSS) systems. The main purpose is to aid in understanding OSS evolution. The work centers on collecting large quantities of structural data cost-effectively and analyzing such data to understand software evolution <em>dynamics</em> (the mechanisms and causes of change or growth). <br /><br /> We propose a multipurpose systematic approach to extracting program facts (<em>e. g. </em>, function calls). This approach is supported by a suite of C and C++ program extractors, which cover different steps in the program build process and handle both source and binary code. We present several heuristics to link facts extracted from individual files into a combined system model of reasonable accuracy. We extract historical sequences of system models to aid software evolution analysis. <br /><br /> We propose that software evolution can be viewed as <em>Punctuated Equilibrium</em> (<em>i. e. </em>, long periods of small changes interrupted occasionally by large avalanche changes). We develop two approaches to study such dynamical behavior. One approach uses the evolution spectrograph to visualize file level changes to the implemented system structure. The other approach relies on automated software clustering techniques to recover system design changes. We discuss lessons learned from using these approaches. <br /><br /> We present a new perspective on software evolution dynamics. From this perspective, an evolving software system responds to external events (<em>e. g. </em>, new functional requirements) according to <em>Self-Organized Criticality</em> (SOC). The SOC dynamics is characterized by the following: (1) the probability distribution of change sizes is a power law; and (2) the time series of change exhibits long range correlations with power law behavior. We present empirical evidence that SOC occurs in open source software systems.
|
76 |
Open Source Software Evolution and Its DynamicsWu, Jingwei January 2006 (has links)
This thesis undertakes an empirical study of software evolution by analyzing open source software (OSS) systems. The main purpose is to aid in understanding OSS evolution. The work centers on collecting large quantities of structural data cost-effectively and analyzing such data to understand software evolution <em>dynamics</em> (the mechanisms and causes of change or growth). <br /><br /> We propose a multipurpose systematic approach to extracting program facts (<em>e. g. </em>, function calls). This approach is supported by a suite of C and C++ program extractors, which cover different steps in the program build process and handle both source and binary code. We present several heuristics to link facts extracted from individual files into a combined system model of reasonable accuracy. We extract historical sequences of system models to aid software evolution analysis. <br /><br /> We propose that software evolution can be viewed as <em>Punctuated Equilibrium</em> (<em>i. e. </em>, long periods of small changes interrupted occasionally by large avalanche changes). We develop two approaches to study such dynamical behavior. One approach uses the evolution spectrograph to visualize file level changes to the implemented system structure. The other approach relies on automated software clustering techniques to recover system design changes. We discuss lessons learned from using these approaches. <br /><br /> We present a new perspective on software evolution dynamics. From this perspective, an evolving software system responds to external events (<em>e. g. </em>, new functional requirements) according to <em>Self-Organized Criticality</em> (SOC). The SOC dynamics is characterized by the following: (1) the probability distribution of change sizes is a power law; and (2) the time series of change exhibits long range correlations with power law behavior. We present empirical evidence that SOC occurs in open source software systems.
|
77 |
Entropy-based diagnostics of criticality Monte Carlo simulation and higher eigenmode acceleration methodologyShi, Bo 10 June 2010 (has links)
Because of the accuracy and ease of implementation, Monte Carlo methodology is widely used in analysis of nuclear systems. The obtained estimate of the multiplication factor (keff) or flux distribution is statistical by its nature. In criticality simulation of a nuclear critical system, whose basis is the power iteration method, the guessed source distribution initially is generally away from the converged fundamental one. Therefore, it is necessary to ensure that the convergence is achieved before data are accumulated. Discarding a larger amount of initial histories could reduce the risk of contaminating the results by non-converged data but increases the computational expense. This issue is amplified for large loosely coupled nuclear systems with low convergence rate. Since keff is a generation-based global value, frequently no explicit criterion is applied to the diagnostic of keff directly. As an alternative, a flux-based entropy check available in MCNP5 works well in many cases. However, when applied to a difficult storage fuel pool benchmark problem, it could not always detect the non-convergence of flux distribution. Preliminary evaluation indicates that it is due to collapsing local information into a single number. This thesis addresses this problem by two new developments. First, it aims to find a more reliable way to assess convergence by analyzing the local flux change. Second, it introduces an approach to simultaneously compute both the first and second eigenmodes. At the same time, by computing these eigenmodes, this approach could increase the convergence rate. Improvement in these two areas could have a significant impact on practicality of Monte Carlo criticality simulations.
|
78 |
Development and implementation of convergence diagnostics and acceleration methodologies in Monte Carlo criticality simulationsShi, Bo 12 December 2011 (has links)
Because of the accuracy and ease of implementation, the Monte Carlo methodology is widely used in the analysis of nuclear systems. The estimated effective multiplication factor (keff) and flux distribution are statistical by their natures. In eigenvalue problems, however, neutron histories are not independent but are correlated through subsequent generations. Therefore, it is necessary to ensure that only the converged data are used for further analysis. Discarding a larger amount of initial histories would reduce the risk of contaminating the results by non-converged data, but increase the computational expense. This issue is amplified for large nuclear systems with slow convergence. One solution would be to use the convergence of keff or the flux distribution as the criterion for initiating accumulation of data. Although several approaches have been developed aimed at identifying convergence, these methods are not always reliable, especially for slow converging problems. This dissertation has attacked this difficulty by developing two independent but related methodologies. One aims to find a more reliable and robust way to assess convergence by statistically analyzing the local flux change. The other forms a basis to increase the convergence rate and thus reduce the computational expense. Eventually, these two topics will contribute to the ultimate goal of improving the reliability and efficiency of the Monte Carlo criticality calculations.
|
79 |
Refinery Power Distribution Reliability and InterruptionNygren, Leif Unknown Date
No description available.
|
80 |
Refinery Power Distribution Reliability and InterruptionNygren, Leif 11 1900 (has links)
In the refining industry, the cost of a power system interruption is dominated by an associated loss of production. Power distribution within a refinery includes a set of production units within a highly inter-dependent process, where the outage of a single unit affects the production of additional units. This thesis proposes a method to quantify the impact of this cascading effect, called the criticality enhancement function, in which a process reliability model is introduced to link electrical outage cut-sets with lost production. Power system criticality is analyzed using four different approaches to the calculation of annual expected impact from load point interruptions on a case study of the 125,000 barrel-per-day Petro-Canada Edmonton Refinery. This thesis demonstrates how employment of the proposed technique, with its marriage of electrical and process reliability models, enables the most accurate estimation of the impact of power system interruptions.
|
Page generated in 0.0402 seconds