Spelling suggestions: "subject:"replicas"" "subject:"replicase""
11 |
Stress and microstructural evolution during shape-preserving silica magnesiothermic reductionDavis, Stanley Casey 06 March 2012 (has links)
Shape-preserving silica magnesiothermic reduction is a gas-solid reaction used to convert complex, 3-dimensional SiO₂ structures into replicas composed of a two-phase product of MgO and Si. The MgO/Si components of this reaction are found to form an interwoven aggregate product structure, which is suitably robust that the MgO phase can be selectively dissolved to yield porous Si. Here, the kinetics and mechanisms of growth of this robust product structure have been studied. The aggregate product structure was deduced to result because stacked layers of MgO/Si product phases with planar interfaces are geometrically unstable, owing to the growth kinetics of the products. The interwoven nature of the aggregate may be explained by the presence of an amorphous magnesium silicate phase ahead of the MgO/Si product during reaction. Complex composition gradients in the magnesium silicate can lead to tortuous and branching growth of MgO and Si phases as the magnesium silicate is consumed by reaction. In addition, a large residual stress (> 5 GPa) was measured in the MgO/Si product layer formed during reaction of planar quartz. Despite the presence of such a large stress, no distortion or cracking of reacted structures was found to occur after reaction in the temperature range 650-900 °C. XRD-based residual stress measurements and morphological observations of product films on reacted quartz substrates were used to evaluate possible mechanisms of stress relief in the structure. It was found that the migration of MgO to the external surface of the product layer could be correlated to the rate of stress relaxation that occurred in annealed product films. Finally, applications of silica magnesiothermic reduction and derivative processes were studied in the fields of chemical catalysis and optical chemical sensing.
|
12 |
Quality aspects in direct shear testing of rock jointsLarsson, Jörgen January 2021 (has links)
The stability of rock masses is influenced by the occurrence of rock joints. Therefore, the shear strength of rock joints must be considered in dimensioning of underground constructions. One way to predict the shear strength is through usage of failure criteria, which are validated from results of direct shear tests under controlled laboratory conditions. Consequently, the quality of the results from the tests are crucial to the accuracy with which the criteria will be able to predict the shear strength. Since rock joints are unique by nature usage of replicas (man-made copies of rock joints) is of importance in parameter studies. The overall objective of this work is to facilitate the development of improved criteria for predictions of the shear strength of rock joints. To support this objective, two sources of uncertainty have been investigated, namely the geometry of replicas and the influence of the normal stiffness of test systems. Two quality assurance parameters for evaluation of geometrical differences between replicas and rock joints based on scanning data have been derived. The first parameter describes the morphological deviations. The second parameter describes the deviations in orientation with respect to the shear plane. The effective normal stiffness approach, which compensates for the influence of the normal stiffness of the test system in direct shear testing, has been developed, validated, and applied. With help of the quality assurance parameters it is demonstrated that it is possible to reproduce replicas within narrow tolerances. Application of the effective normal stiffness approach basically eliminates the normal load error. In all, the results support generation of improved quality of test data and consequently, the development of shear strength criteria with improved accuracy will also be facilitated. / <p>Academic Dissertation which, with due permission of the KTH Royal Institute of Technology, is submitted for public defence for the Degree of Licentiate of Engineering on Wednesday the 9th June 2021, at 9:00 a.m. in M108, Brinellvägen 23, Stockholm.</p><p><strong>Paper A</strong>: Larsson J, Flansbjer M, Portal N W, Johnson E, Johansson F, and Mas Ivars D. (2020) Geometrical Quality Assurance of Rock Joint Replicas in Shear Tests – Introductory Analysis. Paper presented at the ISRM International Symposium - EUROCK 2020, physical event not held. https://onepetro.org/ISRMEUROCK/proceedings-abstract/EUROCK20/All-EUROCK20/ISRM-EUROCK-2020-101/451187 In Diva: http://urn.kb.se/resolve?urn=urn:nbn:se:ri:diva-51987 </p><p><strong>Paper B:</strong> Larsson J, Johansson F, Mas Ivars D, Johnson E, Flansbjer M and Portal N W. (2021) Rock joint replicas in direct shear testing – Part 1: Extraction of geometrical quality assurance parameters. To be submitted to Rock Mechanics and Rock Engineering In DiVA: http://urn.kb.se/resolve?urn=urn:nbn:se:ri:diva-53111 </p><p><strong>Paper C:</strong> Larsson J and Flansbjer M. (2020) An Approach to Compensate for the Influence of the System Normal Stiffness in CNS Direct Shear Tests. Rock Mechancis and Rock Engineering 53, 2185–2199 https://doi.org/10.1007/s00603-020-02051-0 In DiVA: http://urn.kb.se/resolve?urn=urn:nbn:se:ri:diva-44085 </p><p><strong>Paper D:</strong> Larsson J. (2021) Experimental investigation of the system normal stiffness of a 5 MN direct shear test setup and the compensation of it in CNS direct shear tests. Submitted to ISRM International Symposium - EUROCK 2021 In DiVA: http://urn.kb.se/resolve?urn=urn:nbn:se:ri:diva-53112 </p>
|
13 |
Like Jacob with Esau: The 3D Printed Replica and the Future of the MuseumWalton, James Andrew 13 June 2018 (has links)
The importance of the aura, the metaphysical element that gives art, artifacts, and other objects of cultural heritage their authenticity, has been heavily contemplated ever since the publication of Walter Benjamin's "The Work of Art in the Age of Mechanical Reproduction." This thesis strives to add to this conversation and expand upon it by delving into the emergence of additive manufacturing, or what is more commonly known as 3D printing, and its relation to museums and other institutions that comprise the public humanities. This technology challenges the auratic properties of an exhibit by first digitizing it onto a computer by scanning it and then uploading this data to a 3D printer, which then proceeds to replicate the scanned exhibit down to incredibly fine details. For museums the possibility that 3D printed replicas, increasingly able to be indistinguishable from the original and capable of being produced in great numbers at ease, replacing their auratic exhibits is a very real possibility to consider. This thesis argues that some museums are responding by despatializing their exhibitions in order to uphold their auratic exhibits, while others are offsetting the potential loss by turning their exhibitions into tactile, multisensory experiences. Either option, which are not mutually exclusive, transforms the traditional museum. This thesis ultimately concludes that it's possible to reconcile the auratic exhibit with the 3D printed replica should these institutions properly adapt. Doing so will allow them to continue fulfilling their mission statements to preserve and promote the auratic exhibits well into the future. / Master of Arts / The emergence of 3D printing in recent years has brought with it many implications for not just society, but also the museums and other institutions that comprise the public humanities. Along with printing out objects designed entirely within a computer, 3D imaging and printing technology can easily process scanned objects that have been digitized onto a computer and then reproduce that object while accurately mimicking the specific features that once made it unique. As they increasingly improve in their ability to print with more and more materials, distinguishing the original from its copies from sight alone brings with it an existential dilemma for museums that rely on the authenticity of the original to draw visitors. If everyone could potentially have their own life-size copy of David thanks to a 3D printer, what fate awaits the original David? This thesis will detail this possible development and how exhibitions are responding. Arguing that some museums are in fact utilizing 3D printers outside the halls of their institutions in order to uphold their original collections, while others are offsetting the potential loss by embracing the reproducibility 3D printers provide to allow visitors to touch and interact with 3D printed copies in multisensory exhibitions. Either option, which are not mutually exclusive, transforms the traditional museum going experience. This thesis ultimately concludes that it’s possible to reconcile the original with a perfectly mimicked copy should these institutions properly adapt, allowing them to fulfill their missions to preserve and promote the originals well into the future.
|
14 |
Física estatística de compressed sensing online / Statistical Physics of Online Compressed SensingRossi, Paulo Victor Camargo 02 March 2018 (has links)
Neste trabalho, Compressed Sensing é introduzido do ponto de vista da Física Estatística. Após uma introdução sucinta onde os conceitos básicos da teoria são apresentados, incluindo condições necessárias para as medições e métodos básicos de reconstrução do sinal, a performance típica do esquema Bayesiano de reconstrução é analisada através de um cálculo de réplicas exposto em detalhe pedagógico. Em seguida, a principal contribuição original do trabalho é introduzida --- o algoritmo Bayesiano de Compressed Sensing Online faz uso de uma aproximação de campo médio para simplificar cálculos e reduzir os requisitos de memória e computação, enquanto mantém a acurácia de reconstrução do esquema offline na presença de ruído aditivo. A última parte deste trabalho contém duas extensões do algoritmo online que permitem reconstrução otimizada do sinal no cenário mais realista onde conhecimento perfeito da distribuição geradora não está disponível. / In this work, Compressed Sensing is introduced from a Statistical Physics point of view. Following a succinct introduction where the basic concepts of the framework are presented, including necessary measurement conditions and basic signal reconstruction methods, the typical performance of the Bayesian reconstruction scheme is analyzed through a replica calculation shown in pedagogical detail. Thereafter, the main original contribution of this work is introduced --- the Bayesian Online Compressed Sensing algorithm makes use of a mean-field approximation to simplify calculations and reduce memory and computation requirements, while maintaining the asymptotic reconstruction accuracy of the offline scheme in the presence of additive noise. The last part of this work are two extensions of the online algorithm that allow for optimized signal reconstruction in the more realistic scenarios where perfect knowledge of the generating distribution is unavailable.
|
15 |
Sécurité et disponibilité des données stockées dans les nuages / Data availability and sécurity in cloud storageRelaza, Théodore Jean Richard 12 February 2016 (has links)
Avec le développement de l'Internet, l'informatique s'est basée essentiellement sur les communications entre serveurs, postes utilisateurs, réseaux et data centers. Au début des années 2000, les deux tendances à savoir la mise à disposition d'applications et la virtualisation de l'infrastructure ont vu le jour. La convergence de ces deux tendances a donné naissance à un concept fédérateur qu'est le Cloud Computing (informatique en nuage). Le stockage des données apparaît alors comme un élément central de la problématique liée à la mise dans le nuage des processus et des ressources. Qu'il s'agisse d'une simple externalisation du stockage à des fins de sauvegarde, de l'utilisation de services logiciels hébergés ou de la virtualisation chez un fournisseur tiers de l'infrastructure informatique de l'entreprise, la sécurité des données est cruciale. Cette sécurité se décline selon trois axes : la disponibilité, l'intégrité et la confidentialité des données. Le contexte de nos travaux concerne la virtualisation du stockage dédiée à l'informatique en nuage (Cloud Computing). Ces travaux se font dans le cadre du projet SVC (Secured Virtual Cloud) financé par le Fond National pour la Société Numérique " Investissement d'avenir ". Ils ont conduit au développement d'un intergiciel de virtualisation du stockage, nommé CloViS (Cloud Virtualized Storage), qui entre dans une phase de valorisation portée par la SATT Toulouse-Tech-Transfer. CloViS est un intergiciel de gestion de données développé au sein du laboratoire IRIT, qui permet la virtualisation de ressources de stockage hétérogènes et distribuées, accessibles d'une manière uniforme et transparente. CloViS possède la particularité de mettre en adéquation les besoins des utilisateurs et les disponibilités du système par le biais de qualités de service définies sur des volumes virtuels. Notre contribution à ce domaine concerne les techniques de distribution des données afin d'améliorer leur disponibilité et la fiabilité des opérations d'entrées/sorties dans CloViS. En effet, face à l'explosion du volume des données, l'utilisation de la réplication ne peut constituer une solution pérenne. L'utilisation de codes correcteurs ou de schémas de seuil apparaît alors comme une alternative valable pour maîtriser les volumes de stockage. Néanmoins aucun protocole de maintien de la cohérence des données n'est, à ce jour, adapté à ces nouvelles méthodes de distribution. Nous proposons pour cela des protocoles de cohérence des données adaptés à ces différentes techniques de distribution des données. Nous analysons ensuite ces protocoles pour mettre en exergue leurs avantages et inconvénients respectifs. En effet, le choix d'une technique de distribution de données et d'un protocole de cohérence des données associé se base sur des critères de performance notamment la disponibilité en écriture et lecture, l'utilisation des ressources système (comme l'espace de stockage utilisé) ou le nombre moyen de messages échangés durant les opérations de lecture et écriture. / With the development of Internet, Information Technology was essentially based on communications between servers, user stations, networks and data centers. Both trends "making application available" and "infrastructure virtualization" have emerged in the early 2000s. The convergence of these two trends has resulted in a federator concept, which is the Cloud Computing. Data storage appears as a central component of the problematic associated with the move of processes and resources in the cloud. Whether it is a simple storage externalization for backup purposes, use of hosted software services or virtualization in a third-party provider of the company computing infrastructure, data security is crucial. This security declines according to three axes: data availability, integrity and confidentiality. The context of our work concerns the storage virtualization dedicated to Cloud Computing. This work is carried out under the aegis of SVC (Secured Virtual Cloud) project, financed by the National Found for Digital Society "Investment for the future". This led to the development of a storage virtualization middleware, named CloViS (Cloud Virtualized Storage), which is entering a valorization phase driven by SATT Toulouse-Tech-Transfer. CloViS is a data management middleware developped within the IRIT laboratory. It allows virtualizing of distributed and heterogeneous storage resources, with uniform and seamless access. CloViS aligns user needs and system availabilities through qualities of service defined on virtual volumes. Our contribution in this field concerns data distribution techniques to improve their availability and the reliability of I/O operations in CloViS. Indeed, faced with the explosion in the amount of data, the use of replication can not be a permanent solution. The use of "Erasure Resilient Code" or "Threshold Schemes" appears as a valid alternative to control storage volumes. However, no data consistency protocol is, to date, adapted to these new data distribution methods. For this reason, we propose to adapt these different data distribution techniques. We then analyse these new protocols, highlighting their respective advantages and disadvantages. Indeed, the choice of a data distribution technique and the associated data consistency protocol is based on performance criteria, especially the availability and the number of messages exchanged during the read and write operations or the use of system resources (such as storage space used).
|
16 |
Física estatística de compressed sensing online / Statistical Physics of Online Compressed SensingPaulo Victor Camargo Rossi 02 March 2018 (has links)
Neste trabalho, Compressed Sensing é introduzido do ponto de vista da Física Estatística. Após uma introdução sucinta onde os conceitos básicos da teoria são apresentados, incluindo condições necessárias para as medições e métodos básicos de reconstrução do sinal, a performance típica do esquema Bayesiano de reconstrução é analisada através de um cálculo de réplicas exposto em detalhe pedagógico. Em seguida, a principal contribuição original do trabalho é introduzida --- o algoritmo Bayesiano de Compressed Sensing Online faz uso de uma aproximação de campo médio para simplificar cálculos e reduzir os requisitos de memória e computação, enquanto mantém a acurácia de reconstrução do esquema offline na presença de ruído aditivo. A última parte deste trabalho contém duas extensões do algoritmo online que permitem reconstrução otimizada do sinal no cenário mais realista onde conhecimento perfeito da distribuição geradora não está disponível. / In this work, Compressed Sensing is introduced from a Statistical Physics point of view. Following a succinct introduction where the basic concepts of the framework are presented, including necessary measurement conditions and basic signal reconstruction methods, the typical performance of the Bayesian reconstruction scheme is analyzed through a replica calculation shown in pedagogical detail. Thereafter, the main original contribution of this work is introduced --- the Bayesian Online Compressed Sensing algorithm makes use of a mean-field approximation to simplify calculations and reduce memory and computation requirements, while maintaining the asymptotic reconstruction accuracy of the offline scheme in the presence of additive noise. The last part of this work are two extensions of the online algorithm that allow for optimized signal reconstruction in the more realistic scenarios where perfect knowledge of the generating distribution is unavailable.
|
17 |
Quality aspects in direct shear testing of rock jointsLarsson, Jörgen January 2021 (has links)
The stability of rock masses is influenced by the occurrence of rock joints. Therefore, the shear strength of rock joints must be considered in dimensioning of underground constructions. One way to predict the shear strength is through usage of failure criteria, which are validated from results of direct shear tests under controlled laboratory conditions. Consequently, the quality of the results from the tests are crucial to the accuracy with which the criteria will be able to predict the shear strength. Since rock joints are unique by nature usage of replicas (man-made copies of rock joints) is of importance in parameter studies. The overall objective of this work is to facilitate the development of improved criteria for predictions of the shear strength of rock joints. To support this objective, two sources of uncertainty have been investigated, namely the geometry of replicas and the influence of the normal stiffness of test systems. Two quality assurance parameters for evaluation of geometrical differences between replicas and rock joints based on scanning data have been derived. The first parameter describes the morphological deviations. The second parameter describes the deviations in orientation with respect to the shear plane. The effective normal stiffness approach, which compensates for the influence of the normal stiffness of the test system indirect shear testing, has been developed, validated, and applied. With help of the quality assurance parameters it is demonstrated that it is possible to reproduce replicas within narrow tolerances. Application of the effective normal stiffness approach basically eliminates the normal load error. In all, the results support generation of improved quality of test data and consequently, the development of shear strength criteria with improved accuracy will also be facilitated. / Bergmassors stabilitet påverkas av bergssprickor. Bergssprickors skjuvhållfasthet behöver därför beaktas vid fastställandet av vilka laster berganläggningar skall dimensioneras mot. Skjuvhållfastheten predikteras bland annat med hjälp av brottkriterier, vilka valideras med hjälp av resultaten från skjuvtester i kontrollerad laboratoriemiljö. Kvaliteten på resultaten från testerna är därför av avgörande betydelse för med vilken noggrannhet kriterierna kommer att kunna prediktera skjuvhållfastheten. Det övergripande målet med detta arbete är att underlätta utvecklingen av förbättrade kriterier för prediktioner av bergssprickors skjuvhållfasthet. Som ett bidrag till att uppnå detta mål har två osäkerhetsfaktorer undersökts, nämligen geometrin av replikor (kopior) av bergssprickor och inverkan av testsystems normalstyvhet. Två kvalitetssäkringsparametrar för utvärdering av de geometriska skillnaderna mellan replikor och bergprov baserade på skanningdata har tagits fram. Den första parametern beskriver de morfologiska avvikelserna. Den andra parametern beskriver avvikelserna i orientering med avseende på skjuvplanet. Ett tillvägagångssätt med en effektiv systemnormalstyvhet, vilken kompenserar för inverkan av testsystemets normalstyvhet, har utvecklats, validerats och tillämpats. Med hjälp av kvalitetssäkringsparametrarna påvisas att det ar möjligt att reproducera replikor inom snäva toleranser. Genom tillämpning av tillvägagångssättet med en effektiv normalstyvhet kan felet i normallast i princip elimineras. Sammantaget stödjer resultaten framtagning av testdata med förbättrad kvalitet och därigenom underlättas även utvecklingen av skjuvhållfasthetskriterier med förbättrad noggrannhet. / <p>QC 210518</p>
|
18 |
Objectified and Digitized : Digital Replicas, Digital Capital and Models’ Objectification in the Fashion Industryvan Halteren, Robin Naomi January 2024 (has links)
This thesis explores how racialized and gendered power dynamics are reflected in public discourse on digital replicas and the personal narratives of models. Using the lenses of digital capital and objectification theory, it examines how these power relations impact models' agency and ownership over their images and digital bodies. Employing a combination of critical discourse analysis and autoethnography, the research provides macro and micro perspectives on power relations in the fashion industry. The findings highlight how digital capital impacts who gets to shape the narrative on digital replicas and who reaps the benefits of these technologies. It sheds light on how racialized and gendered power relations intersect with objectification and digital capital, impacting who can benefit from digital replicas. The study highlights models' significant challenges in maintaining autonomy and control over their careers and public image. It suggests that new technologies, such as digital replicas, can further exacerbate existing inequalities. The findings also highlight how power dynamics on both a societal and interpersonal level shape the potential for digital replicas to enhance agents' control over models and intensify their experiences of commodification. By aligning with previous research on models' marginalization and research on how new technologies can intensify existing inequalities, this thesis contributes to the broader conversation on how existing power relations, labor, and technological developments shape one another.
|
19 |
Percepção e presença-o corpo na escultura, cinema e biotecnologiasTaborda, Sérgio January 1999 (has links)
No description available.
|
20 |
Statistical physics of constraint satisfaction problemsLamouchi, Elyes 10 1900 (has links)
La technique des répliques est une technique formidable prenant ses origines de la physique statistique, comme un moyen de calculer l'espérance du logarithme de la constante de normalisation d'une distribution de probabilité à haute dimension. Dans le jargon de physique, cette quantité est connue sous le nom de l’énergie libre, et toutes sortes de quantités utiles, telle que l’entropie, peuvent être obtenue de là par des dérivées. Cependant, ceci est un problème NP-difficile, qu’une bonne partie de statistique computationelle essaye de résoudre, et qui apparaît partout; de la théorie des codes, à la statistique en hautes dimensions, en passant par les problèmes de satisfaction de contraintes. Dans chaque cas, la méthode des répliques, et son extension par (Parisi et al., 1987), se sont prouvées fortes utiles pour illuminer quelques aspects concernant la corrélation des variables de la distribution de Gibbs et la nature fortement nonconvexe de son logarithme negatif. Algorithmiquement, il existe deux principales méthodologies adressant la difficulté de calcul que pose la constante de normalisation:
a). Le point de vue statique: dans cette approche, on reformule le problème en tant que graphe dont les nœuds correspondent aux variables individuelles de la distribution de Gibbs, et dont les arêtes reflètent les dépendances entre celles-ci. Quand le graphe en question est localement un arbre, les procédures de message-passing sont garanties d’approximer arbitrairement bien les probabilités marginales de la distribution de Gibbs et de manière équivalente d'approximer la constante de normalisation. Les prédictions de la physique concernant la disparition des corrélations à longues portées se traduise donc, par le fait que le graphe soit localement un arbre, ainsi permettant l’utilisation des algorithmes locaux de passage de messages. Ceci va être le sujet du chapitre 4.
b). Le point de vue dynamique: dans une direction orthogonale, on peut contourner le problème que pose le calcul de la constante de normalisation, en définissant une chaîne de Markov le long de laquelle, l’échantillonnage converge à celui selon la distribution de Gibbs, tel qu’après un certain nombre d’itérations (sous le nom de temps de relaxation), les échantillons sont garanties d’être approximativement générés selon elle. Afin de discuter des conditions dans lesquelles chacune de ces approches échoue, il est très utile d’être familier avec la méthode de replica symmetry breaking de Parisi.
Cependant, les calculs nécessaires sont assez compliqués, et requièrent des notions qui sont typiquemment étrangères à ceux sans un entrainement en physique statistique.
Ce mémoire a principalement deux objectifs : i) de fournir une introduction a la théorie des répliques, ses prédictions, et ses conséquences algorithmiques pour les problèmes de satisfaction de constraintes, et ii) de donner un survol des méthodes les plus récentes adressant la transition de phase, prédite par la méthode des répliques, dans le cas du problème k−SAT, à partir du point de vu statique et dynamique, et finir en proposant un nouvel algorithme qui prend en considération la transition de phase en question. / The replica trick is a powerful analytic technique originating from statistical physics as an attempt to compute the expectation of the logarithm of the normalization constant of a high dimensional probability distribution known as the Gibbs measure. In physics jargon this quantity is known as the free energy, and all kinds of useful quantities, such as the entropy, can be obtained from it using simple derivatives. The computation of this normalization constant is however an NP-hard problem that a large part of computational statistics attempts to deal with, and which shows up everywhere from coding theory, to high dimensional statistics, compressed sensing, protein folding analysis and constraint satisfaction problems. In each of these cases, the replica trick, and its extension by (Parisi et al., 1987), have proven incredibly successful at shedding light on keys aspects relating to the correlation structure of the Gibbs measure and the highly non-convex nature of − log(the Gibbs measure()). Algorithmic speaking, there exists two main methodologies addressing the intractability of the normalization constant:
a) Statics: in this approach, one casts the system as a graphical model whose vertices represent individual variables, and whose edges reflect the dependencies between them. When the underlying graph is locally tree-like, local messagepassing procedures are guaranteed to yield near-exact marginal probabilities or equivalently compute Z. The physics predictions of vanishing long range correlation in the Gibbs measure, then translate into the associated graph being locally tree-like, hence permitting the use message passing procedures. This will be the focus of chapter 4.
b) Dynamics: in an orthogonal direction, we can altogether bypass the issue of computing the normalization constant, by defining a Markov chain along which sampling converges to the Gibbs measure, such that after a number of iterations known as the relaxation-time, samples
are guaranteed to be approximately sampled according to the Gibbs measure. To get into the conditions in which each of the two approaches is likely to fail (strong long range correlation, high energy barriers, etc..), it is very helpful to be familiar with the so-called replica symmetry breaking picture of Parisi. The computations involved are however quite involved, and come with a number of prescriptions and prerequisite notions (s.a. large deviation principles, saddle-point approximations) that are typically foreign to those without a statistical physics background. The purpose of this thesis is then twofold: i) to provide a self-contained introduction to replica theory, its predictions, and its algorithmic implications for constraint satisfaction problems, and ii) to give an account of state of the art methods in addressing the predicted phase transitions in the case of k−SAT, from both the statics and dynamics points of view,
and propose a new algorithm takes takes these into consideration.
|
Page generated in 0.0437 seconds