• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 32
  • 11
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 122
  • 18
  • 18
  • 18
  • 17
  • 15
  • 14
  • 14
  • 14
  • 12
  • 10
  • 9
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Detekce kompletnosti instrukční sady pro generování univerzálního překladače jazyka C / Instruction Set Completness Detection for Retargetable C Compiler Generation

Nagy, Michal January 2012 (has links)
This thesis concerns the issue of completness detection of instruction set description for the LLVM compiler, or the ability of a compiler to generate target program for every valid source program in the appropriate high-level programming language. On the basis of regular tree grammar theory and several scietific theses that also concern this issue, formal tool for inclusion checking of two grammars. Furthermore a method for automatic extraction of the two grammars from the instruction set description has been devised, as a result of which the tool can be used for checking completeness of instruction selection. In combination with checking completeness of the legalization process of the LLVM compiler, which preceeds the instruction selection, it should be feaseable to check completeness of most compiler parts dependent on the target architecture.
62

Investigation of how to implement successful KPIs for organizations – based on an empirical study at an international organization

Shen, Jing January 2013 (has links)
In the information age, KPIs (Key Performance Indicators) are commonly used for business performance measurement in organizations; however there’s no single best way about how to implement KPIs, which means that a company can select any arbitrary suitable KPIs. The thesis will do research and investigations based on existing academic theory and a case study in an international logistic company. The research was aiming at helping organizations to develop and implement successful KPIs that are effective, complete, and aligned with their business strategy and which conforms to scientific theories for how KPIs can be implemented. I designed a framework which was combined with the critical factors of successful KPIs in the scientific theories, the framework was designed for helping companies diagnosing and improving the effectiveness, completeness and alignment of their KPIs. . The aim was fulfilled by successfully testing the proposed framework in a case study. The result from the case study shows that there was a gap between how the KPIs are used and the scientific theories of how they should be used. However the framework need to be further tested in future research for a general usefulness.
63

Hypercubes Latins maximin pour l’echantillonage de systèmes complexes / Maximin Latin hypercubes for experimental design

Le guiban, Kaourintin 24 January 2018 (has links)
Un hypercube latin (LHD) maximin est un ensemble de points contenus dans un hypercube tel que les points ne partagent de coordonnées sur aucune dimension et tel que la distance minimale entre deux points est maximale. Les LHDs maximin sont particulièrement utilisés pour la construction de métamodèles en raison de leurs bonnes propriétés pour l’échantillonnage. Comme la plus grande partie des travaux concernant les LHD se sont concentrés sur leur construction par des algorithmes heuristiques, nous avons décidé de produire une étude détaillée du problème, et en particulier de sa complexité et de son approximabilité en plus des algorithmes heuristiques permettant de le résoudre en pratique.Nous avons généralisé le problème de construction d’un LHD maximin en définissant le problème de compléter un LHD entamé en respectant la contrainte maximin. Le sous-problème dans lequel le LHD partiel est vide correspond au problème de construction de LHD classique. Nous avons étudié la complexité du problème de complétion et avons prouvé qu’il est NP-complet dans de nombreux cas. N’ayant pas déterminé la complexité du sous-problème, nous avons cherché des garanties de performances pour les algorithmes résolvant les deux problèmes.D’un côté, nous avons prouvé que le problème de complétion n’est approximable pour aucune norme en dimensions k ≥ 3. Nous avons également prouvé un résultat d’inapproximabilité plus faible pour la norme L1 en dimension k = 2. D’un autre côté, nous avons proposé un algorithme d’approximation pour le problème de construction, et avons calculé le rapport d’approximation grâce à deux bornes supérieures que nous avons établies. En plus de l’aspect théorique de cette étude, nous avons travaillé sur les algorithmes heuristiques, et en particulier sur la méta-heuristique du recuit simulé. Nous avons proposé une nouvelle fonction d’évaluation pour le problème de construction et de nouvelles mutations pour les deux problèmes, permettant d’améliorer les résultats rapportés dans la littérature. / A maximin Latin Hypercube Design (LHD) is a set of point in a hypercube which do not share a coordinate on any dimension and such that the minimal distance between two points, is maximal. Maximin LHDs are widely used in metamodeling thanks to their good properties for sampling. As most work concerning LHDs focused on heuristic algorithms to produce them, we decided to make a detailed study of this problem, including its complexity, approximability, and the design of practical heuristic algorithms.We generalized the maximin LHD construction problem by defining the problem of completing a partial LHD while respecting the maximin constraint. The subproblem where the partial LHD is initially empty corresponds to the classical LHD construction problem. We studied the complexity of the completion problem and proved its NP-completeness for many cases. As we did not determine the complexity of the subproblem, we searched for performance guarantees of algorithms which may be designed for both problems. On the one hand, we found that the completion problem is inapproximable for all norms in dimensions k ≥ 3. We also gave a weaker inapproximation result for norm L1 in dimension k = 2. On the other hand, we designed an approximation algorithm for the construction problem which we proved using two new upper bounds we introduced.Besides the theoretical aspect of this study, we worked on heuristic algorithms adapted for these problems, focusing on the Simulated Annealing metaheuristic. We proposed a new evaluation function for the construction problem and new mutations for both the construction and completion problems, improving the results found in the literature.
64

A Desk Study of the Education Policy Implications of Using Data from Multiple Sources: Example of Primary School Teacher Supply and Demand in Malawi

Khombe, Moses 01 December 2014 (has links) (PDF)
Malawi, as a country with very limited resources, needs to have educational policies in place to maximize effectiveness of the public education system. Policymakers depend on accurate data, but variations in data between sources leaves policymakers uncertain as they attempt to craft policies to address the growing educational crisis in Malawi. A desk study was performed to evaluate the policy implications of employing data from multiple sources using primary school teacher supply and demand in Malawi as an illustration. This study examined one national organization, Malawi's Ministry of Education, Science, and Technology (MoEST); three international aid and assistance organizations (IAAOs), including The Department for International Development (DIFD) from the UK, Japan International Cooperation Agency (JICA), and the United States Agency for International Development (USAID); and one global organization, The United Nations Educational, Scientific and Cultural Organization (UNSECO). The study documented differences and similarities between the data sources. Among the factors considered were the nature of each institution and the effect it could have on data collection, aggregation, analysis and reporting; the definitions used by each organization, and their implications for data use; and each organization's methods of collection, aggregation, analysis and reporting. The study found significant variations in the teacher supply and demand data presented by the five organizations, with variations of up to 333% between sources. To address this problem, it is recommended that the Government of Malawi (GoM) establish a central agency to standardize education data. Three policy scenarios are detailed, presenting the probable outcome of various actions the GoM could take regarding this recommendation.
65

A Tableau Algorithm for SROIQ under Infinitely Valued Gödel Semantics

Borgwardt, Stefan, Peñaloza, Rafael 20 June 2022 (has links)
Fuzzy description logics (FDLs) are knowledge representation formalisms capable of dealing with imprecise knowledge by allowing intermediate membership degrees in the interpretation of concepts and roles. One option for dealing with these intermediate degrees is to use the so-called Gödel semantics. Despite its apparent simplicity, developing reasoning techniques for expressive FDLs under this semantics is a hard task. We present a tableau algorithm for deciding consistency of a SROIQ ontology under Gödel semantics. This is the first algorithm that can handle the full expressivity of SROIQ as well as the full Gödel semantics.
66

Hardness of Constraint Satisfaction and Hypergraph Coloring : Constructions of Probabilistically Checkable Proofs with Perfect Completeness

Huang, Sangxia January 2015 (has links)
A Probabilistically Checkable Proof (PCP) of a mathematical statement is a proof written in a special manner that allows for efficient probabilistic verification. The celebrated PCP Theorem states that for every family of statements in NP, there is a probabilistic verification procedure that checks the validity of a PCP proof by reading only 3 bits from it. This landmark theorem, and the works leading up to it, laid the foundation for many subsequent works in computational complexity theory, the most prominent among them being the study of inapproximability of combinatorial optimization problems. This thesis focuses on a broad class of combinatorial optimization problems called Constraint Satisfaction Problems (CSPs). In an instance of a CSP problem of arity k, we are given a set of variables taking values from some finite domain, and a set of constraints each involving a subset of at most k variables. The goal is to find an assignment that simultaneously satisfies as many constraints as possible. An alternative formulation of the goal that is commonly used is Gap-CSP, where the goal is to decide whether a CSP instance is satisfiable or far from satisfiable, where the exact meaning of being far from satisfiable varies depending on the problems.We first study Boolean CSPs, where the domain of the variables is {0,1}. The main question we study is the hardness of distinguishing satisfiable Boolean CSP instances from those for which no assignment satisfies more than some epsilon fraction of the constraints. Intuitively, as the arity increases, the CSP gets more complex and thus the hardness parameter epsilon should decrease. We show that for Boolean CSPs of arity k, it is NP-hard to distinguish satisfiable instances from those that are at most 2^{~O(k^{1/3})}/2^k-satisfiable. We also study coloring of graphs and hypergraphs. Given a graph or a hypergraph, a coloring is an assignment of colors to vertices, such that all edges or hyperedges are non-monochromatic. The gap problem is to distinguish instances that are colorable with a small number of colors, from those that require a large number of colors. For graphs, we prove that there exists a constant K_0&gt;0, such that for any K &gt;= K_0, it is NP-hard to distinguish K-colorable graphs from those that require 2^{Omega(K^{1/3})} colors. For hypergraphs, we prove that it is quasi-NP-hard to distinguish 2-colorable 8-uniform hypergraphs of size N from those that require 2^{(log N)^{1/4-o(1)}} colors. In terms of techniques, all these results are based on constructions of PCPs with perfect completeness, that is, PCPs where the probabilistic proof verification procedure always accepts a correct proof. Not only is this a very natural property for proofs, but it can also be an essential requirement in many applications. It has always been particularly challenging to construct PCPs with perfect completeness for NP statements due to limitations in techniques. Our improved hardness results build on and extend many of the current approaches. Our Boolean CSP result and GraphColoring result were proved by adapting the Direct Sum of PCPs idea by Siu On Chan to the perfect completeness setting. Our proof for hypergraph coloring hardness improves and simplifies the recent work by Khot and Saket, in which they proposed the notion of superposition complexity of CSPs. / Ett probabilistiskt verifierbart bevis (eng: Probabilistically Checkable Proof, PCP) av en matematisk sats är ett bevis skrivet på ett speciellt sätt vilket möjliggör en effektiv probabilistisk verifiering. Den berömda PCP-satsen säger att för varje familj av påståenden i NP finns det en probabilistisk verifierare som kontrollerar om en PCP bevis är giltigt genom att läsa endast 3 bitar från det. Denna banbrytande sats, och arbetena som ledde fram till det, lade grunden för många senare arbeten inom komplexitetsteorin, framförallt inom studiet av approximerbarhet av kombinatoriska optimeringsproblem. I denna avhandling fokuserar vi på en bred klass av optimeringsproblem i form av villkorsuppfyllningsproblem (engelska ``Constraint Satisfaction Problems'' CSPs). En instans av ett CSP av aritet k ges av en mängd variabler som tar värden från någon ändlig domän, och ett antal villkor som vart och ett beror på en delmängd av högst k variabler. Målet är att hitta ett tilldelning av variablerna som samtidigt uppfyller så många som möjligt av villkoren. En alternativ formulering av målet som ofta används är Gap-CSP, där målet är att avgöra om en CSP-instans är satisfierbar eller långt ifrån satisfierbar, där den exakta innebörden av att vara ``långt ifrån satisfierbar'' varierar beroende på problemet.Först studerar vi booleska CSPer, där domänen är {0,1}. Den fråga vi studerar är svårigheten av att särskilja satisfierbara boolesk CSP-instanser från instanser där den bästa tilldelningen satisfierar högst en andel epsilon av villkoren. Intuitivt, när ariten ökar blir CSP mer komplexa och därmed bör svårighetsparametern epsilon avta med ökande aritet. Detta visar sig vara sant och ett första resultat är att för booleska CSP av aritet k är det NP-svårt att särskilja satisfierbara instanser från dem som är högst 2^{~O(k^{1/3})}/2^k-satisfierbara. Vidare studerar vi färgläggning av grafer och hypergrafer. Givet en graf eller en hypergraf, är en färgläggning en tilldelning av färger till noderna, så att ingen kant eller hyperkant är monokromatisk. Problemet vi analyserar är att särskilja instanser som är färgbara med ett litet antal färger från dem som behöver många färger. För grafer visar vi att det finns en konstant K_0&gt;0, så att för alla K &gt;= K_0 är det NP-svårt att särskilja grafer som är K-färgbara från dem som kräver minst 2^{Omega(K^{1/3})} färger. För hypergrafer visar vi att det är kvasi-NP-svårt att särskilja 2-färgbara 8-likformiga hypergrafer som har N noder från dem som kräv minst 2^{(log N)^{1/4-o(1)}} färger. Samtliga dessa resultat bygger på konstruktioner av PCPer med perfekt fullständighet. Det vill säga PCPer där verifieraren alltid accepterar ett korrekt bevis. Inte bara är detta en mycket naturlig egenskap för PCPer, men det kan också vara ett nödvändigt krav för vissa tillämpningar. Konstruktionen av PCPer med perfekt fullständighet för NP-påståenden ger tekniska komplikationer och kräver delvis utvecklande av nya metoder. Vårt booleska CSPer resultat och vårt Färgläggning resultat bevisas genom att anpassa ``Direktsumman-metoden'' introducerad av Siu On Chan till fallet med perfekt fullständighet. Vårt bevis för hypergraffärgningssvårighet förbättrar och förenklar ett färskt resultat av Khot och Saket, där de föreslog begreppet superpositionskomplexitet av CSP. / <p>QC 20150916</p>
67

Les deux corps du juge et le syndrome du dispositif : étude sur les causes de l'incomplétude normative, sa portée juridictionnelle et ses autres conséquences en droit continental français contemporain / The two bodies of the judge and operative syndrome. : study on the causes of the normative incompleteness, legal scope and its other consequences in contemporary French continental law

Puma, André-Charles 03 October 2018 (has links)
L’État de droit peut se définir comme un système institutionnel dans lequel la puissance publique est soumise au droit. Cette notion, a été redéfinie au début du vingtième siècle par Hans Kelsen comme : « un État dans lequel les normes juridiques sont hiérarchisées de telle sorte que sa puissance s’en trouve limitée ». Un tel système qui pose la soumission des patients à la règle, présuppose outre la légitimité de ses agents, la traduction objective de la normativité qui en est issue. Pour autant il appert de l’observation des dispositifs qui en résultent, des anomalies structurelles et fonctionnelles dont les effets cliniques constitutifs d’un syndrome, pointent les dysfonctionnements d’un espace juridictionnel essentiellement abandonné aux individualités. En conséquence, les interactions entre les agents et les patients (justiciables, défendeurs, demandeurs) ne sauraient être dissociées de l’analyse de ces manifestations spécifiques au droit continental, notamment français contemporain. C’est donc, après avoir procédé à l’identification du syndrome et à l’analyse du « concept dispositif », fait le constat d’un paradoxe régulatoire constant et relevé les signes cliniques des affections, que nous en avons déduis les vecteurs. Toutefois, le constat qui en est résulté conduisait, soit à considérer le phénomène inéluctable et à l’intégrer, soit à en rechercher les causes originelles et les voies susceptibles d’en atténuer les effets. Par suite, c’est à l’aune d’un paradigme constant, propre au droit continental, qu’après avoir relevé les effets et identifié les causes des affections ainsi révélées par le syndrome du dispositif, que nous avons imaginé le concept de résidualisme. Partant, après en avoir aperçu tant les fondements que la stratégie, nous en avons recherché les premières pistes susceptibles d’en réduire la portée et de conduire à l’élaboration d’un dispositif « assisté », visant tant à obtenir l’adhésion effective des agents et des patients, qu’à décharger le juge d’une responsabilité normative qui n’est pas la sienne. / The two bodies of the judge and the syndrome of the device: study on the causes of the normative incompleteness, legal scope and its other consequences in contemporary French continental law.The rule of law can be defined as an institutional system in which the public authority is subject to the law. This notion has been redefined in the early twentieth century by Hans Kelsen as: "a State in which legal standards are prioritized so that its power is limited. Such a system that asks patients to the rule submission, presupposes the legitimacy of its agents, in addition to objective translation of normativity which from. So far it appears from the observation devices resulting, structural and functional abnormalities with the constituent clinical effects of a syndrome, that point the dysfunctions of a jurisdictional space essentially abandoned to individualities. As a result, the interactions between agents and patients (litigants, defendants, plaintiffs) cannot be separated from the analysis of these events specific to the continental law, including contemporary french. It is therefore, after identification of the syndrome and the analysis of the 'system concept', made the observation that for a constant regulatory paradox and noted the clinical signs of disease, that we examined the vectors. However, the observation that resulted was driving, consider the inevitable and to integrate it, either search for the original causes and ways to mitigate the effects. Accordingly, it is in the light of a paradigm of constant, clean to the continental law, after having noted the effects and identified the causes of disease as revealed by the syndrome of the device, we have created the concept of residualism. Therefore, after to have seen both the foundations that the strategy we sought in the first tracks likely to reduce the scope and lead to the development of a "guided" device, both aiming to get effective accession of agents and of the patients, to unload the judge of a normative responsibility is not hers.
68

Efficient query answering in peer data management systems

Roth, Armin 12 March 2012 (has links)
Peer-Daten-Management-Systeme (PDMS) bestehen aus einer hochdynamischen Menge heterogener, autonomer Peers. Die Peers beantworten Anfragen einerseits gegen lokal gespeicherte Daten und reichen sie andererseits nach einer Umschreibung anhand von Schema-Mappings an benachbarte Peers weiter. Solche aufgrund fehlender zentraler Komponenten eigentlich hoch- flexiblen Systeme leiden bei zunehmender Anzahl von Peers unter erheblichen Effi- zienzproblemen. Die Gründe hierfür liegen in der massiven Redundanz der Pfade im Netzwerk der Peers und im Informationsverlust aufgrund von Projektionen entlang von Mapping-Pfaden. Anwender akzeptieren in hochskalierten Umgebungen zum Datenaustausch in vielen Anwendungsszenarien Konzessionen an die Vollständigkeit der Anfrageergebnisse. Unser Ansatz sieht in der Vollständigkeit ein Optimierungsziel und verfolgt einen Kompromiß zwischen Nutzen und Kosten der Anfragebearbeitung. Hierzu schlagen wir mehrere Strategien für Peers vor, um zu entscheiden, an welche Nachbar-Peers Anfragen weitergeleitet werden. Peers schließen dabei Mappings von der Anfragebearbeitung aus, von denen sie ein geringes Verhältnis von Ergebnisgröße zu Kosten, also geringe Effizienz erwarten. Als Basis dieser Schätzungen wenden wir selbstadaptive Histogramme über die Ergebniskardinalität an und weisen nach, daß diese in dieser hochdynamischen Umgebung ausreichende Genauigkeit aufweisen. Wir schlagen einen Kompromiß zwischen der Nutzung von Anfrageergebnissen zur Anpassung dieser Metadaten-Statistiken und der Beschneidung von Anfrageplänen vor, um den entsprechenden Zielkonflikt aufzulösen. Für eine Optimierungsstrategie, die das für die Anfragebearbeitung verwendete Zeit-Budget limitiert, untersuchen wir mehrere Varianten hinsichtlich des Effizienzsteigerungspotentials. Darüber hinaus nutzen wir mehrdimensionale Histogramme über die Überlappung zweier Datenquellen zur gezielten Verminderung der Redundanz in der Anfragebearbeitung. / Peer data management systems (PDMS) consist of a highly dynamic set of autonomous, heterogeneous peers connected with schema mappings. Queries submitted at a peer are answered with data residing at that peer and by passing the queries to neighboring peers. PDMS are the most general architecture for distributed integrated information systems. With no need for central coordination, PDMS are highly flexible. However, due to the typical massive redundancy in mapping paths, PDMS tend to be very inefficient in computing the complete query result as the number of peers increases. Additionally, information loss is cumulated along mapping paths due to selections and projections in the mappings. Users usually accept concessions on the completeness of query answers in large-scale data sharing settings. Our approach turns completeness into an optimization goal and thus trades off benefit and cost of query answering. To this end, we propose several strategies that guide peers in their decision to which neighbors rewritten queries should be sent. In effect, the peers prune mappings that are expected to contribute few data. We propose a query optimization strategy that limits resource consumption and show that it can drastically increase efficiency while still yielding satisfying completeness of the query result. To estimate the potential data contribution of mappings, we adopted self-tuning histograms for cardinality estimation. We developed techniques that ensure sufficient query feedback to adapt these statistics to massive changes in a PDMS. Additionally, histograms can serve to maintain statistics on data overlap between alternative mapping paths. Building on them, redundant query processing is reduced by avoiding overlapping areas of the multi-dimensional data space.
69

Error-Correcting Codes in Spaces of Sets and Multisets and Their Applications in Permutation Channels / Zaštitni kodovi u prostorima skupova i multiskupova i njihove primene u permutacionim kanalima

Kovačević Mladen 15 October 2014 (has links)
<p>The thesis studies two communication<br />channels and corresponding error-correcting<br />codes. Multiset codes are introduced and<br />their applications described. Properties of<br />entropy and relative entropy are investigated.</p> / <p>U tezi su analizirana dva tipa komunikacionih<br />kanala i odgovarajući za&scaron;titni kodovi.<br />Uveden je pojam multiskupovnog koda i<br />opisane njegove primene. Proučavane su<br />osobine entropije i relativne entropije.</p>
70

Planejamentos combinatórios construindo sistemas triplos de steiner

Barbosa, Enio Perez Rodrigues 26 August 2011 (has links)
Submitted by Luciana Ferreira (lucgeral@gmail.com) on 2014-09-16T12:52:36Z No. of bitstreams: 2 Dissertação EnioPerez.pdf: 2190954 bytes, checksum: 8abd6c2cd31279e28971c632f6ed378b (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2014-09-16T14:10:30Z (GMT) No. of bitstreams: 2 Dissertação EnioPerez.pdf: 2190954 bytes, checksum: 8abd6c2cd31279e28971c632f6ed378b (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Made available in DSpace on 2014-09-16T14:10:30Z (GMT). No. of bitstreams: 2 Dissertação EnioPerez.pdf: 2190954 bytes, checksum: 8abd6c2cd31279e28971c632f6ed378b (MD5) license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) Previous issue date: 2011-08-26 / Intuitively, the basic idea of Design Theory consists of a way to select subsets, also called blocks, of a finite set, so that some properties are satisfied. The more general case are the blocks designs. A PBD is an ordered pair (S;B), where S is a finite set of symbols, and B is a collection of subsets of S called blocks, such that each pair of distinct elements of S occur together in exactly one block of B. A Steiner Triple System is a particular case of a PBD, where every block has size only 3, being called triples. The main focus is in building technology systems. By resolvability is discussed as a Steiner Triple Systems is resolvable, and when it is not resolvable. This theory has several applications, eg, embeddings and even problems related to computational complexity. / Intuitivamente, a idéia básica de um Planejamento Combinatório consiste em uma maneira de selecionar subconjuntos, também chamados de blocos, de um conjunto finito, de modo que algumas propriedades especificadas sejam satisfeitas. O caso mais geral são os planejamentos balanceados. Um PBD é um par ordenado (S;B), onde S é um conjunto finito de símbolos, e B é uma coleção de subconjuntos de S chamados blocos, tais que cada par de elementos distintos de S ocorrem juntos em exatamente um bloco de B. Um Sistema Triplo de Steiner é um caso particular de um PBD, em que todos os blocos tem tamanho único 3, sendo chamados de triplas. O foco principal está nas técnicas de construção dos sistemas. Por meio da resolubilidade se discute quando um Sistema Triplo de Steiner é resolvível e quando não é resolvível. Esta teoria possui várias aplicações, por exemplo: imersões e até mesmo problemas relacionados à complexidade computacional.

Page generated in 0.065 seconds