• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 4
  • 4
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 54
  • 17
  • 15
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Developing Efficient Strategies for Automatic Calibration of Computationally Intensive Environmental Models

Razavi, Seyed Saman January 2013 (has links)
Environmental simulation models have been playing a key role in civil and environmental engineering decision making processes for decades. The utility of an environmental model depends on how well the model is structured and calibrated. Model calibration is typically in an automated form where the simulation model is linked to a search mechanism (e.g., an optimization algorithm) such that the search mechanism iteratively generates many parameter sets (e.g., thousands of parameter sets) and evaluates them through running the model in an attempt to minimize differences between observed data and corresponding model outputs. The challenge rises when the environmental model is computationally intensive to run (with run-times of minutes to hours, for example) as then any automatic calibration attempt would impose a large computational burden. Such a challenge may make the model users accept sub-optimal solutions and not achieve the best model performance. The objective of this thesis is to develop innovative strategies to circumvent the computational burden associated with automatic calibration of computationally intensive environmental models. The first main contribution of this thesis is developing a strategy called “deterministic model preemption” which opportunistically evades unnecessary model evaluations in the course of a calibration experiment and can save a significant portion of the computational budget (even as much as 90% in some cases). Model preemption monitors the intermediate simulation results while the model is running and terminates (i.e., pre-empts) the simulation early if it recognizes that further running the model would not guide the search mechanism. This strategy is applicable to a range of automatic calibration algorithms (i.e., search mechanisms) and is deterministic in that it leads to exactly the same calibration results as when preemption is not applied. One other main contribution of this thesis is developing and utilizing the concept of “surrogate data” which is basically a reasonably small but representative proportion of a full set of calibration data. This concept is inspired by the existing surrogate modelling strategies where a surrogate model (also called a metamodel) is developed and utilized as a fast-to-run substitute of an original computationally intensive model. A framework is developed to efficiently calibrate hydrologic models to the full set of calibration data while running the original model only on surrogate data for the majority of candidate parameter sets, a strategy which leads to considerable computational saving. To this end, mapping relationships are developed to approximate the model performance on the full data based on the model performance on surrogate data. This framework can be applicable to the calibration of any environmental model where appropriate surrogate data and mapping relationships can be identified. As another main contribution, this thesis critically reviews and evaluates the large body of literature on surrogate modelling strategies from various disciplines as they are the most commonly used methods to relieve the computational burden associated with computationally intensive simulation models. To reliably evaluate these strategies, a comparative assessment and benchmarking framework is developed which presents a clear computational budget dependent definition for the success/failure of surrogate modelling strategies. Two large families of surrogate modelling strategies are critically scrutinized and evaluated: “response surface surrogate” modelling which involves statistical or data–driven function approximation techniques (e.g., kriging, radial basis functions, and neural networks) and “lower-fidelity physically-based surrogate” modelling strategies which develop and utilize simplified models of the original system (e.g., a groundwater model with a coarse mesh). This thesis raises fundamental concerns about response surface surrogate modelling and demonstrates that, although they might be less efficient, lower-fidelity physically-based surrogates are generally more reliable as they to-some-extent preserve the physics involved in the original model. Five different surface water and groundwater models are used across this thesis to test the performance of the developed strategies and elaborate the discussions. However, the strategies developed are typically simulation-model-independent and can be applied to the calibration of any computationally intensive simulation model that has the required characteristics. This thesis leaves the reader with a suite of strategies for efficient calibration of computationally intensive environmental models while providing some guidance on how to select, implement, and evaluate the appropriate strategy for a given environmental model calibration problem.
42

Developing Efficient Strategies for Automatic Calibration of Computationally Intensive Environmental Models

Razavi, Seyed Saman January 2013 (has links)
Environmental simulation models have been playing a key role in civil and environmental engineering decision making processes for decades. The utility of an environmental model depends on how well the model is structured and calibrated. Model calibration is typically in an automated form where the simulation model is linked to a search mechanism (e.g., an optimization algorithm) such that the search mechanism iteratively generates many parameter sets (e.g., thousands of parameter sets) and evaluates them through running the model in an attempt to minimize differences between observed data and corresponding model outputs. The challenge rises when the environmental model is computationally intensive to run (with run-times of minutes to hours, for example) as then any automatic calibration attempt would impose a large computational burden. Such a challenge may make the model users accept sub-optimal solutions and not achieve the best model performance. The objective of this thesis is to develop innovative strategies to circumvent the computational burden associated with automatic calibration of computationally intensive environmental models. The first main contribution of this thesis is developing a strategy called “deterministic model preemption” which opportunistically evades unnecessary model evaluations in the course of a calibration experiment and can save a significant portion of the computational budget (even as much as 90% in some cases). Model preemption monitors the intermediate simulation results while the model is running and terminates (i.e., pre-empts) the simulation early if it recognizes that further running the model would not guide the search mechanism. This strategy is applicable to a range of automatic calibration algorithms (i.e., search mechanisms) and is deterministic in that it leads to exactly the same calibration results as when preemption is not applied. One other main contribution of this thesis is developing and utilizing the concept of “surrogate data” which is basically a reasonably small but representative proportion of a full set of calibration data. This concept is inspired by the existing surrogate modelling strategies where a surrogate model (also called a metamodel) is developed and utilized as a fast-to-run substitute of an original computationally intensive model. A framework is developed to efficiently calibrate hydrologic models to the full set of calibration data while running the original model only on surrogate data for the majority of candidate parameter sets, a strategy which leads to considerable computational saving. To this end, mapping relationships are developed to approximate the model performance on the full data based on the model performance on surrogate data. This framework can be applicable to the calibration of any environmental model where appropriate surrogate data and mapping relationships can be identified. As another main contribution, this thesis critically reviews and evaluates the large body of literature on surrogate modelling strategies from various disciplines as they are the most commonly used methods to relieve the computational burden associated with computationally intensive simulation models. To reliably evaluate these strategies, a comparative assessment and benchmarking framework is developed which presents a clear computational budget dependent definition for the success/failure of surrogate modelling strategies. Two large families of surrogate modelling strategies are critically scrutinized and evaluated: “response surface surrogate” modelling which involves statistical or data–driven function approximation techniques (e.g., kriging, radial basis functions, and neural networks) and “lower-fidelity physically-based surrogate” modelling strategies which develop and utilize simplified models of the original system (e.g., a groundwater model with a coarse mesh). This thesis raises fundamental concerns about response surface surrogate modelling and demonstrates that, although they might be less efficient, lower-fidelity physically-based surrogates are generally more reliable as they to-some-extent preserve the physics involved in the original model. Five different surface water and groundwater models are used across this thesis to test the performance of the developed strategies and elaborate the discussions. However, the strategies developed are typically simulation-model-independent and can be applied to the calibration of any computationally intensive simulation model that has the required characteristics. This thesis leaves the reader with a suite of strategies for efficient calibration of computationally intensive environmental models while providing some guidance on how to select, implement, and evaluate the appropriate strategy for a given environmental model calibration problem.
43

Normalita výjimečnosti? (Z)vládnutí krize v reformě azylové a migrační politiky Evropské unie / Normality of the exception? Crisis Governance in reforming the Asylum and Migration Policy of the European Union

Kaleta, Ondřej January 2019 (has links)
This doctoral thesis examines the issue of crisis governance of the European Union in the context of migration developments after 2015. The author investigates how relevant EU institutions (European Commission, Council of the EU, and European Council) construct exceptionality within the common asylum and migration policy and what might be its impacts on the functionality of this policy. Theoretically, the research is based on the concept of "state of exception" originally introduced in the works of Carl Schmitt and Giorgio Agamben. The main objective of the thesis is to analyze and interpret the extraordinary migration measures from 2015 to 2018, which were proposed and implemented by the EU political actors to address the migration situation. The institutional level is further broadened and contextualized by including three EU Member State governments - Hungary, Austria, and Germany - and their involvement in the interactive shaping of emergency policies. The author studies how the exception is constructed in the EU official discourse, the relationship between exception and normality, and the exercise of power to create a state of exception at supranational/intergovernmental level of the EU as an international organization. The thesis approaches the topic using critical discourse analysis. It...
44

An Assessment of the 2002 National Security Strategy of the United States: Continuity and Change.

Prince, Troy Jason January 2009 (has links)
The 2002 National Security Strategy of the US (NSS 2002) appeared to have presented a momentous approach to self-defense. To many, the doctrine of preemptive selfdefense seemed to challenge the legal and political foundations of the post-World War II international order. Some saw in the US stated reliance on preemption a direct threat to the international system embodied in the UN Charter. The prima facie case that the US position was novel and even dangerous appeared persuasive. This thesis attempts to assess the exceptionality of NSS 2002 in its formulation and implications. This question of exceptionality is broadly divided into two sections. The first section deals with internal exceptionality, in terms of means (the deliberation and drafting processes) and ends (the US defense posture). The second section deals with external exceptionality in the broader terms of possible consequences outside the US. Section One begins by establishing the grounds for looking into the formulation of NSS 2002, and provides the background for that Strategy's mandated precursors. After exploring how National Security Strategy documents are conceived and framed, Section One discusses the Strategy as it was published, and examines a sampling of contemporaneous reactions to its publication. Section Two concentrates on the second part of the research question, and utilizes a thematic approach ¿ in terms of the use of force, the international security environment, and international law. Possible consequences of the proposed US response to contemporary security challenges are considered in these three key areas.
45

Impacts of Complexity and Timing of Communication Interruptions on Visual Detection Tasks

Stader, Sally 01 January 2014 (has links)
Auditory preemption theory suggests two competing assumptions for the attention-capturing and performance-altering properties of auditory tasks. In onset preemption, attention is immediately diverted to the auditory channel. Strategic preemption involves a decision process in which the operator maintains focus on more complex auditory messages. The limitation in this process is that the human auditory, or echoic, memory store has a limit of 2 to 5 seconds, after which the message must be processed or it decays. In contrast, multiple resource theory suggests that visual and auditory tasks may be efficiently time-shared because two different pools of cognitive resources are used. Previous research regarding these competing assumptions has been limited and equivocal. Thus, the current research focused on systematically examining the effects of complexity and timing of communication interruptions on visual detection tasks. It was hypothesized that both timing and complexity levels would impact detection performance in a multi-task environment. Study 1 evaluated the impact of complexity and timing of communications occurring before malfunctions in an ongoing visual detection task. Twenty-four participants were required to complete each of the eight timing blocks that included simple or complex communications occurring simultaneously, and at 2, 5, or 8 seconds before detection events. For simple communications, participants repeated three pre-recorded words. However, for complex communications, they generated three words beginning with the same last letter of a word prompt. Results indicated that complex communications at two seconds or less occurring before a visual detection event significantly impacted response time with a 1.3 to 1.6 second delay compared to all the other timings. Detection accuracy for complex communication tasks under the simultaneous condition was significantly degraded compared to simple communications at five seconds or more prior to the task. This resulted in a 20% decline in detection accuracy. Additionally, participants' workload ratings for complex communications were significantly higher than simple communications. Study 2 examined the timing of communications occurring at the corresponding seconds after the visual detection event. Twenty-four participants were randomly assigned to the communication complexity and timing blocks as in study 1. The results did not find significant performance effects of timing or complexity of communications on detection performance. However the workload ratings for the 2 and 5 second complex communication presentations were higher compared to the same simple communication conditions. Overall, these findings support the strategic preemption assumption for well-defined, complex communications. The onset preemption assumption for simple communications was not supported. These results also suggest that the boundaries of the multiple resource theory assumption may exist up to the limits of the echoic memory store. Figures of merit for task performance under the varying levels of timing and complexity are presented. Several theoretical and practical implications are discussed.
46

Ordonnancement temps réel préemptif multiprocesseur avec prise en compte du coût du système d’exploitation / Multiprocessor preemptive real-time scheduling taking into account the operating system cost

Ndoye, Falou 03 April 2014 (has links)
Dans cette thèse nous étudions le problème d'ordonnancement temps réel multiprocesseur préemptif avec prise en compte du coût exact du système d'exploitation. Ce coût est formé de deux parties : une partie facile à déterminer, correspondant au coût de l'ordonnanceur et une partie difficile à déterminer, correspondant au coût de la préemption. Cette difficulté est due au fait qu'une préemption peut en engendrer une autre, pouvant ainsi créer un phénomène d'avalanche. Dans un premier temps, nous avons étudié l'ordonnancement hors ligne multiprocesseur de tâches indépendantes avec prise en compte du coût exact de la préemption et proposé une analyse d'ordonnançabilité fondée sur une heuristique d'ordonnancement multiprocesseur. Cette heuristique utilise la stratégie d'ordonnancement multiprocesseur par partitionnement. Pour prendre en compte le coût exact de la préemption sur chaque processeur nous avons utilisé la condition d'ordonnançabilité proposée par Meumeu et Sorel. Cette condition d'ordonnançabilité pour des tâches à priorités fixes, est basée sur une opération binaire d'ordonnancement qui permet de compter le nombre exact de préemption et d'ajouter leur coût dans l'analyse d'ordonnançabilité des tâches. L'heuristique proposée permet de maximiser le facteur d'utilisation restant afin de répartir équitablement les tâches sur les processeurs et de réduire leur temps de réponse. Elle produit une table d'ordonnancement hors ligne. Dans un second temps, nous avons étudié l'ordonnancement hors ligne multiprocesseur de tâches dépendantes avec prise en compte du coût exact de la préemption. Puisque la condition d'ordonnançabilité utilisée pour ordonnancer les tâches indépendantes ne s'applique qu'à des tâches à priorités fixes, elle ne permet pas de gérer les inversions de priorités que peuvent entraîner les tâches dépendantes. Nous avons donc proposé une nouvelle condition d'ordonnançabilité pour des tâches à priorités dynamiques. Elle prend en compte le coût exact de la préemption et les dépendances sans aucune perte de données. Ensuite en utilisant toujours la stratégie d'ordonnancement par partitionnement, nous avons proposé pour des tâches dépendantes une heuristique d'ordonnancement multiprocesseur qui réutilise cette nouvelle condition d'ordonnançabilité au niveau de chaque processeur. Cette heuristique d'ordonnancement prend en compte les coûts de communication inter-processeurs. Elle permet aussi de minimiser sur chaque processeur le makespan (temps total d'exécution) des tâches. Cette heuristique produit pour chaque processeur une table d'ordonnancement hors ligne contenant les dates de début et de fin de chaque tâches et de chaque commmunication inter-processeur. En supposant que nous avons une architecture multiprocesseur de type dirigée par le temps (Time-Triggered) pour laquelle tous les processeurs ont une référence de temps unique, nous avons proposé pour chacun des processeurs un ordonnanceur en ligne qui utilise la table d'ordonnancement produite lors de l'ordonnancement hors ligne. Cet ordonnanceur en ligne a l'avantage d'avoir un coût constant qui de plus est facile à déterminer de manière exacte. En effet il correspond uniquement au temps de lecture dans la table d'ordonnancement pour obtenir la tâche sélectionnée lors de l'analyse d'ordonnançabilité hors ligne, alors que dans les ordonnanceurs classiques en ligne ce coût correspond à mettre à jour la liste des tâches qui sont dans l'état prêt à l'exécution puis à sélectionner une tâche selon un algorithme, par exemple RM, DM, EDF, etc. Il varie donc avec le nombre de tâches prêtes à s'exécuter qui change d'une invocation à l'autre de l'ordonnanceur. C'est ce coût qui est utilisé dans les analyses d'ordonnançabilités évoquées ci-dessus. Un autre avantage est qu'il n'est pas nécessaire de synchroniser l'accès aux mémoires de données partagées par plusieurs tâches, car cette synchronisation a été déjà effectuée lors de l'analyse d'ordonnançabilité hors ligne. / In this thesis we studied the problem of multiprocessor preemptive real-time scheduling taking into account the exact cost of the operating system (OS). This cost is composed of two parts: a part easy to determine, corresponding to the scheduler cost and another part difficult to determine, corresponding to the preemption cost. This difficulty is due to the fact that a preemption can involve another one, being able to so create an avalanche phenomenon. First, we studied the off-line multiprocessor real-time scheduling of independent tasks taking into account the exact preemption cost. We proposed a schedulability analysis based on a multiprocessor scheduling heuristic. This heuristic uses the partitioned multiprocessor scheduling approach. In order to take into account the exact preemption cost on every processor we use the schedulability condition proposed by Meumeu and Sorel. This schedulability condition for fixed priorities tasks, is based on a binary scheduling operation which counts the exact number of preemptions and add their cost in the schedulability analysis. The proposed heuristic maximizes the remaining utilization factor to fairly distribute the tasks on processors and to reduce their response time. It produces an off-line scheduling table. Secondly, we studied the off-line multiprocessor real-time scheduling of dependent tasks taking into account the exact preemption cost. Because the schedulability condition used for scheduling independent tasks can be applied only to fixed priorities tasks, it does not allow to manage priorities inversions that are involved by dependent tasks. We proposed a new schedulability condition for dependent tasks which enables fixed and dynamic priorities. This schedulability condition takes into account the exact preemption cost and dependences between tasks without any loss of data. Always with the partitioned scheduling approach, we proposed for dependent tasks a multiprocessor scheduling heuristic which reuses, on every processor, the schedulability condition proposed previously. In addition, this scheduling heuristic takes into account the interprocessors communication costs. It also minimizes on every processor the makespan (total execution time of the tasks on all the processors). This heuristic produces for every processor an off-line scheduling table. Supposing that we have a time-triggered multiprocessor architecture such that all the processors have a unique time reference, we proposed for every processor an on-line scheduler which uses the scheduling table produced during the off-line schedulability analysis. This on-line scheduler has the advantage to have a constant cost that is easy to determine exactly.Indeed, this cost corresponds only to the time necessary to read in the scheduling table the task selected for execution. In the on-line classical scheduler, this cost corresponds to the time necessary to update the list of ready tasks in order to select a task, according to a given scheduling algorithm, for example RM, DM, EDF, etc. In this case, the cost for selecting a task varies with the number of ready tasks which changes from an invocation of the scheduler to another one. Another advantage of the proposed on-line scheduler is that it is not necessary to synchronize the access to the data shared by several tasks, because this synchronization was already done during the off-line schedulability analysis.
47

Humanitarian Intervention: Moral Perspectives

Clark, Tyrome 01 January 2016 (has links)
This thesis addresses primary concepts in the humanitarian intervention debates. I argue that humanitarian intervention is a perfect duty. The global community has a moral obligation to act decisively in the face of extreme human rights abuses. There are two contrasting theoretical perspectives regarding international relations and humanitarian intervention: statism and cosmopolitanism. These contrasting perspectives contest the relative value of state sovereignty and human rights. Some of the most prominent ethicists in the debate have determined states have a “right” to intervene militarily in the internal affairs of other states to halt severe human rights abuses but there is no “duty”to intervene. These conclusions are largely based upon consequentialist considerations. This thesis argues a deontological perspective is essential. References to events Rwanda, Darfur, and Kosovo are made. There is a critical role for preemptive actions to play in addressing humanitarian crises and calls for global justice.
48

Národní bezpečnostní strategie Spojených států amerických 2002: Imperiální Grand Strategy? / National Security Strategy of the United States of America 2002: Imperial Grand Strategy?

Ludvík, Jan January 2009 (has links)
This paper offers a thorough examination of the United States 2002 National Security Strategy. The document is explored in its broader context, which allows us to understand it in its uniqueness and therefore offer sufficient interpretation. Special attention is devoted to the decision making process of the U.S. National Security Council due to primary responsibility of NSC for coordination of American security policy. Further attention is paid to three particular problem- related parts that are often considered to be the most revolutionary issues of this document. Preemption, unilateralism and U.S. support for the spread of democracy are examined in the broader context of the U.S. foreign policy tradition, American identity and historical development. On the basis of thorough research, the paper supposes that all major parts of this particular document are rather compatible with the development of U.S. security policy and they represent rather the outcome of developments than a fundamental change or reformulation of the strategy. The role of strategic documents is implicitly examined as well, while the study suggests that it should be perceived as a product of bureaucratic politics as summarized in a model by Graham Allison.
49

[pt] JOGOS DE OPÇÕES EM OLIGOPÓLIOS ASSIMÉTRICOS SOB AMEAÇA DE PREEMPÇÃO: UMA APLICAÇÃO NO SETOR DE LATAS DE ALUMÍNIO / [en] OLIGOPOLY GAMES UNDER ASYMMETRIC COSTS AND PREEMPTION: AN APPLICATION TO THE ALUMINUM CAN INDUSTRY

11 November 2021 (has links)
[pt] O presente trabalho busca estudar o impacto da preempção em jogos de momento ótimo em um oligopólio assimétrico, aplicando-o no mercado brasileiro de latas de alumínio. Uma primeira análise de um mercado monopolista ajuda a compreender a estrutura do modelo e os principais pontos que influenciam o valor da firma. Em seguida, entramos no caso de um oligopólio simétrico. Esta etapa introduz as interações estratégicas e suas consequências nos investimentos das empresas. Nesse momento, é introduzida a noção de preempção e seus efeitos sobre as decisões ótimas a serem tomadas. Entramos, então, no modelo de jogos de opções em um oligopólio assimétrico com preempção, adaptado para o caso com três empresas atuando no mercado, representando a indústria de latas de alumínio no Brasil. Primeiro uma solução analítica é encontrada, seguida por uma aplicação numérica. Um dos principais resultados é que as empresas precisam antecipar seus investimentos quando existe a ameaça de preempção no mercado, o que as impede de investir no momento em que maximizariam seus valores. / [en] This study aims to analyze the impact of preemption in investment-timing games under asymmetric oligopolies. The model developed in the dissertation is, then, applied to the Brazilian aluminum can industry. A first analysis of a monopolist market helps understand the foundations of the model and the key aspects that influence the firm s value. Then, we deal with a case involving competition with symmetric firms, taking into account the strategic interactions and its consequences to the investment-timing. At this point, it is introduced the notion of preemption and its effects on the optimal timing decisions. Finally, we study the oligopoly games under asymmetric cost and preemption for the case with three firms in the market, representing the Brazilian aluminum can industry. First an analytical solution is found, followed by a numerical application. One of the main results is that the presence of rivals lowers the threshold that triggers investment, so investment occurs sooner, preventing firms to invest in the time that would maximize their values.
50

Analyses of Bus Travel Time Reliability and Transit Signal Priority at the Stop-To-Stop Segment Level

Feng, Wei 02 June 2014 (has links)
Transit travel time is affected by many factors including traffic signals and traffic condition. Transit agencies have implemented strategies such as transit signal priority (TSP) to reduce transit travel time and improve service reliability. However, due to the lack of empirical data, the joint impact of these factors and improvement strategies on bus travel time has not been studied at the stop-to-stop segment level. This study utilizes and integrates three databases available along an urban arterial corridor in Portland, Oregon. Data sources include stop-level bus automatic vehicle location (AVL) and automatic passenger count (APC) data provided by the Tri-County Metropolitan Transportation District of Oregon (TriMet), the Sydney Coordinated Adaptive Traffic System (SCATS) signal phase log data, and intersection vehicle count data provided by the City of Portland. Based on the unique collection and integration of these fine granularity empirical data, this research utilizes multiple linear regression models to understand and quantify the joint impact of intersection signal delay, traffic conditions and bus stop location on bus travel time and its variability at stop-to-stop segments. Results indicate that intersection signal delay is the key factor that affects bus travel time variability. The amount of signal delay is nearly linearly associated with intersection red phase duration. Results show that the effect of traffic conditions (volumes) on bus travel time varies significantly by intersection and time of day. This study also proposed new and useful performance measures for evaluating the effectiveness of TSP systems. Relationships between TSP requests (when buses are late) and TSP phases were studied by comparing TSP phase start and end times with bus arrival times at intersections. Results show that green extension phases were rarely used by buses that requested TSP and that most green extension phases were granted too late. Early green effectiveness (percent of effective early green phases) is much higher than green extension effectiveness. The estimated average bus and passenger time savings from an early green phase are also greater compared to the average time savings from a green extension phase. On average, the estimated delay for vehicles on the side street due to a TSP phase is less than the time saved for buses and automobiles on the major street. Results from this study can be used to inform cities and transit agencies on how to improve transit operations. Developing appropriate strategies, such as adjusting bus stop consolidation near intersections and optimizing bus operating schedules according to intersection signal timing characteristics, can further reduce bus travel time delay and improve TSP effectiveness.

Page generated in 0.1402 seconds