Spelling suggestions: "subject:"consensus."" "subject:"konsensus.""
261 |
The Dosimetric Consequences of Intensity Modulated Radiotherapy for Cervix Cancer - the Impact of Organ Motion, Deformation and Tumour RegressionLim, Karen 10 January 2011 (has links)
Cervix cancer affects women of all ages and causes significant morbidity and mortality. Locally advanced disease is curable with radiotherapy (RT) in about 50% of patients, although often at the expense of serious side effects. In order to improve the therapeutic ratio of tumour control versus normal tissue toxicity, conformal intensity-modulated radiotherapy (IMRT) is being investigated. However, inter- and intra-fractional motion of cervix cancer can contribute to both geographical miss of the target and overdosing of surrounding normal tissues, particularly in the setting of conformal IMRT with steep dose gradients. Defining the target volume accurately and understanding the dose consequence of these complex intra-pelvic organ dynamics during external beam radiotherapy forms the essential foundations for future treatment optimization and adaptation. This in turn will lead to improvements in tumour control and disease-free survival while minimising treatment toxicity.
|
262 |
The Dosimetric Consequences of Intensity Modulated Radiotherapy for Cervix Cancer - the Impact of Organ Motion, Deformation and Tumour RegressionLim, Karen 10 January 2011 (has links)
Cervix cancer affects women of all ages and causes significant morbidity and mortality. Locally advanced disease is curable with radiotherapy (RT) in about 50% of patients, although often at the expense of serious side effects. In order to improve the therapeutic ratio of tumour control versus normal tissue toxicity, conformal intensity-modulated radiotherapy (IMRT) is being investigated. However, inter- and intra-fractional motion of cervix cancer can contribute to both geographical miss of the target and overdosing of surrounding normal tissues, particularly in the setting of conformal IMRT with steep dose gradients. Defining the target volume accurately and understanding the dose consequence of these complex intra-pelvic organ dynamics during external beam radiotherapy forms the essential foundations for future treatment optimization and adaptation. This in turn will lead to improvements in tumour control and disease-free survival while minimising treatment toxicity.
|
263 |
A Social Theory of KnowledgeMiller, Boaz 13 June 2011 (has links)
We rely on science and other organized forms of inquiry to answer cardinal questions on issues varying from global warming and public health to the political economy. In my thesis, which is in the intersection of philosophy of science, social epistemology, and science and technology studies, I develop a social theory of knowledge that can help us tell when our beliefs and theories on such matters amount to knowledge, as opposed to mere opinion, speculation, or educated guess. The first two chapters discuss relevant shortcomings of mainstream analytic epistemology and the sociology of knowledge, respectively. Mainstream epistemology regards individuals, rather than communities, as the bearers of knowledge or justified belief. In Chapter 1, I argue that typically, only an epistemic community can collectively possess sufficient justification required for knowledge. In Chapter 2, I present a case study in computer science that militates against the sociological understating of knowledge as mere interest-based agreement. I argue that social interests alone cannot explain the unfolding of the events in this case. Rather, we must assume that knowledge is irreducible to social dynamics and interests. In Chapter 3, I begin my positive analysis of the social conditions for knowledge. I explore the question of when a consensus is knowledge based. I argue that a consensus is knowledge based when knowledge is the best explanation of the consensus. I identify three conditions – social diversity, apparent consilience of evidence, and meta agreement, for knowledge being the best explanation of a consensus. In Chapter 4, I illustrate my argument by analyzing the recent controversy about the safety of the drug Bendectin. I argue that the consensus in this case was not knowledge based, and hence the deference to consensus to resolve this dispute was unjustified. In chapter 5, I develop a new theory of the logical relations between evidence and social values. I identify three roles social values play in evidential reasoning and justification: They influence the trust we extend to testimony, the threshold values we require for accepting evidence, and the process of combining different sorts of evidence.
|
264 |
A Social Theory of KnowledgeMiller, Boaz 13 June 2011 (has links)
We rely on science and other organized forms of inquiry to answer cardinal questions on issues varying from global warming and public health to the political economy. In my thesis, which is in the intersection of philosophy of science, social epistemology, and science and technology studies, I develop a social theory of knowledge that can help us tell when our beliefs and theories on such matters amount to knowledge, as opposed to mere opinion, speculation, or educated guess. The first two chapters discuss relevant shortcomings of mainstream analytic epistemology and the sociology of knowledge, respectively. Mainstream epistemology regards individuals, rather than communities, as the bearers of knowledge or justified belief. In Chapter 1, I argue that typically, only an epistemic community can collectively possess sufficient justification required for knowledge. In Chapter 2, I present a case study in computer science that militates against the sociological understating of knowledge as mere interest-based agreement. I argue that social interests alone cannot explain the unfolding of the events in this case. Rather, we must assume that knowledge is irreducible to social dynamics and interests. In Chapter 3, I begin my positive analysis of the social conditions for knowledge. I explore the question of when a consensus is knowledge based. I argue that a consensus is knowledge based when knowledge is the best explanation of the consensus. I identify three conditions – social diversity, apparent consilience of evidence, and meta agreement, for knowledge being the best explanation of a consensus. In Chapter 4, I illustrate my argument by analyzing the recent controversy about the safety of the drug Bendectin. I argue that the consensus in this case was not knowledge based, and hence the deference to consensus to resolve this dispute was unjustified. In chapter 5, I develop a new theory of the logical relations between evidence and social values. I identify three roles social values play in evidential reasoning and justification: They influence the trust we extend to testimony, the threshold values we require for accepting evidence, and the process of combining different sorts of evidence.
|
265 |
Cooperative Strategies for Near-Optimal Computation in Wireless NetworksNokleby, Matthew 24 July 2013 (has links)
Computation problems, such as network coding and averaging consen- sus, have become increasingly central to the study of wireless networks. Network coding, in which intermediate terminals compute and forward functions of others’ messages, is instrumental in establishing the capacity of multicast networks. Averaging consensus, in which terminals compute the mean of others’ measurements, is a canonical building block of dis- tributed estimation over sensor networks. Both problems, however, are typically studied over graphical networks, which abstract away the broad- cast and superposition properties fundamental to wireless propagation. The performance of computation in realistic wireless environments, there- fore, remains unclear.
In this thesis, I seek after near-optimal computation strategies under realistic wireless models. For both network coding and averaging con- sensus, cooperative communications plays a key role. For network cod- ing, I consider two topologies: a single-layer network in which users may signal cooperatively, and a two-transmitter, two-receiver network aided by a dedicated relay. In the former topology, I develop a decode-and- forward scheme based on a linear decomposition of nested lattice codes. For a network having two transmitters and a single receiver, the proposed
scheme is optimal in the diversity-multiplexing tradeo↵; otherwise it pro- vides significant rate gains over existing non-cooperative approaches. In the latter topology, I show that an amplify-and-forward relay strategy is optimal almost everywhere in the degrees-of-freedom. Furthermore, for symmetric channels, amplify-and-forward achieves rates near capacity for a non-trivial set of channel gains.
For averaging consensus, I consider large networks of randomly-placed nodes. Under a path-loss wireless model, I characterize the resource de- mands of consensus with respect to three metrics: energy expended, time elapsed, and time-bandwidth product consumed. I show that existing con- sensus strategies, such as gossip algorithms, are nearly order optimal in the energy expended but strictly suboptimal in the other metrics. I propose a new consensus strategy, tailored to the wireless medium and cooperative in nature, termed hierarchical averaging. Hierarchical averaging is nearly order optimal in all three metrics for a wide range of path-loss exponents. Finally, I examine consensus under a simple quantization model, show- ing that hierarchical averaging achieves a nearly order-optimal tradeo↵ between resource consumption and estimation accuracy.
|
266 |
COPIA: A New Software for Finding Consensus Patterns in Unaligned Protein SequencesLiang, Chengzhi January 2001 (has links)
Consensus pattern problem (CPP) aims at finding conserved regions, or motifs, in unaligned sequences. This problem is NP-hard under various scoring schemes. To solve this problem for protein sequences more efficiently,a new scoring scheme and a randomized algorithm based on substitution matrix are proposed here. Any practical solutions to a bioinformatics problem must observe twoprinciples: (1) the problem that it solves accurately describes the real problem; in CPP, this requires the scoring scheme be able to distinguisha real motif from background; (2) it provides an efficient algorithmto solve the mathematical problem. A key question in protein motif-finding is how to determine the motif length. One problem in EM algorithms to solve CPP is how to find good startingpoints to reach the global optimum. These two questions were both well addressed under this scoring scheme,which made the randomized algorithm both fast and accurate in practice. A software, COPIA (COnsensus Pattern Identification and Analysis),has been developed implementing this algorithm. Experiments using sequences from the von Willebrand factor (vWF)familyshowed that it worked well on finding multiple motifs and repeats. COPIA's ability to find repeats makes it also useful in illustrating the internal structures of multidomain proteins. Comparative studies using several groups of protein sequences demonstrated that COPIA performed better than the commonly used motif-finding programs.
|
267 |
Assessment of Affordable Housing Options using Collaborative Geospatial SoftwareNoble, Brad January 2007 (has links)
The scale of the affordable housing problem in Canada is enormous and the situation is worsening due to a number of recent social trends. Continued wealth inequality, an aging population, increased immigration, changing marriage and independence trends, and increased part-time employment, have all contributed to a growing affordable housing problem in Canada. Certain groups such as single parents, recent immigrants, seniors living alone and tourism/seasonal workers are particularly vulnerable. In Canada, cities and tourism-based communities have the most pronounced affordable housing shortages, and this is expected to continue in the future.
New and innovative methods of public participation are needed in dealing with the challenges of affordable housing development. Spatial information technology such as Internet-based collaborative geospatial software aims to improve the public participation process. This technology is able to use the Internet, spatial data and carefully designed interfaces in order to engage citizens and increase community participation for difficult planning problems such as affordable housing development.
This thesis focuses on three objectives. The first objective is to define a collaborative, spatially-aware approach to create and assess affordable housing options in Collingwood, Ontario. This approach will use existing spatial data, participants with a vested interest in affordable housing, and an open source geospatial software tool called MapChat. The second objective is to implement the defined approach in a real-world setting in order to generate participatory input. The third and final thesis objective is to examine the spatial patterns of existing affordable housing and the locations generated in the study to determine sites that are most suitable for future affordable housing development in Collingwood.
The results of the thesis show that the approach used provides a proof of concept in the use of Internet-based collaborative geospatial software that can be applied to any town in Canada. Although the approach involved a modest study design, it was able to offer a number of potential advances in planning the locations of future affordable housing. The approach was successful in creating a set of potential affordable housing options, was effective in assessing those scenarios and was feasible to implement in a real-world setting. In addition, the approach had high potential in the generation and management of information and in supporting community participation and empowerment.
|
268 |
Voting-Based Consensus of Data PartitionsAyad, Hanan 08 1900 (has links)
Over the past few years, there has been a renewed interest in the consensus
problem for ensembles of partitions. Recent work is primarily motivated by the
developments in the area of combining multiple supervised learners. Unlike the
consensus of supervised classifications, the consensus of data partitions is a
challenging problem due to the lack of globally defined cluster labels and to
the inherent difficulty of data clustering as an unsupervised learning problem.
Moreover, the true number of clusters may be unknown. A fundamental goal of
consensus methods for partitions is to obtain an optimal summary of an ensemble
and to discover a cluster structure with accuracy and robustness exceeding those
of the individual ensemble partitions.
The quality of the consensus partitions highly depends on the ensemble
generation mechanism and on the suitability of the consensus method for
combining the generated ensemble. Typically, consensus methods derive an
ensemble representation that is used as the basis for extracting the consensus
partition. Most ensemble representations circumvent the labeling problem. On
the other hand, voting-based methods establish direct parallels with consensus
methods for supervised classifications, by seeking an optimal relabeling of the
ensemble partitions and deriving an ensemble representation consisting of a
central aggregated partition. An important element of the voting-based
aggregation problem is the pairwise relabeling of an ensemble partition with
respect to a representative partition of the ensemble, which is refered to here
as the voting problem. The voting problem is commonly formulated as a weighted
bipartite matching problem.
In this dissertation, a general theoretical framework for the voting problem as
a multi-response regression problem is proposed. The problem is formulated as
seeking to estimate the uncertainties associated with the assignments of the
objects to the representative clusters, given their assignments to the clusters
of an ensemble partition. A new voting scheme, referred to as cumulative voting,
is derived as a special instance of the proposed regression formulation
corresponding to fitting a linear model by least squares estimation. The
proposed formulation reveals the close relationships between the underlying loss
functions of the cumulative voting and bipartite matching schemes. A useful
feature of the proposed framework is that it can be applied to model substantial
variability between partitions, such as a variable number of clusters.
A general aggregation algorithm with variants corresponding to
cumulative voting and bipartite matching is applied and a simulation-based
analysis is presented to compare the suitability of each scheme to different
ensemble generation mechanisms. The bipartite matching is found to be more
suitable than cumulative voting for a particular generation model, whereby each
ensemble partition is generated as a noisy permutation of an underlying
labeling, according to a probability of error. For ensembles with a variable
number of clusters, it is proposed that the aggregated partition be viewed as an
estimated distributional representation of the ensemble, on the basis of which,
a criterion may be defined to seek an optimally compressed consensus partition.
The properties and features of the proposed cumulative voting scheme are
studied. In particular, the relationship between cumulative voting and the
well-known co-association matrix is highlighted. Furthermore, an adaptive
aggregation algorithm that is suited for the cumulative voting scheme is
proposed. The algorithm aims at selecting the initial reference partition and
the aggregation sequence of the ensemble partitions the loss of mutual
information associated with the aggregated partition is minimized. In order to
subsequently extract the final consensus partition, an efficient agglomerative
algorithm is developed. The algorithm merges the aggregated clusters such that
the maximum amount of information is preserved. Furthermore, it allows the
optimal number of consensus clusters to be estimated.
An empirical study using several artificial and real-world datasets demonstrates
that the proposed cumulative voting scheme leads to discovering substantially
more accurate consensus partitions compared to bipartite matching, in the case
of ensembles with a relatively large or a variable number of clusters. Compared
to other recent consensus methods, the proposed method is found to be comparable
with or better than the best performing methods. Moreover, accurate estimates of
the true number of clusters are often achieved using cumulative voting, whereas
consistently poor estimates are achieved based on bipartite matching. The
empirical evidence demonstrates that the bipartite matching scheme is not
suitable for these types of ensembles.
|
269 |
In Defense of Rawlsian ConstructivismAllen, William St. Michael 03 May 2007 (has links)
George Klosko attempts to solve a problem put forth by Rawls, namely how to create a persisting, just and stable liberal democracy in light of pluralism. He believes Rawls has failed at this task through the employment of political constructivism. Klosko claims that since Rawls does not utilize actual views within the existing public to form principles of justice, his method would fail to reach an overlapping consensus. As an alternative, Klosko proposes the method of convergence, which utilizes actual societal views to find overlapping concepts that inform the principles of justice. My argument is that Klosko misconstrues the method and aims of political constructivism. Klosko seems to incorrectly believe that stability is primary to establishing a liberal democracy, whereas it is secondary to the achievement of justice. Because of this error, Klosko’s method of convergence potentially has the consequence of creating a society which is stable but unjust.
|
270 |
Public Participation in Science and Technology Policy: Consensus Conferences and Social InclusionBal, Ravtosh 03 August 2012 (has links)
This study looks at the National Citizens’ Technology Forum (NCTF), a modified version of the consensus conference, which took place in March, 2008 in six cities across the U.S. to understand how inclusive these methods of public participation are in practice. The study focuses on two of these sites. Inclusion of participants was defined in terms of presence, voice and being heard. Transcripts of the audio-visual recordings of the proceedings were the main data of analysis. By focusing on the talk within these deliberative forums, the study looked at how the rules of engagement and status (ascribed and achieved) differences between participants can affect inclusion. The analysis did not reveal any substantial effects of ascribed characteristics on deliberation. Facilitation and the presence of expertise among the participants were found to effect inclusion and equality among participants. These findings suggest that organizers and facilitators of deliberative exercises have to be reflexive of their role as well as aware of the group dynamics. The results also address the larger questions within science and technology policy like the role of expertise and the public in decision making, the institutional design of participatory exercises, and their relation to the political culture and the policy process.
|
Page generated in 0.0346 seconds