Spelling suggestions: "subject:"[een] COMPLEXITY"" "subject:"[enn] COMPLEXITY""
41 |
Directional separability in two and three dimensional space.Nussbaum, Doron, Carleton University. Dissertation. Computer Science. January 1988 (has links)
Thesis (M.C.S.)--Carleton University, 1988. / Also available in electronic format on the Internet.
|
42 |
Completeness and incompleteness /Schaefer, Marcus Georg January 1999 (has links)
Thesis (Ph. D.)--University of Chicago, Dept. of Computer Science, June 1999. / Includes bibliographical references. Also available on the Internet.
|
43 |
Techniques for analyzing the computational power of constant-depth circuits and space-bounded computationTrifonov, Vladimir Traianov, January 1900 (has links) (PDF)
Thesis (Ph. D.)--University of Texas at Austin, 2006. / Vita. Includes bibliographical references.
|
44 |
Complexity as aging non-Poisson renewal processesBianco, Simone. Grigolini, Paolo, January 2007 (has links)
Thesis (Ph. D.)--University of North Texas, May, 2007. / Title from title page display. Includes bibliographical references.
|
45 |
A Longitudinal Assessment of Website ComplexityMostafavi, Seyed Hooman 06 September 2018 (has links)
Nowadays, most people use several websites on a daily basis for various purposes like social networking, shopping, reading news, etc. which shows the significance of these websites in our lives. Due to this phenomenon, businesses can make a lot of profit by designing high quality websites to attract more people. An important aspect of a good website is its page load time. There has been a lot of studies which analyzed this aspect of the websites from different perspectives. In this thesis, we characterize and examine the complexity of a wide range of popular websites in order to discover the trends in their complexity metrics, like their number, size and type of the objects and number and type of the contacted servers for delivering the objects, over the past six year. Moreover, we analyze the correlation between these metrics and the page load times.
|
46 |
Complexity Measurement Of Cyber Physical SystemsJanuary 2011 (has links)
abstract: Modern automotive and aerospace products are large cyber-physical system involving both software and hardware, composed of mechanical, electrical and electronic components. The increasing complexity of such systems is a major concern as it impacts development time and effort, as well as, initial and operational costs. Towards the goal of measuring complexity, the first step is to determine factors that contribute to it and metrics to qualify it. These complexity components can be further use to (a) estimate the cost of cyber-physical system, (b) develop methods that can reduce the cost of cyber-physical system and (c) make decision such as selecting one design from a set of possible solutions or variants. To determine the contributions to complexity we conducted survey at an aerospace company. We found out three types of contribution to the complexity of the system: Artifact complexity, Design process complexity and Manufacturing complexity. In all three domains, we found three types of metrics: size complexity, numeric complexity (degree of coupling) and technological complexity (solvability).We propose a formal representation for all three domains as graphs, but with different interpretations of entity (node) and relation (link) corresponding to the above three aspects. Complexities of these components are measured using algorithms defined in graph theory. Two experiments were conducted to check the meaningfulness and feasibility of the complexity metrics. First experiment was mechanical transmission and the scope of this experiment was component level. All the design stages, from concept to manufacturing, were considered in this experiment. The second experiment was conducted on hybrid powertrains. The scope of this experiment was assembly level and only artifact complexity is considered because of the limited resources. Finally the calibration of these complexity measures was conducted at an aerospace company but the results cannot be included in this thesis. / Dissertation/Thesis / M.S. Mechanical Engineering 2011
|
47 |
Great Conversations: Systems, Complexity, and Epic Encyclopedic Narratives in Contemporary American Fiction 1960-2007Huggins, Paul Alexander 01 August 2013 (has links)
Encyclopedic narratives, as conceptualized by Edward Mendelson, "attempt to render the full range of knowledge and beliefs of a national culture, while identifying the ideological perspectives from which that culture shapes and interprets its knowledge." The development of system paradigms in the sciences and humanities have shown that the complexity of the modern world-system preclude any such move towards totality. From this ideological shift in contemporary American culture, it follows that recent encyclopedic narratives incorporate these new dynamic perspectives. By applying systems paradigms to works by John Barth, Richard Powers, Annie Proulx, and Junot Díaz, the emergence of the epic encyclopedic narrative as a distinct form signifies the necessity of diversity, ambiguity, and noise in the operation of systems and the production of knowledge. Rather than presenting totalized representations of a culture, epic encyclopedic narratives represent the dynamic modern world-system by emphasizing the presence of the emergent phenomena, recursive symmetry, and noise that are central to complex systems theories. The work Ludwig von Bertalanffy, Immanuel Wallerstein, and Ilya Prigogine, amongst others, posits that complexity spurs the development of increased order and organization in socio-cultural systems; epic encyclopedic novels incorporate this philosophy by subverting hegemonic ideologies (i.e., mythopoetic narratives) by introducing alternative and marginalized discourses that disrupt the status quo. The goal of an epic encyclopedic narrative is to revise or complicate the readers' perception of reality through discursive instruction. As such, these novels purposively introduce noise, such as data-dense passages of unfamiliar discourses, within the narrative to force the reader into discovering contexts needed to derive understanding. Ultimately, epic encyclopedic narratives argue that systems will become corrupted and stagnant if marginalized elements are not synthesized into a heterogeneous whole that recognizes individuality.
|
48 |
Systematic parameterized complexity analysis in computational phonologyWareham, Harold 20 November 2017 (has links)
Many computational problems are NP-hard and hence probably do not have fast, i.e., polynomial time, algorithms. Such problems may yet have non-polynomial time algorithms, and the non-polynomial time complexities of these algorithm will be functions of particular aspects of that problem, i.e., the algorithm's running time is upper bounded by f (k) |x|ᶜ, where f is an arbitrary function, |x| is the size of the input x to the algorithm, k is an aspect of the problem, and c is a constant independent of |x| and k. Given such algorithms, it may still be possible to obtain optimal solutions for large instances of NP-hard problems for which the appropriate aspects are of small size or value. Questions about the existence of such algorithms are most naturally addressed within the theory of parameterized computational complexity developed by Downey and Fellows.
This thesis considers the merits of a systematic parameterized complexity analysis in which results are derived relative to all subsets of a specified set of aspects of a given NP-hard problem. This set of results defines an “intractability map” that shows relative to which sets of aspects algorithms whose non-polynomial time complexities are purely functions of those aspects do and do not exist for that problem. Such maps are useful not only for delimiting the set of possible algorithms for an NP-hard problem but also for highlighting those aspects that are responsible for this NP-hardness.
These points will be illustrated by systematic parameterized complexity analyses of problems associated with five theories of phonological processing in natural languages—namely, Simplified Segmental Grammars, finite-state transducer based rule systems, the KIMMO system, Declarative Phonology, and Optimality Theory. The aspects studied in these analyses broadly characterize the representations and mechanisms used by these theories. These analyses suggest that the computational complexity of phonological processing depends not on such details as whether a theory uses rules or constraints or has one, two, or many levels of representation but rather on the structure of the representation-relations encoded in individual mechanisms and the internal structure of the representations. / Graduate
|
49 |
Capacity for complexity, intelligence and personality.Comaroff, Yael 09 July 2012 (has links)
The chaos and instability which dominates the organisational environment of today often
leads to complexity – continuous ambiguity and change. Leaders and managers are
required to be able to make effective decisions in these highly abstract circumstances
hence selecting and managing employees who have the capacity to handle complexity has
become of great importance (Yuksel, 2011). The Career Path Appreciation (CPA), which
is an interview-based technique assessing complexity, has become popular in the South
African context however it is extremely costly and organisations need to be assured that
the financial expense results in a valid, reliable and unique assessment. Therefore, this
research explored the associations between three different assessment measures: the
(CPA), the Wechsler Adult Intelligence Scale III (WAIS-III) and the California
Psychological Inventory (CPI). The aim of the study was to investigate whether
personality and/or intelligence were associated with one’s capacity for complexity in any
way and whether the CPA was distinct in any way from other personality and/or
intelligence measures.
The research was based on archival data collected from a final sample of 266 managers
from a large international manufacturing organisation situated in South Africa. The only
biographical information attained was for age of the individuals.
Correlation results found that only one of the subscales of the WAIS-III, Similarities, was
significantly and moderately correlated with current capacity for complexity. In terms of
future potential, only Similarities and Block Design were found to have significant
positive correlations. Many more of the personality factors were found to be related to
capacity for complexity. Current capacity for complexity was moderately correlated with
Achievement via Independence, Independence, Empathy, Social Presence, Capacity for
Status and Flexibility. For future capacity for complexity, significant moderate
relationships were found with Flexibility, Social Presence, Achievement via
Independence, Intellectual Efficiency, Sociability and Empathy. Chi-Squared Tests of
Association were conducted to assess the nominal data of CPA Style, which found that of
all the WAIS-III subscales and overall scales, only Digit Symbol Coding, Similarities and
Block Design showed evidence of significant relationships. Five CPI factors proved to be
significantly associated with CPA Style: Empathy, Tolerance, Achievement via
Independence, Intellectual Efficiency and Psychological Mindedness.
A series of multiple regressions were conducted in order to find out which personality
and intelligence facets predicted current and future capacity for complexity. It was found
that forty-one percent of the variance in current capability was explained by age,
Dominance, Sociability, Independence, Good Impression, Wellbeing, Achievement via
Independence, Similarities and Block Design. In terms of future capability forty-eight
percent of the variance in Mode was explained by age, Dominance, Social Acceptance,
Good Impression, Achievement via Independence, Flexibility, Similarities, Block Design
and Comprehension.
The research concluded that the CPA assessment is a highly effective and unique
technique for outlining an individual’s capacity for complexity even though it is an
extremely costly assessment tool in South Africa. Although aspects of the WAIS-III and
the CPI were found to be related to capacity for handling complexity, these results were
not strong enough to conclude that the WAIS-III and the CPI overlap with the constructs
measured in the CPA or could be used in its place.
|
50 |
Software maintainability measurement: a task complexity perspectiveHe, Lulu 10 December 2010 (has links)
Software maintainability is one of the most crucial quality attributes of a software product. Software engineering researchers and practitioners have devoted considerable effort to developing “good design” methods, rules and principles to improve software maintainability. But before we can validate the effectiveness of these methods, we first need an approach to measure software maintainability. The existing maintainability measures usually have limited scope and accuracy since they either isolate the software from its environment and focus only on the technical properties of the software, or measure a confounding effect of various factors involved in the maintenance process. Furthermore, these measures are often defined and collected on a coarse-grained level and provide no insight into what makes software difficult to change. This research addresses the problems associated with software maintainability measurement by adapting the concepts of task complexity from the human behavior domain to the software engineering domain. This dissertation involves developing and validating a measurement model for measuring the maintainability of software, to provide a better understanding of the difficulty in modifying software and the effect of software design methods on software maintainability. A measurement protocol and a tool have been developed to support the application of the measurement method.
|
Page generated in 0.0434 seconds