Spelling suggestions: "subject:"skeleton""
1 |
An evaluation of the significance of 'scars of parturition' in the Christ Church Spitalfields sampleCox, Margaret J. January 1989 (has links)
The relationship between the preauricular sulcus and pitting on the dorsal aspect of the pubic corpus in association with pregnancy and parturition has aroused considerable interest since the early 1970's. The major limitation of much of the discussion is that it has been based on data derived from skeletal samples with either unknown or uncertain obstetric histories. The excavation of the crypts beneath Christ Church, Spitalfields between 1984 and 1986 produced 968 skeletons, 387 of which were recovered in association with securely associated, legible coffin plates. Of the 138 adult females in this sample the parity status of 94 has been reconstructed from historical documentation. Such obstetric factors as parity status, number of births, age at first and last births and birth spacing have been examined in relation to the presence or absence of the preauricular sulcus, its type and size, pubic pitting, sacral scarring and the extension of the pubic tubercle. The results suggest that the preauricular sulcus and sacral scarring are independent of obstetric events and that although the small numbers of females with more than one pubic pit or an extended pubic tubercle had born children, the absence of these features is associated with both parous and nulliparous females. Unlike previous studies, both localised cortical resorption and tubercle extension were evaluated as a component part of the obstetric pelvis. The more capacious pelvis proved to be associated with wider and longer preauricular sulci and with the presence of pubic pitting. In order to facilitate comparative studies the Christ Church females are described as part of the sample from which they are derived. Their environmental and cultural backgrounds are discussed.
|
2 |
Human skeletal collections: the responsibilities of project managers, physical anthropologists, conservators and the need for standardized condition assessmentsJanaway, Robert C., Wilson, Andrew S., Caffell, Anwen C., Roberts, Charlotte A. January 2001 (has links)
No
|
3 |
Automated detection of structured coarse-grained parallelism in sequential legacy applicationsEdler Von Koch, Tobias Joseph Kastulus January 2014 (has links)
The efficient execution of sequential legacy applications on modern, parallel computer architectures is one of today’s most pressing problems. Automatic parallelization has been investigated as a potential solution for several decades but its success generally remains restricted to small niches of regular, array-based applications. This thesis investigates two techniques that have the potential to overcome these limitations. Beginning at the lowest level of abstraction, the binary executable, it presents a study of the limits of Dynamic Binary Parallelization (Dbp), a recently proposed technique that takes advantage of an underlying multicore host to transparently parallelize a sequential binary executable. While still in its infancy, Dbp has received broad interest within the research community. This thesis seeks to gain an understanding of the factors contributing to the limits of Dbp and the costs and overheads of its implementation. An extensive evaluation using a parameterizable Dbp system targeting a Cmp with light-weight architectural Tls support is presented. The results show that there is room for a significant reduction of up to 54% in the number of instructions on the critical paths of legacy Spec Cpu2006 benchmarks, but that it is much harder to translate these savings into actual performance improvements, with a realistic hardware-supported implementation achieving a speedup of 1.09 on average. While automatically parallelizing compilers have traditionally focused on data parallelism, additional parallelism exists in a plethora of other shapes such as task farms, divide & conquer, map/reduce and many more. These algorithmic skeletons, i.e. high-level abstractions for commonly used patterns of parallel computation, differ substantially from data parallel loops. Unfortunately, algorithmic skeletons are largely informal programming abstractions and are lacking a formal characterization in terms of established compiler concepts. This thesis develops compiler-friendly characterizations of popular algorithmic skeletons using a novel notion of commutativity based on liveness. A hybrid static/dynamic analysis framework for the context-sensitive detection of skeletons in legacy code that overcomes limitations of static analysis by complementing it with profiling information is described. A proof-of-concept implementation of this framework in the Llvm compiler infrastructure is evaluated against Spec Cpu2006 benchmarks for the detection of a typical skeleton. The results illustrate that skeletons are often context-sensitive in nature. Like the two approaches presented in this thesis, many dynamic parallelization techniques exploit the fact that some statically detected data and control flow dependences do not manifest themselves in every possible program execution (may-dependences) but occur only infrequently, e.g. for some corner cases, or not at all for any legal program input. While the effectiveness of dynamic parallelization techniques critically depends on the absence of such dependences, not much is known about their nature. This thesis presents an empirical analysis and characterization of the variability of both data dependences and control flow across program runs. The cBench benchmark suite is run with 100 randomly chosen input data sets to generate whole-program control and data flow graphs (Cdfgs) for each run, which are then compared to obtain a measure of the variance in the observed control and data flow. The results show that, on average, the cumulative profile information gathered with at least 55, and up to 100, different input data sets is needed to achieve full coverage of the data flow observed across all runs. For control flow, the figure stands at 46 and 100 data sets, respectively. This suggests that profile-guided parallelization needs to be applied with utmost care, as misclassification of sequential loops as parallel was observed even when up to 94 input data sets are used.
|
4 |
Magnetic skeletons and 3D magnetic reconnectionHaynes, Andrew L. January 2008 (has links)
The upper atmosphere of the sun, the solar corona, is approximately 1,000,000K hotter than the surface of the Sun, a property which cannot be explained by the normal processes of heat conduction and radiation. It is now commonly believed that the magnetic fields which fill the solar atmosphere, and propagate down into the interior of the Sun, are important for transferring and transforming energy from the strong plasma flows inside the Sun into the corona as heat. I have investigated an elementary flux interaction which forms a fundamental building block of the coronal heating process. This interaction involves two opposite polarity sources on the Sun's surface in the presence of an overlying magnetic field. To fully understand how this interaction transfers heat into the solar corona, the magnetic skeleton is required, which shows possible sites of heating that are due to magnetic reconnection. A magnetic field is best described by its magnetic skeleton. The most important parts of the magnetic skeleton to find are the null points, from which separatrix surfaces extend that divide magnetic flux of different topology. Part of this thesis proposes a new method of finding null points, for which the accuracy is shown and then compared with another commonly used method (which gave false results). Using these techniques for finding the magnetic skeleton in the magnetic interaction above, the evolution of the skeleton was found to head through seven distinct states, some of which were far more complicated than expected. This included a high number of separators (the intersection of two separatrix surfaces), which are a known location of magnetic reconnection. This separator reconnection was shown to be the main heating mechanism in this interaction, from which the total amount and rates of reconnection in the experiment was calculated. This led to the discovery of recursive reconnection, a process where magnetic flux is reconnected before reconnecting back to its original state, to allow for the process to repeat again. This recursive reconnection was shown to allow far more reconnection than would have been previously expected, all of which releases heat into the neighbouring areas of the atmosphere. Finally, the interaction was modelled with sources of different magnetic radii but of equal flux. This showed that when the antisymmetric nature of the previous interactions was removed, there was little change in the reconnection rates, but when the strength of the overlying magnetic field was increased, the reconnection rates were found to increase. This increase in the overlying magnetic field strength also produced a new magnetic feature called a bald-edge, which was found to replace some of the null points. These bald-edges were found to be associated with surfaces similar to separatrix surfaces that divide flux of different topology but do not extend from a null point. Also features similar to separators extend from these bald-edges.
|
5 |
Comparación de algoritmos de cálculo del Skeleton y su aplicación en biologíaLavado Abarzúa, Alejandro Andrés January 2018 (has links)
Magíster en Ciencias, Mención Computación.
Ingeniero Civil en Computación / Los algoritmos de cálculo del skeleton son una herramienta computacional de amplia utilización en el procesamiento de imágenes y volúmenes médicos y biológicos. Se trata de procedimientos que reducen una figura a un conjunto de líneas que pasan por su centro.
El skeleton de una figura puede ser calculado siguiendo estrategias muy diferentes. Debido a esto, cada algoritmo de cálculo del skeleton puede producir un resultado muy distinto a los demás algoritmos. Ahora bien, cuando se está trabajando en una aplicación donde se requiere el skeleton, ¿cómo elegir el mejor algoritmo para calcularlo?
En esta tesis se proponen métricas originadas en el análisis morfológico de estructuras biológicas para responder cuantitativamente a la pregunta anterior, como el largo total del skeleton, su número de nodos y sus ángulos de bifurcación. Estas métricas permiten caracterizar numéricamente un skeleton y compararlo con otros. De esta manera, el mejor algoritmo para una aplicación en específico puede ser seleccionado en base a los valores de las métricas relevantes para esa aplicación.
Para demostrar la efectividad de estas métricas, se implementaron tres algoritmos de cálculo del skeleton basados en principios teóricos distintos: adelgazamiento topológico, cálculo del skeleton basado en la divergencia y cálculo del skeleton basado en la distancia. Estos algoritmos, más un cuarto basado en contracción de mallas, fueron utilizados para calcular los skeletons de modelos biológicos simulados y reales. Los skeletons de modelos simulados permitieron medir la desviación de cada algoritmo con respecto al valor ideal de cada métrica, revelando diferencias significativas en algunos casos. Ejemplo de esto es la métrica del largo total en estructuras tipo neurona: el cálculo del skeleton por contracción de mallas produce una estructura significativamente más corta que el skeleton calculado mediante un algoritmo basado en la distancia, cuyo largo total es cercano al real. Sin embargo, el algoritmo de contracción de mallas resulta más apropiado para calcular los ángulos de bifurcación. Por último, las métricas para skeletons de modelos reales ilustraron marcadas diferencias entre los resultados producidos por cada algoritmo para la misma figura. / Parcialmente financiado por el Fondo Nacional de Desarrollo Científico y Tecnológico (FONDECYT 11161033), el Instituto Milenio de Neurociencias Biomédicas - BNI (P09-015-F) y la iniciativa Anillo (ACT1402)
|
6 |
Linking Scheme code to data-parallel CUDA-C code2013 December 1900 (has links)
In Compute Unified Device Architecture (CUDA), programmers must manage memory operations, synchronization, and utility functions of Central Processing Unit programs that control and issue data-parallel general purpose programs running on a Graphics Processing Unit (GPU). NVIDIA Corporation developed the CUDA framework to enable and develop data-parallel programs for GPUs to accelerate scientific and engineering applications by providing a language extension of C called CUDA-C. A foreign-function interface comprised of Scheme and CUDA-C constructs extends the Gambit Scheme compiler and enables linking of Scheme and data-parallel CUDA-C code to support high-performance parallel computation with reasonably low overhead in runtime. We provide six test cases — implemented both in Scheme and CUDA-C — in order to evaluate performance of our implementation in Gambit and to show 0–35% overhead in the usual case. Our work enables Scheme programmers to develop expressive programs that control and issue data-parallel programs running on GPUs, while also reducing hands-on memory management.
|
7 |
Generación de Skeletons a Partir de Mallas de SuperficieAlcayaga Gallardo, Liliana Francisca January 2012 (has links)
El modelamiento y análisis de estructuras biológicas microscópicas 3D de alta ramificación es una tarea desafiante debido a su alta complejidad. Un método para abordar esta tarea corresponde a la generación de skeletons, como modelos de dimensión reducida. El skeleton de un objeto 3D es una representación 1D del mismo, aproximadamente equidistante a los bordes y que busca conservar sus propiedades geométricas y topológicas. Los skeletons, aplicados a estructuras biológicas complejas de interés, requieren satisfacer las siguientes propiedades: ser unidimensionales, invariantes bajo transformaciones isométricas, aproximadamente centrados y homeotópicos.
El objetivo de este trabajo de título fue implementar un algoritmo de esqueletonización correcto y robusto, tomando como base un método descrito para mallas triangulares de superficie 3D.
El algoritmo implementado considera tres etapas: una de contracción de la geometría, una de remoción de todas las caras de la malla que la transforma en una estructura unidimensional, y una de centrado para la corrección del skeleton resultante. Para garantizar la robustez del algoritmo, se añadió una rutina de preprocesamiento que verifica que la malla de entrada sea válida; además se realizaron pruebas unitarias para validar los distintos escenarios posibles en las tres etapas del método.
En esta implementación se utilizó el paradigma de programación orientada a objetos y de patrones de diseño para facilitar la extensión y modificación del software. Se evaluó la implementación del algoritmo y de las mejoras propuestas utilizando (i) mallas simples de figuras de fantasía, presentadas en trabajos previos sobre esqueletonización, y (ii) mallas complejas de estructuras biológicas observadas por microscopía confocal. En el último caso se recurrió a biólogos expertos para evaluar los resultados.
Al emplear este método con mallas biológicas de distinto tamaño y complejidad, se obtienen skeletons correctos que satisfacen las propiedades requeridas. En particular se utilizaron mallas de superficie de: red de retículo endoplasmático de células de cultivo COS-7, cuerpo y soma de neuronas pertenecientes al órgano parapineal del pez cebra y conglomerados de membranas plásticas de células de cresta neural de pez cebra. Al emplearlo con las mallas de fantasía se obtienen skeletons aproximadamente centrados en casi todos los casos, y en un caso se observó una región del skeletons que quedó ubicada fuera de la figura original. Además, en este último tipo de mallas, se observa que el tener una malla simétrica no implica que el skeleton resultante sea simétrico. Para todos los casos la aplicación realizada cumple los requerimientos de robustez en las tres etapas del algoritmo.
Finalmente, la extensión del trabajo realizado en proyectos futuros abarca temas como: la paralelización de la aplicación, y mejoras para garantizar que el skeleton se encuentre siempre dentro de la malla y aproximadamente centrado con respecto a ésta.
|
8 |
Esqueletos euclidianos discretos em resolução aumentada / Discrete euclidean skeletons in increased resolutionSaude, Andre Vital 15 December 2006 (has links)
Orientadores: Roberto de Alencar Lotufo, Michel Couprie / Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação / Made available in DSpace on 2018-08-08T12:59:19Z (GMT). No. of bitstreams: 1
Saude_AndreVital_D.pdf: 1866309 bytes, checksum: 4e4c6c725d272926c947e524bde15019 (MD5)
Previous issue date: 2006 / Resumo: A extração de esqueletos Euclidianos é uma tema de grande importância na área de processamento de imagens e tem sido discutido pela comunidade cientÃfica já há mais de 20 anos. Hoje é consenso que os esqueletos Euclidianos devem ter as seguintes caracterÃsticas: ï¬?nos, centrados, homotópicos e reversÃveis, i.e., suficientes para a reconstrução do objeto original. Neste trabalho, introduzimos o Eixo Mediano Euclidiano Exato em Resolução Aumentada -HMA, com o objetivo de obter um eixo mediano mais ï¬?no do que o obtido pela definição clássica. Combinando o HMA com um eï¬?ciente algoritmo de afinamento paralelo homotópico, propomos um esqueleto Euclidiano que é centrado, homotópico, reversÃvel e mais ï¬?no que os já existentes na literatura. O esqueleto proposto tem a particularidade adicional de ser único e independente de decisões arbitrárias. São dados algoritmos e provas, assim como exemplos de aplicações dos esqueletos propostos em imagens reais, mostrando as vantagens da proposta. O texto inclui também uma revisão bibliográfica sobre algoritmos de transformada de distância, eixo mediano e esqueletos homotópicos / Abstract: The extraction of Euclidean skeletons is a subject of great importance in the domain of image processing and it has been discussed by the scientiï¬?c community since more than 20 years.Today it is a consensus that Euclidean skeletons should present the following characteristics: thin, centered, homotopic and reversible, i.e., sufï¬?cient for the reconstruction of the original object. In this work, we introduce the Exact Euclidean Medial Axis in Higher Resolution -HMA, with the objective of obtaining a medial axis which is thinner than the one obtained by the classical medial axis deï¬?nition. By combining the HMA with an efï¬?cient parallel homotopic thinning algorithm we propose an Euclidean skeleton which is centered, homotopic, reversible and thinner than the existing similars in the literature. The proposed skeleton has the additional particularity of being unique and independent of arbitrary choices. Algorithms and proofs are given, as well as applicative examples of the proposed skeletons in real images, showing the advantages of the proposal. The text also includes an overview on algorithms for the Euclidean distance transform algorithms, the medial axis extraction, as well as homotopic skeletons / Doutorado / Engenharia de Computação / Doutor em Engenharia Elétrica
|
9 |
Disconnected Skeletons For Shape RecognitionAslan, Cagri 01 June 2005 (has links) (PDF)
This study presents a new shape representation scheme based on disconnected symmetry axes
along with a matching framework to address the problem of generic shape recognition. The main idea
is to define the relative spatial arrangement of local symmetry axes in a shape centered coordinate
frame. The resulting descriptions are invariant to scale, rotation, small changes in viewpoint and
articulations. Symmetry points are extracted from a surface whose level curves roughly mimic the
motion by curvature. By increasing the amount of smoothing on the evolving curve, only those
symmetry axes that correspond to the most prominent parts of a shape are extracted. The
representation does not suffer from the common instability problems of the traditional connected
skeletons. It captures the perceptual properties of shapes well. Therefore, finding the similarities and
the differences among shapes becomes easier. The matching process is able to find the correct
correspondence of parts under various visual transformations. Highly successful classification results
are obtained on a moderate sized 2D shape database.
|
10 |
Human skeletal remains from Kimberley : an assessment of health in a 19th century mining communityVan der Merwe, Alie Emily 10 July 2007 (has links)
In April 2003 the Sol Plaatjie Municipality disturbed several unmarked graves while digging a storm-water trench next to what is today known as the Gladstone Cemetery in Kimberley, South Africa. They are believed to date to between 1897 and 1900. All remains were excavated and housed at the McGregor museum in Kimberley where they were investigated. The purpose of this study was to analyze and interpret the health status and diseases present within this sample, and to determine whether bone lesions caused by ossified haematomas and treponemal infection can be diagnosed through histological investigations. Standard anthropometric techniques were used to determine the age and sex of the individuals. All bones were assessed for signs of trauma and pathology present on the bones, and histological bone samples were prepared according to a method described by Maat (2002). A total of 107 individuals were investigated, comprising of 86 males and 15 females. The remains were mostly those of young persons, with the majority being younger than 30 years of age. A wealth of pathology was observed with skeletal lesions indicating advanced treponemal disease, scurvy, non-spesfic osteomyelitis, several amputations, cranial fractures and osteoarthritis. A high incidence of dental caries, antemortem tooth loss and periodontal disease were also noted. The remains studied were those of migrant workers, of low socio-economic status, mainly consuming a diet consisting of refined carbohydrates lacking vitamin C. A high prevalence of degenerative changes and cranial fractures suggested participation in regular strenuous physical activities and a high incidence of interpersonal violence. The high incidence of infectious diseases was ascribed to the poor living conditions as well as limited medical care. Surgical procedures were conducted regularly as could be extrapolated from the high incidence of amputations. It was also concluded that a distinction could be made between bone reactions resulting from of haemorrhage and lesions caused by an infectious condition, on histological level. Three stages of ossified haematoma development and remodeling were described. It is hoped that this study gave some recognition to those so unceremoniously dumped in these pauper graves. / Dissertation (MSc (Anatomy))--University of Pretoria, 2007. / Anatomy / unrestricted
|
Page generated in 0.0717 seconds