• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 274
  • 249
  • 38
  • 25
  • 24
  • 11
  • 6
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 726
  • 198
  • 183
  • 147
  • 128
  • 115
  • 101
  • 97
  • 80
  • 73
  • 72
  • 70
  • 61
  • 56
  • 55
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Využití powerpointových prezentací v hodinách HV / Using of Powerpoint's Presentation in Music Education

Moravcová, Jolana January 2014 (has links)
The master's thesis is about the using of PowerPoint presentations in music teaching from the viewpoint of the teacher at lower secondary schools in the Czech Republic. The theoretical part deals with implementation of ICT into schools, the importance and usage of interdisciplinary relations in tuition using ICT and about visualization as one of the successful modern trends in pedagogy linked with new technologies. In the practical part are examples of PowerPoint presentations made on the basis of curriculum selected from class-books for secondary school with proposals for practical use in music class. As a part of this thesis is the DVD with the created PowerPoint presentations. Key words: PowerPoint, presentation, ICT, visualisation, music class, interdisciplinary communication
182

Technology-assisted healthcare : exploring the use of mobile 3D visualisation technology to augment home-based fall prevention assessments

Hamm, Julian J. January 2018 (has links)
Falls often cause devastating injuries which precipitate hospital and long-term care admission and result in an increased burden on health care services. Fall prevention interventions are used to overcome fall risk factors in an ageing population. There is an increasing need for technology-assisted interventions to reduce health care costs, whilst also lessening the burden that an ageing population increasingly has on health care services. Research efforts have been spent on reducing intrinsic fall risk factors (i.e. functional ability deficits and balance impairments) in the older adult population through the use of technology-assisted interventions, but relatively little effort has been expended on extrinsic risk factors (i.e. unsuitable environmental conditions and lack of assistive equipment use), considering the drive for healthcare outside of the clinical setting into the patients' home. In the field of occupational therapy, the extrinsic fall-risk assessment process (EFAP) is a prominent preventive intervention used to promote independent living and alleviate fall risk factors via the provision of assistive equipment prescribed for use by patients in their home environment. Currently, paper-based forms with measurement guidance presented in the form of 2D diagrams are used in the EFAP. These indicate the precise points and dimensions on a furniture item that must be measured as part of an assessment for equipment. However, this process involves challenges, such as inappropriate equipment prescribed due to inaccurate measurements being taken and recorded from the misinterpretation of the measurement guidance. This is largely due to the poor visual representation of guidance that is provided by existing paper-based forms, resulting in high levels of equipment abandonment by patients. Consequently, there is a need to overcome the challenges mentioned above by augmenting the limitations of the paper-based approach to visualise measurement guidance for equipment. To this end, this thesis proposes the use of 3D visualisation technology in the form of a novel mobile 3D application (Guidetomeasure) to visualise guidance in a well-perceived manner and support stakeholders with equipment prescriptions. To ensure that the artefact is a viable improvement over its 2D predecessor, it was designed, developed and empirically evaluated with patients and clinicians alike through conducting five user-centred design and experimental studies. A mixed-method analysis was undertaken to establish the design, effectiveness, efficiency and usability of the proposed artefact, compared with conventional approaches used for data collection and equipment prescription. The research findings show that both patients and clinicians suggest that 3D visualisation is a promising development of an alternative tool that contains functionality to overcome existing issues faced in the EFAP. Overall, this research makes a conceptual contribution (secondary) to the research domain and a software artefact (primary) that significantly improves practice, resulting in implications and recommendations for the wider healthcare provision (primary).
183

Aplicação de texturas em visualização científica. / Texture applied in scientific visualisation.

Marcelo de Barros Mendonça 14 December 2001 (has links)
A crescente disponibilidade de recursos computacionais para o cálculo, simulação e aquisição de dados permite que cientistas e engenheiros produzam enormes conjuntos de dados, bi ou tridimensionais, em geral multivariados. A aplicação de técnicas de Computação Gráfica com o objetivo de ganhar compreensão desses dados compreende o objeto de estudo da área conhecida por Visualização Científica. Texturização é uma forma de variar as propriedades de uma superfície ponto a ponto de forma que esta simule detalhes que não estão de fato presentes na sua geometria. A texturização pode ser aplicada usando as técnicas de mapeamento de textura, e a textura procedimental . Conjuntos de dados vetoriais tridimensionais necessitam de técnicas complexas e computacionalmente caras para que sejam visualizadas com sucesso. No caso de dados vetoriais densos, a visualização torna-se mais difícil, uma vez que as técnicas convencionais não produzem uma visualização adequada desse dados. A técnica conhecida por Line Integral Convolution (LIC) produz bons resultados para conjunto de dados vetoriais densos a um custo computacional aceitável. Esta técnica utiliza uma textura de ruído branco e o cálculo de streamlines como base para a geração da imagem LIC. Esta técnica produz assim uma imagem bidimensional de uma fatia selecionada desse conjunto de dados, interpolando a textura ruído segundo o campo vetorial a ser visualizado. Este trabalho propõe-se a invertigar a técnica LIC e sua relevância no contexto de visualização científica, através da implementação da classe vtkImageLIC. Esta classe segue os princípios de orientação a objetos podendo, assim, ser integrada à biblioteca de visualização VTK, conferindo-lhe portabilidade e capacidade de extensão. / The increasing availability of computational resources for calculus, simulation and data acquisition allow scientists and engineers to generate enormous datasets, bi or tridimensional, generally, multi-varied. The application of Computer Graphics techniques aiming at gathering a better understanding of these data is the target of an area known as Scientific Visualization (ViSC). Texturing is a means of changing the surface’s properties step by step in such a way that it simulates details which are not present in the surface´s geometry. Texturing can be applied through texturing mapping and procedural techniques. Tridimensional vectorial datasets require complex and computer-demanding techniques to be successfully visualized. For dense vectorial dataset, the visualization becomes more difficult, since conventional techniques do not generated an adequate visualization of data. The technique known as Line Integral Convolution (LIC) produces better results for dense vectorial dataset within acceptable computational costs. This technique uses a white noise texture and streamline calculus as the foundation for LIC image generation. As a result, the technique produces a bidimensional image of a selected slice of the dataset, by interpolating the white noise texture according to the vector field to be visualized. This work aims to investigate the LIC technique and its relevance in the context of Scientific Visualization, culminating with the implementation of the vtkImageLIC class. This class is bound to object oriented programming principles and can be integrated with the VTK visualization Library, allowing it to be portable and easily extensible.
184

Enhanced biopsy and regional anaesthesia through ultrasound actuation of a standard needle

Sadiq, Muhammad January 2013 (has links)
There is an urgent and unmet clinical need to improve accuracy and safety during needle-based interventional procedures including regional anaesthesia and cancer biopsy. In ultrasound guided percutaneous needle procedures, there is a universal problem of imaging the needle, particularly the tip, especially in dense tissues and steep insertion angles. Poor visualization of the needle tip can have serious consequences for the patients including nerve damage and internal bleeding in regional anaesthesia and, in the case of biopsy, mis-sampling, resulting in misdiagnosis or the need for repeat biopsy. The aim of the work was to design and develop an ergonomic ultrasound device to actuate standard, unmodified needles such that the visibility of needle can be enhanced when observed under colour Doppler mode of ultrasound imaging. This will make the needle procedures efficient through accurate needle placement while reducing the overall procedure duration. The research reported in this thesis provides an insight into the new breed of piezoelectric materials. A methodology is proposed and implemented to characterize the new piezocrystals under ambient and extreme practical conditions. For the first time, the IEEE standard method (1987) was applied to an investigation of this type with binary (PMN-PT) and ternary (PIN-PMN-PT) compositions of piezocrystals. Using the existing data and the data obtained through characterization, finite element analysis (FEA) were carried to adequately design the ultrasound device. Various configurations of the device were modelled and fabricated, using both piezoceramic and piezocrystal materials, in order to assess the dependency of device’s performance on the configuration and type of piezoelectric material used. In order to prove the design concept and to measure the benefits of the device, pre-clinical trials were carried out on a range of specimens including the soft embalmed Thiel cadavers. Furthermore, an ultrasound planar cutting tool with various configurations was also designed and developed as an alternative to the existing cumbersome ultrasonic scalpels. These configurations were based on new piezocrystals including the Mn-doped ternary (Mn:PIN-PMN-PT) material. It is concluded that the needle actuating device can significantly enhance the visibility of standard needles and additionally benefits in reducing the penetration force. However, in order to make it clinically viable, further work is required to make it compliant with the medical environment. The piezocrystals tested under practical conditions although offer extraordinary piezoelectric properties, are vulnerable to extreme temperature and drive conditions. However, it is observed that newer piezocrystals, especially Mn:PIN-PMN-PT have shown the potential to replace the conventional piezoceramics in high power and actuator applications. Moreover, the d31-mode based planar cutting tool contrasts with the cumbersome design of mass-spring transducer structure and has the potential to be used in surgical procedures.
185

Management, visualisation & mining of quantitative proteomics data

Ahmad, Yasmeen January 2012 (has links)
Exponential data growth in life sciences demands cross discipline work that brings together computing and life sciences in a usable manner that can enhance knowledge and understanding in both fields. High throughput approaches, advances in instrumentation and overall complexity of mass spectrometry data have made it impossible for researchers to manually analyse data using existing market tools. By applying a user-centred approach to effectively capture domain knowledge and experience of biologists, this thesis has bridged the gap between computation and biology through software, PepTracker (http://www.peptracker.com). This software provides a framework for the systematic detection and analysis of proteins that can be correlated with biological properties to expand the functional annotation of the genome. The tools created in this study aim to place analysis capabilities back in the hands of biologists, who are expert in evaluating their data. Another major advantage of the PepTracker suite is the implementation of a data warehouse, which manages and collates highly annotated experimental data from numerous experiments carried out by many researchers. This repository captures the collective experience of a laboratory, which can be accessed via user-friendly interfaces. Rather than viewing datasets as isolated components, this thesis explores the potential that can be gained from collating datasets in a “super-experiment” ideology, leading to formation of broad ranging questions and promoting biology driven lines of questioning. This has been uniquely implemented by integrating tools and techniques from the field of Business Intelligence with Life Sciences and successfully shown to aid in the analysis of proteomic interaction experiments. Having conquered a means of documenting a static proteomics snapshot of cells, the proteomics field is progressing towards understanding the extremely complex nature of cell dynamics. PepTracker facilitates this by providing the means to gather and analyse many protein properties to generate new biological insight, as demonstrated by the identification of novel protein isoforms.
186

Aplicação de texturas em visualização científica. / Texture applied in scientific visualisation.

Mendonça, Marcelo de Barros 14 December 2001 (has links)
A crescente disponibilidade de recursos computacionais para o cálculo, simulação e aquisição de dados permite que cientistas e engenheiros produzam enormes conjuntos de dados, bi ou tridimensionais, em geral multivariados. A aplicação de técnicas de Computação Gráfica com o objetivo de ganhar compreensão desses dados compreende o objeto de estudo da área conhecida por Visualização Científica. Texturização é uma forma de variar as propriedades de uma superfície ponto a ponto de forma que esta simule detalhes que não estão de fato presentes na sua geometria. A texturização pode ser aplicada usando as técnicas de mapeamento de textura, e a textura procedimental . Conjuntos de dados vetoriais tridimensionais necessitam de técnicas complexas e computacionalmente caras para que sejam visualizadas com sucesso. No caso de dados vetoriais densos, a visualização torna-se mais difícil, uma vez que as técnicas convencionais não produzem uma visualização adequada desse dados. A técnica conhecida por Line Integral Convolution (LIC) produz bons resultados para conjunto de dados vetoriais densos a um custo computacional aceitável. Esta técnica utiliza uma textura de ruído branco e o cálculo de streamlines como base para a geração da imagem LIC. Esta técnica produz assim uma imagem bidimensional de uma fatia selecionada desse conjunto de dados, interpolando a textura ruído segundo o campo vetorial a ser visualizado. Este trabalho propõe-se a invertigar a técnica LIC e sua relevância no contexto de visualização científica, através da implementação da classe vtkImageLIC. Esta classe segue os princípios de orientação a objetos podendo, assim, ser integrada à biblioteca de visualização VTK, conferindo-lhe portabilidade e capacidade de extensão. / The increasing availability of computational resources for calculus, simulation and data acquisition allow scientists and engineers to generate enormous datasets, bi or tridimensional, generally, multi-varied. The application of Computer Graphics techniques aiming at gathering a better understanding of these data is the target of an area known as Scientific Visualization (ViSC). Texturing is a means of changing the surface’s properties step by step in such a way that it simulates details which are not present in the surface´s geometry. Texturing can be applied through texturing mapping and procedural techniques. Tridimensional vectorial datasets require complex and computer-demanding techniques to be successfully visualized. For dense vectorial dataset, the visualization becomes more difficult, since conventional techniques do not generated an adequate visualization of data. The technique known as Line Integral Convolution (LIC) produces better results for dense vectorial dataset within acceptable computational costs. This technique uses a white noise texture and streamline calculus as the foundation for LIC image generation. As a result, the technique produces a bidimensional image of a selected slice of the dataset, by interpolating the white noise texture according to the vector field to be visualized. This work aims to investigate the LIC technique and its relevance in the context of Scientific Visualization, culminating with the implementation of the vtkImageLIC class. This class is bound to object oriented programming principles and can be integrated with the VTK visualization Library, allowing it to be portable and easily extensible.
187

Knowledge-based scaling for biological models / Généralisation de modèles métaboliques par connaissances

Zhukova, Anna 18 December 2014 (has links)
Les réseaux métaboliques à l’échelle génomique décrivent les relations entre milliers de réactions et molécules biochimiques pour améliorer notre compréhension du métabolisme. Ils trouvent des applications dans les domaines chimiques, pharmaceutiques, et dans la biorestauration.La complexité de modèles métaboliques mets des obstacles á l’inférence des modèles, à la comparaison entre eux, ainsi que leur analyse, curation et amélioration par des experts humains. Parce que l’abondance des détailles dans les réseaux à grande échelle peut cacher des erreurs et des adaptations importantes de l’espèce qui est étudié, c’est important de trouver les correct niveaux d’abstraction qui sont confortables pour les experts humains : on doit mettre en évidence la structure essentiel du modèle ainsi que les divergences de celle-là (par exemple les chemins alternatives et les réactions manquantes), tout en masquant les détails non significatifs.Pour répondre a cette demande nous avons défini une généralisation des modèles métaboliques, fondée sur les connaissances, qui permet la création des vues abstraites de réseaux métaboliques. Nous avons développé une méthode théorétique qui regroupe les métabolites en classes d’équivalence et factorise les réactions reliant ces classes d’équivalence. Nous avons réalisé cette méthode comme une bibliothèque Python qui peut être téléchargée depuis metamogen.gforge.inria.fr.Pour valider l’intérêt de notre méthode, nous l’avons appliquée à 1 286 modèles métaboliques que nous avons extraits de la ressource Path2Model. Nous avons montré que notre méthode aide l’expert humain à relever de façon automatique les adaptations spécifiques de certains espèces et à comparer les modèles entre eux.Après en avoir discuté avec des utilisateurs, nous avons décidé de définir trois niveaux hiérarchiques de représentation de réseaux métaboliques : les compartiments, les modules et les réactions détaillées. Nous avons combiné notre méthode de généralisation et le paradigme des interfaces zoomables pour développer Mimoza, un système de navigation dans les réseaux métaboliques qui crée et visualise ces trois niveaux. Mimoza est accessible en ligne et pour le téléchargement depuis le site mimoza.bordeaux.inria.fr. / Genome-scale metabolic models describe the relationships between thousands of reactions and biochemical molecules, and are used to improve our understanding of organism’s metabolism. They found applications in pharmaceutical, chemical and bioremediation industries.The complexity of metabolic models hampers many tasks that are important during the process of model inference, such as model comparison, analysis, curation and refinement by human experts. The abundance of details in large-scale networks can mask errors and important organism-specific adaptations. It is therefore important to find the right levels of abstraction that are comfortable for human experts. These abstract levels should highlight the essential model structure and the divergences from it, such as alternative paths or missing reactions, while hiding inessential details.To address this issue, we defined a knowledge-based generalization that allows for production of higher-level abstract views of metabolic network models. We developed a theoretical method that groups similar metabolites and reactions based on the network structure and the knowledge extracted from metabolite ontologies, and then compresses the network based on this grouping. We implemented our method as a python library, that is available for download from metamogen.gforge.inria.fr.To validate our method we applied it to 1 286 metabolic models from the Path2Model project, and showed that it helps to detect organism-, and domain-specific adaptations, as well as to compare models.Based on discussions with users about their ways of navigation in metabolic networks, we defined a 3-level representation of metabolic networks: the full-model level, the generalized level, the compartment level. We combined our model generalization method with the zooming user interface (ZUI) paradigm and developed Mimoza, a user-centric tool for zoomable navigation and knowledgebased exploration of metabolic networks that produces this 3-level representation. Mimoza is available both as an on-line tool and for download atmimoza.bordeaux.inria.fr.
188

The Mandala dancers : a collaborative inquiry into the experiences of participants in a program of creative meditation : an investigation into a means of celebrating the wonderful in ordinary people

Pearce, Malcolm, University of Western Sydney, Hawkesbury, Faculty of Health, Humanities and Social Ecology, School of Social Ecology January 1994 (has links)
The thesis is the result of an inquiry into the experiences of a group of people engaged in a Buddhist inspired creative meditation program, the main practice of which is the recognition and honouring of the spiritual dimensions, the divinity of self and others. The study employed a heuristic process of examining 'inner world' experiences. The inquiry was collaborative in the sense that its findings were not those of one person alone, but were a compilation of the results of interactions within the group. The inquiry was based on the hypothesis that creative meditation can facilitate changes in a person's perception of self and the external world. The principal aim was to explore into that possibility and investigate the group members' thoughts and feelings as to the main function, significance and eventual outcome of their practice. The investigation seemed to show that for the core group participants there were changes in self-understanding involving more self-acceptance. Changes in attitudes to relationships of various kinds also took place and these also seemed to involve the development of a greater degree of acceptance. With some participants the association of the practice with favourable co-incidence was an interesting but inexplicable feature. For some there was an identification of mind sets which seemed to have a bearing on the quality of meditation experience and its outcomes. The title of the study refers to the manner in which the meditations were often generated. A mandala, a symbolic picture, was designed by each participant and the features of this were imagined to move, sometimes dancing, through the meditations which followed. The second sub-title refers to an integral feature of the practice which was an attempt to arouse a sense of the wonderful as a quality of the people who were imagined to appear in the meditations. / Master of Science (Hons) (Social Ecology)
189

Development of ice particle production system for ice jet process

Shanmugam, Dinesh Kumar, dshanmugam@swin.edu.au January 2005 (has links)
This thesis presents a comprehensive study of the ice particle production process through experimentation and numerical methods using computational fluid dynamics (CFD) that can be used to produce ice particles with controlled temperature and hardness for use in ice jet (IJ) process for industrial applications. The analytical and numerical modeling for the heat exchanger system are developed that could predict the heat, mass and momentum exchange between the cold gas and water droplets. Further, the feasibility study of the deployment of ice particles produced from the ice jet system for possible cleaning and blasting applications are analyzed numerically. Although the use of Abrasive Water Jet (AWJ) technology in cutting, cleaning, machining and surface processing is a very successful industrial process, a considerable amount of secondary particle waste and contamination impingement by abrasive materials has been an important issue in AWJ process. Some alternate cryogenic jet methods involving vanishing abrasive materials, such as plain liquid nitrogen or carbon dioxide have been tried for these applications, but they also suffer from certain drawbacks relating to the quality, safety, process control and materials handling. The use of ice jet process involving minute ice particles has received relatively little attention in industrial applications. Some researches have concentrated on the studies of effects of Ice Jet outlet parameters of the nozzle and focus tube for machining soft and brittle materials. Most of the work in this area is qualitative and researchers have paid a cursory attention to the ice particles temperature and the efficiency of production of these particles. An extensive investigation to gain insight knowledge into the formulation of ice formation process parameters is required in arriving at a deeper understanding of the entire ice jet process for production application. Experimental investigations were focussed on the measurement of ice particle temperature, phase transitions, ice particle diameter, coalescence and hardness test. The change in ice particle diameter from the inlet conditions to the exit point of the heat exchanger wasinvestigated using the experimental results. These observations were extended to numerical analysis of temperature variations of ice particles at different planes inside the custom built heat exchanger. The numerical predictions were carried out with the aid of visualization studies and temperature measurement results from experiments. The numerical models were further analysed to find out the behaviour of ice particles in the transportation stage, the mixing chamber of the nozzle and focus tube. This was done to find out whether the methodology used in this research is feasible and if it can be used in applications such as cleaning, blasting, drilling and perhaps cutting. The results of the empirical studies show that ice particles of desired temperature and hardness could be produced successfully with the current novel design of the heat exchanger. At the optimum parameters, ice particles could be produced below -60�C, with hardness of particles comparable to gypsum (Moh�s hardness of 1.5 to 3). The visualization studies of the process assisted in observation of the phases of ice at various points along the heat exchanger. The results of numerical analysis were found to agree well with the experiments and were supported by the statistical model assessments. Numerical analyses also show the survival of ice particles at the nozzle exit even with high-pressure, high-velocity water/air mixture.
190

Animated proportional Venn diagrams: a study into their description, construction and business application

Hingston, Phillip Anthony, Hingston@bigpond.com January 2007 (has links)
Anecdotal observation of the way in which data visualisation techniques are utilised to present relationships in data to audiences informed the author's view that data visualisation had not evolved to utilise the capabilities of ubiquitous business computer equipment. In an information rich but attention poor business environment, a search for a new tool was undertaken to supplement those techniques available to help audiences understand statistical relationships in presentation data. This search resulted in the development of a practical software tool based on animated Venn diagrams (Dvenn) that attempted to exploit the inherent human ability to perceive quantities visually, a faculty described herein as visual numeracy. The exploitation of this faculty is considered here to be a valuable aid for group understanding of business presentation data. The development of the tool was an essential part of the research that was undertaken and the resulting software forms a significant portion of this practise based research. The aim of the software development was to develop a readily accessible tool that could be utilised in a non-specialist business environment to better facilitate an honest shared meaning of numerical data between a presenter and their audience. The development of the tool progressed through a number of iterations and the software that accompanies this work is an important component that needs to be viewed in conjunction with the text. The test of the final version was undertaken with undergraduate University students in an attempt to validate the efficacy of the data visualisation technique. The test of the Dvenn software was made against the mature yardstick of scatter-plots. Interestingly, the correlations presented by scatter-plot were not as readily identified as would have been assumed, however, the results for the Dvenn tests were not supportive of the technique for widespread adoption. Nevertheless, further research into the best method of harnessing visual numeracy would seem to be justified.

Page generated in 0.1038 seconds