• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 49
  • 17
  • 16
  • 7
  • 6
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 142
  • 25
  • 15
  • 13
  • 11
  • 11
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Σχεδίαση και ανάπτυξη εκπαιδευτικού λογισμικού με θέμα "Εξερευνώντας την ενέργεια"

Σίψα, Γρηγορία 26 August 2008 (has links)
Η παρούσα διπλωματική εργασία αφορά την κατασκευή εκπαιδευτικού πακέτου σε μορφή ψηφιακού δίσκου με θέμα την Ενέργεια και συνοδεύεται από εκπαιδευτικό υλικό (εκπαιδευτικά σενάρια και δραστηριότητες). Αποτελεί δε χρήσιμο συμπληρωματικό εργαλείο για το δάσκαλο των δύο τελευταίων τάξεων του Δημοτικού Σχολείου και είναι σύμφωνο με το Πρόγραμμα Σπουδών αυτών των τάξεων. Σκοπός του είναι η διευκόλυνση της εκπαιδευτικής διαδικασίας και η κατανόηση και διαχείριση των δύσκολων εννοιών της ενέργειας, μέσα από ένα ενδιαφέρον και αλληλεπιδραστικό περιβάλλον. Αναφέρεται στις έννοιες της ενέργειας, των μορφών της, των πηγών της, της μετατροπής της και της υποβάθμισής της. Επίσης παρουσιάζει την ενέργεια ως εξαντλήσιμο αγαθό και στοχεύει στην ευαισθητοποίηση των μαθητών για την ανάγκη εξοικονόμησής της ενέργειας. Έτσι, παρουσιάζονται οι βασικές γνώσεις που πρέπει να έχουν οι μαθητές των δύο τελευταίων τάξεων του Δημοτικού για την ενέργεια, καθώς και απλές πρακτικές που μπορούν να αξιοποιούν τα παιδιά στην καθημερινή τους ζωή για την εξοικονόμηση της ενέργειας, εφόσον ένας από τους επιμέρους στόχος είναι η διαμόρφωση ορθής στάσης εκ μέρους των παιδιών στο θέμα. Οι πληροφορίες δίνονται με την μορφή κειμένου, αφήγησης, φωτογραφίας, γραφικών, ήχων και κινούμενων σχεδίων. Ο στόχος του λογισμικού και των δραστηριοτήτων είναι να παρουσιάσει στο δάσκαλο μια ολοκληρωμένη πρόταση για την ένταξη των Νέων Τεχνολογιών στο μάθημά του παρέχοντας ένα πλαίσιο εκπαιδευτικού σεναρίου με ποικιλία θεματικού περιεχομένου και μέσων παρουσίασης και να καλύψει, όσο αυτό είναι δυνατόν, τις ανάγκες του μαθητή και του δασκάλου σε διάφορες συνθήκες τάξης, επιπέδου και χρονικής διάρκειας μαθήματος. / -
22

Cognition driven deformation modelling

Janke, Andrew Lindsay Unknown Date (has links)
This thesis describes the development of a model of cerebral atrophic change associated with neurodegeneration. Neurodegenerative diseases such as Alzheimer's dementia present a significant health problem within the elderly population. Effective treatment relies upon the early detection of anatomic change, and the subsequent differential diagnosis of the disorder from other closely related neurological conditions. Importantly, this also includes the investigation of the relationship between atrophic change and cognitive function. In unison with the growth in neuroimaging technology, myriad methodologies have been developed since the first quantitative measures of atrophic change were deduced via manual tracing. Subsequently, automated region of interest analysis, segmentation, voxel-based morphometry and non-linear registration have all been used to investigate atrophy. These methods commonly report findings of ventricular enlargement and temporal lobe change in AD and other dementias. Whilst these results are accurate indicators of atrophy, they are largely non-specific in their diagnostic utility. In addition, the aforementioned methods have been employed to discern change observed at discrete intervals during a disease process. In order to gain a greater understanding of the temporal characteristics of changes that occur as a result of atrophy, a deformation modelling method that allows the continuous tracking of these changes in a cohort of AD patients and elderly control subjects is presented in this thesis. Deformation modelling involves non-linear registration of images to investigate the change that is apparent between two or more images. The non- linear registration results are analysed and presented via three metrics: local volume loss (atrophy); volume (CSF) increase; and translation (interpreted as representing collapse of cortical structures). Changes observed in the analyses in this thesis are consistent with results from neuro-anatomical studies of AD. Results using the more traditional methods of analysis are presented for comparative purposes.
23

Cognition driven deformation modelling

Janke, Andrew Lindsay Unknown Date (has links)
This thesis describes the development of a model of cerebral atrophic change associated with neurodegeneration. Neurodegenerative diseases such as Alzheimer's dementia present a significant health problem within the elderly population. Effective treatment relies upon the early detection of anatomic change, and the subsequent differential diagnosis of the disorder from other closely related neurological conditions. Importantly, this also includes the investigation of the relationship between atrophic change and cognitive function. In unison with the growth in neuroimaging technology, myriad methodologies have been developed since the first quantitative measures of atrophic change were deduced via manual tracing. Subsequently, automated region of interest analysis, segmentation, voxel-based morphometry and non-linear registration have all been used to investigate atrophy. These methods commonly report findings of ventricular enlargement and temporal lobe change in AD and other dementias. Whilst these results are accurate indicators of atrophy, they are largely non-specific in their diagnostic utility. In addition, the aforementioned methods have been employed to discern change observed at discrete intervals during a disease process. In order to gain a greater understanding of the temporal characteristics of changes that occur as a result of atrophy, a deformation modelling method that allows the continuous tracking of these changes in a cohort of AD patients and elderly control subjects is presented in this thesis. Deformation modelling involves non-linear registration of images to investigate the change that is apparent between two or more images. The non- linear registration results are analysed and presented via three metrics: local volume loss (atrophy); volume (CSF) increase; and translation (interpreted as representing collapse of cortical structures). Changes observed in the analyses in this thesis are consistent with results from neuro-anatomical studies of AD. Results using the more traditional methods of analysis are presented for comparative purposes.
24

Using Auditory Feedback to Teach Dance Skills to Adults with Intellectual Disabilities

Abreu, Aracely 29 October 2015 (has links)
The purpose of this study was to evaluate the use of auditory feedback for teaching individuals with intellectual disabilities the “Mississippi Cha Cha Slide.” Participants consisted of six males ages 35 to 61. During baseline, line dance skills were low for all participants. During the auditory feedback intervention, the trainer used a clicker to reinforce dance steps and forward chaining to chain movements into a sequence. Once auditory feedback was implemented, line dance skills increased substantially for all participants. Generalization assessments for four of the participants resulted in performance levels similar to baseline and demonstrate the need for future training with music. Follow up data collected for all four participants showed that dance skills were maintained.
25

An investigation into the consistency and usability of selected minisatellite detecting software packages

Masombuka, Koos Themba January 2013 (has links)
A tandem repeat is a sequence of adjacent repetitions of a nucleotide patternsignature, called its motif, in a DNA sequence. The repetitions may either be exact or approximate copies of the motif. A minisatellite is a tandem repeat whose motif is of moderate length. One approach to searching for minisatellites assumes prior knowledge about the motif. This approach limits the search for minisatellites to specified motifs. An alternative approach tries to identify signatures autonomously from within a DNA sequence. Several different algorithms that use this approach have been developed. Since they do not use pre-specified motifs, and since a degree of approximation is tolerated, there may be ambiguity about where minisatellites start and end in a given DNA sequence. Various experiments were conducted on four well-known software packages to investigate this conjecture. The software packages were executed on the same data and their respective output was compared. The study found that the selected computer algorithms did not report the same outputs. The lack of precise definitions of properties of such patterns may explain these differences. The difference in definitions relate to the nature and extent of approximation to be tolerated in the patterns during the search. This problem could potentially be overcome by agreeing on how to specify acceptable approximations when searching for minisatellites. Some of these packages are implemented as Academic/Research Software (ARS). Noting that ARS has a reputation of being difficult to use, this study also investigated the usability of these ARS implementations. It relied on literature that offers usability evaluation methods. Potential problems that are likely to affect the general usability of the systems were identified. These problems relate inter alia, to visibility, consistency and efficiency of use. Furthermore, usability guidelines in the literature were followed to modify the user interface of one of the implementations. A sample of users evaluated the before- and after versions of this user interface. Their feedback suggests that the usability guidelines were indeed effective in enhancing the user interface. / Dissertation (MSc)--University of Pretoria, 2013. / gm2014 / Computer Science / unrestricted
26

Investigating the Reproducbility of NPM packages

Goswami, Pronnoy 19 May 2020 (has links)
The meteoric increase in the popularity of JavaScript and a large developer community has led to the emergence of a large ecosystem of third-party packages available via the Node Package Manager (NPM) repository which contains over one million published packages and witnesses a billion daily downloads. Most of the developers download these pre-compiled published packages from the NPM repository instead of building these packages from the available source code. Unfortunately, recent articles have revealed repackaging attacks to the NPM packages. To achieve such attacks the attackers primarily follow three steps – (1) download the source code of a highly depended upon NPM package, (2) inject malicious code, and (3) then publish the modified packages as either misnamed package (i.e., typo-squatting attack) or as the official package on the NPM repository using compromised maintainer credentials. These attacks highlight the need to verify the reproducibility of NPM packages. Reproducible Build is a concept that allows the verification of build artifacts for pre-compiled packages by re-building the packages using the same build environment configuration documented by the package maintainers. This motivates us to conduct an empirical study (1) to examine the reproducibility of NPM packages, (2) to assess the influence of any non-reproducible packages, and (3) to explore the reasons for non-reproducibility. Firstly, we downloaded all versions/releases of 226 most-depended upon NPM packages, and then built each version with the available source code on Github. Secondly, we applied diffoscope, a differencing tool to compare the versions we built against the version downloaded from the NPM repository. Finally, we did a systematic investigation of the reported differences. At least one version of 65 packages was found to be non-reproducible. Moreover, these non- reproducible packages have been downloaded millions of times per week which could impact a large number of users. Based on our manual inspection and static analysis, most reported differences were semantically equivalent but syntactically different. Such differences result due to non-deterministic factors in the build process. Also, we infer that semantic differences are introduced because of the shortcomings in the JavaScript uglifiers. Our research reveals challenges of verifying the reproducibility of NPM packages with existing tools, reveal the point of failures using case studies, and sheds light on future directions to develop better verification tools. / Master of Science / Software packages are distributed as pre-compiled binaries to facilitate software development. There are various package repositories for various programming languages such as NPM (JavaScript), pip (Python), and Maven (Java). Developers install these pre-compiled packages in their projects to implement certain functionality. Additionally, these package repositories allow developers to publish new packages and help the developer community to reduce the delivery time and enhance the quality of the software product. Unfortunately, recent articles have revealed an increasing number of attacks on the package repositories. Moreover, developers trust the pre-compiled binaries, which often contain malicious code. To address this challenge, we conduct our empirical investigation to analyze the reproducibility of NPM packages for the JavaScript ecosystem. Reproducible Builds is a concept that allows any individual to verify the build artifacts by replicating the build process of software packages. For instance, if the developers could verify that the build artifacts of the pre-compiled software packages available in the NPM repository are identical to the ones generated when they individually build that specific package, they could mitigate and be aware of the vulnerabilities in the software packages. The build process is usually described in configuration files such as package.json and DOCKERFILE. We chose the NPM registry for our study because of three primary reasons – (1) it is the largest package repository, (2) JavaScript is the most widely used programming language, and (3) there is no prior dataset or investigation that has been conducted by researchers. We took a two-step approach in our study – (1) dataset collection, and (2) source-code differencing for each pair of software package versions. For the dataset collection phase, we downloaded all available releases/versions of 226 popularly used NPM packages and for the code-differencing phase, we used an off-the-shelf tool called diffoscope. We revealed some interesting findings. Firstly, at least one of the 65 packages as found to be non-reproducible, and these packages have millions of downloads per week. Secondly, we found 50 package-versions to have divergent program semantics which high- lights the potential vulnerabilities in the source-code and improper build practices. Thirdly, we found that the uglification of JavaScript code introduces non-determinism in the build process. Our research sheds light on the challenges of verifying the reproducibility of NPM packages with the current state-of-the-art tools and the need to develop better verification tools in the future. To conclude, we believe that our work is a step towards realizing the reproducibility of NPM packages and making the community aware of the implications of non-reproducible build artifacts.
27

Fast Relabeling of Deformable Delaunay Tetrahedral Meshes Using a Compact Uniform Grid

Frogley, David C. 28 July 2011 (has links) (PDF)
We address the problem of fast relabeling of deformable Delaunay tetrahedral meshes using a compact uniform grid, with CPU parallelization through OpenMP. This problem is important in visualizing the simulation of deformable objects and arises in scientific visualization, games, computer vision, and motion picture production. Many existing software tools and APIs have the ability to manipulate 3D virtual objects. Prior mesh-based representations either allow topology changes or are fast. We aim for both. Specifically, we improve the efficiency of the relabeling step in the Delaunay deformable mesh invented by Pons and Boissonnat and improved by Tychonievich and Jones. The relabeling step assigns material types to deformed meshes and accounts for 70% of the computation time of Tychonievich and Jones' algorithm. We have designed a deformable mesh algorithm using a Delaunay triangulation and a compact uniform grid with CPU parallelization to obtain greater speed than other methods that support topology changes. On average, over all our experiments and with various 3D objects, the serial implementation of the relabeling step of our work reports a speedup of 2.145 over the previous fastest method, including one outlier whose speedup was 3.934. When running in parallel on 4 cores, on average the relabeling step of our work achieves a speedup of 3.979, with an outlier at 7.63. The average speedup of our parallel relabeling step over our own serial relabeling step is 1.841.Simulation results show that the resulting mesh supports topology changes.
28

Analýza vnímání neúplných cen zájezdů v České Republice / Analysis of the perception of incomplete prices of travel packages in the Czech Republic

Valová, Lucie January 2008 (has links)
The thesis deals with problems of price quoting of travel packages in the Czech Republic. It refers to the process of packaging in the theoretical part, as well to various ways of pricing together with calculation illustration and catalogues price strategy. Furthermore, it shows the difference between a company's and a customer's price perception and informs about the psychological price effect on the final consumer. The important part of this thesis represents the analysis of the current situation of the price quoting of travel packages in the Czech Republic in comparison with the situation abroad. Chapters are oriented to the analysis of the legislation in the field of packaging and price quoting of travel packages, of the praxis of Czech and foreign travel agencies and of taking action of interested institutions to solve this problem. The whole thesis is supplemented with a detailed analysis of the data extracted from the questionnaire.
29

Modeling, design, fabrication and characterization of power delivery networks and resonance suppression in double-sided 3-D glass interposer packages

Kumar, Gokul 07 January 2016 (has links)
Effective power delivery in Double-sided 3-D glass interposer packages was proposed, investigated, and demonstrated towards achieving high logic-to-memory bandwidth. Such 3-D interposers enable a simpler alternative to direct 3-D stacking by providing low-loss, wide-I/O channels between the logic device on one side of the ultra-thin glass interposer and memory stack on the other side, eliminating the need for complex TSVs in the logic die. A simplified PDN design approach with power-ground planes was proposed to overcome resonance challenges from (a) added parasitic inductance in the lateral power delivery path from the printed wiring board (PWB), due to die placement on the bottom side of the interposer, and (b) the low-loss property of the glass substrate. Based on this approach, this dissertation developed three important suppression solutions using, (a) the 3-D interposer package configuration, (b) the selection of embedded and SMT-based decoupling capacitors, and (c) coaxial power-ground planes with TPVs. The self-impedance of the 3-D glass interposer PDN was simulated using electromagnetic solvers, including printed-wiring-board (PWB) and chip-level models. Two-metal and four-metal layer test vehicles were fabricated on 30-μm and 100-μm thick glass substrates using a panel-based double-side fabrication process, for potential lower cost and improved electrical performance. The PDN test structures were characterized upto 20 GHz, to demonstrate the measured verification of (a) 3-D glass interposer power delivery network and (b) resonance suppression. The data and analysis presented in this dissertation prove that the objectives of this research were met successfully, leading to the first demonstration of effective PDN design in ultra-thin (30-100μm), and 3-D double-sided glass BGA packages, by suppressing the PDN noise from mode resonances.
30

Modeling, design, fabrication and reliability characterization of ultra-thin glass BGA package-to-board interconnections

Singh, Bhupender 27 May 2016 (has links)
Recent trends to miniaturized systems such as smartphones and wearables, as well as the rise of autonomous vehicles relying on all-electric and smart in-car systems, have brought unprecedented needs for superior performance, functionality, and cost requirements. Transistor scaling alone cannot meet these metrics unless the remaining system components such as substrates and interconnections are scaled down to bridge the gap between transistor and system scaling. In this regard, 3D glass system packages have emerged as a promising alternative due to their ultra-short system interconnection lengths, higher component densities and system reliability enabled by the tailorable coefficient of thermal expansion (CTE), high dimensional stability and surface smoothness, outstanding electrical properties and low-cost panel-level processability of glass. The research objectives are to demonstrate board-level reliability of large, thin, glass packages directly mounted on PCB with conventional BGAs at pitches of 400µm SMT and smaller. Two key innovations are introduced to accomplish the objectives: a.) Reworkable circumferential polymer collars providing strain-relief at critical high stress concentration areas in the solder joints, b.) novel Mn-doped SACMTM solder to provide superior drop test performance without degrading thermomechanical reliability. Modeling, package and board design, fabrication and reliability characterization were carried out to demonstrate reliable board-level interconnections of large, ultra-thin glass packages. Finite-element modeling (FEM) was used to investigate the effectiveness of circumferential polymer collars as a strain-relief solution on fatigue performance. Experimental results with polymer collars indicated a 2X improvement in drop performance and 30% improvement in fatigue life. Failure analysis was performed using characterization techniques such as confocal surface acoustic microscopy (C-SAM), optical microscopy, X-ray imaging, and scanning electron microscopy/energy dispersive spectrometry (SEM/EDS). Model-to-experiment correlation was performed to validate the effectiveness of polymer collars as a strain-relief mechanism. Enhancement in board-level reliability performance with advances in solder materials based on Mn-doped SACMTM is demonstrated in the last part of the thesis.The studies, thus, demonstrate material, design and process innovations for package-to-board interconnection reliability with ultra-thin, large glass packages.

Page generated in 0.0335 seconds