Spelling suggestions: "subject:"5oftware packages"" "subject:"5oftware backages""
1 |
Design and performance analysis of raster graphics systemsPetrus, K. I. January 1987 (has links)
No description available.
|
2 |
A nonlinear statistical MESFET model using low order statistics of equivalent circuit model parameter setsCrampton, Raymond J. 03 March 2009 (has links)
A nonlinear statistical MESFET model is presented which shows good agreement with measured results. A single stage GaAs power amplifier tuned at 13.5 GHz is simulated with the model and accurately predicts output power, input return loss, and power added efficiency. Not only is nominal performance good, but the spreads of amplifier performance seen over many wafers and several lots is represented well. The model consists of capturing low order statistics (means, variances, and correlations) of equivalent circuit model parameter sets and using these to produce a continuous distribution of devices representative of those seen from manufacturing. Complete nonlinear and statistical modeling extraction software packages are also presented which allow easy investigation of many aspects of statistical models such as sensitivity of data set statistics as a function of sample size and how model parameters vary as a function of time with a given fabrication process. / Master of Science
|
3 |
An investigation into the consistency and usability of selected minisatellite detecting software packagesMasombuka, Koos Themba January 2013 (has links)
A tandem repeat is a sequence of adjacent repetitions of a nucleotide patternsignature,
called its motif, in a DNA sequence. The repetitions may either
be exact or approximate copies of the motif. A minisatellite is a tandem
repeat whose motif is of moderate length.
One approach to searching for minisatellites assumes prior knowledge
about the motif. This approach limits the search for minisatellites to
specified motifs. An alternative approach tries to identify signatures autonomously
from within a DNA sequence. Several different algorithms that
use this approach have been developed. Since they do not use pre-specified
motifs, and since a degree of approximation is tolerated, there may be ambiguity
about where minisatellites start and end in a given DNA sequence.
Various experiments were conducted on four well-known software packages
to investigate this conjecture. The software packages were executed on
the same data and their respective output was compared. The study found
that the selected computer algorithms did not report the same outputs. The
lack of precise definitions of properties of such patterns may explain these
differences. The difference in definitions relate to the nature and extent of
approximation to be tolerated in the patterns during the search. This problem
could potentially be overcome by agreeing on how to specify acceptable
approximations when searching for minisatellites.
Some of these packages are implemented as Academic/Research Software
(ARS). Noting that ARS has a reputation of being difficult to use, this study
also investigated the usability of these ARS implementations. It relied on
literature that offers usability evaluation methods. Potential problems that
are likely to affect the general usability of the systems were identified. These
problems relate inter alia, to visibility, consistency and efficiency of use.
Furthermore, usability guidelines in the literature were followed to modify
the user interface of one of the implementations. A sample of users evaluated the before- and after versions of this user interface. Their feedback suggests
that the usability guidelines were indeed effective in enhancing the user
interface. / Dissertation (MSc)--University of Pretoria, 2013. / gm2014 / Computer Science / unrestricted
|
4 |
An Additive Bivariate Hierarchical Model for Functional Data and Related ComputationsRedd, Andrew Middleton 2010 August 1900 (has links)
The work presented in this dissertation centers on the theme of regression and
computation methodology. Functional data is an important class of longitudinal
data, and principal component analysis is an important approach to regression with
this type of data. Here we present an additive hierarchical bivariate functional data
model employing principal components to identify random e ects. This additive
model extends the univariate functional principal component model. These models
are implemented in the pfda package for R. To t the curves from this class of models
orthogonalized spline basis are used to reduce the dimensionality of the t, but retain
exibility. Methods for handing spline basis functions in a purely analytical manner,
including the orthogonalizing process and computing of penalty matrices used to t
the principal component models are presented. The methods are implemented in the
R package orthogonalsplinebasis.
The projects discussed involve complicated coding for the implementations in R.
To facilitate this I created the NppToR utility to add R functionality to the popular
windows code editor Notepad . A brief overview of the use of the utility is also
included.
|
5 |
The Resource Allocation Capabilities Of Commercial Project Management Software Packages For Resource Constrained Project Scheduling ProblemCekmece, Kerem 01 March 2009 (has links) (PDF)
Resource constrained project scheduling problem (RCPSP) has been subject of extensive
research in project management literature as RCPSP is one of the most challenging
problems in the project management and is of great practical importance. In
this thesis, resource allocation capabilities of Primavera Enterprise V.6.0-Project
Management (P6) and MS Project 2007 (MS) were evaluated for solving overallocated
problems in the RCPSP. Fourty-five resource overallocated instance projects
were selected from the PSPLIB to evaluate performance of P6 and MS Project
2007. Three resource allocation priority rules of P6 and two resource allocation
priority rules of MS were used for comparision. The best solutions of different priority
rules for P6 and MS were compared by using t-test. Results of the P6 and MS
were compared with the lower bounds and optimum solutions of the previous heuristic
methods. The comparisions indicate that both P6 and MS has limited capabilities
for solving overallocated problems in RCPSP. Especially for larger projects the
widely used project management software packages can not provide optimum or
near optimum solutions.
|
6 |
Risk management and tacit knowledge in IT projects: making the implicit explicitTaylor, Hazel Ann January 2004 (has links)
This research addressed the need for in-depth investigation of what actually happens in the practice of risk management in software package implementation projects. There is strong 'official' sanction in the IT literature for the use of formal risk management processes for IT projects but there is a confused picture of their application in practice. While many potential risk factors for IT projects have been identified, and formal procedures have been prescribed for the management of these risks, there has been little work investigating how project managers assess these risks in practice and what countermeasures they employ against these risks in their projects. In particular, the study used an interpretive critical decision interview approach to focus on those areas of risk management knowledge that project managers have acquired through experience, i.e. tacit knowledge. A new categorization of risk factors emanating from three sources -- vendor, client, and third party -reveals risk factors not previously identified. Some of these new factors arise from the three sources noted, while others arise from the package implementation focus of the projects and from aspects arising from the location of the projects in Hong Kong. Key factors that cause problems even when anticipated and mitigated, and the most often unanticipated problems are also identified. The study further presents an examination of the studied managers' risk management practices, and the strategies they use to address both potential and actual problems. This examination revealed close conformance with recommended literature prescriptions at some stages of projects, and significant variation at other stages, with strategies applied being broad and general rather than risk specific. A useful categorization of these strategies into four broad groups relating to different sets of risk factors is presented, reflecting the actual practice of respondents. Tacit knowledge was revealed throughout these investigations in the variances observed between prescribed and actual practice, and particularly from an examination of project managers' decision-making practices from two different perspectives - rational and naturalistic. A hybrid decision-making model is proposed to capture the actual processes observed, and to provide prescriptive guidance for risk management practice. The investigation makes a contribution to the field of IT project risk management in three ways. First, the investigation has addressed the need for empirical studies into IT risk management practices and the factors influencing project managers in their choice and application of strategies to manage risk. Second, by examining how experienced IT project managers approach the task of managing risk in software package implementations, the study has extended our understanding of the nature of the knowledge and skills that effective IT project managers develop through experience. Third, the study makes a theoretical contribution to our understanding of IT project risk management by examining the decision-making processes followed by IT project managers from the perspective of two contrasting theories of decision-making - the rational method and the Naturalistic Decision Making theory.
|
7 |
DSPNexpress: a software package for the efficient solution of deterministic and stochastic Petri netsLindemann, Christoph 10 December 2018 (has links)
This paper describes the analysis tool DSPNexpress which has been developed at the Technische Universität Berlin since 1991. The development of DSPNexpress has been motivated by the lack of a powerful software package for the numerical solution of deterministic and stochastic Petri nets (DSPNs) and the complexity requirements imposed by evaluating memory consistency models for multicomputer systems. The development of DSPNexpress has gained by the author's experience with the version 1.4 of the software package GreatSPN. However, opposed to GreatSPN, the software architecture of DSPNexpress is particularly tailored to the numerical evaluation of DSPNs. Furthermore, DSPNexpress contains a graphical interface running under the X11 window system. To the best of the author's knowledge, DSPNexpress is the first software package which contains an efficient numerical algorithm for computing steady-state solutions of DSPNs.
|
8 |
Finite Element Analysis of Unreinforced Concrete Block Walls Subject to Out-of-Plane LoadingHe, Zhong 12 1900 (has links)
<p>Finite element modeling of the structural response of hollow concrete block walls subject to out-of-plane loading has become more common given the availability of computers and general-purpose finite element software packages. In order to develop appropriate models of full-scale walls with and without openings, a parametric study was conducted on simple wall elements to assess different modeling techniques. Two approaches were employed in the study, homogeneous models and heterogeneous models. The linear elastic analysis was carried out to quantify the effects of the modeling techniques for hollow blocks on the structural response of the assembly, specifically for out-of-plane bending. Three structural elements with varying span/thickness ratios were considered, a horizontal spanning strip, a vertical spanning strip and a rectangular wall panel supported on four edges. The values computed using homogeneous and heterogeneous finite element models were found to differ significantly depending on the configuration and span/thickness ratio of the wall.</p><p>Further study was carried out through discrete modeling approach to generate a three-dimensional heterogeneous model to investigate nonlinear behaviour of full-scale walls under out-of-plane loading. The Composite Interface Model, established based on multi-surface plasticity, which is capable of describing both tension and shear failure mechanisms, has been incorporated into the analysis to capture adequately the inelastic behaviour of unit-mortar interface.An effective solution procedure was achieved by implementing the Newton-Raphson method, constrained with the arc-length control method and enhanced by line search algorithm. The proposed model was evaluated using experimental results for ten full-size walls reported in the literature. The comparative analysis has indicated very good agreement between the numerical and experimental results in predicting the cracking and ultimate load values as well as the corresponding crack pattern. / Thesis / Master of Applied Science (MASc)
|
9 |
Development of registration methods for cardiovascular anatomy and function using advanced 3T MRI, 320-slice CT and PET imagingWang, Chengjia January 2016 (has links)
Different medical imaging modalities provide complementary anatomical and functional information. One increasingly important use of such information is in the clinical management of cardiovascular disease. Multi-modality data is helping improve diagnosis accuracy, and individualize treatment. The Clinical Research Imaging Centre at the University of Edinburgh, has been involved in a number of cardiovascular clinical trials using longitudinal computed tomography (CT) and multi-parametric magnetic resonance (MR) imaging. The critical image processing technique that combines the information from all these different datasets is known as image registration, which is the topic of this thesis. Image registration, especially multi-modality and multi-parametric registration, remains a challenging field in medical image analysis. The new registration methods described in this work were all developed in response to genuine challenges in on-going clinical studies. These methods have been evaluated using data from these studies. In order to gain an insight into the building blocks of image registration methods, the thesis begins with a comprehensive literature review of state-of-the-art algorithms. This is followed by a description of the first registration method I developed to help track inflammation in aortic abdominal aneurysms. It registers multi-modality and multi-parametric images, with new contrast agents. The registration framework uses a semi-automatically generated region of interest around the aorta. The aorta is aligned based on a combination of the centres of the regions of interest and intensity matching. The method achieved sub-voxel accuracy. The second clinical study involved cardiac data. The first framework failed to register many of these datasets, because the cardiac data suffers from a common artefact of magnetic resonance images, namely intensity inhomogeneity. Thus I developed a new preprocessing technique that is able to correct the artefacts in the functional data using data from the anatomical scans. The registration framework, with this preprocessing step and new particle swarm optimizer, achieved significantly improved registration results on the cardiac data, and was validated quantitatively using neuro images from a clinical study of neonates. Although on average the new framework achieved accurate results, when processing data corrupted by severe artefacts and noise, premature convergence of the optimizer is still a common problem. To overcome this, I invented a new optimization method, that achieves more robust convergence by encoding prior knowledge of registration. The registration results from this new registration-oriented optimizer are more accurate than other general-purpose particle swarm optimization methods commonly applied to registration problems. In summary, this thesis describes a series of novel developments to an image registration framework, aimed to improve accuracy, robustness and speed. The resulting registration framework was applied to, and validated by, different types of images taken from several ongoing clinical trials. In the future, this framework could be extended to include more diverse transformation models, aided by new machine learning techniques. It may also be applied to the registration of other types and modalities of imaging data.
|
10 |
Comparison Of The Resource Allocation Capabilities Of Project Management Software Packages In Resource Constrained Project Scheduling ProblemsHekimoglu, Ozge 01 January 2007 (has links) (PDF)
In this study, results of a comparison on benchmark test problems are presented to investigate the performance of Primavera V.4.1 with its two resource allocation priority rules and MS Project 2003. Resource allocation capabilities of the packages are measured in terms of deviation from the upper bound of the minimum makespan. Resource constrained project scheduling problem instances are taken from PSPLIB which are generated under a factorial design from ProGen. Statistical tests are applied to the results for investigating the significance effectiveness of the parameters.
|
Page generated in 0.0499 seconds