Spelling suggestions: "subject:"QA computer 5oftware 76.75676.765"" "subject:"QA computer 5oftware 76.75776.765""
21 |
Representing Component Variability In Configuration ManagementBayraktar, Gamze 01 September 2012 (has links) (PDF)
Reusability of assets within a family of products is the major goal of Software Product Line Engineering (SPLE), therefore managing variability is an important task in SPLs. Configuration management in the context of software product line engineering is more complicated
than that in single systems engineering due to &rdquo / variability in space&rdquo / in addition to &rdquo / variability in time&rdquo / of core assets. In this study, a method for documenting variability in executable configuration items, namely components, is proposed by associating them with the Orthogonal
Variability Model (OVM) which introduces variability as a separate model. The main aim is to trace variability in dierent configurations by explicitly documenting variability information for components. The links between OVM elements and components facilitate tool support for product derivation as the components matching the selected variations can be gathered by following the links. The proposed scheme is demonstrated on a case study about a radar GUI variability model.
|
22 |
A Method To Decrease Common Problems In Effort Data Collection In The Software IndustryOzkaya Eren, Aysegul 01 August 2012 (has links) (PDF)
Efficient project planning and project management is crucial to complete the software projects in expected time and requirements. The most critical stage in project planning is estimation of the software size, time and budget. In this stage, effort data is used for benchmarking data sets, effort estimation, project monitoring and controlling. However, there are some problems related to effort data collection in the software industry. In this thesis, a pilot study and survey study are conducted to observe common practices and problems in effort data collection in the industry and results are analyzed. These problems are explained in terms of tool, process and people factors and solution suggestions are presented according to these problems. In accordance with the findings, a method and a tool which can facilitates to provide more accurate data are developed. A case study is performed in order to validate the method and applicability of the tool in the industry.
|
23 |
Automatic Cartoon Generation By Learning The Style Of An ArtistKuruoglu, Betul 01 September 2012 (has links) (PDF)
In this study, we suggest an algorithm for generating cartoons from face images automatically.
The suggested method learns drawing style of an artist and applies this style to the face images
in a database to create cartoons.
The training data consists of a set of face images and corresponding cartoons, drawn by the
same artist. Initially, a set of control points are labeled and indexed to characterize the face in
the training data set for both images and corresponding caricatures. Then, their features are
extracted to model the style of the artist. Finally, a similarity matrix of real face image set and
the input image are constructed. With the help of the similarity matrix, Distance-Weighted
Nearest Neighbor algorithm calculates the exaggeration coefficients which caricaturist would
have designed for the input image in his mind. In caricature generation phase, Moving Least
Squares algorithm is applied to distort the input image based on these coefficients. Caricatures
generated by this approach successfully cover most of the caricaturist&rsquo / s key characteristics in
his drawing.
|
24 |
Dds Based Mil-std-1553b Data Bus Interface SimulationDeniz, Ertan 01 September 2012 (has links) (PDF)
This thesis describes distributed simulation of MIL-STD-1553B Serial Data Bus interface
and protocol based on the Data Distribution Service (DDS) middleware standard. The
data bus connects avionics system components and transports information among them in
an aircraft. It is important for system designers to be able to evaluate and verify their
component interfaces at the design phase. The 1553 serial data bus requires specialized
hardware and wiring to operate, thus it is expensive and complex to verify component
interfaces. Therefore modeling the bus on commonly available hardware and networking
infrastructure is desirable for evaluation and verification of component interfaces. The DDS
middleware provides publish-subscribe based communications with a number of QoS (Quality
Of Service) attributes. DDS makes it easy to implement distributed systems by providing an
abstraction layer over the networking interfaces of the operating systems. This thesis takes
the advantage of the DDS middleware to implement a 1553 serial data bus simulation tool. In
addition, the tool provides XML based interfaces and scenario definition capabilities, which
enable easy and quick testing and validation of component interfaces. Verification of the tool
was performed over a case study using a scenario based on the MIL-STD-1760 standard.
|
25 |
Automated Building Detection From Satellite Images By Using Shadow Information As An Object InvariantBaris, Yuksel 01 October 2012 (has links) (PDF)
Apart from classical pattern recognition techniques applied for automated building detection in satellite images, a robust building detection methodology is proposed, where self-supervision data can be automatically extracted from the image by using shadow and its direction as an invariant for building object. In this methodology / first the vegetation, water and shadow regions are detected from a given satellite image and local directional fuzzy landscapes representing the existence of building are generated from the shadow regions using the direction of illumination obtained from image metadata. For each landscape, foreground (building) and background pixels are automatically determined and a bipartitioning is obtained using a graph-based algorithm, Grabcut. Finally, local results are merged to obtain the final building detection result. Considering performance evaluation results, this approach can be seen as a proof of concept that the shadow is an invariant for a building object and promising detection results can be obtained when even a single invariant for an object is used.
|
26 |
A Qualitative Model Of The Critical Success Factors For The Effectiveness Of Information Systems OutsourcingUcar, Erkan 01 October 2012 (has links) (PDF)
The objective of this research is to construct a model of the critical success factors for the effectiveness of Information Systems (IS) outsourcing. &ldquo / Lack of in-house expertise&rdquo / and &ldquo / cost effectiveness&rdquo / are the widely accepted major factors of motivation for IS outsourcing. Although various decision models and analytical frameworks have been proposed before, the literature is not abundant on a complete qualitative model. In contrast with the decision models which are executed before an outsourcing engagement (a-priori), an effectiveness model will be an a-posteriori guide which will enable the clients to measure their outsourcing performance
and re-evaluate their business and management strategies. This thesis examines the critical success factors for outsourcing effectiveness through qualitative research conducted with multiple case studies for information systems developed for public and private clients. A conceptual model consisting of various hypotheses is constructed and qualitatively evaluated.
|
27 |
Organizational Learning Assessment In Software Development OrganizationsChouseinoglou, Oumout 01 October 2012 (has links) (PDF)
Knowledge is one of the most important assets of an organization that directly affects business success, and its importance increases for organizations that use knowledge-intensive processes such as the software development industry. In an industry in which technological developments are rapid, in order to keep up with the continuously increasing competition and to obtain competitive advantage the software organizations need to obtain the correct knowledge, use it efficiently and pass it to future projects evolving it accordingly. The major aim of this research is to propose a novel model, namely AiOLoS, for assessing the level of organizational learning and learning characteristics in software development organizations. The primary contributions of this two-legged AiOLoS model are the identification of the major process areas and the core processes that a learning software organization follows during its organizational learning process and to provide the necessary measures and metrics and the corresponding definitions/interpretations for the assessment of the learning characteristics of the software development organization. The research is supported with a multiple case-study work conducted in software development teams in order to identify the mapping of the core processes and the applicability of the AiOLoS model to software development organizations, its utilization as a tool for assessing organizational learning and providing a basis for software process improvement.
|
28 |
Knowledge Discovery In Microarray Data Of BioinformaticsKocabas, Fahri 01 June 2012 (has links) (PDF)
This thesis analyzes major microarray repositories and presents a metadata
framework both to address the current issues and to promote the main operations
such as knowledge discovery, sharing, integration, and exchange. The proposed
framework is demonstrated in a case study on real data and can be used for other
high throughput repositories in biomedical domain.
Not only the number of microarray experimentation increases, but also the
size and complexity of the results rise in response to biomedical inquiries. And,
experiment results are significant when examined in a batch and placed in a
biological context. There have been standardization initiatives on content, object
model, exchange format, and ontology. However, they have proprietary information
space. There are backlogs and the data cannot be exchanged among the repositories.
There is a need for a format and data management standard at present.iv
v
We introduced a metadata framework to include metadata card and semantic
nets to make the experiment results visible, understandable and usable. They are
encoded in standard syntax encoding schemes and represented in XML/RDF. They
can be integrated with other metadata cards, semantic nets and can be queried. They
can be exchanged and shared. We demonstrated the performance and potential
benefits with a case study on a microarray repository.
This study does not replace any product on repositories. A metadata
framework is required to manage such huge data. We state that the backlogs can be
reduced, complex knowledge discovery queries and exchange of information can
become possible with this metadata framework.
|
29 |
Dynamic Approach To Wind Sensitive Optimum Cruise Phase Flight PlanningYildiz, Guray 01 October 2012 (has links) (PDF)
A Flight Management System (FMS) performs 4 Dimensional flight planning / Lateral Planning (Calculation of the latitude and longitudes of waypoints), Vertical Planning (Calculation of the altitudes of waypoints) and Temporal Planning(Calculation of Estimated Time of Arrival).
Correct and accurate calculation of4D flight path and then guiding the pilot/airplane to track the route in specified accuracy limits in terms of lateral (i.e Required Navigational
Performance RNP), vertical (Reduced Vertical Seperation Minima RVSM), and time (Required Time of Arrival RTA) is what FMS performs in brief.
Any deviation of planned input values versus actual input values, especially during the emergency cases (i.e burning outoneof engines etc.), causes the aircraft to deviate the
plan and requires replanning now taking into consideration the currentsituation.
In emergency situations especially in Oceaning Flights (flights whose cruise phase lasts more than 5 hour is called as &ldquo / Oceaning Flights&rdquo / ) Optimum Cruise Phase Flight Route
Planning plays a vital role.
In avionics domain &ldquo / Optimum&rdquo / does not mean &ldquo / shortest path&rdquo / mainly due to the effect of weather data as wind speed and direction directly affects the groundspeed.
In the scope of the current thesis, an algorithm employing dynamic programming paradigms will be designed and implemented to find the optimum flight route planning. A
top down approach by making use of aircraft route planning ontology will be implemented to fill the gap between the flight plan specific domain knowledge and optimization
techniques employed. Where as the algorithm will be generic by encapsulating the aircraft&rsquo / s performance characteristics / it will be evaluated on C-130 aircraft.
|
30 |
Improving Scalability And Efficiency Of Ilp-based And Graph-based Concept Discovery SystemsMutlu, Alev 01 July 2013 (has links) (PDF)
Concept discovery is the problem of finding definitions of target relation in terms or other relation given
as a background knowledge. Inductive Logic Programming (ILP)-based and graph-based approaches
are two competitors in concept discovery problem. Although ILP-based systems have long dominated
the area, graph-based systems have recently gained popularity as they overcome certain shortcomings
of ILP-based systems. While having applications in numerous domains, ILP-based concept discovery systems still sustain scalability and efficiency issues. These issues generally arose due to the large search spaces such systems build. In this work we propose memoization-based and parallelization-based methods that modify the search space construction step and the evaluation step of ILP-based concept discovery systems to overcome these problem.
In this work we propose three memoization-based methods, called Tabular CRIS, Tabular CRIS-wEF,
and Selective Tabular CRIS. In these methods, basically, evaluation queries are stored in look-up tables
for later uses. While preserving some core functions in common, each proposed method improves
e_ciency and scalability of its predecessor by introducing constraints on what kind of evaluation
queries to store in look-up tables and for how long.
The proposed parallelization method, called pCRIS, parallelizes the search space construction and
evaluation steps of ILP-based concept discovery systems in a data-parallel manner. The proposed
method introduces policies to minimize the redundant work and waiting time among the workers at
synchronization points.
Graph-based approaches were first introduced to the concept discovery domain to handle the so called local plateau problem. Graph-based approaches have recently gained more popularity in concept discovery system as they provide convenient environment to represent relational data and are able to
overcome certain shortcomings of ILP-based concept discovery systems. Graph-based approaches can
be classified as structure-based approaches and path-finding approaches. The first class of approaches
need to employ expensive algorithms such as graph isomorphism to find frequently appearing substructures.
The methods that fall into the second class need to employ sophisticated indexing mechanisms
to find out the frequently appearing paths that connect some nodes in interest. In this work, we also
propose a hybrid method for graph-based concept discovery which does not require costly substructure
matching algorithms and path indexing mechanism. The proposed method builds the graph in such a
way that similar facts are grouped together and paths that eventually turn to be concept descriptors are
build while the graph is constructed.
|
Page generated in 0.0599 seconds