Spelling suggestions: "subject:"QA deneral 15707"" "subject:"QA ceneral 15707""
1 |
From syllable to meaning: effects of knowledge of syllable in learning the meaning bearing units of languageColtekin, Cagri 01 December 2006 (has links) (PDF)
This thesis aims to investigate the role of the syllable, a non-meaning bearing unit, in learning high level meaning bearing units---the lexical items of language. A computational model has been developed to learn the meaning bearing units of the language, assuming knowledge of syllables. The input to the system comprises of words marked at syllable boundaries together with their meanings. Using a statistical learning algorithm, the model discovers the meaning bearing elements with their respective syntactic categories. The model' / s success has been tested against a second model that has been trained with the same corpus segmented at morpheme boundaries. The lexicons learned by both models have been found to be similar, with an exact overlap of 71%.
|
2 |
Local Modeling Of The Ionospheric Vertical Total Electron Content (vtec) Using Particle FilterAghakarimi, Armin 01 September 2012 (has links) (PDF)
ABSTRACT
LOCAL MODELING OF THE IONOSPHERIC VERTICAL TOTAL ELECTRON CONTENT (VTEC) USING
PARTICLE FILTER
Aghakarimi, Armin
M.Sc., Department of Geodetic and Geographic Information Technologies
Supervisor: Prof. Dr. Mahmut Onur Karslioglu
September 2012, 98 pages
Ionosphere modeling is an important field of current studies because of its influences on the propagation of the electromagnetic signals. Among the various methods of obtaining ionospheric information, Global Positioning System (GPS) is the most prominent one because of extensive stations distributed all over the world. There are several studies in the literature related to the modeling of the ionosphere in terms of Total Electron Content (TEC). However, most of these studies investigate the ionosphere in the global and regional scales. On the other hand, complex dynamic of the ionosphere requires further studies in the local structure of the TEC distribution. In this work, Particle filter has been used for the investigation of local character of the ionosphere VTEC. Besides, standard Kalman filter as an effective method for optimal state estimation is applied to the same data sets to compare the corresponding results with results of Particle filter. The comparison shows that Particle filter indicates better performance than the standard Kalman filter especially during the geomagnetic storm. MATLAB© / R2011 software has been used for programing all processes and algorithms of the study.
|
3 |
A Mixed Integer Second Order Cone Programming Reformulation For A Congested Location And Capacity Allocation Problem On A Supply Chain NetworkMohammad, Salimian 01 January 2013 (has links) (PDF)
Supply chain network design involves location decisions for production facilities and distribution centers.
We consider a make-to-order supply chain environment where distribution centers serve as crossdocking
terminals. Long waiting times may occur at a cross-docking terminal, unless sucient handling
capacity is installed. In this study, we deal with a facility location problem with congestion
eects at distribution centers. Along with location decisions, we make capacity allocation (service
rate) and demand allocation decisions so that the total cost, including facility opening, transportation
and congestion costs, is minimized.
Response time to customer orders is a critical performance measure for a supply chain network. The
decisions like where the plants and distribution centers are located aect the response time of the
system. Response time is more sensitive to these decisions in a make-to-order business environment.
In a distribution network where distribution centers function as cross-docking terminals, capacity or
the service rate decisions also aect the response time performance.
This study is closely related to a recent work Vidyarthi et al. (2009) which models distribution centers
asM/G/1 queuing systems. They use the average waiting time formula ofM/G/1 queuing model. Thus,
the average waiting time at a distribution center is a nonlinear function of the demand rate allocated to
and the service rate available at the distribution center. The authors Vidyarthi et al. (2009) propose a
linear approximation approach and a Lagrangian based heuristic for the problem.
Dierent than the solution approach proposed in Vidyarthi et al. (2009), we propose a closed form
formulation for the problem. In particular, we show that the waiting time function derived from M/G/1
queuing model can be represented via second order conic inequalities. Then, the problem becomes
a mixed integer second order cone programming problem which can be solved by using commercial
branch-and-bound software such as IBM ILOG CPLEX. Our computational tests show that proposed reformulation can be solved in reasonable CPU times for practical size instances.
|
4 |
Software Engineering Process ImprovementSezer, Bulent 01 April 2007 (has links) (PDF)
This thesis presents a software engineering process improvement study. The literature on software process improvement is reviewed. Then the current design verification process at one of the Software Engineering Departments of
the X Company, Ankara, Tü / rkiye (SED) is analyzed. Static software development process metrics have been calculated for the SED based on a recently proposed approach. Some improvement suggestions have been made based on the metric values calculated according to the proposals of that study.
Besides, the author' / s improvement suggestions have been discussed with the senior staff at the department and then final version of the improvements has been gathered. Then, a discussion has been made comparing these two approaches. Finally, a new software design verification process model has
been proposed. Some of the suggestions have already been applied and preliminary results have been obtained.
|
5 |
Combinatorial Auction ProblemsBaykal, Safak 01 August 2007 (has links) (PDF)
Electronic commerce is becoming more important day by day. Many transactions
and business are done electronically and many people do not want paper work
anymore. When a firm wants to buy raw materials or components, it announces
its need to related websites or in the newspapers. Similar demands and
announcements can be seen almost everywhere nowadays. In this way, it needs to
perform fast and reliable auctions as much as possible. On the other hand, buyers
not only consider cost but also consider a lot of different aspects like quality,
warranty period, lead time etc when they want to purchase something. This
situation leads to more complex problems in the purchasing process.
As a consequence, some researchers started to consider auction mechanisms that
support bids characterized by several attributes in addition to the price (quality of
the product, quantity, terms of delivery, quality of the supplier etc.). These are
referred to as multi-attribute combinatorial auctions.
In this thesis, Combinatorial Auctions are analyzed. Single-attribute multi-unit,
multi-attribute multi-unit combinatorial auction models are studied and an
interactive method is applied for solving the multi-attribute multi-unit
combinatorial auction problem.
|
6 |
Evaluation And Comparison Of Helicopter Simulation Models With Different FidelitiesYilmaz, Deniz 01 July 2008 (has links) (PDF)
This thesis concerns the development, evaluation, comparison and testing of a UH-1H helicopter
simulation model with various fidelity levels. In particular, the well known minimum
complexity simulation model is updated with various higher fidelity simulation components,
such as the Peters-He inflow model, horizontal tail contribution, improved tail rotor model,
control mapping, ground eect, fuselage interactions, ground reactions etc. Results are compared
with available flight test data. The dynamic model is integrated into the open source
simulation environment called Flight Gear. Finally, the model is cross-checked through evaluations
using test pilots.
|
7 |
An Assessment Model For Web-based Information System EffectivenessTokdemir, Gul 01 January 2009 (has links) (PDF)
Information System (IS) effectiveness assessment is an important issue for the organizations as IS have become critical for their survival. With the incorporation of Internet technologies into the business environment, it is now more difficult to measure IS effectiveness, because Internet provides a borderless, non-stop, flexible communication medium. Assessing the effectiveness of web-based information systems (WIS) is vital for survival and competitive advantage which is a complicated
subject since there are several interacting factors to consider. In the literature there are several methods proposed for IS assessment. However, those studies have been far from providing a broad, comprehensive evaluation framework for any type of web-based IS independent of its domain. In this study, a generic WIS effectiveness assessment framework is proposed. The framework is applied in case studies consisting of four organizations in e-commerce and e-banking domains.
|
8 |
Spectral Modular MultiplicationAkin, Ihsan Haluk 01 February 2008 (has links) (PDF)
Spectral methods have been widely used in various fields of engineering and applied mathematics.
In the field of computer arithmetic: data compression, polynomial multiplication and
the spectral integer multiplication of Sch¨ / onhage and Strassen are among the most important
successful utilization. Recent advancements in technology report the spectral methods may
also be beneficial for modular operations heavily used in public key cryptosystems.
In this study, we evaluate the use of spectral methods in modular multiplication. We carefully
compare their timing performances with respect to the full return algorithms. Based on our
evaluation, we introduce new approaches for spectral modular multiplication for polynomials
and exhibit standard reduction versions of the spectral modular multiplication algorithm for
polynomials eliminating the overhead of Montgomery&rsquo / s method.
Moreover, merging the bipartite method and standard approach, we introduce the bipartite
spectral modular multiplication to improve the hardware performance of spectral modular
multiplication for polynomials. Finally, we introduce Karatsuba combined bipartite method
for polynomials and its spectral version.
|
9 |
Induction And Control Of Large-scale Gene Regulatory NetworksTan, Mehmet 01 June 2009 (has links) (PDF)
Gene regulatory networks model the interactions within the cell and thus it is essential to understand their structure and to develop some control mechanisms that could effectively deal with them. This dissertation tackles these two aspects. To handle the first problem, a new constraint-based modeling algorithm is proposed that can both increase the quality of the output and decrease the computational requirements for learning the structure of gene regulatory networks by integrating multiple biological data types and applying a special method for dense nodes in the network. Constraint-based structure learning algorithms generally perform well on sparse graphs and it is true that sparsity is not uncommon. However, some domains like gene regulatory networks are characterized by the possibility of having some dense regions in the underlying graph and the proposed algorithm is capable of dealing with this issue. The algorithm is based on a well-known structure learning algorithm called the PC algorithm, and extends it in multiple aspects. Once a network exists, we could address the second problem, namely control of the gene regulatory network for various applications where the curse of dimensionality is the main issue. It is possible that hundreds of genes may regulate one biological activity in an organism and this implies a huge state space even in the case of Boolean models. The thesis proposes effective methods to find control policies for large-scale networks. The modeling and control algorithms proposed in this dissertation have been evaluated on both synthetic and real data sets. The test results demonstrate the efficiency and effectiveness of the proposed approaches.
|
10 |
Context Based Interoperability To Support Infrastructure Management In MunicipalitiesTufan, Emrah 01 September 2010 (has links) (PDF)
Interoperability between Geographic Information System (GIS) of different
infrastructure companies is still a problem to be handled. Infrastructure companies
deal with many operations as a part of their daily routine such as a regular
maintenance, or sometimes they deal with unexpected situations such as a
malfunction due to natural event, like a flood or an earthquake. These situations
may affect all companies and affected infrastructure companies response to these
effects. Responses may result in consequences and in order to model these
consequences on GIS, GISs are able to share information, which brings the
interoperability problem into the scene.
The present research, aims at finding an answer to interoperability problem between
GISs of different companies by considering contextual information. During the
study, the geographical features are handled as the major concern and
interoperability problem is examined by targeting them. The model constructed in
this research is based on the ontology and because the meaning of the terms in the
ontology depends on the context, ontology based context modeling is also used.
v
In this research, a system implementation is done for two different GISs of two
|
Page generated in 0.0393 seconds