• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • Tagged with
  • 12
  • 12
  • 12
  • 12
  • 5
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Exact Methods for Solving Single-Vehicle Pickup and Delivery Problems in Real Time

O'Neil, Ryan James 02 March 2019 (has links)
<p> The Traveling Salesman Problem with Pickup and Delivery (TSPPD) describes the problem of finding a minimum cost path in which pickups precede their associated deliveries. The TSPPD is particularly important in the growing field of Dynamic Pickup and Delivery Problems (DPDP). These include the many-to-many Dial-A-Ride Problems (DARP) of companies such as Uber and Lyft, and the Meal Delivery Routing Problem (MDRP) of Grubhub. We examine exact methods for solving TSPPDs where orders from different pickup locations can be carried simultaneously. Our interest lies in solving such problems for real-time applications, in which finding high quality solutions quickly (often in less than a second) is more important that proving the optimality of such solutions. </p><p> We begin by considering enumeration, Constraint Programming (CP) with and without Assignment Problem (AP) inference duals, Mixed Integer Programming (MIP), and hybrid methods combining CP and MIP. Our CP formulations examine multiple techniques for ensuring pickup and delivery precedence relationships. We then propose a new MIP formulation for the Asymmetric Traveling Salesman Problem with Pickup and Delivery, along with valid inequalities for the Sarin-Sherali-Bhootra formulation. We study these models in their complete forms, relax complicating constraints of these models, and compare their performance. Finally, we examine the use of low-width Multivalued Decision Diagrams (MDDs) in a branch-and-bound with and without AP inference duals as a primal heuristic for finding high quality solutions to TSPPDs within strict time budgets. </p><p> In our results and conclusions, we attempt to provide guidance about which of these methods may be most appropriate for fast TSPPD solving given various time budgets and problem sizes. We present computational results showing the promise of our new MIP formulations when applied to pickup and delivery problems. Finally, we show that hybridized low-width MDDs can be more effective than similarly structured hybrid CP techniques for real-time combinatorial decision making.</p><p>
2

A Multivariate Bayesian Approach to Modeling Vulnerability Discovery in the Software Security Lifecycle

Johnston, Reuben Aaron 31 July 2018 (has links)
<p> Software vulnerabilities that enable well-known exploit techniques for committing computer crimes are <i>preventable</i>, but they continue to be present in releases. When Blackhats (i.e., malicious researchers) discover these vulnerabilities they oftentimes release corresponding exploit software and malware. If vulnerabilities&mdash;or discoveries of them&mdash;are not prevented, mitigated, or addressed, customer confidence could be reduced. In addressing the issue, software-makers must choose which mitigation alternatives will provide maximal impact and use vulnerability discovery modeling (VDM) techniques to support their decision-making process. In the literature, applications of these techniques have used traditional approaches to analysis and, despite the dearth of data, have not included information from experts and do not include influential variables describing the software release (SR) (e.g., code size and complexity characteristics) and security assessment profile (SAP) (e.g., security team size or skill). Consequently, they have been limited to modeling discoveries over time for SR and SAP scenarios of unique products, whose results are not readily comparable without making assumptions that equate all SR and SAP combinations under study. This research takes an alternative approach, applying Bayesian methods to modeling the vulnerability-discovery phenomenon. Relevant data were obtained from expert judgment (i.e., information elicited from security experts in structured workshops) and from public databases. The open-source framework, MCMCBayes, was developed to perform Bayesian model averaging (BMA). It combines predictions of interval-grouped discoveries by performance-weighting results from six variants of the non-homogeneous Poisson process, two regression models, and two growth-curve models. Utilizing expert judgment also enables forecasting expected discoveries over time for arbitrary SR and SAP combinations, thus helping software-makers to better understand the effects of influential variables they control on the phenomenon. This requires defining variables that describe arbitrary SR and SAP combinations as well as constructing VDM extensions that parametrically scale results from a defined baseline SR and SAP to the arbitrary SR and SAP of interest. Scaling parameters were estimated using elicited multivariate data gathered with a novel paired comparison approach. MCMCBayes uses the multivariate data with the BMA model for the baseline to perform predictions for desired SR and SAP combinations and to demonstrate how multivariate VDM techniques could be used. The research is applicable to software-makers and persons interested in applications of expert-judgment elicitation or those using Bayesian analysis techniques with phenomena having non-decreasing counts over time.</p><p>
3

Computational Models for Scheduling in Online Advertising

Arkhipov, Dmitri I. 27 October 2016 (has links)
<p> Programmatic advertising is an actively developing industry and research area. Some of the research in this area concerns the development of optimal or approximately optimal contracts and policies between publishers, advertisers and intermediaries such as ad networks and ad exchanges. Both the development of contracts and the construction of policies governing their implementation are difficult challenges, and different models take different features of the problem into account. In programmatic advertising decisions are made in real time, and time is a scarce resource particularly for publishers who are concerned with content load times. Policies for advertisement placement must execute very quickly once content is requested; this requires policies to either be pre-computed and accessed as needed, or for the policy execution to be very efficient. We formulate a stochastic optimization problem for per publisher ad sequencing with binding latency constraints. Within our context an ad request lifecycle is modeled as a sequence of one by one solicitations (OBOS) subprocesses/lifecycle stages. From the viewpoint of a supply side platform (SSP) (an entity acting in proxy for a collection of publishers), the duration/span of a given lifecycle stage/subprocess is a stochastic variable. This stochasticity is due both to the stochasticity inherent in Internet delay times, and the lack of information regarding the decision processes of independent entities. In our work we model the problem facing the SSP, namely the problem of optimally or near-optimally choosing the next lifecycle stage of a given ad request lifecycle at any given time. We solve this problem to optimality (subject to the granularity of time) using a classic application of Richard Bellman's dynamic programming approach to the 0/1 Knapsack Problem. The DP approach does not scale to a large number of lifecycle stages/subprocesses so a sub-optimal approach is needed. We use our DP formulation to derive a focused real time dynamic programming (FRTDP) implementation, a heuristic method with optimality guarantees for solving our problem. We empirically evaluate (through simulation) the performance of our FRTDP implementation relative to both the DP implementation (for tractable instances) and to several alternative heuristics for intractable instances. Finally, we make the case that our work is usefully applicable to problems outside the domain of online advertising.</p>
4

A Vector Parallel Branch and Bound Algorithm

Guilbeau, Jared T. 13 September 2017 (has links)
<p> Global optimization problems sometimes attain their extrema on infinite subsets of the search space, forcing mathematically rigorous programs to require large amounts of data to describe these sets. This makes these programs natural candidates for both vectorization methods and parallel computing. Here, we give a brief overview of parallel computing and vectorization methods, exploit their availability by constructing a fully distributed implementation of a mathematically rigorous Vector Parallel Branch and Bound Algorithm using MATLAB&rsquo;s SPMD architecture and interval arithmetic, and analyze the performance of the algorithm across different methods of inter-processor communication.</p><p>
5

Optimal supply chain configuration for the additive manufacturing of biomedical implants

Emelogu, Adindu Ahurueze 11 January 2017 (has links)
<p> In this dissertation, we study two important problems related to additive manufacturing (AM). In the first part, we investigate the economic feasibility of using AM to fabricate biomedical implants at the sites of hospitals AM versus traditional manufacturing (TM). We propose a cost model to quantify the supply-chain level costs associated with the production of biomedical implants using AM technology, and formulate the problem as a two-stage stochastic programming model, which determines the number of AM facilities to be established and volume of product flow between manufacturing facilities and hospitals at a minimum cost. We use the sample average approximation (SAA) approach to obtain solutions to the problem for a real-world case study of hospitals in the state of Mississippi. We find that the ratio between the unit production costs of AM and TM (ATR), demand and product lead time are key cost parameters that determine the economic feasibility of AM.</p><p> In the second part, we investigate the AM facility deployment approaches which affect both the supply chain network cost and the extent of benefits derived from AM. We formulate the supply chain network cost as a continuous approximation model and use optimization algorithms to determine how centralized or distributed the AM facilities should be and how much raw materials these facilities should order so that the total network cost is minimized. We apply the cost model to a real-world case study of hospitals in 12 states of southeastern USA. We find that the demand for biomedical implants in the region, fixed investment cost of AM machines, personnel cost of operating the machines and transportation cost are the major factors that determine the optimal AM facility deployment configuration.</p><p> In the last part, we propose an enhanced sample average approximation (eSAA) technique that improves the basic SAA method. The eSAA technique uses clustering and statistical techniques to overcome the sample size issue inherent in basic SAA. Our results from extensive numerical experiments indicate that the eSAA can perform up to 699% faster than the basic SAA, thereby making it a competitive solution approach of choice in large scale stochastic optimization problems.</p>
6

Two combinatorial optimization problems at the interface of computer science and operations research /

Kao, Gio K., January 2008 (has links)
Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2008. / Source: Dissertation Abstracts International, Volume: 69-11, Section: B, page: 6920. Adviser: Sheldon Jacobson. Includes bibliographical references (leaves 120-128) Available on microfilm from Pro Quest Information and Learning.
7

An Investigation of Anomaly-based Ensemble Models for Multi-domain Intrusion Detection

Mikhail, Joseph W. 29 November 2018 (has links)
<p> Although the traditional intrusion detection problem has been well studied with the release of the KDD&rsquo;99 and NSL-KDD datasets, recent intrusion detection has expanded to include wireless 802.11 networks and Industrial Control Systems &amp; Supervisory Control and Data Acquisition (ICS/SCADA) systems. This research investigates the application of two novel models to multi-domain intrusion detection. The first model is hybrid ensemble that uses complementary-based diversity measures in an efficient greedy search pruning process. The proposed hybrid ensemble is constructed from a heterogeneous combination of decision tree and Naive Bayes classifiers and evaluated for intrusion detection performance on an 802.11 wireless system, a power generation system, and a gas pipeline system. The second model is based on a one-versus-all (OVA) binary framework comprising multiple nested sub-ensembles. To provide good generalization ability, each sub-ensemble contains a collection of sub-learners, and only a portion of the sub-learners implement boosting. A class weight based on the sensitivity metric (true positive rate), learned from the training data only, is assigned to the sub-ensembles of each class. The second model is applied to traditional and 802.11 wireless network intrusion detection. Overall, the proposed models achieve higher detection rates and good overall false positive performance when evaluating the model compared to state-of-the-art methods for effective multi-domain intrusion detection.</p><p>
8

A Coherent Classifier/Prediction/Diagnostic Problem Framework and Relevant Summary Statistics

Eiland, E. Earl 23 November 2017 (has links)
<p> Classification is a ubiquitous decision activity. Regardless of whether it is predicting the future, e.g., a weather forecast, determining an existing state, e.g., a medical diagnosis, or some other activity, classifier outputs drive future actions. Because of their importance, classifier research and development is an active field.</p><p> Regardless of whether one is a classifier developer or an end user, evaluating and comparing classifier output quality is important. Intuitively, classifier evaluation may seem simple, however, it is not. There is a plethora of classifier summary statistics and new summary statistics seem to surface regularly. Summary statistic users appear not to be satisfied with the existing summary statistics. For end users, many existing summary statistics do not provide actionable information. This dissertation addresses the end user's quandary. </p><p> The work consists of four parts: 1. Considering eight summary statistics with regard to their purpose (what questions do they quantitatively answer) and efficacy (as defined by measurement theory). 2. Characterizing the classification problem from the end user's perspective and identifying four axioms for end user efficacious classifier evaluation summary statistics. 3. Applying the axia and measurement theory to evaluate eight summary statistics and create two compliant (end user efficacious) summary statistics. 4. Using the compliant summary statistics to show the actionable information they generate.</p><p> By applying the recommendations in this dissertation, both end users and researchers benefit. Researchers have summary statistic selection and classifier evaluation protocols that generate the most usable information. End users can also generate information that facilitates tool selection and optimal deployment, if classifier test reports provide the necessary information. </p><p>
9

Dynamics of global supply chain and electric power networks: Models, pricing analysis, and computations

Matsypura, Dmytro 01 January 2006 (has links)
In this dissertation, I develop a new theoretical framework for the modeling, pricing analysis, and computation of solutions to electric power supply chains with power generators, suppliers, transmission service providers, and the inclusion of consumer demands. In particular, I advocate the application of finite-dimensional variational inequality theory, projected dynamical systems theory, game theory, network theory, and other tools that have been recently proposed for the modeling and analysis of supply chain networks (cf. Nagurney (2006)) to electric power markets. This dissertation contributes to the extant literature on the modeling, analysis, and solution of supply chain networks, including global supply chains, in general, and electric power supply chains, in particular, in the following ways. It develops a theoretical framework for modeling, pricing analysis, and computation of electric power flows/transactions in electric power systems using the rationale for supply chain analysis. The models developed include both static and dynamic ones. The dissertation also adds a new dimension to the methodology of the theory of projected dynamical systems by proving that, irrespective of the speeds of adjustment, the equilibrium of the system remains the same. Finally, I include alternative fuel suppliers, along with their behavior into the supply chain modeling and analysis framework. This dissertation has strong practical implications. In an era in which technology and globalization, coupled with increasing risk and uncertainty, complicate electricity demand and supply within and between nations, the successful management of electric power systems and pricing become increasingly pressing topics with relevance not only for economic prosperity but also national security. This dissertation addresses such related topics by providing models, pricing tools, and algorithms for decentralized electric power supply chains. This dissertation is based heavily on the following coauthored papers: Nagurney, Cruz, and Matsypura (2003), Nagurney and Matsypura (2004, 2005, 2006), Matsypura and Nagurney (2005), Matsypura, Nagurney, and Liu (2006).
10

Evolutionary algorithms for statistics and finance

Karavas, Vassilios N 01 January 2003 (has links)
Several models in econometrics and finance have been proven to be computationally intractable due to their complexity. In this dissertation, we propose an evolutionary-genetic-algorithm for solving these types of problems. We extend the models so that less restrictive assumptions are required and we cope with the increased complexity by using a modified version of the evolutionary algorithm proposed for the simpler cases. More specifically, we study closer the estimation of switching regression models as introduced by Quandt (1958). The applicability of the proposed algorithms is examined through disequilibrium models; models that provide supply and demand functions for markets, when the price is not adjusted so that the quantity supplied equals the quantity demanded. We focus on the computational aspect of the deterministic switching regression models and we suggest a self-evolving genetic algorithm for solving these types of problems. As an illustration, we present results from Monte Carlo simulations and thereafter we apply the algorithm to the disequilibrium model proposed for the gasoline market during the “energy crisis”. We further extend the “general model” for markets in disequilibrium by incorporating dynamic relationships, and we examine the applicability of the proposed genetic algorithm in this more complex and realistic problem. Subsequently, the proposed genetic algorithm for the markets in disequilibrium is applied to financial models, where the structure and computational complexity are comparable with those of the switching regression models. As example, we apply the algorithm to minimizing portfolio tracking error with respect to a pre-specified index. The proposed genetic algorithm possesses unique characteristics that maximize the fitness of the algorithm itself for each individual problem. This is achieved through a Self-Evolving process that teaches the genetic algorithm what internal parameters improve the algorithm's fitness.

Page generated in 0.1582 seconds