• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 7
  • 1
  • 1
  • Tagged with
  • 38
  • 38
  • 16
  • 14
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Exact Methods for Solving Single-Vehicle Pickup and Delivery Problems in Real Time

O'Neil, Ryan James 02 March 2019 (has links)
<p> The Traveling Salesman Problem with Pickup and Delivery (TSPPD) describes the problem of finding a minimum cost path in which pickups precede their associated deliveries. The TSPPD is particularly important in the growing field of Dynamic Pickup and Delivery Problems (DPDP). These include the many-to-many Dial-A-Ride Problems (DARP) of companies such as Uber and Lyft, and the Meal Delivery Routing Problem (MDRP) of Grubhub. We examine exact methods for solving TSPPDs where orders from different pickup locations can be carried simultaneously. Our interest lies in solving such problems for real-time applications, in which finding high quality solutions quickly (often in less than a second) is more important that proving the optimality of such solutions. </p><p> We begin by considering enumeration, Constraint Programming (CP) with and without Assignment Problem (AP) inference duals, Mixed Integer Programming (MIP), and hybrid methods combining CP and MIP. Our CP formulations examine multiple techniques for ensuring pickup and delivery precedence relationships. We then propose a new MIP formulation for the Asymmetric Traveling Salesman Problem with Pickup and Delivery, along with valid inequalities for the Sarin-Sherali-Bhootra formulation. We study these models in their complete forms, relax complicating constraints of these models, and compare their performance. Finally, we examine the use of low-width Multivalued Decision Diagrams (MDDs) in a branch-and-bound with and without AP inference duals as a primal heuristic for finding high quality solutions to TSPPDs within strict time budgets. </p><p> In our results and conclusions, we attempt to provide guidance about which of these methods may be most appropriate for fast TSPPD solving given various time budgets and problem sizes. We present computational results showing the promise of our new MIP formulations when applied to pickup and delivery problems. Finally, we show that hybridized low-width MDDs can be more effective than similarly structured hybrid CP techniques for real-time combinatorial decision making.</p><p>
2

A Multivariate Bayesian Approach to Modeling Vulnerability Discovery in the Software Security Lifecycle

Johnston, Reuben Aaron 31 July 2018 (has links)
<p> Software vulnerabilities that enable well-known exploit techniques for committing computer crimes are <i>preventable</i>, but they continue to be present in releases. When Blackhats (i.e., malicious researchers) discover these vulnerabilities they oftentimes release corresponding exploit software and malware. If vulnerabilities&mdash;or discoveries of them&mdash;are not prevented, mitigated, or addressed, customer confidence could be reduced. In addressing the issue, software-makers must choose which mitigation alternatives will provide maximal impact and use vulnerability discovery modeling (VDM) techniques to support their decision-making process. In the literature, applications of these techniques have used traditional approaches to analysis and, despite the dearth of data, have not included information from experts and do not include influential variables describing the software release (SR) (e.g., code size and complexity characteristics) and security assessment profile (SAP) (e.g., security team size or skill). Consequently, they have been limited to modeling discoveries over time for SR and SAP scenarios of unique products, whose results are not readily comparable without making assumptions that equate all SR and SAP combinations under study. This research takes an alternative approach, applying Bayesian methods to modeling the vulnerability-discovery phenomenon. Relevant data were obtained from expert judgment (i.e., information elicited from security experts in structured workshops) and from public databases. The open-source framework, MCMCBayes, was developed to perform Bayesian model averaging (BMA). It combines predictions of interval-grouped discoveries by performance-weighting results from six variants of the non-homogeneous Poisson process, two regression models, and two growth-curve models. Utilizing expert judgment also enables forecasting expected discoveries over time for arbitrary SR and SAP combinations, thus helping software-makers to better understand the effects of influential variables they control on the phenomenon. This requires defining variables that describe arbitrary SR and SAP combinations as well as constructing VDM extensions that parametrically scale results from a defined baseline SR and SAP to the arbitrary SR and SAP of interest. Scaling parameters were estimated using elicited multivariate data gathered with a novel paired comparison approach. MCMCBayes uses the multivariate data with the BMA model for the baseline to perform predictions for desired SR and SAP combinations and to demonstrate how multivariate VDM techniques could be used. The research is applicable to software-makers and persons interested in applications of expert-judgment elicitation or those using Bayesian analysis techniques with phenomena having non-decreasing counts over time.</p><p>
3

Parallelization of Entity-Based Models in Computational Social Science| A Hardware Perspective

Brearcliffe, Dale K. 31 March 2018 (has links)
<p> The use of simulations by social scientists in exploring theories and hypotheses is well documented. As computer systems have grown in capacity, so have interests of social scientists in executing larger simulations. Social scientists often approach their simulation design from the top down by selecting an Entity-Based Model (<b>EBM</b>) framework from those that are readily available, thus limiting modeling capability to the available frameworks. Ultimately, the framework is dependent upon what is at the bottom, the hardware architecture that serves as the foundation of the computing system. Parallel hardware architecture supports the simultaneous execution of a problem split into multiple pieces. Thus, the problem is solved faster in parallel. In this thesis, a selection of parallel hardware architectures is examined with a goal of providing support for EBMs. The hardware's capability to support parallelization of EBMs is described and contrasted. A simple EBM is tested to illustrate these capabilities and implementation challenges specific to parallel hardware are explored. The results of this research offer social scientists better informed choices than the sequential EBM frameworks that currently exist. Matching the model to the correct supporting hardware will permit larger scale problems to be examined and expands the range of models that a social scientist can explore.</p><p>
4

Computational Models for Scheduling in Online Advertising

Arkhipov, Dmitri I. 27 October 2016 (has links)
<p> Programmatic advertising is an actively developing industry and research area. Some of the research in this area concerns the development of optimal or approximately optimal contracts and policies between publishers, advertisers and intermediaries such as ad networks and ad exchanges. Both the development of contracts and the construction of policies governing their implementation are difficult challenges, and different models take different features of the problem into account. In programmatic advertising decisions are made in real time, and time is a scarce resource particularly for publishers who are concerned with content load times. Policies for advertisement placement must execute very quickly once content is requested; this requires policies to either be pre-computed and accessed as needed, or for the policy execution to be very efficient. We formulate a stochastic optimization problem for per publisher ad sequencing with binding latency constraints. Within our context an ad request lifecycle is modeled as a sequence of one by one solicitations (OBOS) subprocesses/lifecycle stages. From the viewpoint of a supply side platform (SSP) (an entity acting in proxy for a collection of publishers), the duration/span of a given lifecycle stage/subprocess is a stochastic variable. This stochasticity is due both to the stochasticity inherent in Internet delay times, and the lack of information regarding the decision processes of independent entities. In our work we model the problem facing the SSP, namely the problem of optimally or near-optimally choosing the next lifecycle stage of a given ad request lifecycle at any given time. We solve this problem to optimality (subject to the granularity of time) using a classic application of Richard Bellman's dynamic programming approach to the 0/1 Knapsack Problem. The DP approach does not scale to a large number of lifecycle stages/subprocesses so a sub-optimal approach is needed. We use our DP formulation to derive a focused real time dynamic programming (FRTDP) implementation, a heuristic method with optimality guarantees for solving our problem. We empirically evaluate (through simulation) the performance of our FRTDP implementation relative to both the DP implementation (for tractable instances) and to several alternative heuristics for intractable instances. Finally, we make the case that our work is usefully applicable to problems outside the domain of online advertising.</p>
5

Influence and Authority of Information Sources in the Highlands| Exploring the Immigration Debate During the Scottish Independence Referendum

Stewart, Kristine N. 16 April 2019 (has links)
<p> This dissertation examines the role of the mass news media as an influencer of opinions on immigration through an examination of information sources used by host, Highland community members. There is an extensive range of research exploring the experiences of immigrants and policy responses in the UK, but little is known about how host communities process and respond to increasing cultural diversity. Addressing the latter is essential to overcome the assimilation tendencies in discourses about the integration of immigrants. Critical discourse analysis was used to analyze newspapers and interviews in this mixed methods study conducted in the year prior to the Scottish Independence Referendum. Findings of this study revealed the negative and homogenizing portrayal of immigrants in the mass news media, the importance of first and second hand experiences as sources of information on immigration in Scottish Highland communities, and the influence of sociocultural factors on how people establish authority of information sources. Findings suggest the need for stronger institutional infrastructures to address increasing diversity in the UK. Of particular interest is the context of this research, during a time of crisis, which reveals that the act of decision-making is based on the often unconscious, ontological construction of information behaviors through the worldview of participants.</p><p>
6

A Vector Parallel Branch and Bound Algorithm

Guilbeau, Jared T. 13 September 2017 (has links)
<p> Global optimization problems sometimes attain their extrema on infinite subsets of the search space, forcing mathematically rigorous programs to require large amounts of data to describe these sets. This makes these programs natural candidates for both vectorization methods and parallel computing. Here, we give a brief overview of parallel computing and vectorization methods, exploit their availability by constructing a fully distributed implementation of a mathematically rigorous Vector Parallel Branch and Bound Algorithm using MATLAB&rsquo;s SPMD architecture and interval arithmetic, and analyze the performance of the algorithm across different methods of inter-processor communication.</p><p>
7

Emergency Responders as Inventors| An Action Research Examination of Public Information Work

St. Denis, Lise Ann 31 December 2015 (has links)
<p> The development of information and communication technologies (ICTs) has expanded the ways that people communicate and share information with one another. In the context of disaster, this has disrupted and reshaped the nature of the communication of emergency information and public participation in the emergency response process itself. Members of the public have been much quicker at adapting and improvising solutions in this new communication ecology than emergency response organizations. This difference in adoption reflects key differences in the formal constraints and responsibilities faced by emergency responders in comparison to the ability in the public sphere to improvise and organize more fluidly. My research focuses on the design and ongoing development of sociotechnical solutions within a community of emergency responders interested in integrating social media into emergency response practices. I look at both the solutions emerging across this community and the sociotechnical arrangements that support ongoing communication and the evolution of new ideas in a continual process of invention. My research spans four years, starting with an initial case study and progressing over time into a collaborative role that leverages my skills and knowledge of crisis informatics in the joint exploration of data analysis strategies and communication strategies.</p>
8

Optimal supply chain configuration for the additive manufacturing of biomedical implants

Emelogu, Adindu Ahurueze 11 January 2017 (has links)
<p> In this dissertation, we study two important problems related to additive manufacturing (AM). In the first part, we investigate the economic feasibility of using AM to fabricate biomedical implants at the sites of hospitals AM versus traditional manufacturing (TM). We propose a cost model to quantify the supply-chain level costs associated with the production of biomedical implants using AM technology, and formulate the problem as a two-stage stochastic programming model, which determines the number of AM facilities to be established and volume of product flow between manufacturing facilities and hospitals at a minimum cost. We use the sample average approximation (SAA) approach to obtain solutions to the problem for a real-world case study of hospitals in the state of Mississippi. We find that the ratio between the unit production costs of AM and TM (ATR), demand and product lead time are key cost parameters that determine the economic feasibility of AM.</p><p> In the second part, we investigate the AM facility deployment approaches which affect both the supply chain network cost and the extent of benefits derived from AM. We formulate the supply chain network cost as a continuous approximation model and use optimization algorithms to determine how centralized or distributed the AM facilities should be and how much raw materials these facilities should order so that the total network cost is minimized. We apply the cost model to a real-world case study of hospitals in 12 states of southeastern USA. We find that the demand for biomedical implants in the region, fixed investment cost of AM machines, personnel cost of operating the machines and transportation cost are the major factors that determine the optimal AM facility deployment configuration.</p><p> In the last part, we propose an enhanced sample average approximation (eSAA) technique that improves the basic SAA method. The eSAA technique uses clustering and statistical techniques to overcome the sample size issue inherent in basic SAA. Our results from extensive numerical experiments indicate that the eSAA can perform up to 699% faster than the basic SAA, thereby making it a competitive solution approach of choice in large scale stochastic optimization problems.</p>
9

Two combinatorial optimization problems at the interface of computer science and operations research /

Kao, Gio K., January 2008 (has links)
Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2008. / Source: Dissertation Abstracts International, Volume: 69-11, Section: B, page: 6920. Adviser: Sheldon Jacobson. Includes bibliographical references (leaves 120-128) Available on microfilm from Pro Quest Information and Learning.
10

Dissociating components of cognitive control using high-density event-related potentials implementation of control, conflict processing, and error monitoring /

Larson, Michael James. January 2004 (has links)
Thesis (M.S.)--University of Florida, 2004. / Typescript. Title from title page of source document. Document formatted into pages; contains 60 pages. Includes Vita. Includes bibliographical references.

Page generated in 0.0785 seconds