• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • Tagged with
  • 5
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Integrated service selection, pricing and fullfillment planning for express parcel carriers - Enriching service network design with customer choice and endogenous delivery time restrictions

Martin, Florian, Hemmelmayr, Vera, Wakolbinger, Tina January 2019 (has links) (PDF)
Express parcel carriers offer a wide range of guaranteed delivery times in order to separate customers who value quick delivery from those that are less time but more price sensitive. Such segmentation, however, adds a whole new layer of complexity to the task of optimizing the logistics operations. While many sophisticated models have been developed to assist network planners in minimizing costs, few approaches account for the interplay between service pricing, customer decisions and the associated restrictions in the distribution process. This paper attempts to fill this research gap by introducing a heuristic solution approach that simultaneously determines the ideal set of services, the associated pricing and the fulfillment plan in order to maximize profit. By integrating revenue management techniques into vehicle routing and eet planning, we derive a new type of formulation called service selection, pricing and fulfillment problem (SSPFP). It combines a multi-product pricing problem with a cycle-based service network design formulation. In order derive good-quality solutions for realistically-sized instances we use an asynchronous parallel genetic algorithm and follow the intuition that small changes to prices and customer assignments cause minor changes in the distribution process. We thus base every new solution on the most similar already evaluated fulfillment plan. This adapted initial solution is then iteratively improved by a newly-developed route-pattern exchange heuristic. The performance of the developed algorithm is demonstrated on a number of randomly created test instances and is compared to the solutions of a commercial MIP-solver. / Series: Schriftenreihe des Instituts für Transportwirtschaft und Logistik - Supply Chain Management
2

A solution of the two parameter gamma model to relate unit hydrograph features to basin characteristics

Cruise, James Franklin 07 July 2010 (has links)
The problem of correlating unit hydrograph features to topographic and man-made basin characteristics received attention in this report. The unit graph features considered herein were the peak discharge and the time lag of basin response. In order to facilitate the desired regression analysis, the two-parameter gamma model proposed by Edson was utilized in the investigation. The parameters of the model were obtained by the simultaneous solution of the equations for unit graph peak and lag using observed unit hydrographs for 16 basins in the Piedmont region of North Carolina and 14 basins located in Northern Virginia. In the opinion of many, these parameters are a better measure of the complex relationship which exists between the runoff from a basin and the topographic features of that basin than are the values of the unit graph peak and lag time themselves. The basin characteristics utilized in the investigation were: basin area, length of the longest streamcourse in the basin, average stream slope between points 10 percent and 85 percent downstream of the headwaters, and the percent of the impervious area contained in the basin. This last factor served as a measure of the amount of urban development present in the watershed. The investigation was hampered by a regrettable lack of sufficient data to derive regression equations of good reliability. This fact was due to the reduction of the data into groups by narrow geographical ranges. Thus, the number of stations available for analysis in anyone group was insufficient for purposes of a reliable regression analysis. From the investigation, it appears that the most significant basin characteristics affecting runoff are length, slope, and urban development. The strongest regression equations were derived using those three characteristics. It appears that the length and slope factors give better results when combined in the form (L/√S). / Master of Science
3

What did you really earn last year?: explaining measurement error in survey income data

Angel, Stefan, Disslbacher, Franziska, Humer, Stefan, Schnetzer, Matthias January 2019 (has links) (PDF)
This paper analyses the sources of income measurement error in surveys with a unique dataset.We use the Austrian 2008-2011 waves of EU-SILC which provide individual information on wages,pensions and unemployment benefits from survey interviews and officially linked administrativerecords. Thus, we do not have to fall back on complex two-sample matching procedures likerelated studies. We empirically investigate four sources of measurement error, namely (i) socialdesirability, (ii) socio-demographic characteristics of the respondent, (iii) the survey design, and(iv) the presence of learning effects. We find strong evidence for a social desirability bias inincome reporting, while the presence of learning effects is mixed and depends on the income typeunder consideration. An Owen value decomposition reveals that social desirability is a majorexplanation of misreporting in wages and pensions, whereas socio-demographic characteristicsare most relevant for mismatches in unemployment benefits. / Series: INEQ Working Paper Series
4

Big Data and Regional Science: Opportunities, Challenges, and Directions for Future Research

Schintler, Laurie A., Fischer, Manfred M. January 2018 (has links) (PDF)
Recent technological, social, and economic trends and transformations are contributing to the production of what is usually referred to as Big Data. Big Data, which is typically defined by four dimensions -- Volume, Velocity, Veracity, and Variety -- changes the methods and tactics for using, analyzing, and interpreting data, requiring new approaches for data provenance, data processing, data analysis and modeling, and knowledge representation. The use and analysis of Big Data involves several distinct stages from "data acquisition and recording" over "information extraction" and "data integration" to "data modeling and analysis" and "interpretation", each of which introduces challenges that need to be addressed. There also are cross-cutting challenges, which are common challenges that underlie many, sometimes all, of the stages of the data analysis pipeline. These relate to "heterogeneity", "uncertainty", "scale", "timeliness", "privacy" and "human interaction". Using the Big Data analysis pipeline as a guiding framework, this paper examines the challenges arising in the use of Big Data in regional science. The paper concludes with some suggestions for future activities to realize the possibilities and potential for Big Data in regional science. / Series: Working Papers in Regional Science
5

Coherent transfer between electron and nuclear spin qubits and their decoherence properties

Brown, Richard Matthew January 2012 (has links)
Conventional computing faces a huge technical challenge as traditional transistors will soon reach their size limitations. This will halt progress in reaching faster processing speeds and to overcome this problem, require an entirely new approach. Quantum computing (QC) is a natural solution offering a route to miniaturisation by, for example, storing information in electron or nuclear spin states, whilst harnessing the power of quantum physics to perform certain calculations exponentially faster than its classical counterpart. However, QCs face many difficulties, such as, protecting the quantum-bit (qubit) from the environment and its irreversible loss through the process of decoherence. Hybrid systems provide a route to harnessing the benefits of multiple degrees of freedom through the coherent transfer of quantum information between them. In this thesis I show coherent qubit transfer between electron and nuclear spin states in a <sup>15</sup>N@C<sub>60</sub> molecular system (comprising a nitrogen atom encapsulated in a carbon cage) and a solid state system, using phosphorous donors in silicon (Si:P). The propagation uses a series of resonant mi- crowave and radiofrequency pulses and is shown with a two-way fidelity of around 90% for an arbitrary qubit state. The transfer allows quantum information to be held in the nuclear spin for up to 3 orders of magnitude longer than in the electron spin, producing a <sup>15</sup>N@C<sub>60</sub> and Si:P ‘quantum memory’ of up to 130 ms and 1.75 s, respectively. I show electron and nuclear spin relaxation (T<sub>1</sub>), in both systems, is dominated by a two-phonon process resonant with an excited state, with a constant electron/nuclear T<sub>1</sub> ratio. The thesis further investigates the decoherence and relaxation properties of metal atoms encapsulated in a carbon cage, termed metallofullerenes, discovering that exceptionally long electron spin decoherence times are possible, such that these can be considered a viable QC candidate.

Page generated in 0.0405 seconds