• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 116
  • 12
  • 10
  • 9
  • 9
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 222
  • 48
  • 39
  • 38
  • 23
  • 22
  • 20
  • 20
  • 19
  • 19
  • 16
  • 15
  • 15
  • 14
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Aisle-based order pick systems with batching, zoning, and sorting

Choe, Kyung-Il 12 1900 (has links)
No description available.
122

Determining Realistic Loss Estimates for Rack Storage Warehouse Fires

Porter, Timothy Miller January 2006 (has links)
At present there is no simple, yet scientifically robust method for calculating insurance loss estimates due to a fire. Therefore building owners and insurers can not make suitably informed decisions when selecting fire protection measures or setting premiums as they have no way of defining the true risk they face. As a consequence this research aims to investigate a number of techniques in an effort to define one as appropriate for further research. Three different methods were explored and consist of risk based analysis, deterministic hand calculations and Computational Fluid Dynamics (CFD). Extensive literature reviews were conducted in each area and the final models were based on the outcomes of this research. Rack storage warehouses were chosen for analysis as they are currently topical within the fire engineering community and are a particular concern for insurers. The risk based analysis employed statistical techniques including event tree analysis and monte carlo simulation to calculate loss distributions and sensitivity analyses. The hand calculation method was based on equations presented in the literature and incorporated the use of a zone model (BRANZFire) to calculate deterministic loss estimates. The CFD model used was Fire Dynamics Simulator and full scale warehouse fires were modelled using this programme. It was concluded that Fire Dynamics Simulator is an inappropriate tool as the capability for providing loss estimates in a timely manner is currently beyond the model's capabilities. Of the two remaining methods the statistical risk based model was selected as the most appropriate for further investigation. The primary reasons for this decision were the ability to calculate loss distributions and conduct sensitivity analyses, as well as its versatility and user friendliness. Improved statistical data was defined as imperative for future development of the model.
123

Determining optimal staffing levels for the picking and packing operations in a distribution center

Badurdeen, Fathima Fazleena. January 2002 (has links)
Thesis (M.S.)--Ohio University, November, 2002. / Title from PDF t.p.
124

Determining realistic loss estimates for rack storage warehouse fires : a thesis submitted in partial fulfillment of the requirements for the degree of Master of Engineering in Fire Engineering, Department of Civil Engineering, University of Canterbury ... Christchurch, New Zealand /

Porter, Tim January 1900 (has links)
Thesis (M.E.F.E.)--Iniversity of Canterbury, 2004. / Typescript (photocopy). "October 2004." Includes bibliographical references. Also available via the World Wide Web.
125

Symbiotic existence

Raymond, Anne-Marie E. January 2008 (has links)
Thesis (M.Arch.)--University of Detroit Mercy, 2008. / "April 28, 2008". Includes bibliographical references (p. [19-21]).
126

Inventory management of packaging materials in a pharmaceutical warehouse a case study /

Kaluskar, Pranav. January 2008 (has links)
Thesis (M.S.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2008. / Includes bibliographical references.
127

An analysis of implementing an open bond system in Hong Kong

Wong, Yuk-mei, Kathy. January 2001 (has links)
Thesis (M.P.A.)--University of Hong Kong, 2001. / Includes bibliographical references (leaves 87-90). Also available in print.
128

Επέκταση του OLAP μοντέλου με σημαντικά δίκτυα

Στραγαλινός, Ευάγγελος 19 May 2011 (has links)
Λαμβάνοντας υπόψη ότι η χρήση των υπολογιστών μεταφέρθηκε από τους ερευνητικούς οργανισμούς στις επιχειρήσεις, διαπιστώνουμε ότι αποτελεί πλέον αναπόσπαστο επιχειρηματικό εργαλείο. Φτάνοντας στις αρχές της δεκαετίας του ‘90, μεγάλοι επιχειρηματικοί και κρατικοί φορείς διέθεταν τεράστιες ποσότητες δεδομένων που θα έπρεπε να εκμεταλλευθούν. Τα παραπάνω σε συνδυασμό με την εδραιωμένη πλέον αντίληψη ότι η πληροφορία αποτελεί το πιο πολύτιμο αγαθό, οδήγησαν στην ανάγκη για εφαρμογές ανάλυσης και επεξεργασίας μεγάλου όγκου δεδομένων. Την λύση δίνουν οι τεχνολογίες των Αποθηκών Δεδομένων (Data warehouses) και της αναλυτικής Επεξεργασίας Δεδομένων (OLAP). Τον τελευταίο καιρό διεξάγεται σημαντική έρευνα σε οντολογίες και σημασιολογικά δίκτυα, όπου πλέον η πληροφορία περιγράφεται εννοιολογικά για να είναι ευκολότερη η ανάκτηση της, η χρησιμοποίηση της και η σύγκριση της, δημιουργείται η ανάγκη εύρεσης ενός νέου συνδυαστικού τρόπου αναπαράστασης της γνώσης. Στην παρούσα διπλωματική εργασία ο αναγνώστης εισάγεται στα σημασιολογικά δίκτυα. Εκεί παρουσιάζονται αναλυτικά τα είδη των σημασιολογικών δικτύων και ιδιαίτερα ορισμένα που χρησιμοποιούνται για την αντιμετώπιση εξαιρέσεων. Δίνετε εισαγωγική περιγραφή της θεωρίας που αναπτύσσεται στις βάσεις δεδομένων με ιδιαίτερη έμφαση στις πολυδιάστατες βάσεις και τις αποθήκες δεδομένων. Παρουσιάζονται αναλυτικά τα OLAP εργαλεία καθώς και οι βασικές έννοιες που τα αποτελούν. Τέλος, γίνεται η παρουσίαση μιας νέας τεχνικής που βασίζεται στις μεθόδους αντιμετώπισης εξαιρέσεων σε ένα σημασιολογικό δίκτυο με σκοπό να επεκτείνει ένα OLAP μοντέλο και να το καταστήσει ικανό να αντιμετωπίσει εξαιρέσεις. Η προτεινόμενη επέκταση επιτρέπει την ύπαρξη εξαιρέσεων μεταξύ των τιμών των διαστάσεων ενός υπερκύβου δεδομένων, οι οποίες παίζουν σημαντικό ρόλο στην σωστή εξαγωγή συμπερασμάτων. / Taking into consideration that the use of computers has been transferred from research institutes to private companies and industries, we found out that computers constitute henceforth an integral enterprising tool. During the early 90’s, large private and public institutions afforded enormous quantities of data which revealed the need to be exploited. According to the above and having in mind that standing concepts handle information as the most important asset, it is noted that there is a need for applications that analyse and handle large amount of data. A solution to this problem was given by the data warehouse technologies and the analytical data processing (OLAP). Recently considerable research is conducted on ontologies and semantic networks, where information is already described in a conceptual way, which makes it easier to recover, to use it and compare to other similar. Thus, it is created the need of finding a new combination method of representation of knowledge. In the present master’s dissertation the reader is introduced into semantic networks. All the types of certain semantic networks are presented in detail and particularly those which are dealing with the representation of exceptions. Emphasis is given to the case of multidimensional databases as well as data deposits. Among them, OLAP tools and their basic theory is being described in more analysis. Finally, there is a presentation of a new technique that is based on methods for encountering exceptions in a semantic network whose goal is to extend one OLAP model in order to enable exception overcoming. The proposed expansion allows the existence of exceptions among the dimension values of a data hypercube which play a significant role to the right export of results.
129

Does data warehouse end-user metadata add value?

Foshay, N, Mukherjee, Avinandan, Taylor, W. Andrew January 2007 (has links)
No / Many data warehouses are currently underutilized by managers and knowledge workers. Can high-quality end-user metadata help to increase levels of adoption and use?
130

Processamento de consultas SOLAP drill-across e com junção espacial em data warehouses geográficos / Processing of drill-across and spatial join SOLAP queries over geographic data warehouses

Brito, Jaqueline Joice 28 November 2012 (has links)
Um data warehouse geográco (DWG) é um banco de dados multidimensional, orientado a assunto, integrado, histórico, não-volátil e geralmente organizado em níveis de agregação. Além disso, também armazena dados espaciais em uma ou mais dimensões ou em pelo menos uma medida numérica. Visando oferecer suporte à tomada de decisão, é possível realizar em DWGs consultas SOLAP (spatial online analytical processing ), isto é, consultas analíticas multidimensionais (e.g., drill-down, roll-up, drill-across ) com predicados espaciais (e.g., intersecta, contém, está contido) denidos para range queries e junções espaciais. Um desafio no processamento dessas consultas é recuperar, de forma eficiente, dados espaciais e convencionais em DWGs muito volumosos. Na literatura, existem poucos índices voltados à indexação de DWGs, e ainda assim nenhum desses índices dedica-se a indexar consultas SOLAP drill-across e com junção espacial. Esta dissertação visa suprir essa limitação, por meio da proposta de estratégias para o processamento dessas consultas complexas. Para o processamento de consultas SOLAP drill-across foram propostas duas estratégias, Divide e Única, além da especicação de um conjunto de diretrizes que deve ser seguido para o projeto de um esquema de DWG que possibilite a execução dessas consultas e da especicação de classes de consultas. Para o processamento de consultas SOLAP com junção espacial foi proposta a estratégia SJB, além da identicação de quais características o esquema de DWG deve possuir para possibilitar a execução dessas consultas e da especicação do formato dessas consultas. A validação das estratégias propostas foi realizada por meio de testes de desempenho considerando diferentes congurações, sendo que os resultados obtidos foram contrastados com a execução de consultas do tipo junção estrela e o uso de visões materializadas. Os resultados mostraram que as estratégias propostas são muito eficientes. No processamento de consultas SOLAP drill-across, as estratégias Divide e Única mostraram uma redução no tempo de 82,7% a 98,6% com relação à junção estrela e ao uso de visões materializadas. No processamento de consultas SOLAP com junção espacial, a estratégia SJB garantiu uma melhora de desempenho na grande maioria das consultas executadas. Para essas consultas, o ganho de desempenho variou de 0,3% até 99,2% / A geographic data warehouse (GDW) is a special kind of multidimensional database. It is subject-oriented, integrated, historical, non-volatile and usually organized in levels of aggregation. Furthermore, a GDW also stores spatial data in one or more dimensions or at least in one numerical measure. Aiming at decision support, GDWs allow SOLAP (spatial online analytical processing) queries, i.e., multidimensional analytical queries (e.g., drill-down, roll-up, drill-across) extended with spatial predicates (e.g., intersects, contains, is contained) dened for range and spatial join queries. A challenging issue related to the processing of these complex queries is how to recover spatial and conventional data stored in huge GDWs eciently. In the literature, there are few access methods dedicated to index GDWs, and none of these methods focus on drill-across and spatial join SOLAP queries. In this master\'s thesis, we propose novel strategies for processing these complex queries. We introduce two strategies for processing SOLAP drill-across queries (namely, Divide and Unique), dene a set of guidelines for the design of a GDW schema that enables the execution of these queries, and determine a set of classes of these queries to be issued over a GDW schema that follows the proposed guidelines. As for the processing of spatial join SOLAP queries, we propose the SJB strategy, and also identify the characteristics of a DWG schema that enables the execution of these queries as well as dene the format of these queries. We validated the proposed strategies through performance tests that compared them with the star join computation and the use of materialized views. The obtained results showed that our strategies are very ecient. Regarding the SOLAP drill-across queries, the Divide and Unique strategies showed a time reduction that ranged from 82,7% to 98,6% with respect to star join computation and the use of materialized views. Regarding the SOLAP spatial join queries, the SJB strategy guaranteed best results for most of the analyzed queries. For these queries, the performance gain of the SJB strategy ranged from 0,3% to 99,2% over the star join computation and the use of materialized view

Page generated in 0.0509 seconds