• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 4
  • 3
  • 2
  • 1
  • Tagged with
  • 19
  • 8
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Effect of Emotional Stimulation on Recognition and Inference

Haddan, Eugene E. 08 1900 (has links)
The purpose of this study was to make an assessment of the probable effects of extraneous stimulation on both cognitive achievement or learning, and emotional reactions and autonomic arousals of students. While the focus of interest was upon possible disruptive effects, the kinds of measurements projected would make it possible to observe some effects of either kind, disruptive or facilitative.
2

Sambandet mellan saldodifferenser och effektivitet : En fallstudie på utomhuslager inom Sandvik Materials Technology

Lundgren, Amelie, Jervill, Anna January 2014 (has links)
To meet the needs and demands of the costumer, accurate information about how much is available in stock is necessary. When the information in the system does not match the physical inventory, inventory inaccuracy occurs. Inventory inaccuracy is common among companies and may contribute to increased labor cost, excess inventory, production disruptions, waste of time, late deliveries, poor service and lost customers. The purpose of this study is to identify contributing factors to inventory inaccuracy for business with outdoor storage. The study also aims to investigate how efficiency factors can reduce the inventory inaccuracy for business with outdoor storage. A case study was conducted to answer the purpose. A company in the steel industry was chosen to examine inventory inaccuracy on business with outdoor storage. During the case study a series of interviews was conducted with employees and managers. Together with information from observations and documents the information from the interviews was compiled into flowcharts. Inventory inaccuracy affects companies with in many ways and leads to inefficiency. Incorrect inventory leads to unnecessary work and costs and can also contribute to a reduced efficiency for business with outdoor storage. Inventory inaccuracy may be reduced with continuous inventory checks, simplified processes, more automation, enhanced information and communication. Improved routines can also help reduce inventory inaccuracy. To increase efficiency, the staff must be seen as an important resource.
3

The Impact of the RFID Technology in Improving Performance of Inventory Systems subject to Inaccuracies

Rekik, Yacine 08 December 2006 (has links) (PDF)
Contrairement à un système d'identification plus traditionnel tel que le code à barres, la nouvelle technologie RFID (Radio Frequency IDentification) utilise des ondes radio fréquence pour transmettre des données entre une étiquette et un lecteur pour pouvoir identifier, localiser ou suivre une entité dans une chaîne d'approvisionnement. Cette propriété lui procure certains avantages (facilité d'accès à l'information, suivi continu, amélioration de l'exactitude des données, détection du vol et de la contrefaçon, etc..) par rapport à d'autres systèmes d'identification et de capture de données. Nous partons du constat que l'utilisation de cette nouvelle technologie permettra aux acteurs de la chaîne logistique de pouvoir partager une information de meilleure qualité, plus exhaustive et fiable concernant le flux physique et le suivi de la localisation produits. Or, l'hypothèse implicite considérée dans la plupart des modèles classiques de gestion de stock est que l'on a une connaissance parfaite du flux entrant et sortant. L'objectif de recherche sera d'intégrer dans ces modèles des dégradations venant fausser le flux nominal et d'en analyser les conséquences (en termes de coût additionnel). Un accent fort sera mis sur le développement de solutions combinant efficacité et simplicité. L'accent sera mis aussi sur le mode de partage du coût de cette technologie entre plusieurs acteurs de la chaîne logistique : serait-il mieux de partager les bénéfices de cette technologie dans un environnement de coordination ou dans un environnement de compétitivité entre acteurs? Les résultas de cette thèse porte sur l'élaboration de modèles théoriques -de type gestion de stock – concernant la production, la distribution et l'approvisionnement dans une chaîne logistique et faisant intervenir et le coût et les gains potentiels de cette nouvelle technologie d'identification automatique.
4

Essays on Retail Operations

Chuang, Hao-Chun 03 October 2013 (has links)
This dissertation comprises three essays in which we develop optimization, econometric, and simulation models to help traditional retailers improve in-store operations. Our modeling efforts aim to tackle inventory record inaccuracy (IRI) and suboptimal staffing levels, both of which are pervasive problems in retailing and cause non-trivial profit loss. In the first essay, we devise two optimization models that represent current practices in industry to minimize costs induced by IRI: daily-fraction and all-or-none inspection. We further perform a case study to identify deficiencies of store operating practices given different risk preferences. Our findings provide practical guidelines for managers to design cost-efficient inspection policy. In the second essay, we develop a dynamic simulation model to analyze multiple antecedents of IRI. Based on simulation results, we derive two hypotheses on the association between IRI and labor. The panel data analysis shows that both the level and the mix of store labor have strong impacts on IRI. Our analysis derives qualitative insights for retail managers to prevent the occurrence of IRI. Finally, in the third essay, we perform an empirical study to improve staffing decisions in retailing. We first develop a response function to quantify the impact of labor and traffic on sales. Grounded on the function we propose a traffic-based staffing heuristic, which performs closely to the optimal and outperforms existing staffing levels in counterfactual experiments. A major contribution of our study is to quantify the benefits of delivering labor plans based on traffic information. Also, the staffing approach is easy to use and saves the need for traffic forecasting.
5

Impact of inaccurate data on supply chain inventory performance

Basinger, Karen Lynn 30 November 2006 (has links)
No description available.
6

Laboratórios via sistema tradicional e espectroscopia de reflectância: avaliação da qualidade analítica dos atributos do solo / Laboratories in the traditional system and reflectance spectroscopy: evaluation of analytical quality of soil attributes

Bedin, Luis Gustavo 26 August 2016 (has links)
A análise de solo é considerada ferramenta essencial para fins de recomendação de calagem, adubação e manejo do solo. Entretanto, com a demanda crescente por alimentos e a necessidade do aumento sustentável da produtividade agrícola, é fundamental seguir progredindo em termos de qualidade, custos e o tempo demandado para a obtenção dos resultados destas análises. Neste sentido, as técnicas de sensoriamento remoto, incluindo as escalas laboratoriais, de campo, aéreas e orbitais, apresentam vantagens, principalmente no que se refere à avaliação de áreas de grande extensão. A qualidade das determinações laboratoriais é fundamental para as recomendações de manejo do solo, levando ao questionamento do grau de variabilidade analítica entre diferentes laboratórios e quantificações via espectroscopia de reflectância. Objetivou-se avaliar as incertezas relacionadas às determinações da análise de solo, e como isso pode afetar nos modelos de predição espectrais (350-2.500 nm). Com isso, espera-se entender as vantagens e limitações das metodologias, permitindo assim decisões mais adequadas para o manejo do solo. Amostras de solos sob cultivo extensivo de cana de açúcar foram coletadas de 29 municípios situados no estado de São Paulo. Para a coleta dos solos foram abertos 48 perfis com aproximadamente 1,5 m de profundidade, foi retirado de cada perfil aproximadamente 10 kg de terra, nas profundidades de 0-0,2 e 0,8-1,00 m, totalizando 96 amostras primárias. Para as determinações químicas foram analisados os seguintes atributos: potencial hidrogeniônico (pH), matéria orgânica (MO), fósforo resina (P), potássio trocável (K+), cálcio trocável (Ca2+), magnésio trocável (Mg2+), alumínio trocável (Al3+), acidez potencial (H + Al), soma de bases trocáveis (SB), capacidade de troca de cátions (CTC), saturação da CTC por bases (V%) e saturação por Al3+ (m%). No que se refere às determinações granulométricas, foram analisadas as frações areia, silte e argila. Para obtenção dos espectros de reflectância, foram utilizados quatro espectrorradiômetros (350-2.500 nm). As variações das recomendações de calagem de diferentes laboratórios também foram avaliadas. Laboratórios foram avaliados com base em índices de imprecisão e inexatidão. As determinações com maiores erros em ordem decrescente, considerando a média de todos os laboratórios, foram m%, Al3+, Mg2+ e P. Esses erros influenciaram significativamente nas calibrações dos modelos de predições via sensor. Além disso, foi observado que as incertezas analíticas muitas vezes podem influenciar na recomendação de calagem. Para esta recomendação, um dos laboratórios estudados apresentou resultados com erro maior a 1 t ha-1. Os modelos de predição calibrados com os dados do laboratório com menor quantidade de erros apresentaram valor de R2 maior que 0,7 e RPD maior que 1,8, para os atributos MO, Al, CTC, H+Al, areia, silte e argila. A metodologia empregada possibilitou a quantificação do nível de incertezas aceitáveis nas determinações laboratoriais e a avaliação de como os erros analíticos laboratoriais influenciaram nas predições dos sensores. A espectroscopia de reflectância mostra ser alternativa complementar eficiente aos métodos tradicionais de análises de solo. / Soil analysis is an essential tool for liming recomendation, fertilization and soil management. Considering the increasing demand for food and the need for a sustainable increase in agricultural productivity, it is essential to promote the quality of soil analysis, as well as reducing costs and time required to obtain such analysis. In this sense, remote sensing techniques, including laboratory, field, aerial and orbital levels, have advantages especially regarding the assessment of areas of large extension. The quality of laboratory measurements is critical for soil management recommendations, which makes important to question the degree of analytical variability between different laboratories and measurements via reflectance spectroscopy. This study aimed to evaluate the uncertainties related to traditional soil analysis, and how they can affect the spectral prediction models (350-2500 nm). It is expected to understand the advantages and limitations of both methodologies, allowing proper decision-making for soil management. Soil samples under extensive sugar cane cultivation were collected from 29 municipalities in the state of São Paulo. For soil sampling, 48 soil profiles were opened in a depth of approximately 1.5 m and 10 kg of soil was collected from the depths 0-0.2 and 0.8-1.0 m, resulting in 96 primary samples. For chemical analysis the following attributes were considered: potential of Hydrogen (pH), Organic Matter (OM), phosphorus (P), exchangeable potassium (K+), exchangeable calcium (Ca2+), exchangeable magnesium (Mg2+), exchangeable aluminum (Al3+), potential acidity (H + Al), total exchangeable bases (SB), Cation Exchange Capacity (CEC), CEC saturation by bases (V%) and saturation by Al3+ (m%). Regarding the particle size measurements, the fractions sand, silt and clay were analyzed. Four spectroradiometers (350-2500 nm) were used in order to obtain the reflectance spectra. The variations of liming recommendations from different laboratories were also evaluated. Laboratories were evaluated based on imprecision and inaccuracy rates. The soil attributes that presented highest errors in the traditional analysis, based on the average of all laboratories, were in descending order m%, Al3+, Mg2+ and P. These errors significantly influenced the calibrations of the prediction models through sensors. Furthermore, the analytical uncertainties can often influence liming recommendations. For this recommendation, one of the laboratories presented results with errors greater than 1 t ha-1. The prediction models calibrated with laboratory data with fewer errors presented R2 value greater than 0.7 and RPD greater than 1.8 for OM, Al3+, CEC, H + Al, sand, silt and clay. The methodology allowed the quantification of the level of acceptable uncertainty in the laboratory measurements and the evaluation of how the laboratory analytical errors influenced the predictions of the sensors. The reflectance spectroscopy is an efficient complementary alternative to traditional methods of soil analyses.
7

Laboratórios via sistema tradicional e espectroscopia de reflectância: avaliação da qualidade analítica dos atributos do solo / Laboratories in the traditional system and reflectance spectroscopy: evaluation of analytical quality of soil attributes

Luis Gustavo Bedin 26 August 2016 (has links)
A análise de solo é considerada ferramenta essencial para fins de recomendação de calagem, adubação e manejo do solo. Entretanto, com a demanda crescente por alimentos e a necessidade do aumento sustentável da produtividade agrícola, é fundamental seguir progredindo em termos de qualidade, custos e o tempo demandado para a obtenção dos resultados destas análises. Neste sentido, as técnicas de sensoriamento remoto, incluindo as escalas laboratoriais, de campo, aéreas e orbitais, apresentam vantagens, principalmente no que se refere à avaliação de áreas de grande extensão. A qualidade das determinações laboratoriais é fundamental para as recomendações de manejo do solo, levando ao questionamento do grau de variabilidade analítica entre diferentes laboratórios e quantificações via espectroscopia de reflectância. Objetivou-se avaliar as incertezas relacionadas às determinações da análise de solo, e como isso pode afetar nos modelos de predição espectrais (350-2.500 nm). Com isso, espera-se entender as vantagens e limitações das metodologias, permitindo assim decisões mais adequadas para o manejo do solo. Amostras de solos sob cultivo extensivo de cana de açúcar foram coletadas de 29 municípios situados no estado de São Paulo. Para a coleta dos solos foram abertos 48 perfis com aproximadamente 1,5 m de profundidade, foi retirado de cada perfil aproximadamente 10 kg de terra, nas profundidades de 0-0,2 e 0,8-1,00 m, totalizando 96 amostras primárias. Para as determinações químicas foram analisados os seguintes atributos: potencial hidrogeniônico (pH), matéria orgânica (MO), fósforo resina (P), potássio trocável (K+), cálcio trocável (Ca2+), magnésio trocável (Mg2+), alumínio trocável (Al3+), acidez potencial (H + Al), soma de bases trocáveis (SB), capacidade de troca de cátions (CTC), saturação da CTC por bases (V%) e saturação por Al3+ (m%). No que se refere às determinações granulométricas, foram analisadas as frações areia, silte e argila. Para obtenção dos espectros de reflectância, foram utilizados quatro espectrorradiômetros (350-2.500 nm). As variações das recomendações de calagem de diferentes laboratórios também foram avaliadas. Laboratórios foram avaliados com base em índices de imprecisão e inexatidão. As determinações com maiores erros em ordem decrescente, considerando a média de todos os laboratórios, foram m%, Al3+, Mg2+ e P. Esses erros influenciaram significativamente nas calibrações dos modelos de predições via sensor. Além disso, foi observado que as incertezas analíticas muitas vezes podem influenciar na recomendação de calagem. Para esta recomendação, um dos laboratórios estudados apresentou resultados com erro maior a 1 t ha-1. Os modelos de predição calibrados com os dados do laboratório com menor quantidade de erros apresentaram valor de R2 maior que 0,7 e RPD maior que 1,8, para os atributos MO, Al, CTC, H+Al, areia, silte e argila. A metodologia empregada possibilitou a quantificação do nível de incertezas aceitáveis nas determinações laboratoriais e a avaliação de como os erros analíticos laboratoriais influenciaram nas predições dos sensores. A espectroscopia de reflectância mostra ser alternativa complementar eficiente aos métodos tradicionais de análises de solo. / Soil analysis is an essential tool for liming recomendation, fertilization and soil management. Considering the increasing demand for food and the need for a sustainable increase in agricultural productivity, it is essential to promote the quality of soil analysis, as well as reducing costs and time required to obtain such analysis. In this sense, remote sensing techniques, including laboratory, field, aerial and orbital levels, have advantages especially regarding the assessment of areas of large extension. The quality of laboratory measurements is critical for soil management recommendations, which makes important to question the degree of analytical variability between different laboratories and measurements via reflectance spectroscopy. This study aimed to evaluate the uncertainties related to traditional soil analysis, and how they can affect the spectral prediction models (350-2500 nm). It is expected to understand the advantages and limitations of both methodologies, allowing proper decision-making for soil management. Soil samples under extensive sugar cane cultivation were collected from 29 municipalities in the state of São Paulo. For soil sampling, 48 soil profiles were opened in a depth of approximately 1.5 m and 10 kg of soil was collected from the depths 0-0.2 and 0.8-1.0 m, resulting in 96 primary samples. For chemical analysis the following attributes were considered: potential of Hydrogen (pH), Organic Matter (OM), phosphorus (P), exchangeable potassium (K+), exchangeable calcium (Ca2+), exchangeable magnesium (Mg2+), exchangeable aluminum (Al3+), potential acidity (H + Al), total exchangeable bases (SB), Cation Exchange Capacity (CEC), CEC saturation by bases (V%) and saturation by Al3+ (m%). Regarding the particle size measurements, the fractions sand, silt and clay were analyzed. Four spectroradiometers (350-2500 nm) were used in order to obtain the reflectance spectra. The variations of liming recommendations from different laboratories were also evaluated. Laboratories were evaluated based on imprecision and inaccuracy rates. The soil attributes that presented highest errors in the traditional analysis, based on the average of all laboratories, were in descending order m%, Al3+, Mg2+ and P. These errors significantly influenced the calibrations of the prediction models through sensors. Furthermore, the analytical uncertainties can often influence liming recommendations. For this recommendation, one of the laboratories presented results with errors greater than 1 t ha-1. The prediction models calibrated with laboratory data with fewer errors presented R2 value greater than 0.7 and RPD greater than 1.8 for OM, Al3+, CEC, H + Al, sand, silt and clay. The methodology allowed the quantification of the level of acceptable uncertainty in the laboratory measurements and the evaluation of how the laboratory analytical errors influenced the predictions of the sensors. The reflectance spectroscopy is an efficient complementary alternative to traditional methods of soil analyses.
8

Mechanisms to Reduce Routing Information Inaccuracy Effects: Application to MPLS and WDM Networks

Masip Bruin, Xavier 07 October 2003 (has links)
Les xarxes IP tradicionals utilitzen el model de transmissió "best-effort" per transportar tràfic entre clients de la xarxa. Aquest model de transmissió de tràfic no és el més adequat per les aplicacions en temps real com per exemple, vídeo sota demanda, conferències multimedia o realitat virtual que per altra banda tenen cada cop més adeptes entre els clients de la xarxa. A fi de garantir el correcte funcionament d'aquest tipus d'aplicacions, l'estructura de la xarxa ha de ser substancialment modificada amb l'objectiu final de poder optimitzar els seus propis recursos i així poder fer front a aquells tipus de tràfics i de clients que requereixen certes garanties de "Qualitat de Servei" (QoS) per a la seva correcta transmissió.Aquestes modificacions o millores de la xarxa poden ser perfectament realitzades sota l'entorn d'Enginyeria de Tràfic (Traffic Engineering, TE). Dos són els principals aspectos relacionats amb el funcionament de la xarxa en aquest entorn de TE: els mecanismes de commutació i els mecanismes d'encaminament. Així, per una banda es necessita un mecanisme de commutació molt ràpid en els nodes interns de la xarxa a fi de que els paquets de dades puguin ser processats amb el menor temps possible. En xarxes IP aquest objectiu s'aconsegueix amb el Multiprotocol Label Switching (MPLS). Per altra banda, a fi de garantir certa QoS, les decisions d'encaminament s'han de realitzar tenint en compte quines són les restriccions de QoS sol·licitades per el node client que origina el tràfic. Aquest objectiu s'aconsegueix modificant els esquemes d'encaminament tradicionals, incorporant-hi els paràmetres de QoS en les decisions d'encaminament, generant el que es coneix com algorismes d'encaminament amb QoS (QoS routing).Centrant-nos en aquest darrer aspecte, la majoria dels algorismes d'encaminament amb QoS existents, realitzen la selecció de la ruta a partir de la informació d'estat de l'enllaç emmagatzemada en les bases de dades d'estat de l'enllaç contingudes en els nodes. Per poder garantir que els successius canvis en l'estat de la xarxa estiguin perfectament reflectits en aquesta informació d'encaminament, el protocol d'encaminament ha d'incloure un mecanisme d'actualització que faci possible garantir que la selecció de les rutes es fa a partir d'informació acurada de l'estat real de la xarxa. En un entorn IP tradicional, el qual inicialment no inclou paràmetres de QoS, els canvis produïts en la informació d'encaminament són tan sols deguts a modificacions en la topologia i connectivitat de la xarxa. En aquest entorn, donat que la freqüència en la qual s'espera rebre missatges advertint d'aquestes modificacions no és elevada, la majoria dels mecanismes d'actualització es basen en la inclusió d'un cert període de refresc. Així, les bases de dades s'actualitzen periòdicament mitjançant la distribució d'uns missatges que informen a la resta de nodes de l'estat de la xarxa,a fi de que cada node pugui actualitzar la seva base de dades.No obstant això, hem de tenir en compte que en aquelles xarxes IP/MPLS altament dinàmiques amb requeriments de QoS, aquest mecanisme d'actualització basat en un refresc periòdic no serà útil. Això és degut a la rigidesa que presenta aquest mecanisme, la qual fa que no sigui aplicable a un entorn que presenti contínues variacions del paràmetres dels enllaços cada cop que s'estableixi o s'alliberi una connexió (ara a més de la topologia i connectivitat, s'inclouen paràmetres de QoS, com ampla de banda, retard, variació del retard, etc.). Per tot això, s'haurà de generar un mecanisme d'actualització molt més eficient que sigui capaç de mantenir les bases de dades dels nodes perfectament actualitzades reflectint els continus canvis en l'estat de la xarxa. L'alta granularitat d'aquest mecanisme provocarà una sobrecàrrega de la xarxa, degut a l'enorme quantitat de missatges d'actualització que seran necessaris per poder mantenir informació actualitzada en les bases de dades d'estat de l'enllaç en cada node.Per reduir aquesta sobrecàrrega de senyalització apareixen les polítiques d'activació (triggering policies) que tenen per objectiu determinar en quin moment un node ha d'enviar un missatge d'actualització a la resta de nodes de la xarxa advertint-los de les variacions produïdes en els seus enllaços. Desafortunadament, l'ús d'aquestes polítiques d'activació produeix un efecte negatiu sobre el funcionament global de la xarxa. En efecte, si l'actualització de la informació de l'estat de l'enllaç en els nodes no es fa cada cop que aquesta informació es veu modificada, sinó que es fa d'acord a una certa política d'activació, no es podrà garantir que aquesta informació representi de forma acurada l'esta actual de la xarxa en tot moment. Això pot provocar una selecció no òptima de la ruta seleccionada i un increment en la probabilitat de bloqueig de noves connexions a la xarxa. / Las redes IP tradicionales utilizan el modelo de transmisión best-effort para transportar tráfico entre clientes de la red. Es bien sabido que este modelo de transmisión de tráfico no es el más adecuado para las aplicaciones en tiempo real, tales como video bajo demanda, conferencias multimedia o realidad virtual, que cada vez son más de uso común entre los clientes de la red. Para garantizar el correcto funcionamiento de dichas aplicaciones la estructura de la red debe ser modificada a fin de optimizar la utilización de sus propios recursos y para poder hacer frente a aquellos tráficos que requieran ciertas garantías de Calidad de Servicio (QoS) para su correcta transmisión.Estas modificaciones o mejoras de la red pueden ser perfectamente realizadas bajo el entorno de Traffic Engineering (TE). Dos son los principales aspectos relacionados con el funcionamiento de la red en el entorno de TE: los mecanismos de conmutación y los mecanismos de encaminamiento. Así, por una parte, se necesita un mecanismo de conmutación muy rápido en los nodos intermedios de la red a fin de que los paquetes de datos puedan ser procesados con el menor tiempo posible. En redes IP este objetivo se consigue con el Multiprotocol Label Switching (MPLS). Por otra parte a fin de garantizar cierta QoS, las decisiones de encaminamiento se deben realizar acorde con los parámetros de QoS requeridos por el cliente que origina tráfico. Este objetivo se consigue modificando los esquemas de encaminamiento tradicionales e incorporando parámetros de QoS en las decisiones de encaminamiento, lo que deriva en la generación de encaminamiento con QoS (QoS routing).Centrándonos en este último aspecto de encaminamiento, la mayoría de los algoritmos de QoS routing existentes realizan la selección de la ruta a partir de la información de estado del enlace que está almacenada en las bases de datos de estado del enlace contenidas en los nodos. A fin de garantizar que los sucesivos cambios en el estado de la red estén perfectamente reflejados en dicha información, el mecanismo de encaminamiento debe incorporar un mecanismo de actualización cuyo objetivo sea garantizar que las decisiones de encaminamiento se realizan a partir de información fidedigna del estado de la red. En un entorno IP tradicional, el cual no incluye parámetros de QoS, los cambios producidos en dicha información son los debidos a modificaciones en la topología y conectividad. En dicho entorno dado que no son esperadas frecuentes variaciones de la topología de la red, la mayoría de los mecanismos de actualización están basados en la inclusión de un cierto periodo de refresco.Sin embargo, en redes IP/MPLS altamente dinámicas con requerimientos de QoS, este mecanismo de actualización no será adecuado debido a su rigidez y a las continuas variaciones de los parámetros de los enlaces (que ahora incluirá parámetros de QoS, tales como, ancho de banda, retardo, variación del retado, etc.) que se producirán cada vez que se establezca/libere una conexión. Por tanto, se deberá generar un mecanismo de actualización mucho más eficiente que sea capaz de actualizar las bases de datos de los nodos a fin de reflejar las constantes variaciones del estado de la red. La alta granularidad de este mecanismo provocará una sobrecarga de la red, debido a la enorme cantidad de mensajes de actualización necesarios para mantener información actualizada del estado de la red. Para reducir esta sobrecarga de señalización aparecen las políticas de disparo (triggering policies), cuyo objetivo es determinar en qué momento un nodo debe enviar un mensaje de actualización al resto de nodos de la red advirtiéndoles de las variaciones producidas en sus enlaces.Desafortunadamente el uso de dichas políticas de disparo produce un efecto negativo sobre el funcionamiento global de la red. En efecto, si la actualización de la información de estado del enlace en los nodos no se realiza cada vez que dicha información es modificada sino de acuerdo con cierta política de disparo, no se puede garantizar que dicha información represente fielmente el estado de la red. Así, la selección de la ruta, podrá ser realizada basada en información inexacta o imprecisa del estado de lo red, lo cual puede provocar una selección no óptima de la ruta y un incremento en la probabilidad de bloqueo de la red.Esta Tesis se centra en definir y solucionar el problema de la selección de rutas bajo información inexacta o imprecisa de la red (routing inaccuracy problem). Se consideran dos escenarios de trabajo, las actuales redes MPLS y las futuras redes WDM, para los cuales se propone un nuevo mecanismo de encaminamiento: BYPASS Based Routing (BBR) para redes IP/MPLS y BYPASS Based Optical Routing (BBOR) para redes WDM. Ambos mecanismos de encaminamiento se basan en un concepto común denominado "bypass dinámico".El concepto de "bypass dinámico" permite que un nodo intermedio de la red encamine el mensaje de establecimiento que ha recibido del nodo fuente, a través de una ruta distinta a la calculada por el nodo fuente (y explícitamente indicada en el mensaje de establecimiento), cuando detecte que inesperadamente el enlace de salida no dispone de recursos suficientes para soportar las garantías de QoS requeridas por la conexión a establecer. Estas rutas alternativas, denominadas bypass-paths, son calculadas por el nodo fuente o de entrada a la red simultáneamente con la ruta principal para ciertos nodos intermedios de la misma. En redes IP/MPLS el mecanismo BBR aplica el concepto de "bypass dinámico" a las peticiones de conexión con restricciones de ancho de banda. En cambio, en redes WDM, el mecanismo BBOR aplica el concepto de "bypass dinámico" a la hora de asignar una longitud de onda por la cual se va a transmitir el trafico. / Traditional IP networks are based on the best effort model to transport traffic flowsbetween network clients. Since this model cannot properly support the requirements demanded by several emerging real time applications (such as video on demand, multimedia conferences or virtual reality), some modifications in the network structure, mainly oriented to optimise network performance, are required in order to provide Quality of Service (QoS) guarantees.Traffic Engineering is an excellent framework to achieve these network enhancements.There are two main aspects in this context that strongly interact with network performance: switching mechanisms and routing mechanisms. On one hand, a quick switching mechanism is required to reduce the processing time in the intermediate nodes. In IP networks this behaviour is obtained by introducing Multiprotocol Label Switching (MPLS). On the other hand, a powerful routing mechanism that includes QoS attributes when selecting routes (QoS Routing) is also required.Focusing on the latter aspect, most QoS routing algorithms select paths based on the information contained in the network state databases stored in the network nodes. Because of this, routing mechanisms must include an updating mechanism to guarantee that the network state information perfectly represents the current network state. Since network state changes (topology) are not produced very often, in conventional IP networks without QoS capabilities, most updating mechanisms are based on a periodic refresh.In contrast, in highly dynamic large IP/MPLS networks with QoS capabilities a finer updating mechanism is needed. This updating mechanism generates an important and nondesirablesignalling overhead if maintaining accurate network state information is pursued. To reduce the signalling overhead, triggering policies are used. The main function of a triggering policy is to determine when a network node must advertise changes in its directly connected links to other network nodes. As a consequence of reduced signalling, the information in the network state databases might not represent an accurate picture of the actual network state.Hence, path selection may be done according to inaccurate routing information, which could cause both non-optimal path selection and an increase in connection blocking frequency.This Thesis deals with this routing inaccuracy problem, introducing new mechanisms to reduce the effects on global network performance when selecting explicit paths under inaccurate routing information. Two network scenarios are considered, namely current IP/MPLS networks and future WDM networks, and one routing mechanism per scenario is suggested:BYPASS Based Routing (BBR) for IP/MPLS and BYPASS Based Optical Routing (BBOR) for WDM networks. Both mechanisms are based on a common concept, which is defined as dynamic bypass.According to the dynamic bypass concept, whenever an intermediate node along the selected path (unexpectedly) does not have enough resources to cope with the incoming MPLS/optical-path demand requirements, it has the capability to reroute the set-up message through alternative pre-computed paths (bypass-paths). Therefore, in IP/MPLS networks the BBR mechanism applies the dynamic bypass concept to the incoming LSP demands under bandwidth constraints, and in WDM networks the BBOR mechanism applies the dynamic bypass concept when selecting light-paths (i.e., selecting the proper wavelength in both wavelength selective and wavelength interchangeable networks). The applicability of the proposed BBR and the BBOR mechanisms is validated by simulation and compared with existing methods on their respective network scenarios. These network scenarios have been selected so that obtained results may be extrapolated to a realistic network.
9

A simulation approach for modelling and investigation of inventory inaccuracy in warehouse operation

Kamaludin, Adzhar January 2010 (has links)
This thesis is focused on a simulation modelling approach to address the inventory inaccuracy problems in a warehouse operation. The main motivation which led to this research was a desire to investigate the inventory inaccuracy issues that have been highlighted by a logistics company. Previous and current research into inventory inaccuracy issues is largely related to the development of RFID technology as a possible solution to inventory problems. Since the inventory inaccuracy related to RFID technology is focused on the overall measurement of inventory management and retail business, there are differences between this existing research and the research presented in this thesis which is focused on issues of inventory inaccuracy in a warehouse operation. In this thesis, warehouse operation is studied as a detailed sequence of processes that are involved in the flow of items physically in parallel with related information being stored in the computer system. In these processes there are many places where errors can occur in counting or recording details of inventory, or in physically moving, storing or picking items incorrectly. These details of a warehouse operation are used to develop a conceptual model of inventory inaccuracy in warehouse operations. The study also found that typically a product needs to be considered differently at different stages of its progress through a warehouse (and therefore within different sections of the conceptual model). This is because initially batches of a product are likely to be delivered from a supplier, therefore if errors occur soon after the product is delivered to the warehouse, the error might involve the whole batch (for example the batch may be misplaced and put in an incorrect storage location), or the error might involve just part of the batch (for example poor transportation by forklift truck may damage the packaging carton and some of the items within the carton). When the product is stored ready for meeting customer orders, it needs to be considered as individual items (and errors can occur in counting of individual items or individual items may be misplaced or stolen). Finally, when a customer order is received, the product will be picked and grouped to meet the requirements of the order (for example, one order may require 10 of the product whilst another order may require 20 of the product). Errors might again occur to the whole group or to just part of the group. (Continued ...)
10

The Problem of Missing Items at the Time of Production : A Case Study at Fläkt Woods in Jönköping

Smedberg, Karl, Asamoah-Barnieh, Raymond January 2009 (has links)
<p>In today‟s manufacturing environment, different parts manufactured in-house and bought from suppliers are often assembled together into a finished product. Competition has made it very important for companies to deliver a customized product on a promised date. However, when inventory items are missing at the time of production, lead times for products become uncertain and this makes it difficult to fulfill a customer order on the promised date. It is thus important to explore the causes of missing items at the time of production in order to solve such a problem. This Master of Science thesis carried out through a case study at Fläkt Woods in collaboration with Jönköping University is about the problem of not finding specific inventory items in the locations specified by the computer system. It is delimited to inventory items which are physically within the company premises or which according to the computer system are within the premises of the company. The questions at issue have been what the causes of the problem of missing items within the company are and how to effectively reduce the problem. The thesis has been carried out over an entire academic semester as a full-time work in the company. The sources of the problem have been found to be the result of the work procedure, the underlying software used during work (the in-house developed ERP system), stealing from orders, ineffective barcode scans, re-sequencing at the component manufacturing department (called pre-manufacturing in the company) due to the need to fulfill multiple objectives, set-up times at the component manufacturing department and human errors among others. The suggestions given include: modification of the work procedure and the underlying software used at work, increasing effective scanning and using some checks at critical points in the material flow. Areas for further research are given to further reduce the impact of the problem on the production system.</p>

Page generated in 0.0686 seconds