• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 40
  • 13
  • 7
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 92
  • 16
  • 12
  • 12
  • 10
  • 9
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

An extension for an analytical model of serial transfer lines with unreliable machines and unreliable buffers

Slatkovsky, Greg D. January 2000 (has links)
No description available.
52

Effects of Forestry Streamside Management Zones on Stream Water Quality, Channel Geometry, Soil Erosion, and Timber Management in the Virginia Piedmont

Lakel, William 04 September 2008 (has links)
The major study objectives include determining if a 50-foot streamside management zone (SMZ) as described in the Virginia BMP Manual (VDOF 2002) is generally sufficient to protect stream water quality, riparian soils, and stream bank integrity in headwater streams where forest harvesting has taken place, as well as comparing other SMZ widths with regard to the same environmental protection performance. In 2003, 16 forested watersheds were clear-cut harvested for commercial timber production. Four SMZ treatments were installed across four experimental blocks during harvest. Each of the 16 watersheds was subsequently site-prepared with prescribed burning and planted with loblolly pine (Pinus taeda). Within the watersheds, the established treatments were a 100-foot width with no thinning, a 50-foot width without thinning, a 50-foot width with thinning, and a 25-foot "stringer." Each of the four treatments was conducted within three of four blocks (Incomplete Block Design). After a two-year post-harvest monitoring period, it was determined that the SMZ treatments had no significant effect on water quality, channel geometry, or soil erosion in and around the streams. There was no apparent water quality degradation as a result of harvesting timber, and larger SMZs did not have an impact on any of the parameters studied. It was also apparent that leaving narrower SMZs or thinning within SMZs did not cause any apparent environmental degradation. It was also determined that landowners who leave SMZs on their property have very limited opportunities to manage timber within them for financial gain in the long term. / Ph. D.
53

Recovery of Channel Morphology and Benthic Macroinvertebrate Assemblages after Livestock Exclusion

Ranganath, Sheila Casaba 13 July 2007 (has links)
Measurements in paired stream reaches with and without livestock access in southwestern Virginia suggest that livestock exclusion practices installed on short, isolated stream reaches result in improved geomorphic and riparian vegetation condition, but do not significantly improve the benthic macroinvertebrate assemblage. Detailed longitudinal and cross-sectional surveys, pebble counts, and rapid geomorphic assessments were conducted on contiguous, paired stream reaches (5 pairs) with and without active livestock access across a range of time since livestock exclusion was implemented. In addition, bank characteristics were quantified by measuring groundcover biomass, shrub crown volume, tree density and diameter, soil bulk density, and particle-size analysis. Benthic macroinvertebrates were collected with a D-frame dip net and quantified using the Virginia Stream Condition Index (SCI), and other benthic macroinvertebrate metrics. We determined that: 1) small lengths of livestock exclusion can significantly increase channel depth and decrease the width to depth ratio, and increase groundcover vegetation growth, but do not significantly alter benthic macroinvertebrate assemblages; and, 2) qualitative geomorphic assessment results showed trends over time since exclusion (0 to greater than 50 years), but not in any of the other parameters evaluated. These observations suggest that a more targeted and holistic approach that addresses watershed-wide impacts must be implemented to restore aquatic habitat. (Key Words: CREP, stream channel morphology, livestock exclusion, agriculture, benthic macroinvertebrates, riparian buffers.) / Master of Science
54

Improving TCP Data Transportation for Internet of Things

Khan, Jamal Ahmad 31 August 2018 (has links)
Internet of Things (IoT) is the idea that every device around us is connected and these devices continually collect and communicate data for analysis at a large scale in order to enable better end user experience, resource utilization and device performance. Therefore, data is central to the concept of IoT and the amount being collected is growing at an unprecedented rate. Current networking systems and hardware are not fully equipped to handle influx of data at this scale which is a serious problem because it can lead to erroneous interpretation of the data resulting in low resource utilization and bad end user experience defeating the purpose of IoT. This thesis aims at improving data transportation for IoT. In IoT systems, devices are connected to one or more cloud services over the internet via an access link. The cloud processes the data sent by the devices and sends back appropriate instructions. Hence, the performance of the two ends of the network ie the access networks and datacenter network, directly impacts the performance of IoT. The first portion of the our research targets improvement of the access networks by improving access link (router) design. Among the important design aspects of routers is the size of their output buffer queue. %Selecting an appropriate size of this buffer is crucial because it impacts two key metrics of an IoT system: 1) access link utilization and 2) latency. We have developed a probabilistic model to calculate the size of the output buffer that ensures high link utilization and low latency for packets. We have eliminated limiting assumptions of prior art that do not hold true for IoT. Our results show that for TCP only traffic, buffer size calculated by the state of the art schemes results in at least 60% higher queuing delay compared to our scheme while achieving almost similar access link utilization, loss-rate, and goodput. For UDP only traffic, our scheme achieves at least 91% link utilization with very low queuing delays and aggregate goodput that is approx. 90% of link capacity. Finally, for mixed traffic scenarios our scheme achieves higher link utilization than TCP only and UDP only scenarios as well as low delays, low loss-rates and aggregate goodput that is approx 94% of link capacity. The second portion of the thesis focuses on datacenter networks. Applications that control IoT devices reside here. Performance of these applications is affected by the choice of TCP used for data communication between Virtual Machines (VM). However, cloud users have little to no knowledge about the network between the VMs and hence, lack a systematic method to select a TCP variant. We have focused on characterizing TCP Cubic, Reno, Vegas and DCTCP from the perspective of cloud tenants while treating the network as a black box. We have conducted experiments on the transport layer and the application layer. The observations from our transport layer experiments show TCP Vegas outperforms the other variants in terms of throughput, RTT, and stability. Application layer experiments show that Vegas has the worst response time while all other variants perform similarly. The results also show that different inter-request delay distributions have no effect on the throughput, RTT, or response time. / Master of Science / Internet of Things (IoT) is the idea that every electronic device around us, like watches, thermostats and even refrigerators, is connected to one another and these devices continually collect and communicate data. This data is analyzed at a large scale in order to enable better user experience and improve the utilization and performance of the devices. Therefore, data is central to the concept of IoT and because of the unprecedented increase in the number of connected devices, the amount being collected is growing at an unprecedented rate. Current computer networks over which the data is transported, are not fully equipped to handle influx of data at this scale. This is a serious problem because it can lead to erroneous analysis of the data, resulting in low device utilization and bad user experience, hence, defeating the purpose of IoT. This thesis aims at improving data transportation for IoT by improving different components involved in computer networks. In IoT systems, devices are connected to cloud computing services over the internet through a router. The router acts a gateway to send data to and receive data from the cloud services. The cloud services act as the brain of IoT i.e. they process the data sent by the devices and send back appropriate instructions for the devices to perform. Hence, the performance of the two ends of the network i.e. routers in the access networks and cloud services in datacenter network, directly impacts the performance of IoT. The first portion of our research targets the design of routers. Among the important design aspects of routers is their size of their output buffer queue which holds the data packets to be sent out. We have developed a novel probabilistic model to calculate the size of the output buffer that ensures that the link utilization stays high and the latency of the IoT devices stays low, ensuring good performance. Results show that that our scheme outperforms state-of-the-art schemes for TCP only traffic and shows very favorable results for UDP only and mixed traffic scenarios. The second portion of the thesis focuses on improving application service performance in datacenter networks. Applications that control IoT devices reside in the cloud and their performance is directly affected by the protocol chosen to send data between different machines. However, cloud users have almost no knowledge about the configuration of the network between the machines allotted to them in the cloud. Hence, they lack a systematic method to select a protocol variant that is suitable for their application. We have focused on characterizing different protocols: TCP Cubic, Reno, Vegas and DCTCP from the perspective of cloud tenants while treating the network as a black-box (unknown). We have provided in depth analysis and insights into the throughput and latency behaviors which should help the cloud tenants make a more informed choice of TCP congestion control.
55

METODOLOGIA PARA GERENCIAMENTO DE RISCOS EM PROJETOS DE TRANSMISSÃO DE ENERGIA ELÉTRICA COM FOCO NO PRAZO DE CONCLUSÃO APLICANDO O GERENCIAMENTO DE PROJETOS PELA CORRENTE CRÍTICA E SIMULAÇÃO DE MONTE CARLO / METHODOLOGY FOR RISK MANAGEMENT OF ELECTRIC ENERGY TRANSMISSION PROJECTS WITH FOCUS ON THE TIME FOR COMPLETION ADOPTING THE CRITICAL CHAIN AND MONTE CARLO SIMULATION FOR PROJECT MANAGEMENT

Pimentel, Jacinto Maia 18 October 2016 (has links)
Submitted by admin tede (tede@pucgoias.edu.br) on 2017-06-01T13:24:39Z No. of bitstreams: 1 JACINTO MAIA PIMENTEL.pdf: 3146885 bytes, checksum: d73d15b75ba44e1ab5fa45ebd06988f1 (MD5) / Made available in DSpace on 2017-06-01T13:24:39Z (GMT). No. of bitstreams: 1 JACINTO MAIA PIMENTEL.pdf: 3146885 bytes, checksum: d73d15b75ba44e1ab5fa45ebd06988f1 (MD5) Previous issue date: 2016-10-18 / The construction of electricity generation projects are held in places where there are specific conditions for its implementation, and it appears that these are not always close to the consumer markets. The energy needs to be transported to the consumption centers located many kilometers away, and to this end are used to power transmission lines. These developments are characterized by many different situations that impact on their time of completion. In order to pursue project management options so that deadlines are met, this research applies a risk management methodology for projects by means of the quantitative risk assessment. It is addressed to the energy transmission area with a focus on the project deadline. The proposed model adopts the Critical Chain Project Management – CCPM, whereby the appropriate activities duration are those achievable in the shortest time as possible. The use of CCPM calls for the application of buffers along the project schedule to control the uncertainties. The proposed methodology applies the Monte Carlo Simulation – MCS to the schedule, associating the Minimum, Most likely and Maximum times to the activities. The identified risks are related to these activities. The outcome resulting from the proposed methodology is quantitative and provides useful information to containment and contingency actions. / A construção dos empreendimentos de geração de energia elétrica é realizada nos locais onde existem condições específicas para sua implantação, e, nem sempre estes são próximos aos centros consumidores. A energia gerada necessita ser transportada para os centros de consumo localizados a muitos quilômetros de distância, e, para tal são utilizadas as Linhas de Transmissão de Energia. Estes empreendimentos caracterizam-se por encontrar situações diversas e que impactam no seu prazo de conclusão. Com o objetivo de buscar opções de gerenciamento de projetos para que os prazos sejam alcançados, esta pesquisa aplica uma metodologia para gestão dos riscos em projetos, com a aplicação de análise quantitativa dos riscos. Será aplicada na área de transmissão de energia com foco no prazo de conclusão. O modelo proposto utiliza o método do Gerenciamento de Projeto pela Corrente Crítica – CCPM, por meio do qual as durações das atividades consideradas são aquelas exequíveis no menor prazo possível. O uso da CCPM prevê a introdução de buffers no cronograma do projeto para controlar as incertezas. A metodologia proposta aplica ao cronograma a Simulação de Monte Carlo- SMC, associando às suas atividades o prazo Mínimo, Mais Provável e Máximo. Os riscos identificados são associados às atividades. O resultado da aplicação da metodologia proposta é uma análise dos riscos do projeto indicando a probabilidade de consumo dos buffers, as atividades que mais impactam no seu consumo e os riscos associados a estas. O resultado desta análise é quantitativa e gera informações úteis para ações de contenção e contingência.
56

Shorter project lead times in construction / Kortare ledtider för projekt i byggbranschen

Bergstrand, Karin, Appel, Viktor January 2016 (has links)
The manufacturing industry has made an extensive journey through the last century when it comes to increasing productivity. Results in the construction industry end up far behind manufacturing and do not show anywhere near the same efficiency increase, improved quality or decreased costs. There are a lot of potential savings in the form of activities that consumes resources without creating value for the end customer, also called waste. Waste in combination with uncertainty and variability in task duration forces the project duration to be longer than necessary. The purpose of this study is to investigate the opportunity to shorten the lead time of a construction project and investigate how construction companies can work continuously towards shorter project lead times. The purpose has been fulfilled through a time study performed at four projects managed by one of the leading construction developers of residential buildings in Scandinavia. Waste and variability were mapped in the interior phase. The measured time for each activity was categorized either as; value adding time (VT), necessary but non-value adding time (NNVT) and non-value adding time (NVT). The results showed that mounting consists of 48 % VT, 28 % NNVT and 24 % NVT, while material handling consists of 0 % VT, 75 % NNVT and 25 % NVT. The results further showed large variability in task duration between the projects. Theory from lean, logistics and scheduling were combined to form three different scenarios for shortening project lead times. The first scenario describes how to work with removing waste from the process. The second scenario describes how to reduce variability and thus be able to reduce buffers in the time plan. The last scenario describes how support processes, for example material handling, can be removed from the critical path, in order to reduce project lead times. A proposal for a general approach for continuously working with decreasing project lead times is also presented. Central aspects for shortening project lead times were concluded to be the takt time, variability and risk. The proposed approach involves a combination of the three scenarios, where their impact on each other is considered. / Tillverkningsindustrin har gjort en omfattande resa under det senaste århundradet när det gäller att öka produktiviteten, medan byggbranschen inte visar i närheten av samma ökning i effektivitet, förbättrad kvalitet eller minskade kostnader. Det finns många potentiella besparingar i form av aktiviteter som förbrukar resurser men som inte skapar något värde för kunden, även kallat slöserier. Slöserier i kombination med osäkerhet och variabilitet i arbetsmomentens varaktighet tvingar projektens ledtider att vara längre än nödvändigt. Syftet med denna studie är att undersöka möjligheten att förkorta byggprojekts ledtider och undersöka hur byggföretag kontinuerligt kan arbeta mot kortare ledtider. Syftet har uppfyllts med hjälp av en tidsstudie utförd på fyra projekt vid en av de ledande projektutvecklarna av bostadshus i Skandinavien. Slöserier och variabilitet i inredningsskedet har kartlagts. Den uppmätta tiden för varje aktivitet är kategoriserad efter; värdeskapande tid (VT), nödvändig men icke-värdeskapande tid (NIVT) och icke-värdeskapande tid (IVT). Resultaten visar att montering består av 48 % VT, 28 % NIVT och 24 % IVT, medan material hantering består av 0 % VT, 75 % NIVT och 25 % IVT. Vidare visar resultaten på stor variabilitet i varaktigheten av aktiviteter mellan projekten. Teori från lean, logistik och schemaläggning kombinerades för att bilda tre olika scenarier som kan användas för att förkorta projektens ledtider. Det första scenariot beskriver hur man kan arbeta med att eliminera slöserier från processen. Det andra beskriver hur man kan minska variabiliteten och därmed kunna minska buffertarna i tidplanen. Det sista scenariot beskriver hur stödprocesser, som t.ex. materialhantering, kan brytas ut från den kritiska linjen, i syfte att minska projektets ledtid. Ettförslag till ett generellt sätt att kontinuerligt arbeta med att minska projektens ledtider är ocksåframtaget och innehåller en kombination av de tre scenarierna. Det framkommer att centrala aspekterför att förkorta projektens ledtider är takttid, variabilitet och risk. Det föreslagna arbetssättetinnefattar en kombination av de tre scenarierna, där hänsyn tagits till deras inverkan på varandra.
57

Benchmarking the request throughput of conventional API calls and gRPC : A Comparative Study of REST and gRPC / Mätning av genomströmningen av förfrågningar hos konventionella API-anrop och gRPC : En jämförande studie mellan REST och gRPC

Berg, Johan, Mebrahtu Redi, Daniel January 2023 (has links)
As the demand for better and faster applications increase every year, so does the demand for new communication systems between computers. Today, a common method for computers and software systems to exchange information is the use of REST APIs, but there are cases where more efficient solutions are needed. In such cases, RPC can provide a solution. There are many RPC libraries to choose from, but gRPC is the most widely used today. gRPC is said to offer faster and more efficient communication than conventional web-based API calls. The problem investigated in this thesis is that there are few available resources demonstrating how this performance difference translates into request throughput on a server. The purpose of the study is to benchmark the difference in request throughput for conventional API calls (REST) and gRPC. This was done with the goal of providing a basis for making better decisions regarding the choice of communication infrastructure between applications. A qualitative research method with support of quantitative data was used to evaluate the results. REST and gRPC servers were implemented in three programming languages. A benchmarking client was implemented in order to benchmark the servers and measure request throughput. The benchmarks were conducted on a local network between two hosts. The results indicate that gRPC performs better than REST for larger message payloads in terms of request throughput. REST initially outperforms gRPC for small payloads but falls behind as the payload size increases. The result can be beneficial for software developers and other stakeholders who strive to make informed decisions regarding communication infrastructure when developing and maintaining applications at scale. / Eftersom efterfrågan på bättre och snabbare applikationer ökar varje år, så ökar även behovet av nya kommunikationssystem mellan datorer. Idag är det vanligt att datorer och programvara utbyter information genom användning av APIer, men det finns fall där mer effektiva lösningar behövs. I sådana fall kan RPC erbjuda en lösning. Det finns många olika RPC-bibliotek att välja mellan, men gRPC är det mest använda idag. gRPC sägs erbjuda snabbare och mer effektiv kommunikation än konventionella webbaserade API-anrop. Problemet som undersöks i denna avhandling är att det finns få tillgängliga resurser som visar hur denna prestandaskillnad översätts till genomströmning av förfrågningar på en server. Syftet med studien är att mäta skillnaden i genomströmning av förfrågningar för konventionella API-anrop (REST) och gRPC. Detta gjordes med målet att ge en grund för att fatta bättre beslut om val av kommunikationsinfrastruktur mellan applikationer. En kvalitativ forskningsmetod med stöd av kvantitativa data användes för att utvärdera resultaten. REST- och gRPC-servrar implementerades i tre programmeringsspråk. En benchmarking-klient implementerades för att mäta servrarnas prestanda och genomströmning av förfrågningar. Mätningarna genomfördes i ett lokalt nätverk mellan två datorer. Resultaten visar att gRPC presterar bättre än REST för större meddelanden när det gäller genomströmning av förfrågningar. REST presterade initialt bättre än gRPC för små meddelanden, men faller efter när meddelandestorleken ökar. Resultatet kan vara fördelaktig för programutvecklare och andra intressenter som strävar efter att fatta informerade beslut gällande kommunikationsinfrastruktur vid utveckling och underhållning av applikationer i större skala.
58

Monensina sódica, bicarbonato de sódio e duas fontes de calcário em rações com alto teor de concentrado na terminação de cordeiros Santa Inês / Monensin, sodium bicarbonate and limestone sources in rations with high concentrated for feedlot lambs

Gastaldello Junior, Adilson Luis 08 February 2008 (has links)
Dois experimentos foram realizados para avaliar o desempenho, características da carcaça, digestibilidade e parâmetros ruminais em cordeiros confinados recebendo rações com 90% de concentrado (contendo principalmente milho e farelo de soja) e 10% de feno de "coastcross" como volumoso, contendo calcário calcítico, calcário calcítico tipo "Filler" e bicarbonato de sódio (NaHCO3), com a adição ou não de monensina. Os tratamentos foram: CC= 1,3% Calcário calcítico; CCF= 1,3% Calcário calcítico tipo "Filler": CC + BS= 1% Bicarbonato de sódio + 1,3% Calcário calcítico, com e sem adição de 30 mg/kg de MS de monensina. O delineamento utilizado foi o de blocos completos casulalizados, em fatorial 2 x 3 de acordo com idade e peso inicial nos dois experimentos. No Experimento 1: foram utilizados 42 cordeiros não castrados da raça Santa Inês divididos em 7 blocos e 6 tratamentos, com peso inicial médio de 21 ± 3 kg e a idade de 88 ± 5 dias, não sendo observado diferença (P > 0,05) entre os tratamentos nas variáveis consumo de matéria seca (CMS) e ganho de peso médio diário (GMD), as quais apresentaram médias de 0,955 e 0,290 kg/dia respectivamente. Porém, verificou-se melhor conversão alimentar (P < 0,05) nos tratamentos com monensina (3,32 kg MS/kg de ganho) comparado aos tratamentos sem monensina (3,58 kg MS/kg de ganho). Não houve efeito nos parâmetros de carcaça sendo a média de 50,38%; 2,42%; 13,94 cm2 e 1,87 mm, para rendimento de carcaça quente, quebra por resfriamento, área de olho de lombo e espessura de gordura respectivamente. No Experimento 2, 24 cordeiros foram alocados por peso em 4 blocos, e mantidos em gaiolas para ensaio de metabolismo para determinar a digestibilidade aparente das rações no trato digestório, metabolismo de nitrogênio e parâmetros ruminais (pH, AGCC e N-NH3). A digestibilidade da matéria seca, matéria orgânica e proteína bruta foram maiores no tratamento com CCF (P < 0,05), comparado aos outros tamponantes. Da mesma forma, a digestibilidade da fração fibra detergente neutro foi maior nos tratamentos contendo monensina em relação aos tratamentos sem adição de monensina. Não houve diferença (P > 0,05) entre os tratamentos, no metabolismo de nitrogênio, pH, consumo de água, relação acetato:propionato e concentração de butirato. Porém houve interação (P<0,05) entre os tratamentos CCF e adição de monensina para concentração de N-NH3, AGCC totais, acetato e propionato, sendo as concentrações destes menores neste tratamento quando o ionóforo foi adicionado à ração. Adicionalmente, houve interação entre os tratamentos CC + BS e monensina sobre a concentração de AGCC totais, acetato e propionato, sendo as concentrações menores nos tratamentos sem a monensina. O uso de CCF não interferiu no desempenho e parâmetros da carcaça, porém apresentou melhor digestibilidade da MS e efeito associativo negativo com a monensina. A adição de bicarbonato de sódio, não alterou os resultados de desempenho e carcaça e houve efeito associativo positivo com a monensina. A presença de monensina sódica melhorou a conversão alimentar. / Two trials were conducted to evaluate performance, carcass characteristics, apparent digestibility and ruminal parameters in feedlot lambs fed a 90% concentrate (based on corn and soybean meal) and 10% of "coastcross" hay ration, with two sources of calcarium and sodium bicarbonate (NaHCO3), with or without monensin. Treatments were: CC = 1.3% limestone; CCF= 1.3% "Filler" Calcitic limestone; CC + BS= 1% Sodium bicarbonate + 1.3% Calcitic limestone, with or without monensin (30 mg/kg DM basis). Lambs were assigned to a complete randomized blocks with a 2x3 factorial design, according to body weight and age at beginning of the trial. In experiment 1: Forty two ram Santa Ines lambs were distributed in 7 blocks and 6 treatments, with initial average body weight of 21.2 ± 3 kg and 88 ± 5 days old. There were no differences in DMI and ADG between treatments (P > 0.05), which averaged 0.955 and 0.290 kg/day, respectively. However, there was a better feed conversion (P < 0.05) of the lambs fed diets containing monensin (3.32 kg DM/kg of gain) compared to diets without monensin (3.58 kg DM/kg of gain). There were no differences in carcass characteristics, which averaged 50.38%, 2.42%, 13.94 cm2 and 1.87 mm of hot carcass weight, chilling losses, longissimus muscle area and fat thickness, respectively. In experiment 2: Twenty four lambs were distributed according to the body weight in 4 blocks. They were placed in metabolism crates to evaluate apparent digestibility of the diets in total digestive tract, nitrogen metabolism and ruminal parameters (pH, total volatile fatty acids and N-NH3). Dry matter, organic matter and crude protein digestibilities were greater for treatments with CCF (P < 0.05), compared to other buffers. Neutral detergent fiber digestibility was higher for treatments with the inclusion of monensin compared to treatments without monensin. There were no differences (P > 0.05) in nitrogen metabolism, pH, water intake, acetate:propionate relation and butirate concentration. There was interaction between treatment CCF and monensin inclusion for N-NH3, total volatile fatty acids, acetate and propionate concentrations, which were lower when the ionophore was included in CCF treatment. There was also interaction between treatments CC + BS and monensin for total volatile fatty acid, acetate and propionate, which were in lower concentrations in the CC + BS treament without monensin. Filler" Calcitic limestone inclusion in the diet had no detrimental effect on lamb performance and carcass characteristics, but DM apparent digestibility was improved and there was a negative associative effect with monensin. Sodium bicarbonate inclusion didn´t change performance and carcass characteristics of the lambs and there was a postitive associative effect with monensin for total volatile fatty acids. In addition monensin inclusion in the diet improved feed conversion.
59

Documenting and Understanding Everyday Activities through the Selective Archiving of Live Experiences

Hayes, Gillian Rachael 18 May 2007 (has links)
This research focuses on the development and study of socially appropriate ways to archive data about important life experiences during unexpected and unstructured situations. This work involves three significant phases: formative studies to understand the data capture needs of particular populations of users in these situations; design and development of a technical architecture for capture and access in these settings coupled with design and development of applications for two specific domain problems; and evaluation of this solution as it pertains to these domain problems. The underlying solution presented in this dissertation is known as selective archiving, in which services are always on and available for recording but require some explicit action to archive data. If no such action is taken, recorded data is deleted automatically after a specified time. Selectively archived segments of data can provide an efficient way to recover and to analyze high quality data that traditionally available. The projects presented in this dissertation provide insight about the ways in which we can support record-keeping in informal and unstructured settings. Furthermore, when examined together, these projects provide a view into the larger generalized problem of unstructured capture and access and the acceptability of capture technologies. These considerations evolved into a set of seven tensions surrounding recording technologies that are presented in this dissertation. Furthermore, the experiences surrounding the deployment and evaluation of selective archiving technologies demonstrate the ways in which people use different types of knowledge and cues from the world to determine their reactions to and adoption of such technologies.
60

Models Of Synchronous Production Lines With No Intermediate Buffers

Cetinay, Hande 01 July 2010 (has links) (PDF)
Production lines with unreliable machines have received a great amount of attention in the literature. Especially, two-station systems have mostly been studied because such systems are easier to handle when compared to the longer lines. In literature, longer lines are usually evaluated by a decomposition algorithm, whereby the long line is partitioned into chunks of two-station lines. Decomposition algorithms require intermediate buffer storages of capacity at least two or three. The trends in modern manufacturing practices, on the other hand, such as the Toyota Production System, dictate that intermediate storages be eliminated. Our work studies multi-station lines with no intermediate storage. We develop software to automate the generation of transition probability matrices to allow the analysis of system behavior. The algorithm allows the use of software packages to handle computations and to solve for exact solutions. Long-run behavior is obtained via the algorithm developed in the computational environment MATLAB. The purpose is to analyze the system performance measures such as starvation and blockage times of stations, production rate and work-in-process. In addition, the production rate and the work-in-process measures over failure and repair probabilities are curve-fit to establish simple and useful empirical formulas for lines consisting three, four and five identical stations. Numerical analyses show that the proposed algorithm is effective for exact solutions and the suggested formulas are valid for approximate solutions.

Page generated in 0.0297 seconds