• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 116
  • 102
  • 40
  • 19
  • 18
  • 8
  • 7
  • 5
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 376
  • 171
  • 84
  • 77
  • 67
  • 56
  • 49
  • 47
  • 41
  • 36
  • 36
  • 32
  • 31
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Aplicação do controle estatístico de processo (CEP) como ferramenta para a melhoria da qualidade do leite / Application of statistical process control (SPC) as a tool for improvement of milk quality

Takahashi, Fabio Henrique 01 September 2011 (has links)
Objetivou-se utilizar o controle estatístico de processo (CEP) como ferramenta de melhoria da qualidade do leite. O objetivo do primeiro estudo foi avaliar o CEP como ferramenta de identificação de variações não naturais, na qualidade do leite, passíveis de serem manipuladas. Para este estudo, foram utilizados dados de produção de leite, de contagem de células somáticas (CSS) e de contagem bacteriana total (CBT) de 384 fazendas, obtidos do banco de dados da Clínica do Leite - ESALQ/ USP no ano de 2009. Avaliou-se os efeitos naturais (época do ano e produção de leite) sobre a variação de CCS e CBT, e adicionalmente, foram gerados gráficos de controle para escore de células somáticas (ECS) e contagem bacteriana total transformada (tCBT), com a finalidade de identificar fontes de variação não naturais em um grupo de quatro fazendas. A variabilidade das informações foi avaliada pelo estimador do desvio padrão (sigma), calculado com base na amplitude móvel. Verificou-se que a época do ano influenciou significativamente o ECS e a tCBT. Os gráficos de controle, dentro de cada época, indicaram a presença de variações não naturais no ECS e na tCBT no grupo de fazendas avaliadas. Além disso, os gráficos de controle também caracterizaram fazendas em controle estatístico de processos. No segundo estudo objetivou-se utilizar o CEP como ferramenta de identificação e classificação de fazendas com maiores probabilidades de violarem os padrões de qualidade utilizados pela indústria. Foram utilizados dados de CCS e CBT de 452 fazendas, no período de janeiro de 2009 a março de 2010. Calcularam-se as proporções de violação nos padrões de qualidade, considerando os limites de CCS = 400.000 céls./ mL e CBT = 100.000 UFC/ mL. Foram calculados os índices de capacidade (Cpk) e classificaram-se as fazendas segundo quatro categorias de médias e de índices Cpk. As fazendas que apresentaram maiores médias e desvios obtiveram maiores frequências de violação. As fazendas com médias inferiores aos limites propostos para CCS e CBT representaram, respectivamente, 25,05 % e 97,78 % das fazendas. Entretanto, a proporção de fazendas que forneceram leite consistentemente dentro dos padrões de qualidade avaliados (Cpk 1) representou 4,65 % e 35,17 % das propriedades para CCS e CBT respectivamente. Portanto, a aplicação do CEP dentro das fazendas é uma ferramenta adicional para monitorar a qualidade do leite produzido, o índice Cpk pode ser utilizado em conjunto com os atuais modelos de caracterização da qualidade do leite de fazendas pelas indústrias, uma vez que identifica rebanhos mais consistentes em produzir leite dentro de um padrão de qualidade. / This study aimed to employ statistical process control (SPC) as a tool to improve milk quality. The first objective of the study was to evaluate SPC as tool to identify non-natural variation of milk quality that are possible to be manipulated. In this study, data on milk production, somatic cell count (SCC), and total bacterial count (TBC) of 384 farms were used, provided by database of Clínica do Leite ESALQ/ USP in 2009. Natural effects (time of year and milk production) trough SCC and TBC variation were evaluated, and additionally, control charts for score cell somatic (SCS) and total bacterial count transformed (TBCt) were built aiming to identify sources of non-natural variations in a group of four farms. The variability of information was evaluated trough standard deviation estimator (sigma), calculated with basis of moving range. It was observed that time of year influenced significantly SCS and TBCt. The control charts, in each period, indicated signs of non-natural variation in SCS and TBCt in the group of herds evaluated. Furthermore, the control charts characterized the farms in statistical process control. The second study aimed to use SPC as a tool for identification and rating farms with larger probabilities of infraction on quality standard used by industry. Data of SCC and TBC from 452 farms, from January 2009 to March 2010, were used. The proportion of violation in quality standard, regarding the limits of SCC = 4000,000 cells/ mL and TBC = 100,000 CFU/ mL was calculated. Capability indices (Cpk) were calculated and farms were categorized into four classes of Cpk means and indices. Farms that presented larger means and deviation shoed superior frequencies of violation. Herds with means below the proposed limits of SCC and TBC represented, respectively, 25.05 % and 97.78 % of farms. However, proportion of farms that delivered milk consistently within the standards of quality evaluated (Cpk 1) represented 4.65 % and 35.17 % of farms for SCC and TBC, respectively. Therefore, application of SPC on farms is an additional tool for monitoring milk quality produced, the index Cpk can be used, by industries, associated with the current models to characterize milk quality from farms, since it identifies herds more consistent in producing quality-standard milk.
212

3-D Nautical Charts and Safe Navigation

Porathe, Thomas January 2006 (has links)
In spite of all electronic navigation devices on a modern ship bridge, navigators still lose their orientation. Reasons for this might be excessive cognitive workload caused by too many instruments to read and compile, navigation information that is displayed in a cognitively demanding way, short decision times due to high speed or fatigue due to minimum manning and long work hours. This work addresses the problem of map information displayed in a less than optimal way. Three new concepts are presented: the bridge perspective, the NoGO area polygons and a dual lane seaway network. Map reading can be difficult due to the problem of mental rotations. By allowing a 3-D nautical chart to be viewed from an egocentric bridge perspective, the need for mental rotations can be removed. The cognitively demanding calculations necessary to find out if there is enough water under the keel can be made by the chart system and the result displayed as of free water and NoGo areas. On land car driving is facilitated by a road-network and a sign system. This notion can be further developed on sea and make navigation easier and safer. These concepts were then tested in a laboratory experiment, in interviews and in a prototyping project. The results were very promising. The experiment in a laboratory maze showed that map reading from an egocentric perspective was more efficient than using traditional paper and electronic maps. Interviews and expert evaluation of prototypes also showed great interest from practitioners in the field. / Trots all elektronisk utrustning på en modern skeppsbrygga händer det att navigatörerna förlorar orienteringen. Anledningen kan vara hög kognitiv belastning därför att för många olika instrument måste avläsas och integreras samtidigt, att informationen på instrumenten behöver tolkas på ett kognitivt krävande sätt, att tiden för att fatta beslut blir allt kortare på grund av högre hastigheter till sjöss eller på grund av trötthet. I detta arbete presenteras tre nya koncept för visualisering av navigationsinformation: bryggperspektivet, djupvarningspolygoner och sjövägar. Kartläsning kan ibland vara svårt på grund av de mentala rotationer en användare tvingas genomföra för att kunna jämföra kartan med verkligheten. Genom att göra det möjligt för en användare att se sjökortet ur ett egocentriskt bryggperspektiv, så onödiggörs dessa mentala rotationer. De kognitivt krävande beräkningar som navigatören behöver göra för att försäkra sig om att det finns tillräckligt med vatten under kölen, kan utföras av kartsystemet och resultatet visas istället som fria vattenytor och djupvarningsområden (NoGo areas). På land underlättas bilkörning av ett vägnät med körbanor, filer och skyltar. Detta system kan i högre utsträckning införas till sjöss för att underlätta säker navigering. Dessa koncept har sedan testats genom ett laboratorieexperiment, genom intervjuer och i ett prototyputvecklingsprojekt. Resultaten var mycket lovande. Experimentet i en laboratorielabyrint visade klart att 3D-sjökortet var effektivare än både papperskartan och traditionell elektroniska kartor och intervjuerna och expertutvärderingarna visad på stort intresse från yrkesutövare i branschen.
213

Analysing Message Sequence Graph Specifications

Chakraborty, Joy 04 1900 (has links)
Message Sequence Charts are a visual representation of the system specification which shows how all the participating processes are interacting with each other. Message Sequence Graphs provide modularity by easily allowing combination of more than one Message Sequence Charts to show more complicated system behavior. Requirements modeled as Message Sequence Graphs give a global view of the system as interaction across all the participating processes can be viewed. Thus systems modeled as Message Sequence Graphs are like sequential composition of parallel process. This makes it very attractive during the requirements gathering and review phases as it needs inter-working between different stakeholders with varied domain knowledge and expertise – requirements engineers, system designers, end customers, test professionals etc. In this thesis we give a detailed construction of a finite-state transition system for a com-connected Message Sequence Graph. Though this result is fairly well-known in the literature there has been no precise description of such a transition system. Several analysis and verification problems concerning MSG specifications can be solved using this transition system. The transition system can be used to construct correct tools for problems like model-checking and detecting implied scenarios in MSG specifications. There are several contributions of this thesis. Firstly, we have provided a detailed construction of a transition system exactly implementing the message sequence graph. We have provided the detailed correctness arguments for this construction. Secondly, this construction works for general Message Sequence Graphs and not limited to com-connected graphs alone, although, we show that a finite model can be ensured only if the original graph is com-connected. Also, we show that the construction works for both synchronous and asynchronous messaging systems. Thirdly, we show how to find implied scenarios using the transition model we have generated. We also discuss some of the flaws in the existing approaches. Fourthly we provide a proof of undecidability argument for non com-connected MSG with synchronous messaging.
214

Exit charts based analysis and design of rateless codes for the erasure and Gaussian channels

Mothi Venkatesan, Sabaresan 02 June 2009 (has links)
Luby Transform Codes were the first class of universal erasure codes introduced to fully realize the concept of scalable and fault‐tolerant distribution of data over computer networks, also called Digital Fountain. Later Raptor codes, a generalization of the LT codes were introduced to trade off complexity with performance. In this work, we show that an even broader class of codes exists that are near optimal for the erasure channel and that the Raptor codes form a special case. More precisely, Raptorlike codes can be designed based on an iterative (joint) decoding schedule wherein information is transferred between the LT decoder and an outer decoder in an iterative manner. The design of these codes can be formulated as a LP problem using EXIT Charts and density evolution. In our work, we show the existence of codes, other than the Raptor codes, that perform as good as the existing ones. We extend this framework of joint decoding of the component codes to the additive white Gaussian noise channels and introduce the design of Rateless codes for these channels. Under this setting, for asymptotic lengths, it is possible to design codes that work for a class of channels defined by the signal‐to‐noise ratio. In our work, we show that good profiles can be designed using density evolution and Gaussian approximation. EXIT charts prove to be an intuitive tool and aid in formulating the code design problem as a LP problem. EXIT charts are not exact because of the inherent approximations. Therefore, we use density evolution to analyze the performance of these codes. In the Gaussian case, we show that for asymptotic lengths, a range of designs of Rateless codes exists to choose from based on the required complexity and the overhead. Moreover, under this framework, we can design incrementally redundant schemes for already existing outer codes to make the communication system more robust to channel noise variations.
215

Model-based Code Generation For The High Level Architecture Federates

Adak, Bulent Mehmet 01 December 2007 (has links) (PDF)
We tackle the problem of automated code generation for a High Level Architecture (HLA)- compliant federate application, given a model of the federation architecture including the federate&rsquo / s behavior model. The behavior model is based on Live Sequence Charts (LSCs), adopted as the behavioral specification formalism in the Federation Architecture Metamodel (FAMM). The FAMM is constructed conforming to metaGME, the meta-metamodel offered by Generic Modeling Environment (GME). FAMM serves as a formal language for describing federation architectures. We present a code generator that generates Java/AspectJ code directly from a federation architecture model. An objective is to help verify a federation architecture by testing it early in the development lifecycle. Another objective is to help developers construct complete federate applications. Our approach to achieve these objectives is aspect-oriented in that the code generated from the LSC in conjunction with the Federation Object Model (FOM) serves as the base code on which the computation logic is weaved as an aspect.
216

Metamodeling For The Hla Federation Architectures

Topcu, Okan 01 December 2007 (has links) (PDF)
This study proposes a metamodel, named Federation Architecture Metamodel (FAMM), for describing the architecture of a High Level Architecture (HLA) compliant federation. The metamodel provides a domain specific language and a formal representation for the federation adopting Domain Specific Metamodeling approach to HLA-compliant federations. The metamodel supports the definitions of transformations both as source and as target. Specifically, it supports federate base code generation from a described federate behavior, and it supports transformations from a simulation conceptual model. A salient feature of FAMM is the behavioral description of federates based on live sequence charts (LSCs). It is formulated in metaGME, the meta-metamodel for the Generic Modeling Environment (GME). This thesis discusses specifically the following points: the approach to building the metamodel, metamodel extension from Message Sequence Chart (MSC) to LSC, support for model-based code generation, and action model and domain-specific data model integration. Lastly, this thesis presents, through a series of modeling case studies, the Federation Architecture Modeling Environment (FAME), which is a domain-specific model-building environment provided by GME once FAMM is invoked as the base paradigm.
217

Controlling High Quality Manufacturing Processes: A Robustness Study Of The Lower-sided Tbe Ewma Procedure

Pehlivan, Canan 01 September 2008 (has links) (PDF)
In quality control applications, Time-Between-Events (TBE) type observations may be monitored by using Exponentially Weighted Moving Average (EWMA) control charts. A widely accepted model for the TBE processes is the exponential distribution, and hence TBE EWMA charts are designed under this assumption. Nevertheless, practical applications do not always conform to the theory and it is common that the observations do not fit the exponential model. Therefore, control charts that are robust to departures from the assumed distribution are desirable in practice. In this thesis, robustness of the lower-sided TBE EWMA charts to the assumption of exponentially distributed observations has been investigated. Weibull and lognormal distributions are considered in order to represent the departures from the assumed exponential model and Markov Chain approach is utilized for evaluating the performance of the chart. By analyzing the performance results, design settings are suggested in order to achieve robust lower-sided TBE EWMA charts.
218

The presentation and interpretation of arrow symbolism in biology diagrams at secondary-level.

Du Plessis, Lynn. January 2006 (has links)
The literature contains conflicting ideas about the effectiveness of diagrams, and their constituent symbolism as teaching and learning tools. In addition, only limited research has been specifically conducted on the presentation and interpretation of arrow symbolism used in biology diagrams, let alone on the nature, source and remediation of student difficulties caused by arrows. On the basis of this limited research and 30 years of experience of teaching biology at secondary-level, the author suspected that students might have difficulties interpreting arrow symbolism in diagrams used as explanatory tools and decided to thoroughly investigate this issue. The hypothesis, 'Secondary-level students have difficulty with the use of arrow symbolism in biology diagrams' was formulated and the following broad research questions defined to address the hypothesis: 1. How much of a problem is arrow symbolism in diagrams? 2. How effectively is arrow symbolism used in diagrams to promote the communication of intended ideas? 3. To what extent does the design of arrow symbolism in diagrams influence students ' interpretation and difficulties? 4. How can the emerging empirical data and ideas from literature be combined to illustrate the process of interpretation of arrow symbolism? 5. What measures can be suggested for improving the presentation and interpretation of arrow symbolism in biology diagrams at secondary-level? To address Research question 1, a content analysis of all arrow symbolism in seven popular secondary-level biology textbooks was undertaken. This revealed a wide diversity of arrow styles, spatial organisations, purposes and meanings that could be confusing to students. These results suggested the need for an evaluation of the effectiveness of arrow symbolism (Research question 2). As there was no definitive set of guidelines available for specifically evaluating arrows, general guidelines from the literature on diagrams were used to develop a set of 10 criteria, to evaluate the syntactic, semantic and pragmatic dimensions of arrow symbolism, which were validated by selected educators, students and a graphic design expert. Application of the criteria (which constituted expert opinion) to the arrow symbolism used in 614 realistic, stylised and abstract diagram types, revealed a relatively high incidence (30%) of inappropriately presented arrow designs that could mislead students. To establish whether this problem could be the cause of student difficulties, and to thereby address Research question 3, a stylised and an abstract diagram were selected and evaluated according to the criteria. The results of the evaluation were compared to the responses given by 174 students to a range of written and interview probes and student modified diagrams. In this way, student performance was correlated with expert opinion. The results confirmed that students experience a wide range of difficulties (26 categories) when interpreting arrow symbolism, with some (12 categories) being attributable to inappropriately presented arrow symbolism and others (14 categories) to student-related processing skills and strategies at both surface- and deeper-levels of reasoning. To address question 4, the emerging empirical data from the evaluation and student studies was combined with a wide range of literature, to inform the development of a 3-level, non-tiered model of the process of interpretation of arrow symbolism in diagrams. As this model emphasised the importance of both arrow presentation in diagrams and arrow interpretation by students, it could be used as an effective explanatory tool as well as a predictive tool to identify sources of difficulty with the use of arrow symbolism. This model was, in turn, used to inform the compilation of a range of guidelines for improving the presentation and interpretation of arrow symbolism, and so target Research question 5. These, and other guidelines grounded in the data and relevant literature, were suggested for all role players, including students, educators, textbook writers, graphic artists and researchers, to use as remedial tools. Future research should focus on the implementation of these guidelines and studying their effectiveness for improving the presentation and interpretation of diagrams with arrow and other types of symbolism. / Thesis (Ph.D.)-University of KwaZulu-Natal, Pietermaritzburg, 2006.
219

A chemistry-inspired middleware for flexible execution of service based applications

Wang, Chen 28 May 2013 (has links) (PDF)
With the advent of cloud computing and Software-as-a-Service, Service-Based Application (SBA) represents a new paradigm to build rapid, low-cost, interoperable and evolvable distributed applications. A new application is created by defining a workflow that coordinates a set of third-party Web services accessible over the Internet. In such distributed and loose coupling environment, the execution of SBA requires a high degree of flexibility. For example, suitable constituent services can be selected and integrated at runtime based on their Quality of Service (QoS); furthermore, the composition of service is required to be dynamically modified in response to unexpected runtime failures. In this context, the main objective of this dissertation is to design, to develop and to evaluate a service middleware for flexible execution of SBA by using chemical programming model. Using chemical metaphor, the service-based systems are modeled as distributed, selforganized and self-adaptive biochemical systems. Service discovery, selection, coordination and adaptation are expressed as a series of pervasive chemical reactions in the middleware, which are performed in a distributed, concurrent and autonomous way. Additionally, on the way to build flexible service based systems, we do not restrict our research only in investigating chemical-based solutions. In this context, the second objective of this thesis is to find out generic solutions, such as models and algorithms, to respond to some of the most challenging problems in flexible execution of SBAs. I have proposed a two-phase online prediction approach that is able to accurately make decisions to proactively execute adaptation plan before the failures actually occur.
220

Controle de qualidade de biodieseis de macaúba e algodão e suas misturas com o diesel usando Espectrometria no Infravermelho Médio e Cartas de Controle Multivariadas / Quality control of macaúba and cotton biodiesels and their diesel blends usind MID Spectrometry and Multivariate Control Charts

Guimarães, Eloiza 23 February 2018 (has links)
CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico / A Lei nº 13.033/2014 estabelece a adição de 7 ± 0,5% (v/v) de biodiesel ao óleo diesel utilizado no sistema viário e proíbe a adição de óleos vegetais ou quaisquer solventes nesta mistura. No entanto, há casos de adição irregular de óleos vegetais e/ou residuais no diesel devido, principalmente, ao baixo custo destas matérias-primas em comparação com o produto final ou contaminado durante seu transporte e armazenamento. Assim, são necessárias análises que forneçam respostas imediatas e eficientes para garantir a qualidade dos combustíveis comercializados. Nesse sentido, o presente trabalho propõe o monitoramento da qualidade das misturas biodiesel/diesel (7% de biodiesel e 93% de diesel) usando a Espectrometria no Infravermelho Médio associada a Cartas de Controle Multivariadas baseadas no sinal analítico líquido (NAS). Os biodieseis foram produzidos a partir de óleos de macaúba e algodão usando metanol e etanol. Para cada modelo, foram desenvolvidas três cartas (gráficos): a Carta NAS, que corresponde ao analito de interesse (nesse caso o biodiesel), a carta interferente, relacionadas às contribuições de outros componentes na amostra (diesel), e a carta resíduo correspondendo à variação não sistemática nos espectros (ruído instrumental). Foram analisadas 1508 amostras utilizadas na calibração e validação de seis modelos (biodieseis etílicos e metílicos de algodão, biodieseis metílicos da amêndoa da macaúba, biodieseis metílicos e etílicos do mesocarpo da macaúba, e também de B7 comercial). Na etapa de calibração foram usadas 103 amostras dentro das especificações de qualidade, amostras estas usadas para estabelecerem os limites estatísticos para cada carta. A etapa de validação se deu com amostras dentro e fora das especificações de qualidade (1405 amostras). A validação com amostras fora dos padrões de qualidade foi feita de duas maneiras: em relação ao teor de biodiesel no diesel e à presença de adulterantes no biodiesel, no diesel e na mistura. A presença de adulterantes se deu por substituição parcial do biodiesel por óleos de soja, milho e residual, substituição parcial do diesel por óleo lubrificante, querosene e gasolina e por adição direta dos adulterantes citados na mistura biodiesel/diesel (B7) na faixa de 3,5 a 43,5% (v/v) correspondendo a uma faixa de adulteração de 0,2 a 30% (v/v) na mistura. Assim, foi possível separar as amostras conformes e não conformes tanto em relação ao teor de biodiesel no diesel quanto à presença de adulterantes, uma vez que as amostras dentro das especificações ficaram dentro dos limites estabelecidos e as amostras fora das especificações saíram do limite em pelo menos uma das cartas. Desta maneira, os resultados mostraram que o método descrito neste trabalho é uma alternativa viável, eficiente e rápida no controle de qualidade de biocombustível. / The Law No. 13.033/2014 establishes the addition of 7 ± 0.5% (v/v) of biodiesel to the diesel used in the road system and prohibits the addition of vegetable oils or any solvents to this mixture. However, there are cases of irregular addition of vegetable and/or waste oils in diesel, mainly due to the low cost of these raw materials when compared to the final or contaminated product during transportation and storage. Thus, analyzes are necessary to provide immediate and efficient responses in order to ensure the quality of the marketed fuels. In this sense, the present study proposed the quality monitoring of biodiesel/diesel blends (7% biodiesel and 93% diesel) through the use of the Medium Infrared Spectrometry associated with Multivariate Control Charts based on the liquid analytical signal (NAS). Biodiesel was produced from macaúba (acrocomia aculeate) and cotton oils using methanol and ethanol. For each model, three charts were developed: the NAS Chart, which corresponds to the analyte of interest (in this case biodiesel); the interfering chart, related to the contributions of other components in the sample (diesel); and the waste chart, corresponding to non-systematic variation in spectra (instrumental noise). A total of 1508 samples were used for the calibration and validation of the six models (ethyl and cotton biodiesel, macaúba almond (acrocomia aculeate) methyl biodiesel, macaúba mesocarp (acrocomia aculeate) methyl and biodiesel, and also commercial B7 biodiesel). At the calibration stage, 103 samples were used within the quality specifications, these samples were used to establish the statistical limits for each chart. The validation step was carried out with samples within and out of the quality specifications (1405 samples). The validation with non-standard quality samples was carried out in two ways: in relation to the biodiesel content in diesel and the presence of adulterants in biodiesel, diesel and blend. The presence of adulterants was due to the partial replacement of biodiesel by soybean, corn and residual oils; the partial replacement of diesel by lubricating oil, kerosene and gasoline; and by direct addition of the adulterants mentioned in the biodiesel/diesel blend (B7) in the range of 3.5 to 43.5% (v/v) corresponding to an adulteration range of 0.2 to 30% (v/v) in the blend. Thus, it was possible to separate the conforming and non-conforming samples, both regarding the biodiesel content in diesel and the presence of adulterants, since the samples within the specifications were within the established limits, and the samples out of the specifications have exceeded the limit in least one of the charts. Thus, the results have showed that the method described in this study is a viable, efficient and fast alternative to the biofuel quality control. / Tese (Doutorado)

Page generated in 0.0574 seconds