• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 302
  • 106
  • 35
  • 34
  • 23
  • 11
  • 10
  • 6
  • 5
  • 4
  • 3
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 627
  • 132
  • 103
  • 96
  • 79
  • 75
  • 62
  • 58
  • 52
  • 48
  • 47
  • 40
  • 40
  • 37
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Assessment of the practice and potential of industrial solid waste minimisation : case study of Stellenbosch

Semoli, Belemane Petrose 04 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2005. / ENGLISH ABSTRACT: There is increasing pressure on factories and government to practise cleaner technology. The public is becoming more and more environmentally aware and external pressure from international competitors is also forcing companies to adopt environmentally sound production practices. Our natural resources and the environment need environmentally friendly practices. Waste minimisation is not only prudent practice for manufacturing industries, but is also an integral part of environmental regulations in many countries, including South Africa. This research seeks to investigate the extent and potential for industrial waste minimisation in Stellenbosch. The objectives of this thesis are, firstly to establish and evaluate the present range and extent of industrial solid waste minimisation practices; secondly to identify and evaluate potential industrial solid waste minimisation measures that could (if necessary) be instituted in future; and finally to propose a general strategy for the minimisation of industrial solid waste in Stellenbosch. The findings reflect that currently there is little waste minimisation awareness and practice in Stellenbosch. The most common method of waste minimisation currently practised by industries is recycling through the selling of recyclables. The least common method is the equipment-related change method, due to the high costs involved in adopting this method. Based on the findings, a suitable regional waste management strategy was developed and this strategy could possibly be adopted elsewhere in South Africa. Key words: waste minimisation, waste management, re-use, recycling, factory, environment, practice, participation, cleaner technology, awareness, Stellenbosch / AFRIKAANSE OPSOMMING: Daar is toenemende druk op fabrieke sowel as op die regering om skoner tegnologie te beoefen. Die publiek raak ook toenemend meer omgewingsbewus en druk vanaf die kant van internasionale mededingers forseer maatskappye om praktyke in te stel wat gunstig is ten optigte van die omgewing. Ons natuurlike hulpbronne en die omgewing benodig omgewingsvriendelike vervaardigingspraktyke. Die beperking van afvalstowwe is nie net vir die fabriekswese 'n wyse praktyk nie, maar maak ook in vele lande, met inbegrip van Suid-Afrika, 'n integrale deel uit van regulasies met betrekking tot die omgewing. Hierdie navorsingsprojek poog om die omvang en potensiaal van beperking van afvalstowwe in Stellenbosch te ondersoek. Die doelwitte van die ondersoek is eertens om ondersoek in te stel na die huidige reikwydte en omvang van praktyke om vaste industriële afvalstowwe te beperk en dit te evalueer; tweedens om potensiële industriële vaste afvalstofbeperkingsmaatreëls wat, indien nodig, in die toekoms ingestel sou kon word, te indentifiseer en te evalueer; en dan uiteindelik 'n algemene strategie vir die beperking van vaste industriële afvalstowwe in Stellenbosch voor te stel. Die bevindings in hierdie tesis bewys dat daar tans gennge bewustheid van die noodaaklikheid van afvalstofbeperking in Stellenbosch is en dat dit ewe min in die praktyk toegepas word. Die mees algemene vorm van beperking van afvalstowwe in die fabriekswese is deur middel van verkoop van herwinbare afvalstowwe. Die mins algemene vorm van beperking van afvalstowwe hou verband met die vervanging van toerusting. Die rede hiervoor het te doen met die koste verbonde aan die strategie. Vanuit hierdie bevindings is toepaslike strategie vir die bestuur van afvalstowwe op streeksvlak ontwikkel. Hierdie strategie sou moontlik ook elders in Suid-Afrika toegepas kon word. Sleutelwoorde: beperking van afvalstowwe, afvalstofbestuur, hergebruik, herwinning, fabriek, omgewing, praktyk, deelname, skoner tegnologie, bewustheid, Stellenbosch
212

Modelo matemático para custo e energia na produção de açúcar e álcool /

Ramos, Rômulo Pimentel, 1985. January 2010 (has links)
Orientador: Helenice de Oliveira Florentino Silva / Banca: Adriana Cristina C.Nicola / Banca: Antonio Roberto Balbo / Resumo: A cana-de-açúcar possui grande importância social e econômica para o brasileiro. O Brasil é maior produtor de cana-de-açúcar do mundo, na safra de 2009/2010 a produção foi de 597,8 milhões de toneladas, uma alta de 4,4% em relação às 571,40 milhões de toneladas colhidas em 2008/2009. O país possui 420 Usinas e Destilarias, as quais movimentam cerca de 51 bilhões de reais, o que representa 1,5% do PIB Brasileiro, gerando 4,5 milhões de empregos diretos e indiretos, exportando 23,2 milhões toneladas de açúcar e 3,3 bilhões de litros de etanol, e um investimento médio no país de 6 bilhões de reais ao ano, o que mostra a grande importância da cana-de-açúcar para a economia brasileira. Por outro lado, o crescimento acelerado deste setor tem trazido problemas de grandes dimensões e dificultado o gerenciamento das empresas, estes problemas são de ordens estruturais e ambientais. Por isso é importante conhecer todo processo de cultivo da cana, que vai desde o preparo do solo até processo industrial, que ocorre da recepção da cana até a saída dos produtos, e buscar formas de minimizar os custos de produção. Assim, surge a necessidade de ferramentas que auxiliem os gestores das empresas nas tomadas de decisões. Desta forma, a modelagem matemática pode ser utilizada como uma importante fonte de produção de estimativas para este setor, facilitando cálculos e auxiliando as decisões. O objetivo deste trabalho foi estudar toda a cadeia produtiva de álcool e açúcar e investigar modelos matemáticos para estimar quantitativamente o balanço de energia e o custo de produção, considerando todo o processo de produção de álcool e açúcar, deste o plantio da cana, até a venda destes produtos / Abstract: The sugarcane has great social and economic importance for the Brazilian. Brazil is the largest producer of sugarcane in the world, in 2009/2010 crop production was 597.8 million tons, up 4.4% compared to 571.40 million tonnes harvested in 2008 / 2009. That season the country had 420 mills and distilleries, which handle about 51 billion reais, representing 1.5% of Brazilian GDP, generating 4.5 million direct and indirect jobs, exporting 23.2 million tonnes of sugar and 3.3 billion liters of ethanol, generating an average investment in the country of 6 billion reais a year, which shows the great importance of sugarcane to the Brazilian economy. The accelerated growth of this sector has brought large problems and complicating the management of enterprises these problems are structural and environmental. Therefore it is important to know the whole process of sugar cane cultivation, ranging from soil preparation until industrial process, which occurs in the reception of the cane to the output of products, and seek ways to minimize production costs. Thus it, arises the need for tools that help business managers in making decisions. Thus, mathematical modeling can be used as an important source to estimates the production for this sector, making calculations and aiding decisions. The objective of this work was to study the entire production chain of sugar and alcohol and to investigate mathematical models to quantitatively estimate the cost of production, considering the whole process of producing alcohol and sugar cane, since the plantation until the sale of these products / Mestre
213

Génération aléatoire d'automates et analyse d'algorithmes de minimisation / Random generation of automata and analysis of their state minimization algorithms

David, Julien 28 September 2010 (has links)
Cette thèse porte sur la génération aléatoire uniforme des automates finis et l'analyse des algorithmes de minimisation qui s'y appliquent. La génération aléatoire permet de conduire une étude expérimentale sur les propriétésde l'objet engendré et sur les méthodes algorithmiques qui s'y appliquent. Il s'agit également d'un outil de recherche, qui permet de faciliter l'étude théorique du comportement moyen des algorithmes. L'analyse en moyenne des algorithmes s'inscrit dans la suite des travaux précurseurs de Donald Knuth. Le schéma classique en analyse d'algorithmes consiste à étudier le pire des cas, qui n'est souvent pas représentatif du comportement de l'algorithme en pratique. D'un point de vue théorique, on définit ce qui se produit "souvent'' en fixant une loi de probabilitésur les entrées de l'algorithme. L'analyse en moyenne consiste alors à estimer des ressources utiliséespour cette distribution de probabilité. Dans ce cadre, j'ai travaillé sur des algorithmes de génération aléatoire d'automatesdéterministes accessibles (complets ou non). Ces algorithmes sont basés sur de la combinatoirebijective, qui permet d'utiliser un procédé générique : les générateurs de Boltzmann. J'ai ensuite implanté ces méthodes dans deux logiciels : REGAL et PREGA. Je me suis intéressé à l'analyse en moyenne des algorithmes de minimisation d'automateset j'ai obtenu des résultats qui montrent le cas moyen des algorithmes de Moore et Hopcroft est bien meilleur que le pire des cas / This thesis is about the uniform random generation of finite automata and the analysisof their state minimization algorithms. Random generators allow to conduct an experimental study on the properties of the generated objectand on the algorithms that apply to this object. It is also a useful tool for research that facilitates the theoretical study of the average behavior of algorithms. Usually, the analysis of an algorithm focuses on the worst case scenario, which is often not representative of thepractical behavior of the algorithm. From a theoretical point of view, one can define what happens "often" by fixing a probability law on the algorithm's inputs. The average analysis consists in the estimation ofthe requested resources, according to this probability distribution.In this context, I worked on several algorithms for the random generation of deterministic accessibleautomata (complete or not).Those algorithms are based on bijective combinatorics, that allows to use generic tools called the Boltzmann generators. I implemented those methods in two softwares : REGAL and PREGA. I studied the average complexity of state minimization algorithms and obtained results showing that theaverage case of the two algorithms due to Moore and Hopcroft is way better than the worst case
214

Optimization methods for side-chain positioning and macromolecular docking

Moghadasi, Mohammad 08 April 2016 (has links)
This dissertation proposes new optimization algorithms targeting protein-protein docking which is an important class of problems in computational structural biology. The ultimate goal of docking methods is to predict the 3-dimensional structure of a stable protein-protein complex. We study two specific problems encountered in predictive docking of proteins. The first problem is Side-Chain Positioning (SCP), a central component of homology modeling and computational protein docking methods. We formulate SCP as a Maximum Weighted Independent Set (MWIS) problem on an appropriately constructed graph. Our formulation also considers the significant special structure of proteins that SCP exhibits for docking. We develop an approximate algorithm that solves a relaxation of MWIS and employ randomized estimation heuristics to obtain high-quality feasible solutions to the problem. The algorithm is fully distributed and can be implemented on multi-processor architectures. Our computational results on a benchmark set of protein complexes show that the accuracy of our approximate MWIS-based algorithm predictions is comparable with the results achieved by a state-of-the-art method that finds an exact solution to SCP. The second problem we target in this work is protein docking refinement. We propose two different methods to solve the refinement problem. The first approach is based on a Monte Carlo-Minimization (MCM) search to optimize rigid-body and side-chain conformations for binding. In particular, we study the impact of optimally positioning the side-chains in the interface region between two proteins in the process of binding. We report computational results showing that incorporating side-chain flexibility in docking provides substantial improvement in the quality of docked predictions compared to the rigid-body approaches. Further, we demonstrate that the inclusion of unbound side-chain conformers in the side-chain search introduces significant improvement in the performance of the docking refinement protocols. In the second approach, we propose a novel stochastic optimization algorithm based on Subspace Semi-Definite programming-based Underestimation (SSDU), which aims to solve protein docking and protein structure prediction. SSDU is based on underestimating the binding energy function in a permissive subspace of the space of rigid-body motions. We apply Principal Component Analysis (PCA) to determine the permissive subspace and reduce the dimensionality of the conformational search space. We consider the general class of convex polynomial underestimators, and formulate the problem of finding such underestimators as a Semi-Definite Programming (SDP) problem. Using these underestimators, we perform a biased sampling in the vicinity of the conformational regions where the energy function is at its global minimum. Moreover, we develop an exploration procedure based on density-based clustering to detect the near-native regions even when there are many local minima residing far from each other. We also incorporate a Model Selection procedure into SSDU to pick a predictive conformation. Testing our algorithm over a benchmark of protein complexes indicates that SSDU substantially improves the quality of docking refinement compared with existing methods.
215

Využití numerické lineární algebry k urychlení výpočtu odhadů MCD / Exploiting numerical linear algebra to accelerate the computation of the MCD estimator

Sommerová, Kristýna January 2018 (has links)
This work is dealing with speeding up the algorithmization of the MCD es- timator for detection of the mean and the covariance matrix of a normally dis- tributed multivariate data contaminated with outliers. First, the main idea of the estimator and its well-known aproximation by the FastMCD algorithm is discussed. The main focus was to be placed on possibilities of a speedup of the iteration step known as C-step while maintaining the quality of the estimations. This proved to be problematic, if not impossible. The work is, therefore, aiming at creating a new implementation based on the C-step and Jacobi method for eigenvalues. The proposed JacobiMCD algorithm is compared to the FastMCD in terms of floating operation count and results. In conclusion, JacobiMCD is not found to be fully equivalent to FastMCD but hints at a possibility of its usage on larger problems. The numerical experiments suggest that the computation can indeed be quicker by an order of magnitude, while the quality of results is close to those from FastMCD in some settings. 1
216

Stanovení vybraných parametrů souprav pro minimalizační způsoby zpracování půdy

KRÝSL, Zdeněk January 2018 (has links)
This diploma thesis briefly described the basics of soil cultivation, used machines and basic procedures, with regard to the minimization soil processing. The practical part of the thesis observed and described the parameters of two machines used for minimization soil processing and evaluated their effectivity.
217

Sur quelques applications du codage parcimonieux et sa mise en oeuvre / On compressed sampling applications and its implementation

Coppa, Bertrand 08 March 2013 (has links)
Le codage parcimonieux permet la reconstruction d'un signal à partir de quelques projections linéaires de celui-ci, sous l'hypothèse que le signal se décompose de manière parcimonieuse, c'est-à-dire avec peu de coefficients, sur un dictionnaire connu. Le codage est simple, et la complexité est déportée sur la reconstruction. Après une explication détaillée du fonctionnement du codage parcimonieux, une présentation de quelques résultats théoriques et quelques simulations pour cerner les performances envisageables, nous nous intéressons à trois problèmes : d'abord, l'étude de conception d'un système permettant le codage d'un signal par une matrice binaire, et des avantages apportés par une telle implémentation. Ensuite, nous nous intéressons à la détermination du dictionnaire de représentation parcimonieuse du signal par des méthodes d'apprentissage. Enfin, nous discutons la possibilité d'effectuer des opérations comme la classification sur le signal sans le reconstruire. / Compressed sensing allows to reconstruct a signal from a few linear projections, under the assumption that the signal can be sparsely represented, that is, with only a few coefficients, on a known dictionary. Coding is very simple and all the complexity is gathered on the reconstruction. After more detailed explanations of the principle of compressed sensing, some theoretic resultats from literature and a few simulations allowing to get an idea of expected performances, we focusson three problems: First, the study for the building of a system using compressed sensing with a binary matrix and the obtained benefits. Then, we have a look at the building of a dictionary for sparse representations of the signal. And lastly, we discuss the possibility of processing signal without reconstruction, with an example in classification.
218

Modelo matemático para custo e energia na produção de açúcar e álcool

Ramos, Rômulo Pimentel [UNESP] 09 September 2010 (has links) (PDF)
Made available in DSpace on 2014-06-11T19:24:41Z (GMT). No. of bitstreams: 0 Previous issue date: 2010-09-09Bitstream added on 2014-06-13T19:11:13Z : No. of bitstreams: 1 ramos_rp_me_botfca.pdf: 642292 bytes, checksum: d9987af9a9875bd8567fdb537dd98da8 (MD5) / Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) / A cana-de-açúcar possui grande importância social e econômica para o brasileiro. O Brasil é maior produtor de cana-de-açúcar do mundo, na safra de 2009/2010 a produção foi de 597,8 milhões de toneladas, uma alta de 4,4% em relação às 571,40 milhões de toneladas colhidas em 2008/2009. O país possui 420 Usinas e Destilarias, as quais movimentam cerca de 51 bilhões de reais, o que representa 1,5% do PIB Brasileiro, gerando 4,5 milhões de empregos diretos e indiretos, exportando 23,2 milhões toneladas de açúcar e 3,3 bilhões de litros de etanol, e um investimento médio no país de 6 bilhões de reais ao ano, o que mostra a grande importância da cana-de-açúcar para a economia brasileira. Por outro lado, o crescimento acelerado deste setor tem trazido problemas de grandes dimensões e dificultado o gerenciamento das empresas, estes problemas são de ordens estruturais e ambientais. Por isso é importante conhecer todo processo de cultivo da cana, que vai desde o preparo do solo até processo industrial, que ocorre da recepção da cana até a saída dos produtos, e buscar formas de minimizar os custos de produção. Assim, surge a necessidade de ferramentas que auxiliem os gestores das empresas nas tomadas de decisões. Desta forma, a modelagem matemática pode ser utilizada como uma importante fonte de produção de estimativas para este setor, facilitando cálculos e auxiliando as decisões. O objetivo deste trabalho foi estudar toda a cadeia produtiva de álcool e açúcar e investigar modelos matemáticos para estimar quantitativamente o balanço de energia e o custo de produção, considerando todo o processo de produção de álcool e açúcar, deste o plantio da cana, até a venda destes produtos / The sugarcane has great social and economic importance for the Brazilian. Brazil is the largest producer of sugarcane in the world, in 2009/2010 crop production was 597.8 million tons, up 4.4% compared to 571.40 million tonnes harvested in 2008 / 2009. That season the country had 420 mills and distilleries, which handle about 51 billion reais, representing 1.5% of Brazilian GDP, generating 4.5 million direct and indirect jobs, exporting 23.2 million tonnes of sugar and 3.3 billion liters of ethanol, generating an average investment in the country of 6 billion reais a year, which shows the great importance of sugarcane to the Brazilian economy. The accelerated growth of this sector has brought large problems and complicating the management of enterprises these problems are structural and environmental. Therefore it is important to know the whole process of sugar cane cultivation, ranging from soil preparation until industrial process, which occurs in the reception of the cane to the output of products, and seek ways to minimize production costs. Thus it, arises the need for tools that help business managers in making decisions. Thus, mathematical modeling can be used as an important source to estimates the production for this sector, making calculations and aiding decisions. The objective of this work was to study the entire production chain of sugar and alcohol and to investigate mathematical models to quantitatively estimate the cost of production, considering the whole process of producing alcohol and sugar cane, since the plantation until the sale of these products
219

An evaluation of solid waste management with specific reference to the municipality of Maputo City (Mozambique)

Manhica, Elsa Alberto Pondja January 2012 (has links)
Thesis (MTech (Public Management))--Cape Peninsula University of Technology, 2012. / One of the greatest problems Mozambique is currently facing is the increased involvement of sectors in producing large amounts of solid waste on a daily basis. These sectors are involved in activities that take place in homes, industry, mining, agriculture and commerce. As a result, this problem needs to be treated efficiently by the Municipality of Maputo. Solid waste produced each day in Maputo is not only an aesthetic problem but poses a threat to citizens' health and it damages the environment. With the production of large amounts of waste each day, the Municipality of Maputo is faced with an ineffective solid waste management system. This ineffectiveness is due to a number of reasons, which include lack of resources, inadequate or no staff training, poor management of solid waste by both the municipal and the government, inappropriate laws to regulate solid waste collection, poor control of such laws in terms of removal and disposal of the waste, using past colonial methods for dealing with solid waste and poor community involvement, The problem not only affects the Municipality of Maputo but it also affects both citizens and the environment. Ineffective solid waste management is linked to poor management, lack of resources, poor staff training, and unskilled public officials. The city gets dirtier as the amount of waste increases day by day, due to the fact that citizens living in rural areas have immigrated to the city looking for work after the civil war, which took place between 1977 and 1994. Emerging from a severely damaged war-torn economy, Mozambique is still in the process of reconstituting many of its public institutions. Communities, local government, industry, commerce, civil society, academics and religious organisations can no longer turn a blind eye to poor solid waste management. Instead, they need to join to fight against poor management of solid waste. The current situation demonstrates that too few individuals, non-profit organisations and private companies are involved in solid waste management activities. Effective solid waste management can only be effective if it engages all producers of waste and captures the policy strategies, planning and challenges of sustainable development.
220

Analytical logical effort formulation for local sizing / Formulação analítica baseada em logical effort para dimensionamento local

Alegretti, Caio Graco Prates January 2013 (has links)
A indústria de microeletrônica tem recorrido cada vez mais à metodologia de projeto baseado em células para fazer frente à crescente complexidade dos projetos de circuitos integrados digitais, uma vez que circuitos baseados em células são projetados mais rápida e economicamente que circuitos full-custom. Entretanto, apesar do progresso ocorrido na área de Electronic Design Automation, circuitos digitais baseados em células apresentam desempenho inferior ao de circuitos full-custom. Assim, torna-se interessante encontrar maneiras de se fazer com que circuitos baseados em células tenham desempenho próximo ao de circuitos full-custom, sem que isso implique elevação significativa nos custos do projeto. Com tal objetivo em vista, esta tese apresenta contribuições para um fluxo automático de otimização local para circuitos digitais baseados em células. Por otimização local se entende a otimização do circuito em pequenas janelas de contexto, onde são feitas otimizações considerando o contexto global. Deste modo, a otimização local pode incluir a detecção e isolamento de regiões críticas do circuito e a geração de redes lógicas e de redes de transistores de diferentes topologias que são dimensionadas de acordo com as restrições de projeto em questão. Como as otimizações locais atuam em um contexto reduzido, várias soluções podem ser obtidas considerando as restrições locais, entre as quais se escolhe a mais adequada para substituir o subcircuito (região crítica) original. A contribuição específica desta tese é o desenvolvimento de um método de dimensionamento de subcircuitos capaz de obter soluções com área ativa mínima, respeitando a capacitância máxima de entrada, a carga a ser acionada, e a restrição de atraso imposta. O método é baseado em uma formulação de logical effort, e a principal contribuição é calcular analiticamente a derivada da área para obter área mínima, ao invés de fazer a derivada do atraso para obter o atraso mínimo, como é feito na formulação tradicional do logical effort. Simulações elétricas mostram que o modelo proposto é muito preciso para uma abordagem de primeira ordem, uma vez que apresenta erros médios de 1,48% para dissipação de potência, 2,28% para atraso de propagação e 6,5% para os tamanhos dos transistores. / Microelectronics industry has been relying more and more upon cell-based design methodology to face the growing complexity in the design of digital integrated circuits, since cell-based integrated circuits are designed in a faster and cheaper way than fullcustom circuits. Nevertheless, in spite of the advancements in the field of Electronic Design Automation, cell-based digital integrated circuits show inferior performance when compared with full-custom circuits. Therefore, it is desirable to find ways to bring the performance of cell-based circuits closer to that of full-custom circuits without compromising the design costs of the former circuits. Bearing this goal in mind, this thesis presents contributions towards an automatic flow of local optimization for cellbased digital circuits. By local optimization, it is meant circuit optimization within small context windows, in which optimizations are done taking into account the global context. This way, local optimization may include the detection and isolation of critical regions of the circuit and the generation of logic and transistor networks; these networks are sized according to the existing design constraints. Since local optimizations act in a reduced context, several solutions may be obtained considering local constraints, out of which the fittest solution is chosen to replace the original subcircuit (critical region). The specific contribution of this thesis is the development of a subcircuit sizing method capable of obtaining minimum active area solutions, taking into account the maximum input capacitance, the output load to be driven, and the imposed delay constraint. The method is based on the logical effort formulation, and the main contribution is to compute the area derivative to obtain minimum area, instead of making the delay derivative to obtain minimum delay, as it is done in the traditional logical effort formulation. Electrical simulations show that the proposed method is very precise for a first order approach, as it presents average errors of 1.48% in power dissipation, 2.28% in propagation delay, and 6.5% in transistor sizes.

Page generated in 0.114 seconds