• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 323
  • 232
  • 51
  • 27
  • 23
  • 23
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 807
  • 139
  • 127
  • 120
  • 102
  • 98
  • 80
  • 77
  • 72
  • 70
  • 69
  • 69
  • 64
  • 62
  • 61
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
581

Conceptual design of long-span trusses using multi-stage heuristics

Agarwal, Pranab 16 August 2006 (has links)
A hybrid method that addresses the design and optimization of long-span steel trusses is presented. By utilizing advancements in present day computing and biologically inspired analysis and design, an effort has been made to automate the process of evolving optimal trusses in an unstructured problem domain. Topology, geometry and sizing optimization of trusses are simultaneously addressed using a three stage methodology. Multi-objective genetic algorithms are used to optimize the member section sizes of truss topologies and geometries. Converting constraints into additional objectives provides a robust algorithm that results in improved convergence to the pareto-optimal set of solutions. In addition, the pareto-curve plotted based on how well the different objectives are satisfied helps in identifying the trade-offs that exist between these objectives, while also providing an efficient way to rank the population of solutions during the search process. A comparison study between multi-objective genetic algorithms, simulated annealing, and reactive taboo search is conducted to evaluate the efficiency of each method with relation to its overall performance, computational expense, sensitivity to initial parameter settings, and repeatability of finding near-global optimal designs. The benefit of using a three stage approach, and also implementing the entire model on parallel computers, is the high level of computational efficiency that is obtained for the entire process and the near-optimal solutions obtained. The overall efficiency and effectiveness of this method has been established by comparing the truss design results obtained using this method on bridge and roof truss benchmark problems with truss designs obtained by other researchers. One of the salient features of thisresearch is the large number of optimal trusses that are produced as the final result. The range of designs available provides the user with the flexibility to select the truss design that best matches their design requirements. By supporting human-computer interactions between these stages, the program also incorporates subjective aesthetic criteria, which assist in producing final designs in consonance with the user's requirements.
582

Advances in LTL load plan design

Zhang, Yang 07 July 2010 (has links)
A load plan specifies how freight is routed through a linehaul terminal network operated by a less-than-truckload (LTL) carrier. Determining the design of the load plan is critical to effective operations of such carriers. This dissertation makes contributions in modeling and algorithm design for three problems in LTL load plan design: (1) Refined execution cost estimation. Existing load plan design models use approximations that ignore important facts such as the nonlinearity of transportation costs with respect to the number of trailers, and empty travel beyond what is required for trailer balance that results from driver rules. We develop models that more accurately capture key operations of LTL carriers and produce accurate operational execution costs estimates; (2) Dynamic load planning. Load plans are traditionally revised infrequently by LTL carriers due to the difficulty of solving the associated optimization problem. Technological advances have now enabled carriers to consider daily load plan updates. We develop technologies that efficiently and effectively adjust a nominal load plan for a given day based on the actual freight to be served by the carrier. We present an integer programming based local search procedure, and a greedy randomized adaptive search heuristic; and (3) Stochastic load plan design. Load plan design models commonly represent origin-destination freight volumes using average demands, which do not describe freight volume fluctuations. We investigate load plan design models that explicitly utilize information on freight volume uncertainty and design load plans that most cost-effectively deal with varying freight volumes and lead to the lowest expected cost. We present a Sample Average Approximation approach and a variant of the method for solving the stochastic integer programming formulations.
583

Heuristic Scheduling Algorithms For Parallel Heterogeneous Batch Processors

Mathirajan, M 11 1900 (has links)
In the last decade, market pressures for greater variety of products forced a gradual shift from continuous manufacturing to batch manufacturing in various industries. Consequently batch scheduling problems have attracted the attention of researchers in production and operations management. This thesis addresses the scheduling of parallel non-identical batch processors in the presence of dynamic job arrivals, incompatible job-families and non-identical job sizes. This problem abstracts the scheduling of heat-treatment furnace operations of castings in a steel foundry. The problem is of considerable interest in this sector as a large proportion of the total production time is spent in heat treatment processing. This problem is also encountered in other industrial settings such as burn-in operation in the final testing stage of semiconductor manufacturing, and manufacturing of steel, ceramics, aircraft parts, footwear, etc. A detailed literature review and personal communications with experts revealed that this class of batch scheduling problems have not been addressed hitherto. A major concern in the management of foundries is to maximize throughput and reduce flow time and work-in-process inventories. Therefore we have chosen the primary scheduling objective to be the utilization of batch processors and as secondary objectives the minimization of overall flow time and weighted average waiting time per job. This formulation can be considered as an extension of problems studied by DOBSON AND NAMBINADOM (1992), UZSOY (1995), ZEE et a/. (1997) and MEHTA AND UZSOY (1998). Our effort to carefully catalogue the large number of variants of deterministic batch scheduling problems led us to the development of a taxonomy and notation. Not surprisingly, we are able to show that our problem is NP-hard and is therefore in the company of many scheduling problems that are difficult to solve. Initially two heuristic algorithms, one a mathematical programming based heuristic algorithm (MPHA) and the other a greedy heuristic algorithm were developed. Due to the computational overheads in the implementation of MPHA when compared with the greedy heuristic, we chose to focus on the latter as the primary scheduling methodology. Preliminary experimentation led us to the observation that the performance of greedy heuristics depends critically on the selection of job-families. So eight variants of the greedy heuristic that differ mainly in the decision on "job-family selection" were proposed. These eight heuristics are basically two sets {Al, A2, A3, A4} and the modified (MAI, MA2, MA3, MA4}, which differ only on how the "job-family" index, weighted shortest processing time, is computed. For evaluating the performance of the eight heuristics, computational experiments were carried out. The analysis of the experimental data is presented in two perspectives. The goal of the first perspective was to evaluate the absolute quality of the solutions obtained by the proposed heuristic algorithms when compared with estimated optimal solutions. The second perspective was to compare the relative performance of the proposed heuristics. The test problems generated were designed to reflect real-world scheduling problems that we have observed in the steel-casting industry. Three important problem parameters for the test set generation are the number of jobs [n], job-priority [P], and job-family [F]. We considered 5 different levels for n, 2 different levels for P and 2 different levels for F. The test set reflects (i) the size of the jobs vary uniformly (ii) there are two batch processors and (iii) five incompatible job-families with different processing times. 15 problem instances were generated for each level of (n, P, and F). Out of many procedures available in the literature for estimating optimal value for combinatorial optimization problems, we used the procedure based on Weibull distribution as discussed in Rardin and Uzsoy (2001). For each problem instance of the randomly generated 300 problem instances, 15 feasible solutions (i.e., the average utilization of batch processors (AUBP)) were obtained using "random decision rule for first two stages and using a "best-fit heuristic' for the last stage of the scheduling problem. These 15 feasible solutions were used to estimate the optimal value. The generated 15 feasible solutions are expected to provide the estimated optimal value of the problem instance with a very high probability. Both average performance and worst-case performance of the heuristics indicated that, the heuristic algorithms A3 and A4, on the average yielded better utilization than the estimated optimal value. This indicates that the Weibull-based technique may have yielded conservative estimates of the optimal value. Further, the other heuristic algorithms found inferior solutions when compared with the estimated optimal value. But the deviations were very small. From this, we may infer that all the proposed heuristic algorithms are acceptable. The relative evaluation of heuristics was in terms of both computational effort and the quality of the solution. For the heuristics, it was clear that the computational burden is low enough on the average to run all the proposed heuristics on each problem instance and select the best solution. Also, it is observed that any algorithm from the first set of {Al, A2, A3, and A4} takes more computational time than any one from the second set {MAI, MA2, MA3, and MA4}. Regarding solution quality, the following inferences were made: ٭ In general the heuristic algorithms are sensitive to the choice of problem factors with respect to all the scheduling objectives. ٭ The three algorithms A3, MA4 and MAI are observed to be superior with respect to the scheduling objectives: maximizing average utilization of batch processors (AUBP), minimization of overall flow time (OFT) and minimizing weighted average waiting time (WAWT) respectively. Further, the heuristic algorithm MAI turns out to be the best choice if we trade-off all three objectives AUBP, OFT and WAWT. Finally we carried out simple sensitivity analyses experiments in order to understand the influence of some parameters of the scheduling on the performance of the heuristic algorithms. These were related to one at a time changes in (1) job-size distribution, (2) capacities of batch processors and (3) processing time of job-families. From the analyses it appears that there is an influence of changes in these input parameters. The results of the sensitivity analyses can be used to guide the selection of a heuristic for a particular combination of input parameters. For example, if we have to pick a single heuristic algorithm, then MAI is the best choice when considering the performance and the robustness indicated by the sensitivity analysis. In summary, this thesis examined a problem arising in the scheduling of heat-treatment operations in the steel-casting industry. This problem was abstracted to a class of deterministic batch scheduling problems. We analyzed the computational complexity of this problem and showed that it is NP-hard and therefore unlikely to admit a scalable exact method. Eight variants of a fast greedy heuristic were designed to solve the scheduling problem of interest. Extensive computational experiments were carried out to compare the performance of the heuristics with estimated optimal values (using the Weibull technique) and also for relative effectiveness and this showed that the heuristics are capable of consistently obtaining near-estimated) optimal solutions with very low computational burden for the solution of large scale problems. Finally, a comprehensive sensitivity analysis was carried out to study the influence of a few parameters, by changing them one at a time, on the performance of the heuristic algorithms. This type of analysis gives users some confidence in the robustness of the proposed heuristics.
584

Assembly and test operations with multipass requirement in semiconductor manufacturing

Gao, Zhufeng 30 June 2014 (has links)
In semiconductor manufacturing, wafers are grouped into lots and sent to a separate facility for assembly and test (AT) before being shipped to the customer. Up to a dozen operations are required during AT. The facility in which these operations are performed is a reentrant flow shop consisting of several dozen to several hundred machines and up to a thousand specialized tools. Each lot follows a specific route through the facility, perhaps returning to the same machine multiple times. Each step in the route is referred to as a "pass." Lots in work in process (WIP) that have more than a single step remaining in their route are referred to as multi-pass lots. The multi-pass scheduling problem is to determine machine setups, lot assignments and lot sequences to achieve optimal output, as measured by four objectives related to key device shortages, throughput, machine utilization, and makespan, prioritized in this order. The two primary goals of this research are to develop a new formulation for the multipass problem and to design a variety of solution algorithms that can be used for both planning and real-time control. To begin, the basic AT model considering only single-pass scheduling and the previously developed greedy randomized adaptive search procedure (GRASP) along with its extensions are introduced. Then two alternative schemes are proposed to solve the multipass scheduling problem. In the final phase of this research, an efficient procedure is presented for prioritizing machine changeovers in an AT facility on a periodic basis that provides real-time support. In daily planning, target machine-tooling combinations are derived based on work in process, due dates, and backlogs. As machines finish their current lots, they need to be reconfigured to match their targets. The proposed algorithm is designed to run in real time. / text
585

対応経験を活用した避難対策と災害対応計画策定手法に関する研究

三宅, 英知 23 March 2015 (has links)
Kyoto University (京都大学) / 0048 / 新制・課程博士 / 博士(情報学) / 甲第19114号 / 情博第560号 / 新制||情||99 / 32065 / 京都大学大学院情報学研究科社会情報学専攻 / (主査)教授 林 春男, 教授 田中 克己, 教授 喜多 一 / 学位規則第4条第1項該当
586

Humor-Centered Design: Using Humor as a Rhetorical Approach in Design

Delaney, Chelsey 01 May 2011 (has links)
My thesis pursues the development of a tool to empower designers and non-designers to better understand humor’s function in design and to encourage the use of humor as a rhetorical device to undertake social problems. Humor research is a field that is largely based on linguistic studies, but because of its multidisciplinary stretch in the past decade has displayed a broad rhetorical influence; however, it has yet to form a substantial relationship with design. Through a literature review of linguistic, rhetorical, and design theories, I identified a set of heuristics that guide how humor should operate in design. I then tested the effectiveness of the heuristics, and with their final revision, applied them to designing for motivational problems associated with public displays of political mobilization. My user research inferred the creation of a mobile instructional tool that guides the collaborative and/or individual production of political communication artifacts (e.g. rally signs), which use humor to confront socially complex issues. The artifacts’ implicit intent is to motivate political mobilization and to found and/or empower communities. My project focus entails the creation and testing of the tool on the individual level. Whether the artifacts created produce the desired effect regarding mobilization and community strength is unknown; Future work should lend itself to testing humorous design’s effect on political mobilization and ability to empower communities.
587

On the performance of recent swarm based metaheuristics for the traveling tournament problem.

Saul, Sandile Sinethemba . 08 October 2014 (has links)
M.Sc. University of KwaZulu-Natal, Durban 2013.
588

La résolution du problème de formation de cellules dans un contexte multicritère

Ahadri, Mohamed Zaki 01 1900 (has links)
Les techniques de groupement technologique sont aujourd’hui utilisées dans de nombreux ateliers de fabrication; elles consistent à décomposer les systèmes industriels en sous-systèmes ou cellules constitués de pièces et de machines. Trouver le groupement technologique le plus efficace est formulé en recherche opérationnelle comme un problème de formation de cellules. La résolution de ce problème permet de tirer plusieurs avantages tels que la réduction des stocks et la simplification de la programmation. Plusieurs critères peuvent être définis au niveau des contraintes du problème tel que le flot intercellulaire,l’équilibrage de charges intracellulaires, les coûts de sous-traitance, les coûts de duplication des machines, etc. Le problème de formation de cellules est un problème d'optimisation NP-difficile. Par conséquent les méthodes exactes ne peuvent être utilisées pour résoudre des problèmes de grande dimension dans un délai raisonnable. Par contre des méthodes heuristiques peuvent générer des solutions de qualité inférieure, mais dans un temps d’exécution raisonnable. Dans ce mémoire, nous considérons ce problème dans un contexte bi-objectif spécifié en termes d’un facteur d’autonomie et de l’équilibre de charge entre les cellules. Nous présentons trois types de méthodes métaheuristiques pour sa résolution et nous comparons numériquement ces métaheuristiques. De plus, pour des problèmes de petite dimension qui peuvent être résolus de façon exacte avec CPLEX, nous vérifions que ces métaheuristiques génèrent des solutions optimales. / Group technology techniques are now widely used in many manufacturing systems. Those techniques aim to decompose industrial systems into subsystems or cells of parts and machines. The problem of finding the most effectivegroup technology is formulated in operations research as the Cell Formation Problem. Several criteria can be used to specify the optimal solution such as flood intercellular, intracellular load balancing, etc. Solving this problem leads to several advantages such as reducing inventory and simplifying programming. The Cell Formation Problem is an NP-hard problem; therefore, exact methods cannot be used to solve large problems within a reasonabletime, whereas heuristics can generate solutions of lower quality, but in a reasonable execution time. We suggest in this work, three different metaheuristics to solve the cell formation problem having two objectives functions: cell autonomy and load balancing between the cells.We compare numerically these metaheuristics. Furthermore, for problems of smaller dimension that can be solved exactly with CPLEX, we verify that the metaheuristics can reach the optimal value.
589

Reverse logistics: models and applications

Soto Zuluaga, Juan Pablo 12 January 2006 (has links)
En los últimos años la Logística Inversa se ha hecho relevante no solo para el mundo académico sino también para el empresarial. Las empresas dan cada día más importancia a esta área, debido a los factores medioambientales y a los beneficios derivados del mejoramiento de su proceso de devoluciones. Así mismo, para tener unos procesos de Logística Inversa eficientes y exitosos, es necesaria la colaboración entre los miembros de la cadena de suministro. Esta tesis se concentra en ambos aspectos, Colaboración y Logística Inversa.El propósito de esta tesis es doble; primero, analizar los problemas que sufren hoy en día las empresas en esta área, partiendo de una perspectiva general, y posteriormente analizando la industria editorial española. En segundo lugar, nosotros proponemos cuatro modelos matemáticos concernientes a los problemas de planificación que presentan las empresas cuando incorporan las devoluciones, y finalmente proponemos unas metodologías para solucionarlos. / During last years Reverse Logistics has become a relevant topic not only for academics but also for the business world. Companies are giving each day more and more importance to this field, because the environmental issues and the benefits that the company can obtain by the improvement of their return's processes. To obtain a successful and efficient Reverse Logistics processes there exist the need to collaborate along the supply chain. This thesis focuses on both of these two topics, Collaboration and Reverse Logistics. The aim of this thesis is twofold; first, we try to understand the returns processes' problems that companies are facing today from the management point of view, from a general perspective and afterwards on the editorial industry. Secondly, we propose some mathematical models and solution methods related to real planning problems faced by the companies when the returns are incorporated.
590

Aspectos semióticos quebrando paradigmas da avaliação de interfaces web por critérios ergonômicos / Semiotic aspects breaking paradigms in evaluating web interfaces for ergonomics

Jardim Filho, Airton Jordani 20 July 2015 (has links)
Made available in DSpace on 2016-12-12T20:17:56Z (GMT). No. of bitstreams: 1 123540.pdf: 11604613 bytes, checksum: e3582e7c70b38f1c4e2b7d8a1cc66316 (MD5) Previous issue date: 2015-07-20 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This study presents an analysis of heuristics, which are ergonomic criteria, while questioning whether they are still suitable for use after twenty years since its conception. For that purpose, the interactions between subjects in digital interfaces are taken as theoretical object, and the home page of the Google search engine is taken as an empirical object. The overall objective of the research is to determine, through expert analysis, if the usability attributes found on the website taken as an empirical object coincide with ergonomic criteria laid down by both the academy and the market, namely, the heuristics of Nielsen (1995b). This thesis also comprises a bibliographic review of ergonomic criteria and its main exponents: Bastien and Scapin (1993), Shneiderman (2005), Jordan (1998) and Nielsen (1990, 1995b), from which it was concluded that the heuristics presented by the latter are sufficiently representative of advocating for the other authors. This work also attempts to show the relationship between the concept of syncretic text, verbaltextual, besides the history of graphic interfaces and of Google, as well as usability on the worldwide web and web search devices. After submitting the empirical object analysis to six experts, it is safe to affirm that the Nielsen heuristics, although established, are not the most appropriate way to assess the usability of a web interface. This statement is based on the results of the assessments made by the evaluators. Even though there were problems identified in relation to all of the analyzed heuristics, the empirical object of this research is the most used website in the world and a major reference for Internet users. In parallel, it was observed that, over time, Google s mechanism has gradually changed, which shows its adjustment to the user that, in turn, also adapts to it. To this end, this research employed the concept of Landowski s manipulation (2014), which, in turn, was implemented in the context of web interfaces the empirical object and the way it relates and interacts with its users and its deployment as an effect of sense. In addition, a parallel was drawn between the interaction adjustment regimen and User-Centered Design which, in spite of being concepts of distinct sources, advocate for a close contact between the actors in an interaction, so that they feel each other s dynamics in order to maintain pace and balance. It can be argued, therefore, that other tools, not just those offered by usability engineering are needed, so that one can properly evaluate a web interface. / Este estudo apresenta uma análise dos critérios ergonômicos, chamados de heurísticas, questionando serem ainda adequados para aplicação vinte anos após sua concepção. Para tanto, tomou-se como objeto teórico as interações entre sujeitos nas interfaces digitais e, como objeto empírico, a página inicial do mecanismo de busca do Google. O objetivo geral da pesquisa foi verificar, por meio da análise de especialistas, se os atributos de usabilidade encontrados na página inicial do website objeto empírico são coincidentes com os critérios ergonômicos consagrados tanto pela academia, quanto pelo mercado, quer sejam as heurísticas de Nielsen (1995b). Foi realizada, ainda, revisão bibliográfica sobre critérios ergonômicos e seus principais expoentes: Bastien e Scapin (1993), Shneiderman (2005), Jordan (1998) e Nielsen (1990; 1995,b), a partir da qual foi possível concluir que as heurísticas apresentadas por este último são suficientemente representativas do que preconizam os demais autores. Buscou-se, ainda, a relação entre conceito de texto sincrético, verbo-textual, além do histórico das interfaces gráficas e do Google, usabilidade na world-wide web e de dispositivos de busca na web. Após submeter a seis especialistas a análise do objeto empírico, foi possível afirmar que as heurísticas de Nielsen, embora consagradas, não são o modo mais adequado para avaliar a usabilidade de uma interface web. Tal afirmação baseia-se nos resultados das avaliações feitas pelos especialistas. Mesmo identificados problemas com relação a todas as heurísticas analisadas, o objeto empírico desta pesquisa é o website mais utilizado no mundo e uma das maiores referências para os usuários da internet. Paralelamente, observou-se que, ao longo do tempo, o mecanismo do Google modificou-se gradativamente, o que evidencia sua adaptação ao usuário que também se adapta a ele. Para tanto, lançou-se mão do conceito de manipulação de Landowski (2014) que, por sua vez, foi transposto para o contexto das interfaces web - o objeto empírico e a maneira como se relaciona e interage com seus usuários e o consequente efeito de sentido de ajustamento gerado por essa interação. Traçou-se, ainda, um paralelo entre o regime de interação por ajustamento e o User Centered Design (ou Design Centrado no Usuário) que, embora sejam conceitos de origem distintas, preconizam um estreito contato entre os atores em uma interação, para que sintam (ou pressintam) a dinâmica do outro como forma de manter a cadência e equilíbrio. É possível afirmar, desta forma, que sejam necessárias outras ferramentas, que não apenas aquelas oferecidas pela engenharia da usabilidade, para que se possa avaliar adequadamente uma interface web.

Page generated in 0.0827 seconds