• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 324
  • 232
  • 51
  • 27
  • 23
  • 23
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 808
  • 139
  • 127
  • 120
  • 102
  • 98
  • 80
  • 77
  • 72
  • 70
  • 69
  • 69
  • 64
  • 63
  • 61
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
421

Soziale Informationsverarbeitung in der juristischen Urteilsfindung : experimentelle Untersuchungen zur Ankerheuristik / Social information processing and legal decision making : experimental studies on anchoring and adjustment

Bieneck, Steffen January 2006 (has links)
Heuristiken der Urteilsbildung umfassen bottom-up bzw. schemagesteuerte Strategien innerhalb der sozialen Informationsverarbeitung, mit deren Hilfe trotz unsicherer Datenlage hinreichend genaue Urteile gefällt werden können. Die Anker- und Anpassungsheuristik als eine Form solcher Faustregeln beschreibt im Wesentlichen die Wirkung von vorgegebenen Zahlen (den so genannten Ankerwerten) auf numerische Schätzungen. Urteile unter Unsicherheit sind zum Beispiel im Bereich der Rechtsprechung zu beobachten, wobei die Entscheidungsprozesse hier eher normativ auf der Basis der vorliegenden Informationen, d.h. einer datengesteuerten Verarbeitung, erfolgen sollten. <br><br> In einer Serie von drei Experimenten wurde die Ankerheuristik auf den Bereich der Rechtsprechung übertragen. Mit Hilfe der Vignettentechnik wurden <i>N</i> = 229 Rechtsreferendare sowie <i>N</i> = 600 Studierende der Rechtswissenschaften zu ihrem Strafverhalten befragt. Im Mittelpunkt standen drei Zielsetzungen: (1) die Replikation und Erweiterung der Ankereffekts in Bezug auf eine größere Gruppe von Deliktarten; (2) die Analyse individueller Unterschiede in der Ankernutzung unter Berücksichtigung verschiedener Persönlichkeitsvariablen (Need for Cognition und Need for Cognitive Closure) sowie (3) die Anregung zu verstärkter systematischer Informationsverarbeitung durch die Indizierung einer Genauigkeitsmotivation. <br><br> Der Ankereffekt in der juristischen Urteilsfindung konnte für die verschiedenen Deliktgruppen repliziert werden. Die Ergebnisse zeigen, dass die wahrgenommene Schwere der geschilderten Taten mit dem Strafmaß korrelierte. Dieser Zusammenhang wurde durch die Einführung von Ankerwerten deutlich reduziert. Entgegen den bisherigen Untersuchungen war zwar auch bei den Rechtsreferendaren ein Ankereffekt zu beobachten, der jedoch geringer ausfiel als bei den Studierenden der Rechtswissenschaften. Im Hinblick auf die Persönlichkeitsmerkmale konnte die Erwartung bestätigt werden, dass ein geringes Kognitionsbedürfnis sowie ein hohes Geschlossenheitsbedürfnis mit höherer Anfälligkeit für die Ankerheuristik einhergehen. Die Erzeugung eines Rechtfertigungsdrucks dagegen veranlasste die Probanden, sich intensiver mit den Materialien zu beschäftigen und eher datengeleitet vorzugehen. Implikationen für die juristische Praxis werden diskutiert. / Decisions are usually based on beliefs about the likelihood that an uncertain event will occur (i.e., the results of an election or the liability of the accused). In estimating the likelihood of those events people often revert to heuristics as a theory-driven processing strategy in order to reduce the effort of the decision-making process. On the one hand heuristics might be quite helpful in controlling information processing; on the other hand they can lead to systematic biases in judgments. Anchoring and adjustment describe a judgmental heuristic, where individuals gauge numerical size by starting from an initial arbitrary or irrelevant value (an anchor) and adjusting it during the subsequent course of judgment to arrive at their final judgment. However, the adjustment of the judgment typically remains insufficient, thus leading to judgments that are biased in the direction of the starting value. <br><br> The concept of judgmental heuristics can be applied to legal decision making. Legal decision-making is normatively defined as data-driven, which means that judgements about the culpability of a defendant need to be corroborated by evidence specific to the case at hand. Individuals involved in this process are required to assess the evidence without being affected by personal feelings and beliefs or by extraneous evidence. <br><br> A series of three experiments tested the impact of anchoring and adjustment on legal decision making. Using the vignette technique, <i>N</i> = 229 junior barristers and <i>N</i> = 600 law students evaluated scenarios describing criminal offences. Apart from replicating the anchoring effect in different samples, the studies explored the impact of individual differences in personality variables (need for cognition and cognitive closure) on the anchoring effect. Further, a strategy to promote data-driven processing by inducing an accuracy motivation was evaluated. <br><br> The results clearly indicate an anchoring effect in legal decision-making. The results showed a strong correlation between the perceived severity of the cases and the recommended sentence. This correlation was significantly reduced when an anchor was introduced. In contrast to previous studies, junior barristers showed a less extreme bias in their judgments compared to law students. In terms of individual differences regarding the readiness to engage in elaborate information processing the results showed a higher susceptibility for the anchoring information when need for cognition was low and need for cognitive closure was high. Introducing an accuracy motivation prompted the participants to engage in more data-driven processing, thus reducing the anchoring effect. The implications for social cognition research and legal practice are discussed.
422

How do different densities in a network affect the optimal location of service centers?

Han, Mengjie, Håkansson, Johan, Rebreyend, Pascal January 2013 (has links)
The p-median problem is often used to locate p service centers by minimizing their distances to a geographically distributed demand (n). The optimal locations are sensitive to geographical context such as road network and demand points especially when they are asymmetrically distributed in the plane. Most studies focus on evaluating performances of the p-median model when p and n vary. To our knowledge this is not a very well-studied problem when the road network is alternated especially when it is applied in a real world context. The aim in this study is to analyze how the optimal location solutions vary, using the p-median model, when the density in the road network is alternated. The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 service centers we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000. To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited improvement in the optimal solutions when nodes in the road network increase and p is low. When p is high the improvements are larger. The results also show that choice of the best network depends on p. The larger p the larger density of the network is needed.
423

A Heuristic Method for Routing Snowplows After Snowfall

Sochor, Jana, Yu, Cecilia January 2004 (has links)
Sweden experiences heavy snowfall during the winter season and cost effective road maintenance is significantly affected by the routing of snowplows. The routing problem becomes more complex as the SwedishNational Road Administration (Vägverket) sets operational requirements such as satisfying a time window for each road segment. This thesis focuses on route optimization for snowplows after snowfall; to develop and implement an algorithm for finding combinations of generated routes which minimize the total cost. The results are compared to those stated in the licentiate thesis by Doctoral student Nima Golbaharan (2001). The algorithm calculates a lower bound to the problem using a Lagrangian master problem. A common subgradient approach is used to find near-optimal dual variables to be sent to a column-generation program which returns routes for the snowplows. A greedy heuristic chooses a feasible solution, which gives an upper bound to the problem. This entire process is repeated as needed. This method for routing snowplows produces favorable results with a relatively small number of routes and are comparable to Golbaharan's results. An interesting observation involves the allocation of vehicles in which certain depots were regularly over- or under-utilized. This suggests that the quantity and/or distribution of available vehicles may not be optimal.
424

The development of an integrated routing and carbon dioxide emissions model for goods vehicles

Palmer, Andrew 11 1900 (has links)
The issues of global warming and climate change are a worldwide concern and the UK government has committed itself to major reductions in CO2 emissions, the most significant of the six greenhouse gases. Road transport currently accounts for about 22% of total UK emissions of CO2, and has been steadily rising. Therefore, initiatives are required to try and reduce the gas emissions in this sector. The aim of this research has been to develop a computer based vehicle routing model that calculates the overall amount of CO2 emitted from road journeys, as well as time and distance. The model has been used to examine a number of delivery strategies to assess how CO2 emissions vary. The aim has not been to produce new mathematical theories, but to produce an innovative basis for routing which will provide new information and knowledge about how CO2 emissions vary for different minimisation and congestion criteria. The approach used in this research brings together elements from transportation planning and environmental modelling combined with logistics based vehicle routing techniques. The model uses a digitised road network containing predicted traffic volumes, to which speed flow formulae are applied so that a good representation of speed can be generated on each of the roads. This means that the model is uniquely able to address the issue of congestion in the context of freight vehicle routing. It uses driving cycle data to apply variability to the generated speeds to reflect acceleration and deceleration so that fuel consumption, and therefore CO2, can be estimated. Integrated within the model are vehicle routing heuristics to enable routes to be produced which minimise the specified criterion of time, distance or CO2. The results produced by the model show that there is a potential to reduce CO2 emissions by about 5%. However, when other transport externalities are considered overall benefits are dependent on road traffic volumes.
425

Scheduling and Advanced Process Control in semiconductor Manufacturing

Obeid, Ali 29 March 2012 (has links) (PDF)
In this thesis, we discussed various possibilities of integrating scheduling decisions with information and constraints from Advanced Process Control (APC) systems in semiconductor Manufacturing. In this context, important questions were opened regarding the benefits of integrating scheduling and APC. An overview on processes, scheduling and Advanced Process Control in semiconductor manufacturing was done, where a description of semiconductor manufacturing processes is given. Two of the proposed problems that result from integrating bith systems were studied and analyzed, they are :Problem of Scheduling with Time Constraints (PTC) and Problem of Scheduling with Equipement health Factor (PEHF). PTC and PEHF have multicriteria objective functions.PTC aims at scheduling job in families on non-identical parallel machines with setup times and time constraints.Non-identical machines mean that not all miachines can (are qualified to) process all types of job families. Time constraints are inspired from APC needs, for which APC control loops must be regularly fed with information from metrology operations (inspection) within a time interval (threshold). The objective is to schedule job families on machines while minimizing the sum of completion times and the losses in machine qualifications.Moreover, PEHF was defined which is an extension of PTC where scheduling takes into account the equipement Health Factors (EHF). EHF is an indicator on the state of a machine. Scheduling is now done by considering a yield resulting from an assignment of a job to a machine and this yield is defined as a function of machine state and job state.
426

Larmbolagets webbplats : En studie om skapandet av en tillgänglig och användbar webbplats / Larmbolaget’s Website : A Study on the Creation of an Accessible and Usable Website

Lundberg, Niklas, Wigren, Marcus January 2012 (has links)
This paper accounts for the work with a website for a company whose real name will not be disclosed in the text of this paper. Instead, they have been assigned the fictitious name Larmbolaget, which can be translated as ‘The Alarm Company’. During our work with Larmbolaget’s website, we consulted a selection of accessibility guidelines from the Web Content Accessibility Guidelines (WCAG) 2.0, developed by a working group within the World Wide Web Consortium, as well as a number of design principles and heuristics outlined in this paper. We also conducted several usability tests on users from our intended target group. All the measures taken were an effort to ensure that Larmbolaget’s website be as accessible and usable as possible. Our conclusion is that the resulting website does comply with the accessibility guidelines and design principles described in this paper, and thus we consider it to be accessible and usable to a wide range of users, including those with disabilities. / Den här rapporten redogör för arbetet med en webbplats åt ett företag vars riktiga namn inte avslöjas i texten i denna rapport. Istället har de tilldeltas det påhittade namnet Larmbolaget. Under arbetet med Larmbolagets webbplats har vi tagit hänsyn till en utvald del av tillgänglighetsriktlinjerna i Web Content Accessibility Guidelines (WCAG) 2.0, utvecklade av en arbetsgrupp inom World Wide Web Consortium, samt ett antal designprinciper och heuristiker som vi sammanfattar i den här rapporten. Vi utförde även ett flertal användartester på användare inom vår tänkta målgrupp. Alla dessa åtgärder var ett försök att göra Larmbolagets webbplats så tillgänglig och användbar som möjligt. Vår slutsats är att den resulterande webbplatsen följer de tillgänglighetsriktlinjer och designprinciper som beskrivs i den här rapporten, och således anser vi den vara tillgänglig och användbar för en stor mängd användare, inklusive dem med funktionshinder.
427

Optimization Approaches to Protein Folding

Yoon, Hyun-suk 20 November 2006 (has links)
This research shows optimization approaches to protein folding. The protein folding problem is to predict the compact three dimensional structure of a protein based on its amino acid sequence. This research focuses on ab-initio mathematical models to find provably optimal solutions to the 2D HP-lattice protein folding model. We built two integer programming (IP) models and five constraint programming (CP) models. All the models give provably optimal solutions. We also developed some CP techniques to solve the problem faster and then compared their computational times. We tested the models with several protein instances. My models, while they are probably too slow to use in practice, are significantly faster than the alternatives, and thus are mathematically relevant. We also provided reasons why protein folding is hard using complexity analysis. This research will contribute to showing whether CP can be an alternative to or a complement of IP in the future. Moreover, figuring out techniques combining CP and IP is a prominent research issue and our work will contribute to that literature. It also shows which IP/CP strategies can speed up the running time for this type of problem. Finally, it shows why a mathematical approach to protein folding is especially hard not only mathematically, i.e. NP-hard, but also practically.
428

An Adaptive Simulated Annealing Method For Assembly Line Balancing And A Case Study

Guden, Huseyin 01 August 2006 (has links) (PDF)
Assembly line balancing problem is one of the most studied NP-Hard problems. NP-Hardness leads us to search for a good solution instead of the optimal solution especially for the big-size problems. Meta-heuristic algorithms are the search methods which are developed to find good solutions to the big-size and combinatorial problems. In this study, it is aimed at solving the multi-objective multi-model assembly line balancing problem of a company. A meta-heuristic algorithm is developed to solve the deterministic assembly line balancing problems. The algorithm developed is tested using the test problems in the literature and the the real life problem of the company as well. The results are analyzed and found to be promising and a solution is proposed for the firm.
429

A Location Routing Problem For The Municipal Solid Waste Management System

Ayanoglu, Cemal Can 01 February 2007 (has links) (PDF)
This study deals with a municipal solid waste management system in which the strategic and tactical decisions are addressed simultaneously. In the system, the number and locations of the transfer facilities which serve to the particular solid waste pick-up points and the landfill are determined. Additionally, routing plans are constructed for the vehicles which collect the solid waste from the pick-up points by regarding the load capacity of the vehicles and shift time restrictions. We formulate this reverse logistics system as a location-routing problem with two facility layers. Mathematical models of the problem are presented, and an iterative capacitated-k-medoids clustering-based heuristic method is proposed for the solution of the problem. Also, a sequential clustering-based heuristic method is presented as a benchmark to the iterative method. Computational studies are performed for both methods on the problem instances including up to 1000 pick-up points, 5 alternative transfer facility sites, and 25 vehicles. The results obtained show that the iterative clustering-based method developed achieves considerable improvement over the sequential clustering-based method.
430

Applying cross-channel user experience design theory to practice : A case study of a public transportation company in Sweden

Lång, Ida, Schlegel, Anne January 2015 (has links)
The emergence of digital technology, social media and ubiquitous computing in the 21stcentury changed customer behavior and created new possibilities, but also challenges, forcompanies offering their services. The new customer generation is more tech-savvy thanever before, and therefore places higher demands on companies to have well-designed experienceswith services that can be consumed through various channels. This study investigatesthese service environments to see if they are actively shaped to cross-channel ecosystemsby the companies or if the companies react to the demands of their customers. Furthermore,the goal of this thesis is to find out how the current theory of cross-channel userexperience can assist in formulating design strategies for service ecosystems. To determinethis, the authors conducted a theoretical analysis of the current IS literature and created,based on that, a cross-channel user experience design framework. Within a case study of aSwedish transportation company, company and user interviews, direct observations of theavailable service artifacts, analysis of documentation, and the design of the user journeyswere executed to assess the as-is ecosystem. On the basis of these results, it was proventhat cross-channel ecosystems are shaped based on user demands. The created frameworkwas applied to formulate a language of critique of the cross-channel user experience designof the underlying case study, and the framework was proven to be applicative to practiceafter adjusting it to its final version.

Page generated in 0.043 seconds