• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • 7
  • 6
  • 3
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 46
  • 9
  • 7
  • 7
  • 7
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Palmprint Identification Based on Generalization of IrisCode

Kong, Adams 22 January 2007 (has links)
The development of accurate and reliable security systems is a matter of wide interest, and in this context biometrics is seen as a highly effective automatic mechanism for personal identification. Among biometric technologies, IrisCode developed by Daugman in 1993 is regarded as a highly accurate approach, being able to support real-time personal identification of large databases. Since 1993, on the top of IrisCode, different coding methods have been proposed for iris and fingerprint identification. In this research, I extend and generalize IrisCode for real-time secure palmprint identification. PalmCode, the first coding method for palmprint identification developed by me in 2002, directly applied IrisCode to extract phase information of palmprints as features. However, I observe that the PalmCodes from the different palms are similar, having many 45o streaks. Such structural similarities in the PalmCodes of different palms would reduce the individuality of PalmCodes and the performance of palmprint identification systems. To reduce the correlation between PalmCodes, in this thesis, I employ multiple elliptical Gabor filters with different orientations to compute different PalmCodes and merge them to produce a single feature, called Fusion Code. Experimental results demonstrate that Fusion Code performs better than PalmCode. Based on the results of Fusion Code, I further identify that the orientation fields of palmprints are powerful features. Consequently, Competitive Code, which uses real parts of six Gabor filters to estimate the orientation fields, is developed. To embed the properties of IrisCode, such as high speed matching, in Competitive Code, a novel coding scheme and a bitwise angular distance are proposed. Experimental results demonstrate that Competitive Code is much more effective than other palmprint algorithms. Although many coding methods have been developed based on IrisCode for iris and palmprint identification, we lack a detailed analysis of IrisCode. One of the aims of this research is to provide such analysis as a way of better understanding IrisCode, extending the coarse phase representation to a precise phase representation, and uncovering the relationship between IrisCode and other coding methods. This analysis demonstrates that IrisCode is a clustering process with four prototypes; the locus of a Gabor function is a two-dimensional ellipse with respect to a phase parameter and the bitwise hamming distance can be regarded as a bitwise angular distance. In this analysis, I also point out that the theoretical evidence of the imposter binomial distribution of IrisCode is incomplete. I use this analysis to develop a precise phase representation which can enhance iris recognition accuracy and to relate IrisCode and other coding methods. By making use of this analysis, principal component analysis and simulated annealing, near optimal filters for palmprint identification are sought. The near optimal filters perform better than Competitive Code in term of d’ index. Identical twins having the closest genetics-based relationship are expected to have maximum similarity in their biometrics. Classifying identical twins is a challenging problem for some automatic biometric systems. Palmprint has been studied for personal identification for many years. However, genetically identical palmprints have not been studied. I systemically examine Competitive Code on genetically identical palmprints for automatic personal identification and to uncover the genetically related palmprint features. The experimental results show that the three principal lines and some portions of weak lines are genetically related features but our palms still contain rich genetically unrelated features for classifying identical twins. As biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analyzed before they are massively deployed in security systems. I propose projected multinomial distribution for studying the probability of successfully using brute-force attacks to break into a palmprint system based on Competitive Code. The proposed model indicates that it is computationally infeasible to break into the palmprint system using brute-force attacks. In addition to brute-force attacks, I address the other three security issues: template re-issuances, also called cancellable biometrics, replay attacks, and database attacks. A random orientation filter bank (ROFB) is used to generate cancellable Competitive Codes for templates re-issuances. Secret messages are hidden in templates to prevent replay and database attacks. This technique can be regarded as template watermarking. A series of analyses is provided to evaluate the security levels of the measures.
12

A Rescheduling Problem With Controllable Processing Times:trade-off Between Number Of Disrupted Jobs And Reschedulingcosts

Cincioglu, Derya 01 December 2011 (has links) (PDF)
In this thesis, we consider a rescheduling problem on non-identical parallel machines with controllable processing times. A period of unavailability occurs on one of the machines due to a machine failure, material shortage or broken tool. These disruptions may cause the original schedule to become inecient and sometimes infeasible. In order to generate a new and feasible schedule, we are dealing with two conflicting measures called the eciency and stability measures simultaneously. The eciency measure evaluates the satisfaction of a desired objective function value and the stability measure evaluates the amount of change between the schedule before and after the disruption. In this study, we measure stability by the number of disrupted jobs. In this thesis, the job is referred as a disrupted job if it completes processing after its planned completion time in the original schedule. The eciency is measured by the additional manufacturing cost of jobs. Decreasing number of disrupted jobs requires compressing the processing time of a job which cause an increase in its additional manufacturing cost. For that reason we cannot minimize these objectives at the same time. In order to handle this, we developed a mixed integer programming model for the considered problem by applying the epsilon-constraint approach. This approach makes focusing on the single objective possible to get efficient solutions. Therefore, we studied the problem of minimizing additional manufacturing cost subject to a limit on the number of disrupted jobs. We also considered a convex compression cost function for each job and solved a cost minimization problem by applying conic quadratic reformulation for the model. The convexity of cost functions is a major source of diculty in finding optimal integer solutions in this problem, but applying strengthened conic reformulation has eliminated this diculty. In addition, we prepare an improvement search algorithm in order to find good solution in reasonable CPU times. We use our heuristic procedure on optimality properties we showed for a single machine subproblem. We made computational experiments on small and medium scale test problems. Afterwards, we compare the performance of the improvement search algorithm and mathematical model for their solution quality and durations.
13

The role of transfer-appropriate processing in the effectiveness of decision-support graphics

Stiso, Michael E. 15 November 2004 (has links)
The current project is an examination of the effectiveness of decision-support graphics in a simulated real-world task, and of the role those graphics should play in training. It is also an attempt to apply a theoretical account of memory performance-transfer-appropriate processing-to naturalistic decision making. The task in question is a low-fidelity air traffic control simulation. In some conditions, that task includes decision-support graphics designed to explicitly represent elements of the task that normally must be mentally represented-namely, trajectory and relative altitude. The assumption is that those graphics will encourage a type of processing different from that used in their absence. If so, then according to the theory of transfer-appropriate processing (TAP), the best performance should occur in conditions in which the graphics are present either during both training and testing, or else not at all. For other conditions, the inconsistent presence or absence of the graphics should lead to mismatches in the type of processing used during training and testing, thus hurting performance. A sample of 205 undergraduate students were randomly assigned to four experimental and two control groups. The results showed that the support graphics provided immediate performance benefits, regardless of their presence during training. However, presenting them during training had an apparent overshadowing effect, in that removing them during testing significantly hurt performance. Finally, although no support was found for TAP, some support was found for the similar but more general theory of identical elements.
14

Why do IKEA's products have different prices in different countries?

Chen, Mengling, Huang, Xin January 2012 (has links)
During the past decade, the law of one price and purchasing power parity theories have been empirically tested for their validity. IKEA, as a world famous furnishing company, sells identical products in different countries with different prices. The main emphasis of this paper is placed on the problem of if and why IKEA’s pricing actually departs from the law of one price and purchasing power parity. We focus on the following three main explaining factors: the existence of trade cost, the influences of non-traded parts cost of the goods, and other possible pricing behaviors of the firms. To be able to fulfill our objectives, a regression model combined with the theoretical framework and the institutional framework of IKEA have been used in this paper. The remarkable outcomes are gotten as below: (Ⅰ) The price variation still exist after removing the influences of transportation cost, trade barriers, taxes. (Ⅱ) Higher productivity contributes to higher national prices, but higher labor cost has no significant effect on price variation. (Ⅲ) Price discrimination and special market strategies in specific areas do play a role in the price variation.
15

”Jag känner att vi är tvillingar, men jag måste få vara mig själv.” : En kvalitativ studie om enäggstvillingars upplevelser kring sin identiteti sin sociala miljö.

Öman, Cecilia, Ladan, Antonija January 2014 (has links)
Denna studie understryker den sociala miljöns betydelse för enäggstvillingars identitetsskapande. Hur utvecklas identiteten hos två genetiskt identiska individer som växer upp i samma miljö? Undersökningen bygger på kvalitativa intervjuer med tio enäggstvillingar, som även vill ge en inblick i hur enäggstvillingar upplever sin relation till sin tvilling och hur de upplever att de uppfattas av den sociala omgivningen. I vår teoretiska referensram har vi använt oss av diverse identitetsteorier som betonar interpersonella- och grupprocesser, Cooleys teori om spegeljaget samt den sociala jämförelseteorin som presenteras av Festinger. I resultatet framgick det hur enäggstvillingars identitet uppfattas i förhållande till sin tvilling, där omgivningen hade en betydande roll i enäggstvillingarnas identitetsskapande men även hur jämförelser direkt påverkade den egna självkänslan. Vi vill även konstatera att samtliga enäggstvillingar delade upplevelsen av trygghet och gemenskap i relationen till sin tvilling. / This study emphasizes the social environment to identical twins identity. How does the identity of two genetically identical individuals who grow up in the same environment develop? The study is based on qualitative interviews with ten identical twins, who also want to give a glimpse of how identical twins experience their relationship with their twin and how they experience they are perceived by the social environment. We have been using various identity theories that emphasize interpersonal and group processes in our theoretical framework, also Cooley's theory of the looking glass self and the social comparison theory presented by Festinger. The result showed how identical twins identity was perceived in relation to its twin, where the social environment had a significant role for identical twins identity but also how comparisons directly affected their self-esteem. We would also note that all identical twins shared the experience of safety and fellowship in relation to its twin.
16

Welfare implications of nonidentical time valuations under constrained road pricing policies : analytical studies with corridor and urban-wide networks

Sapkota, Virginia A. January 2004 (has links)
The goal of the research is to devise an equitable road pricing system which would leave the majority of routes free of tolls, so that low income people would suffer no cash loss although they would probably suffer loss of time. The aims of the dissertation are twofold. The first is to provide a numerical analysis of how urban commuters with differing abilities to pay would respond to additional road user charges. The welfare implications of such differential responses are examined and their policy implications analysed. The second aim is to develop a practical framework to model congestion pricing policies in the context of heterogeneous users. To achieve these aims, the following objectives have been set: (a) Using a simple network with two parallel competing routes, determine both welfare maximising and revenue maximising tolls under the constraint that only one route can be priced. In this setting, determine the allocation of traffic between the alternative routes, the efficiency gain, the revenue, the changes in travel cost and the distributional effects. (b) Establish a realistic model of an actual urban area to examine the impacts of selectively tolling congestible routes. As in the simple network case, assess the effects of toll policy on traffic distribution, network efficiency, revenues, and the welfare of the individual consumer and society. (c) Evaluate whether the non-identical treatment of users will enhance the acceptability of congestion pricing as a transport policy. Results from the simulations indicate that non-identical treatment of drivers? responses to toll charges provides better understanding of the differential impacts of various pricing policies. Allowing for heterogeneity in time valuation provides a better assessment of the efficiency of pricing policies and of the welfare impacts of toll charges, as it is able to capture their differential effects. More importantly, it shows that low-income commuters may not be significantly worse off with pricing especially when there is a free alternative route. This research demonstrates the need to adopt appropriate analytical techniques and assumptions when modelling the traffic equilibrium in a network with tolls. These include relaxing the homogeneity assumption, examining sensitivity to supply function parameter values and to the effect of vehicle operating cost, and using a route rather than link based measure of consumer surplus
17

Twin-to-twin transfusion syndrome: diagnosis, treatment, and long term outcomes

Ansari, Arisha 27 January 2023 (has links)
Twin to twin transfusion syndrome is a rare complication that can develop in monochorionic twin pregnancies where abnormal placental connections lead to hemodynamic imbalance between the two fetuses. The twin receiving the surplus of blood experiences polyhydramnios whereas the twin donating their blood experiences oligohydramnios. Diagnosis of this syndrome is done based off of the Quintero Staging scale, which consists of five categories of criteria ranging from non-critical diagnoses to diagnoses involving demise of one or two fetuses. The gold standard for treatment involves ablating abnormal vessel connections via a laser therapy. This therapy has shown to reduced short term and long term complications within the twins, and be most efficient at ceasing the disproportionate blood supply between the fetuses. Long term outcomes of twin to twin transfusion syndrome mainly involve neurodevelopmental impairment, but cardiovascular and renal complications can also be present. Adverse neurodevelopmental outcomes should be the ones to most closely monitor postnatally in all TTTS survivors. For recipient twin survivors, cardiovascular outcomes should be most closely watched via blood pressure monitoring and routine echocardiograms. For donor twin survivors, creatinine levels should be routinely checked in order to detect signs of chronic kidney disease in early childhood. Long term outcomes of twin to twin transfusion syndrome still need further investigating due to the difficulty of gathering information postnatally. Limitations that further increase the complexity of this research include lack of education and decreased opportunities for underserved communities to access the advanced medical care required to treat and monitor this disease. Shedding light on this disparity can lead mothers to be more aware of the signs and symptoms of this disease, leading to early detection and more positive outcomes.
18

Almost Mirror Image: Exploring The Similarities And Dissimilarities Of Identical Twins In Theatrical Solo Performance

Mignacca, Elizabeth 01 January 2015 (has links)
Almost Mirror Image: Exploring the Similarities and Dissimilarities of Identical Twins in Theatrical Solo Performance Is an exploration of the psyche of identical twins within the context of devised solo performance. The author, an identical twin herself, has long been interested in twins' ability to cultivate both highly independent personalities as well as intensely co-dependent tendencies during development. What can twins tell us about the way we create close relationships and how is their upbringing radically different from the majority of the world that is born alone? Equally intrigued by society's growing technological dependence, the author would like to delve into how the science and development of twins appears counterintuitive to the intra-personal technological world they grow up in by using personal, autobiographic solo performance as her research platform. The data collected from research sources such as Jo Bonney's Extreme Exposure and Michael Kearns' The Solo Performer's Journey, will provide fodder for the thesis document and the author's devised solo piece, entitled Teach me how to be Lonely. While devising her own solo performance, the author will compare and contrast her process with that of a few select solo performers such as Anna Deavere Smith and Rachel Rosenthal. The author will delve into various styles of solo work creation, including the testimony plays of Smith and the autobiographical style of Rosenthal, in order to view her own work with a self-reflective and identity-driven lens. Overall, the author hopes to achieve a more comprehensive understanding of her own experience as an identical twin through the facilitation of her solo work as well as explore how the creation of solo performance can offer artists in the 21st century more freedom of expression and identity than the performance of a standard play.
19

IDENTICAL CONSTITUENT COMPOUNDING: A CONCEPTUAL INTEGRATION-BASED MODEL

Benjamin, Brandon Lee 31 May 2018 (has links)
No description available.
20

Lot-sizing and scheduling optimization using genetic algorithm

Darwish, Mohammed January 2019 (has links)
Simultaneous lot-sizing and scheduling problem is the problem to decide what products to be produced on which machine and in which order, as well as the quantity of each product. Problems of this type are hard to solve. Therefore, they were studied for years, and a considerable number of papers is published to solve different lotsizing and scheduling problems, specifically real-case problems. This work proposes a Real-Coded Genetic Algorithm (RCGA) with a new chromosome representation to solve a non-identical parallel machine capacitated lot-sizing and scheduling problem with sequence dependent setup times and costs, machine cost and backlogging. Such a problem can be found in real world production line at furniture manufacturer in Sweden. Backlogging is an important concept in this problem, and it is often ignored in the literature. This study implements three different types of crossover; one of them has been chosen based on numerical experiments. Four mutation operators have been combined together to allow the genetic algorithm to scan the search area and maintain genetic diversity. Other steps like initializing of the population and a reinitializing process have been designed carefully to achieve the best performance and to prevent the algorithm from trapped into the local optimum. The proposed algorithm is implemented and coded in MATLAB and tested for a set of standard medium to large-size problems taken from the literature. A variety of problems were solved to measure the impact of different characteristics of problems such as the number of periods, machines, and products on the quality of the solution provided by the proposed RCGA. To evaluate the performance of the proposed algorithm, the average deviation from the lower bound and runtime for the proposed RCGA are compared with three other algorithms from the literature. The results show that, in addition to its high computational speed, the proposed RCGA outperforms the other algorithms for non-identical parallel machine problems, while it is outperformed by the other algorithms for problems with the more identical parallel machine. The results show that the different characteristics of problem instances, like increasing setup cost, and size of the problem influence the quality of the solutions provided by the proposed RCGA negatively.

Page generated in 0.0971 seconds