• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 129
  • 43
  • 16
  • 13
  • 11
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 2
  • 2
  • Tagged with
  • 287
  • 28
  • 25
  • 23
  • 21
  • 18
  • 17
  • 16
  • 16
  • 15
  • 14
  • 14
  • 14
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Functional recovery following neuromuscular blockade in critically ill adults

Foster, Janet G. Whetstone, 1953- 14 March 2011 (has links)
Not available / text
12

The analysis of a transistorized fast rise blocking oscillator

Guerin, James Howard, 1930- January 1960 (has links)
No description available.
13

A study of compressibility and scale and their influence in dead-end pressure filtration

Willmer, S. A. January 1996 (has links)
A systematic study has been made of the factors which influence compressibility in cake filtration. A fully automated dead-end pressure filtration rig was designed, constructed and assembled at Loughborough and used to provide accurate data for an experimental matrix. The parameters investigated included pressure, feed concentration, time, surface charge, size and shape of the particulate material and scale of filtration. A proven electrical resistance measuring technique was used to determine transient solids concentrations through a filtering cake/suspension and subsequently interpret filtration performance. The particulate materials calcite and zinc sulphide dispersed in water were used to span a range of filter cake compressibility. The initial results from experiments using calcite were found to be reproducible and generally followed the expected trends. More compressible materials, such as zinc sulphide, were subsequently filtered at different pHs to investigate the influence of surface charge. Filter cells of different filter sizes were used to examine the influence of scale on filtration. Several curious observations were made such as sudden increases in filtrate rate at apparently stable process conditions and lower concentration measurements near the base of the cake. The importance of considering all scale-up parameters and their relation to each other is highlighted in the work. Further points such as the definition of compressibility over a narrow range of pressure and the structure of the filter cake have been discussed. The results were analysed with the use of conventional and new filtration theories. The conventional and modem theories generally gave good predictions of cake height for all the materials tested but for unstable/compressible suspensions the prediction of the concentration profile was less accurate. The use of the maximum solids concentration value at the base of the cake has been suggested to reduce this error and so help prevent filter under sizing. Results indicating retarded packing compressibility have been discussed along with volume-time data which suggest an influence of scale on filtration. Scale-up constants were found to vary considerably in some cases. Large changes in cake resistance were seen for small changes in cake concentration suggesting a need to use other structural characterising parameters as well as macroscopic cake concentration values. The Koenders and Wakeman model gave good predictions of the first linear part of the volume-time relationships for stable suspensions. Conclusions are drawn on the influence of each of the studied parameters on the filtration of compressible materials.
14

Some aspects of clinical pharmacology of rocuronium and rapacuronium

McCourt, Killian Charles January 1999 (has links)
No description available.
15

Chiral chromatography of enantiomeric cardiovascular and other drugs

Kingston, Gillian A. January 1990 (has links)
Since the enantiomers of a number of racemic drugs have been found to have different activities or modes of action, enantioselective analysis is becoming more important. A number of different approaches to chromatographic chiral resolutions have been evaluated for their ability to resolve the enantiomers of racemic beta blocking drugs, Three chiral HPLC columns were investigated; a cyclodextrin phase, an (R)-3,5-dinitrabezoylphenylglycine phase and a protein phase. The acid glycoprotein phase successfully resolved atenolol, alprenolol, metoprolol, oxprenolol, propranolol and verapamil with 0.01M phosphate buffer eluents modified with either acetonitrile or isopropanol. The (R)-3,5-dinitrobenzoylphenylglycine phase was used with eluents of Isopropanol in hexane and resolutions of propranolol, oxprenolol, metoprolol, alprenolol and pronethalol were achieved after formation of the 1- or 2-naphthamide derivatives, although no separations were achieved for the underivatised samples. The cyclodextrin phase was also found to be unsuccessful in resolving underivatised samples of propranolol and verapamil. However preliminary results indicate that resolutions are possible after the formation of their trifluoroacetyl derivatives. The cyclodextrin phase was also successfully used to resolve the enantiomers of chlorpheniramine and the geometric isomers of clomiphene. In addition to the chiral HPLC stationary phases, a (+)-10-camphorsulphonic acid mobile phase additive was investigated, although this was found to be completely unsuccessful. Finally the use of a chiral diamide GLC column was investigated. This was not suited to the analysis of beta blockers, even after derivatisation, although derivatised amino acids were well resolved. The use of computer modelling to predict the degree of separation of enantlomers was also investigated for the (R)-3,5-dinitrabenzoyl phenylglycine phase, with the interaction energies between the phase and both isomers of each compound calculated for the most stable conformation. From a comparison with the experimental results, it was shown that this approach to prediction was unsuccessful.
16

Optimization on H.264 De-blocking Filter

Waheed, Abdul-Mohammed January 2008 (has links)
H.264/AVC is the state-of-the-art video coding standard which promises to achieve same video quality at about half the bit rate of previous standards (H.263, MPEG-2). This tremendous achievement in compression and perceptual quality is due to the inclusion of various innovative tools. These tools are highly complex and data intensive as a result poses very heavy computational burden on the processors. De-blocking filter is one among them, it is the most time consuming part of the H.264/AVC reference decoder. In this thesis, a performance analysis of the de-blocking filter is made on Intel Pentium 4 processor and accordingly various optimization techniques have been studied and implemented. For some techniques statistical analysis of video data is done and according to the results obtained optimization is performed and for other techniques SIMD instructions has been used to achieve the optimization. Comparison of optimized techniques using SIMD with the reference software has shown significant speedup thus contributing to the real time implementation of the de-blocking filter on general purpose platform. / De-blocking Filter is the most time consuming part of the H.264 High Profile decoder. The process of De-block filtering specified in the H.264/AVC standard is sequential thus not computationally optimal. In this thesis various optimization algorithms have been studied and implemented. When compared to JM13.2 boundary strength algorithm, Static and ICME algorithms are quite primitive as a result no performance gain is achieved, in fact there is a decrease in performance. This dismal performance is due to various reasons, prominent among them are increased memory access, unrolling of loop to 4x4 boundary and early detection of intra blocks. When it comes to the optimization algorithms of Edge filtering module both the algorithms (SIMD and fast algorithm) showed significant improvement in performance when compared to JM13.2 edge filtering algorithm. This improvement is mainly due to the parallel filtering operation done in edge filtering module. Therefore, by using SSE2 instructions large speed up could be achieved on general purpose processors like Intel, while keeping the conformance with the standard.
17

Essays on economics of information : search, networks and price discrimination / Essais sur l'économie d'information : recherche, réseaux sociaux et discrimination de prix

Parakhoniak, Anastasiia 24 May 2018 (has links)
Cette thèse se compose de trois chapitres indépendants abordant différentes questions de l’économie de l’information. Le premier chapitre étudie des stratégies optimales des entreprises qui sont présentes sur les marchés en ligne et hors ligne. Nous étudions des stratégies de prix optimales des détaillants en présence de showrooming et leurs décisions sur des canaux de distribution. Le showrooming est une situation où les consommateurs essayent des produits dans des magasins classiques avant de les acheter en ligne à un prix inférieur. Une manière d’empêcher le showrooming est d’utiliser des prix identiques dans le magasin physique et en ligne. Nous montrons que dans l’optique de recherche de prix bas, le choix de prix identiques est, en effet, un choix optimal. Cependant concernant des prix plus élevés, les prix identiques sont suboptimals, et les achats en ligne et en magasins classiques coexistent avec le showrooming. Une entreprise qui fait face à la concurrence en ligne d’un détaillant multicanaux étranger a une incitation au geo-blocking, c.-à-d. qu’elle refuse de servir les clients étrangers, bien que cela amène à une diminution de la demande. Le geo-blocking modère la concurrence en ligne et mène à des prix plus élevés aussi bien en ligne que dans les magasins physiques. L’exigence juridique des prix identiques, aide à éliminer les incitations au geo-blocking et reconstitue ainsi la concurrence en ligne. Le deuxième chapitre analyse la diffusion de l’information dans les réseaux de la communication où les interactions sociales sont coûteuses. Nous proposons un modèle dynamique avec les agents stratégiques qui décident combien d’effort mettre dans la stratégie publicitaire d’un produit pour une période donnée. Nous montrons que le niveau d’équilibre de l’effort individuel de la communication est convexe avec la proportion des agents avertis, et inférieur le niveau socialement optimal au cause de l’effet substantiel de free-riding. Nous prouvons que pour des coûts de recommandation suffisamment élevées c’est socialement optimal que les agents symétriques exercent le même effort de communication tandis que pour des coûts de recommandation basses ceci n’est pas vrai. Dans le cadre de notre modèle nous analysons la stratégie de la publicité de l’entreprise lançant un nouveau produit avec des extériorités de réseau positives pour des consommateurs. Les expositions d’analyse montrent que les résultats de la publicité diminuent rapidement en proportion de consommateurs avertis de l’effet de free-riding. Ainsi, de façon optimale l’entreprise doit ajuster et réduire le niveau de la publicité par intermittence. Le troisième chapitre est un papier co-écrit avec Maarten Janssen et Alexei Parakhonyak. Dans cet article nous proposons un nouveau concept d’équilibre “Non-reservation price equilibria” (Non-RPE). “Reservation price equilibria” (RPE) n’évaluent pas exactement la puissance du marché dans les marchés de recherche du consommateur. Sur la plupart des marchés de recherche, les consommateurs ne connaissent pas les éléments importants de l’environnement dans lequel ils font des recherches (comme, pour exemple, le coût pour les entreprises). Nous discutons que RPE souffrent de questions théoriques, telles que la non-existence et la dépendance critique des croyances spécifiques hors-de-équilibre, quand les consommateurs apprennent en faisant des recherches. Nous définissons équilibrée, la situation où les consommateurs choisissent rationnellement des stratégies de recherche qui ne sont pas caractérisées par un prix de réservation. Non-RPE existent toujours et ne dépendent pas des croyances spécifiques hors-deéquilibre. Non-RPE ont pour objectif la recherche active du consommateur et sont compatibles avec les résultats empiriques récents. / This thesis consists of three independent chapters addressing different questions of information economics. The first chapter studies optimal strategies of firms which are present in both offline and online markets. We study optimal pricing strategies of retailers in presence of showrooming and their decisions on distribution channels. Showrooming is a situation where consumers try products at brick-and-mortar stores before purchasing them online at a lower price. One way to prevent showrooming is to use a price matching policy, whereby price is the same in both the physical store and the online channel. We show that for small search costs, a price matching policy is indeed optimal. However for higher search costs price matching is suboptimal, and online and offline purchases coexist with showrooming. A firm which faces online competition from a foreign multichannel retailer has an incentive to geo-block, i.e. refuse to serve foreign customers, even though it leads to a decrease in potential demand. Geo-blocking relaxes online competition and leads to higher prices both online and in brick-and-mortar stores. A legal price parity requirement helps to eliminate incentives to geo-block and thus restores online competition. The second chapter analyzes information diffusion process in communication networks where social interactions are costly. We provide a dynamic model with strategic agents who decide how much effort to put into the propagation of information about a product in each period. We show that the equilibrium level of the individual communication effort is convex in the proportion of informed agents, and lower than the socially optimal level due to the substantial free-riding effect. We show that for sufficiently high recommendation cost it is socially optimal that symmetric agents exert the same communication effort while for low recommendation cost this is not true. In the context of our model we analyze the advertising strategy of the firm launching a new product with positive network externalities for consumers. The analysis shows that the outcome of advertisement is decreasing fast with the proportion of informed consumers due to the free-riding effect. Thus, optimally the firm has to adjust and reduce the level of advertising in each period. The third chapter is a co-authored paper with Maarten Janssen and Alexei Parakhonyak. In this paper we propose a new equilibrium concept of Non-reservation price equilibria (Non-RPE). Reservation price equilibria (RPE) do not accurately assess market power in consumer search markets. In most search markets, consumers do not know important elements of the environment in which they search (such as, for example, firms’ cost). We argue that when consumers learn when searching, RPE suffer from theoretical issues, such as non-existence and critical dependence on specific out-of-equilibrium beliefs. We characterize equilibria where consumers rationally choose search strategies that are not characterized by a reservation price. Non-RPE always exist and do not depend on specific out-of-equilibrium beliefs. Non-RPE have active consumer search and are consistent with recent empirical findings.
18

Network Bargaining: Creating Stability Using Blocking Sets

Steiner, David January 2012 (has links)
Bargaining theory seeks to answer the question of how to divide a jointly generated surplus between multiple agents. John Nash proposed the Nash Bargaining Solution to answer this question for the special case of two agents. Kleinberg and Tardos extended this idea to network games, and introduced a model they call the Bargaining Game. They search for surplus divisions with a notion of fairness, defined as balanced solutions, that follow the Nash Bargaining Solution for all contracting agents. Unfortunately, many networks exist where no balanced solution can be found, which we call unstable. In this thesis, we explore methods of changing unstable network structures to find fair bargaining solutions. We define the concept of Blocking Sets, introduced by Biro, Kern and Paulusma, and use them to create stability. We show that by removing a blocking set from an unstable network, we can find a balanced bargaining division in polynomial time. This motivates the search for minimal blocking sets. Unfortunately this problem is NP-hard, and hence no known efficient algorithm exists for solving it. To overcome this hardness, we consider the problem when restricted to special graph classes. We introduce a O(1)-factor approximation algorithm for the problem on planar graphs with unit edge weights. We then provide an algorithm to solve the problem optimally in graphs of bounded treewidth, which generalize trees.
19

Blocked and recovered memories of affective, distinctive, and neutral paragraphs

Corbisier, Barbara Lynn 15 May 2009 (has links)
Highly affective memories have been thought to be longer lasting and more detailed than other memories, and many experimental results have supported this assertion. The apparent robustness of these memories, however, may result from their high distinctiveness, rather than their emotional content. Two experiments tested free and cued recall for negative affect, distinctive, and neutral paragraphs. Experiment 1 compared neutral and negative affect paragraphs using a blocked and recovered memory technique. Affective paragraphs were remembered significantly better than neutral paragraphs in free recall of paragraph titles, regardless of condition. Details of neutral paragraphs were remembered significantly better than affective paragraphs, regardless of condition. No recovery effect was found. Experiment 2 compared distinctive and neutral paragraphs using the same technique. Free recall of paragraph titles did not differ between paragraph types. Neutral paragraphs were remembered better than distinctive paragraphs in cued recall, regardless of condition. Participants remembered significantly more with cued recall, and significantly more in the forget condition, and distinctive paragraphs were subject to a much greater forgetting effect than neutral paragraphs. It is unclear why a robust forgetting effect, using these stimuli, was not found. Consistent with previous literature, affective stimuli were remembered well, but inconsistently, distinctive stimuli were not. These results provide support for the claim that negative affect memories are more robust than other memories. This may result from their inherent emotional content as opposed to their being distinctive in some way.
20

Analysis and Control of Batch Order Picking Processes Considering Picker Blocking

Hong, Soon Do 2010 August 1900 (has links)
Order picking operations play a critical role in the order fulfillment process of distribution centers (DCs). Picking a batch of orders is often favored when customers’ demands create a large number of small orders, since the traditional single-order picking process results in low utilization of order pickers and significant operational costs. Specifically, batch picking improves order picking performance by consolidating multiple orders in a "batch" to reduce the number of trips and total travel distance required to retrieve the items. As more pickers are added to meet increased demand, order picking performance is likely to decline due to significant picker blocking. However, in batch picking, the process of assigning orders to particular batches allows additional flexibility to reduce picker blocking. This dissertation aims to identify, analyze, and control, or mitigate, picker blocking while batch picking in picker-to-part systems. We first develop a large-scale proximity-batching procedure that can enhance the solution quality of traditional batching models to near-optimality as measured by travel distance. Through simulation studies, picker blocking is quantified. The results illustrate: a) a complex relationship between picker blocking and batch formation; and b) a significant productivity loss due to picker blocking. Based on our analysis, we develop additional analytical and simulation models to investigate the effects of picker blocking in batch picking and to identify the picking, batching, and sorting strategies that reduce congestion. A new batching model (called Indexed order Batching Model (IBM)) is proposed to consider both order proximity and picker blocking to optimize the total order picking time. We also apply the proposed approach to bucket brigade picking systems where hand-off delay as well as picker blocking must be considered. The research offers new insights about picker blocking in batch picking operations, develops batch picking models, and provides complete control procedures for large-scale, dynamic batch picking situations. The twin goals of added flexibility and reduced costs are highlighted throughout the analysis.

Page generated in 0.0621 seconds