• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 483
  • 186
  • 126
  • 35
  • 28
  • 24
  • 24
  • 22
  • 20
  • 14
  • 9
  • 9
  • 8
  • 4
  • 4
  • Tagged with
  • 1080
  • 1080
  • 148
  • 147
  • 136
  • 130
  • 92
  • 70
  • 70
  • 68
  • 67
  • 58
  • 56
  • 56
  • 53
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
761

Assessing demand for organic lamb using choice modelling

Rutledge, M. P. January 2009 (has links)
The worldwide market for organic foods is growing fast, but New Zealand meat producers have been slow to respond. Specifically, New Zealand producers have little or no organic lamb products for export or domestic sale. Part of the reason for this hesitancy to meet demand with supply is that the nature of the demand and consumer willingness to pay for the product is not well understood. The purpose of this study is to investigate New Zealand organic food consumers’ attitudes towards organic food and production methods and to evaluate consumer willingness to pay for an organic lamb product. Data for this study was collected using computer aided personal interviewing (CAPI) in supermarkets and speciality stores in Christchurch and Wellington. The study questioned consumers about their consumption habits, attitudes towards organic food and production methods and presented choice modelling scenarios to test willingness to pay for different attributes of lamb. Factor analysis is used to group the 12 attitudinal questions into three factors which were then placed into a two step cluster analysis to create consumer segments. Choice modelling was then used to measure consumer preferences for the tested attributes of lamb. From the factor and cluster analysis three distinct consumer segments were found and labelled as Committed Organic Seekers, Convenience Organic Consumers and Incidental Organic Consumers. These labels reflect each group’s organic consumption habits and attitudes towards organic food. The choice modelling results show that there is a willingness to pay for organic lamb. The three identified consumer groups state they would pay a premium of 61%, 44% and 26% respectively for organic lamb over standard pasture raised lamb. This paper gives an insight into consumer attitudes and preferences towards a product that could provide a way for New Zealand farmers to increase their returns. It contributes to the body of knowledge about the likely consumer profiles of regular consumers of organic food. There are only a few other studies that have attempted to measure consumer attitudes and willingness to pay for organic meat, however, the author is not aware of any published example of a study that has specifically investigated demand for organic lamb anywhere in the world. The study provides information about stated willingness to pay for five different attributes of lamb; this information should be of value in assisting the industry by showing which product offerings are likely to generate the highest sale price.
762

板橋地區空氣污染預測模式之探討 / Researching Forcast Model of Air Pollution at Pacho

藺超華, Lian,Chau Hwa Unknown Date (has links)
由於近年來汽機車的成長率大增,□=>許多重大營建工程陸續開工,導致 空氣污染日益嚴重,所以研究板橋地區一氧化氮濃度的預測模式。在本篇 論文中,我們首先應用集群分析將一氧化氮依濃度區分成數個集群,而後 運用區別分析診斷集群分析的結果是否合宜,最後找出集群內觀察值數目 最多的那個集群,然後將多變量時間序列中經過差分一次後的自我相關模 式應用在上面。目的是要尋求更精確的污染濃度預測值,以提供環保單位 一些訊息以作參考。
763

Bayesian Cluster Analysis : Some Extensions to Non-standard Situations

Franzén, Jessica January 2008 (has links)
<p>The Bayesian approach to cluster analysis is presented. We assume that all data stem from a finite mixture model, where each component corresponds to one cluster and is given by a multivariate normal distribution with unknown mean and variance. The method produces posterior distributions of all cluster parameters and proportions as well as associated cluster probabilities for all objects. We extend this method in several directions to some common but non-standard situations. The first extension covers the case with a few deviant observations not belonging to one of the normal clusters. An extra component/cluster is created for them, which has a larger variance or a different distribution, e.g. is uniform over the whole range. The second extension is clustering of longitudinal data. All units are clustered at all time points separately and the movements between time points are modeled by Markov transition matrices. This means that the clustering at one time point will be affected by what happens at the neighbouring time points. The third extension handles datasets with missing data, e.g. item non-response. We impute the missing values iteratively in an extra step of the Gibbs sampler estimation algorithm. The Bayesian inference of mixture models has many advantages over the classical approach. However, it is not without computational difficulties. A software package, written in Matlab for Bayesian inference of mixture models is introduced. The programs of the package handle the basic cases of clustering data that are assumed to arise from mixture models of multivariate normal distributions, as well as the non-standard situations.</p>
764

The use of low energy photons in brachytherapy : dosimetric and microdosimetric studies around 103Pd and 125I seeds

Reniers, Brigitte 16 February 2005 (has links)
The general context of this work is the use of low energy photon sealed sources in brachytherapy. We have worked in particular on two isotopes: I-125 (mean energy of 27 keV) and Pd-103 (mean energy of 21 keV). The sealed sources are prepared as cylindrical seeds 4.5 mm in length and 0.8 mm in radius. Even if the external dimensions are standard, the internal design can be extremely different from one model to the other as the manufacturers try to improve the dosimetric characteristics of their sources. These isotopes are used mainly as permanent implants for prostate tumours but can also be used in the treatment of eye tumours. Compared to the higher energy photon sources, they offer the physical advantages of a safer manipulation from a radioprotection point of view and of the reduction of the dose to the surrounding healthy tissues. When performing a clinical treatment, it is absolutely mandatory to be able to report very precisely various parameters that can have an impact on the patient treatment outcomes. These parameters are, for example, the prescribed dose, the doses at different organs, the degree of uniformity that has been achieved on the target or some dose-volume information. The brachytherapy treatment planning systems (TPS) also permit more and more to conform the treatment to the patient anatomy, like in external treatments. In the case of prostate tumours, it has been possible for a few years, using ultrasound imaging, to check the positioning of the seeds and to calculate the dose distribution in real time during the implantation procedure. It is clear that to achieve a good precision in the treatment itself, the sources dosimetric characteristics must be known with a maximum of precision. As at these low energies the dosimetric characteristics are very dependant on the internal design, this implies a thorough study of any new source design. This is the subject of the first part of this work with the determination of the dosimetric characteristics of two new models of seed produced by the IBt Company. That determination has been done using measurements with thermoluminescent detectors and Monte Carlo calculations with two codes: MCNP4C and EGSnrc. The comparison of these two codes with the measurements has brought into evidence the necessity to use up to date cross section libraries during the calculations and to take into account the presence of the detectors during the measurements. However, the dosimetry is only one part of the problem when dealing with these low energy sources in radiotherapy. The irradiated materials are complex living tissues, composed of many substructures on which radiations have not always the same impact. It is now largely accepted that the most radiosensitive part of a living cell is the DNA. When photons interact with matter, they produce electrons that deposit their energy as ionisations and excitations of the atoms. These ionisations, if they occur with or close to the DNA molecule, are responsible for damaging it. That structure can support a certain amount of damage and stay functional thanks to repair mechanisms, but these mechanisms have limits in the way they can handle multiple breaks in the DNA strands. If these breaks are too close in space and time, repair is not possible and the cell dies. The density of ionisations increases when the energy of the incoming photon decreases so it is expected that low energy photons will have a higher efficiency at killing cells and so a higher Relative Biological Effect than high energy photons. To study that subject, one must reduce the volume of matter considered during energy deposition events to reach the sizes relevant for cells, micron or even nanometre volumes. That is why the part of radiation physics dealing with that problem is called microdosimetry. The second part of this work is dedicated to a theoretical microdosimetric study of these low energy photon sources using the microdosimetric functions for volumes of 1µm and a cluster analysis study for nanometric volumes. The two types of studies show that the photons of 20-30 keV are effectively more biologically efficient than high energy photons. The microdosimetric results give a ratio of the relative to Co-60 of 2.6 for Pd-103 and 2.2 for I-125, which is concordant with the experimental values published by Wuu et al. (1996). The cluster analysis also shows that the electrons generated by photons of 20-30 keV produce more high order clusters than electrons of 300 keV. The mean cluster order for clusters of 10 nm is 3.0 for Pd-103 and is 3.3 for I-125 compared to 2.1 for electrons of 300 keV. In this case, I-125 shows a higher mean cluster order, which is related to a potentially higher biological efficiency. This is explained by the fact that I-125 photons interact with a probability of 51% by Compton Effect, when that probability decreases to 27% for Pd-103. Compton interactions generate a high number of very low electrons that deposit their energy very locally with a very high density of ionisations. More radiobiological studies are needed to determine which volume site is more relevant and therefore which kind of study better reflects the reality. / Le cadre général de ce travail est l'utilisation de sources scellées de photons de basse énergie en curiethérapie. Nous avons travaillé en particulier avec deux isotopes : I-125 (énergie moyenne de 27 keV) et le Pd-103 (énergie moyenne de 21 keV). Les sources scellées se présentent sous la forme de grains cylindriques de 4.5 mm de long et 0.8 mm de rayon. Bien que les dimensions externes soient standards, la configuration interne peut être extrêmement différente d'un modèle à l'autre vu que les fabricants essaient d'améliorer les caractéristiques dosimétriques de leurs sources. Ces isotopes sont utilisés comme implants permanents pour les tumeurs de la prostate mais peuvent aussi être utilisés dans le traitement des tumeurs ophtalmiques. Comparés aux sources de plus haute énergie, ils offrent l'avantage physique d'une manipulation plus facile du point de vue de la radioprotection et d'une réduction de la dose aux tissus sains avoisinants. Lors d'un traitement clinique, il est important de pouvoir rapporter précisément certains paramètres qui peuvent avoir un impact sur les résultats du traitement pour le patient. Ces paramètres sont par exemple la dose prescrite, les doses à différents organes, le degré d'uniformité atteint sur le volume cible ou certaines information dose-volume. Les systèmes de calcul de dose pour la curiethérapie permettent également de mieux en mieux de conformer le traitement à l'anatomie du patient, comme en radiothérapie externe. Dans le cas des tumeurs de la prostate, il est possible depuis quelques années de vérifier la position des sources et de calculer les distributions de dose en temps réel lors de l'implantation grâce à l'utilisation d'une sonde ultrason. Il est clair que pour arriver à une bonne précision lors du traitement lui-même, il est indispensable de connaître les caractéristiques dosimétriques des sources avec un maximum de précision. Vu que, à ces basses énergies, les caractéristiques dosimétriques sont très dépendantes de la structure interne de sources, cela implique une étude minutieuse et complète de chaque nouveau modèle de grain mis sur le marché. Ceci forme le cadre de la première partie de ce travail qui est la détermination des caractéristiques dosimétriques de deux nouveaux modèles de grains produits par la compagnie IBt. Cette détermination a été réalisée grâce à l'utilisation de détecteurs thermoluminescents et de calculs de Monte Carlo avec deux codes : MCNP4C et EGSnrc. La comparaison de ces deux codes avec les mesures a permit de mettre en évidence la nécessité d'utiliser des bibliothèques de sections efficaces récentes lors des calculs et de prendre en compte la présence des détecteurs lors des mesures. Cependant, la dosimétrie est seulement une partie du problème de l'étude des sources de basse énergie utilisée en radiothérapie. Les milieux irradiés sont des tissus vivants complexes, composés de plusieurs sous-structures sur lesquelles les radiations n'ont pas toujours le même effet. Il est maintenant largement accepté que la partie la plus radiosensible d'une cellule vivante est la molécule d'ADN. Lorsque des photons interagissent avec la matière, ils produisent des électrons qui eux-mêmes déposent leur énergie sous forme d'ionisation et d'excitations des atomes. Ces ionisations, si elles se passent à proximité ou dans la molécule d'ADN, peuvent endommager celle-ci. Cette structure peut supporter une certaine quantité de dommages tout en restant fonctionnelle grâce à des mécanismes de réparation, mais ces mécanismes connaissent des limitations dans la manière dont ils peuvent gérer les cassures multiples des brins d'ADN. Si ces cassures sont trop proches dans le temps et l'espace, la réparation n'est plus possible et la cellule meurt. La densité d'ionisation augmente lorsque l'énergie du photon décroît. Il est donc attendu que les photons de basse énergie auront un efficacité accrue pour tuer les cellules et donc une plus haute Efficacité Biologique Relative que les photons de haute énergie. Pour étudier cet effet, il faut réduire le volume de matière considéré lors des évènements responsables du dépôt d'énergie afin d'atteindre des dimensions d'un ordre représentatif pour les cellules, microns ou même nanomètres. C'est pourquoi la partie de la physique des radiations dédiée à ce problème est appelée microdosimétrie. La seconde partie de ce travail est dédiée à une étude microdosimétrique théorique des ces sources de photons de basse énergie utilisant le concept de fonctions microdosimétriques pour des volumes de 1 µm et une analyse de cluster pour des volumes de l'ordre du nanomètre. Les deux types d'étude montrent que les photons de 20-30 keV sont effectivement plus efficaces radiobiologiquement que les photons de hautes énergie. Les résultats microdosimétriques donnent un rapport de yD relatif au Co-60 de 2.6 pour le Pd-103 et de 2.2 pour l'I-125, ce qui est concordant avec les valeurs expérimentales publiées par Wuu et al. (1996). L'analyse de cluster montre aussi que les électrons générés par les photons de 20-30 keV produisent plus de clusters d'ordre élevé que les électrons de 300 keV. L'ordre de cluster moyen pour des clusters de 10 nm est de 3.0 pour le 103Pd et de 3.3 pour l'I-125 comparé à 2.1 pour des électrons de 300keV. Dans ce cas, l'I-125 montre un ordre de cluster moyen plus élevé, ce qui est lié à une efficacité biologique potentiellement plus élevée. Ceci est expliqué par le fait que les photons de l'I-125 interagissent par effet Compton avec une probabilité de 51 %, cette probabilité tombe à 27 % dans le cas du Pd-103. Les interactions Compton génèrent un grand nombre d'électrons de très basse énergie qui déposent leur énergie très localement avec une très haute densité d'ionisations. D'autres études biologiques sont nécessaires pour déterminer quel volume est plus adapté et donc quel type d'étude reflète le mieux la réalité.
765

The Characterization of Fine Particulate Matter in Toronto Using Single Particle Mass Spectrometry

Rehbein, Peter J. G. 13 January 2011 (has links)
An Aerosol Time-of-Flight Mass Spectrometer (ATOFMS) was used to obtain mass spectra of individual aerosol particles in the 0.5 – 2 µm size range in downtown Toronto, Canada for one to two month periods during each season of 2007. A modified version of the Adaptive Resonance Theory (ART-2a) clustering algorithm, which clusters particles based on the similarity of their mass spectra, was shown to be more accurate than the existing algorithm and was used to cluster the ambient data. A total of 21 unique particle types were identified and were characterized based on their chemical composition, their size, and their temporal trends and seasonal variations. Potential sources are also discussed. Particles containing trimethylamine (TMA) were also observed and a more detailed investigation of ambient trends in conjunction with a laboratory experiment was performed in order to elucidate conditions for which TMA will be observed in the particle phase in Southern Ontario.
766

The Characterization of Fine Particulate Matter in Toronto Using Single Particle Mass Spectrometry

Rehbein, Peter J. G. 13 January 2011 (has links)
An Aerosol Time-of-Flight Mass Spectrometer (ATOFMS) was used to obtain mass spectra of individual aerosol particles in the 0.5 – 2 µm size range in downtown Toronto, Canada for one to two month periods during each season of 2007. A modified version of the Adaptive Resonance Theory (ART-2a) clustering algorithm, which clusters particles based on the similarity of their mass spectra, was shown to be more accurate than the existing algorithm and was used to cluster the ambient data. A total of 21 unique particle types were identified and were characterized based on their chemical composition, their size, and their temporal trends and seasonal variations. Potential sources are also discussed. Particles containing trimethylamine (TMA) were also observed and a more detailed investigation of ambient trends in conjunction with a laboratory experiment was performed in order to elucidate conditions for which TMA will be observed in the particle phase in Southern Ontario.
767

雙元存款產品對財富管理投資組合報酬率貢獻度分析 / The Study on the Contribution of Foreign-Exchange-Option-Linked Dual Currency Structure Notes for Wealth Management Portfolio

姜如意, Chiang, Ru Yi Unknown Date (has links)
在全球股市呈現不穩的情勢下,雙元外匯存款產品成為財富管理業務所發展的熱門產品。雙元外匯存款產品結構包括外匯選擇權與定期外幣存款。然外匯選擇權的操作過程所隱含的風險必須加以探討,因此本研究以美國那斯達克股市報酬率與美國國庫券與十年期公債利差等資訊,試著藉由集群分析,探討美元兌澳幣(USD/AUD)、美元兌英鎊(USD/GBP)、歐元兌澳幣(EUR/AUD)等元存款產品之報酬率與風險。 本研究實證結果為: 一、不同市場狀態的操作策略不同 從各集群的涵義來看,當市場狀態屬於集群1時,此時Nasdaq指數日報酬率處於高檔但已有長期成長疑慮下,則「短期看多澳幣,看空美元」為一正確的外匯策略判斷基礎。當市場處於集群2的經濟成長性與股市報酬率處於較樂觀的狀態下,「短期看多英鎊,看空美元」與「短期看空美元,看多澳幣」是較適合的判斷。當市場處於集群3的股市低檔與債券市場反映經濟成長訊息的狀態下,則「看多澳幣,看空歐元」與「短期看空澳幣,看多美元」等為較佳的策略思維。 二、雙元存款產品的現金流量補償機制必須依據不同市場狀態 本研究發現雙元外匯存款產品在不同匯率與不同集群下,會有不同的Mean/StDev值,代表投資者與財富管理業者必須面對外匯市場進行利益的分配問題。目前雙元外匯存款產品都有設定不同匯率下的保本機制,故對於財富管理業者而言,雙元外匯存款產品屬於資金短期配置的選項之一,因此,針對不同的總體經濟或市場環境,業者必須快速調整,創造投資者與業者雙贏的局面。 / With the global stock markets unstable, foreign-exchange-option-linked dual currency structure notes have become the popular products for wealth management. Foreign-exchange-option-linked dual currency structure notes have been involved with foreign exchange option and currency deposit. Nonetheless, the risks inherent in the currency option should be discussed . Therefore, this study uses cluster analysis to explore the information in Nasdaq index returns and interest spreads , to discover the returns and risks in foreign exchange rates in term of “USD/AUD”, “USD/GBP”, and “EUR/AUD”. After the analysis in this study, the conclusions of this study could be summarized as following: Firstly , the proposals and strategies for the dual currency structure notes should be based on the statuses of markets. With market status showing higher stock returns but concerns for future economic growth, the appropriate strategies should be built up on the concept of “short USD, long AUD in near term”. When market status showing positive stock returns and positive future economic growth, the appropriate strategies should be built up on the concept of “short USD, long AUD in near term” or “short USD, long GBP in near term”. With market status reflecting lower stock returns but positive perspectives for future economic growth, the appropriate strategies should be built up on the concept of “short EURO, long AUD in near term”, or “short AUD, long USD in near term”. Based on the Mean/StDev , this study suggests the wealth managers should design different portfolios under different scenarios in foreign exchange rates, to generate best payoffs between the investors and wealth managers.
768

Differentiation And Classification Of Counterfeit And Real Coins By Applying Statistical Methods

Tansel, Icten 01 June 2012 (has links) (PDF)
ABSTRACT DIFFERENTIATION AND CLASSIFICATION OF COUNTERFEIT AND REAL COINS BY APPLYING STATISTICAL METHODS Tansel, I&ccedil / ten M.Sc, Archaeometry Graduate Program Supervisor : Assist. Prof. Dr. Zeynep Isil Kalaylioglu Co-Supervisor : Prof. Dr. Sahinde Demirci June 2012, 105 pages In this study, forty coins which were obtained from Museum of Anatolian Civilizations (MAC) in Ankara were investigated. Some of those coins were real (twenty two coins) and the remaining ones (eighteen coins) were fake coins. Forty coins were Greek coins which were dated back to middle of the fifth century BCE and reign of Alexander the Great (323 &ndash / 336 BCE). The major aims of this study can be summarized as follow
769

Molekular-zytogenetische Untersuchungen und Expressionsanalysen des Multiplen Myeloms

Grandy, Isabell 05 December 2006 (has links) (PDF)
Durch die Kombination von SKY-, Array-CGH-, und Expressionsnanalysen wurden ausgewählte MM-Zelllinien auf Aberrationen hin untersucht und diese genauer analysiert. 32 Myelom-Patienten wurden mittels Array-CGH-Analyse untersucht und aufgrund ihrer Aberrationen und der klinischen Daten durch eine anschließende Clusteranalyse in 4 Subgruppen unterteilt.
770

The structuring of management control in Swedish home care units : An explorative discourse study

Lindström, Linda January 2014 (has links)
Background. The research on management in Swedish home care has been conductedmainly from sociological perspectives where structural conditions have been of interest (see for example Hagerman et al., 2013; Andersson, 2014; Österlind, 2013). The conditions impacting on management are described as differing ideals where the main ideals are the care perspective and the cost perspective (see for example Andersson, 2014; Österlind, 2013). The conflict between ideals create tensions between ideology and practice and different expectations (Antonsson, 2013) and may also create problems, dilemmas and paradoxes (Österlind, 2013). The rules impacting on the home care activities are bureaucratic rules stemming from the state and municipality. However, Trydegård (2000) argues that at the same time there is room for autonomy and path-dependence in the home care units. Purpose and research approach. There seems to be a lack of studies on management control in home care, and more especially no study combining a discourse, structures and theories on management control. The purpose of this thesis is to explore management control in home care in the relation between structures and managers’ interpretative repertoires in a social-constructionist perspective. The purpose is also to create a prototype model for further research. The ontological positioning and theoretical framework are building on Giddens’ structuration theory (1979, 1984) in which structures are seen as both the medium and outcome of social interaction and rules are important. The units of analysis are the managers’ accounts on management control in semi-structured interviews. The accounts are analysed in a so called case cluster analysis (McClintock et al., 1979) in the software program NVivo. The codes are building on Ouchi’s theory of management control (1979) as ‘input control’, ‘behaviour control’, ‘output control’ and ‘clan control’, and also building on Giddens’ structuration theory (1978, 1984) defined as 8 characteristics of rules, ‘normative sanctions’, ‘signification of meaning’, ‘authoritative’ or ‘allocative resources’. Findings. The findings reveal that home care is highly bureaucratic in input and output control by the use of formal rules stemming from municipality or state. In behaviour control home care has a medium-low degree of bureaucracy if exercised through ‘signification of meaning’ and medium-high when exercised through ‘authoritative resources’. In clan control home care has a low degree of bureaucracy and can either be positive or negative depending on how informal leaders in the unit impact on cooperation between care personnel and if there is trust and a good communication between manager and care personnel. Two main patterns of structuration appear: creation of structures for an efficient process flow of home care to increase efficiency, and co-creation of new rules for behaviour to increase cooperation. Managers focus on different situations of management control depending on conditions in the home care unit and own interpretations. Two interpretative repertoires are identified; the discourse on hard matters is created in relation to matters that are more rigid in structure, such as legislation and municipal goals and that are difficult to interpret differently, whereas soft matters are created in discourse around dilemmas and human or relational aspects of control.

Page generated in 0.0684 seconds