• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1291
  • 376
  • 212
  • 163
  • 71
  • 63
  • 36
  • 33
  • 28
  • 28
  • 26
  • 14
  • 12
  • 10
  • 10
  • Tagged with
  • 2855
  • 398
  • 284
  • 280
  • 207
  • 195
  • 190
  • 163
  • 157
  • 156
  • 156
  • 152
  • 147
  • 142
  • 128
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
911

Concepts and methods of multivariate information synthesis for mineral resources estimation.

Pan, Guocheng. January 1989 (has links)
This study introduces a new methodology referred to as geoinformation synthesis for multivariate evaluation of mineral resources and integration of diverse geoscience data. The most critical component is the development of the notion of intrinsic samples and the methods for their delineation. Intrinsic samples replace grid cells which are conventionally employed as the basic information reference. Grid cell sampling has imposed several serious limitations on the geoscience and genetic information that can be objectively related to mineral endowment. Methods based upon intrinsic samples moderate to a certain extent these problems and bring the critical genetic information into the geoscience information system which forms the basis for the quantitative evaluation of mineral resources. The second major component in this new methodology is the integration of factors describing exploration effects with other geodata and mineral endowment estimation; this combination effectively reduces the possibilities of biases in the estimates of mineral endowment and recoverable resources due to the incomplete knowledge on the control area and imperfect analogy with the study areas. The third component is the use in the qualitative models of synthesized geoinformation, which is considerably enhanced, instead of using directly the original measurements (geodata). Several multivariate techniques are proposed and employed for synthesis of diverse information and estimation of mineral endowment, including a priori weighted multivariate criterion, optimum discretization, coherency analysis, multidimensional scaling method (p(ijk), filtering analysis, and geochemical transportation models. These methods were developed, tested, and demonstrated on an actual case study of the epithermal gold-silver deposits in the Walker Lake quadrangle of Nevada and California using various data sets available for this region: geochemical, structural, gravity and magnetic, lithology, and alteration. Finally, the estimation of endowment in terms of epithermal gold-silver mineral occurrences is given for some selected intrinsic samples or information zones identified in the Walker Lake region.
912

An Assessment of Abundance, Diet, and Cultural Significance of Mexican Gray Wolves in Arizona

Rinkevich, Sarah Ellen January 2012 (has links)
I sampled the eastern portion of the Fort Apache Indian Reservation from June 19 to August 8 in 2008 and from May 6 to June 19 in 2009. I used scat detection dogs to find wolf (Canis lupus baileyi) scat on the Fort Apache Indian Reservation during 2008 and 2009. My population size estimate of the wolf population was 19 individuals (95% CI = 14 - 58; SE = 8.30) during 2008 and 2009. My study also used DNA analyses to obtain an accurate assessment of Mexican wolf diet and, compare prey remains in Mexican gray wolf scat with prey remains in two other sympatric carnivore species (coyote, C. latrans, and puma, Puma concolor). Percent biomass of prey items consumed by Mexican wolves included 89% for elk, 8% for mule deer, and 3% for coyote. Percent biomass of prey items consumed by pumas was 80% for elk, 12% for mule deer, 4% for turkey, and 4% for fox. I included an ethnographic feature to my research. My study showed evidence of shared knowledge about the wolf within Western Apache culture. My data fit the consensus model based upon the large ratio between the first and second eigenvalues. I provided a literature review of how traditional ecological knowledge has enhanced the field of conservation biology but also the challenges of collecting and incorporating it with western science. Lastly, I provide an historical perspective of wolves throughout Arizona, an assessment of their historical abundance, and document a possible mesocarnivore release. Between 1917 and 1964, 506 wolves, 117,601 coyotes, 2,608 mountain lions, 1,327 bears, 19,797 bobcats, and 21 jaguars were killed by PARC agents, bounty hunters, and ranchers as reported in U.S. Bureau of Biological Survey Annual Reports in Arizona. The relationship between the numbers of coyotes and wolves destroyed was investigated using Pearson correlation coefficient. There was a negative correlation between the numbers of wolves and coyotes destroyed in Arizona between 1917 and 1964 (r = -0.40; N = 46; p = 0.01) suggesting a possible mesopredator release of coyotes with the extirpation of the wolf in Arizona.
913

INVESTIGATION OF IN-PIT ORE-WASTE SELECTION PROCEDURES USING CONDITIONALLY SIMULATED OREBODIES.

Arik, Abdullah. January 1982 (has links)
No description available.
914

一籃子違約交換評價之演算法改進 / Improved algorithms for basket default swap valuation

詹依倫, Chan, Yi-Lun Unknown Date (has links)
各項信用衍生性商品中,最廣為人知的商品即為違約風險交換(credit default swap; CDS),但由於金融市場與商品的擴張,標的資產不再侷限單一資產而是增加至數家或數百家,而多個標的資產的違約風險交換稱為一籃子違約風險交換(basket default swap; BDS)。 根據Chiang et al. ((2007), Journal of Derivatives, 8-19.),在單因子模型中應用importance sampling (IS) 來估計違約給付金額,不僅可以確保違約事件的發生,還可以提高估計的效率,因此本文延伸此一概念,將此方法拓展至多因子模型。本文分為三種方法:一為將多個獨立因子合併為一邊際因子,並針對此邊際因子做importance sampling;二為找出其最具影響性的因子應用importance sampling;最後,我們針對portfolio C 於Glasserman ((2004), Journal of Derivatives, 24-42.) 將標的資產分為獨立兩群,我們將分段利用exponential twist及Chiang et al. (2007)所提出的單因子方法,提升違約事件發生的機率。 借由數值模擬結果,發現將多個獨立因子合併為一邊際因子的方法應用於標的資產為同質模型(homogeneous model),會有較佳的結果;對具影響性的因子應用importance sampling的方法於各種模型之下的估計結果都頗為優秀,但其variance reduction較差且流程較不符合現實財務狀況,方法三則為特殊模型的應用,其只適用於能將標的資產獨立分群的模型,並且估計準確與否和選取exponential twist的位置有重要關係,第四節我們將同時呈現兩個不同位置的估計值與variance reduction. / Credit default swap (CDS) is the most popular in many kinds of credit derivatives, but number of obligor couldn’t be one always because of the expansions of financial market and contracts. CDS which has been contained more than one obligor is called basket default swap (BDS). According to Chiang et al. ((2007), Journal of Derivatives, 8-19.), applying importance sampling to estimate the default payment in one factor model could not only guarantee the default event occurs but also improve the efficiency of estimation. So this paper extends this concept for expanding this method to multiple factors model. There are three methods for expanding: First, merge multiple factors into a marginal factor and apply importance sampling to this marginal factor; second, apply importance sampling to the factor which has higher factor loading and third, we consider portfolio C in Glasserman ((2004), Journal of Derivatives, 24-42.) and divide total obligors into two independent groups. We would use the ways of exponential twist and the method in one factor model of Chiang et al. (2007) considered in two parts to raise the probability of default event occur. Borrow by the result of numerical simulation, method 1 has better results when obligors are homogeneous model; the results of method 2 are outstanding in each model, but its efficiency is worse and the procedure doesn’t fit with the realistic financial situation; the third method is the application of the special model, it could only apply to the model which could separate obligors independently, and the accuracy of estimates is strongly correlated to the position of exponential twist. In section 4, we would display the estimator and variance reduction in two different positions.
915

A naïve sampling model of intuitive confidence intervals

Hansson, Patrik January 2007 (has links)
A particular field in research on judgment and decision making (JDM) is concerned with realism of confidence in one’s knowledge. An interesting finding is the so-called format dependence effect, which implies that assessment of the same probability distribution generates different conclusions about over- or underconfidence depending on the assessment format. In particular, expressing a belief about some unknown continuous quantity (e.g., a stock value) in the form of an intuitive confidence interval is severely prone to overconfidence as compared to expressing the belief as an assessment of a probability judgment. This thesis gives a tentative account of this finding in terms of a Naïve Sampling Model, which assumes that people accurately describe their available information stored in memory, but they are naïve in the sense that they treat sample properties as proper estimators of population properties (Study 1). The effect of this naivety is directly investigated empirically in Study 2. A prediction that short-term memory is a constraining factor for sample size in judgment, suggesting that experience per se does not eliminate overconfidence is investigated and verified in Study 3. Age-related increments in overconfidence were observed with intuitive confidence interval but not for probability judgment (Study 4). This thesis suggests that no cognitive processing bias (e.g., Tversky & Kahneman, 1974) over and above naivety is needed to understand and explain the overconfidence “bias” with intuitive confidence interval and hence the format dependence effect.
916

Is the Intuitive Statistician Eager or Lazy? : Exploring the Cognitive Processes of Intuitive Statistical Judgments

Lindskog, Marcus January 2013 (has links)
Numerical information is ubiquitous and people are continuously engaged in evaluating it by means of intuitive statistical judgments. Much research has evaluated if people’s judgments live up to the norms of statistical theory but directed far less attention to the cognitive processes that underlie the judgments. The present thesis outlines, compares, and tests two cognitive models for intuitive statistical judgments, summarized in the metaphors of the lazy and eager intuitive statistician. In short, the lazy statistician postpones judgments to the time of a query when the properties of a small sample of values retrieved from memory serve as proxies for population properties. In contrast, the eager statistician abstracts summary representations of population properties online from incoming data. Four empirical studies were conducted. Study I outlined the two models and investigated whether an eager or a lazy statistician best describes how people make intuitive statistical judgments. In general the results supported the notion that people spontaneously engage in a lazy process. Under certain specific conditions, however, participants were able to induce abstract representations of the experienced data. Study II and Study III extended the models to describe naive point estimates (Study II) and inference about a generating distribution (Study III). The results indicated that both the former and the latter type of judgment was better described by a lazy than an eager model. Finally, Study IV, building on the support in Studies I-III, investigated boundary conditions for a lazy model by exploring if statistical judgments are influenced by common memory effects (primacy and recency). The results indicated no such effects, suggesting that the sampling from long-term memory in a lazy process is not conditional on when the data is encountered. The present thesis makes two major contributions. First, the lazy and eager models are first attempts at outlining a process model that could possibly be applied for a large variety of statistical judgments. Second, because a lazy process imposes boundary conditions on the accuracy of statistical judgments, the results suggest that the limitations of a lazy intuitive statistician would need to be taken into consideration in a variety of situations.
917

Musik i samband med venprovtagning för att lindra stress och smärta hos patienten : En interventionsstudie

Vasberg, Anna, Holm, Lina January 2014 (has links)
Bakgrund: En av sjuksköterskans grundläggande uppgifter är att lindra patientens lidande i samband med vård. Lugn musik har visat sig vara ett effektivt verktyg för att minska patienters upplevelser av stress och smärta vid flera procedurer inom sjukvården. Parametrar som hjärtfrekvens och blodtryck kan påverkas av musiken. Frisättning av hormoner och signalsubstanser kan minska patientens känslor av obehag av olika slag. Venös blodprovtagning utförs rutinmässigt på vårdcentraler över hela landet. Inom primärvården saknas dock riktlinjer för hur lugn musik kan användas för att öka patientens välmående i denna vårdsituation.  Syfte: Denna studies syfte var att undersöka huruvida lugn bakgrundsmusik påverkar patienters skattning av stress och smärta i samband med venprovtagning på vårdcentral samt att undersöka skillnader i skattning av stress och smärta vid venprovtagning beroende av patientkarakteristika.  Metod: En kvantitativ interventionsstudie med enkätundersökning på vårdcentral genomfördes. Frågeformulär delades ut direkt efter provtagningstillfället till 70 patienter. Av dessa hade 35 exponerats för lugn bakgrundsmusik under proceduren och 35 fått venprovet genomfört utan musik.  Resultat: Inga signifikanta skillnader kunde påvisas mellan musikinterventionsgruppen och kontrollgruppen. Signifikanta skillnader kunde påvisas i skattning av stress och smärta mellan patienter relaterat till åldersgrupp där yngre patienter skattade stress och smärta högre än äldre patienter. Mellan manliga och kvinnliga patienter kunde en signifikant skillnad i smärtskattning påvisas där kvinnor skattade högre än män. De patienter som uttryckte stickrädsla skattade stress och smärta högre än övriga patienter. Ingen skillnad i skattning av stress eller smärta kunde påvisas mellan de patienter som ansågs vana och de som ansågs ovana vid venprovtagning.  Slutsats: Ingen signifikant skillnad i skattning av stress och smärta relaterat till bakgrundsmusiken i interventionen kunde genom denna studie påvisas. Mot bakgrund av tidigare forskning kan dock tänkas att mer omfattande studier på samma ämne kan leda till ett annat resultat. Majoriteten av de fynd som relaterades till patientkarakteristika verkar stämma överens med tidigare forskning. / Background: One of the basic tasks of the nurse is to alleviate the patient’s suffering associated with health care. Relaxing music has proven to be an effective tool to reduce patients’ experiences of stress and pain during many health care procedures. Parameters as heart rate and blood pressure can be affected by the music. Release of hormones and signal substances can reduce the patient’s feelings of different kinds of discomfort. Venous blood sampling is performed as a routine at health care centers across Sweden. In primary health care there are no guidelines on how relaxing music can be used to increase the patient’s well-being in this care situation.  Aim: The aim of this study was to investigate whether relaxing background music has an impact on the health center patient’s grading of stress and pain during venipuncture, and to examine differences in grading of stress and pain during venipuncture dependent on patient characteristics.  Method: A quantitative intervention study with a survey at a health center was performed. Questionnaires were distributed immediately after venous blood sampling to 70 patients. Of these, 35 had been exposed to relaxing background music during the venipuncture, and 35 had not been exposed to any music during the procedure.  Main results: No significant differences were found between the music intervention group and the control group. Significant differences were found in grading of stress and pain between patients dependent on their age group, younger patients ranked their stress and pain higher than older patients. Between male and female patients a significant difference in grading of pain were found, women ranked their pain higher than men. The patients who expressed a fear of needles ranked their stress and pain higher than other patients. No difference in grading of stress or pain could be found between those patients who were considered accustomed, and those who were not considered accustomed to venipuncture.  Conclusion: No significant difference in grading of stress and pain could be found related to the music intervention in this study. Based on previous studies, there is still a possibility that more extensive research on the same topic can produce a different result. The majority of findings related to patient characteristics seems to be aligned with previous research in this field.
918

Importance Sampling to Accelerate the Convergence of Quasi-Monte Carlo

Hörmann, Wolfgang, Leydold, Josef January 2007 (has links) (PDF)
Importance sampling is a well known variance reduction technique for Monte Carlo simulation. For quasi-Monte Carlo integration with low discrepancy sequences it was neglected in the literature although it is easy to see that it can reduce the variation of the integrand for many important integration problems. For lattice rules importance sampling is of highest importance as it can be used to obtain a smooth periodic integrand. Thus the convergence of the integration procedure is accelerated. This can clearly speed up QMC algorithms for integration problems up to dimensions 10 to 12. (author's abstract) / Series: Research Report Series / Department of Statistics and Mathematics
919

Strategies for non-uniform rate sampling in digital control theory

Khan, Mohammad Samir January 2010 (has links)
This thesis is about digital control theory and presents an account of methods for enabling and analysing intentional non-uniform sampling in discrete compensators. Most conventional control algorithms cause numerical problems where data is collected at sampling rates that are substantially higher than the dynamics of the equivalent continuous-time operation that is being implemented. This is of relevant interest in applications of digital control, in which high sample rates are routinely dictated by the system stability requirements rather than the signal processing needs. Considerable recent progress in reducing the sample frequency requirements has been made through the use of non-uniform sampling schemes, so called alias-free signal processing. The approach prompts the simplification of complex systems and consequently enhances the numerical conditioning of the implementation algorithms that otherwise, would require very high uniform sample rates. Such means of signal representation and analysis presents a variety of options and thus is being researched and practiced in a number of areas in communications. However, the control communities have not yet investigated the use of intentional non-uniform sampling, and hence the ethos of this research project is to investigate the effectiveness of such sampling regimes, in the context of exploiting the benefits. Digital control systems exhibit bandwidth limitations enforced by their closed-loop frequency requirements, the calculation delays in the control algorithm and the interfacing conversion times. These limitations pave the way for additional phase lags within the control loop that demand very high sample rates. Since non-uniform sampling is propitious in reducing the sample frequency requirements of digital processing, it proffers the prospects of being utilised in achieving a higher control bandwidth without opting for very high uniform sample rates. The concept, to the author s knowledge, has not formally been studied and very few definite answers exist in control literature regarding the associated analysis techniques. The key contributions adduced in this thesis include the development and analysis of the control algorithm designed to accommodate intentional non-uniform sample frequencies. In addition, the implementation aspects are presented on an 8-bit microcontroller and an FPGA board. This work begins by establishing a brief historical perspective on the use of non-uniform sampling and its role for digital processing. The study is then applied to the problem of digital control design, and applications are further discoursed. This is followed by consideration of its implementation aspects on standard hardware.
920

Highly degenerate diffusions for sampling molecular systems

Noorizadeh, Emad January 2010 (has links)
This work is concerned with sampling and computation of rare events in molecular systems. In particular, we present new methods for sampling the canonical ensemble corresponding to the Boltzmann-Gibbs probability measure. We combine an equation for controlling the kinetic energy of the system with a random noise to derive a highly degenerate diffusion (i.e. a diffusion equation where diffusion happens only along one or few degrees of freedom of the system). Next the concept of hypoellipticity is used to show that the corresponding Fokker-Planck equation of the highly degenerate diffusion is well-posed, hence we prove that the solution of the highly degenerate diffusion is ergodic with respect to the Boltzmann-Gibbs measure. We find that the new method is more efficient for computation of dynamical averages such as autocorrelation functions than the commonly used Langevin dynamics, especially in systems with many degrees of freedom. Finally we study the computation of free energy using an adaptive method which is based on the adaptive biasing force technique.

Page generated in 0.0776 seconds