81 |
Mapping geospatial events based on extracted spatial information from web documentsRock, Nathaniel Robert 01 May 2011 (has links)
Web documents such as news articles, social feeds, and blogs provide an abundant and readily available data source of spatial information relating to dynamic events such as wildfires, storms, and chemical spills. Research in the fields of geographic information retrieval and natural language processing use methods to extract place-names from web documents that can be used to geocode these events. However much of the spatial information in these articles are difficult to use because of the inherent vagueness of natural language. This thesis aims to develop methods to handle the vaguness of representing natural language descriptions of events by integrating precise spatial information (landmarks and geographic coordinates) with imprecise spatial information to provide a map-based visualization of the likely spatial extent and location of web document events.
|
82 |
Physical and Functional Events Involved in Conjugal Transfer of Lactose Utilization in Lactococcus lactis subsp. lactisWang, Hua 01 May 1992 (has links)
The nature of the cell surface components involved in donor cell clumping (Clu+) and the relationship of Clu+ to high frequency conjugal transfer of lactose utilization (Lac) in Lactococcus lactis subsp. lactis ML3 was examined. Lactose positive (Lac+), Clu+ transconjugants, containing a novel 104 kilobase Lac plasmid, were obtained by mating ML3 with LM2301. When used as Lac+ donors in second round matings, these transconjugants transferred Lac at high frequencies ranging from 10-2 to 10-4 transconjugants per donor CFU. Treatment of donor cells with EDTA and EGTA containing solutions or proteolytic enzymes (proteinase K and chymotrypsin A) resulted in a loss of Clu+. By using a direct plate conjugation technique, these treatments also decreased the capacity for transferring Lac at high frequency. Analysis of cell-surface proteins by SOS-PAGE identified a novel protein of approximately 125 kDa which was present in Clu+ transconjugants, but not in non-clumping transconjugants. These results suggest that Clu+ is required for high frequency Lac transfer in ML3 transconjugants, and at least one large protein is involved in Clu+. De novo synthesis requirements of donor cells for conjugal transfer of Lac were tested on direct plate conjugation technique. Results indicate that de novo protein synthesis and RNA synthesis are not required for conjugal transfer of Lac.
|
83 |
Psychological Time: The effect of task complexity upon the human estimation of duration.Webber, Simon January 2007 (has links)
This thesis was designed to investigate the effect of task complexity upon how humans estimate duration. Previous task complexity research suggests that duration is overestimated with simple tasks and underestimated with complex tasks. One-hundred and forty-two first and second year university students participated. Twelve experiments were conducted, which required participants to complete computer generated jigsaw puzzles and periodically estimate how long they thought they had been doing the puzzle. In Experiment 1, participants were required to complete a jigsaw puzzle before making an estimate. In the remaining eleven experiments, estimates were made throughout the session whilst participants worked on the jigsaw puzzle. In the first four experiments, a task was complex if there were more puzzle pieces and simpler if there were fewer puzzle pieces. There were no significant results obtained from the first four experiments. Given the lack of effect from the first four experiments, the next two experiments partially replicated two task complexity studies to determine how task complexity can be used as an explanation for why estimations of duration differ. Again, there were no significant results obtained from these two experiments. The next four experiments tested whether people's estimates of duration were affected by the rate of reinforcement they receive (i.e., successfully moving a puzzle piece to a new location per unit time). In the first of these two experiments (7 and 8) there was no effect of the manipulation, which consisted of decreasing the distance which a puzzle piece could be moved on the screen, relative to the distance the computer mouse was moved and fixing the speed at which a puzzle piece could be moved. In Experiments 9 and 10, more discriminative stimuli were used to indicate to participants that a change in the reinforcement rate was occurring. There was a significant result in Experiment 9 in one condition but this effect was not replicated in Experiment 10. In Experiment 11, the reinforcement rate was reduced to zero and there was a significant effect on participants' estimates of duration. However, these results suggested a confound between whether the reinforcement rate or not being able to access the jigsaw puzzle was affecting estimates of duration. In Experiment 12, access to the jigsaw puzzle was limited, whilst simultaneously controlling the reinforcement rate and the results showed that not having access to the jigsaw puzzle affected how participants estimate duration. These findings suggest that information can act as reinforcement, enabling a person to engage in private behaviour. When there is no access to reinforcement, time 'drags' for humans.
|
84 |
Sensor placement for microseismic event locationErrington, Angus Frank Charles 07 November 2006
Mining operations can produce highly localized, low intensity earthquakes that are referred to as microseismic events. Monitoring of microseismic events is useful in predicting and comprehending hazards, and in evaluating the overall performance of a mine design. <p>A robust localization algorithm is used to estimate the source position of the microseismic event by selecting the hypothesized source location that maximizes an energy function generated from the sum of the time--aligned sensor signals. The accuracy of localization for the algorithm characterized by the variance depends in part upon the configuration of sensors. Two algorithms, MAXSRC and MINMAX, are presented that use the variance of localization error, in a particular direction, as a performance measure for a given sensor configuration.<p>The variance of localization error depends, in part, upon the energy spectral density of the microseismic event. The energy spectral density characterization of sensor signals received in two potash mines are presented and compared using two spectral estimation techniques: multitaper estimation and combined time and lag weighting. It is shown that the difference between the the two estimation techniques is negligible. However, the differences between the two mine characterizations, though not large, is significant. An example uses the characterized energy spectral densities to determine the variance of error for a single step localization algorithm.<p>The MAXSRC and MINMAX algorithms are explained. The MAXSRC sensor placement algorithm places a sensor as close as possible to the source position with the maximum variance. The MINMAX sensor placement algorithm minimizes the variance of the source position with the maximum variance after the sensor has been placed. The MAXSRC algorithm is simple and can be solved using an exhaustive search while the MINMAX algorithm uses a genetic algorithm to find a solution. These algorithms are then used in three examples, two of which are simple and synthetic. The other example is from Lanigan Potash Mine. The results show that both sensor placement algorithms produce similar results, with the MINMAX algorithm consistently doing better. The MAXSRC algorithm places a single sensor approximately 100 times faster than the MINMAX algorithm. The example shows that the MAXSRC algorithm has the potential to be an efficient and intuitively simple sensor placement algorithm for mine microseismic event monitoring. The MINMAX algorithm provides, at an increase in computational time, a more robust placement criterion which can be solved adequately using a genetic algorithm.
|
85 |
Inference for a General Class of Models for Recurrent Events with application to cancer dataGonzález Ruiz, Juan Ramón 29 December 2005 (has links)
La necesidad del análisis de supervivencia aparece cuando necesitamos estudiar las propiedades estadísticas de una variable que describe el tiempo hasta que ocurre un evento único. En algunas ocasiones, podemos observar que el evento de interés ocurre repetidamente en un mismo individuo, como puede ser el caso de un paciente diagnosticado de cáncer que recae a lo largo del tiempo o cuando una persona es reingresada repetidas veces en un hospital. En este caso hablamos de análisis de supervivencia con eventos recurrentes. La naturaleza recurrente de los eventos hace necesario el uso de otras técnicas distintas a aquellas que utilizamos cuando analizamos tiempos de supervivencia para un evento único. En esta tesis, tratamos este tipo de análisis principalmente motivados por dos estudios en investigación en cáncer que fueron creados especialmente para este trabajo. Uno de ellos hace referencia a un estudio sobre readmisiones hospitalarias en pacientes diagnosticados con cáncer colorectal, mientras que el otro hace referencia a pacientes diagnosticados con linfomas no Hodgkinianos. Este último estudio es especialmente relevante ya que incluimos información sobre el efecto del tratamiento después de las recaídas y algunos autores han mostrado la necesidad de desarrollar un modelo específico para pacientes que presentan este tipo de enfermedades. Nuestra contribución al análisis univariante es proponer un método para construir intervalos de confianza para la mediana de supervivencia en el caso de eventos recurrentes. Para ello, hemos utilizado dos aproximaciones. Una de ellas se basa en las varianzas asintóticas derivadas de dos estimadores existentes de la función de supervivencia, mientras que el otro utiliza técnicas de remuestreo. Esta última aproximación es útil ya que uno de los estimadores utilizados todavía no tiene una forma cerrada para su varianza. La nueva contribución de este trabajo es el estudio de cómo hacer remuestreo en la presencia de datos con eventos recurrentes que aparecen de un esquema conocido como --sum-quota accrual" y la informatividad del mecanismo de censura por la derecha que presentan este tipo de datos. Demostramos la convergencia d bil y los intervalos de confianza asintóticos se construyen utilizando dicho resultado. Por otro lado, el análisis multivariante trata el problema de cómo incorporar más de una covariable en el análisis. En problemas con eventos recurrentes, también necesitamos tener en cuenta que además de las covariables, la hetereogeneidad, el número de ocurrencias, o especialmente, el efecto de las intervenciones después de las reocurrencias puede modificar la probabilidad de observar un nuevo evento en un paciente. Este último punto es muy importante ya que todavía no se ha tenido en cuenta en estudios biomédicos. Para tratar este problema, hemos basado nuestro trabajo en un nuevo modelo para eventos recurrentes propuesto por Peña y Hollander, 2004. Nuestra contribución a este punto es la adaptación de las recaídas en cáncer utilizando este modelo en el que el efecto de las intervenciones se representa mediante un proceso llamado --edad efectiva' que actúa sobre la función de riesgo basal. Hemos llamado a este modelo modelo dinámico de cáncer (--dynamic cancer model'). También tratamos el problema de la estimación de parámetros de la clase general de modelos para eventos recurrentes propuesta por Peña y Hollander donde el modelo dinámico de cáncer se puede ver como un caso especial de este modelo general. Hemos desarrollado dos aproximaciones. La primera se basa en inferencia semiparamétrica, donde la función de riesgo basal se especifica de forma no paramétrica y usamos el algoritmo EM. La segunda es una aproximación basada en verosimilitud penalizada donde adoptamos dos estrategias diferentes. Una de ellas se basa en penalizar la verosimilitud parcial donde la penalización recae en los coeficientes de regresión. La segunda penaliza la verosimilitud completa y da una estimación no paramétrica de la función de riesgo basal utilizando un estimador continuo. La solución se aproxima utilizando splines. La principal ventaja de este método es que podemos obtener fácilmente una estimación suave de la función de riesgo así como una estimación de la varianza de la varianza de la fragilidad, mientras que con las otras aproximaciones esto no es posible. Además este último método presenta un coste computacional bastante más bajo que los otros. Los resultados obtenidos con datos reales, indican que la flexibilidad de este modelo es una garantía para analizar datos de pacientes que recaen a lo largo del tiempo y que son intervenidos después de las recaídas tumorales.El aspecto computacional es otra de las contribuciones importantes de esta tesis al campo de los eventos recurrentes. Hemos desarrollado tres paquete de R llamados survrec, gcmrec y frailtypack que están accesibles en CRAN, http://www.r-project.org/. Estos paquetes permiten al usuario calcular la mediana de supervivencia y sus intervalos de confianza, estimar los par metros del modelo de Peña y Hollander (en particular el modelo dinámico de cáncer) utilizando el algoritmo EM y la verosimilitud penalizada, respectivamente. / Survival analysis arises when we are interested in studying statistical properties of a variable which describes the time to a single event. In some situations, we may observe that the event of interest occurs repeatedly in the same individual, such as when a patient diagnosed with cancer tends to relapse over time or when a person is repeatedly readmitted in a hospital. In this case we speak about survival analysis with recurrent events. Recurrent nature of events makes necessary to use other techniques from those used when we analyze survival times from one single event. In this dissertation we deal with this type of analysis mainly motivatedby two studies on cancer research that were created specially for this research. One of them belongs to a study on hospital readmissions in patients diagnosed with colorectal cancer, while the other one deals with patients diagnosed with non-Hodgkin's lymphoma. This last study is mainly relevant since we include information about the effect of treatment after relapses and some authors have stated the needed of developing a specific model for relapsing patients in cancer settings.Our first contribution to univariate analysis is to propose a method to construct confidence intervals for the median survival time in the case of recurrent event settings. Two different approaches are developed. One of them is based on asymptotic variances derived from two existing estimators of survival function, while the other one uses bootstrap techniques. This last approach is useful since one of the estimators used, does not have any closed form for its variance yet. The new contribution to this work is the examination of the question of how to do bootstrapping in the presence of recurrent event data arising from a sum-quota accrual scheme and informativeness of right censoring mechanism. Weak convergence is proved and asymptotic confidence intervals are built to according this result. On the other hand, multivariate analysis addresses the problem of how incorporate more than one covariate in the analysis. In recurrent event settings, we also need to take into account that apart from covariates, the heterogeneity, the number of occurrences or specially, the effect of interventions after re occurrences may modify the probability of observing a new event in a patient. This last point is a very important one since it has not been taken into consideration in biomedical studies yet. To address this problem, we base our work on a new model for recurrent events proposed by Peña and Hollander. Our contribution to this topic is to accommodate the situation of cancer relapses to this model model in which the effect of interventions is represented by an effective age process acting on the baseline hazard function. We call this model dynamic cancer model.We also address the problem of estimating parameters of the general class of models for recurrent events proposed by Peña and Hollander, 2004, where the dynamic cancer model may be seen as a special case of this general model. Two general approaches are developed. First approach is based on semiparametric inference, where a baseline hazard function is nonparametrically specified and uses the EM algorithm. The second one is a penalized likelihood approach where two different strategies are adopted. One of them is based on penalizing the partial likelihood where the penalization bears on a regression coefficient. The second penalized approach penalized full likelihood, and it gives a non parametric estimation of the baseline hazard function using a continuous estimator. The solution is then approximated using splines. The main advantage of this method is that we caneasily obtain smooth estimates of the hazard function and an estimation of the variance of frailty variance, while in the other approaches this is not possible. In addition, this last approach has a quite less computational cost than the other ones. The results obtained using dynamic cancer model in real data sets, indicate that the flexibility of this method provides a safeguard for analyzing data where patients relapse over time and interventions are performed after tumoral reoccurrences.Computational issue is another important contribution of this work to recurrent event settings. We have developed three R packages called survrec, gcmrec, and frailtypack that are available at CRAN, http://www.r-project.org/. These packages allow users to compute median survival time and their confidence intervals, to estimate the parameters involved in the Peña and Hollander's model (in particular in the dynamic cancer model) using EM algorithm, and to estimate this parameters using penalized approach, respectively.
|
86 |
Sensor placement for microseismic event locationErrington, Angus Frank Charles 07 November 2006 (has links)
Mining operations can produce highly localized, low intensity earthquakes that are referred to as microseismic events. Monitoring of microseismic events is useful in predicting and comprehending hazards, and in evaluating the overall performance of a mine design. <p>A robust localization algorithm is used to estimate the source position of the microseismic event by selecting the hypothesized source location that maximizes an energy function generated from the sum of the time--aligned sensor signals. The accuracy of localization for the algorithm characterized by the variance depends in part upon the configuration of sensors. Two algorithms, MAXSRC and MINMAX, are presented that use the variance of localization error, in a particular direction, as a performance measure for a given sensor configuration.<p>The variance of localization error depends, in part, upon the energy spectral density of the microseismic event. The energy spectral density characterization of sensor signals received in two potash mines are presented and compared using two spectral estimation techniques: multitaper estimation and combined time and lag weighting. It is shown that the difference between the the two estimation techniques is negligible. However, the differences between the two mine characterizations, though not large, is significant. An example uses the characterized energy spectral densities to determine the variance of error for a single step localization algorithm.<p>The MAXSRC and MINMAX algorithms are explained. The MAXSRC sensor placement algorithm places a sensor as close as possible to the source position with the maximum variance. The MINMAX sensor placement algorithm minimizes the variance of the source position with the maximum variance after the sensor has been placed. The MAXSRC algorithm is simple and can be solved using an exhaustive search while the MINMAX algorithm uses a genetic algorithm to find a solution. These algorithms are then used in three examples, two of which are simple and synthetic. The other example is from Lanigan Potash Mine. The results show that both sensor placement algorithms produce similar results, with the MINMAX algorithm consistently doing better. The MAXSRC algorithm places a single sensor approximately 100 times faster than the MINMAX algorithm. The example shows that the MAXSRC algorithm has the potential to be an efficient and intuitively simple sensor placement algorithm for mine microseismic event monitoring. The MINMAX algorithm provides, at an increase in computational time, a more robust placement criterion which can be solved adequately using a genetic algorithm.
|
87 |
Life Experiences and Resilience in College Students: A Relationship Influenced by Hope and MindfulnessCollins, Amy B. 16 January 2010 (has links)
This study examines the effects of hope and mindfulness on the relationship between life experiences and resilience, as well as the direct relationships among these constructs. Participants were 537 undergraduates who were taking online courses at Texas A&M University. Most participants were traditionally-aged and female. Quantitative self-report measures for each construct were administered online via SurveyMonkey.
The small, positive relationship between life events and resilience approached, but did not reach, statistical significance, and hope but not mindfulness was found to moderate this relationship. Correlations were significant, positive, and linear between resilience and hope (r = .57, p < .01), resilience and mindfulness (r = .50, p less than .01), and hope and mindfulness (r = .44, p less than .01). Suggestions for clinical interventions aimed at increasing resilience by increasing hope and mindfulness are provided
|
88 |
Simulation of Seismic Real and Virtual Data Using the 3d Finite-difference Technique and Representation TheoremYang, Xiujun 15 May 2009 (has links)
Seismic modeling is a technique for simulating wave propagation through the
subsurface. For a given geological model, seismic modeling allows us to generate
snapshots of wave propagation and synthetic data. In my dissertation, for real seismic
events I have chosen to implement the finite-difference modeling technique. When
adequate discretization in space and time is possible, the finite-difference technique is
by far one of the most accurate tools for simulating elastic-wave propagation through
complex geological models.
In recent years, a significant amount of work has been done in our group using
2D finite-difference modeling. For complex salt structures which exploration and pro-
duction industries meet today, 2D finite-difference modeling is not sufficient to study
subsalt imaging or the demultiple of subsalt models. That is why I have developed a
3D finite-difference modeling code.
One of the key challenges that I have met in developing the 3D finite-difference
code is to adapt the absorbing boundary conditions. Absorbing boundary conditions
are needed to describe the infinite geological models by limited computing domain.
I have validated the 3D finite-difference code by comparing its results with analytic
solutions. I have used 3D finite-difference program to generate data corresponding
to 3D complex model which describes salt and subsalt structures of Gulf of Mexico.
The resulting data include reflections, diffractions and other scattering phenomena.
I have also used finite-difference program in anisotropic context to show that we can effectively predict shear-wave splitting and triplication in the data.
There are new sets of events that are not directly recorded in seismic data, they
have been called virtual events. These events are turning to be as important as real
events in modern data processing. Therefore we also have to learn how to model them.
Unfortunately, they cannot yet be modeled directly from finite-difference. Here I will
describe how to model these events by using cross correlation type representation
theorem. As illustration of how important of virtual events for seismic data process-
ing, I also described an internal multiple attenuation technique which utilized virtual
events.
|
89 |
Things that happenTiles, J. E. January 1981 (has links)
Revision of Thesis (D. Phil.)--Oxford University, 1978. / Includes bibliographical references (p. [115]-116) and index.
|
90 |
Crisis growth : outcomes and determinants /Wonacott, Nancy Louise, January 2001 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2001. / Vita. Includes bibliographical references (leaves 223-238). Available also in a digital version from Dissertation Abstracts.
|
Page generated in 0.024 seconds