• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 13
  • 13
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Evaluation of indicators of stress in populations of polar bears (Ursus maritimus) and grizzly bears (Ursus arctos)

Hamilton, Jason 07 January 2008 (has links)
Grizzly and polar bears are both species at the top of the food chain in their respective ecosystems, and as such are indicative of the overall health of the ecosystem. Presently there is little data regarding the stress status of these animals. The development of reliable indicators of stress is important as both species face rapid environmental change. Polar bears from Hudson’s Bay (Ontario, Canada) and grizzly bears from Alberta, Canada, were anaesthetized and blood samples retrieved. Samples were assayed for changes in serum-based indicators of stress. Serum cortisol levels, the predominant corticosteroid in mammals and a commonly used indicator of stress, was measured to evaluate its potential as a chronic stress indicator in bears. The induction time of the cortisol response to stressor exposure is rapid and will be influenced by the stress relating to capture. Hence, serum levels of heat shock proteins (hsps), specifically the 60 (hsp60) and 70 kilodalton (hsp70) families of hsps were also measured to evaluate their reliability as a stress indicator in bears. Traditionally, heat shock proteins have been measured in tissues; however recent studies have indicated their presence in serum in response to chronic stress. In addition, the study examined the feasibility of using corticosteroid-binding globulin (CBG), a serum protein that binds cortisol, as a stress indicator in bears. CBG regulates the availability of cortisol to the tissues (only unbound cortisol elicits a response) but unlike cortisol is not rapidly regulated by acute stress. Bear CBG was isolated and a specific anti-bear CBG antibody was generated. The development of an enzyme-linked immunoadsorbant assay (ELISA) using this bear anti-CBG has the potential to be a useful tool to determine longer-term stress response in bears. Known life-history variables were correlated to the observed levels of serum indicators to elucidate which environmental factors impact bears. The length of sea ice coverage was the strongest determinant of serum cortisol and hsp70 levels in polar bears; the longer ice cover reflects more feeding time and this is reduced through climatic warming. This suggests that fasting associated metabolic changes may be impacting serum cortisol response and hsp70 levels in polar bears. For grizzly bears the proportion of protected homerange had the strongest correlation with stress indicators. This suggests that human impact on the environment, including resource extraction and landscape changes, result in altered levels of serum cortisol and hsp70 levels. Hsp60 was not observed to vary significantly in the face of changing environmental variables, and as such no correlation could be made between serum hsp60 levels and environmental variables in bears. Serum hsp70 was observed to change significantly in response to environmental variables in both polar and grizzly bears. These data along with the changes in cortisol and other health based indicators have the potential to make hsp70 a useful indicator of altered health status in bears. This study is the first attempt to integrate the usefulness of a suite of serum indicators of stress as a tool for detecting the health status of bears. The lack of a control group for comparison to wild population limits the utility of the observed variables as a tool to detect stressed states in bears. However, as these serum indicators are also modulated by the animals health life-history, including food limitation, the monitoring of these serum stress indicators, along with other indicators of fed and fasted states, may give a better picture of the health status of the animal related to nutrient availability.
2

Evaluation of indicators of stress in populations of polar bears (Ursus maritimus) and grizzly bears (Ursus arctos)

Hamilton, Jason 07 January 2008 (has links)
Grizzly and polar bears are both species at the top of the food chain in their respective ecosystems, and as such are indicative of the overall health of the ecosystem. Presently there is little data regarding the stress status of these animals. The development of reliable indicators of stress is important as both species face rapid environmental change. Polar bears from Hudson’s Bay (Ontario, Canada) and grizzly bears from Alberta, Canada, were anaesthetized and blood samples retrieved. Samples were assayed for changes in serum-based indicators of stress. Serum cortisol levels, the predominant corticosteroid in mammals and a commonly used indicator of stress, was measured to evaluate its potential as a chronic stress indicator in bears. The induction time of the cortisol response to stressor exposure is rapid and will be influenced by the stress relating to capture. Hence, serum levels of heat shock proteins (hsps), specifically the 60 (hsp60) and 70 kilodalton (hsp70) families of hsps were also measured to evaluate their reliability as a stress indicator in bears. Traditionally, heat shock proteins have been measured in tissues; however recent studies have indicated their presence in serum in response to chronic stress. In addition, the study examined the feasibility of using corticosteroid-binding globulin (CBG), a serum protein that binds cortisol, as a stress indicator in bears. CBG regulates the availability of cortisol to the tissues (only unbound cortisol elicits a response) but unlike cortisol is not rapidly regulated by acute stress. Bear CBG was isolated and a specific anti-bear CBG antibody was generated. The development of an enzyme-linked immunoadsorbant assay (ELISA) using this bear anti-CBG has the potential to be a useful tool to determine longer-term stress response in bears. Known life-history variables were correlated to the observed levels of serum indicators to elucidate which environmental factors impact bears. The length of sea ice coverage was the strongest determinant of serum cortisol and hsp70 levels in polar bears; the longer ice cover reflects more feeding time and this is reduced through climatic warming. This suggests that fasting associated metabolic changes may be impacting serum cortisol response and hsp70 levels in polar bears. For grizzly bears the proportion of protected homerange had the strongest correlation with stress indicators. This suggests that human impact on the environment, including resource extraction and landscape changes, result in altered levels of serum cortisol and hsp70 levels. Hsp60 was not observed to vary significantly in the face of changing environmental variables, and as such no correlation could be made between serum hsp60 levels and environmental variables in bears. Serum hsp70 was observed to change significantly in response to environmental variables in both polar and grizzly bears. These data along with the changes in cortisol and other health based indicators have the potential to make hsp70 a useful indicator of altered health status in bears. This study is the first attempt to integrate the usefulness of a suite of serum indicators of stress as a tool for detecting the health status of bears. The lack of a control group for comparison to wild population limits the utility of the observed variables as a tool to detect stressed states in bears. However, as these serum indicators are also modulated by the animals health life-history, including food limitation, the monitoring of these serum stress indicators, along with other indicators of fed and fasted states, may give a better picture of the health status of the animal related to nutrient availability.
3

Cortisol fecal em ovinos: curva de excreção e estabilidade / Fecal cortisol in sheep: excretion curve and stability

Longo, Ana Luisa Silva 26 February 2016 (has links)
O presente estudo foi dividido em dois experimentos, tendo como objetivo determinar a curva de excreção do cortisol fecal e sua estabilidade nas fezes perante exposição à diferentes períodos de tempo e temperatura entre as colheitas e análises, correlacionando os níveis de cortisol fecal com o pico de cortisol sanguíneo. No experimento 1, seis fêmeas mestiças (Dorper x Santa Inês) tiveram suas fezes totais colhidas durante 24 horas após a aplicação do hormônio adrenocorticotrófico (ACTH), além de colheitas de sangue realizadas antes da aplicação do ACTH, 60, 120 e 300 minutos depois; durante as quais foram atribuídos escores de reatividade para cada animal. Logo após as análises foi iniciado o experimento 2, no qual 9 cordeiros mestiços (Dorper x Santa Inês) foram submetidos a uma situação de estresse térmico durante os horários das 11 às 15 horas da tarde, tendo suas fezes colhidas às 23 horas do mesmo dia. Após a colheita, as fezes foram agrupadas e homogeneizadas em três grupos distintos, de onde retiraram-se alíquotas referentes aos tratamentos propostos: três temperaturas (15°, 25° e 35°) e quatro tempos (1, 3, 6 e 12 horas). Os dados da curva de excreção foram analisados por ANOVA, bem como pela correlação entre os valores de cortisol sanguíneo, fecal e reatividade. Para análise da estabilidade foi utilizada ANOVA multifatorial com dois fatores (temperatura e intervalo de tempo). Para avaliação das variáveis comportamentais foi realizada a transformação de escala dos dados para \"arco-seno raiz de porcentagem\", procedendo-se à análise de variância. O modelo estatístico contemplou os efeitos de dia (1, 2 e 3) com análise individual por animal. Os parâmetros de cortisol sanguíneo, frequência respiratória e temperatura retal foram analisados pelo teste t e correlação de Pearson. Todas as comparações de médias foram realizadas por teste F e teste t (PDIFF). A reatividade durante a colheita não exerceu efeito significativo sobre os valores de cortisol sanguíneo, os quais demonstraram médias maiores 60 minutos após a aplicação do ACTH e, após 300 minutos as ovelhas apresentaram níveis de cortisol considerados normais para ovinos sem estresse. Por outro lado, o pico de cortisol nas fezes foi verificado aproximadamente 10 a 12 horas após o pico de cortisol no sangue, não sendo verificadas diminuições significativas nas concentrações que indicassem o retorno aos níveis basais durante o período de 24 horas (P>0,05). Não foram observadas diferenças significativas entre os tempos e temperaturas aos quais as amostras de fezes foram submetidas (P>0,05), verificando-se uma tendência a manutenção da concentração do cortisol fecal em ovinos durante o período de 12 horas após a colheita. / This present study was divided into two experiments, aiming to determine the excretion curve of faecal cortisol and its stability over different periods of time and temperature between harvest and analyzes, correlating the fecal cortisol levels with peak blood cortisol. The project was developed in Biometeorology and Ethology Lab, Faculty of Animal Science and Food Engineering, University of São Paulo, Pirassununga - SP. In the first experiment, six crossbred (Dorper x Santa Inês) females sheep had their total feces collected during 24 hours after the application of adrenocorticotropic hormone (ACTH), beyond the blood samples taken before the application of ACTH, and one, two and five hours after application; in which was attributed reactivity scores to each animal. Soon after the analysis was started the second experiment, in which nine crossbred lambs (Dorper x Santa Inês) underwent a situation of thermal stress from 11 to 15 pm, and their feces were collected at 23 hours the same day. After harveting, the feces were pooled and homogenized in three different groups, where aliquots were withdrawn relating to the treatments proposed: three temperatures (15, 25 and 35°C) and four times (1, 3, 6 and 12 hours). The excretion curve data were analyzed by ANOVA, as well as the correlation between blood cortisol levels, faecal and reactivity. For stability analysis were used multifactor ANOVA with two factors (temperature and time range). To evaluate the behavioral variables was performed the transformation of the data range for \"arc sine percentage root\", proceeding to the analysis of variance. The statistical model included effects of day (1, 2 and 3) with individual analysis by animal. The blood cortisol parameters, respiratory rate and rectal temperature were analyzed by t test Pearson correlation. All comparisons of means were performed by F and t test (pdiff). The reactivity during harvest did not exert significant effect on blood cortisol levels, which showed higher averages 60 minutes after the application of ACTH, and after 300 minutes, the sheep showed cortisol levels considered normal to them, without stress. On the other hand, the peak of cortisol in the feces was observed approximately 10-12 hours after the peak of cortisol in the blood, not being observed significant decreases that indicate the return to the basal levels during the 24 hour period (P>0, 05). Were not observed no significant differences between the time and temperature in which the faecal samples were subjected (P>0.05), verifying a tendency on the maintenance of the concentration of faecal cortisol in sheep during the 12 hour period after harvest.
4

Cortisol fecal em ovinos: curva de excreção e estabilidade / Fecal cortisol in sheep: excretion curve and stability

Ana Luisa Silva Longo 26 February 2016 (has links)
O presente estudo foi dividido em dois experimentos, tendo como objetivo determinar a curva de excreção do cortisol fecal e sua estabilidade nas fezes perante exposição à diferentes períodos de tempo e temperatura entre as colheitas e análises, correlacionando os níveis de cortisol fecal com o pico de cortisol sanguíneo. No experimento 1, seis fêmeas mestiças (Dorper x Santa Inês) tiveram suas fezes totais colhidas durante 24 horas após a aplicação do hormônio adrenocorticotrófico (ACTH), além de colheitas de sangue realizadas antes da aplicação do ACTH, 60, 120 e 300 minutos depois; durante as quais foram atribuídos escores de reatividade para cada animal. Logo após as análises foi iniciado o experimento 2, no qual 9 cordeiros mestiços (Dorper x Santa Inês) foram submetidos a uma situação de estresse térmico durante os horários das 11 às 15 horas da tarde, tendo suas fezes colhidas às 23 horas do mesmo dia. Após a colheita, as fezes foram agrupadas e homogeneizadas em três grupos distintos, de onde retiraram-se alíquotas referentes aos tratamentos propostos: três temperaturas (15°, 25° e 35°) e quatro tempos (1, 3, 6 e 12 horas). Os dados da curva de excreção foram analisados por ANOVA, bem como pela correlação entre os valores de cortisol sanguíneo, fecal e reatividade. Para análise da estabilidade foi utilizada ANOVA multifatorial com dois fatores (temperatura e intervalo de tempo). Para avaliação das variáveis comportamentais foi realizada a transformação de escala dos dados para \"arco-seno raiz de porcentagem\", procedendo-se à análise de variância. O modelo estatístico contemplou os efeitos de dia (1, 2 e 3) com análise individual por animal. Os parâmetros de cortisol sanguíneo, frequência respiratória e temperatura retal foram analisados pelo teste t e correlação de Pearson. Todas as comparações de médias foram realizadas por teste F e teste t (PDIFF). A reatividade durante a colheita não exerceu efeito significativo sobre os valores de cortisol sanguíneo, os quais demonstraram médias maiores 60 minutos após a aplicação do ACTH e, após 300 minutos as ovelhas apresentaram níveis de cortisol considerados normais para ovinos sem estresse. Por outro lado, o pico de cortisol nas fezes foi verificado aproximadamente 10 a 12 horas após o pico de cortisol no sangue, não sendo verificadas diminuições significativas nas concentrações que indicassem o retorno aos níveis basais durante o período de 24 horas (P>0,05). Não foram observadas diferenças significativas entre os tempos e temperaturas aos quais as amostras de fezes foram submetidas (P>0,05), verificando-se uma tendência a manutenção da concentração do cortisol fecal em ovinos durante o período de 12 horas após a colheita. / This present study was divided into two experiments, aiming to determine the excretion curve of faecal cortisol and its stability over different periods of time and temperature between harvest and analyzes, correlating the fecal cortisol levels with peak blood cortisol. The project was developed in Biometeorology and Ethology Lab, Faculty of Animal Science and Food Engineering, University of São Paulo, Pirassununga - SP. In the first experiment, six crossbred (Dorper x Santa Inês) females sheep had their total feces collected during 24 hours after the application of adrenocorticotropic hormone (ACTH), beyond the blood samples taken before the application of ACTH, and one, two and five hours after application; in which was attributed reactivity scores to each animal. Soon after the analysis was started the second experiment, in which nine crossbred lambs (Dorper x Santa Inês) underwent a situation of thermal stress from 11 to 15 pm, and their feces were collected at 23 hours the same day. After harveting, the feces were pooled and homogenized in three different groups, where aliquots were withdrawn relating to the treatments proposed: three temperatures (15, 25 and 35°C) and four times (1, 3, 6 and 12 hours). The excretion curve data were analyzed by ANOVA, as well as the correlation between blood cortisol levels, faecal and reactivity. For stability analysis were used multifactor ANOVA with two factors (temperature and time range). To evaluate the behavioral variables was performed the transformation of the data range for \"arc sine percentage root\", proceeding to the analysis of variance. The statistical model included effects of day (1, 2 and 3) with individual analysis by animal. The blood cortisol parameters, respiratory rate and rectal temperature were analyzed by t test Pearson correlation. All comparisons of means were performed by F and t test (pdiff). The reactivity during harvest did not exert significant effect on blood cortisol levels, which showed higher averages 60 minutes after the application of ACTH, and after 300 minutes, the sheep showed cortisol levels considered normal to them, without stress. On the other hand, the peak of cortisol in the feces was observed approximately 10-12 hours after the peak of cortisol in the blood, not being observed significant decreases that indicate the return to the basal levels during the 24 hour period (P>0, 05). Were not observed no significant differences between the time and temperature in which the faecal samples were subjected (P>0.05), verifying a tendency on the maintenance of the concentration of faecal cortisol in sheep during the 12 hour period after harvest.
5

A Comparison of Child Morbidity and Mortality in Two Contrasting Medieval Cemeteries in Denmark.

Schutkowski, Holger, Bennike, P., Lewis, Mary Elizabeth, Valentin, F. 29 June 2009 (has links)
No / This study compares associations between demographic profiles, long bone lengths, bone mineral content, and frequencies of stress indicators in the preadult populations of two medieval skeletal assemblages from Denmark. One is from a leprosarium, and thus probably represents a disadvantaged group (Næstved). The other comes from a normal, and in comparison rather privileged, medieval community (Æbelholt). Previous studies of the adult population indicated differences between the two skeletal collections with regard to mortality, dental size, and metabolic and specific infectious disease. The two samples were analyzed against the view known as the osteological paradox (Wood et al. [1992] Curr. Anthropol. 33:343-370), according to which skeletons displaying pathological modification are likely to represent the healthier individuals of a population, whereas those without lesions would have died without acquiring modifications as a result of a depressed immune response. Results reveal that older age groups among the preadults from Næstved are shorter and have less bone mineral content than their peers from Æbelholt. On average, the Næstved children have a higher prevalence of stress indicators, and in some cases display skeletal signs of leprosy. This is likely a result of the combination of compromised health and social disadvantage, thus supporting a more traditional interpretation. The study provides insights into the health of children from two different biocultural settings of medieval Danish society and illustrates the importance of comparing samples of single age groups.
6

Medieval populations, society and climate : an interdisciplinary approach to the study of two skeletal assemblages from Bucharest and Braşov (Romania), 14th-18th cent. AD

Diana, Annamaria January 2016 (has links)
The complex relationship between human societies and the environment has become a thriving field of research over the past three decades. The contribution of human osteoarchaeology to exploring this relationship, however, has been rather limited. Two unpublished late medieval skeletal assemblages unearthed in the historical centres of Bucharest and Braşov (located in southern and north-central Romania respectively) seemed ideal choices for investigating the impact of substantial climatic and environmental changes that took place worldwide between the 14th and the 18th century AD. As witnessed by medieval artistic and documentary sources, this unsettled climate was mirrored by human populations with social and political instability, epidemics, famine, but also through the rise of new cultural movements. The analysis of over 600 individuals (a minimum number of 421 individuals from Bucharest and 206 from Braşov) was carried out to: 1) Provide a thorough osteological analysis, and compare and test statistically the collected data to reconstruct demographic and pathological patterning; 2) Identify ‘skeletal environmental markers’, i.e. possible indicators of the effect of climatic shifts on the human body; 3) Cross-reference osteological, archaeological, historical and climatological data in order to present a robust biocultural assessment of the impact of environmental and historical events on the Romanian population during the Middle Ages. The identification of low life-expectancy, higher mortality rates for children and young adults and general high morbidity levels were in line with other studies on medieval populations. However, evidence for a high prevalence of specific physiological and psychological stress markers was observed in these two geographically, culturally and economically different urban communities. As a strong mortality- and morbidity-shaping factor, the detrimental effect of climate anomalies is one of the main explanations for such findings, and is supported by medieval historical sources and recent advances in Romanian climatological studies. Despite some limitations (i.e. incomplete chronological information for most of the burial contexts, minimal local historical sources, lack of funding for isotopic analyses, and time constraints), the results of the present study have offered a new perspective on the relationship between Romanian medieval populations and their living environment, and have shown the enormous potential of interdisciplinary bioarchaeological research in Romania.
7

Bringing Childhood Health into Focus: Incorporating Survivors into Standard Methods of Investigation

Holland, Emily 09 January 2014 (has links)
The osteological paradox addresses how well interpretations of past population health generated from human skeletal remains reflect the health of the living population from which they were drawn. Selective mortality and hidden heterogeneity in frailty are particularly relevant when assessing childhood health in the past, as subadults are the most vulnerable group in a population and are therefore less likely to fully represent the health of those who survived. The ability of subadults to represent the health of those who survived is tested here by directly comparing interpretations of childhood stress based on non-survivors (subadults aged 6-20,14 females and 9 males) to those based on retrospective analyses of survivors (adults aged 21-46, 26 females and 27 males). Non-survivors and survivors were directly matched by birth year, using the Coimbra Identified Skeletal Collection; therefore interpretations of childhood stress reflect a shared childhood. Long bone and vertebral canal growth, linear enamel hypoplasia, cribra orbitalia, porotic hyperostosis, scurvy indicators and periosteal bone reactions were assessed for both groups. Overall, long bone growth generates the same interpretation of health for both non-survivors and survivors, and both groups exhibit the same range of stress (mild to severe), but the pattern of stress experienced in childhood differs between the two groups. Female survivors reveal different timing of stress episodes and a higher degree of stress than female non-survivors. Male survivors exhibit less stress than male non-survivors. These different patterns suggest that interpretations based solely on non-survivors would under-represent the stress experienced by female survivors and over-represent the stress experienced by male survivors, further demonstrating the importance of addressing issues of selective mortality. In addition, these different patterns suggest that hidden heterogeneity of frailty may be sex specific where males are more vulnerable to stress and females more able to develop resistance to stress and survive.
8

Bringing Childhood Health into Focus: Incorporating Survivors into Standard Methods of Investigation

Holland, Emily 09 January 2014 (has links)
The osteological paradox addresses how well interpretations of past population health generated from human skeletal remains reflect the health of the living population from which they were drawn. Selective mortality and hidden heterogeneity in frailty are particularly relevant when assessing childhood health in the past, as subadults are the most vulnerable group in a population and are therefore less likely to fully represent the health of those who survived. The ability of subadults to represent the health of those who survived is tested here by directly comparing interpretations of childhood stress based on non-survivors (subadults aged 6-20,14 females and 9 males) to those based on retrospective analyses of survivors (adults aged 21-46, 26 females and 27 males). Non-survivors and survivors were directly matched by birth year, using the Coimbra Identified Skeletal Collection; therefore interpretations of childhood stress reflect a shared childhood. Long bone and vertebral canal growth, linear enamel hypoplasia, cribra orbitalia, porotic hyperostosis, scurvy indicators and periosteal bone reactions were assessed for both groups. Overall, long bone growth generates the same interpretation of health for both non-survivors and survivors, and both groups exhibit the same range of stress (mild to severe), but the pattern of stress experienced in childhood differs between the two groups. Female survivors reveal different timing of stress episodes and a higher degree of stress than female non-survivors. Male survivors exhibit less stress than male non-survivors. These different patterns suggest that interpretations based solely on non-survivors would under-represent the stress experienced by female survivors and over-represent the stress experienced by male survivors, further demonstrating the importance of addressing issues of selective mortality. In addition, these different patterns suggest that hidden heterogeneity of frailty may be sex specific where males are more vulnerable to stress and females more able to develop resistance to stress and survive.
9

Validation de l’utilisation d’indicateurs physiologiques de stress comme indicateurs de qualité des habitats

Lejeune, Cédric 01 1900 (has links)
No description available.
10

Regulated deficit irrigation in citrus: agronomic response and water stress indicators

Ballester Lurbe, Carlos 06 May 2013 (has links)
In the Mediterranean area water is a scarce natural resource and periods of drought are frequent. It is then important to increase water use efficiency of irrigated crops. In order to achieve this, one promising option is regulated deficit irrigation (RDI). RDI consists in reducing water application during stages of crop development when yield and fruit quality have low sensitivity to water stress. Full irrigation is provided during the rest of the season to maintain production and fruit quality at adequate levels (Behboudian and Mills, 1997). In citrus, flowering and fruit set are sensitive periods to water restrictions, because water stress during this period increases fruit drop (Ginestar and Castel 1996). The more appropriate phenological period for applying water restrictions seems to be the summer period providing that water applications returned at full dosage sufficiently before harvest in order to allow for compensation in fruit growth (Cohen and Goell 1988). Previous work by González-Altozano and Castel (1999) showed the feasibility of applying RDI in 'Clementina de Nules' and identified threshold values of plant water stress that allowed water savings of about 10-20% without any detrimental effect on yield or fruit size. It would be desirable now to study the extrapolation of these results to commercial orchards of citrus and assess the use of RDI in different citrus cultivars. Two RDI strategies (RDI-1, irrigated at 50% of crop evapotranspiration (ETc) during summer and; RDI-2, irrigated at 35% ETc during the same period to RDI-1) will be compared with a control treatment irrigated at full requirements. As the level of water stress reached by trees is the important factor when RDI strategies are applied, the study of accurate water stress indicators for citrus is also needed. Thus, during the period of water restrictions the use of sap flow and canopy temperature measurements, obtained by thermal imaging or by means of fixed infrared thermometer sensors, will be assessed and compared to classical methods like stem water potential and stomatal conductance. / Ballester Lurbe, C. (2013). Regulated deficit irrigation in citrus: agronomic response and water stress indicators [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/28582 / TESIS / Premios Extraordinarios de tesis doctorales

Page generated in 0.0877 seconds