• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • Tagged with
  • 38
  • 38
  • 38
  • 38
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

MAPPING AND DECOMPOSING SCALE-DEPENDENT SOIL MOISTURE VARIABILITY WITHIN AN INNER BLUEGRASS LANDSCAPE

Landrum, Carla 01 January 2013 (has links)
There is a shared desire among public and private sectors to make more reliable predictions, accurate mapping, and appropriate scaling of soil moisture and associated parameters across landscapes. A discrepancy often exists between the scale at which soil hydrologic properties are measured and the scale at which they are modeled for management purposes. Moreover, little is known about the relative importance of hydrologic modeling parameters as soil moisture fluctuates with time. More research is needed to establish which observation scales in space and time are optimal for managing soil moisture variation over large spatial extents and how these scales are affected by fluctuations in soil moisture content with time. This research fuses high resolution geoelectric and light detection and ranging (LiDAR) as auxiliary measures to support sparse direct soil sampling over a 40 hectare inner BluegrassKentucky (USA) landscape. A Veris 3100 was used to measure shallow and deep apparent electrical conductivity (aEC) in tandem with soil moisture sampling on three separate dates with ascending soil moisture contents ranging from plant wilting point to near field capacity. Terrain attributes were produced from 2010 LiDAR ground returns collected at ≤1 m nominal pulse spacing. Exploratory statistics revealed several variables best associate with soil moisture, including terrain features (slope, profile curvature, and elevation), soil physical and chemical properties (calcium, cation exchange capacity, organic matter, clay and sand) and aEC for each date. Multivariate geostatistics, time stability analyses, and spatial regression were performed to characterize scale-dependent soil moisture patterns in space with time to determine which soil-terrain parameters influence soil moisture distribution. Results showed that soil moisture variation was time stable across the landscape and primarily associated with long-range (~250 m) soil physicochemical properties. When the soils approached field capacity, however, there was a shift in relative importance from long-range soil physicochemical properties to short-range (~70 m) terrain attributes, albeit this shift did not cause time instability. Results obtained suggest soil moisture’s interaction with soil-terrain parameters is time dependent and this dependence influences which observation scale is optimal to sample and manage soil moisture variation.
32

Thinking Aloud in the Science Classroom: Can a literacy strategy increase student learning in science?

Mockel, Lindsey Joan 27 August 2013 (has links)
This research study investigated the effect of using the think aloud protocol while reading informational text on students' ability to learn from text in a secondary science classroom. The participants in this study were high school students (n=47) in three classes of a mixed-grade Integrated Biology, Chemistry, and Physics course. The study tracked student achievement during a four-week curriculum unit on the theory of evolution and evidence for biological evolution. All students received instruction on using the think aloud protocol, and all students practiced the think aloud protocol when reading short articles related to scientific evidence for evolution. The researcher measured student's ability to read and understand science text by comparing scores from a reading skills pre-assessment and post-assessment from each student. Student surveys were conducted to gather feedback on the effectiveness of the strategy in teaching students to use a literacy strategy while reading science text. Data were analyzed using descriptive statistics.
33

A Paleoclimate Modeling Experiment to Calculate the Soil Carbon Respiration Flux for the Paleocene-Eocene Thermal Maximum

Tracy, David M 01 January 2012 (has links) (PDF)
The Paleocene-Eocene Thermal Maximum (PETM) (55 million years ago) stands as the largest in a series of extreme warming (hyperthermal) climatic events, which are analogous to the modern day increase in greenhouse gas concentrations. Orbitally triggered (Lourens et al., 2005, Galeotti et al., 2010), the PETM is marked by a large (-3‰) carbon isotope excursion (CIE). Hypothesized to be methane driven, Zeebe et al., (2009) noted that a methane based release would only account for 3.5°C of warming. An isotopically heavier carbon, such as that of soil and C3 plants, has the potential to account for the warming and CIE (Zachos et al., 2005). During the early Eocene, high latitude surface temperatures created favorable conditions for the sequestration of terrestrial carbon. A large untapped terrestrial carbon reservoir, such as that within permafrost regions, contains the potential, if degraded, to account for the CIE as well as the global temperature increase observed during the PETM. Using an fully integrated climate model (GENESIS) with fully coupled vegetation model (BIOME4), we show that adequate conditions for permafrost growth and terrestrial carbon sequestration did exist during the lead up to the PETM. By calculating the flux of net primary production (NPP) and soil respiration (Rs), we demonstrate that the biodegradation of permafrost-based carbon reservoirs had the potential to drive the PETM. Furthermore, we show that the natural planetary response to unbalanced carbon reservoirs resulted in the terrestrial sequestration of atmospheric carbon via permafrost regeneration, yielding a vulnerable carbon reservoir for the subsequent hyperthermal.
34

Convolution and Autoencoders Applied to Nonlinear Differential Equations

Borquaye, Noah 01 December 2023 (has links) (PDF)
Autoencoders, a type of artificial neural network, have gained recognition by researchers in various fields, especially machine learning due to their vast applications in data representations from inputs. Recently researchers have explored the possibility to extend the application of autoencoders to solve nonlinear differential equations. Algorithms and methods employed in an autoencoder framework include sparse identification of nonlinear dynamics (SINDy), dynamic mode decomposition (DMD), Koopman operator theory and singular value decomposition (SVD). These approaches use matrix multiplication to represent linear transformation. However, machine learning algorithms often use convolution to represent linear transformations. In our work, we modify these approaches to system identification and forecasting of solutions of nonlinear differential equations by replacing matrix multiplication with convolution transformation. In particular, we develop convolution-based approach to dynamic mode decomposition and discuss its application to problems not solvable otherwise.
35

Newsvendor Models With Monte Carlo Sampling

Ekwegh, Ijeoma W 01 August 2016 (has links)
Newsvendor Models with Monte Carlo Sampling by Ijeoma Winifred Ekwegh The newsvendor model is used in solving inventory problems in which demand is random. In this thesis, we will focus on a method of using Monte Carlo sampling to estimate the order quantity that will either maximizes revenue or minimizes cost given that demand is uncertain. Given data, the Monte Carlo approach will be used in sampling data over scenarios and also estimating the probability density function. A bootstrapping process yields an empirical distribution for the order quantity that will maximize the expected profit. Finally, this method will be used on a newsvendor example to show that it works in maximizing profit.
36

Can a comprehensive transition plan to barefoot running be the solution to the injury epidemic in American endurance runners?

Scarlett, Michael A. 01 January 2018 (has links)
Fossils belonging to the genus Homo, dating as far back as two million years ago, exhibit uniquely efficient features suggesting that early humans had evolved to become exceptional endurance runners. Although they did not have the cushion or stability-control features provided in our modern day running shoes, our early human ancestors experienced far less of the running-related injuries we experience today. The injury rate has been estimated as high as 90% annually for Americans training for a marathon and as high as 79% annually for all American endurance runners. There is an injury epidemic in conventionally shod populations that does not exist in the habitually unshod or minimally shod populations around the world. This has led many to conclude that the recent advent of highly technological shoes might be the problem. Although current literature has been inconclusive, there are two main limitations in virtually all of the studies: 1) transition phases of less than three months and 2) transition phases without rehabilitation exercises. These two aspects are key to the treatment of the structural consequences on the muscles and tendons of the foot and calf that habitually shod individuals have faced. This study includes a discussion of the cumulative consequences that lifelong shoe usage has on the development of the feet and lower legs. I propose a 78-week study that addresses the limitations of past studies by implementing a gradual, 32-week, multi-shoe transition complemented by an evidence-based rehabilitation program. I believe that this approach will restore strength and elasticity to muscles and tendons that have been inhibited by lifelong usage of overconstructed shoes and adequately prepare runners for the increased demand brought on by a­­­­­ changing running mechanic. This comprehensive, multifaceted transition plan to a fully minimalist shoe will provide novel insight into the ongoing barefoot debate. Can this approach finally demonstrate the proposed benefits of losing the shoes?
37

The Effects of the Use of Technology In Mathematics Instruction on Student Achievement

Myers, Ron Y 30 March 2009 (has links)
The purpose of this study was to examine the effects of the use of technology on students’ mathematics achievement, particularly the Florida Comprehensive Assessment Test (FCAT) mathematics results. Eleven schools within the Miami-Dade County Public School System participated in a pilot program on the use of Geometers Sketchpad (GSP). Three of these schools were randomly selected for this study. Each school sent a teacher to a summer in-service training program on how to use GSP to teach geometry. In each school, the GSP class and a traditional geometry class taught by the same teacher were the study participants. Students’ mathematics FCAT results were examined to determine if the GSP produced any effects. Students’ scores were compared based on assignment to the control or experimental group as well as gender and SES. SES measurements were based on whether students qualified for free lunch. The findings of the study revealed a significant difference in the FCAT mathematics scores of students who were taught geometry using GSP compared to those who used the traditional method. No significant differences existed between the FCAT mathematics scores of the students based on SES. Similarly, no significant differences existed between the FCAT scores based on gender. In conclusion, the use of technology (particularly GSP) is likely to boost students’ FCAT mathematics test scores. The findings also show that the use of GSP may be able to close known gender and SES related achievement gaps. The results of this study promote policy changes in the way geometry is taught to 10th grade students in Florida’s public schools.
38

Software Internationalization: A Framework Validated Against Industry Requirements for Computer Science and Software Engineering Programs

Vũ, John Huân 01 March 2010 (has links)
View John Huân Vũ's thesis presentation at http://youtu.be/y3bzNmkTr-c. In 2001, the ACM and IEEE Computing Curriculum stated that it was necessary to address "the need to develop implementation models that are international in scope and could be practiced in universities around the world." With increasing connectivity through the internet, the move towards a global economy and growing use of technology places software internationalization as a more important concern for developers. However, there has been a "clear shortage in terms of numbers of trained persons applying for entry-level positions" in this area. Eric Brechner, Director of Microsoft Development Training, suggested five new courses to add to the computer science curriculum due to the growing "gap between what college graduates in any field are taught and what they need to know to work in industry." He concludes that "globalization and accessibility should be part of any course of introductory programming," stating: A course on globalization and accessibility is long overdue on college campuses. It is embarrassing to take graduates from a college with a diverse student population and have to teach them how to write software for a diverse set of customers. This should be part of introductory software development. Anything less is insulting to students, their family, and the peoples of the world. There is very little research into how the subject of software internationalization should be taught to meet the major requirements of the industry. The research question of the thesis is thus, "Is there a framework for software internationalization that has been validated against industry requirements?" The answer is no. The framework "would promote communication between academia and industry ... that could serve as a common reference point in discussions." Since no such framework for software internationalization currently exists, one will be developed here. The contribution of this thesis includes a provisional framework to prepare graduates to internationalize software and a validation of the framework against industry requirements. The requirement of this framework is to provide a portable and standardized set of requirements for computer science and software engineering programs to teach future graduates.

Page generated in 0.2081 seconds