• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 698
  • 169
  • 90
  • 71
  • 64
  • 43
  • 35
  • 24
  • 22
  • 21
  • 18
  • 10
  • 6
  • 6
  • 5
  • Tagged with
  • 1512
  • 144
  • 131
  • 128
  • 124
  • 114
  • 113
  • 96
  • 92
  • 89
  • 82
  • 78
  • 75
  • 73
  • 71
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

The Effect of Aleks on Students' Mathematics Achievement in an Online Learning Environment and the Cognitive Complexity of the Initial and Final Assessments

Nwaogu, Eze 11 May 2012 (has links)
For many courses, mathematics included, there is an associated interactive e-learning system that provides assessment and tutoring. Some of these systems are classified as Intelligent Tutoring Systems. MyMathLab, Mathzone, and Assessment of LEarning in Knowledge Space (ALEKS) are just a few of the interactive e-learning systems in mathematics. In ALEKS, assessment and tutoring are based on the Knowledge Space Theory. Previous studies in a traditional learning environment have shown ALEKS users to perform equally or better in mathematics achievement than the group who did not use ALEKS. The purpose of this research was to investigate the effect of ALEKS on students’ achievement in mathematics in an online learning environment and to determine the cognitive complexity of mathematical tasks enacted by ALEKS’s initial (pretest) and final (posttest) assessments. The targeted population for this study was undergraduate students in College Mathematics I, in an online course at a private university in the southwestern United States. The study used a quasi-experimental One-Group non-randomized pretest and posttest design. Five methods of analysis and one model were used in analyzing data: t-test, correctional analysis, simple and multiple regression analysis, Cronbach’s Alpha reliability test and Webb’s depth of knowledge model. A t-test showed a difference between the pretest and posttest reports, meaning ALEKS had a significant effect on students’ mathematics achievement. The correlation analysis showed a significant positive linear relationship between the concept mastery reports and the formative and summative assessments reports meaning there is a direct relationship between the ALEKS concept mastery and the assessments. The regression equation showed a better model for predicting mathematics achievement with ALEKS when the time spent learning in ALEKS and the concept mastery scores are used as part of the model. According to Webb’s depth of knowledge model, the cognitive complexity of the pretest and posttest question items used by ALEKS were as follows: 50.5% required application of skills and concepts, 37.1% required recall of information, and 12.4% required strategic thinking: None of the questions items required extended thinking or complex reasoning, implying ALEKS is appropriate for skills and concepts building at this level of mathematics.
362

Regime Change: Sampling Rate vs. Bit-Depth in Compressive Sensing

January 2012 (has links)
The compressive sensing (CS) framework aims to ease the burden on analog-to-digital converters (ADCs) by exploiting inherent structure in natural and man-made signals. It has been demonstrated that structured signals can be acquired with just a small number of linear measurements, on the order of the signal complexity. In practice, this enables lower sampling rates that can be more easily achieved by current hardware designs. The primary bottleneck that limits ADC sampling rates is quantization, i.e., higher bit-depths impose lower sampling rates. Thus, the decreased sampling rates of CS ADCs accommodate the otherwise limiting quantizer of conventional ADCs. In this thesis, we consider a different approach to CS ADC by shifting towards lower quantizer bit-depths rather than lower sampling rates. We explore the extreme case where each measurement is quantized to just one bit, representing its sign. We develop a new theoretical framework to analyze this extreme case and develop new algorithms for signal reconstruction from such coarsely quantized measurements. The 1-bit CS framework leads us to scenarios where it may be more appropriate to reduce bit-depth instead of sampling rate. We find that there exist two distinct regimes of operation that correspond to high/low signal-to-noise ratio (SNR). In the measurement compression (MC) regime, a high SNR favors acquiring fewer measurements with more bits per measurement (as in conventional CS); in the quantization compression (QC) regime, a low SNR favors acquiring more measurements with fewer bits per measurement (as in this thesis). A surprise from our analysis and experiments is that in many practical applications it is better to operate in the QC regime, even acquiring as few as 1 bit per measurement. The above philosophy extends further to practical CS ADC system designs. We propose two new CS architectures, one of which takes advantage of the fact that the sampling and quantization operations are performed by two different hardware components. The former can be employed at high rates with minimal costs while the latter cannot. Thus, we develop a system that discretizes in time, performs CS preconditioning techniques, and then quantizes at a low rate.
363

Chess Performance under Time Pressure: Evidence for the Slow Processes in Speed Chess

Chang, Yu-Hsuan 16 September 2013 (has links)
An influential theory of chess skill holds that expertise in chess is not due to greater depth of search by experts but, rather, to the ability to recognize familiar patterns of pieces. Although there is evidence that experts search deeper than non-experts, the data are not consistent. In this thesis, I propose “key-position theory” which states that only in a small number of key positions is it necessary to search deeply and it is these positions that experts search deeper than non-experts. Study 1 found, consistent with key-position theory, that the distribution of moves times is extremely skewed with some moves taking much longer than others. This pattern was more pronounced for the stronger players. Study 2 found that the errors made by weaker players involved less search than the errors made by stronger players. These findings suggest that search is an important component of chess expertise.
364

Bildbaserad rendering : Implementation och jämförelse av två algoritmer

Härdling, Peter January 2010 (has links)
Det här arbetet har gått ut på att jämföra två algoritmer för bildbaserad rendering. Båda algoritmerna använder två bilder som spelats in med formatet MultiView plus depth för att rendera nya mellanliggande vyer av en tredimensionell scen. De tvådimensionella bilderna är kompletterade med djupvärden för varje bildpunkt. Renderingen kan då utföras genom perspektivriktiga transformationer där alla bildpunkters nya positioner projiceras individuellt. I samband med renderingen behöver bland annat mätfel i de ursprungliga bilderna samt skymda partier hanteras. Algoritm I gör det delvis genom utjämning av skarvararna mellan bildernas bidrag till den nya vyn. Algoritm II bygger på att bilderna delas upp i lager där de lager som ansetts vara säkra prioriteras framför lager som har bedömts vara mer riskabla. Algoritmerna har implementerats i Matlab och algoritm II har modifierats genom kompletteringar av dess regler för prioriteringen av lagren till mer komplicerade scener. Algoritm II har visat sig vara bättre på att bevara detaljer i de renderade vyerna och håller en jämnare hastighet vid renderingarna. Den ger även högre och jämnare resultat vid jämförelse med kvalitetsmåttet PSNR men vid jämförelser med MSSIM har den däremot fått något lägre värden. De ytterligare stegen vid renderingen har även ökat renderingstiderna med upp till 40 % jämfört med algoritm I. Författaren ger förslag på områden för fortsatt utveckling av algoritm II. Till exempel bör algoritmen testas vidare för att avgöra om de använda gränsvärdena är generella eller om de måste anpassas till olika scener. / This thesis has been aimed at comparing two algorithms for image-based renderings. Both algorithms uses two images recorded with the MultiView plus depth format, to render new intermediate views of a three-dimensional scene. The two-dimensional images extensions with depth values for each pixel, makes it possible to perform the image warping as perspective projections of all individually pixels to their new positions. During rendering, such as measurement error in the original images and occlusions has to be handled. Algorithm I is partly based on smoothening the joints between the contributions from the two images to the novel view. Algorithm II divides the images into layers, in which layers consid-ered safe has priority over layers that have been defined as more risky. The algorithms have been implemented in Matlab and algorithm II has been modified through additions to the priority rules for the layers to more complex scenes. Algorithm II has proven to be better at preserving the details in the rendered views, and maintains a less varying speed when rendering. It also provides higher and more consistent PSNR values, but in comparison using MSSIM the values are slightly lower. The additional steps have also increased the rendering times by up to 40 % compared to algorithm I. The author suggests areas for further development of algorithm II. For example, the algorithm should be tested further to determine if the used thresholds are general or whether they must be adapted to different scenes.
365

Focus controlled image coding based on angular and depth perception / Fokusstyrd bildkodning baserad på vinkel och djup perception

Grangert, Oskar January 2003 (has links)
In normal image coding the image quality is the same in all parts of the image. When it is known where in the image a single viewer is focusing it is possible to lower the image quality in other parts of the image without lowering the perceived image quality. This master's thesis introduces a coding scheme based on depth perception where the quality of the parts of the image that correspond to out-of-focus scene objects is lowered to obtain data reduction. To obtain further data reduction the method is combined with angular perception coding where the quality is lowered in parts of the image corresponding to the peripheral visual field. It is concluded that depth perception coding can be done without lowering the perceived image quality and that the coding gain increases as the two methods are combined.
366

Predicting temperature profiles during simulated forest fires

Enninful, Ebenezer Korsah 19 September 2006 (has links)
Below-ground effects during forest fires are some of the important issues forest managers consider when conducting prescribed fire programs. Heat transfer models in soil are needed to predict temperatures in soil during forest fires. Many of the heat transfer models in soil that include the effects of moisture are complex and in most cases do not have very good predictive abilities. Researchers believe that simple heat transfer models in soil that neglect the effects of moisture could have very good predictive abilities.<p>This study presents a one-dimensional numerical model of heat transfer in dry homogenous sand. Both constant and temperature dependent thermal properties of the sand were used in order to determine which had better predictive abilities. The constant thermal properties model was also extended to a model of two-layer dry soil. A computer code written in Fortran was used to generate results from the model. A number of experiments were conducted with dry sand to validate the model. A comparison of the numerical and experimental results indicated that the temperature dependent properties model had better predictive abilities than the constant properties model. The models were found to do a good job of predicting temperature profiles and depth of lethal heat penetration at heat fluxes indicative of forest fires.<p>Experiments were also conducted to determine the effect of moisture on temperature profiles and the depth of lethal heat penetration in sand and the effect of inorganics on the spread rate of smoldering combustion in peat moss. An experimental correlation of the effects of inorganic content on the spread rate of smoldering combustion in peat moss was developed. Additionally, laboratory methods of validating models of heat transfer in soil were developed with the aim of limiting the dependence on full scale testing. Specifically the use of the cone calorimeter for validating numerical models of heat transfer in soil and the responses of forest floor soil and laboratory created soil samples to heat input were compared. The results indicated that the laboratory created soil did a very good job of mimicking the heat response of the forest floor soil with a maximum difference in lethal heat penetration of 4%.
367

Habitatpreferenser hos tjockskalig målarmussla (Unio crassus) med avseende på vattendjup och beskuggning. / Habitat preferences of the thick-shelled river mussel (Unio crassus) regarding water depth and shading.

Lundberg, Malin January 2012 (has links)
The thick-shelled river mussel (Unio crassus) is a red listed species classified as Endangered (EN) and is also considered within the Habitats Directive. The distribution in Sweden is fragmented and it is mostly occurring in the south eastern parts. We investigated the presence of Unio crassus in a section of the stream Storån, Östergötland County, from Falerum to the inflow into Lake Åkervristen. The environmental parameters investigated were water depth, bottom substrate, shading, water velocity and the slope over the water surface. In this thesis I have focused mainly on water depth and shading, comparing sites with and without mussels. In addition, I used a multivariate PCA analysis to evaluate all parameters together. The water depth was significantly larger in habitats with mussels than in those without. Shading varied from 5 to 80 %, but there was no significant difference between habitats with and without mussels. There was no correlation between water depth and mussel density and not between shading and mussel density either. The multivariate PCA analysis showed that the habitats with and without mussels were different from with regard to the PC1 axis, which included water depth, bottom substrate and water velocity. Alone, the water depth is not enough to predict the presence of Unio crassus in the stream, and it is likely that more parameters need to be considered.  Previous work indicate that the more parameters and habitats that are investigated, the more confident results can be stated of which habitats Unio crassus prefer.
368

Lorentz Lattice Gases on Graphs

Kreslavskiy, Dmitry Michael 26 November 2003 (has links)
The present work consists of three parts. In the first part (chapters III and IV), the dynamics of Lorentz lattice gases (LLG) on graphs is analyzed. We study the fixed scatterer model on finite graphs. A tight bound is established on the size of the orbit for arbitrary graphs, and the model is shown to perform a depth-first search on trees. Rigidity models on trees are also considered, and the size of the resulting orbit is established. In the second part (chapter V), we give a complete description of dynamics for LLG on the one-dimensional integer lattice, with a particular interest in showing that these models are not capable of universal computation. Some statistical properties of these models are also analyzed. In the third part (chapter VI) we attempt to partition a pool of workers into teams that will function as independent TSS lines. Such partitioning may be aimed to make sure that all groups work at approximately the same rate. Alternatively, we may seek to maximize the rate of convergence of the corresponding dynamical systems to their fixed points with optimal production at the fastest rate. The first problem is shown to be NP-hard. For the second problem, a solution for splitting into pairs is given, and it is also shown that this solution is not valid for partitioning into teams composed of more than two workers.
369

Development of Techniques to Quantify Chemical and Mechanical Modifications of Polymer Surfaces: Application to Chemical Mechanical Polishing

Diao, Jie 01 December 2004 (has links)
This thesis is devoted to development of techniques to quantify chemical and mechanical influences during chemical mechanical polishing (CMP) near the surface of a polymer film, poly (biphenyl dianhydride-p-phenylenediamine) (BPDA-PDA). To quantify chemical modifications during CMP, an iterative algorithm has been proposed to extract depth profiles based on Ficks second law of diffusion in a multi-element system from data supplied by angle resolved x-ray photoelectron spectroscopy. It has been demonstrated that the technique can be used to quantify the depth of chemical modification of BPDA-PDA surfaces treated with alkaline solutions. Polymer chains near the surface realign themselves during CMP and polarized infrared spectroscopy is chosen in this thesis to quantify chain orientations induced by CMP to evaluate the mechanical influence. A theoretical framework based on a 44 matrix method for spectral simulation together with an oscillator model for BPDA-PDA has been used to obtain quantitative chain orientation information on a post-CMP BPDA-PDA sample by fitting simulated polarized infrared spectra to experimentally generated spectra. Verification of the oscillator model was established from the complex refractive indices of BPDA-PDA films, which were determined using a new method (R/T ratio method) developed in this thesis to extract complex refractive indices of films with biaxial symmetry from polarized transmission and reflection spectra.
370

Wetland Public Trust and Management Model in Taiwan¡G A Case Study of the Aogu Wetland, Chiayi, Taiwan

Shang, Shu-Ting 06 July 2010 (has links)
Wetlands cover a broad of areas, including aquatic and terrestrial ecological systems. Many people and agencies are attracted the natural resources value and begin the action to protect it. In 2007, the Construction and Planning Agency Ministry of the Interior¡]CPAMI¡^in Taiwan announced 75 sites as ¡§National Wetlands of Importance¡¨. Currently the Wetland Conservation Law draft completed of 2010. In the future the wetlands conservation and mitigation mechanism will be definite. Wetland conservation is not only a government duty but also the resporsibility of the Private Sector and Citizen. Sometimes non-government organization and community often face private land ownership, lack of income and problems with Taiwan¡¦s current laws and regulation. Many wildlife habitats and critical wetlands are not owned by government, such as Aogu Wetland, therefore many researchers begin to promote the idea of "public trust" as one of the models for wetland sustainable management. In this study use public trust management model to combine wetland conservation maintain wetland ¡§no net loss¡¨. The common consensus and co-management mechanism between public and private sectors become crucial issues. Major research methods used Depth-Interview with different field experts to find the solution and feasibility of the proposed framework. Wetland public trust is a tool advocating the collaboration and cooperation among public, non-profit organizations, enterprise, government to improve the outcomes of environmental conservation, which can be oversee by particular authorities This study proposes wetland public trust as an appropriate framework and integrates tax system to improve the wetland conservation models in Taiwan.

Page generated in 0.0384 seconds