• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 233
  • 84
  • 54
  • 32
  • 31
  • 26
  • 9
  • 7
  • 6
  • 6
  • 6
  • 4
  • 3
  • 3
  • 3
  • Tagged with
  • 569
  • 101
  • 73
  • 59
  • 50
  • 48
  • 48
  • 46
  • 46
  • 44
  • 44
  • 38
  • 37
  • 34
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Speciallärares arbete med kartläggning i relation till SUM : - en studie om kommunikationens betydelse vid kartläggning i matematik. / Specialist teachers' work with mapping in relation to Special Educational needs in Mathematics (SEM) : - the importance of communication in mapping of mathematics.

Lind Behrenfors, Cecilia, Ljungberg, Camilla January 2022 (has links)
För oss som blivande speciallärare i matematik finns ett särskilt intresse av att utforska speciallärares arbete med kartläggning av elever i särskilda undervisningsbehov i matematik, SUM. Syftet med denna studie är därför att undersöka hur speciallärare arbetar med kartläggning i relation till elever i SUM och vilket utrymme   som kommunikationen ges i olika processer. Studiens syfte är också att ta reda på vilka utmaningar som kan ses i kartläggningsarbetet i en vardagspraktik i relation till SUM.   Några speciallärare från olika F-6 skolor har således valts ut till att intervjuas för att bidra med empiri till studien. Semistrukturerad intervju har valts som metod för att intervjun ska ge en djupare inblick i speciallärarnas beskrivningar. Metodens inriktning är också induktiv med en kvalitativ ansats. Genom speciallärarnas erfarenheter, metoder och strategier vill vi synliggöra speciallärarens viktiga yrkeskompetens i kartläggningsarbetet av elever i SUM. Resultatet visar att arbetet med kartläggning på de aktuella skolorna sker organiserat utifrån någon form av årshjul eller matematikutvecklingsplan. Speciallärarna genomför kartläggningar i relation till SUM och anpassar kartläggningsmaterialet för att passa den enskilda eleven. Den muntliga kommunikationen med stöd av de fysiska redskapen i form av konkret material, kartläggningsmaterial och olika arbetsblad är betydelsefull i kartläggningsarbetet. Specialläraren har en medierande roll i flera olika kulturer, med både elever, lärare, speciallärare och andra professioner. En tydlig slutsats i studien är att kartläggning är speciallärarens viktigaste verktyg i arbetet med elever i SUM. Den muntliga kommunikationen är ett betydelsefullt och ovärderligt redskap för specialläraren i olika kontexter. Det framkommer att det finns utmaningar i arbetet med kartläggning som behöver beaktas både ur elevens perspektiv men också ur speciallärarens perspektiv. Dessa utmaningar leder till att prioriteringar behöver göras vilket gör speciallärarens yrkesroll komplex och omgärdad av olika dilemman. Samtidigt ger rollen genom det elevnära arbetet ett unikt inifrånperspektiv, en överblick över skolans behov, rutiner och resurser.
82

Intensivundervisning i matematik ur ett elevperspektiv : En interventionsstudie med fokus på elevers upplevelser av intensivundervisning i matematik / Intensive teaching in mathematics from a student perspective : An intervention study focusing on students' experiences of intensive teaching in mathematics

Eriksson, Catharina, Förster, Maria January 2022 (has links)
Denna studie ger en inblick i några elevers upplevelser av matematiken i skolan. Syftet var att undersöka hur elever talar om sin inställning till matematik och matematikundervisning samt om sin upplevelse av intensivundervising. Fokus för studienär ett elevperspektiv vilket undersöktes genom kvalitativa intervjuer som genomfördes före och efter en fyra veckors intervention i form av intensivundervisning. Intervjuerna synliggjorde hur eleverna talade om sin inställning till matematiken samt om sin upplevelse av intensivundervisningen. Intensivundervisningen hade formen av förlektioner där kommande matematiska innehåll behandlades men innehöll även moment med syfte  att utveckla elevernas taluppfattning. Genom elevernas tal om sin inställning till matematiken samt sina upplevelser av intensivundervisningen kunde en ökad känsla av sammanhang anas. Eleverna talade något positivare om matematiken efter intensivundervisningen och uttryckte att de fått ny kunskap under interventionen. Det skulle kunna innebära att eleverna upplevde en ökad inkludering i matematikklassrummet efter intensivundervisningen.
83

Episode 5.01 – The Sum-of-Products Expression

Tarnoff, David 01 January 2020 (has links)
Who knew how easy it would be to derive a Boolean expression from a truth table? By following a few simple steps, sum-of-products expressions are quickly converted to and from truth tables. In addition, the SOP expression is a heck of a performer.
84

A Novel Accurate Approximation Method of Lognormal Sum Random Variables

Li, Xue 15 December 2008 (has links)
No description available.
85

Studying the Impact of Solar Photovoltaic on Transient Stability of Power Systems using Direct Methods

Mishra, Chetan 07 December 2017 (has links)
The increasing penetration of inverter based renewable generation in the form of solar photo-voltaic (PV) or wind has introduced numerous operational challenges and uncertainties. Among these challenges, one of the major ones is the impact on the transient stability of the grid. On the other hand, the direct methods for transient stability assessment of power systems have also fairly evolved over the past 30 years. These set of techniques inspired from the Lyapunov's direct method provide a clear insight into the system stability changes with a changing grid. The most attractive feature of these types of techniques is the heavy reduction in the computational burden by cutting down on the simulation time. These advancements were still aimed at analyzing the stability of a non-linear autonomous dynamical system and the existing power system perfectly fits that definition. Due to the changing renewable portfolio standards, the power system is undergoing serious structural and performance alterations. The whole idea of power system stability is changing and there is a major lack of work in the field of direct methods in keeping up with these changes. This dissertation aims at employing the pre-existing direct methods as well as developing new techniques to visualize and analyze the stability of a power system with an added subset of complexities introduced by PV generation. / Ph. D.
86

Non-Wiener Characteristics of LMS Adaptive Equalizers: A Bit Error Rate Perspective

Roy, Tamoghna 12 February 2018 (has links)
Adaptive Least Mean Square (LMS) equalizers are widely used in digital communication systems primarily for their ease of implementation and lack of dependence on a priori knowledge of input signal statistics. LMS equalizers exhibit non-Wiener characteristics in the presence of a strong narrowband interference and can outperform the optimal Wiener equalizer in terms of both mean square error (MSE) and bit error rate (BER). There has been significant work in the past related to the analysis of the non-Wiener characteristics of the LMS equalizer, which includes the discovery of the shift in the mean of the LMS weights from the corresponding Wiener weights and the modeling of steady state MSE performance. BER performance is ultimately a more practically relevant metric than MSE for characterizing system performance. The present work focuses on modeling the steady state BER performance of the normalized LMS (NLMS) equalizer operating in the presence of a strong narrowband interference. Initial observations showed that a 2 dB improvement in MSE may result in two orders of magnitude improvement in BER. However, some differences in the MSE and BER behavior of the NLMS equalizer were also seen, most notably the significant dependence (one order of magnitude variation) of the BER behavior on the interference frequency, a dependence not seen in MSE. Thus, MSE cannot be used as a predictor for the BER performance; the latter further motivates the pursuit of a separate BER model. The primary contribution of this work is the derivation of the probability density of the output of the NLMS equalizer conditioned on a particular symbol having been transmitted, which can then be leveraged to predict its BER performance. The analysis of the NLMS equalizer, operating in a strong narrowband interference environment, resulted in a conditional probability density function in the form of a Gaussian Sum Mixture (GSM). Simulation results verify the efficacy of the GSM expression for a wide range of system parameters, such as signal-to-noise ratio (SNR), interference-to-signal (ISR) ratio, interference frequency, and step-sizes over the range of mean-square stable operation of NLMS. Additionally, a low complexity approximate version of the GSM model is also derived and can be used to give a conservative lower bound on BER performance. A thorough analysis of the MSE and BER behavior of the Bi-scale NLMS equalizer (BNLMS), a variant of the NLMS equalizer, constitutes another important contribution of this work. Prior results indicated a 2 dB MSE improvement of BNLMS over NLMS in the presence of a strong narrowband interference. A closed form MSE model is derived for the BLMS algorithm. Additionally, BNLMS BER behavior was studied and showed the potential of two orders of magnitude improvement over NLMS. Analysis led to a BER model in the form of a GSM similar to the NLMS case but with different parameters. Simulation results verified that both models for MSE and BER provided accurate prediction of system performance for different combinations of SNR, ISR, interference frequency, and step-size. An enhanced GSM (EGSM) model to predict the BER performance for the NLMS equalizer is also introduced, specifically to address certain cases (low ISR cases) where the original GSM expression (derived for high ISR) was less accurate. Simulation results show that the EGSM model is more accurate in the low ISR region than the GSM expression. For the situations where the derived GSM expression was accurate, the BER estimates provided by the heuristic EGSM model coincided with those computed from the GSM expression. Finally, the two-interferer problem is introduced, where NLMS equalizer performance is studied in the presence of two narrowband interferers. Initial results show the presence of non-Wiener characteristics for the two-interferer case. Additionally, experimental results indicate that the BER performance of the NLMS equalizer operating in the presence of a single narrowband interferer may be improved by purposeful injection of a second narrowband interferer. / PHD
87

Summation formulae and zeta functions

Andersson, Johan January 2006 (has links)
<p>This thesis in analytic number theory consists of 3 parts and 13 individual papers.</p><p>In the first part we prove some results in Turán power sum theory. We solve a problem of Paul Erdös and disprove conjectures of Paul Turán and K. Ramachandra that would have implied important results on the Riemann zeta function.</p><p>In the second part we prove some new results on moments of the Hurwitz and Lerch zeta functions (generalized versions of the Riemann zeta function) on the critical line.</p><p>In the third and final part we consider the following question: What is the natural generalization of the classical Poisson summation formula from the Fourier analysis of the real line to the matrix group SL(2,R)? There are candidates in the literature such as the pre-trace formula and the Selberg trace formula.</p><p>We develop a new summation formula for sums over the matrix group SL(2,Z) which we propose as a candidate for the title "The Poisson summation formula for SL(2,Z)". The summation formula allows us to express a sum over SL(2,Z) of smooth functions f on SL(2,R) with compact support, in terms of spectral theory coming from the full modular group, such as Maass wave forms, holomorphic cusp forms and the Eisenstein series. In contrast, the pre-trace formula allows us to get such a result only if we assume that f is also SO(2) bi-invariant.</p><p>We indicate the summation formula's relationship with additive divisor problems and the fourth power moment of the Riemann zeta function as given by Motohashi. We prove some identities on Kloosterman sums, and generalize our main summation formula to a summation formula over integer matrices of fixed determinant D. We then deduce some consequences, such as the Kuznetsov summation formula, the Eichler-Selberg trace formula and the classical Selberg trace formula.</p>
88

New fictitious play procedure for solving Blotto games

Lee, Moon Gul 12 1900 (has links)
Approved for public release; distribution in unlimited. / In this thesis, a new fictitious play (FP) procedure is presented to solve two-person zero-sum (TPZS) Blotto games. The FP solution procedure solves TPZS games by assuming that the two players take turns selecting optimal responses to the opponent's strategy observed so far. It is known that FP converges to an optimal solution, and it may be the only realistic approach to solve large games. The algorithm uses dynamic programming (DP) to solve FP subproblems. Efficiency is obtained by limiting the growth of the DP state space. Blotto games are frequently used to solve simple missile defense problems. While it may be unlikely that the models presented in this paper can be used directly to solve realistic offense and defense problems, it is hoped that they will provide insight into the basic structure of optimal and near-optimal solutions to these important, large games, and provide a foundation for solution of more realistic, and more complex, problems. / Captain, Republic of Korea Air Force
89

Code analysis : Uncovering hidden problems in a codebase by commits to a version control system / Kod analys : Avslöja gömda problem i en kodbas med commits till ett versionhanteringssystem

Reijo, Ken, Kåhre, Martin January 2019 (has links)
Syfte – Startpunkten för den här studien var att identifiera effektiva och ineffektiva delar av en kodbas som kan leda till förbättringar av koden. Detta utfördes på förfrågan av ett företag. För att stödja begärandet utnyttjades en bok av författaren Adam Tornhill. Han diskuterar flera metoder (också kallade analysmetoder) som används för att lokalisera och analysera platser i kod där effektivitet och ineffektivitet föreligger. De använder alla diverse variabler för det här syftet. En webbapplikation utvecklades för att upptäcka den här varianten av kod med hjälp av dessa variabler. Det spekulerades att det kan saknas samband mellan variablerna som kan upptäckas. Detta skulle ge mer insikt i hur verksam och overksam kod kan urskiljas och evalueras, utöver den insikt Adam Tornhills metoder understöder. Det vetenskapliga syftet och bidraget var därför att särskilja potentiella korrelationer mellan analysmetoderna. En frågeställning härleddes från detta syfte: ● Finns det korrelation hos variabler bland befintliga analysmetoder och i sådant fall, vilka är det? Metod – För att svara på frågeställningen var en noggrannare granskning av variablerna nödvändig för att utvärdera vilka som hade potentiella relationer. Efter det hämtades kvantitativa data av de valda variablerna från 7 open source projekter. Data erhölls från git commit historik från 2 år tillbaka. Informationen presenteras i form av grafer som examinerades för mönster och kontext bland de analysmetoder som var i fokus. Statistiska formler (som Pearsons korrelationskoefficient) utnyttjades i syfte att beräkna exakt korrelation för variablerna. Signifikansnivå sattes på 0,001 och ett p-värde kalkylerades. För de projekt med p-värde mindre än signifikansnivå beräknades även ett median och medelvärde med  deras olika korrelationskoefficienten. Resultat – I slutet påträffades två stycken variabler som undersöktes genom grafer för samband. Utredningen visade ett tydligt mönster som indikerar att när fler personer arbetar på en fil kommer också antalet logiska kopplingar att öka för korresponderande fil. För de olika korrelationsvärden visade det sig att 6 av de 7 projekten hade ett p-värde mindre än den satta signifikansnivån 0,001 som innebär att 6 koefficienter är mycket statistiskt signifikanta. Det var bara för 5 av de 6 projekten med godkänd signifikans som positiv korrelation uppmättes. Medelvärdet för de 6 projekten med p-värde mindre än signifikansnivån var 0.41 och medianvärdet 0.63 vilket indikerar en positiv korrelation mellan antal författare och logiska kopplingar Implikationer – I projekt där många personer jobbar på en fil borde försiktighetsåtgärder erfordras med avseende till logiska kopplingar. Vissa förhindrande medel kanske kan etableras för att vägleda andra att minska, eller i alla fall inte onödigt ackumulera, logiska kopplingar när åtskilliga personer inträder på en fil. Begränsningar – Bara två analysmetoder och deras två variabler undersöktes för korrelation och det kan finnas fler variabler som kan ha korrelation. De 7 projekten som det utvanns data från var alla från open source och därför kanske inte resultatet stämmer för closed source projekt. / Purpose – The starting point of this study was to locate efficient and inefficient code in a codebase to improve upon it, as a request from a company. To help with their request a book was used by author Adam Tornhill who has made several methods (also called analysis methods) for this purpose. They all use different variables that locate problems in the code. A web application was developed to use these variables. It was speculated that relationships between the variables may be missing which could improve the analysis methods and in turn uncover efficient and inefficient code better. The main scientific purpose and contribution was therefore to discover associations between the specific variables presented by Adam Tornhill. A question derived from this purpose:   ● Are there correlations with variables among existing analysis methods and in that case, what are they? Method – To answer the question posed, a closer look on the variables was needed to see which ones had a potential connection. After that empirical data of the chosen variables was gathered in the form of quantitative data from 7 open source projects from two years back. This was done by fetching commits from git, called commit history, and presenting the data in a suitable way in form of graphs. In the end the graphs were reviewed to detect possible patterns and then statistical formulas (Pearson's correlation coefficient) were used to calculate the exact correlation between the variables. A significance level was set at 0,001 and then p-value calculated. Median and mean value of the correlation coefficients of projects with p-value less than the significance level were also calculated. Findings – Two variables were inspected in the end, number of authors and number of logical couplings for the same file, and were made into a new analysis method with a graph. Through the graph analysis the methods seem to vary together. The graph shows a clear pattern that as more people work a module the more logical couplings will increase. For 6 out of 7 of the projects analyzed, the p-value was less than the significance level set from the beginning, meaning 6 coefficients were highly statistically significant. It was only for five out of these 6 that a positive coefficient was calculated. For the 6 projects with p-value less than significance the mean correlation coefficient was 0.41 and median 0.63, which both indicate a positive correlation between number of authors and number of logical couplings. Implications – Projects that have several people working on a module should watch out for logical couplings on that same module. Perhaps preventative measures can be made to ensure that people watch out for these logical couplings as more people start working on a module. Limitations – Only two analysis methods and their variables were inspected for further determination of a correlation, and there could be more correlations that are missing. Furthermore, the 7 projects that were used as data were open source and therefore the result from this study may not be the same as for closed source projects.
90

Broadband vibrational sum frequency spectroscopy (VSFS) of modified graphene and polymeric thin films

Holroyd, Chloe January 2017 (has links)
The surface-specific technique of vibrational sum frequency spectroscopy (VSFS) can provide vibrational information about chemical bonds at surfaces and interfaces. Two photons, of visible and infrared frequency, are spatially and temporally overlapped at a surface/interface to produce a photon at the sum frequency (SF) of the two input photons. As well as this process only being allowed in non-centrosymmetric media (i.e. VSFS is surface/interface specific), the SF process is enhanced when the IR beam is resonant with vibrational resonances. Broadband VSFS has been used in this project to study surfaces of two distinct classes of materials, namely graphene and polymers. Firstly, broadband VSFS was used to investigate the heating polymeric thin films using a home-built heated sample cell. The cell was tested using self-assembled monolayers (SAMs) of 1-octadecanethiol (ODT) grown on gold substrates. It was subsequently used to investigate thin films of poly(methyl methacrylate) (PMMA) of four different thicknesses and two different molecular weights that were spin-coated onto gold substrates. It was shown that the monolayers of ODT become disordered upon heating and solidified to incorporate the disorder introduced by the heating process. The PMMA films were also shown to become more disordered as a function of temperature. Secondly, broadband VSFS was used to investigate modified graphene, motivated by the fact that modifications to pristine graphene, be it intentional (i.e. functionalisation) or unintentional (i.e. contamination), cause the properties of graphene to change. This project focused on studying hydrogenated graphene, N-methylbenzamide functionalised graphene and contamination on commercial graphene. A method for calculating the number of hydrogen atoms in a hydrogen island was developed. VSF spectra of CH stretches in N-methylbenzamide functionalised graphene were obtained. Residues on commercially bought graphene were detected using VSFS and RAIRS. These residues were assigned to PMMA that remained on the CVD graphene by the process of transferring the CVD graphene from the copper foil on which it was grown onto the gold substrates.

Page generated in 0.047 seconds