• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 105
  • 24
  • 9
  • 7
  • 6
  • 3
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 182
  • 182
  • 30
  • 29
  • 28
  • 27
  • 26
  • 26
  • 23
  • 20
  • 19
  • 17
  • 17
  • 16
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Funding Matters : Archaeology and the Political Economy of the Past in the EU

Niklasson, Elisabeth January 2016 (has links)
The aim of this thesis is to show how Europe is constructed at the intersection between archaeology, money and politics within EU cultural actions. Ever since the 1970s, the European Community has invested money and prestige in the idea of a common cultural heritage for Europe. Alongside symbolic attributes such as the flag and anthem, archaeological sites have been used as rhetorical fuel to create a sense of European belonging, much like in national identity building. As a result, archaeologists and heritage professionals have benefitted from EU funding for restoration of sites, training schools and cooperation projects since 1976. In order to address this mutual engagement, the research in this thesis explores the ways that EU grant systems in culture have fostered specific approaches to Europeanness, and how supported projects have responded to notions about a common heritage. By considering EU officials, expert reviewers, consultants and archaeologists as co-creators of the frameworks they participate in, this study raises the idea of financial ties as a place of interaction. The study takes an ethnographic approach and uses discourse analysis and tools from Actor-Network Theory. The material consists of observations made during an internship at the European Commission, 41 interviews with different actors, as well as policy documents, budgets and collected information about 160 supported projects with archaeological themes. This research demonstrates how the expectations linked to archaeology have turned it into both a problem and a promise in the search for a 'usable past' for the EU. On the one hand, archaeology has functioned as an anchor, mooring the notion of a common heritage to something solid. On the other, because of its strong commitment to nationhood, what archaeology claims for its own has often undermined the very idea of a shared European inheritance. Projects benefitting from EU support have taken advantage of the expectations placed upon archaeology to help create a European identity, using buzzwords and 'application poetry' in their proposals. Many projects continuously used EU goals and symbols in their outputs. Sometimes a European past and present was connected by rhetorically tying archaeological periods (such as the Middle Ages and Roman Era) and phenomena (rock art or landscapes) to the EU political project. This link was more manifest in public settings than in academic ones. Taken together, the considerations brought up in this study show that funding matters. The EU strategy of vagueness, in which instructions and evaluation criteria foremost decide the frames but not the content of the projects, has inspired applicants to 'think Europe without thinking.' Once an application is written and submitted, a chain of translations by different actors works to depoliticise the act of constructing Europe. The EU, just as other funding bodies, has become entangled in the political ecology of archaeology. An entanglement which is unavoidable, but which needs to be critically addressed. Funding sources matter for the way we understand both the past and the meaning of archaeology in the present. / Denna avhandling undersöker hur Europa skapas i gränslandet mellan arkeologi, pengar och politik inom den Europeiska Unionens kulturpolitiska finansieringsprogram. Vid sidan av symboliska attribut såsom flagga och nationalsång har företrädare för den Europeiska Gemenskapen och EU engagerat sig i idén om ett gemensamt europeiskt kulturarv, på ett metaforiskt såväl som ett materiellt plan. Politisk legitimitet har sökts med hänvisning till en mångtusenårig samhörighet. I samband med detta engagemang har arkeologer och kulturarvsarbetare sedan 1970-talet erhållit finansiellt stöd för restaureringsprojekt på platser av europeisk betydelse och transnationella samarbetsprojekt som kan skapa europeiskt mervärde. Studien undersöker banden mellan EU och arkeologi genom att lyfta finansiering som en plats för interaktion och meningsskapande. En etnografisk metod har tillämpats, där empirin består av fältobservationer från en praktikantperiod på Europeiska kommissionen, 41 intervjuer med olika aktörer, samt policydokument och arkeologiska texter. En databas med 160 arkeologiska projekt har även skapats. Diskursanalys och nätverksteoretiska begrepp såsom översättningar och svarta lådan har använts för att lokalisera och begreppsliggöra iakttagelser och meningsfulla skärningspunkter i materialet. Studien visar hur EU-tjänstemän, expertgranskare, konsulter och arkeologer alla deltar i utformandet av arkeologiska problemställningar och byggandet av professionella nätverk. EUs mjuka strategier, inom vilka instruktioner och utvärderingskriterier främst bestämmer ramarna men inte innehållet i de finansierade projekten, har inspirerat sökande att tänka Europa utan att tänka. När en ansökan skrivs och lämnas in startar en kedja av översättningar som leder till att olika aktörer avpolitiserar skapandet av Europa i samtiden. I resultaten framkom att arkeologiska projekt, genom att använda EUs målformuleringar i sina projektansökningar, ofta har utnyttjat EUs förväntningar på arkeologi om att skapa en europeisk identitet. I flera projekt knöts en europeisk samhörighet i det förflutna samman med dagens EUropa. Dessutom fortsatte många projekt att använda EUs mål och symboler i sina outputs. Här var EU-kopplingen tydligare i publika sammanhang än i akademiska. Sammantaget visar studien att val av finansieringskälla spelar stor roll. EUs finansieringsprogram har blivit en del av arkeologins politiska ekologi, en sammanflätning som är oundviklig men viktig att kritiskt uppmärksamma. Dessa band påverkar både vår syn på det förflutna och samhällets syn på arkeologi idag.
112

Impacts of Black Box Warning, National Coverage Determination, and Risk Evaluation and Mitigation Strategies on the Inpatient On-Label and Off-label Use of Erythropoiesis-Stimulating Agents

Seetasith, Arpamas 01 February 2013 (has links)
Background: FDA black box warning, Risk Evaluation and Mitigation Strategies (REMS), and CMS national coverage determination (NCD) aim to reduce inappropriate use of erythropoiesis-stimulating agents (ESAs) that are widely used in anemic patients. Previous studies have not linked specific safety interventions to changes in ESA utilization patterns in the inpatient settings nor assessed such interventions on off-label use of the drugs. Ineffectiveness of the intervention and lag time between such interventions and the observed change in clinical practice could lead to serious clinical outcomes. In addition, such interventions may unintentionally reduce on-label and some off-label use of ESAs considered “appropriate” in patients who could otherwise benefit. Objectives: The primary aim of the study is to quantify the impacts of the (1) addition of black box warning, (2) implementation of NCD, and (3) institution of REMS on ESA on-label and off-label utilization patterns of adult inpatients. Demographic, clinical condition, physician, and hospital characteristics of ESAs users by their use category are also described in detail. Methods: Electronic health records in Cerner Database from January 1, 2005 to June 30, 2011 were used. The use of the two erythropoietic drugs: epoetin alfa and darbepoetin alfa were categorized into three groups using ICD-9-CM diagnoses and procedures codes and patients’ medication information. The three categories were (1) on-label use (approved by the FDA); (2) off-label use supported (use for the indications not approved by the FDA, but there is strong clinical evidence to support its use); and (3) off-label use unsupported (use for the indications not approved by the FDA and lacking clinical evidence). The immediate and trend impacts of the interventions on the proportion of ESAs prescribed for each usage category between 2005 and 2011 were assessed using an interrupted time series technique. The likelihood of receiving ESAs among patients with on-label, off-label supported, off-label unsupported indications was assessed using a generalized estimating equation approach with binary logistic regression technique, clustering for hospitals and controlling for potential confounders such as patient characteristics, patient clinical conditions, physician specialty, and hospital characteristics. Results: During the study period, there were 111,363 encounters of ESA use. These encounters represented 86,763 patients admitted to Cerner health system between January 1, 2005 and June 30, 2011. Of these patients, 66,121 were prescribed epoetin alfa only (76.2%); 20,088 darbepoetin alfa only (23.2%); and 554 were prescribed both epoetin alfa and darbepoetin alfa (0.6%). Forty-nine percent of the patients used ESAs for the on-label indications, 8.6% for off-label supported indications, and 42.7% for the off-label unsupported indications. The main uses of ESAs in our sample were for CKD (ONS, 41.1%) and chronic anemia (OFU, 31.8%). From 2005 to 2010, the proportion of visits with ESA ONS and OFS use decreased 53.2% and 81.9%, while ESA OFU increased 112.6%. Results from binary logistic regression using GEE model showed overall decreasing trends in ESA use for the on-label and off-label supported indications, but not off-label unsupported indications. REMS had no impact on the odds of receiving ESAs among patients with on-label and off-label conditions. Black box warning reduced the odds of being prescribed with epoetin alfa in patients with off-label unsupported conditions by 40%. It was also associated with 4% and 15% per month reduction in the odds of using darbepoetin alfa in patients with off-label supported and unsupported conditions. Lastly, there was a significant decline in all categories of ESA use the month after Medicare national coverage determination was implemented. The impact of NCD ranged from a 20% reduction in the odds of off-label supported use to a 37% reduction in on-label use. Age, gender, race, source of payment, admission type, clinical complexity, discharge disposition, and hospital size were significant associated with ESA use on-label and off-label. Conclusion: This study was the first to determine the impact of safety interventions on ESA on-label and off-label utilization patterns in the inpatient settings using the Cerner database. We demonstrated lag between the interventions and observed change in clinical practice, and the relative impacts of three types of safety interventions on on-label and off-label ESA use in the hospital settings. The indirect impact of the reimbursement change was the potential unintended consequence of reducing the likelihood of receiving ESAs for a patient with indicated conditions who could have otherwise benefited from the drugs.
113

Automatic parameter tuning in localization algorithms / Automatisk parameterjustering av lokaliseringsalgoritmer

Lundberg, Martin January 2019 (has links)
Many algorithms today require a number of parameters to be set in order to perform well in a given application. The tuning of these parameters is often difficult and tedious to do manually, especially when the number of parameters is large. It is also unlikely that a human can find the best possible solution for difficult problems. To be able to automatically find good sets of parameters could both provide better results and save a lot of time. The prominent methods Bayesian optimization and Covariance Matrix Adaptation Evolution Strategy (CMA-ES) are evaluated for automatic parameter tuning in localization algorithms in this work. Both methods are evaluated using a localization algorithm on different datasets and compared in terms of computational time and the precision and recall of the final solutions. This study shows that it is feasible to automatically tune the parameters of localization algorithms using the evaluated methods. In all experiments performed in this work, Bayesian optimization was shown to make the biggest improvements early in the optimization but CMA-ES always passed it and proceeded to reach the best final solutions after some time. This study also shows that automatic parameter tuning is feasible even when using noisy real-world data collected from 3D cameras.
114

Automated Testing of Robotic Systems in Simulated Environments

Andersson, Sebastian, Carlstedt, Gustav January 2019 (has links)
With the simulations tools available today, simulation can be utilised as a platform for more advanced software testing. By introducing simulations to software testing of robot controllers, the motion performance testing phase can begin at an earlier stage of development. This would benefit all parties involved with the robot controller. Testers at ABB would be able to include more motion performance tests to the regression tests. Also, ABB could save money by adapting to simulated robot tests and customers would be provided with more reliable software updates. In this thesis, a method is developed utilising simulations to create a test set for detecting motion anomalies in new robot controller versions. With auto-generated test cases and a similarity analysis that calculates the Hausdorff distance for a test case executed on controller versions with an induced artificial bug. A test set has been created with the ability to detect anomalies in a robot controller with a bug.
115

Deep Learning Black Box Problem

Hussain, Jabbar January 2019 (has links)
Application of neural networks in deep learning is rapidly growing due to their ability to outperform other machine learning algorithms in different kinds of problems. But one big disadvantage of deep neural networks is its internal logic to achieve the desired output or result that is un-understandable and unexplainable. This behavior of the deep neural network is known as “black box”. This leads to the following questions: how prevalent is the black box problem in the research literature during a specific period of time? The black box problems are usually addressed by socalled rule extraction. The second research question is: what rule extracting methods have been proposed to solve such kind of problems? To answer the research questions, a systematic literature review was conducted for data collection related to topics, the black box, and the rule extraction. The printed and online articles published in higher ranks journals and conference proceedings were selected to investigate and answer the research questions. The analysis unit was a set of journals and conference proceedings articles related to the topics, the black box, and the rule extraction. The results conclude that there has been gradually increasing interest in the black box problems with the passage of time mainly because of new technological development. The thesis also provides an overview of different methodological approaches used for rule extraction methods.
116

Automatic Random Variate Generation for Simulation Input

Hörmann, Wolfgang, Leydold, Josef January 2000 (has links) (PDF)
We develop and evaluate algorithms for generating random variates for simulation input. One group called automatic, or black-box algorithms can be used to sample from distributions with known density. They are based on the rejection principle. The hat function is generated automatically in a setup step using the idea of transformed density rejection. There the density is transformed into a concave function and the minimum of several tangents is used to construct the hat function. The resulting algorithms are not too complicated and are quite fast. The principle is also applicable to random vectors. A second group of algorithms is presented that generate random variates directly from a given sample by implicitly estimating the unknown distribution. The best of these algorithms are based on the idea of naive resampling plus added noise. These algorithms can be interpreted as sampling from the kernel density estimates. This method can be also applied to random vectors. There it can be interpreted as a mixture of naive resampling and sampling from the multi-normal distribution that has the same covariance matrix as the data. The algorithms described in this paper have been implemented in ANSI C in a library called UNURAN which is available via anonymous ftp. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
117

Model-Based Test Case Generation for Real-Time Systems

Hessel, Anders January 2007 (has links)
<p>Testing is the dominant verification technique used in the software industry today. The use of automatic test case execution increases, but the creation of test cases remains manual and thus error prone and expensive. To automate generation and selection of test cases, model-based testing techniques have been suggested.</p><p>In this thesis two central problems in model-based testing are addressed: the problem of how to formally specify coverage criteria, and the problem of how to generate a test suite from a formal timed system model, such that the test suite satisfies a given coverage criterion. We use model checking techniques to explore the state-space of a model until a set of traces is found that together satisfy the coverage criterion. A key observation is that a coverage criterion can be viewed as consisting of a set of items, which we call coverage items. Each coverage item can be treated as a separate reachability problem. </p><p>Based on our view of coverage items we define a language, in the form of parameterized observer automata, to formally describe coverage criteria. We show that the language is expressive enough to describe a variety of common coverage criteria described in the literature. Two algorithms for test case generation with observer automata are presented. The first algorithm returns a trace that satisfies all coverage items with a minimum cost. We use this algorithm to generate a test suite with minimal execution time. The second algorithm explores only states that may increase the already found set of coverage items. This algorithm works well together with observer automata.</p><p>The developed techniques have been implemented in the tool CoVer. The tool has been used in a case study together with Ericsson where a WAP gateway has been tested. The case study shows that the techniques have industrial strength.</p>
118

Predictor development for controlling real-time applications over the Internet

Kommaraju, Mallik 25 April 2007 (has links)
Over the past decade there has been a growing demand for interactive multimedia applications deployed over public IP networks. To achieve acceptable Quality of Ser- vice (QoS) without significantly modifying the existing infrastructure, the end-to-end applications need to optimize their behavior and adapt according to network char- acteristics. Most existing application optimization techniques are based on reactive strategies, i.e. reacting to occurrences of congestion. We propose the use of predic- tive control to address the problem in an anticipatory manner. This research deals with developing models to predict end-to-end single flow characteristics of Wide Area Networks (WANs). A novel signal, in the form of single flow packet accumulation, is proposed for feedback purposes. This thesis presents a variety of effective predictors for the above signal using Auto-Regressive (AR) models, Radial Basis Functions (RBF) and Sparse Basis Functions (SBF). The study consists of three sections. We first develop time- series models to predict the accumulation signal. Since encoder bit-rate is the most logical and generic control input, a statistical analysis is conducted to analyze the effect of input bit-rate on end-to-end delay and the accumulation signal. Finally, models are developed using this bit-rate as an input to predict the resulting accu- mulation signal. The predictors are evaluated based on Noise-to-Signal Ratio (NSR) along with their accuracy with increasing accumulation levels. In time-series models, RBF gave the best NSR closely followed by AR models. Analysis based on accu- racy with increasing accumulation levels showed AR to be better in some cases. The study on effect of bit-rate revealed that bit-rate may not be a good control input on all paths. Models such as Auto-Regressive with Exogenous input (ARX) and RBF were used to develop models to predict the accumulation signal using bit-rate as a modeling input. ARX and RBF models were found to give comparable accuracy, with RBF being slightly better.
119

Urmakarens budbärare : Modern intelligent design-rörelse i jämförelse med brittisk naturteologi vid 1800-talets början

Samuelsson, Jonatan January 2012 (has links)
I uppsatsen jämförs William Paleys Natural Theology : or, evidence of the Existence and Attributes of the Deity, collected from the appearances of nature från 1802 med Michale J. Behes Darwin's black box : The biochemical challenge to evolution från 1996. Jämförelsen kontextualiseras med en skiss över evolutionsteorimotståndets historia under den mellanliggande perioden, som tillsammans med själva den vetenskapliga utvecklingen, antas förklara de skillnader i tilltal och val av argument som föreligger mellan de två verken. I fråga om grundläggande budskap, idéstruktur, syfte och religiositet befinns de två verken ligga varandra betydligt  närmare än vad som initialt kan tyckas vara fallet.
120

A web-based programming environment for novice programmers

Truong, Nghi Khue Dinh January 2007 (has links)
Learning to program is acknowledged to be difficult; programming is a complex intellectual activity and cannot be learnt without practice. Research has shown that first year IT students presently struggle with setting up compilers, learning how to use a programming editor and understanding abstract programming concepts. Large introductory class sizes pose a great challenge for instructors in providing timely, individualised feedback and guidance for students when they do their practice. This research investigates the problems and identifies solutions. An interactive and constructive web-based programming environment is designed to help beginning students learn to program in high-level, object-oriented programming languages such as Java and C#. The environment eliminates common starting hurdles for novice programmers and gives them the opportunity to successfully produce working programs at the earliest stage of their study. The environment allows students to undertake programming exercises anytime, anywhere, by "filling in the gaps" of a partial computer program presented in a web page, and enables them to receive guidance in getting their programs to compile and run. Feedback on quality and correctness is provided through a program analysis framework. Students learn by doing, receiving feedback and reflecting - all through the web. A key novel aspect of the environment is its capability in supporting small &quotfill in the gap" programming exercises. This type of exercise places a stronger emphasis on developing students' reading and code comprehension skills than the traditional approach of writing a complete program from scratch. It allows students to concentrate on critical dimensions of the problem to be solved and reduces the complexity of writing programs.

Page generated in 0.0604 seconds