• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 542
  • 230
  • 69
  • 48
  • 44
  • 41
  • 17
  • 15
  • 14
  • 13
  • 11
  • 7
  • 5
  • 5
  • 4
  • Tagged with
  • 1281
  • 254
  • 249
  • 207
  • 142
  • 137
  • 131
  • 107
  • 96
  • 84
  • 82
  • 78
  • 71
  • 70
  • 69
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
651

Europeiska Unionen : En resa genom tid om hur dagspressen i två nationer framställer europeisk gemenskap

Eriksson, Jenny January 2010 (has links)
This thesis has its focus on media portrayal of a large political institution; the European Union. Articles from two nations daily newspapers have been analyzed in order to examine how the European Union is described and framed for their readers. The newspapers that was chosen for this study and from which the material was collected, were the Swedish newspaper Svenska Dagbladet the American newspaper The New York Times. The material was taken from three periods of time, and by this the results did not only serve a presentiment on how the media reports about the subject, but also an idea over the media’s coverage character over time. Also, by analyzing articles from newspapers from different countries, the material could give an answer to whether there is any difference in media reporting and coverage about the European Union over nation borders. Theories that have been under observation for this study, and which have functioned as tools for the analyzing process are: the agenda-setting theory, media ideology, framing, media logic and political communication. Further inspirations that have been under consideration and of values through the development of this thesis are thoughts from the modern theorist Jürgen Habermas, and earlier studies that have been made on the subject in matter, for example ones by Lars Palm and Vanni Tjernström. Results from this study showed that there were differences between the two nations way of reporting about the subject, and also that changes over time have occurred. Mostly, it was the Swedish news reporting that showed evident change over the three investigated periods. This can be explained by the countries over all changed relationship towards the European Union. Further did results from this study show that the American articles included more actors and subjects, compared to the Swedish articles. This can be a factor of the American articles longer character.
652

Scouting algorithms for field robots using triangular mesh maps

Liu, Lifang 31 July 2007
Labor shortage has prompted researchers to develop robot platforms for agriculture field scouting tasks. Sensor-based automatic topographic mapping and scouting algorithms for rough and large unstructured environments were presented. It involves moving an image sensor to collect terrain and other information and concomitantly construct a terrain map in the working field. In this work, a triangular mesh map was first used to represent the rough field surface and plan exploring strategies. A 3D image sensor model was used to simulate collection of field elevation information.<p>A two-stage exploring policy was used to plan the next best viewpoint by considering both the distance and elevation change in the cost function. A greedy exploration algorithm based on the energy cost function was developed; the energy cost function not only considers the traveling distance, but also includes energy required to change elevation and the rolling resistance of the terrain. An information-based exploration policy was developed to choose the next best viewpoint to maximise the information gain and minimize the energy consumption. In a partially known environment, the information gain was estimated by applying the ray tracing algorithm. The two-part scouting algorithm was developed to address the field sampling problem; the coverage algorithm identifies a reasonable coverage path to traverse sampling points, while the dynamic path planning algorithm determines an optimal path between two adjacent sampling points.<p>The developed algorithms were validated in two agricultural fields and three virtual fields by simulation. Greedy exploration policy, based on energy consumption outperformed other pattern methods in energy, time, and travel distance in the first 80% of the exploration task. The exploration strategy, which incorporated the energy consumption and the information gain with a ray tracing algorithm using a coarse map, showed an advantage over other policies in terms of the total energy consumption and the path length by at least 6%. For scouting algorithms, line sweeping methods require less energy and a shorter distance than the potential function method.
653

The Individual Mandate, Commerce Clause, and Supreme Court: Predicting the Court's Ruling in HHS v. Florida

Medling, Nicholas 01 January 2012 (has links)
An analysis of the evolution of the Commerce Clause, the Justices on the Supreme Court, and the arguments presented in this case indicate that the minimum coverage provision of the Patient Protection and Affordable Care Act will be struck down. Although the Court will likely be split 5 to 4 along ideological lines, each of the justices will have a unique rationale behind their decision. Chief Justice Roberts, Justice Scalia, and Justice Kennedy were heavily targeted by both parties’ oral and written arguments because there was speculation that any one of these traditionally conservative justices could be the fifth vote to uphold the individual mandate. However, it does not appear likely that the federal government supported their claims well enough to yield such a result. Instead, the Court will respond in the negative to the issue of "Whether Congress had the power under Article I of the Constitution to enact the minimum coverage provision." The Court’s interpretation of the Congress' commerce power has undergone two major expansions since the Constitution was ratified, and both of these expansions were met with a contractionary response to prevent the commerce clause’s growth into an unchecked power. This Court will not open a new frontier of power for the Congress, but rather it will respect the limits on Congressional power established by the Rehnquist Court.
654

Jackknife Empirical Likelihood for the Variance in the Linear Regression Model

Lin, Hui-Ling 25 July 2013 (has links)
The variance is the measure of spread from the center. Therefore, how to accurately estimate variance has always been an important topic in recent years. In this paper, we consider a linear regression model which is the most popular model in practice. We use jackknife empirical likelihood method to obtain the interval estimate of variance in the regression model. The proposed jackknife empirical likelihood ratio converges to the standard chi-squared distribution. The simulation study is carried out to compare the jackknife empirical likelihood method and standard method in terms of coverage probability and interval length for the confidence interval of variance from linear regression models. The proposed jackknife empirical likelihood method has better performance. We also illustrate the proposed methods using two real data sets.
655

Health Safety-Net Crisis: A Case Study of News Discourse

Mitchell, Cecilia F. 13 August 2013 (has links)
This study is the first to analyze news coverage of a hegemonic struggle over a crisis that threatened to close a Southern safety net hospital. Such closure could have left indigent, African American men and women without health care access. The study utilizes critical discourse analysis to focus on news portrayals of patients and the struggle over whether the hospital would continue to be governed by a majority-Black, public board of directors or a nonprofit, private board recommended by a majority-White civic group. Results indicate that newspaper coverage privileged the elite, White view, while stereotypically representing indigent, Black patients as problematic. Coverage legitimized privatizing the hospital’s board through a neoliberal discourse that also portrayed its majority-Black board as incompetent.
656

Prediction of recurrent events

Fredette, Marc January 2004 (has links)
In this thesis, we will study issues related to prediction problems and put an emphasis on those arising when recurrent events are involved. First we define the basic concepts of frequentist and Bayesian statistical prediction in the first chapter. In the second chapter, we study frequentist prediction intervals and their associated predictive distributions. We will then present an approach based on asymptotically uniform pivotals that is shown to dominate the plug-in approach under certain conditions. The following three chapters consider the prediction of recurrent events. The third chapter presents different prediction models when these events can be modeled using homogeneous Poisson processes. Amongst these models, those using random effects are shown to possess interesting features. In the fourth chapter, the time homogeneity assumption is relaxed and we present prediction models for non-homogeneous Poisson processes. The behavior of these models is then studied for prediction problems with a finite horizon. In the fifth chapter, we apply the concepts discussed previously to a warranty dataset coming from the automobile industry. The number of processes in this dataset being very large, we focus on methods providing computationally rapid prediction intervals. Finally, we discuss the possibilities of future research in the last chapter.
657

Electrochemical Characterizations and Theoretical Simulations of Transport Behaviors at Nanoscale Geometries and Interfaces

Liu, Juan 12 November 2012 (has links)
Since single nanopores were firstly proposed as a potential rapid and low-cost tool for DNA sequencing in 1990s (PNAS, 1996, 93, 13770), extensive studies on both biological and synthetic nanopores and nanochannels have been reported. Nanochannel based stochastic sensing at single molecular level has been widely reported through the detection of transient ionic current changes induced by geometry blockage due to analytes translocation. Novel properties, including ion current rectification (ICR), memristive and memcapacitive behaviors were reported. These fundamental properties of nanochannels arise from the nanoscale dimensions and enables applications not only in single molecule sensing, but also in drug delivery, electrochemical energy conversion, concentration enrichment and separation, nanoprecipitation, nanoelectronics etc. Electrostatic interactions at nanometer-scale between the fixed surface charges and mobile charges in solution play major roles in those applications due to high surface to volume ratio. However, the knowledge of surface charge density (SCD) at nanometer scale is inaccessible within nanoconfinement and often extrapolated from bulk planar values. The determination of SCD at nanometer scale is urgently needed for the interpretation of aforementioned phenomena. This dissertation mainly focuses on the determination of SCD confined at a nanoscale device with known geometry via combined electroanalytical measurements and theoretical simulation. The measured currents through charged nanodevices are different for potentials with the same amplitude but opposite polarities, which deviates away from linear Ohm's behavior, known as ICR. Through theoretical simulation of experiments by solving Poisson and Nernst-Planck equations, the SCD within nanoconfinement is directly quantified for the first time. An exponential gradient SCD is introduced on the interior surface of a conical nanopre based on the gradient distribution of applied electric field. The physical origin is proposed based on the facilitated deprotonation of surface functional groups by the applied electric field. The two parameters that describe the non-uniform SCD distribution: maximum SCD and distribution length are determined by fitting high- and low-conductivity current respectively. The model is validated and applied successfully for quantification and prediction of mass transport behavior in different electrolyte solutions. Furthermore, because the surface charge distribution, the transport behaviors are intrinsicaly heterogeneous at nanometer scale, the concept is extended to noninvasively determine the surface modification efficacy of individual nanopore devices. Preliminary results of single molecule sensing based on streptavidin-iminobiotin are included. The pH dependent binding affinity of streptavidin-iminobiotin binding is confirmed by different current change signals ("steps" and "spikes") observed at different pHs. Qualitative concentration and potential dependence have been established. The chemically modified nanopores are demonstrated to be reusable through regenerating binding surface.
658

THE CAPITAL REQUIREMENT DIRECTIVE IV : A study of national divergences in Sweden, Denmark and Germany´s financial markets and the ability to implement the CRD IV

Larsson Nyheim, Robin, Larsson Nyheim, Kim January 2012 (has links)
The global financial market has been under a lot of stress in the past years. With the financial crisis that started in 2008, in the US and spread around the world, it created awareness that the world’s financial market requires more regulation to withstand such a crisis. Therefore a new recommended framework for the global financial market was developed by the Basel Committee on Banking Supervision; Basel III. Basel III presented a new era with stricter supervision of banks and tighter regulations. As the European Union is one of the world’s most integrated regions, it strives to be the first to implement the Basel III framework. In order to achieve this, the European Union created its own legislative package, the Capital Requirement Directive IV.The research purpose of this dissertation is to examine how divergences in Sweden, Denmark and Germany’s national financial markets will affect their ability to implement the new CRD IV regulations. Based on the research the conclusion is that our Swedish respondent is most prepared in meeting the new regulations of our three respondents; the characteristics of the Swedish financial market seem well fit to meet the new requirements. Both Germany and Denmark seems to be experiencing problems; the characteristics of their financial markets create obstacles when implementing the new regulations. Denmark has difficulties with their mortgage lending market due to their unique mortgage model. Germany will have problems with the leverage ratio and their inflexible three pillar banking system. Germany’s government has been skeptical to the new CRD IV regulations and this might also have affected our German respondent in a negative way. With the implementation of the regulations the European Commission aims to improve the banking sector in the member states, so that they will be able to endure stress periods better and help to prevent another financial crisis. However, the implementation of the new regulations puts a lot of pressure on the banks and how well they can perform during the implementation process. With this research a questionnaire is created that will help understand how three major banks in Sweden, Denmark and Germany will be affected by the new regulations and if the characteristics of their national financial markets will give them advantages or disadvantages when implementing them. The answers also give us a conclusion to which of the new regulations each respondent will have the most difficulty of implementing. Future research is suggested to be done into the Danish mortgage lending market and their unique mortgage model, to see if it can co-exist with the new CRD IV regulations. Also an in-depth research into the German three pillar banking system can be interesting, to find out if they are able to maintain it or if they have to restructure it.
659

Prediction of recurrent events

Fredette, Marc January 2004 (has links)
In this thesis, we will study issues related to prediction problems and put an emphasis on those arising when recurrent events are involved. First we define the basic concepts of frequentist and Bayesian statistical prediction in the first chapter. In the second chapter, we study frequentist prediction intervals and their associated predictive distributions. We will then present an approach based on asymptotically uniform pivotals that is shown to dominate the plug-in approach under certain conditions. The following three chapters consider the prediction of recurrent events. The third chapter presents different prediction models when these events can be modeled using homogeneous Poisson processes. Amongst these models, those using random effects are shown to possess interesting features. In the fourth chapter, the time homogeneity assumption is relaxed and we present prediction models for non-homogeneous Poisson processes. The behavior of these models is then studied for prediction problems with a finite horizon. In the fifth chapter, we apply the concepts discussed previously to a warranty dataset coming from the automobile industry. The number of processes in this dataset being very large, we focus on methods providing computationally rapid prediction intervals. Finally, we discuss the possibilities of future research in the last chapter.
660

Scouting algorithms for field robots using triangular mesh maps

Liu, Lifang 31 July 2007 (has links)
Labor shortage has prompted researchers to develop robot platforms for agriculture field scouting tasks. Sensor-based automatic topographic mapping and scouting algorithms for rough and large unstructured environments were presented. It involves moving an image sensor to collect terrain and other information and concomitantly construct a terrain map in the working field. In this work, a triangular mesh map was first used to represent the rough field surface and plan exploring strategies. A 3D image sensor model was used to simulate collection of field elevation information.<p>A two-stage exploring policy was used to plan the next best viewpoint by considering both the distance and elevation change in the cost function. A greedy exploration algorithm based on the energy cost function was developed; the energy cost function not only considers the traveling distance, but also includes energy required to change elevation and the rolling resistance of the terrain. An information-based exploration policy was developed to choose the next best viewpoint to maximise the information gain and minimize the energy consumption. In a partially known environment, the information gain was estimated by applying the ray tracing algorithm. The two-part scouting algorithm was developed to address the field sampling problem; the coverage algorithm identifies a reasonable coverage path to traverse sampling points, while the dynamic path planning algorithm determines an optimal path between two adjacent sampling points.<p>The developed algorithms were validated in two agricultural fields and three virtual fields by simulation. Greedy exploration policy, based on energy consumption outperformed other pattern methods in energy, time, and travel distance in the first 80% of the exploration task. The exploration strategy, which incorporated the energy consumption and the information gain with a ray tracing algorithm using a coarse map, showed an advantage over other policies in terms of the total energy consumption and the path length by at least 6%. For scouting algorithms, line sweeping methods require less energy and a shorter distance than the potential function method.

Page generated in 0.0586 seconds