Spelling suggestions: "subject:"macro"" "subject:"sacro""
311 |
Asymmetric effects of monetary policy: A Markov-Switching SVAR approachGaopatwe, Molebogeng Patience 14 February 2022 (has links)
This paper examines the effects of monetary policy on macroeconomic variables in Botswana as a developing small macro-economy using the Markov-switching structural vector autoregressive (MS-SVAR) framework, utilising time-series data from 1994: Q1 to 2019: Q4. The study makes use of bank rate (interest rate), inflation and output gap. The first model is a structural vector autoregressive (VAR) model that takes the form employed by Rudebusch and Svensson (1999), whilst the second one makes use of the same structure but includes Markov switching in the policy rule (i.e., Markov switching SVAR). Regime-switching models can effectively describe the data generating process when considering both in-sample and out of sample evaluations compared to the linear models, which submerge the structural changes that have occurred in the economy over the years. The results from the SVAR shows that monetary policy has a symmetric impact on the output gap and inflation. Therefore, it can be noted that non-linearities in the structural model do not necessarily imply asymmetric effects of shocks. Furthermore, the MS-SVAR shows that the Central Bank of Botswana responds differently to policy shocks in different regimes. This underscores the importance of regime-switching features in providing a more accurate description of the economy.
|
312 |
Production of Poly(lactic acid) Biodegradable Films and the Introduction of a Novel Initiation Method for Free Radical Polymerization via Magnetic FieldsMiller, Kent R. 19 July 2012 (has links)
No description available.
|
313 |
Essays on Investment, Maintenance, and RepairLi, Junkan 12 August 2022 (has links)
No description available.
|
314 |
Can less be more? : En studie om betydelsen av influencers följarantal inom influencer marketingHolfve, Felicia, Jiddy, Mudjier January 2022 (has links)
Under de senaste åren har det blivit allt viktigare för företag att vara aktiva på sociala medier och ha en kontinuerlig uppdatering på de olika sociala plattformarna. Det beror på att influencer marketing som marknadsföringsstrategi börjat ersätta den traditionella marknadsföringen. Influencer marketing ses som den mest direkta marknadsföringsstrategin för kontakt med potentiella kunder som dessutom är den mest kostnadseffektiva. Influencer marketing innebär att företag använder sig av influencers för att marknadsföra sina produkter. Syftet med denna studie är att undersöka om följarantal spelar någon roll ur både ett företags- och konsumentperspektiv. För att kunna undersöka detta fenomen har författarna tillämpat en kvalitativ metod där intervjuer skett med 2 företag. Vidare har en kvantitativ metod tillämpats genom en icke strukturerad enkät med 4 företag. Slutligen har en enkätundersökning tillämpats där författarna fått in svar från totalt 161 respondenter. Resultatet visar givande information både kring företagsperspektivet samt konsumentperspektivet. Studien visar en indikation på att företag inte kollar direkt på följarantalet men däremot fokuserar de på följarengagemang och räckvidd när de väljer en influencer, något som kommer till följd från följarantalet. En microinfluencer tenderar att ha ett större följarengagemang på grund av att följare känner en vänskapsrelation till influencern. Vidare anses en microinfluencer mer trovärdig på grund av den nära och öppna relationen de har med sina följare, något som tenderar att minska ju större en influencer blir. En macroinfluencer har däremot en större räckvidd på grund av sin stora plattform och följarantal. Resultatet visar att en mix av de båda bör användas av företag för att uppnå bästa möjliga resultat och kunna nå ut till olika typer av konsumenter. Företagen kan därmed analysera sina arbeten och anpassa valet av influencer beroende på typ av arbete samt resultat som de är ute efter.
|
315 |
Hur påverkas direktavkastningskravet på kontorsfastigheter av makroekonomiskaförändringar? : En kvantitativ studie av sambandet mellan direktavkastningskrav och makroekonomiska variabler / How are office cap rates affected by macroeconomic changes?Yilmaz, Rusen, Edlund, Viktor January 2019 (has links)
Den svenska kontorsmarknaden har under senare år drivits mot rekordhöga hyror och lågadirektavkastningskrav. Drivkraften i utvecklingen har varit gynnsamma makroekonomiskaförhållanden med bland annat negativ reporänta. Denna studie syftar till att undersöka ivilken grad utvecklingen av direktavkastningskrav i tre svenska storstäder, Stockholm,Göteborg och Malmö kan förklaras av makroekonomiska förändringar. Intressant är även hurde olika städerna förhåller sig till varandra och utvecklingen i ekonomin. Frågeställningensätts i ett mer generellt sammanhang, där marknads- och fastighetspecifika faktorerutelämnas.Med stöd av empiriska studier och ekonomiska teorier, som bland annat 4Q-modellen,angrips frågeställningen genom ett urval av makrovariabler, bearbetning av data ochgrundläggande statistiska beräkningar. Dem utvalda makrovariabler är: KPI, Reporänta,Statsobligationer 10 år, BNP, Arbetslöshet och OMXS30. Korrelationsberäkningar och linjäraregressionsmodeller är centrala verktyg i det kvantitativa arbetet. De statistiskaberäkningarna utförs både med och utan tidsförskjutningar på direktavkastningskravet motmakrovariabler. På så vis beaktas fastighetsmarknadens tröghet och cykliska natur.Resultatet, som ligger i linje med tidigare studier, visar att valda makroekonomiska variableroftast inte är tillräckliga för att förklara rörelsen i direktavkastningskravet. Inte heller påvisasnågon påtaglig skillnad städerna emellan. Resultatet vittnar om värdet av mängd data ochurvalet av oberoende variabler vid en regressionsanalys. Däremot påvisas ett tydligtsamband mellan utvecklingen av direktavkastningskravet och de utvalda makrovariablerna. / The Swedish office market has over the last couple of years developed into a state of recordbraking prime rents at an all-time high and cap rates at an all-time low. The force ofdevelopment has been the favorable macro-economic conditions. This study aims toexamine to what extent the development of cap rates of offices in Sweden’s three largestcities, Stockholm, Gothenburg and Malmo can be explained by macro-economic changes. Apoint of interest is how these cities relate to each other and the economy. The issue is put ina generical state where factors specific to the markets get left out.With the support of empirical studies and economic theories, such as the 4Q model, theissue is worked by a selection of macro variables by processing of data and basic statisticalcalculations. The selected macro variables are: CPI, Repo rate, Government bonds 10years, GDP, Unemployment and OMXS30. Correlation calculations and linear regressionmodels are central tools in the quantitative work. The statistical calculations are performedboth with and without time lags on the cap rates for macro variables. In this way, the inertiaand cyclical nature of the property market are considered.The result, which is in line with previous studies, shows that the movement of cap rates overtime often can’t be explained only by macro-economic variables. Furthermore, it shows thatthere is no palpable difference between the cities when it comes to development of caprates. The result shows the importance of the multitude of data and the selection of variableswhen performing a regression analysis. With this said the study does demonstrate aconnection between the development of cap rates and the chosen macro variables.
|
316 |
Determinants of Capital Structure : A Quantitative Study on Swedish Listed FirmsJohansson, Rasmus, Filip, Lindberg January 2022 (has links)
In the finance literature determinants of Capital structure have been widely debated. Previous studies have mainly focused on microeconomic determinants in countries outside Sweden and research on the Swedish market has been sparse. This study aims to analyze how microeconomic determinants such as profitability, firm size and tangible assets affect the capital structure and further how the determinants from the Swedish macroenvironment like inflation, tax rate and interest rate affect capital structure. The study considers previous theories on capital structures relevance and evaluates the Swedish firms support for the Irrelevance Theory, Pecking order and the Trade-off theory. In other words, by evaluating how the determinants affect the capital structure we were able to see connections between theory and how Swedish firms determine their financing decision. Based on a review of the literature and theories, the determinants, the quantitative approach, and collection method was decided. The data was collected over a 10-year period between 2010 - 2019 and amounted to 1116 firms and 44 632 observations. A multiple regression method was performed where the dependent variable the debt ratio was split into short-term, long-term, and total debt to get a better understanding of the results. Analysis of the results demonstrated that Swedish firms' total debt ratio had a significant negative relation towards profitability. This indicates that Swedish firms choose to finance their operation with internal funds rather than with debt which supports the Pecking order Theory. However, the determinant growth show significant negative relationship to the debt ratio which is in support for the Trade-off theory. The results imply Swedish firms conflicting support for theories on capital structure. Our results from a change in the Swedish macroenvironment show that inflation causes firms long-term debt ratio to decrease, which potentially demonstrates the fear of higher interest rates as inflation hits and an unwillingness to finance with debt when the cost of financial distress increases. Considering previous studies has shown contrasting results on the determinants effect on capital structure. We consider our findings to be in line with overall expectations and believe we add further knowledge which can be applied to the Swedish business environment.
|
317 |
Macroscopic Traffic Safety Analysis Based On Trip Generation CharacteristicsSiddiqui, Chowdhury 01 January 2009 (has links)
Recent research has shown that incorporating roadway safety in transportation planning has been considered one of the active approaches to improve safety. Aggregate level analysis for predicting crash frequencies had been contemplated to be an important step in this process. As seen from the previous studies various categories of predictors at macro level (census blocks, traffic analysis zones, census tracts, wards, counties and states) have been exhausted to find appropriate correlation with crashes. This study contributes to this ongoing macro level road safety research by investigating various trip productions and attractions along with roadway characteristics within traffic analysis zones (TAZs) of four counties in the state of Florida. Crashes occurring in one thousand three hundred and forty-nine TAZs in Hillsborough, Citrus, Pasco, and Hernando counties during the years 2005 and 2006 were examined in this study. Selected counties were representative from both urban and rural environments. To understand the prevalence of various trip attraction and production rates per TAZ the Euclidian distances between the centroid of a TAZ containing a particular crash and the centroid of the ZIP area containing the at fault driver's home address for that particular crash was calculated. It was found that almost all crashes in Hernando and Citrus County for the years 2005-2006 took place in about 27 miles radius centering at the at-fault drivers' home. Also about sixty-two percent of crashes occurred approximately at a distance of between 2 and 10 miles from the homes of drivers who were at fault in those crashes. These results gave an indication that home based trips may be more associated with crashes and later trip related model estimates which were found significant at 95% confidence level complied with this hypothesized idea. Previous aggregate level road safety studies widely addressed negative binomial distribution of crashes. Properties like non-negative integer counts, non-normal distribution, over-dispersion in the data have increased suitability of applying the negative binomial technique and has been selected to build crash prediction models in this research. Four response variables which were aggregated at TAZ-level were total number of crashes, severe (fatal and severe injury) crashes, total crashes during peak hours, and pedestrian and bicycle related crashes. For each response separate models were estimated using four different sets of predictors which are i) various trip variables, ii) total trip production and total trip attraction, iii) road characteristics, and iv) finally considering all predictors into the model. It was found that the total crash model and peak hour crash model were best estimated by the total trip productions and total trip attractions. On the basis of log-likelihoods, deviance value/degree of freedom, and Pearson Chi-square value/degree of freedom, the severe crash model was best fit by the trip related variables only and pedestrian and bicycle related crash model was best fit by the road related variables only. The significant trip related variables in the severe crash models were home-based work attractions, home-based shop attractions, light truck productions, heavy truck productions, and external-internal attractions. Only two variables- sum of roadway segment lengths with 35 mph speed limit and number of intersections per TAZ were found significant for pedestrian and bicycle related crash model developed using road characteristics only. The 1349 TAZs were grouped into three different clusters based on the quartile distribution of the trip generations and were termed as less-tripped, moderately-tripped, and highly-tripped TAZs. It was hypothesized that separate models developed for these clusters would provide a better fit as the clustering process increases the homogeneity within a cluster. The cluster models were re-run using the significant predictors attained from the joint models and were compared with the previous sets of models. However, the differences in the model fits (in terms of Alkaike's Information Criterion values) were not significant. This study points to different approaches when predicting crashes at the zonal level. This research is thought to add to the literature on macro level crash modeling research by considering various trip related data into account as previous studies in zone level safety have not explicitly considered trip data as explanatory covariates.
|
318 |
Macro-Borings in Cretaceous Oysters from Drumheller, Alberta: Taphonomy and PaleoecologyKemp, Kathleen Margaret 06 1900 (has links)
<p> Macro-borings and other biogenic structures found in Ostrea glabra valves were examined using a dissecting microscope, SEM and X-radiography. Shells were collected from in situ and transported oyster beds of the Bearpaw-Horseshoe Canyon Formation transition (Upper Cretaceous) at Drumheller, Alberta. Emended diagnoses for Entobia, Talpina and Zapfella were proposed and systematic descriptions of these ichnogenera along with Oichnus were done. Repair blisters and partitions apparently formed by the oyster in response to irritations were also described and interpreted. It was confirmed that statistical analysis could differentiate between round holes drilled by naticids and muricids. It was possible to define fossil micro-environments on the basis of an interpretation of taphonomy and paleoecology considered within the sedimentary context.</p> / Thesis / Bachelor of Science (BSc)
|
319 |
Adding hygiene to gambit schemeDoucet, Antoine 07 1900 (has links)
Le langage de programmation Scheme est reconnu pour son puissant
système de macro-transformations. La représentation
du code source d'un programme, sous forme de données manipulables
par le langage,
permet aux programmeurs de modifier directement
l'arbre de syntaxe abstraite sous-jacent.
Les macro-transformations
utilisent une syntaxe similaire aux procédures régulières mais,
elles définissent plutôt des procédures à exécuter
lors de la phase de compilation.
Ces procédures retournent une représentation sous
forme d'arbre de syntaxe abstraite qui devra être substitué
à l'emplacement de l'appel du transformateur. Les procédures
exécutées durant la phase de compilation profitent
de la même puissance que celles exécutées durant de la phase d'évaluation.
Avec ce genre de système de macro-transformations,
un programmeur peut créer des règles de syntaxe spécialisées
sans aucun coût additionnel en performance:
ces extensions syntactiques
permettent l'abstraction de code sans les coûts d'exécution
habituels reliés à la création d'une fermeture sur le tas.
Cette représentation pour le code source de Scheme provient
directement du langage de programmation Lisp. Le code source
est représenté sous forme de listes manipulables
de symboles, ou bien de
listes contenants d'autres listes: une structure appelée
S-expression. Cependant, avec cette approche simpliste,
des conflits de noms peuvent apparaître.
En effet, l'association référée par un certain identifiant
est déterminée exclusivement par
le contexte lexical de celui-ci.
En déplaçant un identifiant dans l'arbre de syntaxe abstraite,
il est possible que cet identifiant se retrouve dans
un contexte lexical contenant une certaine association pour un identifiant du même nom.
Dans de tels cas,
l'identifiant déplacé pourrait ne plus référer à l'association
attendue, puisque cette seconde
association pourrait avoir prévalence sur
la première. L'assurance de transparence référentielle est alors perdue.
En conséquence, le choix de nom pour les identifiants
vient maintenant influencer directement
le comportement du programme,
générant des erreurs difficiles à comprendre.
Les conflits de noms
peuvent être corrigés manuellement dans le code en utilisant,
par exemple, des noms d'identifiants uniques.
La préservation automatique de la transparence référentielle
se nomme hygiène, une notion qui a été beaucoup
étudiée dans le contexte
des langages de la famille Lisp.
La dernière version du Scheme revised report, utilisée
comme spécification pour le langage, étend ce dernier
avec un support pour les macro-transformations hygiéniques.
Jusqu'à maintenant,
l'implémentation Gambit de Scheme ne fournissait
pas de tel système à sa base. Comme contribution, nous
avons ré-implémenter le système de macro de Gambit pour
supporter les macro-transformations hygiéniques au plus bas niveau
de l'implémentation. L'algorithme choisi se base sur l'algorithme
set of scopes implémenté dans le langage Racket et créé par Matthew Flatt.
Le langage Racket s'est grandement inspiré
du langage Scheme mais, diverge
sur plusieurs fonctionnalités importantes. L'une de
ces différences est le puissant système de macro-transformation
sur lequel Racket base la majorité de ses primitives.
Dans ce contexte, l'algorithme a donc été testé
de façon robuste.
Dans cette thèse, nous donnerons un aperçu du langage
Scheme et de sa syntaxe. Nous énoncerons le problème d'hygiène
et décrirons différentes stratégies utilisées
pour le résoudre. Nous justifierons par la suite
notre choix d'algorithme et fourniront une définition
formelle. Finalement, nous présenterons une analyse
de la validité et de la performance du compilateur en
comparant la version originale de Gambit avec notre
version supportant l'hygiène. / The Scheme programming language is known for
its powerful macro system.
With Scheme source code represented as actual Scheme data,
macro transformations
allow the programmer, using that data, to act directly on the
underlying abstract syntax tree.
Macro transformations use a similar syntax to
regular procedures but, they define procedures
meant to be executed at compile time.
Those procedures return an abstract syntax tree representation
to be substituted at the transformer's call location.
Procedures executed at compile-time use the same
language power as run-time procedures.
With the macro system,
the programmer can create specialized
syntax rules without additional performance costs.
This also allows for code abstractions
without the expected run-time cost of closure creations.
Scheme's representation of source code using values
inherits that virtue from the Lisp programming language.
Source code is represented as a list of symbols, or lists
of other lists: a structure coined S-expressions.
However, with this simplistic approach,
accidental name clashes can occur.
The binding to which an identifier refers to
is determined by the lexical context of that identifier.
By moving an identifier around in the abstract syntax tree,
it can be caught within the lexical context of another binding definition with the same name.
This can cause unexpected behavior for programmers
as the choice of names can create substantial changes
in the program.
Accidental name clashes can be manually fixed in the code,
using name obfuscation, for instance.
However, the programmer becomes responsible
for the program's safety.
The automatic preservation of referential transparency
is called hygiene and was
thoroughly studied in the context
of lambda calculus and Lisp-like languages.
The latest Scheme revised report, used as a specification for the
language, extend the language with hygienic macro
transformations.
Up to this point, the Gambit Scheme implementation
wasn't providing a built-in hygienic macro system.
As a contribution, we re-implemented Gambit's
macro system to support hygienic transformations
at its core.
The algorithm we chose is
based on the set of scopes algorithm, implemented in the
Racket language by Matthew Flatt.
The Racket language is heavily based on Scheme but,
diverges on some core features.
One key aspect of the Racket language is
its extensive hygienic syntactic macro system, on
which most core features are built on:
the algorithm
was robustly tested in that context.
In this thesis, we will give an overview of the Scheme language
and its syntax. We will state the hygiene problem and describe
different strategies used to enforce hygiene automatically.
Our algorithmic
choice is then justified and formalized. Finally, we
present the original Gambit macro system and explain
the changes required. We also provide a validity and performance
analysis, comparing the original Gambit implementation to
our new system.
|
320 |
Hemp fiber – an environmentally friendly fiber for concrete reinforcementGiltner, Brian 25 November 2020 (has links)
The commercial use of hemp fiber in the construction industry within the United States is non-existent. This lack of use is because of State and Federal laws forbidding the growth of hemp in the United States, which has led to a lack of research. Not having an established supply chain for hemp and coupled with limited research has put the United States behind other countries in finding viable options for these renewable resources. This is a study of the performance of raw hemp fibers and processed hemp twine in a cement past mixture subjected to tensile loading. Three water/cement ratios (0.66, 0.49, 0.42) were considered. Replacement of cement with fly ash is also part of the program to see if it affects the performance of the system. A detailed description of the method of applying the tensile load to the micro/macro fibers along with the fixture setup is part of this article. The results of this investigation show the hemp twine and fibers will bond to the cement matrix and they can carry higher tensile loads at higher w/c ratios. This study shows that 30 mm embedment length is best for hemp macro fibers and 20 mm embedment for hemp micro fibers. This study also includes a comparative investigation of the performance of hemp fibers to synthetic and steel fibers added to a concrete mix. This investigation examined the compressive strength of the fiber-reinforced concrete mixes, flexural capacity, ductility, flexural toughness and the effects the fibers have on Young’s modulus of elasticity. All fibers were introduced to the same mix design (w/c = 0.49) with replacement of 25% of cement with fly ash. Hemp micro fibers at the same dosing rate a synthetic micro fibers has a slightly higher toughness and equivalent flexural strength. Hemp macro fibers at a higher dosing rate as compared to synthetic fibers will have similar toughness and equivalent flexural strength. Steel fibers performed better than the synthetic and natural fibers at 28-day compressive strength.
|
Page generated in 0.0399 seconds