• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 203
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 465
  • 63
  • 56
  • 56
  • 55
  • 48
  • 45
  • 43
  • 41
  • 40
  • 38
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Algorithmes de géolocalisation à l’intérieur d’un bâtiment en temps différé / Post-processing algorithms for indoor localization

Zoubert-Ousseni, Kersane 10 April 2018 (has links)
La géolocalisation indoor en temps réel a largement été étudiée ces dernières années, et de nombreuses applications y sont associées. Une estimation en temps différé de la trajectoire présente également un certain intérêt. La géolocalisation indoor en temps différé permet par exemple de développer des approches de type crowdsourcing qui tirent profit d'un grand nombre d'utilisateurs afin de récolter un grand nombre de mesures : la connaissance du trajet d'un utilisateur muni d'un smartphone permet par exemple d'alimenter une carte de fréquentation du bâtiment. Estimer la trajectoire de cet utilisateur ne nécessite pas de traitement en temps réel et peut s'effectuer en temps différé ce qui offre deux avantages. D'abord, l'approche temps réel estime une position courante uniquement avec les mesures présentes et passées, alors que l'approche temps différé permet d'avoir accès à l'ensemble des mesures et permet d'obtenir une trajectoire estimée plus régulière et plus précise qu'en temps réel. Par ailleurs, cette estimation peut se faire sur un serveur et n'a pas besoin d'être portée par un smartphone comme c'est le cas en temps réel, ce qui permet d'utiliser une puissance de calcul et un volume mémoire plus importants. L'objet de ces travaux de thèse est de proposer une estimation de la trajectoire d'un individu se déplaçant avec un smartphone recevant des mesures de puissance wifi ou bluetooth (RSS) et enregistrant des mesures inertielles (IMU). En premier lieu, sans la connaissance de la position des murs de la carte, un modèle paramétrique est proposé, basé sur un modèle de propagation d'onde adaptatif pour les mesures RSS ainsi que sur une modélisation par morceaux de la trajectoire inertielle, issue des mesures IMU. Les résultats obtenus en temps différé ont une moyenne d'erreur de 6.2m contre 12.5men temps réel. En second lieu, l'information des contraintes de déplacement induites par la présence des murs du bâtiment est ajoutée et permet d'affiner l'estimation de la trajectoire avec une technique particulaire, comme il est couramment utilisé dans la littérature. Cette seconde approche a permis de développer un lisseur particulaire ainsi qu'un estimateur du maximum a posteriori par l'algorithme de Viterbi. D'autres heuristiques numériques ont été présentées. Une première heuristique ajuste le modèle d'état de l'utilisateur, qui est initialement uniquement basé sur les mesures IMU, à partir du modèle paramétrique développé sans les murs. Une seconde heuristique met en œuvre plusieurs réalisations d'un filtre particulaire et définit deux scores basés sur les mesures RSS et sur la continuité de la trajectoire. Les scores permettent de sélectionner la meilleure réalisation du filtre. Un algorithme global, regroupant l'ensemble de ces approche permet d'obtenir une erreur moyenne de 3.6m contre 5.8m en temps réel. Enfin, un modèle d'apprentissage statistique basé sur des forêts aléatoires a permis de distinguer les trajectoires qui ont été correctement estimées en fonction d'un faible nombre de variables, en prévision d'une application au crowdsourcing. / Real time indoor geolocalization has recently been widely studied, and has many applications. Off-line (post-processing) trajectory estimation also presents some interest. Off-line indoor geolocalization makes it possible for instance to develop crowdsourcing approaches that take advantage of a large number of users to collect a large number of measurements: knowing the trajectory of a smartphone user makes it possible for instance to feed an attendance map. Estimating this trajectory does not need to be performed in real-time and can be performed off-line, two main benefits. Firstly, the real-time approach estimates a current position using present and past measurements only, when the off-line approach has access to the whole measurements, and makes it possible to obtain an estimated trajectory that is smoother and more accurate than with a real-time approach. Secondly, this estimation can be done on a server and does not need to be implemented in the smartphone as it is the case in the real-time approach, with the consequence that more computing power and size memory are available. The objective of this PhD is to provide an off-line estimation of the trajectory of a smartphone user receiving signal strength (RSS) of wifi or bluetooth measurements and collecting inertial measurements (IMU). In the beginning, without the floorplan of the building, a parametric model is proposed, based on an adaptive pathloss model for RSS measurements and on a piecewise parametrization for the inertial trajectory, obtained with IMU measurements. Results are an average error of 6.2mfor the off-line estimation against 12.5m for the real-time estimation. Then, information on displacement constraints induced by the walls is considered, that makes it possible to adjust the estimated trajectory by using a particle technique as often done in the state-of-the-art. With this second approach we developped a particle smoother and a maximum a posteriori estimator using the Viterbi algorithm. Other numerical heuristics have been introduced. A first heuristic makes use of the parametric model developed without the floorplan to adjust the state model of the user which was originally based on IMUalone. A second heuristic proposes to performseveral realization of a particle filter and to define two score functions based on RSS and on the continuity of the estimated trajectory. The scores are then used to select the best realization of the particle filter as the estimated trajectory. A global algorithm, which uses all of the aforementioned approaches, leads to an error of 3.6m against 5.8m in real-time. Lastly, a statistical machine learning model produced with random forests makes it possible to distinguish the correct estimated trajectories by only using few variables to be used in a crowdsourcing framework.
292

[en] COMBINING TO SUCCEED: A NOVEL STRATEGY TO IMPROVE FORECASTS FROM EXPONENTIAL SMOOTHING MODELS / [pt] COMBINANDO PARA TER SUCESSO: UMA NOVA ESTRATÉGIA PARA MELHORAR A PREVISÕES DE MODELOS DE AMORTECIMENTO EXPONENCIAL

TIAGO MENDES DANTAS 04 February 2019 (has links)
[pt] A presente tese se insere no contexto de previsão de séries temporais. Nesse sentido, embora muitas abordagens tenham sido desenvolvidas, métodos simples como o de amortecimento exponencial costumam gerar resultados extremamente competitivos muitas vezes superando abordagens com maior nível de complexidade. No contexto previsão, papers seminais na área mostraram que a combinação de previsões tem potencial para reduzir de maneira acentuada o erro de previsão. Especificamente, a combinação de previsões geradas por amortecimento exponencial tem sido explorada em papers recentes. Apesar da combinação de previsões utilizando Amortecimento Exponencial poder ser feita de diversas formas, um método proposto recentemente e chamado de Bagged.BLD.MBB.ETS utiliza uma técnica chamada Bootstrap Aggregating (Bagging) em combinação com métodos de amortecimento exponencial para gerar previsões mostrando que a abordagem é capaz de gerar previsões mensais mais precisas que todos os benchmarks analisados. A abordagem era considerada o estado da arte na utilização de Bagging e Amortecimento Exponencial até o desenvolvimento dos resultados obtidos nesta tese. A tese em questão se ocupa de, inicialmente, validar o método Bagged.BLD.MBB.ETS em um conjunto de dados relevante do ponto de vista de uma aplicação real, expandindo assim os campos de aplicação da metodologia. Posteriormente, são identificados motivos relevantes para redução do erro de e é proposta uma nova metodologia que utiliza Bagging, Amortecimento Exponencial e Clusters para tratar o efeito covariância, até então não identificado anteriormente na literatura do método. A abordagem proposta foi testada utilizando diferentes tipo de séries temporais da competição M3, CIF 2016 e M4, bem como utilizando dados simulados. Os resultados empíricos apontam para uma redução substancial na variância e no erro de previsão. / [en] This thesis is inserted in the context of time series forecasting. In this sense, although many approaches have been developed, simple methods such as exponential smoothing usually produce extremely competitive results, often surpassing approaches with a higher level of complexity. Seminal papers in time series forecasting showed that the combination of forecasts has the potential to dramatically reduce the forecast error. Specifically, the combination of forecasts generated by Exponential Smoothing has been explored in recent papers. Although this can be done in many ways, a specific method called Bagged.BLD.MBB.ETS uses a technique called Bootstrap Aggregating (Bagging) in combination with Exponential Smoothing methods to generate forecasts, showing that the approach can generate more accurate monthly forecasts than all the analyzed benchmarks. The approach was considered the state of the art in the use of Bagging and Exponential Smoothing until the development of the results obtained in this thesis. This thesis initially deals with validating Bagged.BLD.MBB.ETS in a data set relevant from the point of view of a real application, thus expanding the fields of application of the methodology. Subsequently, relevant motifs for error reduction are identified and a new methodology using Bagging, Exponential Smoothing and Clusters is proposed to treat the covariance effect, not previously identified in the method s literature. The proposed approach was tested using data from three time series competitions (M3, CIF 2016 and M4), as well as using simulated data. The empirical results point to a substantial reduction in variance and forecast error.
293

Abordagem semi-paramétrica para cópulas variantes no tempo em séries temporais financeiras / Semiparametric approach for time-varying copula in finacial time series

Daniel de Brito Reis 21 September 2016 (has links)
Neste trabalho foram utilizadas cópulas bivariadas variantes no tempo para modelar a dependência entre séries de retornos financeiros. O objetivo deste trabalho é apresentar uma abordagem de estimação semi-paramétrica de cópulas variantes no tempo a partir de uma função de cópula paramétrica na qual o parâmetro varia no tempo. A função do parâmetro desconhecido será estimada pela aproximação de ondaleta Haar, polinômio de Taylor e Kernel. O desempenho dos três métodos de aproximação será comparado via estudos de simulação. Uma aplicação aos dados reais será apresentada para ilustrar a metodologia estudada. / In this work the bivariate Time-varying copula models have been used to model the dependence between payback. The aim of this work is to present an approach of semiparametric estimation of Time-varying copula models from a parametric copula function in which the parameter varies with the time. The function of the unknown parameter will be estimated by Haar wavelet approach, Taylor series and smoothing Kernel approximation. The measured performance of the three estimation method will be compared by simulation study. An application of the data will be presented to illustrate the studied methodology.
294

Análise de diagnóstico em modelos semiparamétricos normais / Diagnostic analysis in semiparametric normal models

Gleyce Rocha Noda 18 April 2013 (has links)
Nesta dissertação apresentamos métodos de diagnóstico em modelos semiparamétricos sob erros normais, em especial os modelos semiparamétricos com uma variável explicativa não paramétrica, conhecidos como modelos lineares parciais. São utilizados splines cúbicos para o ajuste da variável resposta e são aplicadas funções de verossimilhança penalizadas para a obtenção dos estimadores de máxima verossimilhança com os respectivos erros padrão aproximados. São derivadas também as propriedades da matriz hat para esse tipo de modelo, com o objetivo de utilizá-la como ferramenta na análise de diagnóstico. Gráficos normais de probabilidade com envelope gerado também foram adaptados para avaliar a adequabilidade do modelo. Finalmente, são apresentados dois exemplos ilustrativos em que os ajustes são comparados com modelos lineares normais usuais, tanto no contexto do modelo aditivo normal simples como no contexto do modelo linear parcial. / In this master dissertation we present diagnostic methods in semiparametric models under normal errors, specially in semiparametric models with one nonparametric explanatory variable, also known as partial linear model. We use cubic splines for the nonparametric fitting, and penalized likelihood functions are applied for obtaining maximum likelihood estimators with their respective approximate standard errors. The properties of the hat matrix are also derived for this kind of model, aiming to use it as a tool for diagnostic analysis. Normal probability plots with simulated envelope graphs were also adapted to evaluate the model suitability. Finally, two illustrative examples are presented, in which the fits are compared with usual normal linear models, such as simple normal additive and partially linear models.
295

Estabilização digital em tempo real de imagens em seqüência de vídeos / Real time digital image stabilization in videos sequences

André Calheiros Silvestre 10 May 2007 (has links)
Podemos afirmar que deslocamentos da imagem em quadros consecutivos de uma seqüência de vídeo são causados por pequenas vibrações da câmera e/ou movimentos desejados pelo operador da câmera. A estabilização de imagem consiste no processo de remoção de pequenas vibrações que caracterizam movimentos indesejados de uma seqüência de imagens. Com este propósito, atualmente técnicas de processamento digital de vídeo vêm sendo comumente aplicadas na indústria eletrônica. No processo digital de estabilização de imagens são necessários métodos computacionais de estimação, de suavização e de correção de movimento, para os quais, existe uma grande variedade de técnicas de processamento. O emprego de uma técnica específica de processamento é determinado conforme o tipo de aplicação. Técnicas para a estimação de movimento como casamento de blocos (CB), e para a suavização de movimento como filtro de freqüência passa baixa, são freqüentemente encontradas na literatura. Este trabalho apresenta um sistema de estabilização digital de imagens em tempo real capturadas por uma câmera digital, estimando e compensando movimentos translacionais e rotacionais indesejados. / Undesirable shakes or jiggles, object motion within image or desirable motions caused by the camera operator causes image differences in consecutive frames of video sequences. The image stabilization consists of the process of removing inevitable and undesirable fluctuations, shakes and jiggles; with this purpose, nowadays digital processing techniques have been commonly applied in the electronic industry. On the digital processing of image stabilization, computational methods of estimation, smoothing and motion correction are necessary. In the literature various digital processing techniques for image stabilization are described, the most suitable technique should be chosen according to the kind of application. Techniques such as block matching used in motion estimation and low-pass filters used in motion smoothing are found in a great number of papers. This work presents a real time digital image stabilization system capable of stabilizing video sequences with undesirable translational and rotational displacements between frames.
296

[en] SHORT TERM LOAD FORECASTING MODELS / [pt] MODELOS DE PREVISÃO DE CARGA DE CURTO PRAZO

GHEISA ROBERTA TELLES ESTEVES 10 July 2003 (has links)
[pt] Aplicação de duas metodologias, para prever e modelar o comportamento de uma serie temporal de carga de energia elétrica, a serie histórica de carga elétrica horária de uma das concessionárias de energia elétrica do sudeste brasileiro, a ESCELSA. Foram aplicadas as metodologias de amortecimento direto, e uma metodologia recente, o método de Holt-Winters com múltiplos ciclos. Ambas as metodologias são utilizadas para fazer previsão horária de carga de energia elétrica, portanto, é feita, previsão 24 passos a frente. / [en] Application of two diferent metodologies, in order to model and forecast the behavior of time series of hourly electrical loads generated by ESCELSA. Was applied to the time series studied the metodology of the direct smoothing, and also a recent metodology, the Holt-Winters with multiple sazonalities. In both of them it has been done the hourly forecast (24 hours load forecasting).
297

Tests de l'efficience faible à partir des ondelettes de Haar / Tests of weak form efficiency with Haar wavelet

Belsuz, Autran 24 November 2017 (has links)
Cette thèse proposée utilise les ondelettes de Haar à créer de nouveaux indicateurs techniques, d’en évaluer leurs performances afin de tester la validité de l’efficience faible des marchés financiers. L’approche choisie vise à mettre en œuvre les capacités des indicateurs techniques à capter la mémoire longue présente dans les indices boursiers américains et européens à travers l’estimation de la tendance par le processus de lissage. De plus, cette dernière est une composante importante dans les séries économiques et financières. En effet, elle a fait l’objet d’innombrables investigations tant en analyse technique, qu’en traitement du signal et dans la théorie des cycles économiques. Toutefois, sa présence n’entre pas en ligne de compte dans la théorie classique de la finance, car les principaux modèles utilisés se focalisent sur les variations des cours boursiers. À cet effet, la tendance constitue une source de non-stationnarité entraînant des difficultés majeures pour la modélisation économétrique ou financière. Exploiter cette tendance s’affranchit, dans ce cas, des hypothèses de non-stationnarité tendancielle ou de racine unitaire. En plus, à l’issue des résultats que nous avons obtenus à partir du modèle à changement de régime. Nous confirmons qu’il est possible d’exploiter la présence de mémoire longue dans les cours, et également de battre le marché en présence de coûts de transactions sur les marchés américains et européens. / This proposed thesis uses the Haar wavelets to create new technical indicators, to evaluate their performance in order to test the validity of the weak form of efficient market hypothesis. The chosen approach aims to implement the capabilities of technical indicators to capture the long memory present in the US and European stock indices through the estimation of the trend by the smoothing process. Moreover, the trend is an important component in the economic and financial series. Indeed, it has been the subject of innumerable investigations in technical analysis, in signal processing and in the theory business cycle theory. However, its presence is not taken into account in the classic theory of finance because the main models used focus on changes in stock prices. For this purpose, the trend constitutes a source of non-stationarity leading to major difficulties for econometric or financial modeling. Exploit trend is freed, in this case, from the hypotheses of tendancy or unit root. In addition, the issue of the results we obtained from the regime change model. We confirm that it is possible to exploit the presence of long memory in the series, and also to beat the market in the presence of transaction costs on the American and European markets.
298

Curve Estimation and Signal Discrimination in Spatial Problems

Rau, Christian, rau@maths.anu.edu.au January 2003 (has links)
In many instances arising prominently, but not exclusively, in imaging problems, it is important to condense the salient information so as to obtain a low-dimensional approximant of the data. This thesis is concerned with two basic situations which call for such a dimension reduction. The first of these is the statistical recovery of smooth edges in regression and density surfaces. The edges are understood to be contiguous curves, although they are allowed to meander almost arbitrarily through the plane, and may even split at a finite number of points to yield an edge graph. A novel locally-parametric nonparametric method is proposed which enjoys the benefit of being relatively easy to implement via a `tracking' approach. These topics are discussed in Chapters 2 and 3, with pertaining background material being given in the Appendix. In Chapter 4 we construct concomitant confidence bands for this estimator, which have asymptotically correct coverage probability. The construction can be likened to only a few existing approaches, and may thus be considered as our main contribution. ¶ Chapter 5 discusses numerical issues pertaining to the edge and confidence band estimators of Chapters 2-4. Connections are drawn to popular topics which originated in the fields of computer vision and signal processing, and which surround edge detection. These connections are exploited so as to obtain greater robustness of the likelihood estimator, such as with the presence of sharp corners. ¶ Chapter 6 addresses a dimension reduction problem for spatial data where the ultimate objective of the analysis is the discrimination of these data into one of a few pre-specified groups. In the dimension reduction step, an instrumental role is played by the recently developed methodology of functional data analysis. Relatively standar non-linear image processing techniques, as well as wavelet shrinkage, are used prior to this step. A case study for remotely-sensed navigation radar data exemplifies the methodology of Chapter 6.
299

多期損益平穩化行為之決定因素

黃明潔 Unknown Date (has links)
盈餘對公司來說是最簡單、最直接之績效衡量指標,故盈餘之報導對公司相當重要,也因此過去有關盈餘管理及損益平穩化行為之相關研究將重點放在報導水準受到各種因素、各類利害關係人之影響。公司之控制權掌握在股東手中,因此股東對盈餘報導之影響不容忽視;但股東(同時也是投資人)之投資目的及投資策略不盡相同,對盈餘之影響亦可能不同。此外,同產業之公司間會因投資人對其所做之相對績效評估而有相互競爭之行為,致使同產業之行為可能有趨向一致之情況,不同產業則可能不同。   本研究以民國八十三年至民國八十八年為研究期間,針對我國上市公司研究其損益平穩化行為,從股權結構觀點(以長、短期投資之股東持股比率)探討其與公司損益平穩化行為之關係,藉以瞭解投資人之投資目的不同對公司長期盈餘之趨勢有何關聯性。其次,針對我國電子產業與非電子產業進行產業效果之研究,探討損益平穩化行為之不同是否與產業有關,以及電子產業損益平穩化之傾向。   實證結果發現:(1)公司之平穩化行為與股權結構有顯著關聯性;(2)當公司長期投資之股東持股比率愈高時,公司傾向不採行損益平穩化;(3)當公司短期投資之股東持股比率愈高時,公司愈傾向平穩化其損益;(4)產業因素與我國上市公司損益平穩化行為有顯著關聯性,不同產業之平穩化行為不同,且電子資訊產業傾向不從事損益平穩化。 / Earnings is the simplest and most straightforward indicator of a company's performance. Therefore, earnings reporting could be a crucial concern for an investor's decision-making. There are a lot of researches regarding earnings management and income smoothing focused on how the role of stakeholders and other possible factors would affect the accuracy of a company's earnings reporting. For most companies, their shareholders control the core business activities, and thus it is hard to ignore their impact on the earnings performance. However, due to the discrepancy of investing policies and activities of different types of investor, the extent to which the types of shareholders could influence a company's earnings can be varied. Besides, since companies in the same industry always compete for better performance resulted from investors' relative performance evaluation, there is also a trend that the companies in the same industry would behavior in a similar way when reporting their earnings.   In order to prove that there's a significant relationship between a company's trend of reported earnings and its shareholders' investing goals, this thesis collects the overall market data from 1994 to 1999 and then examines each company's income smoothing behavior. In addition to testing whether a company's decision to exerting income smoothing is contributed to the portions of long-term and shot-term shareholders, this thesis also tries to identify whether the industry effect exists among the electronic industry and other industries that can make different industries have discrepant income smoothing behaviors, and find the electronic industry's income smoothing intent.   The empirical results document that: (1) A company's income smoothing behavior is significantly related to its ownership structure; (2) A company with larger portion of long-term shareholders tends to not adopt income smoothing strategy; (3) A company with larger portion of short-term shareholders tends to adopt income smoothing strategy; (4) The industry effect is significantly related to companies’ income smoothing behaviors. Different industry has different pattern of income smoothing behavior. The companies in the electronic industry tend to not adopt income smoothing strategy.
300

Svenska småföretags användning av reserveringar för resultatutjämning och intern finansiering / Swedish small firms’ utilization of allowances for income smoothing and internal financing

Andersson, Håkan A. January 2006 (has links)
<p>Small firms often have inadequate access to the capital necessary for sucessful management. The Swedish Government introduced in the mid-1990s allowance rules that facilitate retention of profit for sole proprietorships and partnership firms. The tax credits arising from the allowances give certain benefits as a source of financing compared to traditional forms of credits. Among the more essential benefits are that the payment for some parts of the tax credit can be put on hold almost indefinitely, or alternatively never be paid. The firms are free to use these means, and the responsibility of future payment of the postponed tax debt stays with the individual firms. The comprehensive purpose of the dissertation may be stated as to increase the understanding of small Swedish firms, especially sole proprietorships, utilizing possibilities for allowances for income smoothing and internal financing. At the beginning the dissertation describes case studies, comprising a smaller selection of microfirms. With a starting-point from the accounted and reported income-tax returns, alternative calculations are made where additional positive tax and finance effects appear possible to obtain. One purpose of these studies is to increase the insight regarding the possibilities of income smoothing and internal financing that arise from utilizing these allowances. </p><p>These studies also illuminate, to what extent and in what way they are being used in reality. Another objective of these studies is to give a more substantive insight into the technics behind the different allowances, appropriation to positive or negative interest rate allocation appropriation or dissolving of tax allocation reserve appropriation or dissolving of “expansion fund” Theories regarding the creation of resources, through building of capital, and theories on financial planning and strategy are studied. The purpose is to find support for the choice of theoretical grounded underlying independent variables that can be used in cross-sectional studies to explain the use of the possibilities of appropriations. Theories of finance that are of greatest interest, in the operationalisation of these variables, are theories that discuss the choices of different financing alternatives for small firms. The “pecking order theory”, describes the firm’s order of priority when choices of finance alternatives are made. The concept of “financial bootstrapping” expands the frame for different forms of financing choices that especially very small firms have at their disposal.</p><p>The last part of the theoretical frame deals with the phenomenon of “income smoothing,” which can be translated as leveling out profits/losses. A number of financial and non-financial variables are supported by and operationalised from these financial theories e.g., return on sales, capital turnover, quick ratio and debt-to-equity ratio, respectively age, gender and line of business. Cross-sectional studies are implemented for the taxation years of 1996 and 1999, on databases that have been extracted from Statistics Sweden. The group of 87,276 sole proprietorships included in the study were required to complete tax returns and pay taxes for the business activity according to the supporting schedule, N2, information from the sole proprietorships’ income statement and balance sheet in an accounting statement that comes with the income tax return form. The possibilities of allowances are considered as dependent variables. The intention of the cross-sectional studies is to survey and describe the utilization of possible allowances, with the support of the financial and non-financial independent variables. The connection of these variables to the decision of sole proprietorships to appropriate to the tax allocation reserve is also summarized in a logistic regression model. A number of theoretically based propositions are made for the purpose of observing how the variables are connected to the chances that sole proprietorships actually appropriate to this form of allowance. Appropriation to the tax allocation reserve stands out as the most practiced form of allowance. The studies also clarify that utilization varies among different forms of allowances, but that not all firms that have the prerequisites to utilize the possibilities really do so to the full. A further utilization of the different possibilities of allowances is often conceivable. For the sole proprietorships that are not utilizing these possibilities, the allowances should be considered eligible as a contribution to internal financing and to increase access to capital.</p>

Page generated in 0.0572 seconds