• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 11
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Testing specifications in partial observability models : a Bayesian encompassing approach

Almeida, Carlos 04 October 2007 (has links)
A structural approach for modelling a statistical problem permits to introduce a contextual theory based in previous knowledge. This approach makes the parameters completely meaningful; but, in the intermediate steps, some unobservable characteristics are introduced because of their contextual meaning. When the model is completely specified, the marginalisation into the observed variables is operated in order to obtain a tatistical model. The variables can be discrete or continuous both at the level of unobserved and at the level of observed or manifest variables. We are sometimes faced, especially in behavioural sciences, with ordinal variables; this is the case of the so-called Likert scales. Therefore, an ordinal variable could be nterpreted as a discrete version of a latent concept (the discretization model). The normality of the latent variables simplifies the study of this model into the analysis of the structure of the covariance matrix of the "ideally" measured variables, but only a sub-parameter of these matrix can be identified and consistently estimated (i.e. the matrix of polychoric correlations). Consequently, two questions rise here: Is the normality of the latent variables testable? If not, what is the aspect of this hypothesis which could be testable?. In the discretization model, we observe a loss of information with related to the information contained in the latent variables. In order to treat this situation we introduce the concept of partial observability through a (non bijective) measurable function of the latent variable. We explore this definition and verify that other models can be adjusted to this concept. The definition of partial observability permits us to distinguish between two cases depending on whether the involved function is or not depending on a Euclidean parameter. Once the partial observability is introduced, we expose a set of conditions for building a specification test at the level of latent variables. The test is built using the encompassing principle in a Bayesian framework. More precisely, the problem treated in this thesis is: How to test, in a Bayesian framework, the multivariate normality of a latent vector when only a discretized version of that vector is observed. More generally, the problem can be extended to (or re-paraphrased in): How to test, in Bayesian framework, a parametric specification on latent variables against a nonparametric alternative when only a partial observation of these latent variables is available.
2

Futures-Based Forecasts of U.S. Crop Prices

Zhu, Jiafeng 03 October 2017 (has links)
Over the last decade, U.S. crop prices have become significantly more volatile. Volatile markets pose increased risks for the agricultural market participants and create a need for reliable price forecasts. Research discussed in this paper aims to find different approaches to forecast crop cash prices based on the prices of related futures contracts. Corn, soybeans, soft red winter wheat, and cotton are the focus of this research. Since price data for these commodities is non-stationary, this paper used two approaches to solve this problem. The first approach is to forecast the difference in prices between current and future period and the second is to use the regimes. This paper considers the five-year moving average approach as the benchmark when comparing these approaches. This research evaluated model performance using R-squared, mean errors, root mean squared errors, the modified Diebold-Mariano test, and the encompassing test. The results show that both the difference model and the regime model render better performance than the benchmark in most cases, but without a significant difference between each other. Based on these findings, the regime model was used to make forecasts of the cash prices of corn and soybeans, the difference model was used to make predictions for cotton, and the benchmark was used to forecast the SRW cash price. / Master of Science
3

Regime de metas no Brasil e previsão de inflação: acurácia e encompassing

Abreu, Vanessa Castro 08 April 2015 (has links)
Submitted by Renata Lopes (renatasil82@gmail.com) on 2016-04-13T12:41:57Z No. of bitstreams: 1 vanessacastroabreu.pdf: 2365875 bytes, checksum: f04aeafd7fe1f0509c66cbb52f8e7b4e (MD5) / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2016-04-24T03:27:22Z (GMT) No. of bitstreams: 1 vanessacastroabreu.pdf: 2365875 bytes, checksum: f04aeafd7fe1f0509c66cbb52f8e7b4e (MD5) / Made available in DSpace on 2016-04-24T03:27:22Z (GMT). No. of bitstreams: 1 vanessacastroabreu.pdf: 2365875 bytes, checksum: f04aeafd7fe1f0509c66cbb52f8e7b4e (MD5) Previous issue date: 2015-04-08 / O presente estudo busca avaliar se a adoção do regime de metas de inflação no Brasil, em 1999, tem sido capaz de reduzir os erros de previsões de inflação geradas pelos modelos Naïve, ARIMA, GARCH, UC-SV, VAR e Curva de Phillips, e como a acurácia das previsões geradas por esses modelos tem se comportadoao longo do período pós-adoção do regime. Adicionalmente, busca-se verificar a ocorrência de encompassing das previsões geradas pelos modelos citados. O período analisado compreende janeiro de 1996 à dezembro de 2013 e o horizonte de previsão é igual a doze meses. Utilizando a estatística Raiz do Erro Quadrado Médio (REQM) e o teste Diebold-Mariano modificado (mDM), os resultados mostram que os erros de previsão têm se reduzido ao longo do tempo, de modo que o processo inflacionário parece ser mais fácil de ser previsto. Por outro lado, o modelo Naïve apresenta-se mais acurado do que os outros modelos analisados, de modo que a inflação está mais difícil de ser prevista. O teste HLN modificado (mHLN) mostra que as previsões de inflação geradas pelo modelo Naïve contêm, na maior parte do tempo, todas as informações necessárias para realizar previsões acuradas, não necessitando incorporar informações disponíveis nas previsões geradas por outros modelos. Entretanto, essa situação parece estar se modificando com o decorrer do tempo. / This study seeks to determine if the adoption of inflation targeting regime in Brazil in 1999, has been able to reduce inflation forecasts errors generated by Naïve models, ARIMA, GARCH, UC-SV, VAR and Phillips Curve, and how the accuracy of the forecasts generated by these models has behaved throughout the post-adoption period of the regime. In addition, we seek to verify the occurrence of predictions encompassing generated by the models mentioned. The reporting period comprises January 1996 to December 2013 and the horizon of the forecast is twelve months. Using statistical Root Mean Square Error (RMSE) and Diebold-Mariano modified test (mDM), the results show that the prediction errors have been reduced over time, so that inflation appears to be easier to be predicted. On the other hand, the Naive model is more accurate than other models examined, so that inflation is more difficult to predict. The HLN modified test (mHLN) shows that inflation forecasts generated by the Naive model contains in most of the time, all the necessary information to make accurate predictions, not requiring to incorporate information available on predictions generated by other models. However, this situation appears to be changing over time.
4

The Empirical Study of the Dynamics of Taiwan Short-term Interest- rate

Lien, Chun-Hung 10 December 2006 (has links)
This study includes three issues about the dynamic of 30-days Taiwan Commercial Paper rate (CP2).The first issue focuses on the estimation of continuous-time short-term interest rate models. We discretize the continuous-time models by using two different approaches, and then use weekly and monthly data to estimate the parameters. The models are evaluated by data fit. We find that the estimated parameters are similar for different discretization approaches and would be more stable and efficient under quasi-maximum likelihood (QML) with weekly data. There exists mean reversion for Taiwan CP rate and the relationship between the volatility and the level of interest rates are less than 1 and smaller than that of American T-Bill rates reported by CKLS (1992) and Nowman (1997). We also find that CIR-SR model performs best for Taiwan CP rate. The second issue compares the continuous-time short-term interest rate models empirically both by predictive accuracy test and encompassing test. Having the estimated parameters of the models by discretization of Nowman(1997) and QML, we produce the forecasts on conditional mean and volatility for the interest rate over multiple-step-ahead horizons. The results indicate that the sophisticated models outperform the simpler models in the in-sample data fit, but have a distinct performance in the out-of-sample forecasting. The models equipped with mean reversion can produce better forecasts on conditional means during some period, and the heteroskedasticity variance model with outperform counterparts in volatility forecasting in some periods. The third issue concerns the persistent and massive volatility of short-term interest rates. This part inquires how the realizations on Taiwan short-term interest rates can be best described empirically. Various popular volatility specifications are estimated and tested. The empirical findings reveal that the mean reversion is an important characteristic for the Taiwan interest rates, and the level effect exists. Overall, the GARCH-L model fits well to Taiwan interest rates.
5

As cifras da transcendência na filosofia de Karl Jaspers

Melo, Fernanda de Araújo 18 August 2009 (has links)
Submitted by Renata Lopes (renatasil82@gmail.com) on 2016-10-10T11:41:04Z No. of bitstreams: 1 fernandadearaujomelo.pdf: 286912 bytes, checksum: 50abf19e6ffc68b1a525f40d5fc94deb (MD5) / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2016-10-11T15:50:36Z (GMT) No. of bitstreams: 1 fernandadearaujomelo.pdf: 286912 bytes, checksum: 50abf19e6ffc68b1a525f40d5fc94deb (MD5) / Made available in DSpace on 2016-10-11T15:50:36Z (GMT). No. of bitstreams: 1 fernandadearaujomelo.pdf: 286912 bytes, checksum: 50abf19e6ffc68b1a525f40d5fc94deb (MD5) Previous issue date: 2009-08-18 / CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Esta pesquisa tem por objetivo principal obter uma compreensão das cifras da transcendência no pensamento de Karl Jaspers. Contudo, essa investigação mostra-se determinante para se pensar também em que medida a sua filosofia da existência poderia ser chamada de filosofia da transcendência. Pode-se dizer que a reflexão que o nosso autor estabelece em torno da existência humana possibilita a abertura para a transcendência no momento em que vincula a existência ao horizonte do fracasso e da sua conseqüente superação. Perceba que é no embate com essas situações-limite que se dá o ‘ocorrer’ da transcendência na forma de ‘sinais cifrados’. Isso evidencia, portanto, que o percurso estabelecido por Jaspers em sua fundamentação da especificidade da ‘condição humana’ culmina no âmbito da transcendência. Sob esta ocular, quer-se crer que esta pesquisa contribui, efetivamente, para se pensar como Jaspers estabelece a relação entre a existência e a transcendência, e, indo mais além, como se fundamenta o estatuto de sua filosofia da transcendência. / This thesis aims at understanding the “ciphers of transcendence” in Karl Jasper’s thought. We believe, however, that this investigation can be decisive to think how his philosophy of existence could also be called a philosophy of transcendence. We could say that the reflection that our author establishes around human existence makes possible an opening for transcendence when he connects existence to the horizon at failure and its consequent overcoming. One can observe it is in the confrontation with these extreme moments that transcendence occurs through “ciphered signs”. Therefore, that the way established by Jaspers in his grounding of the specific human condition ends up, in a decisive manner, in a sphere of transcendence. Under this aspect, we think this research contributes to understand how Jaspers establishes the connection between existence and transcendence and, furthermore, how he construes thes statute of his philosophy of transcendence.
6

Stochastic volatility : maximum likelihood estimation and specification testing

White, Scott Ian January 2006 (has links)
Stochastic volatility (SV) models provide a means of tracking and forecasting the variance of financial asset returns. While SV models have a number of theoretical advantages over competing variance modelling procedures they are notoriously difficult to estimate. The distinguishing feature of the SV estimation literature is that those algorithms that provide accurate parameter estimates are conceptually demanding and require a significant amount of computational resources to implement. Furthermore, although a significant number of distinct SV specifications exist, little attention has been paid to how one would choose the appropriate specification for a given data series. Motivated by these facts, a likelihood based joint estimation and specification testing procedure for SV models is introduced that significantly overcomes the operational issues surrounding existing estimators. The estimation and specification testing procedures in this thesis are made possible by the introduction of a discrete nonlinear filtering (DNF) algorithm. This procedure uses the nonlinear filtering set of equations to provide maximum likelihood estimates for the general class of nonlinear latent variable problems which includes the SV model class. The DNF algorithm provides a fast and accurate implementation of the nonlinear filtering equations by treating the continuously valued state-variable as if it were a discrete Markov variable with a large number of states. When the DNF procedure is applied to the standard SV model, very accurate parameter estimates are obtained. Since the accuracy of the DNF is comparable to other procedures, its advantages are seen as ease and speed of implementation and the provision of online filtering (prediction) of variance. Additionally, the DNF procedure is very flexible and can be used for any dynamic latent variable problem with closed form likelihood and transition functions. Likelihood based specification testing for non-nested SV specifications is undertaken by formulating and estimating an encompassing model that nests two competing SV models. Likelihood ratio statistics are then used to make judgements regarding the optimal SV specification. The proposed framework is applied to SV models that incorporate either extreme returns or asymmetries.
7

Självet som kärlekshandling : Om självets och identitetens problem i Karl Jaspers' Von der Wahrheit

Piispanen, Nichan January 2019 (has links)
In this essay I try to put Jaspers philosophy of love in dialogue with some of the leading approaches of personal identity. I start briefly with the pre-Socratics Parmenides and Heraclitus. I then introduce some leading identity theories within the analytical tradition and finally the narrative identity theory with Paul Ricoeur at the forefront. In section 2.1, I analyse the basic concept of Umgreifende and its relation to, among others, Kant and Kierkegaard. Knowledge and objectification of personal identity are seen as major problems in identity theories and contrasted with Jaspers’ idea of ​​identity as Existenz. In section 2.2 I analyse Jaspers’ view on action and freedom. Love appears as the cornerstone for Jaspers’ philosophy in general and Existenz in particular. I link Jasper's concept of love to Augustine and at the same time show the differences between the two. In section 2.3, I analyse Jaspers’ view on communication and action which he thinks is the only way for the self to appear and become phenomenal. The world is proving to have a key role as a medium that enables communication. In section 2.4, I clarify Jaspers view on the role of language in intersubjective communication and the possibilities of hermeneutics. The essay concludes with a brief summary.
8

On testing the Phillips curves, the IS Curves, and the interaction between fiscal and monetary policies

Maka, Alexis 27 November 2013 (has links)
Submitted by Alexis Maka (alexis.maka@ipea.gov.br) on 2014-01-07T17:09:06Z No. of bitstreams: 1 Dissertation - Alexis Maka.pdf: 4105460 bytes, checksum: 61a014ecaca774cf3ae64ddbdd8ea527 (MD5) / Approved for entry into archive by ÁUREA CORRÊA DA FONSECA CORRÊA DA FONSECA (aurea.fonseca@fgv.br) on 2014-01-28T18:27:24Z (GMT) No. of bitstreams: 1 Dissertation - Alexis Maka.pdf: 4105460 bytes, checksum: 61a014ecaca774cf3ae64ddbdd8ea527 (MD5) / Approved for entry into archive by Marcia Bacha (marcia.bacha@fgv.br) on 2014-02-03T15:53:23Z (GMT) No. of bitstreams: 1 Dissertation - Alexis Maka.pdf: 4105460 bytes, checksum: 61a014ecaca774cf3ae64ddbdd8ea527 (MD5) / Made available in DSpace on 2014-02-03T15:53:53Z (GMT). No. of bitstreams: 1 Dissertation - Alexis Maka.pdf: 4105460 bytes, checksum: 61a014ecaca774cf3ae64ddbdd8ea527 (MD5) Previous issue date: 2013-11-27 / Esta tese é composta por três ensaios sobre testes empíricos de curvas de Phillips, curvas IS e a interação entre as políticas fiscal e monetária. O primeiro ensaio ('Curvas de Phillips: um Teste Abrangente') testa curvas de Phillips usando uma especificação autoregressiva de defasagem distribuída (ADL) que abrange a curva de Phillips Aceleracionista (APC), a curva de Phillips Novo Keynesiana (NKPC), a curva de Phillips Híbrida (HPC) e a curva de Phillips de Informação Rígida (SIPC). Utilizamos dados dos Estados Unidos (1985Q1--2007Q4) e do Brasil (1996Q1--2012Q2), usando o hiato do produto e alternativamente o custo marginal real como medida de pressão inflacionária. A evidência empírica rejeita as restrições decorrentes da NKPC, da HPC e da SIPC, mas não rejeita aquelas da APC. O segundo ensaio ('Curvas IS: um Teste Abrangente') testa curvas IS usando uma especificação ADL que abrange a curva IS Keynesiana tradicional (KISC), a curva IS Novo Keynesiana (NKISC) e a curva IS Híbrida (HISC). Utilizamos dados dos Estados Unidos (1985Q1--2007Q4) e do Brasil (1996Q1--2012Q2). A evidência empírica rejeita as restrições decorrentes da NKISC e da HISC, mas não rejeita aquelas da KISC. O terceiro ensaio ('Os Efeitos da Política Fiscal e suas Interações com a Política Monetária') analisa os efeitos de choques na política fiscal sobre a dinâmica da economia e a interação entre as políticas fiscal e monetária usando modelos SVARs. Testamos a Teoria Fiscal do Nível de Preços para o Brasil analisando a resposta do passivo do setor público a choques no superávit primário. Para a identificação híbrida, encontramos que não é possível distinguir empiricamente entre os regimes Ricardiano (Dominância Monetária) e não-Ricardiano (Dominância Fiscal). Entretanto, utilizando a identificação de restrições de sinais, existe evidência que o governo seguiu um regime Ricardiano (Dominância Monetária) de janeiro de 2000 a junho de 2008. / This dissertation consists of three essays on empirical testing of Phillips curves, IS curves, and the interaction between fiscal and monetary policies. The first essay ('Phillips Curves: An Encompassing Test') tests Phillips curves using an autoregressive distributed lag (ADL) specification that encompasses the accelerationist Phillips curve (APC), the New Keynesian Phillips curve (NKPC), the Hybrid Phillips curve (HPC), and the Sticky-Information Phillips curve (SIPC). We use data from the United States (1985Q1--2007Q4) and from Brazil (1996Q1--2012Q2), using the output gap and alternatively the real marginal cost as measure of inflationary pressure. The empirical evidence rejects the restrictions implied by the NKPC, the HPC, and SIPC, but does not reject those implied by the APC. The second essay ('IS Curves: An Encompassing Test') tests IS curves using an ADL specification that encompasses the traditional Keynesian IS curve (KISC), the New Keynesian IS curve (NKISC), and the Hybrid IS curve (HISC). We use data from the United States (1985Q1--2007Q4) and from Brazil (1996Q1--2012Q2). The evidence rejects the restrictions implied by the NKISC and the HISC, but does not reject those of the KISC. The third essay ('The Effects of Fiscal Policy and its Interactions with Monetary Policy in Brazil') analyzes the effects of fiscal policy shocks on the dynamics of the economy and the interaction between fiscal and monetary policy using structural vector autoregressions (SVARs). We test the Fiscal Theory of the Price Level for Brazil, analyzing the response of public sector liabilities to primary surplus shocks. For the hybrid identification we find that it is not possible to distinguish empirically between Ricardian (Monetary Dominance) and non-Ricardian (Fiscal Dominance) regimes. However, using sign restrictions there is some evidence that the government followed a Ricardian (Monetary Dominance) regime from January 2000 to June 2008.
9

Engagement relationnel et bénévolat en milieu carcéral : du don et de la reconnaissance en institution totalisante / Relational commitment and voluntary service in prison area

Petitgas, Bernard 01 September 2017 (has links)
L’institution totalisante, en tant qu’institution close sur elle-même, hermétique au monde n’existe pas, ni dans l’absolu ni comme idéal type auquel serait comparé, par exemple, l’institution pénitentiaire. Il existe toujours des interfaces humaines, matérielles et temporelles entre les différents espaces sociaux fussent-elles délimitées par des murs et des barbelés. Il convient pour aborder l’univers carcéral, celui dans lequel nous portons notre étude tout en y étant détenu, de parler plutôt d’institution totalisante où se joue continuellement un conflit entre rationalité formelle et matérielle, mais aussi un conflit aux dimensions puissantes de socialisation, de reconfiguration des comportements et de subjectivation des vécus. Notre centre de détention présente un exemple de ce conflit entre un univers répressif normalisé et sécuritaire, et un autre empli de stratégies pragmatiques de survie ou de resocialisation. La complexité qui en résulte est à l’image de la société elle-même et du rapport permanent des individus à leurs institutions.Enrichi par nos précédentes recherches, notre présent travail tente d’aborder deux importants aspects que sont le paradigme du don et la théorie de la reconnaissance, et de les articuler avec l’engagement bénévole en détention. Avant tout, nous voulons démontrer que beaucoup d’aspects du paradigme du don et de la théorie de la reconnaissance sont aussi et déjà présents en contexte d’incarcération. Mais ces aspects présentent la particularité de maintenir la vie de l’institution totalisante en circuit fermé, c’est-à-dire sur elle-même.Le paradigme du don à travers le bénévolat amène à repenser l’espace carcéral comme un espace de socialisation à part entière et éminemment en interaction avec l’extérieur. C’est parce qu’elle est en constante relation avec la société que l’institution totalisante a besoin du bénévolat pour que cette relation, prise dans le paradigme du don, fasse des détenus les responsables mêmes de leurs échanges avec l’extérieur. Dès lors, en termes de rationalisation, entendue comme sens donné par les acteurs à leurs actions et à leurs recherches de liens, les visées sont à la fois pragmatiques, utilitaristes, et altruistes, les détenus s’y redéfinissant dans le cadre de la réciprocité, du rendu et de l’offre, plutôt que de la dette, la stigmatisation et la punition.Le lien social est la base du bénévolat/don. Il est bénéfique à tous en termes de réinsertion et de lutte contre le stigmate pour certains, de place dans le jeu du social pour d’autres. Dans un cadre « d’endettement mutuel positif », c’est bien la socialisation qui s’exprime. / The “Total Institution”, as a body closed in on itself, isolated from the outside world, does not exist, either as an absolute or as a standard ideal to which, for instance, the penal institution would be compared. There are always human, material and temporal interfaces between the different social spaces, even though they are delimited by walls and barbed wire. In order to understand the prison world, the one in which we are detained and on which we focus our research, the term “all-encompassing” institution is best suited insofar as permanent conflict takes place between formal and material rationalities, along with a powerful conflict of socialization and a reconfiguration of the behaviours and subjectivities of the actors. Our detention center provides an example of these conflicts between a normalized and repressive universe and another, filled with pragmatic strategies of survival or re-socialization. The complexity that results is the same as the one we can find in the entire society and in the permanent relation between individuals and their institutions.Enriched by ours previous research, this study endeavours to tackle two important questions: Gift Paradigm and the Theory of Recognition, and to link them up with the theme of voluntary work in prisons. First of all, we want to show that many aspects of the Gift Paradigm and of the Theory of Recognition are also to be found in jail. But these aspects have one particular consequence: they maintain the life of the institution closed on itself.The Gift Paradigm, through benevolent commitment, leads to the reconsideration of the prison space as a space of fully-fledged socialization and of eminent interaction with the outside world. It is precisely because it is in constant relation with society that the “all-encompassing” institution requires voluntary work and voluntary commitment. Within the benevolent relational act, as it is viewed in the Gift Paradigm framework, the convicts are in the situation of being responsible for their exchanges with the outside world. Then, in terms of rationalization, seen as a meaning given by the actors to their actions and to their needs of relationships, the aims are at the same time pragmatic, utilitarian, and altruistic. The convicts redefine themselves within the scope of reciprocity, return and offer, rather than that of debt, stigmatization and punishment.The social link is at the basis of the “voluntary/gift”. It is beneficial to all prisoners in terms of reinsertion, and in terms of struggle against stigma for some of their role in society (outside mercantile or professional roles) for others. In the same way it is beneficial to the volunteers. In a case of “positive mutual indebting”, it is socialization itself that is expressed.
10

The law in The Brothers Karamazov / El derecho en Los hermanos Karamazov

Zolezzi Ibárcena, Lorenzo 10 April 2018 (has links)
The Brothers Karamazov was the last novel of Dostoievski and for that reason is in way a sort of synthesis of his thinking. In the article there is a brief development of some key ideas as these: in matter of guilt, the attitude has more importance than action; everyone is guilty of everything before the eyes of everyone (universal guilt); suffering purifies the individual and acts as a remedy that promotes his spiritual elevation; freewill is central in human existence. But the novel is also a novel about a crime. Somebody is murdered and the readers will discover the perpetrator at the very end. 25% of the novel is devoted to technical legal matters: the instruction of the summary and the court trial. But what is most interesting is that an innocent is found guilty, because law had no other choice having into account the facts that are backed by evidence. It would be possible to find him not guilty, but for doing so it would be necessary to change the paradigm that is the backbone of modern criminal law. / Los hermanos Karamazov fue la última obra de Dostoievski, por lo que se convierte en una especie de síntesis de su pensamiento. En el artículo se abordan algunas de las ideas clave del autor, como que en materia de culpa, la actitud es más importante que la acción; que todos somos culpables de todo ante todos; que el dolor purifica y es como un remedio para la elevación espiritual del individuo; así como la reivindicación del libre albedrío. Pero la novela es también la historia de un crimen. El autor nos introduce en el misterio de una muerte, cuyo autor descubriremos al final. También contiene un 25% dedicado a la instrucción del proceso y al juicio oral. Pero lo más interesante es que un inocente es declarado culpable y es declarado culpable porque el derecho no podría obrar de otra manera tal como son presentados los hechos. Sería posible declararlo no culpable, pero para ello tendríamos que inventar otro paradigma del proceso penal.

Page generated in 0.1071 seconds