• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1697
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 9
  • 8
  • 7
  • Tagged with
  • 3612
  • 598
  • 433
  • 364
  • 360
  • 359
  • 347
  • 327
  • 326
  • 295
  • 282
  • 257
  • 214
  • 214
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Essays on insider trading, innovation, and political economy

Chen, Jiawei 09 August 2022 (has links) (PDF)
I study how insider trading interacts with the political economy, regulators, and other corporate governance mechanisms. In the first section, I examine the impact of insider trading restriction enforcement on firm innovation. U. S. Securities and Exchange Commission enforcement actions are intended to protect investors and limit expropriation by firm insiders, but enforcement could impact insiders’ incentives to contribute to value enhancing activities. Therefore, I explore how corporate innovation and performance respond to insider trading restrictions imposed by firms and regulators. Using manually collected data on SEC indictments against corporate insiders, I document more innovative activity following external insider trading restrictions. External restrictions are also followed by higher corporate investment, capital access, and operating performance. Similarly, internal blackout restrictions to insider trading are also linked to more patents. SEC and congressional rule changes serve as quasi-natural experiments resulting in shocks in enforcement and indictments for identification and inference. Overall, the results suggest insider trading restrictions and enforcement actions impact subsequent firm activities and managerial decisions by protecting outside investment, resulting in more investment and innovation. In the second section, I explore the relation between political uncertainty and insider trading. With political uncertainty elevated recently, I examine the role of political uncertainty among insiders. By measuring firm-specific political risk measured from conference calls, I observe insiders trade more actively during uncertain periods with trading volume and transaction value increasing alongside political uncertainty. The results are driven by non-routine insider transactions and purchases at firms with CEO duality and fewer insider trading restrictions. Next, I observe similar results when exploiting variation in election timing across states and alternative external measures. Moreover, I find evidence of informed insider trading by observing higher abnormal returns following insider trades amidst political uncertainty. Finally, I find political uncertainty is linked to lower bid-ask spreads and leverage but observe higher outstanding shares with more insider trading when experiencing positive political uncertainty, consistent with insiders informing markets and improving liquidity. Overall, these results suggest insiders purchase more actively and opportunistically amidst political uncertainty, improving market information quality, especially when internal governance is accommodating.
312

Two Essays On Screening Strategies

Ganesh Pillai, Rajani 01 January 2009 (has links)
Consumers form consideration sets by screening from all available alternatives. Consumers typically utilize one of two types of screening strategies: an exclusion screening strategy wherein alternatives not worthy of further consideration are rejected or an inclusion strategy wherein worthy alternatives are selected for further evaluation. Extant literature has documented the important role played by screening strategies in decision making. However, there is very limited understanding of when and why consumers may employ one screening strategy over the other as well the impact of the screening strategy for decision accuracy. This dissertation attempts to study the antecedent and consequence of screening strategies. Essay 1 in this dissertation, investigates the role of consumers' perceived uncertainty on the choice of screening strategy. Four studies in this essay show that when consumers are highly uncertain they are more likely to choose exclusion screening strategy; whereas when they are less uncertain they are more likely to use inclusion screening. Mediation analyses in Studies 1 and 2 show that the choice of screening strategy is primarily driven by perceived accuracy of the strategy. Study 3 demonstrates that the effect of uncertainty on the choice of screening strategy is moderated by consideration set size. When uncertain consumers form smaller sets they are more likely to use exclusion screening, but this relationship flips when they form larger consideration sets. Finally, external validity for the relationship between uncertainty and choice of screening strategy is demonstrated in Study 4 using the popular TV game show Who Wants to be a Millionaire? Essay two in this dissertation, investigates the role of perceived uncertainty and consideration set size on the relationship between screening strategy and objective accuracy of the decision. Utilizing an experimental study with an actual choice task, I demonstrate that perceived uncertainty moderates the screening strategy-decision accuracy relationship. Further, this interactive relationship is contingent on consideration set sizes. Whereas consumers with high perceived uncertainty make higher quality decisions with inclusion while forming smaller consideration sets, their decision quality is higher with exclusion when forming larger sets. Likewise, while consumers with low perceived uncertainty make more accurate decisions with exclusion when forming smaller sets, the accuracy of their decisions increases with inclusion when forming larger sets. This dissertation contributes to literature on screening strategies by explicating perceived uncertainty as a critical factor that leads to consumers preferring one screening strategy versus the other. Furthermore, it adds to our understanding of an important consequence of using screening strategies--decision accuracy.
313

Framework for Estimating Performance and Associated Uncertainty of Modified Aircraft Configurations

Denham, Casey Leigh-Anne 22 June 2022 (has links)
Flight testing has been the historical standard for determining aircraft airworthiness - however, increases in the cost of flight testing and the accuracy of inexpensive CFD promote certification by analysis to reduce or replace flight testing. A framework is introduced to predict the performance in the special case of a modification to an existing, previously certified aircraft. This framework uses a combination of existing flight test or high fidelity data of the original aircraft as well as lower fidelity data of the original and modified configurations. Two methods are presented which estimate the model form uncertainty of the modified configuration, which is then used to conduct non-deterministic simulations. The framework is applied to an example aircraft system with simulated flight test data to demonstrate the ability to predict the performance and associated uncertainty of modified aircraft configurations. However, it is important that the models and methods used are applicable and accurate throughout the intended use domain. The factors and limitations of the framework are explored to determine the range of applicability of the framework. The effects of these factors on the performance and uncertainty results are demonstrated using the example aircraft system. The framework is then applied to NASA's X-57 Maxwell and each of its modifications. The estimated performance and associated uncertainties are then compared to the airworthiness criteria to evaluate the potential of the framework as a component to the certification by analysis process. / Doctor of Philosophy / Aircraft are required to undergo an airworthiness certification process to demonstrate the capability for safe and controlled flight. This has historically been satisfied by flight testing, but there is a desire to use computational analysis and simulations to reduce the cost and time required. For aircraft which are based on an aircraft which has already been certified, but contain minor changes, computational tools have the potential to provide a large benefit. This research proposes a framework to estimate the flight performance of these modified aircraft using inexpensive computational or ground based methods and without requiring expensive flight testing. The framework is then evaluated to ensure that it provides accurate results and is suitable for use as a supplement to the airworthiness certification process.
314

The Use of Central Tendency Measures from an Operational Short Lead-time Hydrologic Ensemble Forecast System for Real-time Forecasts

Adams, Thomas Edwin III 05 June 2018 (has links)
A principal factor contributing to hydrologic prediction uncertainty is modeling error intro- duced by the measurement and prediction of precipitation. The research presented demon- strates the necessity for using probabilistic methods to quantify hydrologic forecast uncer- tainty due to the magnitude of precipitation errors. Significant improvements have been made in precipitation estimation that have lead to greatly improved hydrologic simulations. However, advancements in the prediction of future precipitation have been marginal. This research shows that gains in forecasted precipitation accuracy have not significantly improved hydrologic forecasting accuracy. The use of forecasted precipitation, referred to as quantita- tive precipitation forecast (QPF), in hydrologic forecasting remains commonplace. Non-zero QPF is shown to improve hydrologic forecasts, but QPF duration should be limited to 6 to 12 hours for flood forecasting, particularly for fast responding watersheds. Probabilistic hydrologic forecasting captures hydrologic forecast error introduced by QPF for all forecast durations. However, public acceptance of probabilistic hydrologic forecasts is problematic. Central tendency measures from a probabilistic hydrologic forecast, such as the ensemble median or mean, have the appearance of a single-valued deterministic forecast. The research presented shows that hydrologic ensemble median and mean forecasts of river stage have smaller forecast errors than current operational methods with forecast lead-time beginning at 36-hours for fast response basins. Overall, hydrologic ensemble median and mean forecasts display smaller forecast error than current operational forecasts. / Ph. D.
315

Stormwater Monitoring: Evaluation of Uncertainty due to Inadequate Temporal Sampling and Applications for Engineering Education

McDonald, Walter Miller 01 July 2016 (has links)
The world is faced with uncertain and dramatic changes in water movement, availability, and quality are due to human-induced stressors such as population growth, climatic variability, and land use changes. At the apex of this problem is the need to understand and predict the complex forces that control the movement and life-cycle of water, a critical component of which is stormwater runoff. Success in addressing these issues is also dependent upon educating hydrology professionals who understand the physical processes that produce stormflow and the effects that these stressors have on stormwater runoff and water quality. This dissertation addresses these challenges through methodologies that can improve the way we measure stormflow and educate future hydrology professionals. A methodology is presented to (i) evaluate the uncertainty due to inadequate temporal sampling of stormflow data, and (ii) develop equations using regional regression analysis that can be used to select a stormflow sampling frequency of a watershed. A case study demonstrates how the proposed methodology has been applied to 25 stream gages with watershed areas ranging between 30 and 11,865 km2 within the Valley and Ridge geomorphologic region of Virginia. Results indicate that autocorrelation of stormflow hydrographs, drainage area of the catchment, and time of concentration are statistically significant predictor variables in single-variable regional regression analysis for estimating the site-specific stormflow sampling frequency under a specific magnitude of uncertainty. Methods and resources are also presented that utilize high-frequency continuous stormwater runoff data in hydrology education to improve student learning. Data from a real-time continuous watershed monitoring station (flow, water quality, and weather) were integrated into a senior level hydrology course at Virginia Tech (30 students) and two freshman level introductory engineering courses at Virginia Western Community College (70 students) over a period of 3 years using student-centered modules. The goal was to assess student learning through active and collaborative learning modules that provide students with field and virtual laboratory experiences. A mixed methods assessment revealed that student learning improved through modules that incorporated watershed data, and that students most valued working with real-world data and the ability to observe real-time environmental conditions. / Ph. D.
316

Regulatory and Economic Consequences of Empirical Uncertainty for Urban Stormwater Management

Aguilar, Marcus F. 10 October 2016 (has links)
The responsibility for mitigation of the ecological effects of urban stormwater runoff has been delegated to local government authorities through the Clean Water Act's National Pollutant Discharge Elimination Systems' Stormwater (NPDES SW), and Total Maximum Daily Load (TMDL) programs. These programs require that regulated entities reduce the discharge of pollutants from their storm drain systems to the "maximum extent practicable" (MEP), using a combination of structural and non-structural stormwater treatment — known as stormwater control measures (SCMs). The MEP regulatory paradigm acknowledges that there is empirical uncertainty regarding SCM pollutant reduction capacity, but that by monitoring, evaluation, and learning, this uncertainty can be reduced with time. The objective of this dissertation is to demonstrate the existing sources and magnitude of variability and uncertainty associated with the use of structural and non-structural SCMs towards the MEP goal, and to examine the extent to which the MEP paradigm of iterative implementation, monitoring, and learning is manifest in the current outcomes of the paradigm in Virginia. To do this, three research objectives were fulfilled. First, the non-structural SCMs employed in Virginia in response to the second phase of the NPDES SW program were catalogued, and the variability in what is considered a "compliant" stormwater program was evaluated. Next, the uncertainty of several commonly used stormwater flow measurement devices were quantified in the laboratory and field, and the importance of this uncertainty for regulatory compliance was discussed. Finally, the third research objective quantified the uncertainty associated with structural SCMs, as a result of measurement error and environmental stochasticity. The impacts of this uncertainty are discussed in the context of the large number of structural SCMs prescribed in TMDL Implementation Plans. The outcomes of this dissertation emphasize the challenge that empirical uncertainty creates for cost-effective spending of local resources on flood control and water quality improvements, while successfully complying with regulatory requirements. The MEP paradigm acknowledged this challenge, and while the findings of this dissertation confirm the flexibility of the MEP paradigm, they suggest that the resulting magnitude of SCM implementation has outpaced the ability to measure and functionally define SCM pollutant removal performance. This gap between implementation, monitoring, and improvement is discussed, and several potential paths forward are suggested. / Ph. D.
317

Risk-Aware Planning by Extracting Uncertainty from Deep Learning-Based Perception

Toubeh, Maymoonah I. 07 December 2018 (has links)
The integration of deep learning models and classical techniques in robotics is constantly creating solutions to problems once thought out of reach. The issues arising in most models that work involve the gap between experimentation and reality, with a need for strategies that assess the risk involved with different models when applied in real-world and safety-critical situations. This work proposes the use of Bayesian approximations of uncertainty from deep learning in a robot planner, showing that this produces more cautious actions in safety-critical scenarios. The case study investigated is motivated by a setup where an aerial robot acts as a "scout'' for a ground robot when the below area is unknown or dangerous, with applications in space exploration, military, or search-and-rescue. Images taken from the aerial view are used to provide a less obstructed map to guide the navigation of the robot on the ground. Experiments are conducted using a deep learning semantic image segmentation, followed by a path planner based on the resulting cost map, to provide an empirical analysis of the proposed method. The method is analyzed to assess the impact of variations in the uncertainty extraction, as well as the absence of an uncertainty metric, on the overall system with the use of a defined factor which measures surprise to the planner. The analysis is performed on multiple datasets, showing a similar trend of lower surprise when uncertainty information is incorporated in the planning, given threshold values of the hyperparameters in the uncertainty extraction have been met. / Master of Science / Deep learning (DL) is the phrase used to refer to the use of large hierarchical structures, often called neural networks, to approximate semantic information from data input of various forms. DL has shown superior performance at many tasks, such as several forms of image understanding, often referred to as computer vision problems. Deep learning techniques are trained using large amounts of data to map input data to output interpretation. The method should then perform correct input-output mappings on new data, different from the data it was trained on. Robots often carry various sensors from which it is possible to make interpretations about the environment. Inputs from a sensor can be high dimensional, such as pixels given by a camera, and processing these inputs can be quite tedious and inefficient given a human interpreter. Deep learning has recently been adopted by roboticists as a means of automatically interpreting and representing sensor inputs, like images. The issue that arises with the traditional use of deep learning is twofold: it forces an interpretation of the inputs even when an interpretation is not applicable, and it does not provide a measure of certainty with its outputs. Many techniques have been developed to address this issue with deep learning. These techniques aim to produce a measure of uncertainty associated with DL outputs, such that even when an incorrect or inapplicable output is produced, it is accompanied with a high level of uncertainty. To explore the efficacy and applicability of these uncertainty extraction techniques, this thesis looks at their use as applied to part of a robot planning system. Specifically, the input to the robot planner is an overhead image taken by an unmanned aerial vehicle (UAV) and the output is a path from a set start and goal position to be taken by an unmanned ground vehicle (UGV) below. The image is passed through a deep learning portion of the system that performs what is called semantic segmentation, mapping each pixel to a meaningful class, on the image. Based on the segmentation, each pixel is given a cost proportionate to the perceived level of safety associated with that class. A cost map is thus formed on the entire image, from which traditional robotics techniques are used to plan a path from start to goal. A comparison is performed between the risk-neutral case which uses the conventional DL method and the risk-aware case which uses uncertainty information accompanying the modified DL technique. The overall effects on the robot system are envisioned by observing a metric called the surprise factor, where a high surprise factor signifies a poor prediction of the actual cost associated with a path. The risk-neutral case is shown to have a higher surprise factor than the proposed risk-aware setup, both on average and in safety-critical case studies.
318

An Empirical Analysis of Environmental Uncertainty, Realoptions Decision Patterns and Firm Performance

Boccia Jr., Alfred M. 01 September 2009 (has links)
Real options theory has become an influential explanatory and normative framework for making resource allocation decisions. Despite a growing body of strategy research regarding real options, however, there is as of yet little empirical confirmation (1) that firm resource allocation behavior conforms with real options theory, or (2) that employing real options principles has a positive impact on firm performance. This research examines these questions. Using a survey instrument designed to measure a range of real options-theoretic decision patterns, data has been collected from a sample of 173 U.S. manufacturing firms. This data set has been used to test two central premises. The first is that, in contrast to much of the real options literature, there is no inherently superior real options decision pattern. Instead, real options-optimal investment decisions depend on the magnitude and source of the uncertainties that firms encounter in their task environments. This premise is tested by measuring two important sources of uncertainty in the external environment: uncertainty regarding the level and composition of demand (market uncertainty) and uncertainty regarding the intentions and actions of competitors (competitive uncertainty). I develop the theoretical foundation for expecting that patterns of real options behavior vary with these two sources of uncertainty, and that different sources of uncertainty frequently promote competing real options-theoretic decision behavior. The research tests these hypothesized relationships empirically. The principal contribution of this analysis has been to develop a more fine-grained appreciation of the relationship between real options theory and a multidimensional conceptualization of uncertainty. The second premise of the research is that making investment decisions based on real options principles has a positive effect on firm performance. There is ample theoretical foundation for the superiority of real options theory as a framework for making resource commitment decisions. The research examines this expectation empirically by testing whether the fit or congruence between real options decision patterns and environmental uncertainty is positively related to firm profitability, market value and growth.
319

Relating with the Supernatural in Living Subjectivity in Søren Kierkegaard (1813 – 1855) and Maurice Blondel (1861 – 1949):

Agbaw-Ebai, Maurice Ashley January 2021 (has links)
Thesis advisor: Jeffrey Bloechl, / The question of how one must relate with God today opens the door to the dialectics regarding the necessity for the supernatural, for relationality presupposes a personalistic dimension of interaction, communication and engagement, which tends to assume a being with which such interactions and engagements must proceed. But how can we talk about God in a way that is sensitive to the modern and contemporary sense of human autonomy, that is, in a manner that is not patronizing but rather, flowing from exigencies that the human condition and human data is presenting to us? In other words, is there a possibility that the reality of human experience itself can offer us the unavoidable necessity for engaging the God-question? If yes, what is the path that such a necessary engagement with God can take to the extent that it does not appear confessional and thereby, summarily dismissed by the non-religious, even before the case is made? In Kierkegaard and Blondel, I felt one could discern real possibilities to answering the question of both the necessity for the supernatural and how relationality emerges from such a necessity. With the Danish philosopher Kierkegaard, relationality emerges in the prioritizing of the singular individual over the collectivism of Danish Lutheranism. In Kierkegaard’s reading of the state of things, a Christianity that had become identifiable with the reigning culture, with the zeitgeist, could no longer possess the transformative energies that must define and shape a relationship with Jesus Christ. In almost polemical tones, Kierkegaard writes: “When Christianity entered into the world, people were not Christians, and the difficulty was to become a Christian. Nowadays the difficulty in becoming a Christian is that one must cease to become a Christian.” (Søren Kierkegaard, Provocations: Spiritual Writings of Kierkegaard, compiled and edited by Charles E. Moore (Walden, NY: Plough Publishing House, 2002, 211). In other words, the Christianity that was operational in Kierkegaard’s day, in his assessment, was distant from the Christianity of the New Testament. And so, ceasing to become a Christian meant that one had to eschew the cultural Christianity of Christendom and return to the New Testament Christianity, a return which was the only path capable of reinvigorating the Christian faith. In Kierkegaard’s eyes, this diagnosis meant much more that lamenting Christianity’s loss of fervor. It was indicative as well of the absence of a living relationship with God, for a faith that has lost its steam cannot bring about the intersubjectivity that ought to define religious practice, in that the individual was no longer eager to build an engaging and active relationship with the supernatural and to live out the demands of such a relationship, thanks to the help that comes from the supernatural. Kierkegaard attributes this diminishment of a living faith to Christianity’s acquiescence to a mindset of levelling that had become commonplace in society, a flattening that resulted in the forfeiture of any feel of particularity that ought to characterize the religious phenomenon. In this light, the urgency of recovering the singular individual, in his or her subjectivity, that comes to the realization of human brokenness and the human need for the forgiveness, emerges as the path to a rediscovery of the Christian élan in its beauty and transformational spirit. The subject is unable to save the self from the absence of satiety that characterizes the life of sin, estrangement and anxiety. It is the individual that must reform or be converted, becoming a Christian. It is the individual that must open the self to God, allowing the internal dispositions to be shaped. To become a Christian is to become a single individual, and no one can teach one how to become an individual. It is not something that can be communicated. It is something that can only be lived by one’s self. Christianity must therefore be lived in and through personal expressions, for, in typical Kierkegaardian fashion, human existence does not happen in the abstract. Humans live and think in the concrete situations of their lives, and not in the rational speculations and speculative systems, which, to follow the Kierkegaardian view of things, results in the vanishing of authentic individuality. But what is really wrong about generality that elicits such a consistent objection from Kierkegaard? It would appear that the answer resides in his conviction that the demands of New Testament Christianity are such that every individual as individual had to take a stand, for or against the spiritual élan that was being proposed by the New Testament. Every individual had to take up his or her daily crosses and follow Jesus [Lk 9:23]. Individuality, very much different from individualism, is therefore, central to becoming a Christian. And to the extent that generality or Hegelian collectivism shielded the individual from this responsibility of becoming a Christian by simply jumping on the bandwagon of the whole, Kierkegaard became convinced that the path towards a revived Christian spirituality and existence had to start with asserting the place of the singular individual over and even against the collective. And it is at this point that the question of the necessity and inescapability of the supernatural appears in bolder focus for Kierkegaard, in that, having ascertained the superficiality of Church, state and the communal that has swallowed up the individual, thoughts of any possible spiritual rebirth bring to sharper focus the dialectics of the relationship between the individual and God. The state of estrangement from God is the state of non-being, of the absence of fulfillment. With humble acceptance of God’s offer of forgiveness comes the rescue from the abyss of broken subjectivity. This rescue by God only takes place when the subjective, having come to terms with his or her internal discord, accepts to entrust the self into the hands of God, by a leap of faith. This leap implies that I give up on my ideals of what my life ought to be, embracing an unknown journey of faith, always conscious that God will be faithful to God’s providential promises to me as a believer, just as he was to Abraham as recounted in the book of Genesis. In this sense, a new life of freedom is borne. From my living relationality with God, I experience God’s forgiveness. From my living relationality with God, I experience an unknown freedom. And from forgiveness and freedom comes an unknown contentment, fulfillment and happiness. Summarily, for Kierkegaard, living relationality with God is realizable through the acceptance of my brokenness in the spirit of humility and faith. On the other hand, for Blondel, God’s forgiveness, faith, freedom and contentment, and living relationality with the Supernatural emerges in the unfolding of the phenomenon of human action. He captures the essence of his philosophical undertaking with the famous opening lines of L’Action (1893): “Yes or no, does human life make sense, and does man have a destiny?” (Maurice Blondel, L’Action (1893) Essay on a Critique of Life and a Science of Practice, trans. Oliva Blanchette (Notre Dame, Indiana: University of Notre Dame Press, 2007, vii). In other words, how do I come to act in my life as a conscientious human being, in terms of my own existence here and now? It is by way of responding to this question that Blondel settles for action as the defining reality that explains who the human being is, for to be human is to act, for the human condition is of the necessity to act. As a human being, I am an acting person, and I can only be known when I act. Accordingly, it is thanks to my actions that my humanity manifests itself and makes me accessible to others. And this is the justification for why action cannot be peripheral to philosophy, if philosophy has to study the question of what it means to be human, and the ultimate destiny of human existence. In effect, to study who the human being is, is to study human action, for one’s person becomes translucent thanks to the way one acts. For the French philosopher, human action is always seeking for fulfillment. Moving from concentric circles from family, immediate community and nation, action is understood not as a specific activity but as an unfolding reservoir of human willing, which continues to demand more. There is a wedge between the willing will and the willed will. A perfect fit never happens between the human’s ever continuous desire and human realized action. And antecedent to the attitude before the supernatural lies the whole dynamics of human choosing, upon which resides the resolution of the impasse between finitude and infinitude that is characteristic of human existence, as has emerged in the phenomenon of human action. This impasse between the willing and willed wills must be resolved, for two reasons: First, whether human life makes sense? Second, whether the human being has a destiny. These two questions make it impossible to offer a negative solution to the impasse that faces human willing and choosing. A burden is thus imposed on human beings, from which an escape is existentially impossible. Dilettantism is not an option. And if human willing is unable to resolve the impasse between an ever-yearning for more that never matches our concrete acts, then there appears in the phenomenon of human action, what Blondel calls, the one thing necessary. This one thing necessary is the supernatural. This is the Being that comes from the outside of human action to rescue the human being. At this point, philosophy has played its role in helping to navigate the uncertain seas of human action, showing the way to what is needed, if action is not to be aborted. But philosophy, though it has raised the problem, cannot offer the solution. The rescuing of human action and by extension, the human being from the existential impossibility of a crushing human-only self-understanding, is an offer that must now be articulated by religion. Herein appears a question for every human being, a question that emerges from the human quest for satiety: to be God with and through God, or to be God without and against God? Living relationality for Blondel suggests that the former is the most fitting response, for all attempts of the latter as shown in the evolution of the phenomenon of action have proven to be futile. Summarily, for both Kierkegaard and Blondel, living relationality with the supernatural is, in the final analysis, a rescuing of the human being from the temptation of human autonomy fashioned in a way that excludes God. Both Kierkegaard and Blondel clearly do not envisage that the question about the meaning of human life, fulfillment, contentment and destiny, can be resolved without or against God. And not only that, of crucial importance, is likewise the realization that every human being is invited to take a stand regarding the question of whether human life can find contentment away from the supernatural, hence, the necessity for the subjective in the philosophical landscape of religious existentialism. By demonstrating from the absence of satiety in human life (Kierkegaard) and from the impasse that emerges in the phenomenon of the unfolding of human action (Blondel) that the supernatural is necessary to the realization of human fulfillment, Kierkegaard and Blondel emerge as necessary interlocutors to contemporary men and women in their search or pursuit of happiness, hence placing us in their debt regarding the specific question of the human search for meaning and fulfillment. / Thesis (PhD) — Boston College, 2021. / Submitted to: Boston College. Graduate School of Arts and Sciences. / Discipline: Philosophy.
320

Uncertainty Quantification in Neural Network-Based Classification Models

Amiri, Mohammad Hadi 10 January 2023 (has links)
Probabilistic behavior in perceiving the environment and take critical decisions have an inevitable role in human life. A decision is concerned with a choice among the available alternatives and is always subject to unknown elements concerning the future. The lack of complete data, insufficient scientific, behavioral, and industry development and of course defects in measurement methods, affect the reliability of an action’s outcome. Thus, having a proper estimation of this reliability or uncertainty could be very advantageous particularly when an individual or generally a subject is faced with a high risk. With the fact that there are always uncertainty elements whose values are unknown and these enter into a processes through multiple sources, it has been a primary challenge to design an efficient representation of confidence objectively. With the aim of addressing this problem, a variety of researches have been conducted to introduce frameworks in metrology of uncertainty quantification that are comprehensive enough and have transferability into different areas. Moreover, it’s also a challenging task to define a proper index that reflects more aspects of the problem and measurement process. With significant advances in Artificial Intelligence in the past decade, one of the key elements, in order to ease human life by giving more control to machines, is to heed the uncertainty estimation for a prediction. With a focus on measurement aspects, this thesis attends to demonstrate how a different measurement index affects the quality of evaluated predictive uncertainty of neural networks. Finally, we propose a novel index that shows uncertainty values with the same or higher quality than existing methods which emphasizes the benefits of having a proper measurement index in managing the risk of the outcome from a classification model.

Page generated in 0.0499 seconds