• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 19
  • 7
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The Implementation of Sensory and Intelligible Elements in the Design Process

Bonnet, Cody January 2016 (has links)
Sustainable Built Environments Senior Capstone Project / The context of this topic covers the implementation of sensory and intelligible elements into the design process. These elements mainly pertain towards the human experience, and how design professionals can interpret these elements into functional concepts. The idea of incorporating sensory and intelligible principles is a relatively new premise in the realm of design, and as such there are many opportunities for the field to expand. Due to uncertainties pertaining to this field, such as the cost of production as well the subjective nature of the information, there are not many examples of design professionals utilizing these concepts to their full potential. The research methodology is primarily a qualitative analysis, examining precedents and examples of sensory elements as well as their functional applications in the professional world. Significant findings of this research combat the perceived subjectivity of this field, as there are proven benefits to incorporating these elements in the design process. These include creating more memorable spaces, improving mood of participants, and the ability to create safer spaces. Keywords: Intelligible, Kevin Lynch, sensory, Christopher Alexander, human experience, design process
2

La connaissance chez Jean Duns Scot : le rôle de la nature commune et la position de l'auteur dans la querelle des Universaux

Racette, Sylvain January 2008 (has links)
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal.
3

La connaissance chez Jean Duns Scot : le rôle de la nature commune et la position de l'auteur dans la querelle des Universaux

Racette, Sylvain January 2008 (has links)
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
4

Le bien intelligible de Platon et le principe de non-contradiction chez Aristote en tant que « anhypothétiques »

Racine, Félix 10 1900 (has links)
Le présent mémoire entend étudier le concept de l’anhypothétique dans l’oeuvre de Platon et d’Aristote. Plus exactement, nous examinons le Bien intelligible chez Platon ainsi que le principe de non-contradiction chez Aristote. Bien que ce sont des principes qui adoptent la même conceptualisation, ils ont chacun un contexte et des facteurs qui leur sont spécifiques. Cela s’explique notamment par le fait que les philosophes ont des divergences doctrinales par rapport à l’être, la théorie des formes, la dialectique, et cetera. Un tour d’horizon des philosophies respectives ainsi qu’une comparaison systématique des principes nous aideront à y voir plus clair. De plus, nous examinerons en quoi Aristote est redevable à Platon pour l’adoption du principe de non-contradiction. Cela dit, les deux principes à l’étude ont comme caractéristique fondamentale de ne pas être des hypothèses, c’est-à-dire qu’ils ne peuvent souffrir du statut de contingence qui est typique de l’hypothèse. En effet, l’anhypothétique cherche à neutraliser son alternative afin de s’ériger en tant que nécessité catégorique. Les raisons d’une telle nécessité s’expliquent entre autres parce que l’anhypothétique est condition de possibilité pour la connaissance des êtres. La potentialité de sa négation est donc nettement problématique sur le plan scientifique et ontologique, d’où son statut particulièrement éminent pour les philosophes. / This dissertation examines the concept of the anhypothetical in the works of Plato and Aristotle. More precisely, we examine the Intelligible Good in Plato and the principle of non-contradiction in Aristotle. Although these principles adopt the same conceptualization, they each have their own specific context and factors. This is because the philosophers have doctrinal differences with regard to being, the theory of forms, dialectics, etcetera. An overview of the respective philosophies, together with a systematic comparison of principles, will help us to get a clearer picture. We'll also examine how Aristotle is indebted to Plato for the adoption of the principle of non-contradiction. That said, the two principles under consideration have the fundamental characteristic of not being hypotheses, i.e. they cannot suffer from the status of contingency that is typical of hypothesis. Indeed, the anhypothetical seeks to neutralize its alternative in order to set itself up as a categorical necessity. The reasons for such a necessity include the fact that the anhypothetic is a condition of possibility for the knowledge of beings. The potentiality of its negation is therefore clearly problematic on a scientific and ontological level, hence its particularly eminent status for philosophers.
5

Actionable Visualization of Higher Dimensional Dynamical Processes

Pappu, Sravan Kumar 20 May 2011 (has links)
Analyzing modern day's information systems that produce humongous multi-dimensional data in form of logs, traces or events that unfold over time can be tedious without adequate visualization, thereby, advocating the need for an intelligible visualization. This thesis researched and developed a visualization framework that represents multi-dimensional dynamic and temporal process data in a potentially intelligible and actionable form. A prototype showing four different views using notional malware data abstracted from Normal Sandbox behavioral traces were developed. In particular, the B-matrix view representing the DLL files used by the malware to attack a system. This representation is aimed at visualizing large data sets without losing emphasis on the process unfolding over multiple dimensions.
6

A New Approach to the Synthesis of Fuzzy Systems from Input-Output Data

Garriga Berga, Carles 07 October 2005 (has links)
Fuzzy logic has been applied successfully to systems modeling for ages. One of its main advantages is that it provides an understandable knowledge representation. Nevertheless, most investigations have focused their efforts on achieving accurate models and by doing so, they have omitted the linguistic capabilities of fuzzy logic.This thesis researches into the issues related to intelligible fuzzy models, because since science demonstrated the use of fuzzy logic when searching optimal models in terms of error (in fact a fuzzy model is a universal approximator), some but few investigators have focused their efforts in order to achieve really intelligible models in spite of losing some accuracy.In this work we propose a whole methodology able to find an intelligible fuzzy model in a local manner (rule by rule) from input-output data. In this sense we find the number and position of the necessary fuzzy sets and also the linguistic rules related to them. For this purpose we have developed a hierarchical process which takes into account several steps and techniques, some of which are original contributions.The resulting method is very simple and also intelligible. Therefore, this solution performs the final models with a low computational cost, but furthermore, allows the tuning of its different options depending on the nature of the problem and the characteristics of the users.In this thesis we explain the whole methodology and illustrate its advantages (but also its problems) with several examples which are benchmarks in most cases.
7

Social work discourses and the social work interview

Stenson, Kevin January 1989 (has links)
It will be argued that, in order to understand particular exchanges between social workers and clients, it is essential to go beyond the view that sees them simply in terms of interaction between unique persons, and locate them within the wider discursive settings within which they occur. Most of the talk which takes place in these interviews concerns problematic issues within family life, particularly in terms of the relationships between parents and children. Behind these apparently mundane conversations lie agendas of social work issues which have been constructed historically with the rise of the caring professions. The early part of the thesis is concerned with uncovering the historically constructed norms of acceptable motherhood which underpin social work strategies with families and which help set the agendas of interviews. Then the analysis focuses on how general norms and objectives are translated into operational, professional techniques. This theme is carried forward through a focus on the social settings in which interviews take place, the building up of subject positions within interviews, for social worker and client, and the implications of translating from a predominantly oral to a literate based, professional mode of discourse. Finally, the analysis is concerned with the tentative attempts, marked by ambiguity and resistance, to go beyond the mere monitoring of the life of the client, and draw her/him into a form of discourse which is openly committed to social work aims, where the client seems to want to present his or her life problems in terms which are intelligible to, and manageable within, the strategies open to the social worker.
8

Time as a Policy Mechanism and Intelligible Principle: An Examination of the National Emergencies Act

Tull, Justin Wayne 05 June 2023 (has links)
The effective oversight and management of national emergencies are critical to preserving democratic processes and norms. Congress passed the National Emergencies Act (NEA) of 1976 to regulate the open-ended and unchecked implementation of emergency authorities by the president. Notwithstanding the NEA's objectives, the number and duration of national emergencies are proliferating. National emergencies evoke a sense of urgency that results in exceptional governance procedures and alters official and public perceptions. However, national emergencies declared under the NEA rarely reflect the definition of urgency and endure for years, indicating potential oversight failures and a re-emergence of the president's unchecked use of emergency power. Concerns arise that a national emergency shifts legislative power to the executive, making government policy less democratic. The national scope of these emergencies also portends the potential for harm to a broad population. Ambiguous judicial and legislative instructions, presidential aspirations of demonstrating leadership, and congressional blame avoidance further complicate the governance of national emergencies. This research conceptualizes time as the intelligible principle that Congress used to meet the judicial requirements for delegating functional responsibilities to the executive branch while retaining constitutional obligations and maintaining oversight of executive action. Sequences, deadlines, and repetition are temporal mechanisms that help regulate government action and moderate authorities. Understanding how temporal policy mechanisms affect the use of emergency authority, shape government interaction, and adjust accountability is particularly important as the United States confronts a hyper-partisan environment and demands to confront new issues as national emergencies intensify. Employing a policy tracing methodology augmented by survival and qualitative comparative analysis, this dissertation analyzes national emergency data composed of declarations, continuations, amendments, and terminations. The analysis incorporates Supreme Court decisions, budgetary impact statements, and Federal Register data to track and evaluate national emergencies declared via presidential proclamation and executive order. The ensuing model delineates the properties of the national emergencies declared under the NEA and clarifies relational factors contributing to temporal variation amongst emergency declarations. The resulting clarity contributes to scholarly and governmental use of temporal policy mechanisms—particularly sequences, deadlines, and repetition—and offers recommendations for enhancing the oversight of U.S. national emergencies. / Doctor of Philosophy / The oversight and management of national emergencies are crucial for protecting democratic processes and norms. In 1976, Congress passed the National Emergencies Act (NEA) to prevent the president from using unconstrained emergency powers. However, the NEA has not been successful in controlling the frequency and duration of national emergencies. During a national emergency, a sense of urgency generally leads to exceptional governance procedures and changes how people perceive governance situations. This research examined national emergency declarations, continuations, amendments, and terminations to understand how the NEA governs emergencies and what principles guide it. The findings show that national emergencies declared under the NEA have limited congressional oversight and are increasingly influenced by politics. The lack of clear instructions from the judiciary and the legislature, the president's desire to display leadership, and Congress' tendency to avoid blame further complicate the governance of national emergencies, allowing them to last for many years without proper oversight. To address the weaknesses in the NEA and improve the handling of national emergencies, this dissertation proposes the concept of temporal policy mechanisms. Temporal policy mechanisms use time as a guiding principle to delegate emergency authority and ensure accountability. Examples of temporal mechanisms include sequences, deadlines, and repetition to regulate government actions and moderate authorities. The analysis also highlights origination bias, where Congress sets rules for others but fails to follow its own processes. By implementing transparent temporal policy mechanisms and reducing the sense of urgency during prolonged national emergencies, accountability and transparency can be enhanced thereby upholding U.S. constitutional principles and benefiting they citizenry.
9

無形資產投入量與企業報酬及風險之研究 / Research on Enterprise Return, Risk, and Input in Intangible Assets

薛健宏, Hsueh, Chien-Hung Unknown Date (has links)
本研究旨在探討無形資產的報酬與風險。隨著知識經濟的來臨,無形資產愈發重要,然而,無形資產風險的探討卻極為有限。Barsky and Marchant (2000)明白指出,「無形資產」就是「智慧資本」。基於吳安妮﹙2004﹚、Kaplan and Norton﹙2004﹚的架構,「平衡計分卡」的三個非財務構面對應著「各無形資產項目」,即「顧客構面」對應著「關係資本」,「內部流程構面」對應著「創新資本」,「學習與成長構面」對應著「人力資本」。本研究參酌廖俊杰﹙2003﹚的設計,進一步將各類無形資產細分為投入與產出因子。 關於報酬與風險間的關連性,實證顯示,風險與報酬間呈顯著負向關係,與「資本資產訂價模式」有所不同。本研究依報酬水準分組,結果顯示,高報酬樣本的報酬、風險呈正向關係,而兩者在低報酬樣本呈負向關係。本研究以台灣近期的資料為主,而近來台灣景氣不佳,故整體而言,報酬、風險呈負向關係。 結果亦顯示,各無形資產(關係資本、創新資本、人力資本)的投入量有助於同類無形資產的產出,而各無形資產產出亦對企業績效具正面影響,驗證了無形資產的經濟價值。此外,實證結果亦支持「平衡計分卡」的論點,即「人力資本產出」對「創新資本產出」具正面效益,「創新資本產出」有利於「關係資本產出」,且「關係資本產出」也能提升「企業報酬」。此外,本研究亦發現,高報酬樣本僅研發支出的經濟效益顯著大於低報酬樣本,這意味著,研發密度係影響企業報酬水準的關鍵因素。 風險即不確定性,結果顯示,各無形資產對企業風險未呈一致性的結果,「創新資本產出」會增加企業風險,而「關係資本、人力資本產出」均可減輕公司的不確定性。在產出指標中,唯有「創新資本產出」(即專利)具高度的不確定性,其他無形資產產出皆基於「已確定的財務結果」或「已成事實的員工流動狀況」,故唯有創新資本產出與企業風險呈正向關係。在「高報酬、低風險」的環境下,關係資本、人力資本產出均與「企業風險」呈負向關係。 此外,所有「無形資產投入」(包括關係資本、創新資本、人力資本投入)均直接與「企業風險」均呈負向關係。「關係資本投入」具邊際效益遞減的特性,以致企業風險趨緩。再者,實證顯示,「研發支出」與「創新資本產出風險」間具正相關,「創新資本產出風險」與「企業風險」成凸向函數關係。這意味著,當企業不從事研發工作,未申請專利,即使該企業不具「創新資本產出風險」,但這類企業較不具競爭能力,面對生存危機,該企業的整體風險反而較高。換句話說,創新活動最終能減少企業的整體風險。就人力資本而言,除年資外,人力資本投入可增加公司報酬,減少企業風險,無怪乎常言「員工是企業最重要的資產」。 財報資料系基於一般公認會計原編製,與市場認知恐有所不同,本研究測試兩者的風險認知是否存在重大差異。結果顯示,無形資產投入的「市場風險」大多低於「會計風險」,並未發現「無形資產投入成果」因資訊不對稱之疑慮,而增加企業風險。 / The paper tests return and risk of intangible assets. Under knowledge economy, intangible assets are more and more important, but there is rare literature about risk of intangible assets. Barsky and Marchant (2000) pointed out that “intangible asset” is “intelligible capital”. Based on the framework by Kaplan and Norton (2004) and Wu (2004), three non-financial perspectives on balanced scorecard belong to intangible assets, in other words, customer perspective is connected with relationship capital, internal process perspective with innovation capital, learning and growth perspective with human capital. Referring to Liao (2003), the intangible assets items are divided into input and output items. About the relationship between return and risk, the empirical result shows the negative relation between return and risk, inconsistent with CAPM. Grouping by the level of return, the result shows the positive relationship between risk and return in high-return group, and negative relationship between risk and return in low-return group. The sample comes from recent Taiwan facing the depression, so the empirical results show negative relationship between risk and return. Results show that all inputs in intangible assets can raise the output of the same capital. In addition, all outputs of intangible assets can also increase the enterprise performance, supporting the economic value of intangible assets. In addition, the empirical results also support the concept of balanced scorecard, this is, output from human capital can increase the output from innovation capital, and output from innovation capital can increase the return through the output from relationship capital. In addition, the result shows R&D expenditure of the high-return group is only one variable significantly higher than that of the low-return group. It implies R&D is a key factor for the enterprise return. About risk, there are no consistent results. Output from innovation capital would increase the whole risk in the company, while the output from relationship capital and human capital can decrease the enterprise risk. Among the output indicators, output from innovation capital is measured with patents, which is only one indicator about highly uncertainty, while output from other capitals are captured by confirmed financial results and the occurred turnover in employees. Therefore, the output from innovation capital is related positively with the enterprise risk, and the environment with “low return, high risk” makes the negative relationship between enterprise risk and the output from relationship capital and human capital. In addition, the enterprise risk is related negatively with all inputs of intangible assets including relationship capital, innovation capital, and human capital. The marginal benefit from input of relationship capital goes down, probably leading to the decreasing enterprise risk. Moreover, the results show the positive association between R&D expenditure and the risk of innovation-capital output, and the convex relationship between the enterprise risk and the risk of the innovation-capital output, implying the firm without R&D operation can not apply for the patent, and the firms don’t the risk of innovation-capital output. However, the companies lack the competitive ability and face the serious survival crisis, so there is more enterprise risk in the firm at final. In other words, the innovation operation can decrease the enterprise risk. About human capital, except working years of employees, human-capital input can increase the enterprise return, and decrease the enterprise risk, so it is usually said “employees is the most important asset in the enterprise.” Financial reports are compiled by GAAP, which is probably different from the recognition of investors in market, so the difference in recognition of risk between them is examined in the paper. Results show the market risk of most intangible-asset inputs is lower than the accounting risk, and no extra risk from information asymmetry is found.
10

English Language Learners' Perspectives of the Communicative Language Approach

Barnes-Hawkins, Colonda LaToya 01 January 2016 (has links)
The communicative language approach (CLA) dominates pedagogical practice in second language acquisition classrooms in the US. However, this approach does not emphasize independent pronunciation instruction, leaving learners to improve pronunciation on their own. This study explored the perspectives of English language learners (ELLs) being instructed via the CLA regarding the effectiveness of the CLA in providing intelligible pronunciation skills. The intelligibility principle of language served as the theoretical foundation underlying this study guided by research questions addressing how well the CLA met ELLs' pronunciation intelligibility needs and their perspectives on receiving independent pronunciation instruction to meet these needs. Using qualitative case study methods, the research questions were addressed through an analysis of interviews of 10 community college ELL adult volunteers who received instruction using the CLA as current or former students in the intensive English program, had linguistic skill levels ranging from beginner to advanced, and were graduates of U.S. schools. A typological analysis model was followed where the data were organized by themes, patterns, and identified relationships. Participants reported wanting to improve their pronunciation and that their pronunciation had improved with the CLA instructional strategies. Although all participants desired to receive some independent instruction in pronunciation, their preferred instructional modes differed. It is recommended that ELLs' perspectives be heard and that English as a Second Language educators instruct with the CLA while also providing explicit pronunciation instruction. The results of this study indicating student satisfaction with the CLA may elicit positive social change within the ELL community by providing a voice to ELLs.

Page generated in 0.0898 seconds