• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1018
  • 231
  • 141
  • 104
  • 77
  • 36
  • 29
  • 26
  • 25
  • 16
  • 16
  • 13
  • 11
  • 11
  • 10
  • Tagged with
  • 2177
  • 935
  • 934
  • 355
  • 302
  • 276
  • 265
  • 198
  • 183
  • 177
  • 172
  • 167
  • 147
  • 130
  • 127
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
841

Lateral Load Analysis Of Shear Wall-frame Structures

Akis, Tolga 01 January 2004 (has links) (PDF)
The purpose of this study is to model and analyze the nonplanar shear wall assemblies of shear wall-frame structures. Two three dimensional models, for open and closed section shear wall assemblies, are developed. These models are based on conventional wide column analogy, in which a planar shear wall is replaced by an idealized frame structure consisting of a column and rigid beams located at floor levels. The rigid diaphragm floor assumption, which is widely used in the analysis of multistorey building structures, is also taken into consideration. The connections of the rigid beams are released against torsion in the model proposed for open section shear walls. For modelling closed section shear walls, in addition to this the torsional stiffness of the wide columns are adjusted by using a series of equations. Several shear wall-frame systems having different shapes of nonplanar shear wall assemblies are analyzed by static lateral load, response spectrum and time history methods where the proposed methods are used. The results of these analyses are compared with the results obtained by using common shear wall modelling techniques.
842

Wide Baseline Stereo Image Rectification and Matching

Hao, Wei 01 December 2011 (has links)
Perception of depth information is central to three-dimensional (3D) vision problems. Stereopsis is an important passive vision technique for depth perception. Wide baseline stereo is a challenging problem that attracts much interest recently from both the theoretical and application perspectives. In this research we approach the problem of wide baseline stereo using the geometric and structural constraints within feature sets. The major contribution of this dissertation is that we proposed and implemented a more efficient paradigm to handle the challenges introduced by perspective distortion in wide baseline stereo, compared to the state-of-the-art. To facilitate the paradigm, a new feature-matching algorithm that extends the state-of-the-art matching methods to larger baseline cases is proposed. The proposed matching algorithm takes advantage of both the local feature descriptor and the structure pattern of the feature set, and enhances the matching results in the case of large viewpoint change. In addition, an innovative rectification for uncalibrated images is proposed to make wide baseline stereo dense matching possible. We noticed that present rectification methods did not take into account the need for shape adjustment. By introducing the geometric constraints of the pattern of the feature points, we propose a rectification method that maximizes the structure congruency based on Delaunay triangulation nets and thus avoid some existing problems of other methods. The rectified stereo images can then be used to generate a dense depth map of the scene. The task is much simplified compared to some existing method because the 2D searching problem is reduced to 1D searching. To validate the proposed methods, real world images are applied to test the performance and comparisons to the state-of-the-art methods are provided. The performance of the dense matching with respect to the changing baseline is also studied.
843

Plant-wide Performance Monitoring and Controller Prioritization

Pareek, Samidh 06 1900 (has links)
Plant-wide performance monitoring has generated a lot of interest in the control engineering community. The idea is to judge the performance of a plant as a whole rather than looking at performance of individual controllers. Data based methods are currently used to generate a variety of statistical performance indices to help us judge the performance of production units and control assets. However, so much information can often be overwhelming if it lacks precise information. Powerful computing and data storage capabilities have enabled industries to store huge amounts of data. Commercial performance monitoring softwares such as those available from many vendor companies such as Honeywell, Matrikon, ExperTune etc typically use this data to generate huge amounts of information. The problem of data overload has in this way turned into an information overload problem. This work focuses on developing methods that reconcile these various statistical measures of performance and generate useful diagnostic measures in order to optimize process performance of a unit/plant. These methods are also able to identify the relative importance of controllers in the way that they affect the performance of the unit/plant under consideration. / Process Control
844

Unterstützung der Nutzung des kollektiven Wissens in einem LCMS / Supporting the use of social knowledge in an LCMS

Lorenz, Anja 07 July 2011 (has links) (PDF)
Bei der Erstellung von Lernmaterialien für die Aus- und Weiterbildung in Unternehmen treffen verschiedene Anforderungen aufeinander: Die Kursmaterialien sollen fachlich richtig, didaktisch sinnvoll und gestalterisch ansehnlich aufbereitet sein. Zugleich finden sich in den Unternehmen unterschiedliche Zielgruppen und Einsatzzwecke, sodass hohe Ansprüche an den effektiven Einsatz und somit an die Wiederverwendbarkeit einmal erstellter Lerninhalte bestehen. Learning Content Management Systeme (LCMS) begegnen diesen Herausforderungen und stellen Funktionalitäten zur Erstellung, Bearbeitung, Verwaltung und Veröffentlichung von Lernobjekten und den daraus zusammengestellten Kursmaterialien bereit: Zentralisierte Lernobjektrepositorien für XML-basierte Lerninhalte erleichtern nicht nur die Wiederverwendung von Informations- und Lernobjekten in Kursmaterialien für verschiedene Lernszenarien, sondern sie ermöglichen erst die Überführung der Lerninhalte in verschiedene Verteilungsformate und Sprachversionen. Während der Umgang mit Lernobjekten für einzelne Autoren durch diese Funktionalitäten weitestgehend vereinfacht wird, fehlt es bislang an einer umfassenden Betrachtung, wie die Zusammenarbeit verschiedener Autoren im LCMS unterstützt werden kann. Mit der Dissertation werden Übertragungsmöglichkeiten von Kollaborationsprinzipien aus dem Web 2.0 untersucht, die als Vorbild für die gemeinsame Erstellung von Content und die dabei nötigen Abstimmungsprozesse durch nicht- bzw. flachstrukturierte, heterogene Autorengruppen dienen. Als methodische Klammer wird die DIN EN ISO/IEC 19796 (2009) herangezogen. Sie gibt einerseits die für die Analyse nötige Strukturierung der Prozesse bei der Lernangebotserstellung vor und liefert außerdem die für die Evaluation nötigen Qualitätskriterien.
845

Growth and Process-Induced Deep Levels in Wide Bandgap Semiconductor GaN and SiC / 結晶成長及びプロセスにより導入されるワイドバンドギャップ半導体GaN及びSiC中の深い準位

Kanegae, Kazutaka 23 March 2022 (has links)
付記する学位プログラム名: 京都大学卓越大学院プログラム「先端光・電子デバイス創成学」 / 京都大学 / 新制・課程博士 / 博士(工学) / 甲第23909号 / 工博第4996号 / 新制||工||1780(附属図書館) / 京都大学大学院工学研究科電子工学専攻 / (主査)教授 木本 恒暢, 教授 川上 養一, 准教授 安藤 裕一郎 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
846

Classification of HTML Documents

Xie, Wei January 2006 (has links)
Text Classification is the task of mapping a document into one or more classes based on the presence or absence of words (or features) in the document. It is intensively being studied and different classification techniques and algorithms have been developed. This thesis focuses on classification of online documents that has become more critical with the development of World Wide Web. The WWW vastly increases the availability of on-line documents in digital format and has highlighted the need to classify them. From this background, we have noted the emergence of “automatic Web Classification”. These mainly concentrate on classifying HTML-like documents into classes or categories by not only using the methods that are inherited from the traditional Text Classification process, but also utilizing the extra information provided only by Web pages. Our work is based on the fact that, Web documents, contain not only ordinary features (words) but also extra information, such as meta-data and hyperlinks that can be used to advantage the classification process. The aim of this research is to study various ways of using the extra information, in particularly, hyperlink information provided by HTML-documents (Web pages). The merit of the approach, developed in this thesis, is its simplicity, compared with existing approaches. We present different approaches of using hyperlink information to improve the effectiveness of web classification. Unlike other work in this area, we will only use the mappings between linked documents and their own class or classes. In this case, we only need to add a few features called linked-class features into the datasets, and then apply classifiers on them for classification. In the numerical experiments we adopted two wellknown Text Classification algorithms, Support Vector Machines and BoosTexter. The results obtained show that classification accuracy can be improved by using mixtures of ordinary and linked-class features. Moreover, out-links usually work better than in-links in classification. We also analyse and discuss the reasons behind this improvement. / Master of Computing
847

Classification of HTML Documents

Xie, Wei . University of Ballarat. January 2006 (has links)
Text Classification is the task of mapping a document into one or more classes based on the presence or absence of words (or features) in the document. It is intensively being studied and different classification techniques and algorithms have been developed. This thesis focuses on classification of online documents that has become more critical with the development of World Wide Web. The WWW vastly increases the availability of on-line documents in digital format and has highlighted the need to classify them. From this background, we have noted the emergence of “automatic Web Classification”. These mainly concentrate on classifying HTML-like documents into classes or categories by not only using the methods that are inherited from the traditional Text Classification process, but also utilizing the extra information provided only by Web pages. Our work is based on the fact that, Web documents, contain not only ordinary features (words) but also extra information, such as meta-data and hyperlinks that can be used to advantage the classification process. The aim of this research is to study various ways of using the extra information, in particularly, hyperlink information provided by HTML-documents (Web pages). The merit of the approach, developed in this thesis, is its simplicity, compared with existing approaches. We present different approaches of using hyperlink information to improve the effectiveness of web classification. Unlike other work in this area, we will only use the mappings between linked documents and their own class or classes. In this case, we only need to add a few features called linked-class features into the datasets, and then apply classifiers on them for classification. In the numerical experiments we adopted two wellknown Text Classification algorithms, Support Vector Machines and BoosTexter. The results obtained show that classification accuracy can be improved by using mixtures of ordinary and linked-class features. Moreover, out-links usually work better than in-links in classification. We also analyse and discuss the reasons behind this improvement. / Master of Computing
848

Formulary approach to the taxation of transnational corporations A realistic alternative?

Celestin, Lindsay Marie France Clement January 2000 (has links)
The Formulary Approach to the Taxation of Transnational Corporations: A Realistic Alternative? Synopsis The central hypotheses of this thesis are: that global formulary apportionment is the most appropriate method for the taxation of transnational corporations (TNCs) in lieu of the present system commonly referred to as the separate accounting/arm's length method; and that it is essential, in order to implement the proposed global formulary model, to create an international organisation which would fulfil, in the taxation field, a role equivalent to that of the World Trade Organisation (WTO) in international trade. The world economy is fast integrating and is increasingly dominated by the activities of transnational enterprises. These activities create a dual tax problem for various revenue authorities seeking to tax gains derived thereon: Firstly, when two or more countries entertain conflicting tax claims on the same base, there arises what is commonly referred to as a double taxation problem. Secondly, an allocation problem arises when different jurisdictions seek to determine the quantum of the gains to be allocated to each jurisdiction for taxation purposes. The traditional regime for solving both the double taxation and the allocation problem is enshrined in a series of bilateral treaties signed between various nations. These are, in general, based on the Organisation for Economic Co-operation and Development (OECD) Model Treaty.1 It is submitted, in this thesis, that while highly successful in an environment characterised by the coexistence of various national taxation systems, the traditional regime lacks the essential attributes suitable to the emerging 'borderless world'. The central theme of this thesis is the allocation problem. The OECD Model attempts to deal with this issue on a bilateral basis. Currently, the allocation problem is resolved through the application of Articles 7 and 9 of the OECD Model. In both instances the solution is based on the 'separate enterprise' standard, also known as the separate entity theory. This separate accounts/arm's length system was articulated in the 1930s when international trade consisted of flows of raw materials and other natural products as well as flows of finished manufactured goods. Such trade is highly visible and may be adequately valued both at the port of departure or at the port of entry in a country. It follows that within this particular system of international trade the application of the arm's length principle was relatively easy and proved to be extremely important in resolving both the double taxation and apportionment problems. Today, however, the conditions under which international trade is conducted are substantially different from those that prevailed until the 1960s. * Firstly, apart from the significant increase in the volume of traditionally traded goods, trade in services now forms the bulk of international exchanges. In addition, the advent of the information age has dramatically increased the importance of specialised information whose value is notoriously difficult to ascertain for taxation purposes. * Secondly, the globalisation phenomenon which gathered momentum over the last two decades has enabled existing TNCs to extend their global operations and has favoured the emergence of new transnational firms. Thus, intra-firm trade conducted outside market conditions accounts for a substantial part of international trade. * Thirdly, further economic integration has been achieved following the end of the Cold War and the acceleration of the globalisation phenomenon. In this new world economic order only TNCs have the necessary resources to take advantage of emerging opportunities. The very essence of a TNC is 'its ability to achieve higher revenues (or lower costs) from its different subsidiaries as a whole compared to the results that would be achieved under separate management on an arm's length basis.'2 Yet, the prevailing system for the taxation of TNCs overlooks this critical characteristic and is therefore incapable of fully capturing, for taxation purposes, the aggregate gains of TNCs. The potential revenue loss arising from the inability of the present system to account for and to allocate synergy gains is substantial. It follows that the perennial questions of international taxation can no longer be addressed within the constraints of the separate entity theory and a narrow definition of national sovereignty. Indeed, in order to mirror the developments occurring in the economic field, taxation needs to move from a national to an international level. Moreover, a profound reform of the system is imperative in order to avoid harmful tax competition between nations and enhance compliance from TNCs. Such a new international tax system needs to satisfy the test of simplicity, equity, efficiency, and administrative ease. To achieve these objectives international cooperation is essential. The hallmark of international cooperation has been the emergence, after World War II, of a range of international organisations designed to facilitate the achievement of certain goals deemed essential by various nations. The need for an organisation to deal specifically with taxation matters is now overwhelming. Consequently, this thesis recommends the creation of an international organisation to administer the proposed system. The main objective of this international organisation would be to initiate and coordinate the multilateral application of a formulary apportionment system which, it is suggested, would deal in a more realistic way with 'the difficult problems of determining the tax base and allocating it appropriately between jurisdictions'.3 The global formulary apportionment methodology is derived from the unitary entity theory. The unitary theory considers a TNC as a single business which, for convenience, is divided into 'purely formal, separately-incorporated subsidiaries'.4 Under the unitary theory the global income of TNCs needs to be computed, then such income is apportioned between the various component parts of the enterprise by way of a formula which reflects the economic contribution of each part to the derivation of profits. The question that arises is whether the world of international taxation is ready for such a paradigm shift. It is arguable that this shift has already occurred albeit cautiously and in very subtle ways. Thus, the latest of the OECD Guidelines on the transfer pricing question provides that 'MNE [Multinational Enterprise] groups retain the freedom to apply methods not described in this Report to establish prices provided those prices satisfy the arm's length principle in accordance with these Guidelines.'5 Arguably, the globalisation process has created 'the specific situation' allowed for by the OECD. This thesis, therefore, explores the relative obsolescence of the bilateral approach to the taxation of TNCs and then suggests that a multilateral system is better adapted to the emerging globalised economy. The fundamental building blocks of the model proposed in this thesis are the following: * First, the administration and coordination of the proposed system is to be achieved by the creation of a specialised tax organisation, called Intertax, to which member countries would devolve a limited part of their fiscal sovereignty. * Second, in order to enable the centralised calculation of TNC's profits, the proposed system requires the formulation of harmonised methods for the measurement of the global profits of TNCs. Therefore, the efforts of the International Accounting Standards Committee (IASC) to produce international accounting standards and harmonised consolidation rules must be recognised and, if needs be, refined and ultimately implemented. * Third, the major function of Intertax would be to determine the commercial profits of TNCs on a standardised basis and to apportion the latter to relevant countries by way of an appropriate formula/formulas. Once this is achieved, each country would be free, starting from its share of commercial profits, to determine the taxable income in accordance with the particular tax base that it adopts and, ultimately, the tax payable within its jurisdiction. In the proposed system, therefore, a particular country would be able to independently set whatever depreciation schedules or investment tax credits it chooses, and adopt whatever tax accounting rules it deems fit relative to its policy objectives. Moreover, this thesis argues that the global formulary apportionment model it proposes is not dramatically opposed to the arm's length principle. Indeed, it suggests that the constant assumption to the contrary, even with regard to the usual formulary apportionment methodology, is extravagant because both methodologies are based on a common endeavour, that is, to give a substantially correct reflex of a TNC's true profits. It has often been objected that global formulary apportionment is arbitrary and ignores market conditions. This thesis addresses such concerns by rejecting the application of a single all-purpose formula. Rather, it recognises that TNCs operating in different industries require different treatment and, therefore, suggests the adoption of different formulas to satisfy specific industry requirements. For example, the formula applicable to a financial institution would be different to that applicable to the pharmaceutical industry. Each formula needs to be based on the fundamental necessity to capture the functions, taking into consideration assets used, and risks assumed within that industry. In addition, if the need arises, each formula should be able to be fine-tuned to fit specific situations. Moreover, it is also pertinent to note that the OECD already accepts 'the selected application of a formula developed by both tax administrations in cooperation with a specific taxpayer or MNE group...such as it might be used in a mutual agreement procedure, advance transfer pricing agreement, or other bilateral or multilateral determination.'6 The system proposed in this thesis can thus be easily reconciled with the separate accounting/arm's length which the OECD so vehemently advocates. Both models have the same preoccupations so that what is herein proposed may simply be characterised as an institutionalised version of the system advocated by the OECD. Multilateral formulary apportionment addresses both the double taxation and the allocation problems in international taxation. It resolves the apportionment question 'without depending on an extraordinary degree of goodwill or compliance from taxpayers.'7 It is therefore submitted that, if applied on a multilateral basis with a minimum of central coordination, it also seriously addresses the double taxation problem. Indeed, it is a flexible method given that different formulas may be devised to suit the needs of TNCs operating in different sectors. Consequently, formulary apportionment understood in this sense, is a realistic alternative to the limitations of the present system.
849

Beyond the Digital Diva: Women on the World Wide Web

C.Kilpin@murdoch.edu.au, Carrie Kilpin January 2004 (has links)
In the year 2000, American researchers reported that women constituted 51 percent of Internet users. This was a significant discovery, as throughout the medium’s history, women were outnumbered by men as both users and builders of sites. This thesis probes not only this historical moment of change, but how women are mobilising the World Wide Web in their work, leisure and lives. Not considered in the ‘51% of American women now online’ headline is the lack of women engaged in Web building rather than Web shopping. In technical fields relating to the Web, women are outnumbered and marginalized, being poorly represented in computer-related college and university courses, in careers in computer science and computer programming, and also in digital policy. This thesis identifies the causes for the low number of women in these spheres. I consider the social and cultural reasons for their exclusion and explore the discourses which operate to discourage women’s participation. My original contribution to knowledge is forged as much through how this thesis is written as by the words and footnotes that graze these pages. With strong attention to methodology in Web-based research, I gather a plurality of women’s voices and experiences of under-confidence, humiliation and fear. Continuing the initiatives of Dale Spender’s Nattering on the Net, I research women’s use of the Web in placing a voice behind the statistics. I also offer strategies for digital intervention, without easy platitudes to the ‘potential’ for women in the knowledge economy or through Creative Industries strategies. The chapters of this thesis examine the contexts in which exclusionary attitudes are created and perpetuated. No technology is self-standing: we gain information about ‘new’ technologies from the old. I investigate representations and mediations of women’s relationship to the Web in fields including the media, the workplace, fiction, the Creative Industries and educational institutions. For example, the media is complicit in causing women to doubt their technological capabilities. The images and ideologies of women in film, newspapers and magazines that present computer and Web usage are often discriminatory and derogatory. I also found in educational institutions that patriarchal attitudes privilege men, and discourage female students’ interest in digital technologies. I interviewed high school and university students and found that the cultural values embedded within curricula discriminate against women. Limitations in Web-based learning were also discovered. In discussing the cultural and social foundations for women’s absence or under-confidence in technological fields, I engage with many theories from a prominent digital academic: Dale Spender. In her book Nattering on the Net: Women, Power and Cyberspace, Spender’s outlook is admonitory. She believes that unless women acquire a level of technological capital equal to their male counterparts, women will continue to be marginalised as new political and social ideologies develop. She believes women’s digital education must occur as soon as possible. While I welcome her arguments, I also found that Spender did not address the confluence between the analogue and the digital. She did not explore how the old media is shaping the new. While Spender’s research focused on the Internet, I ponder her theses in the context of the World Wide Web. In order to intervene in the patriarchal paradigm, to move women beyond digital shoppers and into builders of the digital world, I have created a website (included on CD-ROM) to accompany this thesis’s arguments. It presents links to many sites on the Web to demonstrate how women are challenging the masculine inscriptions of digital technology. Although the website is created to interact directly with Chapter Three, its content is applicable to all parts of the thesis. This thesis is situated between cultural studies and internet studies. This interdisciplinary dialogue has proved beneficial, allowing socio-technical research to resonate with wider political applications. The importance of intervention - and the need for change - has guided my words. Throughout the research and writing process of this thesis, organisations have released reports claiming gender equity on the Web. My task is to capture the voice, views and fears of the women behind these statistics.
850

Expert criteria for evaluating the quality of web-based child development information /

Martland, Nancy F. January 1900 (has links)
Thesis (Ph.D.)--Tufts University, 2001. / Adviser: Fred Rothbaum. Submitted to the Dept. of Child Development. Includes bibliographical references. Access restricted to members of the Tufts University community. Also available via the World Wide Web;

Page generated in 0.0929 seconds