• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1040
  • 382
  • 345
  • 249
  • 170
  • 80
  • 45
  • 28
  • 18
  • 17
  • 17
  • 17
  • 16
  • 10
  • 9
  • Tagged with
  • 2832
  • 415
  • 249
  • 244
  • 243
  • 227
  • 211
  • 196
  • 196
  • 186
  • 178
  • 175
  • 166
  • 161
  • 160
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

PROVIZ: an integrated graphical programming, visualization and scripting framework for WSNs

Kumbakonam Chandrasekar, Ramalingam 01 April 2013 (has links)
Wireless Sensor Networks (WSNs) are rapidly gaining popularity in various critical domains like health care, critical infrastructure, and climate monitoring, where application builders have diversified development needs. Independent of the functionalities provided by the WSN applications, many of the developers use visualization, simulation, and programming tools. However, these tools are designed as separate stand-alone applications, which force developers to use multiple tools. This situation often poses confusion and hampers an efficient development experience. To avoid the complexity of using multiple tools, a new, extensible, multi-platform, scalable, and open-source framework called PROVIZ is designed. PROVIZ is an integrated visualization and programming framework with the following features: PROVIZ 1) visualizes sensor nodes and WSN traffic by parsing the data received either from a packet sniffer (e.g., a sensor-based sniffer, or a commercial TI SmartRF 802.15.4 packet sniffer), or from a simulator (e.g., OMNeT); 2) visualizes a heterogeneous WSN consisting of different sensor nodes sending packets with different packet payload formats; and 3) provides a programming framework, which provides a graphical and script-based programming functionality, for developing WSN applications. Also, PROVIZ includes built-in extensible visual demo deployment capabilities that allow users to quickly craft network scenarios and share them with other users. Additionally, a secure and energy efficient wireless code dissemination protocol, named SIMAGE, was developed. SIMAGE is used by PROVIZ to wirelessly reprogram the sensor nodes. SIMAGE uses a link quality cognizant adaptive packet-sizing technique along with energy-efficient encryption protocols for secure and efficient code dissemination. In this thesis, the various features of PROVIZ's visualization and programming framework are explained, the functionality and performance of SIMAGE protocol is described, an example WSN security attack scenario is analyzed, and how PROVIZ can be used as a visual debugging tool to identify the security attack and aid in providing a software fix are discussed.
262

Návrh metodiky a vytvoření vybraných programových modulů pro nastavování a snímání defektů soustružnických nástrojů pomocí laserového měřícího zařízení / The design of methodology and creating of chosen SW modules for setting and exploring of lathe tools defects using laser measuring device

Křížek, Michal January 2008 (has links)
This diploma thesis deals with possibility of a non-contact tool corrections measurement and broken tool detection on CNC lathes. For this purpose was made a methodology for measuring and application in NC program. For testing the program in praxis was used CNC lathe SPM 16 with Sinumerik 840D controller. As measuring equipment was used a laser probe NC4 by Renishaw manufacturer.
263

Manipulátor nástrojů, nástrových hlav a držáků / Manipulator of Tools, Tool Heads and Tools Holders

Ševčík, Michal January 2010 (has links)
The aim of this diploma thesis is to design manipulator for handling tools and tool heads between two tool’s magazines of the two vertical machining centers. One part of this thesis is a proposal of solutions, selection of the optimal choice, its construction and technical calculations of some construction nodes. Other parts of this thesis are technical drawings. The technical drawing of assembly, and two technical drawings of main components.
264

Hello! How can I assist you today ? : An Analysis of GPT Technology in Supporting International Entrepreneurship

LALLEE, Anaïs, MUCO, Nana January 2023 (has links)
This thesis investigates the applications and implications of Generative Pretrained Transformer (GPT) technology in international entrepreneurship. The research questions focus on how GPT can serve as a strategic tool, communication tool, and knowledge leverage tool, and how these applications influence decision-making, enhance performance, The findings from the analysis chapters highlights that GPT significantly contributes to strategic planning, market analysis, and operational management, thereby enhancing decision-making and performance. GPT technology, acting as a potent communication tool, nurtures more robust client relationships and eases cross-cultural interactions, courtesy of its superior language processing capabilities. The thesis discusses how these capabilities of GPT can lead to reduced miscommunications and enhanced client satisfaction. This, in turn, contributes to cost savings by retaining existing customers and attracting new ones, thereby enhancing profits. Moreover, The main theoretical implications that this  thesis has resulted in will showcase the time efficiency brought by automated, high-quality communication that reduces man-hours spent on routine interactions, freeing resources for strategic tasks for international entrepreneurship.  Furthermore, this study enriches the literature on generative AI exemplified by models like GPT in the world of international business, particularly within the context of international entrepreneurship. It offers essential insights that can guide international entrepreneurs in understanding the potential advantages of integrating GPT technology into their business operations.
265

Strategic Decisions Creation-Implementation (SDCI) process: an empirical study

Abdulhadi, Samer Nazmi 10 1900 (has links)
The aim of this research was to explore empirically how firms create and implement strategic decisions (SD’s). The research was inspired by the need to understand further organizational process underpinning SD’s phenomenon and potentially contribute to the overall performance of firms. Previous research on SD’s has been focusing on the formal strategic planning approaches, which have been criticized for their highly prescriptive views of SD’s, separating creation from implementation, and focusing on the content and discrete elements rather than the holistic process. Despite all these studies, our understanding of the actual nature of the SD phenomenon from creation to implementation remains incomplete. Motivated by the need to look empirically and holistically at this very complex social phenomenon, this research problematizes the above aspects of SD’s literature and positions this research within a wider social and descriptive process based approach. The research employed qualitative and Analytic Induction (AI) methodologies, and addressed the above need in three projects. The objective of each project has evolved and lead to the emergence of the final findings, which suggest a possible answer to the overall research aim. The Scoping Study proposed a theoretical framework of successful SD’s implementation factors. Project 1 went further and investigated these factors empirically. Project 2 developed empirically the process of how people actually create and implement SD’s. In Project 3, this process was analysed through the theoretical lens of the sensemaking perspective and was applied by practitioners through an empirically tested diagnostic tool. This research has made a step towards a better understanding of SD’s in practice and contributed to the academic knowledge by proposing a different, yet viable descriptive process, which can improve the overall quality of the SD’s, and potentially lead to better performance.
266

The development and implementation of a qualitative tool into a sensory product which can be used in a class situation for children with learning problems

Burger, Y., De Lange, R.H. January 2010 (has links)
Published Article / Children with Learning Problems (LP) differ from other children and are mostly identified in the primary grades. Factors which may influence the development of sensory products to stimulate children with LP are design factors such as illustrations, colour, themes and supporting factors which include therapeutic practices and cultural sensitivity. The previous mentioned factors may be beneficial for text enhancement and reading comprehension within books for children with Learning Disabilities (LD). It is envisaged that if design factors as well as sensory stimulants are integrated into play therapy mediums such as the Sensory Product (SP), it will be able to stimulate a child with LP through different therapeutic practices. Special needs teachers aid children with LP through intervention strategies once they are identified. Intervention strategies involve the use of instruments such as scripted and prescribed programmes (Fuchs & Fuchs, 2006), reading aloud by teachers to children (Fisher, Flood, Lapp & Frey, 2004) and one-on-one instruction as part of the three-tiered Reading to Intervention Model (RIM) (Scanlon & Sweeney, 2008). SP have the potential to assist teachers and children with LP but only if those products are appropriate for the children's developmental level (Oravec, 2000).
267

THE TEST AND TRAINING ENABLING ARCHITECTURE, TENA, AN IMPORTANT COMPONENT IN JOINT MISSION ENVIRONMENT TEST CAPABILITY (JMETC) SUCCESSES

Hudgins, Gene 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The Joint Mission Environment Test Capability (JMETC) is a distributed live, virtual, and constructive (LVC) testing capability developed to support the acquisition community and to demonstrate Net-Ready Key Performance Parameters (KPP) requirements in a customer-specific Joint Mission Environment (JME). JMETC provides connectivity to the Services’ distributed test capabilities and simulations, as well as industry test resources. JMETC uses the Test and Training Enabling Architecture, TENA, which is well-designed for supporting JMETC events. TENA provides the architecture and software capabilities necessary to enable interoperability among range instrumentation systems, facilities, and simulations. TENA, used in major field exercises and numerous distributed test events, provides JMETC with a technology already being deployed in DoD.
268

The Design of an Application Used for Aircraft Stability Evaluation

Leite, Nelson Paiva Oliveira, Lopes, Leonardo Mauricio de Faria, Walter, Fernando 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / One of the most important characteristics of an aircraft is its capability to return to its stable trimmed flight state after the occurrence of a disturbance or gust without the pilot intervention. The evaluation of such behavior, known as the aircraft stability, is divided into three sections: Lateral; Directional; and Longitudinal stabilities. The determination of the stability of an experimental aircraft requires the execution of a Flight Test Campaign (FTC). For the stability FTC the test bed should be equipped with a complete Flight Test Instrumentation (FTI) System which is typically composed by: a Pulse Code Modulation (PCM) Data Acquisition System (DAS); A sensor set; An airborne transmitter; and A data recorder. In the real-time operations, live data received over the Telemetry Link, that are processed, distributed and displayed at the Ground Telemetry System (GTS) enhances the FTC safety level and efficiency. The due to the lack of reliability, recorded data is retrieved in the post mission operations to allow the execution of data reduction analysis. This process is time consuming because recorded data has to be downloaded, converted to Engineering Units (EU), sliced, filtered and processed. The reason for the usage of this less efficient process relies in the fact that the real-time Telemetry data is less reliable as compared to recorded data (i.e. noisier). The upcoming iNET technology could provide a very reliable Telemetry Link. Therefore the data reduction analysis can be executed with live telemetry data in quasi-real time after the receipt of all valid tests points. In this sense the Brazilian Flight Test Group (GEEV) along with EMBRAER and with the support of Financiadora de Estudos e Projetos (FINEP) started the development of several applications. This paper presents the design of a tool used in the Longitudinal Static Stability Flight Tests Campaign. The application receives the Telemetry data over either a TCP/IP or a SCRAMnet Network, performs data analysis and test point validation in real time and when all points are gathered it performs the data reduction analysis and automatically creates Hyper Terminal Markup Language (HTML) formatted tests reports. The tool evaluation was executed with the instruction flights for the 2009 Brazilian Flight Test School (CEV). The result shows an efficiency gain for the overall FTC.
269

Design and analysis of the internally cooled smart cutting tools with the applications to adaptive machining

Bin Che Ghani, Saiful Anwar January 2013 (has links)
Adaptive machining with internally cooled smart cutting tools is a smart solution for industrial applications, which have stringent manufacturing requirements such as contamination free machining (CFM), high material removal rate, low tool wear and better surface integrity. The absence of cutting fluid in CFM causes the cutting tool and the workpiece subject to great thermal loads owing to higher friction and adhesion, and as a result may increase the levels of tool wear drastically. The increase in cutting temperature may influence the chip morphology which in return producing metal chips in unfavourable ribbon or snarl forms. CFM is difficult to be realized as contaminants can be in various forms in the machining operation and to avoid them totally requires a very tight controlled condition. However, the ecological, economical and technological demands compel the manufacturing practitioners to implement environmentally clean machining process (ECMP). Machining with innovative cooling techniques such as heat pipe, single-phase microduct, cryogenic or minimum quantity lubrication (MQL) has been intensely researched in recent years in order to reduce the cutting temperature in ECMP, thus enabling the part quality, the tool life and the material removal rate achieved in ECMP at least equate or surpass those obtained in conventional machining. On the other hand, the reduction of cutting temperature by using these techniques is often superfluous and is adverse to the produced surface roughness as the work material tends to inherent brittle and hard property at low temperature. Open cooling system means the machining requires a constant cooling supply and it does not provide a solution for process condition feedback as well.This Ph.D. project aims to investigate the design and analysis of internally cooled cutting tools and their implementation and application perspectives for smart adaptive machining in particular. Circulating the water based cooling fluid in a closed loop circuit contributes to sustainable manufacturing. The advantage of reducing cutting temperature from localized heat at the tool tip of an internally cooled cutting tool is enhanced with the smart features of the tool, which is trained by real experimental data, to cognitively vary the coolant flow rate, cutting feed rate or/and cutting speed to control the critical machining temperature as well as optimum machining conditions. Environmental friendly internal micro-cooling can avoid contamination of generated swarf which can also reduce the cutting temperature and thus reduce tool wear, increase machining accuracy and optimize machining economics. Design of the smart cutting tool with internal micro-cooling not only takes into account of the environmental aspects but also justifies with its ability to reduce the machining cost. Reduction of production cost can be achieved with the lower consumption of cooling fluid and improved machining resources/ energy efficiency. The models of structural, heat transfer, computational fluid dynamics (CFD) and tool life provide useful insight of the performance of the internally cooled smart cutting tool. Experimental validation using the smart cutting tool to machine titanium, steel and aluminium, indicates that the application of internally cooled smart cutting tools in adaptive machining can improve machining performance such as cutting temperature, cutting forces and surface quality generated. The useful tool life span is also extended significantly with internally cooled smart cutting tools in comparison to the tool life in conventional machining. The internally cooled smart cutting tool has important implications in the application to ECMP particularly by overcoming the stigma of high uncontrollable cutting temperature with the absence of cooling fluid.
270

The development of a culture-based tool to predict team performance

Hodgson, Allan January 2014 (has links)
The effect of national culture on the performance of teams is becoming an increasingly important issue in advanced western countries. There are many interlinked reasons for this, including the increasing globalisation of companies and the use of joint ventures for the development of expensive platforms. A further issue relates to the export of complex sociotechnical systems, where a culture clash between designer/manufacturer and user can lead to significant problems. This report describes research work that was carried out to analyse the cultural factors that influence the performance of teams (including researchers, designers, operators and crews), and to determine whether these factors could be captured in a tool to provide assistance to team managers and team builders. The original point of interest related to the development of increasingly complex sociotechnical systems, for example nuclear power stations, oil refineries, offshore oil platforms, hospital systems and large transport aircraft. Answers that might be sought, in particular by the senior managers of global companies, include (1) the best teams (or best national locations) for fundamental research, industrial research & development, product/system improvement and other key activities, and (2) the implications for system performance and, as a result, for system design, of targeting an eastern Asian market, a South-American market, etc. A literature review was carried out of the effects of culture on team performance, of culture measures and tools and of task classifications; in addition, empirical evidence of the validity of measures and tools was sought. Significant evidence was found of the effects of culture on teams and crews, but no national culture-based team performance prediction tools were found. Based on the results of the literature review, Hofstede's original four-dimension cultural framework was selected as the basis for the collection and analysis of empirical data, including the results of studies from the literature and the researcher s own empirical studies. No team or task classification system was found that was suitable for the purposes of linking culture to team performance, so a five-factor task classification was developed, based on the literature review, to form the basis of the initial modelling work. A detailed analysis of results from the literature and from the author s pilot studies revealed additional culture-performance relationships, including those relating to cultural diversity. Three culture-performance models were incorporated into software tools that offered performance prediction capabilities. The first model was primarily a test bed for ideas; the second model incorporated a task/behavioural approach which achieved limited success; the third and final model was evaluated against a range of team and crew performance data before being tested successfully for acceptability by users. The research results included the discovery that the effects of cultural diversity must be sought at the individual cultural dimension level not at the composite level, that the effects of national culture on team performance are consistent and strong enough to be usefully captured in a predictive culture tool and that the relationship between culture and behaviour is moderated by contextual factors.

Page generated in 0.0537 seconds