• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 7
  • 6
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 58
  • 12
  • 8
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Mitteilungen des URZ 1/2003

Ziegler,, Richter,, Riedel,, Hille, 10 March 2003 (has links)
Mitteilungen des URZ 1/2003
42

La transition vers l'innovation soutenable pour les entreprises industrielles : une approche par les business models : application au domaine du génie industriel / Towards sustainable innovation for industrial companies : a business model approach

Bisiaux, Justine 14 October 2015 (has links)
Ces dernières décennies ont été marquées par l’apparition d’un nouveau contexte en faveur du développement soutenable, dans lequel de nouveaux modes de consommation et de production émergent. Ce nouveau contexte tend à se substituer à l’innovation intensive, les entreprises industrielles à réorienter leur business model vers l’innovation soutenable. Cependant ce changement nécessite des bouleversements stratégiques et organisationnels que certaines entreprises ne parviennent pas à surmonter. Ces situations d’entreprises révèlent un double enjeu : la nécessité de caractériser l’innovation soutenable d’une part et l’intérêt de définir une stratégie d’évolution et de diffusion de l’innovation soutenable au sein des entreprises d’autre part. Afin de caractériser l’innovation soutenable, trois notions sont mobilisées : le business model, la soutenabilité et la fonctionnalité. Les résultats de cette exploration suggèrent l’utilisation du business model comme objet intermédiaire afin de favoriser la co-construction et l’évolution de business models. L’étude de la soutenabilité du business model conduit à la proposition de l’association de l’économie de fonctionnalité - business model serviciel - à l’éco-conception - modèle de conception environnemental - comme déclinaison de l’innovation soutenable. L’analyse de la fonctionnalité révèle une complémentarité de ces deux démarches. Celles-ci permettent de définir de nouvelles offres du point de vue des valeurs d’usage et permettent d’intégrer l’utilisateur dans la définition de l’offre. L’exploration de ces trois notions conduit également à la proposition d’un changement de paradigme en faveur du paradigme soutenable suivi du développement de trajectoires de business models afin de garantir aux entreprises l’atteinte à long terme des plus hauts niveaux de l’innovation soutenable. Ces résultats théoriques servent ensuite au développement d’une méthode d’aide à la décision : Business Model Explorer for Sustainability (BMES). La méthode BMES permet aux entreprises de concevoir de nouveaux business models soutenables et de définir des trajectoires vers ces business models soutenables comme stratégie de diffusion de l’innovation soutenable à long terme. Elle s’appuie sur la notion d’upgradabilité comme déclinaison opérationnelle de l’innovation soutenable. La méthode est développée et testée avec les deux industriels partenaires du projet IDCyclUM : Neopost et Rowenta. Une des principales perspectives de recherche proposée consiste à engager des travaux visant à poursuivre la définition de l’innovation soutenable qui reste un concept ambigu pour lequel il n’existe pas de consensus à l’heure actuelle. / Over the past decades, a new context in favor of sustainable development emerged. New models of consumption and production are developed. This shift from intensive innovation to sustainable innovation leads some companies to rethink their business model. However, this business model evolution requires strategic and organizational changes that some companies fail to overcome. These companies’ situations reveal a double challenge : the need to characterize sustainable innovation on the one hand and defining a strategy for disseminating sustainable innovation on the other hand.To characterize sustainable innovation, three notions are mobilized : the business model, sustainability and functionality. The results of this exploration suggest the use of the business modelas an intermediary object to promote the co-construction and the evolution of business models. The study of sustainable led us to associate functional economy - servicial business model - to eco-design- environmental design - as a declination of sustainable innovation. Functionality concept analysis revealed the complementarity of functional economy and eco-design approaches. This allows us to define new offers from the use-values point of view. The exploration of these three notions also leads to propose a paradigm shift in favor of sustainable paradigm. This paradigm shift is followed by the development and characterization of business models trajectories. These trajectories guide companies in achieving the highest levels of sustainable innovation on the long term. These theoretical results were used to develop a method called Business Model Explorer for Sustainability (BMES). The BMES method allows companies to develop sustainable business models and to define trajectories toward these business models. The method is based on the upgradability concept as operational application of sustainable innovation. The method was developed and tested with the two industrial partners of IDCyclUM project : Neopost and Rowenta. One of the main proposed research perspectives is to continue defining sustainable innovation. This concept remains indeed ambiguous and there is still no consensus about its definition at present.
43

Utilization of Legacy Soil Data for Digital Soil Mapping and Data Delivery for the Busia Area, Kenya

Joshua O Minai (8071856) 06 December 2019 (has links)
Much older soils data and soils information lies idle in libraries and archives and is largely unused, especially in developing countries like Kenya. We demonstrated the usefulness of a stepwise approach to bring legacy soils data ‘back to life’ using the 1980 <i>Reconnaissance Soil Map of the Busia Area</i> <i>(quarter degree sheet No. 101)</i> in western Kenya as an example. Three studies were conducted by using agronomic information, field observations, and laboratory data available in the published soil survey report as inputs to several digital soil mapping techniques. In the first study, the agronomic information in the survey report was interpreted to generate 10 land quality maps. The maps represented the ability of the land to perform specific agronomic functions. Nineteen crop suitability maps that were not previously available were also generated. In the second study, a dataset of 76 profile points mined from the survey report was used as input to three spatial prediction models for soil organic carbon (SOC) and texture. The three predictions models were (i) ordinary kriging, (ii) stepwise multiple linear regression, and (iii) the Soil Land Inference Model (SoLIM). Statistically, ordinary kriging performed better than SoLIM and stepwise multiple linear regression in predicting SOC (RMSE = 0.02), clay (RMSE = 0.32), and silt (RMSE = 0.10), whereas stepwise multiple linear regression performed better than SoLIM and ordinary kriging for predicting sand content (RSME = 0.11). Ordinary kriging had the narrowest 95% confidence interval while stepwise multiple linear regression had, the widest. From a pedological standpoint, SoLIM conformed better to the soil forming factors model than ordinary kriging and had a narrower confidence interval compared to stepwise multiple linear regression. In the third study, rules generated from the map legend and map unit descriptions were used to generate a soil class map. Information about soil distribution and parent material from the map unit polygon descriptions were combined with six terrain attributes, to generate a disaggregated fuzzy soil class map. The terrain attributes were multiresolution ridgetop flatness (MRRTF), multiresolution valley bottom flatness (MRVBF), topographic wetness index (TWI), topographic position index (TPI), planform curvature, and profile curvature. The final result was a soil class map with a spatial resolution of 30 m, an overall accuracy of 58% and a Kappa coefficient of 0.54. Motivated by the wealth of soil agronomic information generated by this study, we successfully tested the feasibility of delivering this information in rural western Kenya using the cell phone-based Soil Explorer app (<a href="https://soilexplorer.net/">https://soilexplorer.net/</a>). This study demonstrates that legacy soil data can play a critical role in providing sustainable solutions to some of the most pressing agronomic challenges currently facing Kenya and most African countries.<div><p></p></div>
44

Spacecraft-Plasma Interaction Modelling of Future Missions to Jupiter

Rudolph, Tobias January 2012 (has links)
As an orbiter cruising to Jupiter will encounter different plasma environments, variety of spacecraft surface charging is expected. This surface potential can lead to inaccurate and wrong in-situ plasma measurements of on-board sensors, which explain the interest in simulating the charging.In this thesis the spacecraft-plasma interactions for a future mission to Jupiter are modelled with the help of the Spacecraft Plasma Interaction System, taking the case of a Jupiter Ganymede Orbiter (JGO) and a Jupiter Europa Orbiter (JEO) as an archetype for a future mission.It is shown that in solar wind at Earth and Jupiter, spacecraft potentials of about 8 V for the JEO, and 10 V to 11 V for the JGO are expected. Furthermore, at a distance of 15 Jupiter radii from Jupiter, the JGO is expected to charge to an electric potential of 2 V, except in the planetary shadow, where it will charge to a high negative potential of -40 V. Moreover, close to the orbit of Callisto, JGO will charge to 12 V in the sun and to 4.6 V in eclipse, due to a high secondary electron emission yield. / <p>Validerat; 20120115 (anonymous)</p>
45

Satellite meteorology in the cold war era: scientific coalitions and international leadership 1946-1964

Callahan, Angelina Long 13 January 2014 (has links)
In tracing the history of the TIROS meteorological satellite system, this dissertation details the convergence of two communities: the DOD space scientists who established US capability to launch and operate these remote sensing systems and the US Weather Bureau meteorologists who would be the managers and users of satellite data. Between 1946 and 1964, these persons participated in successive coalitions. These coalitions were necessary in part because satellite systems were too big—geographically, fiscally, and technically—to be developed and operated within a single institution. Thus, TIROS technologies and people trace their roots to several research centers—institutions that the USWB and later NASA attempted to coordinate for US R&D. The gradual transfer of persons and hardware from the armed services to the non-military NASA sheds light on the US’s evolution as a Cold War global power, shaped from the “top-down” (by the executive and legislative branches) as well as the “bottom-up” (by military and non-military scientific communities). Through these successive coalitions, actor terms centered on “basic science” or the circulation of atmospheric data were used to help define bureaucratic places (the Upper Atmospheric Rocket Research Panel, International Geophysical Year, NASA, and the World Weather Watch) in which basic research would be supported by sustained and collaboration could take place with international partners.
46

RE-ENGINEERING THE EUVE PAYLOAD OPERATIONS INFORMATION FLOW PROCESS TO SUPPORT AUTONOMOUS MONITORING OF PAYLOAD TELEMETRY

Kronberg, F., Ringrose, P., Losik, L., Biroscak, D., Malina, R. F. 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada / The UC Berkeley Extreme Ultraviolet Explorer (EUVE) Science Operations Center (ESOC) is developing and implementing knowledge-based software to automate the monitoring of satellite payload telemetry. Formerly, EUVE science payload data were received, archived, interpreted, and responded to during round-the-clock monitoring by human operators. Now, knowledge-based software will support, augment, and supplement human intervention. In response to and as a result of this re-engineering project, the creation, storage, revision, and communication of information (the information flow process) within the ESOC has been redesigned. We review the information flow process within the ESOC before, during, and after the re-engineering of telemetry monitoring. We identify six fundamental challenges we face in modifying the information flow process. (These modifications are necessary because of the shift from continuous human monitoring to a knowledge-based autonomous monitoring system with intermittent human response.) We describe the innovations we have implemented in the ESOC information systems, including innovations in each part of the information flow process for short-term or dynamic information (which changes or updates within a week) as well as for long-term or static information (which is valid for more than a week). We discuss our phased approach to these innovations, in which modifications were made in small increments and the lessons learned at each step were incorporated into subsequent modifications. We analyze some mistakes and present lessons learned from our experience.
47

Performance Analysis of JavaScript

Smedberg, Fredrik January 2010 (has links)
<p>In the last decade, web browsers have seen a remarkable increase of performance, especially in the JavaScript engines. JavaScript has over the years gone from being a slow and rather limited language, to today have become feature-rich and fast. It’s speed can be around the same or half of comparable code written in C++, but this speed is directly dependent on the choice of the web browser, and the best performance is seen in browsers using JIT compilation techniques.</p><p>Even though the language has seen a dramatic increase in performance, there’s still major problems regarding memory usage. JavaScript applications typically consume 3-4 times more memory than similar applications written in C++. Many browser vendors, like Opera Software, acknowledge this and are currently trying to optimize their memory usage. This issue is hopefully non-existent within a near future.</p><p>Because the majority of scientific papers written about JavaScript only compare performance using the industry benchmarks SunSpider and V8, this thesis have chosen to widen the scope. The benchmarks really give no information about how JavaScript stands in comparison to C#, C++ and other popular languages. To be able to compare that, I’ve implemented a GIF decoder, an XML parser and various elementary tests in both JavaScript and C++ to compare how far apart the languages are in terms of speed, memory usage and responsiveness.</p>
48

Kampen fortsätter : En studie om kompatibilitetsproblem mellan moderna webbläsare / The fight continues : A study of compatibility problems of modern web browsers

Trenkler, Silja January 2006 (has links)
<p>Under 1990-talet utspelade sig en bitter kamp om marknadsandelar mellan de två ledande webbläsare Internet Explorer och Netscape Navigator, det så kallade webbläsar-kriget. Kriget hade till följd att webbläsarna blev nästan helt inkompatibla. Sedan dess pågår en ständig utveckling av gemensamma standarder för webben. Idag är förutsättningarna för kompatibilitet mycket bättre än för tio år sidan, men problemet är inte fullständigt avhjälpt. De moderna webbläsare Internet Explorer 6, Firefox 1.5, Opera 8.5 och Safari kan återge en och samma webbsida visas på olika sätt trots att det finns gemensamma standarder. Syftet med denna uppsats är att ta reda på de tekniska orsakerna bakom problemet samt att ta fram lösningsförslag för att skapa en webbsida som är helt kompatibel i de moderna webbläsarna. Uppsatsen innehåller ett omfattande teorikapitel som behandlar definitioner, historik och problem. Teorin kompletteras av tre fältintervjuer med professionella webbutvecklare. Undersökningarna visar att kompati-bilitetsproblem beror på flera faktorer och att det är omöjligt att skapa en heltäckande lösning som kommer åt alla problem. Men genom att kombinera olika tekniker kan man skapa en metod som täcker en stor del av såväl generella som specifika kompatibilitetsproblemen utan att kollidera med rekommenderade standarder.</p> / <p>During the 1990’s the two leading web browsers, Internet Explorer and Netscape Navigator, fought each other in a battle for market shares, the so-called browser war. This war caused almost complete incompatibility between the web browsers. Since then, there has been a continual development of common standards for the web. Today conditions for compatibility are a lot better compared to ten years ago, but the problem is not completely solved. The modern web browsers Internet Explorer 6, Firefox 1.5, Opera 8.5 and Safari can display the exact same web page differently despite common standards. The aim of this essay is to investigate the technical causes of the problem and to develop suggested solutions for creating a web page that is fully compatible in modern browsers. The essay contains an extensive literature study, considering definitions, history and problems. The theory was completed with three field interviews with professional web designers. The investigations show that compatibility problems depend on several factors and that it is impossible to create one exhaustive solution that encompasses all problems. However, by combining different techniques one can create a method that covers a large part of both general and specific compatibility problems without colliding with recommended standards.</p>
49

Assembly, Integration, and Test of the Instrument for Space Astronomy Used On-board the Bright Target Explorer Constellation of Nanosatellites

Cheng, Chun-Ting 25 July 2012 (has links)
The BRIght Target Explorer (BRITE) constellation is revolutionary in the sense that the same scientific objectives can be achieved smaller (cm3 versus m3 ) and lighter (< 10kg versus 1, 000kg). It is a space astronomy mission, observing the variations in the apparent brightness of stars. The work presented herein focuses on the assembly, integration and test of the instrument used on-board six nanosatellites that form the constellation. The instrument is composed of an optical telescope equipped with a Charge Coupled Device (CCD) imager and a dedicated computer. This thesis provides a particular in-depth look into the inner workings of CCD. Methods used to characterize the instrument CCD in terms of its bias level stability, gain factor determination, saturation, dark current and readout noise level evaluation are provided. These methodologies are not limited to CCDs and they provide the basis for anyone who wishes to characterize any type of imager for scientic applications.
50

Assembly, Integration, and Test of the Instrument for Space Astronomy Used On-board the Bright Target Explorer Constellation of Nanosatellites

Cheng, Chun-Ting 25 July 2012 (has links)
The BRIght Target Explorer (BRITE) constellation is revolutionary in the sense that the same scientific objectives can be achieved smaller (cm3 versus m3 ) and lighter (< 10kg versus 1, 000kg). It is a space astronomy mission, observing the variations in the apparent brightness of stars. The work presented herein focuses on the assembly, integration and test of the instrument used on-board six nanosatellites that form the constellation. The instrument is composed of an optical telescope equipped with a Charge Coupled Device (CCD) imager and a dedicated computer. This thesis provides a particular in-depth look into the inner workings of CCD. Methods used to characterize the instrument CCD in terms of its bias level stability, gain factor determination, saturation, dark current and readout noise level evaluation are provided. These methodologies are not limited to CCDs and they provide the basis for anyone who wishes to characterize any type of imager for scientic applications.

Page generated in 0.0387 seconds