• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 380
  • 89
  • 72
  • 70
  • 67
  • 37
  • 33
  • 18
  • 12
  • 11
  • 10
  • 8
  • 7
  • 5
  • 5
  • Tagged with
  • 935
  • 935
  • 452
  • 196
  • 133
  • 124
  • 115
  • 99
  • 89
  • 88
  • 86
  • 83
  • 79
  • 74
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
821

Open Access: Wissenschaftliches Publizieren im Zeitalter von Digitalität und Internet

Näder, Johannes January 2010 (has links)
Seit der Jahrtausendwende prägt das Schlagwort Open Access die Debatten um wissenschaftliches Publizieren. Je nach Kontext geht es dabei um handfeste Strategien angesichts der Zeitschriftenkrise, um ein mehr oder weniger einheitliches Programm zur Verabschiedung des Gutenberg-Zeitalters oder um visionäre Vorstellungen digitaler Wissenskulturen. Die Untersuchung verfolgt zwei Ziele: Zunächst erklärt sie, auf welche Konzepte sich der Begriff Open Access bezieht, wie sich diese Konzepte entwickelt haben und wie sie zusammenfassend beschrieben werden können. Diese Betrachtungsweise ermöglicht es, Open Access in einem zweiten Schritt aus seiner strategischen und programmatischen Funktionalisierung zu lösen und medien-/kulturwissenschaftlich zu deuten: Nicht zufällig entsteht die Open-Access-Bewegung zu einem Zeitpunkt, da sich die medientechnischen Infrastrukturen der Gesellschaft und damit auch der Wissenschaft tiefgreifend verändern. Die Untersuchung analysiert Open Access als den Versuch der Wissenschaftsgemeinschaft, durch die Beeinflussung des Mediensystems auch unter veränderten medialen Bedigungen wissenschaftliche Öffentlichkeit herstellen zu können und dabei Verwerfungen für wissenschaftliche Arbeitsabläufe und für das gesamte Wissenssystem zu vermeiden. Dabei wird deutlich, dass verschiedene Disziplinen unterschiedliche Anforderungen an wissenschaftliche Öffentlichkeit haben und dass daher Aushandlungsprozesse nötig sind. Gleichzeitig reflektiert die Untersuchung, dass es sich bei den gerade entstehenden neuen Medieninfrastrukturen genau wie beim sich wandelnden wissenschaftlichen Publikationswesen um fragile, grundsätzlich kontingente Strukturen handelt, deren Zukunft keineswegs gesichert ist: Statt Open Access könnten sich auch andere Organisationsformen wissenschaftlicher Öffentlichkeit im digitalen Zeitalter etablieren, die aber nicht unbedingt im Sinne eines offenen und produktiven wissenschaftlichen Austausches und einer breiten gesellschaftlichen Partizipation an wissenschaftlicher Kultur sein müssen. Open Access erscheint vor diesem Hintergrund als vorsichtiger und vergleichweise schadensarmer Aushandlungsprozess, nicht als radikale Revolution des Wissenschaftssystems an der Schwelle zum digitalen Zeitalter. Methodisch knüpft die Untersuchung an die Mediologie an, die durch die Beschreibung von Wechselwirkungen zwischen technischen Medien und kulturellen Praxen die Konstruktion einseitiger Kausalzusammenhänge vermeiden will. In einem Exkurs wird untersucht, worin sich die Konzepte von Open Access und Open Source bzw. Freier Software unterscheiden. Der Anhang der Arbeit enthält den Volltext der drei Erklärungen von Budapest, Bethesda und Berlin, in denen wichtige Prinzipien von Open Access niedergelegt sind.:1. Neue Medien: Mehr als nur Werkzeuge 1.1 Wissenschaft zwischen Technikeuphorie und Skepsis 1.2 Definition und Mediologie des Freien Zugangs 1.3 Sprachreflexion 1.4 Forschungsüberblick 1.5 Offene Zugänge: Schwerpunkt Geisteswissenschaften 2. Open Access: Definitorische Annäherung 2.1 BBB: Programme des Freien Zugangs 2.2 Kontroversen um ein Konzeptschlagwort 2.3 Vision, Programm, Strategie: Facetten des Freien Zugangs 3. Digitalität, Vernetzung und wissenschaftliches Publikationswesen 3.1 Merkmale und Innovationspotenzial des Digitalen 3.2 Wissenschaftliches Publizieren unter den Bedingungen der Grafosphäre 3.3 Stabilisierungspotenzial digitaler Techniken 3.4 Freiheit und Offenheit in anderen Mikromilieus. Exkurs 3.5 Milieusicherung und Irritation 4. Vom Mediengebrauch zu einem neuen Medienwissen Anhang: Budapester Erklärung, Bethesda-Stellungnahme, Berliner Erklärung
822

Towards the numerical modelling of salt / zeolite composites for thermochemical energy storage

Lehmann, Christoph 23 February 2021 (has links)
Komposit-Adsorbentien, die aus einer mit hygroskopischem Salz imprägnierten Zeolithmatrix bestehen, bilden eine vielversprechende Materialklasse für die thermochemische Energiespeicherung (TCES). Sie vereinen die hohe Wärmespeicherdichte des Salzes und die einfache technische Handhabbarkeit des Zeoliths. Dabei verhindert die poröse Matrix das Auslaufen von Salzlösung und kompensiert volumenänderungen während der Ad- und Desorption. Das dynamische Sorptionsverhalten solcher Komposite unterscheidet sich jedoch von dem reiner Zeolithe. Speziell die Adsorptionskinetik ist langsamer, was zu Problemen wie einer geringeren und nicht konstanten thermischen Leistung sowie unvollständiger Adsorption und langen Adsorptionspasen von Energiespeichern auf Basis dieser Materialien führt. Numerische Modellierung hat sich als wichtiges Werkzeug erwiesen, um die Ursachen solcher Leistungseinschränkungen zu identifizieren. Dadurch erleichtert es die Entwicklung von thermochemischen Energiespeichern: Optimale Designs und Arbeitsbedingungen können per Simulation gefunden werden bevor Prototypen gebaut werden müssen. In dieser Arbeit wurde ein numerisches Modell einer Adsorbensschüttung in einer offenen Sorptionskammer entwickelt, in die Open-Sourve Finite-Elemente-Software OpenGeoSys implementiert und mittels experimenteller Daten validiert. Die Modellierungserebnisse zeigen, dass etablierte Sorptionskinetiken das dynamische Adsorptionsverhalten von Salz/Zeolith-Kompositen unter anwendungsrelevanten Arbeitsbedingungen erfassen. Außerdem zeigen sie, dass der Hauptgrund für die Unterschiede zwischen dem Sorptionsverhalten der Komposite und reiner Zeolithe in ihren qualitativ unterschiedlichen Sorptionsgleichgewichten liegt. Ein zweiter Fokus dieser Arbeit liegt darauf zu untersuchen, ob ein begrenzter Umfang an experimentellen Daten genügt, um die entwickelten numerischen Modelle zu kalibrieren. Diese Möglichkeit wurde durch Simulationen von dynamischen Adsorptionsvorgängen an Komposit-Adsorbentien bestätigt. Zudem wurden Kriterien entwickelt, die die Rekonstruktion eines robusten Adsorptionsgleichgewichtsmodells aus einem beschränkten expermientellen Datensatz erlauben. Schließlich wurde im Kontext der Dubinin-Polanyi-Theorie der Adsorption in Mikroporen festgestellt, das die Wahl eines bestimmten Adsorbatdichtemodells nur einen kleinen Einfluss auf Vorhersagen der Leistungsfähigkeit von Adsorbentien für die TCES hat. Die Ergebnisse dieser Arbeit bilden eine fundierte Grundlage für die zukünftige numerische Untersuchung von Materialien, Reaktorgeometrien und Arbeitsbedingungen während der Entwicklung von thermochemischen Energiespeichern, die auf Zeolithen oder Komposit-Adsorbentien basieren.:Used symbols and abbreviations 1. Introduction 2. Foundations 2.1. Thermochemical energy storage 2.2. Zeolites and salt/zeolite composites 2.3. Dubinin-Polanyi theory 2.4. Multiphysical model of a fixed adsorbent bed 2.5. Experimental data 3. Assessment of adsorbate density models 4. Water loading lift and heat storage density prediction 5. Modelling of sorption isotherms based on sparse experimental data 6. Modelling sorption equilibria and kinetics of salt/zeolite composites 7. Summary 7.1. Main achievements 7.2. Conclusions and outlook Bibliography A. Publications A.1. Assessment of adsorbate density models A.2. A comparison of heat storage densities A.3. Water loading lift and heat storage density prediction A.4. Modelling of sorption isotherms based on sparse experimental data A.5. Modelling sorption equilibria and kinetics of salt/zeolite composites / Composite adsorbents consisting of a zeolite host matrix impregnated with a hygroscopic salt are a promising material class for thermochemical energy storage (TCES). They combine the high heat storage density of the salt with the easy technical manageability of the zeolite, which prevents the leakage of salt solution and inhibits volume changes upon ad- and desorption. The dynamic sorption behaviour of such composites, however, is different from the pure host matrix material. Particularly, the adsorption kinetics are slower, which leads to issues such as low and non-steady thermal output power, incomplete adsorption and long adsorption phases of TCES devices using these composite materials. Numerical modelling has proven to be a valuable tool to identify the causes for such performance limitations. Therefore, it facilitates the development of TCES devices: it allows to easily find optimum designs and operating procedures before actual prototypes have to be built. In this thesis a numerical model of a packed adsorbent bed in an open sorption chamber has been developed, implemented in the open-source finite element software OpenGeoSys and validated with experimental data. The modelling results show that established sorption kinetics models capture the dynamic sorption behaviour of salt/zeolite composites under application-relevant operating conditions. Moreover, they show that the main cause for the differences between the composites' and pure zeolite's sorption behaviour lies in their different sorption equilibria. A second focus of the thesis is to investigate the use of limited experimental data for the calibration of the numerical models. This possibility has been confirmed by dynamic sorption simulations of the composite materials. Furthermore, criteria were determined that allow the reconstruction of a robust adsorption equilibrium description from a reduced experimental data set. Finally, in the context of the Dubinin-Polanyi theory of adsorption in micropores, it has been found that the choice of a specific adsorbate density model has only a small influence on performance predictions of adsorbents for TCES. In summary, the results from this thesis will facilitate the screening of materials, reactor geometries and operating conditions via numerical simulations during the design of TCES devices based on zeolites and composite sorbents.:Used symbols and abbreviations 1. Introduction 2. Foundations 2.1. Thermochemical energy storage 2.2. Zeolites and salt/zeolite composites 2.3. Dubinin-Polanyi theory 2.4. Multiphysical model of a fixed adsorbent bed 2.5. Experimental data 3. Assessment of adsorbate density models 4. Water loading lift and heat storage density prediction 5. Modelling of sorption isotherms based on sparse experimental data 6. Modelling sorption equilibria and kinetics of salt/zeolite composites 7. Summary 7.1. Main achievements 7.2. Conclusions and outlook Bibliography A. Publications A.1. Assessment of adsorbate density models A.2. A comparison of heat storage densities A.3. Water loading lift and heat storage density prediction A.4. Modelling of sorption isotherms based on sparse experimental data A.5. Modelling sorption equilibria and kinetics of salt/zeolite composites
823

Variable-Density Flow Processes in Porous Media On Small, Medium and Regional Scales

Walther, Marc 03 November 2014 (has links) (PDF)
Nowadays society strongly depends on its available resources and the long term stability of the surrounding ecosystem. Numerical modelling has become a general standard for evaluating past, current or future system states for a large number of applications supporting decision makers in proper management. In order to ensure the correct representation of the investigated processes and results of a simulation, verification examples (benchmarks), that are based on observation data or analytical solutions, are utilized to evaluate the numerical modelling tool. In many parts of the world, groundwater is an important resource for freshwater. While it is not only limited in quantity, subsurface water bodies are often in danger of contamination from various natural or anthropogenic sources. Especially in arid regions, marine saltwater intrusion poses a major threat to groundwater aquifers which mostly are the exclusive source of freshwater in these dry climates. In contrast to common numerical groundwater modelling, density-driven flow and mass transport have to be considered as vital processes in the system and in scenario simulations for fresh-saltwater interactions. In the beginning of this thesis, the capabilities of the modelling tool OpenGeoSys are verified with selected benchmarks to represent the relevant non-linear process coupling. Afterwards, variable-density application and process studies on different scales are presented. Application studies comprehend regional groundwater modelling of a coastal aquifer system extensively used for agricultural irrigation, as well as hydro-geological model development and parametrization. In two process studies, firstly, a novel method to model gelation of a solute in porous media is developed and verified on small scale laboratory observation data, and secondly, investigations of thermohaline double-diffusive Rayleigh regimes on medium scale are carried out. With the growing world population and, thus, increasing pressure on non-renewable resources, intelligent management strategies intensify demand for potent simulation tools and development of novel methods. In that way, this thesis highlights not only OpenGeoSys’ potential of density-dependent process modelling, but the comprehensive importance of variable-density flow and transport processes connecting, both, avant-garde scientific research, and real-world application challenges.
824

Variable-Density Flow Processes in Porous Media On Small, Medium and Regional Scales

Walther, Marc 07 May 2014 (has links)
Nowadays society strongly depends on its available resources and the long term stability of the surrounding ecosystem. Numerical modelling has become a general standard for evaluating past, current or future system states for a large number of applications supporting decision makers in proper management. In order to ensure the correct representation of the investigated processes and results of a simulation, verification examples (benchmarks), that are based on observation data or analytical solutions, are utilized to evaluate the numerical modelling tool. In many parts of the world, groundwater is an important resource for freshwater. While it is not only limited in quantity, subsurface water bodies are often in danger of contamination from various natural or anthropogenic sources. Especially in arid regions, marine saltwater intrusion poses a major threat to groundwater aquifers which mostly are the exclusive source of freshwater in these dry climates. In contrast to common numerical groundwater modelling, density-driven flow and mass transport have to be considered as vital processes in the system and in scenario simulations for fresh-saltwater interactions. In the beginning of this thesis, the capabilities of the modelling tool OpenGeoSys are verified with selected benchmarks to represent the relevant non-linear process coupling. Afterwards, variable-density application and process studies on different scales are presented. Application studies comprehend regional groundwater modelling of a coastal aquifer system extensively used for agricultural irrigation, as well as hydro-geological model development and parametrization. In two process studies, firstly, a novel method to model gelation of a solute in porous media is developed and verified on small scale laboratory observation data, and secondly, investigations of thermohaline double-diffusive Rayleigh regimes on medium scale are carried out. With the growing world population and, thus, increasing pressure on non-renewable resources, intelligent management strategies intensify demand for potent simulation tools and development of novel methods. In that way, this thesis highlights not only OpenGeoSys’ potential of density-dependent process modelling, but the comprehensive importance of variable-density flow and transport processes connecting, both, avant-garde scientific research, and real-world application challenges.:Abstract Zusammenfassung Nomenclature List of Figures List of Tables I Background and Fundamentals 1 Introduction 1.1 Motivation 1.2 Structure of the Thesis 1.3 Variable-Density Flow in Literature 2 Theory and Methods 2.1 Governing Equations 2.2 Fluid Properties 2.3 Modelling and Visualization Tools 3 Benchmarks 3.1 Steady-state Unconfined Groundwater Table 3.2 Theis Transient Pumping Test 3.3 Transient Saltwater Intrusion 3.4 Development of a Freshwater Lens II Applications 4 Extended Inverse Distance Weighting Interpolation 4.1 Motivation 4.2 Extension of IDW Method 4.3 Artificial Test and Regional Scale Application 4.4 Summary and Conclusions 5 Modelling Transient Saltwater Intrusion 5.1 Background and Motivation 5.2 Methods and Model Setup 5.3 Simulation Results and Discussion 5.4 Summary, Conclusion and Outlook 6 Gelation of a Dense Fluid 6.1 Motivation 6.2 Methods and Model Setup 6.3 Results and Conclusions 7 Delineating Double-Diffusive Rayleigh Regimes 7.1 Background and Motivation 7.2 Methods and Model Setup 7.3 Results 7.4 Conclusions and Outlook III Summary and Conclusions 8 Important Achievements 9 Conclusions and Outlook Bibliography Publications Acknowledgements Appendix
825

軟體產業的顧客知識運用、產權與組織型式

王盈勛, Wang, Ying-hsun Unknown Date (has links)
摘要 開放原始碼軟體的出現,對軟體產業帶來巨大的衝擊。然而,針對開放原始碼軟體的研究,經常將開放原始碼軟體視為「沒有產權」的軟體,或是被更進一步提高到意識型態之爭,視為是爭取自由的「聖戰」。 本研究的成果指明,開放原始碼軟體之所以出現與茁壯,是回應軟體市場對個人化與差異化需求越來越高的有效方式。產品差異化的程度越高,對顧客知識的依賴程度越高,而開放原始碼軟體社群則是全然靠顧客的知識貢獻來開發軟體的一種組織形式。 開放原始碼授權協議作為一種產權制度,並非「沒有產權」或是「反對產權」,組織而是讓軟體開發社群成員的專質性知識與產權配置得以效率結合的制度發明。開放原始碼授權協議以著作權為基礎的制度安排,讓軟體開發者能夠自主地運用軟體原始碼、擴大社群成員間的知識共享、以及排除個人或商業公司在其間從事投機行為的可能性。 本研究對組織創新的三權假說於社群組織的運用,進行了初步的檢驗;此外,對於軟體產業該如何妥善運用顧客力量從事產品創新也提出了一些建議。 / Abstract The emergence of the open source software in the 1990s has made a huge impact on the software industry. However, the overall research on the open source software often regards it as “software without property right,” or even sees it a “holy war” fighting for freedom by heightening the ideological conflicts between liberalism and capitalism. The outcome of this research indicates that the initiation and development of the open source software serves as the effective response for the demanding software market of individualization and differentiation. The more the software product differentiates, the more the users’ knowledge has to be replied upon. The open source software community is a form of organization which entirely lives on the intellectual contribution of its customers. The licensing of the open source software as a system of property right does not mean that it has no property right or it is against property right; rather, it is an invention of system which allows the efficient combination of the community software developers and the distribution of property right. The licensing of the open source software based on the intellectual property right permits the software developers making use of the open source independently, expanding the communal share of knowledge among community members, and eliminating the possibilities of individuals and commercial companies who try to speculate in it. This study has made preliminary examinations on the application of the “hypothesis of the three powers” of the organizational innovation on communities and provides software industry with suggestions for how to make proper use of the power of customers.
826

開放原始碼軟體平台與互補性資產建構—以Google與 Intel 為例 / Open Source Software Platform for Promoting Complementary Asset Developments–a Case Study of Google and Intel

高士翔, Shih-Hsiang (Sean) Kao Unknown Date (has links)
開放原始碼軟體平台與互補性資產建構—以Google與 Intel 為例 / Open source software is Open Innovation only if it has a business model driving it (West and Gallagher 2006). Open Innovation is the paradigm describing the scenario in which firms use a broad range of external sources for innovation and seek a broad range of commercialization alternatives for internal innovation (Chesbrough 2003). The Platform Leader builds the platform and concentrates its efforts on promoting and directing innovation of complementary products in favor of its R&D direction (Cusumano and Gawer 2002). The author has chosen leaders in two distinctive industry sectors— Google, the leader in search engine industry, and Intel, the leader in the microprocessor business for the personal computer industry—as the case study companies for this research. Both cases fit the definition of open innovation since both Google and Intel have specific business models for their open source software platforms. This research explores how industry leaders exploit open source software platforms to realize their specific strategic intents. The research problems are: (1) how companies can incorporate external creativity and innovation to maintain their own innovative momentum; (2) what are the key factors and strategies for building a successful open source software platform and its ecosystem; (3) how can a company use an open source software platform as part of its strategy to enter new markets and promote development of complementary assets to build its competitive advantages. The author proposes the following framework to analyze how leading firms design open source platform strategies: (1) analyze the firm’s core competencies; (2) analyze the firm’s strategic intent for their open source software platform; (3) analyze the firm’s strategies for designing the architecture of their open source software platform; (4) analyze the firm’s strategies for designing the ecosystem around the platform. Based on the analysis of the two comparative cases, the author has been convinced of the following propositions: 1. Firms can use open source software platform to incorporate external creativity and innovations that promote the development of complementary assets and to build or at least maintain their competitive advantage against competitors. 2. Instead of a purely open or purely proprietary platform strategy, platform owners can utilize a hybrid strategy, which combines the advantages of open source and closed source to retain control and differentiation. 3. As opposed to a company-owned open source software platform, a community-owned open source software platform will attract more communities’ involvements and stimulate more innovation. 4. When developing complementary assets, firms should adopt an open innovation approach to incorporate external creativity and innovations; however, when building their core competencies, firms should adopt a more closed innovation approach to maintain their distinctive competitive advantages. 5. One of the key determining factors of a successful open source platform strategy is the platform owner’s ability to create value and enable every partner within the ecosystem to share some portion of it.
827

Session hijacking attacks in wireless local area networks

Onder, Hulusi 03 1900 (has links)
Approved for public release, distribution is unlimited / Wireless Local Area Network (WLAN) technologies are becoming widely used since they provide more flexibility and availability. Unfortunately, it is possible for WLANs to be implemented with security flaws which are not addressed in the original 802.11 specification. IEEE formed a working group (TGi) to provide a complete solution (code named 802.11i standard) to all the security problems of the WLANs. The group proposed using 802.1X as an interim solution to the deficiencies in WLAN authentication and key management. The full 802.11i standard is expected to be finalized by the end of 2004. Although 802.1X provides a better authentication scheme than the original 802.11 security solution, it is still vulnerable to denial-of-service, session hijacking, and man-in-the- middle attacks. Using an open-source 802.1X test-bed, this thesis evaluates various session hijacking mechanisms through experimentation. The main conclusion is that the risk of session hijacking attack is significantly reduced with the new security standard (802.11i); however, the new standard will not resolve all of the problems. An attempt to launch a session hijacking attack against the new security standard will not succeed, although it will result in a denial-of-service attack against the user. / Lieutenant Junior Grade, Turkish Navy
828

Essais sur les logiciels libres : licences doubles, effets de réseau, et concurrence

Latulippe, Johan 10 1900 (has links)
This thesis examines the microeconomic consequences of the arrival of open source in the software market. Specifically, it analyzes three features of open source software by using specific models of industrial organization. Open source software is free, and may be modified or duplicated by anyone. The first paper studies the entry of an open source software in a closed source software market. Using a model of horizontal differentiation, the analysis considers a closed source firm's investment in the quality of its software. The introduction of open source on the market reduces the firm's investment in quality and increases the price of its software. Moreover, the entry of open source software may reduce consumer welfare. Post-entry by an open source software, the reduction in market share lowers the firm's incentive to invest in quality. The second paper features vertical differentiation to study a monopolist selling supporting product to its software. The study begins by contrasting the supply of support by an open source provider and a closed source vendor. The model shows that in both cases the levels of support offered are the same. In addition, consumer welfare is higher and profit lower under an open source software. Then, the paper considers the competition in the provision of support. Here, the supply of high level support is greater than under a monopolist. Finally, the monopolist adopts a dual licensing strategy to extract more surplus from developers interested in modifying open source software and redistributing the resulting product. This technique, when the developers place high value on the source code, generates more profit if the monopolist chooses to publish as open source rather than closed source. The last paper studies how a closed source firm is affected by the introduction of an open source benefiting from contributions by users. A vertical differentiation model is used, and reveals that, when contribution of users is present, the closed source vendor may lower its price to a level where it forces the open source out of the market. The firm's lower price not only increases demand for its software, but also induces consumers into switching from open to closed source software therefore reducing the contribution of users. / Les logiciels libres sont uniques en leur genre\ : non seulement sont-ils distribués gra-tuitement, mais on peut aussi les modifier et les copier. Cette thèse étudie l'impact de ces propriétés du logiciel libre sur la compétition et sur les entreprises de logiciel propriétaire. Des modèles propres à l'organisation industrielle sont utilisés. Le première étude examine l'arrivée d'un logiciel libre sur un marché occupé par un logiciel propriétaire. En utilisant un modèle de différenciation horizontale, le papier consi-dère une firme propriétaire qui investit dans la qualité de son logiciel. L'arrivée d'un logiciel libre cause l'entreprise du logiciel propriétaire à réduire le niveau de son investissement et à augmenter le prix de son produit. Il s'avère alors que l'introduction du logiciel libre sur le marché réduit l'investissement de l'entreprise et engendre même l'augmentation du prix du produit. De plus, l'arrivée du logiciel libre peut réduire le niveau de bien-être des consommateurs. Comme le logiciel libre ne réagit pas aux décisions stratégique de l'entreprise, cette dernière voit son marché réduit peu importe sa stratégie. La firme décide conséquemment de vendre un produit de moindre qualité à un prix plus élevé à une clientèle réduite. Le deuxième papier propose un modèle qui utilise la différenciation verticale afin d'exa-miner un monopoleur offrant un produit complémentaire à son logiciel. L'étude compare d'abord les cas d'un logiciel libre et d'un logiciel propriétaire, toujours dans le contexte d'un monopoleur offrant du support professionnel pour son logiciel. Il est établi que le bien-être des consommateurs est plus élevé, et le profit inférieur dans le cas d'un distributeur de logiciel libre. Ensuite, le modèle initial est modifié avec l'ajout d'une seconde entreprise offrant du support professionnel. Dans ce cas, l'offre de support de haut niveau est plus élevée. Finalement, le monopoleur adopte une stratégie de licences doubles. Ce concept permet au monopoleur de proposer la vente d'une licence même si son logiciel est libre. Cette technique génère plus de profits, certaines conditions étant présentes, que si l'entreprise optait pour un logiciel propriétaire. Un logiciel libre profite des contributions de ses usagers pour améliorer son produit. Le troisième papier examine l'arrivée d'un tel produit sur un marché dominé par un logiciel propriétaire. Le modèle de différenciation verticale utilisé contraste les deux logiciels dans un marché donné et révèle que la contribution des utilisateurs peut diminuer la part de marché du logiciel libre au profit de son conccurrent. De fait, en diminuant ses prix le licenceur du logiciel propriétaire incite le consommateur à délaisser le logiciel libre pour le produit de son concurrent.
829

On the development of an open-source preprocessing framework for finite element simulations

Alexandra D Mallory (6640721) 14 May 2019 (has links)
Computational modeling is essential for material and structural analyses for a multitude of reasons, including for the improvement of design and reducing manufacturing costs. However, the cost of commercial finite element packages prevent companies with limited financial resources from accessing them. Free finite element solvers, such as Warp3D, exist as robust alternatives to commercial finite element analysis (FEA) packages. This and other open-source finite element solvers are not necessarily easy to use. This is mainly due to a lack of a preprocessing framework, where users can generate meshes, apply boundary conditions and forces, or define materials. We developed a preprocessor for Warp3d, which is referred to as <i>W3DInput</i>, to generate input files for the processor. <i>W3DInput</i> creates a general framework, at no cost, to go from CAD models to structural analysis. With this preprocessor, the user can import a mesh from a mesh generator software – for this project, Gmsh was utilized – and the preprocessor will step the user through the necessary inputs for a Warp3D file. By using this preprocessor, the input file is guaranteed to be in the correct order and format that is readable by the solver, and makes it more accessible for users of all levels. With this preprocessor, five use cases were created: a cantilever beam, a displacement control test, a displacement control test with a material defined by a user-defined stress-strain curve, a crystal plasticity model, and pallet. Results were outputted to Exodus II files for viewing in Paraview, and the results were verified by checking the stress-strain curves. Results from these use cases show that the input files generated from the preprocessor functions were correct.
830

Sistemas de informática e informação da atenção básica do Sistema Único de Saúde e o software livre: possibilidades e perspectivas / The Brazilian Unified National Health System (SUS) Primary Health, its informatics and information systems and the free software: perspectives and possibilities

Cortizo, Carlos Tato 06 December 2007 (has links)
Introdução: A Atenção Básica do Sistema Único de Saúde SUS é definida pelo Ministério da Saúde como um conjunto de ações e serviços de saúde no âmbito individual e coletivo, desenvolvidos com práticas gerenciais, sanitárias e sociais participativas, através de ações complexas nos cuidados e atenção à saúde da população do seu território e fundamentada nos princípios da universalidade, integralidade e da eqüidade. Os sistemas de informática em saúde da atenção básica são tecnologias estratégicas na gestão e governança sobre a situação de saúde da população em cada nível de responsabilidade sanitária. O cerne de funcionamento dos sistemas de informática é o software. A literatura pesquisada relata que os softwares dos sistemas de informática em saúde apresentam vários aspectos: inflexibilidade para mudanças, altos custos, baixa eficácia, são frágeis em relação à segurança e a privacidade, não adotam padrões tecnológicos e de saúde, apresentam dificuldades na escalabilidade, são refratários a adaptações às culturas e línguas locais e induzem ao aprisionamento tecnológico dos sistemas de informação em saúde. Neste contexto, o objetivo deste estudo foi o de identificar e analisar quais são as contribuições e limitações do software livre para os sistemas de informática e informação na atenção básica do SUS. Metodologia: Estudo de caso exploratório e qualitativo, comparando dois municípios que utilizam software livre e software privativo nos sistemas de atenção básica do SUS, a partir de critérios obtidos na literatura pesquisada. Resultados: A utilização de software livre nos sistemas de atenção básica do SUS de Campinas e São Paulo apresentou limites nos seguintes tópicos, utilizados como critérios de análise: educação, segurança, privacidade e padrões abertos. A utilização do software livre demonstrou vantagens para os municípios estudados nos seguintes tópicos: custos, escalabilidade, autonomia tecnológica, adaptação do software ao idioma e à cultura local, estabilidade e impacto na qualidade dos serviços de saúde. Conclusão: O software livre é uma alternativa tecnológica viável, robusta e flexível e oferece novas perspectivas para a construção de sistemas de informática e informação da Atenção Básica em saúde / The Primary Health Care of the Brazilian Unified National Health System (SUS) is defined by the Health Ministry as a set of actions and services in the individual and collective scopes developed through managerial sanitary and social participative practices by means of complex actions in the attention and care of thee health of the population within their territory based on the principles of universality, integrality and equity . The health informatics system of the basic attention are strategic tools of management and managery of the health status of the population at each level of sanitary responsibility. The core of operation of the informatics system is the software .The researched literature reports the health and informatics system software to present inflexibility for change, high cost and low efficacy, fragility concerning privacy and safety, lack of technological and health patterns and difficulties in scalability . The software is also depicted as being refractory to local languages and cultures, and to induce technological trap of the health informatics systems. Within this context the aim of the study was the identification and analysis of the contributions and limitations of free software for the informatics and information systems of the primary health of SUS. METODOLOGY: Exploratory qualitative study comparing two municipalities making use of free software and private software in their SUS primary health system based on the criteria obtained from the literary review. RESULTS: The use of free software in the basic attention of the SUS of Campinas and São Paulo presents limitations in the following topics used as analysis criteria: education, safety, privacy and open patterns. The use of free software proved advantageous for the surveyed municipalities in the following topics: costs, scalability, technologic autonomy, stability, adaptation of the software to the local language and culture and impact on the quality of health services. CONCLUSION: Free software is a viable, robust and flexible technological alternative that offers new perspectives for th construction of information and informatics systems of the primary health care.

Page generated in 0.0417 seconds