Spelling suggestions: "subject:"[een] SCALABILITY"" "subject:"[enn] SCALABILITY""
301 |
Optimization and Verification of an Integrated DSPSvensson, Markus, Österholm, Thomas January 2008 (has links)
There is a lot of applications for DSPs (Digital Signal Processor) in the most rapidly growing areas in the industry right now as wireless communication along with audio and video products are getting more and more popular. In this report, a DSP, developed at the division of Computer Engineering at the University of Linköping, is optimized and verified. Register Forwarding was implemented on a general architecture level to avoiddata hazards that may arise when implementing instruction pipelining in a processor. The very common FFT algorithm is also optimized but on instruction setlevel. That means the algorithm is carefully analyzed to find operations that mayexecute in parallel and then create new instructions for these parallel operations.The optimization is concentrated on the butterfly operation as it is such a majorpart of the FFT computation. Comparing the accelerated butterfly with the unaccelerated gives an improvement of 30% in terms of clock cycles needed for thecomputation. In the report there are also some discussions about the benefits and drawbacksof changing from a hardware to a software stack, mostly in terms of interrupts andthe return instruction. Another important property of the processor is scalability. That is, it is possibleto attach extra peripherals to the core, which accelerates certain tasks. Aninterface towards these peripherals is developed along with two template designsthat may be used to develop other peripherals. After all these modifications, a new test bench is developed to verify the functionality.
|
302 |
E-handelsplattformar som är anpassade för skalbarhet och ökad försäljning : En jämförelse mellan e-handelsplattformar. / E-commerce platforms adapted for scalability and increased sales : A comparison between e-commerce platforms.Bertilsson, Robin, Date, Daniel January 2017 (has links)
I denna studie jämförs 14 kostnadsfria e-handelsplattformar utifrån funktionalitet som har identifierats för att öka försäljning och underlätta skalbarhet. För att betygsätta funktionernas påverkan för ökad försäljning respektive underlätta skalbarheten så har ehandlare inom nischen kläder och accessoarer låtits besvara en enkät. För att identifiera relevant funktionalitet till de två olika synsätten utgick forskarna från litteraturen. Det är utifrån dessa funktioner som sedan jämförelsen av e-handelsplattformarna och enkäten bygger på. Studien indikerar på att responsiv design och att kunna erbjuda sina kunder flera betallösningar är de två mest relevanta funktionerna för att nå en ökad försäljning. För att underlätta skalbarhet visar resultatet att det är relevant att kunna köpa till tillägg och teman. Sett till vilken e-handelsplattform som innehar mest relevant funktionalitet så ger studiens resultat indikationer på att Abantecart och OpenCart är de e-handelsplattformar som har den mest relevanta funktionaliteten för att öka försäljning och underlätta en framtida expansion. / In this study 14 e-commerce platforms are compared based on their functionality. The functionality has been identified to increase sales and facilitate scalability. To rate the impact of the functions, e-tailors in the niche of clothing and accessories were asked to respond a questionnaire. To identify relevant functionality for both of the approaches, the researchers used literature. The functionality were used as a framework when comparing the 14 e-commerce platforms. This study indicates that all functions are not equal. Responsive design and multiple payment solutions are considered most critical and are directly linked to increased sales figures. For the e-retailers to achieve scalability, this study finds that purchasable add-ons and themes plays a big role. The e-commerce platforms were rated thanks to data collection and data analysis. Abantecart and Opencart both received high ratings regarding functionality and the ability to increase sales numbers and enable future expansions.
|
303 |
Optimisation énergétique des protocoles de communication des réseaux de capteurs sans fil / Energy optimization of communication protocols of the WSNRandriatsiferana, Rivo Sitraka A. 03 December 2014 (has links)
Pour augmenter la durée de vie des réseaux de capteurs sans fil, une solution est d'améliorer l'efficacité énergétique des protocoles de communication. Le regroupement des nœuds du réseau de capteurs sans fil en cluster est l'une des meilleures méthodes. Cette thèse présente propose plusieurs améliorations en modifiant les paramètres du protocole de référence LEACH. Pour améliorer la distribution énergétique des "cluster-heads", nous proposons deux protocoles de clustering centralisés k-LEACH et sa version optimisée k-LEACH-VAR. Un algorithme distribué, appelé e-LEACH, est également proposé pour réduire l'échange d'information périodique entre les nœuds et la station de base lors de l'élection des "cluster-heads". Par ailleurs, le concept l'équilibrage énergétique est introduit dans les métriques d'élection pour éviter les surcharges des nœuds. Ensuite, nous présentons une version décentralisée de k-LEACH qui, en plus des objectifs précédents, intègre la consommation d'énergie globale du réseau. Ce protocole, appelé, k-LEACH-C2D, vise également à favoriser la scalabilité du réseau. Pour renforcer ce dernier et l'autonomie des réseaux, les deux protocoles de routage "multi-hop" probabiliste, dénotés FRSM et CB-RSM construisent des chemins élémentaires entre les "cluster-heads" et la station de base. Le protocole CB-RSM forme une hiérarchie des "cluster-heads" pendant la phase de formation des clusters, en mettant un accent sur l'auto-ordonnancement et l'auto-organisation entre les "cluster-heads" pour rendre les réseaux le plus "scalable". Ces différents protocoles reposent sur l'idée de base que les nœuds ayant l'énergie résiduelle la plus élevée et la plus faible variance de consommation de l'énergie deviennent "cluster-head". Nous constantans le rôle central de la consommation du nœud dans nos différentes propositions. Ce point fera l'objet de la dernière partie de cette thèse. Nous proposons une méthodologie pour caractériser expérimentalement la consommation d'un nœud. Les objectifs visent à mieux appréhender la consommation pour différentes séquences d'état du nœud. Enfin, nous proposons un modèle global de la consommation du nœud. / To increase the lifetime of wireless sensor networks, a solution is to improve the energy efficiency of the communication's protocol. The grouping of nodes in the wireless sensor network clustering is one of the best methods. This thesis proposes several improvements by changing the settings of the reference protocol LEACH. To improve the energy distribution of "cluster-heads", we propose two centralized clustering protocols LEACH and k-optimized version k-LEACH-VAR. A distributed algorithm, called e-LEACH, is proposed to reduce the periodic exchange of information between the nodes and the base station during the election of "cluster-heads". Moreover, the concept of energy balance is introduced in metric election to avoid overloading nodes. Then we presented a decentralized version of k-LEACH, which in addition to the previous objectives, integrates the overall energy consumption of the network. This protocol, called k-LEACH-C2D, also aims to promote the scalability of the network. To reinforce the autonomy and networks, both routing protocols "multi-hop" probability, denoted CB-RSM and FRSM build elementary paths between the "cluster-heads" and elected the base station. The protocol, CB-RSM, forms a hierarchy of "cluster-heads" during the training phase clusters, with an emphasis on self-scheduling and self-organization between "cluster-heads" to make the networks more scalable. These protocols are based on the basic idea that the nodes have the highest residual energy and lower variance of energy consumption become "cluster-head". We see the central role of consumption of the node in our proposals. This point will be the last part of this thesis. We propose a methodology to characterize experimentally the consumption of a node. The objectives are to better understand the consumption for different sequences of the node status. In the end, we propose a global model of the consumption of the node.
|
304 |
Upscaling Circular Business Models in Fashion Retail Value ChainsHultberg, Emelie January 2021 (has links)
The textile industry is currently operating in an unsustainable manner that is based on overproduction and wasteful, resource-draining practices. Therefore, recently, the concept of circular economy has been presented as a possible solution and a way forward. Changing linear economy business practices and basing them on the circular economy concept is anticipated to solve social and environmental problems while maintaining economic growth. However, fashion retail value chains essentially depend on the logic of mass production, fast fashion, and linear take-make-dispose models. Presently, circular initiatives in this context are rather limited. This thesis posits that circular business model (CBM) scalability is an important factor in the transition to a circular economy in the fashion retail value chain, and thus, a more sustainable fashion industry. Therefore, the purpose of this thesis is to further the understanding of CBM scalability in the context of fashion retail value chains. This includes expanding the notion of scalability to cover a more holistic perspective that goes beyond increasing production output solely for economic gains. Additionally, this involves enhancing the understanding of the required specific resources and capabilities that CBMs must have or develop to overcome challenges and increase their potential for scalability. Accordingly, this thesis covers three papers that utilise qualitative methods wherein archival material, such as peer reviewed journal articles and industry reports, as well as interviews with practitioners are used. Moreover, the extension of the notion of business model scalability goes beyond the boundary of the organisation and encompasses three different perspectives, namely, efficiency, adaptability, and altruism. In addition to this, three areas that challenge CBM scalability are identified. These challenges stem from different business model design themes and can be described as (i) inefficiency, (ii) lack of know-how and out-of-the-box solutions, and (iii) unfamiliarity resulting in scepticism and dissonance with current practices and policy. Finally, the thesis contributes to the CBM literature by utilising the theoretical lenses of resource-based theory and dynamic capabilities theory to identify resources and dynamic capabilities that are strategically important for scaling CBMs. This thesis expands the notion of scalability by going beyond the boundary of the single organisation as well as including a triple bottom line perspective, thus providing an important addition to the understanding of the scalability of CBMs. / Textilindustrin verkar för närvarande på ett ohållbart sätt som bygger på överproduktion och resurskrävande metoder. Cirkulär ekonomi som koncept, har därför nyligen presenterats som en möjlig lösning på de problem som industrin står inför. Genom att frångå linjära praxis och istället bygga affärsmetoder på cirkulär ekonomi förväntas sociala och miljömässiga problem kunna lösas samtidigt som den ekonomiska tillväxten bibehålls. Modeindustrins värdekedjor bygger dock fortfarande huvudsakligen på massproduktion, fast fashion och linjära take-make-dispose-modeller. Cirkulära initiativ är i detta sammanhang ganska begränsade. Denna avhandling framhäver därför, att skalbarhet av cirkulära affärsmodeller är en viktig faktor i övergången till en cirkulär ekonomi, och därmed en mer hållbar modeindustri. Syftet med denna avhandling är följaktligen att öka förståelsen för skalbarhet av cirkulära affärsmodeller inom modebranschens värdekedjor, med fokus på detaljhandeln. Innefattat i detta är ett utvidgat begrepp av skalbarhet, som inkluderar mer än enbart ökad produktion för ekonomisk vinst. Till detta hör även en ökad förståelse av de specifika resurser och förmågor som cirkulära affärsmodeller måste ha eller utveckla för att övervinna utmaningar och bli mer skalbara. Avhandlingen omfattar tre artiklar som använder kvalitativa metoder där arkivmaterial, såsom refereegranskade journalartiklar och branschrapporter, samt intervjuer med personer verksamma i branschen, används. Resultatet påvisar att cirkulära affärsmodellers skalbarhet går bortom organisationens gränser och omfattar tre olika perspektiv: effektivitet, anpassningsförmåga och altruism. Utöver detta identifieras tre områden som speciellt utmanande för skalbarheten. Dessa utmaningar härrör från två olika designteman av affärsmodeller och kan beskrivas som (i) ineffektivitet, (ii) bristande kunskap och brist på färdiga lösningar, och (iii) obekantskap som resulterar i skepsis och dissonans med nuvarande praxis och policy. Slutligen bidrar avhandlingen till litteraturen om cirkulära affärsmodeller genom att använda resursbaserad teori och dynamiska förmågor-teori för att identifiera resurser och dynamiska förmågor som är strategiskt viktiga för skalbarhet av cirkulära affärsmodeller. Avhandlingen bidrar således till kunskapen om cirkulära affärsmodellers skalbarhet genom att utvidga begreppet till att även innefatta aktiviteter som sträcker sig utanför den enskilda organisationen, samt inkludera ett triple bottom line-perspektiv. / <p>Delvis finansierat av Handelsbankens forskningsstiftelser (Jan Wallander and Tom Hedelius Foundation, Tore Browalds Foundation)</p>
|
305 |
Blockchain on Data Security : An interpretive approach on Cyber Security Professionals’ perceptionsGkougkaras, Vasileios January 2021 (has links)
This Master’s Thesis main purpose is to identify the perceptions of Cyber Security Professionals on Blockchain Technology and whether it could potentially become a substitutefor the current data security systems or not. A literature review was initially performed in order to explore previous related research on the field of Cyber Security, its infrastructure andthe main aspects of Blockchain Technology.Qualitative research was conducted in regards to the participants’ perceptions. Morespecifically, individual interviews were held with each one of the participants all of whom are professionals in the field of Cyber Security. Five different themes emerged from theinterviews which were performed by asking open-ended questions and holding a dialogue with the participants. Those themes start with (1) their opinions on current data security infrastructure and methods. Following that theme a half hour presentation on basic blockchain operations and applications were held in order to identify their (2) current understanding onblockchain. (3) Discussions were held in regards to blockchain’s scalability and sustainability,followed by (4) the security of Blockchains. In the end the (5) fifth and final theme covered the main purpose of this master’s thesis which is whether blockchain can be implemented asan alternate technology on data privacy and security.This Master’s Thesis contributes to the current knowledge on Blockchain within the field ofinformatics by providing the perceptions of Cyber Security Professionals in regards to its operation and implementation. Furthermore it aims to identify whether any possible applications of blockchain technology could be suggested for future implementation in thedomain of data security.The discussion summarizes the empirical findings acquired from the interviews and theperceptions of the participants on Blockchain Technology. By exploring the themes thatemerged from the interviews, it is clearly evident that blockchain is still not a technology thatcan be trusted as an alternative to the current security infrastructures and methods according to the participants. Despite that fact a lot of advantages and optimistic elements were derived since blockchain’s immutability and security demonstrates a high level of tamperingresistance, thus making it suitable as a technology within the Information security industry. Incase blockchain manages to overcome its shortcomings it could prove to be a necessary tool in data privacy and security.
|
306 |
Passage à l'échelle pour la visualisation interactive exploratoire de données : approches par abstraction et par déformation spatiale / Addressing scaling challenges in interactive exploratory visualization with abstraction and spatial distortionRicher, Gaëlle 26 November 2019 (has links)
La visualisation interactive est un outil essentiel pour l'exploration, la compréhension et l'analyse de données. L'exploration interactive efficace de jeux de données grands ou complexes présente cependant deux difficultés fondamentales. La première est visuelle et concerne les limitations de la perception et cognition humaine, ainsi que celles des écrans. La seconde est computationnelle et concerne les limitations de capacité mémoire ou de traitement des machines standards. Dans cette thèse, nous nous intéressons aux techniques de passage à l'échelle relativement à ces deux difficultés, pour plusieurs contextes d'application.Pour le passage à l'échelle visuelle, nous présentons une approche versatile de mise en évidence de sous-ensembles d'éléments par déformation spatiale appliquée aux vues multiples et une représentation abstraite et multi-/échelle de coordonnées parallèles. Sur les vues multiples, la déformation spatiale vise à remédier à la diminution de l'efficacité de la surbrillance lorsque les éléments graphiques sont de taille réduite. Sur les coordonnées parallèles, l'abstraction multi-échelle consiste à simplifier la représentation tout en permettant d'accéder interactivement au détail des données, en les pré-agrégeant à plusieurs niveaux de détail.Pour le passage à l'échelle computationnelle, nous étudions des approches de pré-calcul et de calcul à la volée sur des infrastructures distribuées permettant l'exploration de jeux de données de plus d'un milliard d'éléments en temps interactif. Nous présentons un système pour l'exploration de données multi-dimensionnelles dont les interactions et l'abstraction respectent un budget en nombre d'éléments graphiques qui, en retour, fournit une borne théorique sur les latences d'interactions dues au transfert réseau entre client et serveur. Avec le même objectif, nous comparons des stratégies de réduction de données géométrique pour la reconstruction de cartes de densité d'ensembles de points. / Interactive visualization is helpful for exploring, understanding, and analyzing data. However, increasingly large and complex data challenges the efficiency of visualization systems, both visually and computationally. The visual challenge stems from human perceptual and cognitive limitations as well as screen space limitations while the computational challenge stems from the processing and memory limitations of standard computers.In this thesis, we present techniques addressing the two scalability issues for several interactive visualization applications.To address visual scalability requirements, we present a versatile spatial-distortion approach for linked emphasis on multiple views and an abstract and multi-scale representation based on parallel coordinates. Spatial distortion aims at alleviating the weakened emphasis effect of highlighting when applied to small-sized visual elements. Multiscale abstraction simplifies the representation while providing detail on demand by pre-aggregating data at several levels of detail.To address computational scalability requirements and scale data processing to billions of items in interactive times, we use pre-computation and real-time computation on a remote distributed infrastructure. We present a system for multi-/dimensional data exploration in which the interactions and abstract representation comply with a visual item budget and in return provides a guarantee on network-related interaction latencies. With the same goal, we compared several geometric reduction strategies for the reconstruction of density maps of large-scale point sets.
|
307 |
Réseaux virtualisés de prochaine génération basés sur SDN / Next-generation SDN based virtualized networksRifai, Myriana 25 September 2017 (has links)
Les réseaux logiciels (Software Defined Network - SDN) permettent la programmation du réseau et facilitent sa configuration. Bien qu'SDN améliore les performances, il reste confronté à de multiples défis. Dans cette thèse, nous avons développé des solutions qui constituent un premier pas vers les réseaux SDN de prochaine génération. D’abord, nous présentons MINNIE qui permet la scalabilité des commutateurs SDN, qui ne supportent que quelques milliers de règles dans leur coûteuse mémoire TCAM. MINNIE comprime dynamiquement les règles de routage installées dans la TCAM, augmentant ainsi le nombre de règles pouvant être installées. Ensuite, nous abordons le problème de la dégradation de performance des flux courts avec un prototype d’ordonnancement qui exploite les statistiques des commutateurs pour diminuer leur délai de bout-en-bout. Puis, nous visons à diminuer l’intervalle de protection de 50ms qui n’est plus adapté aux applications modernes et réduit leur qualité d’expérience. Notre solution PRoPHYS s’appuie sur les statistiques des commutateurs dans les réseaux hybrides pour découvrir les pannes de liens plus vite que les solutions existantes. Enfin, nous abordons le problème de l’efficacité énergétique qui souvent mène à une dégradation de performance. Nous présentons SENAtoR, qui exploite les nœuds SDN en réseaux hybrides pour éteindre les nœuds réseau sans entraver la performance. Également, nous présentons SEaMLESS qui convertit le service fourni par une machine virtuelle inactive en une fonction de réseaux virtuelle pour permettre à l’administrateur d’utiliser les ressources bloquées tout en maintenant la disponibilité du service. / Software Defined Networking (SDN) was created to provide network programmability and ease complex configuration. Though SDN enhances network performance, it still faces multiple limitations. In this thesis, we build solutions that form a first step towards creating next-generation SDN based networks. In the first part, we present MINNIE to scale the number of rules of SDN switches far beyond the few thousands rules commonly available in TCAM memory, which permits to handle typical data center traffic at very fine grain. To do so MINNIE dynamically compresses the routing rules installed in the TCAM, increasing the number of rules that can be installed. In the second part, we tackle the degraded performance of short flows and present a coarse grained scheduling prototype that leverages SDN switch statistics to decrease their end-to-end delay. Then, we aim at decreasing the 50ms failure protection interval which is not adapted to current broadband speeds and can lead to degraded Quality of Experience. Our solution PRoPHYS leverages the switch statistics in hybrid networks to anticipate link failures by drastically decreasing the number of packets lost. Finally, we tackle the greening problem where often energy efficiency comes at the cost of performance degradation. We present SENAtoR, our solution that leverages SDN nodes in hybrid networks to turn off network devices without hindering the network performance. Finally, we present SEaMLESS that converts idle virtual machines into virtual network functions (VNF) to enable the administrator to further consolidate the data center by turning off more physical servers and reuse resources (e.g. RAM) that are otherwise monopolized.
|
308 |
A concept for nanoparticle-based photocatalytic treatment of wastewater from textile industryLe, Hoai Nga 14 September 2018 (has links)
Industrial wastewater, such as the effluents from textile and garment companies, may contain toxic organic pollutants, which resist conventional wastewater treatment. Their complete and environmentally friendly degradation requires innovative technologies. Photocatalysis, an advanced oxidation process, can serve this purpose. Since 1972, when the photocatalytic activity of titanium dioxide was first noticed, photocatalysis has drawn the attention of scientists and engineers but it has not yet been widely applied in industrial practice. This is mainly related to the challenges of up-scaling from laboratory experiments to large production sites.
The main goal of this thesis is to develop a concept of nanoparticle-based photocatalysis for the treatment of wastewater. Ideally, process parameters should be adjustable and process conditions should be well-defined. These constraints are prerequisite for establishing process models and comparing the photocatalytic efficiency of different photocatalysts or for different pollutants. More importantly, the configuration should be scalable, in order to cover a wide spectrum of applications.
In response to these requirements, this thesis introduces a new reactor concept for photocatalytic wastewater treatment, which relies on finely dispersed photocatalysts as well as uniform and defined process conditions with regard to illumination and flow. The concept was realized in a photocatalytic setup with an illuminated flow reactor. The flow channel has a rectangular cross section and meanders in a plane exposed to two dimensional illumination. Crucial process parameters, e.g., volumetric flow rate and light intensity, can be adjusted in a defined manner. This facilitates the study on the photocatalytic degradation of different organic pollutants in the presence of various photocatalytic materials under arbitrary illumination.
The thesis provides a comprehensive description of the operational procedures necessary to run photocatalytic reactions in the experimental setup. It includes three main steps: i) dispersion of photocatalysts, ii) equilibration with respect to pollutant adsorption and iii) accomplishing the photocatalytic reaction. Samples are collected in a mixing tank for online or offline analysis. The proceeding decrease in the concentration of organic pollutant is used to assess the activity of the photocatalytic materials.
A particular focus lies on the first of these steps, the dispersion of photocatalysts, because it is ignored in most studies. Typically, photocatalysts are in an aggregated state. The thesis demonstrates that type, intensity and energy of dispersion exert a crucial influence on size and morphology of the photocatalyst particles and, thus, on their optical properties and, accordingly, macroscopic photocatalytic behavior. Apart from this, a proper dispersion is necessary to reduce speed of gravitational solid-liquid separation, at best, to prevent catalyst sedimentation and to avoid misleading results.
The photocatalytic performance was intensively investigated for the color removal of a model dye substance, methylene blue. Commercial titanium dioxide nanoparticles, widely explored in literature, were used as a photocatalyst. Their characteristics (size, morphology, stability and optical properties) were determined. Photocatalytic experiments were carried out under UV irradiation. Influences of different factors, including the concentration of the photocatalyst, the concentration of the organic compounds, light intensity, optical pathlength and pH were examined. The degradation was quantified via the decrease of methylene blue concentration. This conversion is, however, an immediate result influenced by all process parameters, e.g., the volume, the light intensity, the optical pathlength. Hence, kinetic models on macroscopic and microscopic levels are established. Normalizations with respect to process conditions are proposed. The apparent reaction kinetics are traced back to volume- and intensity-related reaction rate constants, and the reaction rate constant at the illuminated surface of the reactor. Additionally, the model is modified to be used for time-variant UV intensities, as encountered for solar photocatalysis. These achievements allow for a comparison of the experimental results from different laboratories. Moreover, they are prerequisite for the translation of laboratory results into large scale plants.
Selected case studies for further applications are introduced. The photocatalytic degradation of different organic molecules (one antibiotic and two commercial dyes) with different photocatalytic materials (commercial nanomaterials and self-synthesized magnetic particles) under artificial or natural light sources was performed. Additionally, photocatalysis was studied in a realistic application. Preliminary tests with dye solutions of a textile company in Danang, Vietnam, impressively showed the feasibility of wastewater treatment by means of photocatalysis. Based on the reported capacity of wastewater in the current treatment plant of the company, the necessary process parameters were assessed. The rough estimation showed that photocatalysis can improve the working ability of the current wastewater treatment plant.
In conclusion, this thesis presents a concept for wastewater treatment by slurry photocatalysis. As the process conditions are adjustable and definable, the process can be ideally performed in laboratories for research purposes, where different materials need to be tested and the working volume can be lower than hundreds of milliliters. The photocatalytic configuration is expected to work with a capacity of hundreds of liters, although appropriate experimental evidences are reserved for further up-scaling studies.
|
309 |
Skalierbare Ausführung von Prozessanwendungen in dienstorientierten UmgebungenPreißler, Steffen 25 October 2012 (has links)
Die Strukturierung und Nutzung von unternehmensinternen IT-Infrastrukturen auf Grundlage dienstorientierter Architekturen (SOA) und etablierter XML-Technologien ist in den vergangenen Jahren stetig gewachsen. Lag der Fokus anfänglicher SOA-Realisierungen auf der flexiblen Ausführung klassischer, unternehmensrelevanter Geschäftsprozesse, so bilden heutzutage zeitnahe Datenanalysen sowie die Überwachung von geschäftsrelevanten Ereignissen weitere wichtige Anwendungsklassen, um sowohl kurzfristig Probleme des Geschäftsablaufes zu identifizieren als auch um mittel- und langfristige Veränderungen im Markt zu erkennen und die Geschäftsprozesse des Unternehmens flexibel darauf anzupassen. Aufgrund der geschichtlich bedingten, voneinander unabhängigen Entwicklung der drei Anwendungsklassen, werden die jeweiligen Anwendungsprozesse gegenwärtig in eigenständigen Systemen modelliert und ausgeführt. Daraus resultiert jedoch eine Reihe von Nachteilen, welche diese Arbeit aufzeigt und ausführlich diskutiert. Vor diesem Hintergrund beschäftigte sich die vorliegende Arbeit mit der Ableitung einer konsolidierten Ausführungsplattform, die es ermöglicht, Prozesse aller drei Anwendungsklassen gemeinsam zu modellieren und in einer SOA-basierten Infrastruktur effizient auszuführen. Die vorliegende Arbeit adressiert die Probleme einer solchen konsolidierten Ausführungsplattform auf den drei Ebenen der Dienstkommunikation, der Prozessausführung und der optimalen Verteilung von SOA-Komponenten in einer Infrastruktur.
|
310 |
Modelo de Negocio InspirARTE: Boxes de pintura personalizados / Business Model InspirARTE: Personalized paint boxesDiaz Espejo, Pedro Enrique, Junco Meza, Cynthia Lorena, Párraga Jiménez, Emily De Los Ángeles, Sotelo Colonia , Gabriel Ricardo, Soto Castro, Victor Nicolaz 27 November 2020 (has links)
En el presente proyecto, se desarrolló un modelo de negocio justificable, escalable y competitivo en el largo plazo. El proyecto se basa en la comercialización de un box personalizado que incluye diversos artículos de arte variados dirigidos a hombres y mujeres de 18 a 55 años. El proyecto inició a través de un proceso de ideación que fue a su vez validado por una metodología cualitativa, que se desarrolló a través de entrevistas al mercado seleccionado y a expertos del rubro.
Para validar los supuestos del modelo de negocio, se desarrolló un prototipo del producto (MVP) que fue presentado a los segmentos seleccionados, lo cual nos permitió obtener aprendizajes valiosos sobre los gustos y preferencias de los consumidores.
Posteriormente, se realizó el concierge del proyecto, en el cual, se demostró que los socios claves proveerán de los productos que tengan una excelente relación entre calidad y precio. Así mismo, el mercado tiene un alto nivel de aceptación con relación a nuestro producto, haciendo atractiva su adquisición.
Se considera que el producto será escalable y viable de acuerdo a las tendencias que existen en el mercado. Además, las estrategias a implementar por la organización, como las de marketing, y las alianzas estratégicas, potenciarán a la empresa para alcanzar la visión y misión establecida. / In this project, a justifiable, scalable and competitive business model was developed in the long term. The project is based on the commercialization of a personalized box that includes various articles of art aimed at men and women from 18 to 55 years old. The project began through an ideation process that was in turn validated by a qualitative methodology, which was developed through interviews with the selected market and experts in the field.
To validate the assumptions of the business model, a product prototype (MVP) was developed and presented to the selected segments, which allowed us to obtain valuable insights on consumer tastes and preferences.
Furthermore, the concierge was made,in which it was demonstrated that key partners will provide us products that have an excellent relationship between quality and price and also, that the market has a high level of acceptance in relation to our product, making its acquisition attractive.
It is considered that the product will be scalable and viable according to the trends that exist in the market. Furthermore, the strategies to be implemented by the organization, such as marketing, and strategic alliances, will empower the company to achieve the established vision and mission. / Trabajo de investigación
|
Page generated in 0.0525 seconds