• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 235
  • 34
  • 26
  • 18
  • 13
  • 10
  • 9
  • 8
  • 7
  • 6
  • 3
  • 3
  • 1
  • 1
  • Tagged with
  • 438
  • 125
  • 76
  • 57
  • 55
  • 52
  • 50
  • 44
  • 44
  • 42
  • 39
  • 39
  • 38
  • 37
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Utvärdering av kalciumnitrat som bindetidsaccelerator / Evaluation of calciumnitrate as setting time accelerator

Rafiq, Ari, HamaAmin, Garmian January 2013 (has links)
Man vill förkorta betongs bindetid dvs. den tid då betongytan kan behandlas så att betongytan blir slät efter gjutning. Det är en stor utmaning för företag som tillverkar fabriksbetong vintertid, eftersom bindetiden förlängs ju kallare klimatet är. Syftet med denna labboration var att visa hur Kalciumnitrart fungerar som bindetidsaccelerator i betong, och om Kalciumnitrart påverkar betongens fysikaliska egenskaper.  Följande faktorer har studerats för att se hur dessa faktorer påverkar betongens bindetid i kombination med användning av Kalciumnitrat. Betongens utgångstemperartur Typ av flyttillsatsmedel i betongen Betongens utgångskonsistens Betongens lagringsklimat Även hitta rätt dosering för att denna produkt ska vara lönsamt att användas i praktiken. Alla underökningar har utförts hos Sika AB laboratorium. All data har noggrant undersökts och använts i Excel program för framtagning av tabeller och diagram. Resultaten/slutsats i underökningarna visade följande. Bindetiden kan förkortas med Kalciumnitrat utan att behöva riskera betongens fysikaliska egenskaper. Enligt bindetidsdiagram noterades att 2,0 % och 2,5 % doseringarna gav bästa resultat gällande bindetid dvs. de gav kortast bindetid. Observera att +5 graders lagringsklimat gav ologiska resultat dvs. referensbetongen utan acceleratorn gav kortast bindetid. Tryckhållfastheten påverkas inte av acceleratorn dvs. man kan använda denna produkt utan att riskera betongens bärförmåga. Resultaten visade att betongens utgångskonsistens har stor betydelse för bindetiden, ju högre konsistens värde desto längre bindetid. Även betongens utgångstemperatur har påverkan på bindetiden, ju högre betongtemperatur desto kortare bindetid. / You want to reduce the concrete’s initial setting i.e.  the time the concrete surface can be treated so that surface gets plane after molding. It’s a big challenge for the companies that produce mill concrete in winter. Since the colder the climate gets the binding process will be extended. The purpose of this lab was to show how Calcium Nitrate functions as bonding time accelerator in concrete and if Calcium Nitrate affects the physical features of the concrete. The following elements have been studied to see how these elements affect the initial setting of the concrete in combination with the use of Calcium Nitrate. The initial temperature of the concrete The type of super plasticizer in the concrete The initial consistency Concrete storage climate Even finding the right dose so that this product will be profitable to use in the practice. All investigations have been made at Sika AB laboratory. All the data have been investigated and used in excel program for the product of chart and diagram. The results of the investigations showed the following:   Bond time can be reduced with Calcium Nitrate without needing to risk the physical features of the concrete.  According to bonding time diagram it was noted that 2.0 % and 2.5% doses gave the best result valid the initial setting i.e. that gave the shortest time of initial setting, Observe that +5 degrees storage climate gave illogical results i.e. reference concrete without the accelerator gave the shortest initial setting. Compressive strength does not get affected by the accelerator i.e. you can use this product without risking the concretes carrying capacity.  The results showed that the concrete initial consistency has a big importance to bond time, the higher consistency value the longer time of initial setting. Even concrete initial temperature has influence on the bond time, the higher concrete temperature, the shorter time of initial setting.
102

Coordinate conversion for the Hough transform

Eriksson, Edvin January 2021 (has links)
This thesis attempts to develop a conversion algorithm between local coordinates in constituent detector modules and global coordinates encompassing the whole detector structure in a generic detector. The thesis is a part of preparatory work for studying the Hough Transform as a means of track reconstruction in the level-1 hardware trigger in the upgraded trigger and data acquisition (TDAQ) system in the phase 2 upgrade of the ATLAS detector at CERN. The upgrades being made are to withstand much more extreme conditions that come with the high-Luminosity Large Hadron Collider (HL-LHC). Two algorithms have been made and then implemented in Python scripts to test their feasibility and to compare them against each-other. The Rotation algorithm uses several rotations to correctly place the local coordinates in the global system. The second, the Shear algorithm, simplifies the process into two shears and one rotation, using the small angle approximation. Both algorithms need to be extended to work with more parts of the detector to be considered complete. Despite having lower maximum precision the second algorithm is considered the most promising attempt, since it is much less sensitive to the truncation error that results from working in an integer environment, which is a requirement for use in FPGAs. / I denna uppsats görs ett försök att skapa en omvandlingsalgoritm mellan lokala koordinater i konstituerande detektormoduler och globala koordinater i hela detektorstrukturen för en generisk detektor. Uppsatsen är en del i förberedande arbete för att undersöka hur Houghtransformen kan användas för spårrekonstruktion i den hårdvarubaserade level-1 triggern i det uppgraderade trigger- och datainsamlingssystemet (TDAQ) i fas två-uppgraderingen av ATLAS detektorn vid CERN. Uppgraderingarna som görs är för att kunna utstå de mycket mer extrema förhållanden som medförs av högluminositetsuppgraderingen av Large Hadron Collider (HL-LHC). Två algoritmer har skapats och implementerats i Pythonskript för att testa genomförbarhet och för att jämföra med varandra. Rotationsalgoritmen använder ett antal rotationer för att korrekt placera ut de lokala koordinaterna i det globala systemet. Den andra, Skjuvalgortimen, förenklar processen till två skjuvningar och en rotation med hjälp av liten vinkel-approximationen. Båda algoritmerna behöver utökas för att fungera för fler delar av detektorn för att anses kompletta. Trots lägre maximal precision bedöms den andra algoritmen vara det mest lovande försöket, eftersom den är mycket mindre känslig för trunkeringsfelet som kommer av att arbeta i en heltalsmiljö, som är ett krav för FPGA-implementationen.
103

Designing radiation protection for a linear accelerator : using Monte carlo-simulations / Framtagning av förslag på förstärkt strålskydd för en linjäraccelerator : med hjälp av Monte Carlo-simuleringar

Lindahl, Jonatan January 2019 (has links)
The department of Radiation Sciences at Umeå University has obtained an old linear accelerator, intended for educational purposes. The goal of this thesis was to find proper reinforced radiation protection in an intended bunker (a room with thick concrete walls), to ensure that the radiation outside the bunker falls within acceptable levels. The main method was with the use of Monte Carlo-simulations. To properly simulate the accelerator, knowledge of the energy distribution of emitted radiation was needed. For this, a novel method for spectra determination, using several depth dose measurements including off-axis, was developed. A method that shows promising results in finding the spectra when measurements outside the primary beam are included. The found energy spectrum was then used to simulate the accelerator in the intended bunker. The resulting dose distribution was visualized together with 3D CAD-images of the bunker, to easily see in which locations outside the bunker where the dose was high. An important finding was that some changes are required to ensure that the public does not receive too high doses of radiation on a public outdoor-area that is located above the bunker. Otherwise, the accelerator is only allowed to be run 1.8 hours per year. A workaround to this problem could be to just plant a thorn bush, covering the dangerous area of radius 3m. After such a measure has been taken, which is assumed in the following results, the focus moves to the radiation that leaks into the accelerator’s intended control room, which is located right outside the bunker’s entrance door. The results show that the accelerator is only allowed to be run for a maximum of 6.1 or 3.3 hours per year (depending on the placement of the accelerator in the room), without a specific extra reinforced radiation protection consisting mainly of lead bricks. With the specific extra protection added, the accelerator is allowed to be run 44 or 54 hours per year instead, showing a distinct improvement. However, the dose rate to the control room was still quite high, 13.7 μGy/h or 11.2 μGy/h, compared to the average dose received by someone living in Sweden, which is 0.27 μGy/h. Therefore, further measures are recommended. This is however a worst case scenario, since the leakage spectrum from the accelerator itself was simulated as having the same energy spectrum as the primarybeam at 0.1 % of the intensity, which is the maximum leakage dose according to the specifications for the accelerator. This is probably an overestimation of the intensity. Also, the energy spectra of the leakage is probably of lower energy than the primary beam in at least some directions. Implementing more knowledge of the leak spectra in future work, should therefore result in more allowed run hours for the accelerator.
104

Datenzentrierte Bestimmung von Assoziationsregeln in parallelen Datenbankarchitekturen

Legler, Thomas 15 August 2009 (has links) (PDF)
Die folgende Arbeit befasst sich mit der Alltagstauglichkeit moderner Massendatenverarbeitung, insbesondere mit dem Problem der Assoziationsregelanalyse. Vorhandene Datenmengen wachsen stark an, aber deren Auswertung ist für ungeübte Anwender schwierig. Daher verzichten Unternehmen auf Informationen, welche prinzipiell vorhanden sind. Assoziationsregeln zeigen in diesen Daten Abhängigkeiten zwischen den Elementen eines Datenbestandes, beispielsweise zwischen verkauften Produkten. Diese Regeln können mit Interessantheitsmaßen versehen werden, welche dem Anwender das Erkennen wichtiger Zusammenhänge ermöglichen. Es werden Ansätze gezeigt, dem Nutzer die Auswertung der Daten zu erleichtern. Das betrifft sowohl die robuste Arbeitsweise der Verfahren als auch die einfache Auswertung der Regeln. Die vorgestellten Algorithmen passen sich dabei an die zu verarbeitenden Daten an, was sie von anderen Verfahren unterscheidet. Assoziationsregelsuchen benötigen die Extraktion häufiger Kombinationen (EHK). Hierfür werden Möglichkeiten gezeigt, Lösungsansätze auf die Eigenschaften moderne System anzupassen. Als Ansatz werden Verfahren zur Berechnung der häufigsten $N$ Kombinationen erläutert, welche anders als bekannte Ansätze leicht konfigurierbar sind. Moderne Systeme rechnen zudem oft verteilt. Diese Rechnerverbünde können große Datenmengen parallel verarbeiten, benötigen jedoch die Vereinigung lokaler Ergebnisse. Für verteilte Top-N-EHK auf realistischen Partitionierungen werden hierfür Ansätze mit verschiedenen Eigenschaften präsentiert. Aus den häufigen Kombinationen werden Assoziationsregeln gebildet, deren Aufbereitung ebenfalls einfach durchführbar sein soll. In der Literatur wurden viele Maße vorgestellt. Je nach den Anforderungen entsprechen sie je einer subjektiven Bewertung, allerdings nicht zwingend der des Anwenders. Hierfür wird untersucht, wie mehrere Interessantheitsmaßen zu einem globalen Maß vereinigt werden können. Dies findet Regeln, welche mehrfach wichtig erschienen. Der Nutzer kann mit den Vorschlägen sein Suchziel eingrenzen. Ein zweiter Ansatz gruppiert Regeln. Dies erfolgt über die Häufigkeiten der Regelelemente, welche die Grundlage von Interessantheitsmaßen bilden. Die Regeln einer solchen Gruppe sind daher bezüglich vieler Interessantheitsmaßen ähnlich und können gemeinsam ausgewertet werden. Dies reduziert den manuellen Aufwand des Nutzers. Diese Arbeit zeigt Möglichkeiten, Assoziationsregelsuchen auf einen breiten Benutzerkreis zu erweitern und neue Anwender zu erreichen. Die Assoziationsregelsuche wird dabei derart vereinfacht, dass sie statt als Spezialanwendung als leicht nutzbares Werkzeug zur Datenanalyse verwendet werden kann. / The importance of data mining is widely acknowledged today. Mining for association rules and frequent patterns is a central activity in data mining. Three main strategies are available for such mining: APRIORI , FP-tree-based approaches like FP-GROWTH, and algorithms based on vertical data structures and depth-first mining strategies like ECLAT and CHARM. Unfortunately, most of these algorithms are only moderately suitable for many “real-world” scenarios because their usability and the special characteristics of the data are two aspects of practical association rule mining that require further work. All mining strategies for frequent patterns use a parameter called minimum support to define a minimum occurrence frequency for searched patterns. This parameter cuts down the number of patterns searched to improve the relevance of the results. In complex business scenarios, it can be difficult and expensive to define a suitable value for the minimum support because it depends strongly on the particular datasets. Users are often unable to set this parameter for unknown datasets, and unsuitable minimum-support values can extract millions of frequent patterns and generate enormous runtimes. For this reason, it is not feasible to permit ad-hoc data mining by unskilled users. Such users do not have the knowledge and time to define suitable parameters by trial-and-error procedures. Discussions with users of SAP software have revealed great interest in the results of association-rule mining techniques, but most of these users are unable or unwilling to set very technical parameters. Given such user constraints, several studies have addressed the problem of replacing the minimum-support parameter with more intuitive top-n strategies. We have developed an adaptive mining algorithm to give untrained SAP users a tool to analyze their data easily without the need for elaborate data preparation and parameter determination. Previously implemented approaches of distributed frequent-pattern mining were expensive and time-consuming tasks for specialists. In contrast, we propose a method to accelerate and simplify the mining process by using top-n strategies and relaxing some requirements on the results, such as completeness. Unlike such data approximation techniques as sampling, our algorithm always returns exact frequency counts. The only drawback is that the result set may fail to include some of the patterns up to a specific frequency threshold. Another aspect of real-world datasets is the fact that they are often partitioned for shared-nothing architectures, following business-specific parameters like location, fiscal year, or branch office. Users may also want to conduct mining operations spanning data from different partners, even if the local data from the respective partners cannot be integrated at a single location for data security reasons or due to their large volume. Almost every data mining solution is constrained by the need to hide complexity. As far as possible, the solution should offer a simple user interface that hides technical aspects like data distribution and data preparation. Given that BW Accelerator users have such simplicity and distribution requirements, we have developed an adaptive mining algorithm to give unskilled users a tool to analyze their data easily, without the need for complex data preparation or consolidation. For example, Business Intelligence scenarios often partition large data volumes by fiscal year to enable efficient optimizations for the data used in actual workloads. For most mining queries, more than one data partition is of interest, and therefore, distribution handling that leaves the data unaffected is necessary. The algorithms presented in this paper have been developed to work with data stored in SAP BW. A salient feature of SAP BW Accelerator is that it is implemented as a distributed landscape that sits on top of a large number of shared-nothing blade servers. Its main task is to execute OLAP queries that require fast aggregation of many millions of rows of data. Therefore, the distribution of data over the dedicated storage is optimized for such workloads. Data mining scenarios use the same data from storage, but reporting takes precedence over data mining, and hence, the data cannot be redistributed without massive costs. Distribution by special data semantics or user-defined selections can produce many partitions and very different partition sizes. The handling of such real-world distributions for frequent-pattern mining is an important task, but it conflicts with the requirement of balanced partition.
105

Datenzentrierte Bestimmung von Assoziationsregeln in parallelen Datenbankarchitekturen

Legler, Thomas 22 June 2009 (has links)
Die folgende Arbeit befasst sich mit der Alltagstauglichkeit moderner Massendatenverarbeitung, insbesondere mit dem Problem der Assoziationsregelanalyse. Vorhandene Datenmengen wachsen stark an, aber deren Auswertung ist für ungeübte Anwender schwierig. Daher verzichten Unternehmen auf Informationen, welche prinzipiell vorhanden sind. Assoziationsregeln zeigen in diesen Daten Abhängigkeiten zwischen den Elementen eines Datenbestandes, beispielsweise zwischen verkauften Produkten. Diese Regeln können mit Interessantheitsmaßen versehen werden, welche dem Anwender das Erkennen wichtiger Zusammenhänge ermöglichen. Es werden Ansätze gezeigt, dem Nutzer die Auswertung der Daten zu erleichtern. Das betrifft sowohl die robuste Arbeitsweise der Verfahren als auch die einfache Auswertung der Regeln. Die vorgestellten Algorithmen passen sich dabei an die zu verarbeitenden Daten an, was sie von anderen Verfahren unterscheidet. Assoziationsregelsuchen benötigen die Extraktion häufiger Kombinationen (EHK). Hierfür werden Möglichkeiten gezeigt, Lösungsansätze auf die Eigenschaften moderne System anzupassen. Als Ansatz werden Verfahren zur Berechnung der häufigsten $N$ Kombinationen erläutert, welche anders als bekannte Ansätze leicht konfigurierbar sind. Moderne Systeme rechnen zudem oft verteilt. Diese Rechnerverbünde können große Datenmengen parallel verarbeiten, benötigen jedoch die Vereinigung lokaler Ergebnisse. Für verteilte Top-N-EHK auf realistischen Partitionierungen werden hierfür Ansätze mit verschiedenen Eigenschaften präsentiert. Aus den häufigen Kombinationen werden Assoziationsregeln gebildet, deren Aufbereitung ebenfalls einfach durchführbar sein soll. In der Literatur wurden viele Maße vorgestellt. Je nach den Anforderungen entsprechen sie je einer subjektiven Bewertung, allerdings nicht zwingend der des Anwenders. Hierfür wird untersucht, wie mehrere Interessantheitsmaßen zu einem globalen Maß vereinigt werden können. Dies findet Regeln, welche mehrfach wichtig erschienen. Der Nutzer kann mit den Vorschlägen sein Suchziel eingrenzen. Ein zweiter Ansatz gruppiert Regeln. Dies erfolgt über die Häufigkeiten der Regelelemente, welche die Grundlage von Interessantheitsmaßen bilden. Die Regeln einer solchen Gruppe sind daher bezüglich vieler Interessantheitsmaßen ähnlich und können gemeinsam ausgewertet werden. Dies reduziert den manuellen Aufwand des Nutzers. Diese Arbeit zeigt Möglichkeiten, Assoziationsregelsuchen auf einen breiten Benutzerkreis zu erweitern und neue Anwender zu erreichen. Die Assoziationsregelsuche wird dabei derart vereinfacht, dass sie statt als Spezialanwendung als leicht nutzbares Werkzeug zur Datenanalyse verwendet werden kann. / The importance of data mining is widely acknowledged today. Mining for association rules and frequent patterns is a central activity in data mining. Three main strategies are available for such mining: APRIORI , FP-tree-based approaches like FP-GROWTH, and algorithms based on vertical data structures and depth-first mining strategies like ECLAT and CHARM. Unfortunately, most of these algorithms are only moderately suitable for many “real-world” scenarios because their usability and the special characteristics of the data are two aspects of practical association rule mining that require further work. All mining strategies for frequent patterns use a parameter called minimum support to define a minimum occurrence frequency for searched patterns. This parameter cuts down the number of patterns searched to improve the relevance of the results. In complex business scenarios, it can be difficult and expensive to define a suitable value for the minimum support because it depends strongly on the particular datasets. Users are often unable to set this parameter for unknown datasets, and unsuitable minimum-support values can extract millions of frequent patterns and generate enormous runtimes. For this reason, it is not feasible to permit ad-hoc data mining by unskilled users. Such users do not have the knowledge and time to define suitable parameters by trial-and-error procedures. Discussions with users of SAP software have revealed great interest in the results of association-rule mining techniques, but most of these users are unable or unwilling to set very technical parameters. Given such user constraints, several studies have addressed the problem of replacing the minimum-support parameter with more intuitive top-n strategies. We have developed an adaptive mining algorithm to give untrained SAP users a tool to analyze their data easily without the need for elaborate data preparation and parameter determination. Previously implemented approaches of distributed frequent-pattern mining were expensive and time-consuming tasks for specialists. In contrast, we propose a method to accelerate and simplify the mining process by using top-n strategies and relaxing some requirements on the results, such as completeness. Unlike such data approximation techniques as sampling, our algorithm always returns exact frequency counts. The only drawback is that the result set may fail to include some of the patterns up to a specific frequency threshold. Another aspect of real-world datasets is the fact that they are often partitioned for shared-nothing architectures, following business-specific parameters like location, fiscal year, or branch office. Users may also want to conduct mining operations spanning data from different partners, even if the local data from the respective partners cannot be integrated at a single location for data security reasons or due to their large volume. Almost every data mining solution is constrained by the need to hide complexity. As far as possible, the solution should offer a simple user interface that hides technical aspects like data distribution and data preparation. Given that BW Accelerator users have such simplicity and distribution requirements, we have developed an adaptive mining algorithm to give unskilled users a tool to analyze their data easily, without the need for complex data preparation or consolidation. For example, Business Intelligence scenarios often partition large data volumes by fiscal year to enable efficient optimizations for the data used in actual workloads. For most mining queries, more than one data partition is of interest, and therefore, distribution handling that leaves the data unaffected is necessary. The algorithms presented in this paper have been developed to work with data stored in SAP BW. A salient feature of SAP BW Accelerator is that it is implemented as a distributed landscape that sits on top of a large number of shared-nothing blade servers. Its main task is to execute OLAP queries that require fast aggregation of many millions of rows of data. Therefore, the distribution of data over the dedicated storage is optimized for such workloads. Data mining scenarios use the same data from storage, but reporting takes precedence over data mining, and hence, the data cannot be redistributed without massive costs. Distribution by special data semantics or user-defined selections can produce many partitions and very different partition sizes. The handling of such real-world distributions for frequent-pattern mining is an important task, but it conflicts with the requirement of balanced partition.
106

Simulations of Quantum Black Hole collisions at the LHC with PYTHIA

Niblaeus, Carl January 2011 (has links)
In this bachelor's thesis the concept of microscopical black hole production at colliders is investigated. By using extra dimensional models where the value of the Planck mass can be reduced to the TeV scale, gravity can be made stronger at small distances and the Hierarchy problem can be solved. Since gravity is much stronger already at the TeV-scale, there is a possibility that microscopical black holes are produced at the LHC. In this thesis the possibility to produce Quantum Black Holes, black holes with masses around the Planck mass, is implemented in the event generator PYTHIA. Events where the Quantum Black Holes decay into two particles are simulated and studied. A main contribution is successful colour connections between the final states. Something to solve in future simulations is how to give the black holes a spectrum of masses.
107

Application of GEANT4 toolkit for simulations of high gradient phenomena

Persson, Daniel January 2018 (has links)
To study electron emissions and dark currents in the accelerating structures in particle colliders, a test facility with a spectrometer has been constructed at CERN. This spectrometer has been simulated in the C++ toolkit GEANT4 and in this project the simulation has been improved to handle new realistic input data of the emitted electrons. The goal was to find relations between where the electrons are emitted inside the accelerating structure and the energy or position of the particles measured by the spectrometer. The result was that there is a linear relation between the initial position of the electrons and the width in the positions of the particles measured by the spectrometer. It also appears to be a relations between energy the emitted electrons get in the accelerating structure, which is related to the position, and the energy they deposit in the spectrometer. Further studies where the simulations are compared with real measurement data are required to determine whether these relations are true or not, find better reliability in the relations and get a better understanding of the phenomena.
108

Thermo-mechanical analysis of cryo-cooled electrode system in COMSOL

Olofsson, Joel January 2018 (has links)
In the planned linear accelerator called Compact Linear Collider, CLIC, electrons and positrons will be accelerated to velocities near the speed of light. A limiting factor in accelerating structures are vacuum breakdowns, which are electrical discharges from a surface as a result of a large electric field being applied. In the preparatory studies for the CLIC, Uppsala University in collaboration with The European Organization for Nuclear Research, CERN, is building a DC Spark system to analyze vacuum breakdowns. This system containing large planar electrodes will be cooled down all the way down to around 4 K in order to limit the rate of wich vacuum breakdowns happen. When cooling a system like this, which consists of different components made of different materials there is the question of how the system will be affected. The objective of this project is to investigate how the cooling will affect the stability in terms of stresses and to analyze the cool down time of the system. Another goal is to make a material recommendation for a few parts based on the results. This will be done by simulating the cooling in COMSOL Multiphysics, which is a program that uses finite element analysis to solve complex problems where different branches of physics interact. The conclusion is that the system will most likely be stable as it is and there is no need to redesign it. The choice of recommended material is alumina with the reason being it should cause the least stress and the smallest gap between the electrodes when the cooling is done. There was no big difference in the cool down time between the materials. Further studies and simulations on the system is also recommended since there are many factors not taken into consideration in this study.
109

Den Nätverkande Inkubatorn : En kvalitativ studie om företagsinkubation och nätverkande som en väg mot rikt socialt kapital

Al Halabi, Danni January 2020 (has links)
A great challenge for startup companies is the lack of market knowledge and access to resources. Knowledge and resources can be acquired through networking, however, as many startup companies are young and unknown, they have difficulties establishing themselves in relevant networks. Incubators have for many years been an important instrument for the development of new companies and regional development. One important segment incubators offers startup companies is the access to a network. Networking establishes connections and strengthens relationships, which promotes a richer social capital. The purpose of the study is studying which factors of an incubator that impacts a startup companies’ development of social capital through networking and the challenges associated with networking. The underlying idea in the study is that the association with an incubator can strengthen the startup company’s legitimacy and develop the startup company's social capital by offering access to a network. For a young and unknown company, legitimacy and trust are important elements for network establishment. Young companies that are run by founders without previous qualifications often lack legitimacy and become just one of many others who tries to be heard in the startup business noise. The study uses a qualitative approach and three semi-structured interviews were conducted with representatives from two incubator firms and one industry organization that specializes in networking. The data was analyzed through a deductive thematic analysis. The theoretical field which the study is based on comes from theories within the business administration field and concerning entrepreneurship, organizational networks, business incubators and the creation of social capital within an organizational environment. The study finds that the association with an incubator can increase the company's legitimacy in the form of competency-based trust, which is based on knowledge or involvement with another actor. However, the study finds trust that is based on the founder’s reputation and past experience has a greater legitimating effect compared to only being associated with an incubator. The study also finds that the network an incubator makes available to startup companies consist of a network within the incubators own ecosystem. Network ties consist of specialized mentors with a recognized good level of knowledge within a certain or several fields, as well as incubator management, which can be interpreted as, the incubator offers a qualitative social capital which startup companies can acquire knowledge and resources from.  The study finds that aspiring Born Global founders who wants access to an international network should first review what type of network the incubator can offer. The study identifies the challenges associated with networking through an incubator. Internal networking is challenged by competition between companies, a perception that networking does not add any value, and fear of accidentally sharing technological secrets, which shifts companies from interacting and generating a richer social capital to become isolated or more focused on already established contacts the founder trusts. Challenges linked to networking with mentors includes the lack of truthfulness and complete transparency. Challenges linked to external networking outside the incubator’s ecosystem may include stakeholder recommendations to only network within the incubator's own ecosystem. / En stor utmaning för startupbolag är bristen på marknadskunskaper och tillgång till resurser. Kunskaper och resurser kan förvärvas genom nätverkande, men då många startupbolag är unga och okända har de svårt att etablera sig i relevanta nätverk. Inkubatorer har i många år varit ett viktigt instrument för att utveckla nyföretagande och regional utveckling. En viktig del inkubatorn erbjuder startupbolag är tillgången till ett nätverk.Genom nätverkande etableras förbindelser och stärks relationer, vilket leder till ett rikare socialt kapital. Studiens syfte är att studera vilka faktorer hos en inkubator som påverkar utvecklingen av socialt kapital genom nätverkande och de utmaningar som föreligger med nätverkande. Den bakomliggande idén i studien är att associationen med en inkubator kan stärka startupbolagets legitimitet samt utveckla startupbolagets sociala kapital genom att inkubatorn erbjuder tillgång till ett nätverk. För det unga och okända bolaget är legitimitet och tillit viktiga faktorer för etablering i nätverk. Unga bolag som drivs av en entreprenör utan tidigare meriter saknar ofta legitimitet och blir därför en av många andra som försöker höras i bruset. Studien använder en kvalitativ ansats där tre semi-strukturerade intervjuer utförts med representanter från två inkubatorverksamheter och en branschorganisation som specialiserar sig på nätverkande. Data analyserades genom en deduktiv tematisk analys. Det teoretiska fältet som studien baserar sig kommer från teorier inom det företagsekonomiska fältet och rörande entreprenörskap, organisatoriska nätverk, företagsinkubatorer och skapandet av socialt kapital inom en organisatorisk miljö.Studien finner att medlemskapet hos en inkubator kan höja startupbolagets legitimitet i form av kompetensbaserade tillit, vilket grundar sig på kunskaper eller involvering med en annan aktör. Studien finner dock att tillit som grundar sig på bolagsgrundarens tidigare erfarenheter och rykte på marknaden har en större legitimerande effekt jämfört med att endast ingå hos en inkubator. Studien finner även att nätverket inkubatorn gör tillgängligt för startupbolagen består av ett nätverk inom ramen av inkubatorprogrammets egna ekosystem. Nätverksförbindelser inkubatorn erbjuder består av specialiserade mentorer med erkänt goda kunskaper inom ett eller flera områden samt programmets ledning, vilket kan tolkas som att inkubatorn främjar utvecklingen av ett kvalitativt socialt kapital där nödvändiga kunskaper och resurser kan förvärvas.  Studien finner även att aspirerande Born Globals grundare som vill ha tillgång till ett internationellt nätverk först bör granska vilket typ av nätverk inkubatorn kan erbjuda. Studien identifierar de utmaningar som finns förknippat med nätverkande genom en inkubator. Det interna nätverkandet utmanas av konkurrens mellan bolag, uppfattningen att nätverkade inte tillför något värde, samt en rädsla för att råka dela med sig av teknologiska hemligheter, vilket skiftar bolagen från att interagera och utveckla ett rikare socialt kapital till att bli isolerade eller mer fokuserade på att endast nätverka med redan etablerade kontakter grundaren har tillit till. Hinder för nätverkande med mentorer omfattar avsaknaden av öppenhet och full transparens. Hinder kopplat till extern nätverkande utanför inkubatorns ekosystem kan innefatta rekommendationer från intressenter att endast nätverka inom inkubatorns egna ekosystem.
110

Key Business Services within Open Innovation Collaboration between Startups and large established Firms : A multiple case study of the value offering of Swedish corporate accelerators and incubators from a startup perspective / Centrala Affärsutvecklingstjänster inom Öppen Innovations-samarbeten mellan Startupföretag och stora väletablerade Företag : En multipel fallstudie av värdeerbjudandet av företagsdrivna acceleratorer och inkubatorer inom den svenska marknaden från ett startup-perspektiv

Abu Zeid, Houda, Syed, Tanya January 2019 (has links)
Open innovation is a term that has become popularised over the years, due to changes in how business is done as a result of globalisation and digital transformation. Efforts are being made by incumbent companies to collaborate with external parties to a greater extent, and at the same time, the startup landscape has contributed with new technologies and innovations that in some cases have disrupted markets. A collaboration between large companies and startups can bring about positive synergies since these two types of organisations are different and have the possibility to complement each other. This master thesis looks into the outside-in model of open innovation, specifically examining corporate accelerator programs and incubation hubs from a startup perspective. The following research explores what key services that are offered within these corporate programs and how they can be improved according to startups that have previously partaken in them. This research is a qualitative study with an abductive approach. As part of the research, 10 semi-structured, in-depth interviews were held with representatives from a variety of startups. The major services desired by the interviewees to be included in corporate-run startup programs range from access to internal and external networks to putting more focus on a variety of funding alternatives. Early-stage startups expressed the desire of receiving help with understanding their market and customers. The key improvement areas brought up by the startup companies included the presence of internal champions that can help speed up certain processes and act as a facilitator for important meetings. Many startups point to the importance of having the influence to customize their program experience. In addition, accelerator and incubator employees with previous entrepreneurial experience are considered very helpful by the startups since they can grasp the struggles of the startup in a better way. Furthermore, to have more financing opportunities is desirable. / Genom åren har öppen innovation blivit alltmer populariserad,på grund av förändringar i hur affärer görs till följd av globalisering och digital transformation. Stora företag satsar i större utsträckning på att samarbeta med externa parter, och samtidigt har startup ekosystemet bidragit till ny och radikal teknologi och innovationer som har rubbat vissa marknader. Ett öppen innovation-samarbete mellan ett stort företag och en startup kan bidra positiva synergier eftersom dessa två typer av organisationer är olika och har möjlighet att komplettera varandra. Detta examensarbete undersöker den så kallade outside-in modell för öppen innovation, mer specifikt undersöks företagsacceleratorer och företagsinkubatorer från ett startup-perspektiv. Följande forskning undersöker vilka nyckeltjänster som erbjuds inom dessa företagsprogram och hur de kan förbättras enligt startups som tidigare har deltagit i dem. Denna studie är en kvalitativ studie med en abduktiv ansats. Som en del av forskningen hölls 10 semistrukturerade djupintervjuer med representanter från en rad olika startups. De viktigaste tjänsterna som eftertraktas av intervjuobjekten som del av företagsacceleratorer och företagsinkubatorer gäller tillgång till interna och externa nätverk, som i sin tur kan förse tillgång till flera olika finansieringsalternativ. Startups som befinner sig i en tidig utvecklingsfas uttryckte en önskan att få hjälp med att förstå deras marknad och kunder. Förbättringsområden som identifierades av startupföretagen omfattar förekomsten av internal champions, som kan hjälpa till att påskynda vissa processer och som kan facilitera viktiga möten. Många startups pekar på vikten av att ha inflytande över att anpassa sin programupplevelse. Dessutom är accelerator- och inkubatormedarbetare med tidigare entreprenöriell erfarenhet väldigt eftertraktade, eftersom de kan förstå sig på startupföretagen på ett bättre sätt. Vidare, är det önskvärt att ha fler finansieringsmöjligheter.

Page generated in 0.1123 seconds