91 |
Who is left out? : Hidden Patterns of Birth Under-registration; A Case Study about IranSamadi Dezfouli, Sahba January 2017 (has links)
Universal full coverage of birth registration by 2030 is one of the sustainable development targets which itself is of great significance for the accomplishment of many development goals such as poverty eradication, inclusion, as well as improvement of several health factors. Despite the importance of this topic, not much academic attention has been paid to study the problem of birth under-registration from the perspective of development studies. This research studies the issue of birth under-registration through a case study of Iran. The four main questions of this research are the quantitative significance of the problem, the main causes of birth under-registration, the most affected social groups, and the main problematic domain of action, in the context of Iran. By utilizing an abductive content analysis method, this research aims to understand the problem, rather than proposing policy recommendations. This desk study uses secondary sources and almost all of the sources are of qualitative nature. It is not based on any pre-defined theory and therefore does not aim to generalize nor theorize the findings. It, however, is based on available theories for developing the analytical framework. The adopted analytical framework is Bottleneck analysis which is a method designed by UNICEF specifically for the purpose of birth registration programming and policy evaluation. Birth under-registration in Iran - compared to other countries in the region - turned out to be very low. The findings provide information on many good practices regarding birth registration programming and also about several areas in need of improvement in Iran. By applying the analytical framework to the findings, identified disincentives have been categorized in three domains of supply, demand, and enabling environment, and the significance of disincentives in each domain has been assessed. Based on the analysis, it can be concluded that almost all of the main causes of birth registration are of legal nature, especially patriarchal nationality laws. Also, it was found that the main risk groups were children of illegal immigrants, non-nationals, and unregistered parents, and the main problematic domain is found to be the domain of supply.
|
92 |
Analys och utveckling av godsterminal med hjälp av LeanSawaya, Christopher, Vukman, Boris January 2021 (has links)
Syfte Att undersöka en godsterminals logistikflöde med fokus på en automatiserad paketbana för att identifiera slöseri som kan leda till att outnyttjad arbetstid och flaskhalsar. Identifiera orsaker och vidare utveckla problemlösningar som leder till fullt utnyttjad arbetskapacitet genom implementering av Lean-verktyg. Syftet med arbetet har baserats på två forskningsfrågor: • Hur kan slöseri identifieras och kostnadsoptimeras efter användning av Lean-verktyg inom en godsterminal? • Hur kan rotorsaker till flaskhals vid automatiserade paketbanor identifieras och motverkas genom användning av Lean-verktyg? Metod och genomförande Syftet med arbetet och dess forskningsfrågor har besvarats genom en fallstudie och en litteraturstudie. Fallstudien som utförts bestod av intervjuer och observationer på fallföretaget för att ta reda på hur de arbetar samt hur arbetsprocesserna är utformade för att kunna identifiera slöseri och felprioritering inom terminalens layout kring paketbanan. I litteraturstudien har teori som stödjer logistikkostnader, slöseri och Lean samlats för att kunna besvara rapportens forskningsfrågor. Resultat Genom användning av Lean har slöseri vid fallföretagets automatiserade paketbana kunnat identifieras. Väntan var ett av de slöseri som funnits där fokus lagts på att finna en metod för att motverka väntan och på så sätt effektivisera arbetsflödet. Onödiga transporter var också ett problem som blev tydligt vid fallstudien som genomfördes där studien kunnat resultera i avsevärd minskning av avstånd för onödiga transporter där operatörer vid paketbanan istället kan lägga tiden på värdeskapande arbete. Studien har även resulterat i att onödiga arbetsmoment reducerats genom att de identifierats vid paketbanans defektplock. Resultatet av det har frambringat tidsbesparing och en strukturerad arbetsmetod för arbetsmomentet. Samtliga identifieringar av slöseri har resulterat i att flaskhalsar kan motarbetas för att hindra produktionstopp på paketbanan. Rekommendationer För att företaget ska motverka flaskhalsar och identifiera slöseri på godsterminalen bör en kartläggning av alla arbetsmoment göras. Fokus bör läggas på varje arbetsmoments lönsamhet samt att personalstyrkan bör utbildas inom Lean om möjligt för att all berörd personalstyrka tillsammans ska kunna sträva mot standardisering och implementering av Lean.
|
93 |
Explorative bioinformatic analysis of cardiomyocytes in 2D &3D in vitro culture systemJanardanan, Sruthy January 2021 (has links)
The in vitro cell culture models of human pluripotent stem cells (hPSC)-derived cardiomyocytes (CMs) have gained a predominant value in the field of drug discovery and is considered an attractive tool for cardiovascular disease modellings. However, despite several reports of different protocols for the hPSC-differentiation into CMs, the development of an efficient, controlled and reproducible 3D differentiation remains challenging. The main aim of this research study was to understand the changes in the gene expression as an impact of spatial orientation ofhPSC-derived CMs in 2D(two-dimensional) and 3D(three-dimensional) culture conditions and to identify the topologically important Hub and Hub-Bottleneck proteins using centrality measures to gain new knowledge for standardizing the pre-clinical models for the regeneration of CMs. The above-mentioned aim was achieved through an extensive bioinformatic analysis on the list of differentially expressed genes (DEGs) identified from RNA-sequencing (RNA-Seq). Functional annotation analysis of the DEGs from both 2D and 3D was performed using Cytoscape plug-in ClueGO. Followed by the topological analysis of the protein-protein interaction network (PPIN) using two centrality parameters; Degree and Betweeness in Cytoscape plug-in CenTiScaPe. The results obtained revealed that compared to 2D, DEGs in 3D are primarily associated with cell signalling suggesting the interaction between cells as an impact of the 3D microenvironment and topological analysis revealed 32 and 39 proteins as Hub and Hub-Bottleneck proteins, respectively in 3D indicating the possibility of utilizing those identified genes and their corresponding proteins as cardiac disease biomarkers in future by further research.
|
94 |
Apprentissage et exploitation de représentations sémantiques pour la classification et la recherche d'images / Learning and exploiting semantic representations for image classification and retrievalBucher, Maxime 27 November 2018 (has links)
Dans cette thèse nous étudions différentes questions relatives à la mise en pratique de modèles d'apprentissage profond. En effet malgré les avancées prometteuses de ces algorithmes en vision par ordinateur, leur emploi dans certains cas d'usage réels reste difficile. Une première difficulté est, pour des tâches de classification d'images, de rassembler pour des milliers de catégories suffisamment de données d'entraînement pour chacune des classes. C'est pourquoi nous proposons deux nouvelles approches adaptées à ce scénario d'apprentissage, appelé <<classification zero-shot>>.L'utilisation d'information sémantique pour modéliser les classes permet de définir les modèles par description, par opposition à une modélisation à partir d'un ensemble d'exemples, et rend possible la modélisation sans donnée de référence. L'idée fondamentale du premier chapitre est d'obtenir une distribution d'attributs optimale grâce à l'apprentissage d'une métrique, capable à la fois de sélectionner et de transformer la distribution des données originales. Dans le chapitre suivant, contrairement aux approches standards de la littérature qui reposent sur l'apprentissage d'un espace d'intégration commun, nous proposons de générer des caractéristiques visuelles à partir d'un générateur conditionnel. Une fois générés ces exemples artificiels peuvent être utilisés conjointement avec des données réelles pour l'apprentissage d'un classifieur discriminant. Dans une seconde partie de ce manuscrit, nous abordons la question de l'intelligibilité des calculs pour les tâches de vision par ordinateur. En raison des nombreuses et complexes transformations des algorithmes profonds, il est difficile pour un utilisateur d'interpréter le résultat retourné. Notre proposition est d'introduire un <<goulot d'étranglement sémantique>> dans le processus de traitement. La représentation de l'image est exprimée entièrement en langage naturel, tout en conservant l'efficacité des représentations numériques. L'intelligibilité de la représentation permet à un utilisateur d'examiner sur quelle base l'inférence a été réalisée et ainsi d'accepter ou de rejeter la décision suivant sa connaissance et son expérience humaine. / In this thesis, we examine some practical difficulties of deep learning models.Indeed, despite the promising results in computer vision, implementing them in some situations raises some questions. For example, in classification tasks where thousands of categories have to be recognised, it is sometimes difficult to gather enough training data for each category.We propose two new approaches for this learning scenario, called <<zero-shot learning>>. We use semantic information to model classes which allows us to define models by description, as opposed to modelling from a set of examples.In the first chapter we propose to optimize a metric in order to transform the distribution of the original data and to obtain an optimal attribute distribution. In the following chapter, unlike the standard approaches of the literature that rely on the learning of a common integration space, we propose to generate visual features from a conditional generator. The artificial examples can be used in addition to real data for learning a discriminant classifier. In the second part of this thesis, we address the question of computational intelligibility for computer vision tasks. Due to the many and complex transformations of deep learning algorithms, it is difficult for a user to interpret the returned prediction. Our proposition is to introduce what we call a <<semantic bottleneck>> in the processing pipeline, which is a crossing point in which the representation of the image is entirely expressed with natural language, while retaining the efficiency of numerical representations. This semantic bottleneck allows to detect failure cases in the prediction process so as to accept or reject the decision.
|
95 |
The applicability of modelling and simulation : A case study within the medical device industryNyström, Anton, Hellberg, David January 2020 (has links)
The medical device industry has for a long time lagged behind other industries in terms of adopting new tools for process improvements. Despite showing promising results from various industries, some more heavily regulated than others, modelling and simulation has not yet gained traction within the medical device industry for performing production improvements. The industry has instead relied upon proven improvement philosophies which are believed to generate a desirable outcome. With the purpose of investigating how this novel tool can be combined with current improvement efforts as well as understanding why it has not yet been accepted, a case study was conducted at the Uppsala facility of Johnson & Johnson Vision. A mixed methodological approach was used, where quantitative and qualitative data was analyzed in combination. Semi-structured interviews and structed observations provided empirical evidence for a thematic analysis and a simulation-based bottleneck analysis. Rather than proving that a simulation-based bottleneck was possible in this particular setting, it was used to confirm its applicability in combination with other tools and improvement philosophies. The study concludes that the issue is not strictly related to the use of modelling and simulation but is rather related to the reactive mind-set which has become a consequence of the rigorous regulatory landscape that the industry is encompassed by.
|
96 |
Optimal and heuristic solutions for the single and multiple batch flow shop lot streaming problems with equal sublotsKalir, Adar A. 06 March 1999 (has links)
This research is concerned with the development of efficient solutions to various problems that arise in the flow-shop environments which utilize lot-streaming. Lot streaming is a commonly used process of splitting production lots into sublots and, then, of scheduling the sublots in an overlapping fashion on the machines, so as to expedite the progress of orders in production and to improve the overall performance of the production system.
The different lot-streaming problems that arise in various flow-shop environments have been divided into two categories, single-lot problems and multiple-lot problems. Further classification of the multiple-lot problems into the lot streaming sequencing problem (LSSP) and the flow-shop lot-streaming (FSLS) problem is made in this work. This classification is motivated by the occurrence of these problems in the industry. Several variants of these problems are addressed in this research. In agreement with numerous practical applications, we assume sublots of equal sizes. It turns out that this restriction paves the way to the relaxation of several typical limitations of current lot-streaming models, such as assumption of negligible transfer and setup times or consideration of only the makespan criterion. For the single-lot problem, a goal programming (GP) approach is utilized to solve the problem for a unified cost objective function comprising of the makespan, the mean flow time, the average work-in-process (WIP), and the setup and handling related costs. A very fast optimal solution algorithm is proposed for finding the optimal number of sublots (and, consequently, the sublot size) for this unified cost objective function in a general m-machine flow shop.
For the more complicated multiple-lot problem, a near-optimal heuristic for the solution of the LSSP is developed. This proposed heuristic procedure, referred to as the Bottleneck Minimal Idleness (BMI) heuristic, identifies and employs certain properties of the problem that are irregular in traditional flow-shop problems, particularly the fact that the sublot sizes eminating from the same lot type and their processing times (on the same machines) are identical. The BMI heuristic attempts to maximize the time buffer prior to the bottleneck machine, thereby minimizing potential bottleneck idleness, while also looking-ahead to sequence the lots with large remaining process time earlier in the schedule. A detailed experimental study is performed to show that the BMI heuristic outperforms the Fast Insertion Heuristic (the best known heuristic for flow-shop scheduling), when modified for Lot Streaming (FIHLS) and applied to the problem on hand.
For the FSLS problem, several algorithms are developed. For the two-machine FSLS problem with an identical sublot-size for all the lots, an optimal pseudo-polynomial solution algorithm is proposed. For all practical purposes (i.e., even for very large lot sizes), this algorithm is very fast. For the case in which the sublot-sizes are lot-based, optimal and heuristic procedures are developed. The heuristic procedure is developed to reduce the complexity of the optimal solution algorithm. It consists of a construction phase and an improvement phase. In the construction phase, it attempts to find a near-optimal sequence for the lots and then, in the improvement phase, given the sequence, it attempts to optimize the lot-based sublot-sizes of each of the lots. Extensions of the solution procedures are proposed for the general m-machine FSLS problem.
A comprehensive simulation study of a flow shop system under lot streaming is conducted to support the validity of the results and to demonstrate the effectiveness of the heuristic procedures. This study clearly indicates that, even in dynamic practical situations, the BMI rule, which is based on the proposed BMI heuristic, outperforms existing WIP rules, commonly used in industry, in scheduling a flow-shop that utilizes lot streaming. With respect to the primary performance measure - cycle time (or MFT) - the BMI rule demonstrates a clear improvement over other WIP rules. It is further shown that it also outperforms other WIP rules with respect to the output variability measure, another important measure in flow-shop systems. The effects of several other factors, namely system randomness, system loading, and bottleneck-related (location and number), in a flow-shop under lot streaming, are also reported. / Ph. D.
|
97 |
Visual Analytics of Cascaded Bottlenecks in Planar Flow NetworksPost, Tobias, Gillmann, Christina, Wischgoll, Thomas, Hamann, Bernd, Hagen, Hans 25 January 2019 (has links)
Finding bottlenecks and eliminating them to increase the overall flow of a network often appears in real world applications, such as production planning, factory layout, flow related physical approaches, and even cyber security. In many cases, several edges can form a bottleneck (cascaded bottlenecks). This work presents a visual analytics methodology to analyze these cascaded bottlenecks. The methodology consists of multiple steps: identification of bottlenecks, identification of potential improvements, communication of bottlenecks, interactive adaption of bottlenecks, and a feedback loop that allows users to adapt flow networks and their resulting bottlenecks until they are satisfied with the flow network configuration. To achieve this, the definition of a minimal cut is extended to identify network edges that form a (cascaded) bottleneck. To show the effectiveness of the presented approach, we applied the methodology to two flow network setups and show how the overall flow of these networks can be improved.
|
98 |
Future production state at Strömsholmen AB : - A singel case study analyzing the impacts of eliminating a bottleneck with a total cost perspectiveÅkerberg, Filip, Flodell, Jakob January 2021 (has links)
All producing companies strive to deliver high-quality products to meet their customers’ requirements, to a reasonable price. In order for the companies to stay competitive, it is important to have an efficient flow through the production and therefore continually analyze and develop their own production. The master thesis was carried out at Strömsholmen AB (SAB), a company which produces and develops gas springs and hydraulic springs. Their production is made by a machining hall and an assembly hall, with a component inventory in between. SAB has a plan to upgrade their surface treatment process, the current bottleneck in the production. The goal with the thesis is to point out the future bottlenecks and constraints, furthermore, propose improvements for a more efficient flow. To discover the improvements, three research questions were asked. The first question refers to map out the current production and map the production when the surface treatment is upgraded. The second question is about finding the next bottleneck in the production with an increased order stock. The last question refers to the finding of improvements and the effects related to the production performance. In order to fulfill the purpose and answer the research questions, a literature survey was conducted and a collection of empirical data. The literature survey provided a connection to the existing theory and the study. The empirical data consists of quantitative data which illustrates the current production, the collected qualitative data gives a more in-depth knowledge about SAB’s production. The data has been collected through observations, semi-structed interviews and documentation. The analyze was built upon the five-why method, to find the root causes in the production. The empirical data established that SAB has a complex material flow through the production and a big assortment of components. Beyond that, the surface treatment process was verified as the current bottleneck of the production. When the surface treatment was upgraded, seven percent capacity was available until next machine group will reach the capacity cap. In the analyze, the current production and potential improvements were compared with the existing theories. The study pointed out the value creating and waste steps of the production and how to adjust the flow according to the value creating and waste steps. Further on, future bottlenecks were pointed out, the bottlenecks could discourage the potential to increase the order stock. Two distinguished results could be established from the analysis. First of all, it would be suitable to implement flow shop for A product’s (stands for 70% of the revenue) components that have the same material flow and go through the same machine groups. This would reduce the production throughput time and enable lower stock levels and thereby decrease the inventory carrying cost. The second result is considering how to eliminate the future bottlenecks in the production, this can be done with either sub-contractors or balance the workload. Consequently, this will enable an 20% increased order stock from the production. In addition to these two major improvements, several other improvements were found during the study which will affect SAB’s potential to gain more profit. The improvements include investigate the quality of their data, start performing forecast and review the number of products and components.
|
99 |
Caught in a Bottleneck: Habitat Loss for Woolly Mammoths in Central North America and the Ice-Free Corridor During the Last DeglaciationWang, Yue, Widga, Chris, Graham, Russell W., McGuire, Jenny L., Porter, Warren, Wårlind, David, Williams, John W. 01 February 2021 (has links)
Aim: Identifying how climate change, habitat loss, and corridors interact to influence species survival or extinction is critical to understanding macro-scale biodiversity dynamics under changing environments. In North America, the ice-free corridor was the only major pathway for northward migration by megafaunal species during the last deglaciation. However, the timing and interplay among the late Quaternary megafaunal extinctions, climate change, habitat structure, and the opening and reforestation of the ice-free corridor have been unclear. Location: North America. Time period: 15–10 ka. Major taxa studied: Woolly mammoth (Mammuthus primigenius). Methods: For central North America and the ice-free corridor between 15 and 10 ka, we used a series of models and continental-scale datasets to reconstruct habitat characteristics and assess habitat suitability. The models and datasets include biophysical and statistical niche models Niche Mapper and Maxent, downscaled climate simulations from CCSM3 SynTraCE, LPJ-GUESS simulations of net primary productivity (NPP) and woody cover, and woody cover based upon fossil pollen from Neotoma. Results: The ice-free corridor may have been of limited suitability for traversal by mammoths and other grazers due to persistently low productivity by herbaceous plants and quick reforestation after opening 14 ka. Simultaneously, rapid reforestation and decreased forage productivity may have led to declining habitat suitability in central North America. This was possibly amplified by a positive feedback loop driven by reduced herbivory pressures, as mammoth population decline led to the further loss of open habitat. Main conclusions: Declining habitat availability south of the Laurentide Ice Sheet and limited habitat availability in the ice-free corridor were contributing factors in North American extinctions of woolly mammoths and other large grazers that likely operated synergistically with anthropogenic pressures. The role of habitat loss and attenuated corridor suitability for the woolly mammoth extinction reinforce the critical importance of protected habitat connectivity during changing climates, particularly for large vertebrates.
|
100 |
Effektivisering av produktionsflöden inom livsmedelsbranschen : En studie om utmaningar & möjligheter med digital tvilling / Streamlining Production Flows in the Food Industry : A study on challenges & opportunities with Digital TwinLarsson, Erik, Ringdahl, Anna January 2023 (has links)
En ökad effektivitet i de interna flödena är en viktig faktor för att bibehålla konkurrenskraft inom livsmedelsindustrin. Ny teknik som digitala tvillingar kan därför vara en möjlighet till att identifiera och åtgärda begränsande faktorer i produktflödet. Syftet med denna studie var att bidra till ökad förståelse för vilka möjligheter och utmaningar som kan uppstå vid tillämpningen av digital tvilling för att effektivisera flödesprocesser. För att samla in relevant empirisk data genomfördes två semistrukturerade intervjuer, en observation samt dokumentstudier. Studiens resultat visar att den digitala tvillingen användes för datadrivna simuleringar med hjälp av realtidsdata som producerats av RFID-teknik. Simuleringarna gav möjligheten att förebygga fel som annars hade uppstått i flödet som därmed även sänkte både kostnader och tiden som krävdes vid flödesförändringen. Den digitala tvillingen i kombination med VR gav även möjlighet till visualisering av den digitala tvillingen och är en möjlighet till att engagera medarbetare i förändringar. Det fanns även potential att använda den digitala tvillingen tillsammans med VR för utbildning vilket kan leda till minskade kostnader och fel i det verkliga flödet. Utmaningar som identifierades i studien var att kostnaden för skapandet och underhållet av den digitala tvillingen och att den skapades i rätt tid i projektet för att kunna utnyttja dess potential maximalt. / Increased efficiency in internal flows is an important factor in maintaining competitiveness within the food industry. New technologies such as digital twins can therefore provide an opportunity to identify and address limiting factors in the product flow. The purpose of this study was to contribute to a better understanding of the opportunities and challenges that may arise form the application of digital twins to streamline flow processes. To collect relevant empirical data two semi-structured interviews, an observation and document studies were conducted. The results of the study show that the digital twin was used for data-driven simulations using real-time data produced by RFID technology. The simulations provided the opportunity to prevent errors that would otherwise have occurred in the flow, thus reducing both costs and the time required for flow changes. The digital twin, in combination with virtual reality (VR), also allowed for visualization of the digital twin and provided an opportunity to engage employees in the changes. There was also potential to use the digital twin together with VR for training, which could lead to reduced costs and errors in the actual flow. Challenges identified in the study included the cost of creating and maintaining the digital twin, as well as ensuring that it was created at the right time in the project to fully exploit its potential.
|
Page generated in 0.0305 seconds