1 |
Systémy detekce a prevence průniku / Intrusion Detection and Prevention SystemsČerný, Michal January 2010 (has links)
The detection and intrusion prevention systems could be realized as independent hardware or set in the software form on to the host. The primary purpose of these protective elements is the undesirable activity detection such as integrity intrusion of the files, invalid attempts while connecting to the remote service or acquisition of the local network data. The systems react to the event on the basis of the action that is defined by internal rules. We can include the caution sending or communication blocking among possible counteractions. The base principals of the detection and intrusion prevention systems are described in the dissertation. Various types of captured data analyses and processes of the inhere rules creation and further more caution formats are mentioned in the dissertation. There are also considered the alternatives of their location including advantages of selected situations. There is described the installation and setting up of particular elements of the realized network and security systems. In order to the verification of functionality and factor of the protection providing there was realized several selected types of attacks.
|
2 |
Design and Implementation of a Graceful Degradation Approach for Polymorphic Role Invocation in Object TeamsKummer, Cornelius 07 September 2021 (has links)
In the ever-evolving world of modern software engineering, dynamic and context-dependent adaptability becomes increasingly important. A promising new paradigm that has been proposed is role-oriented programming, an extension of object-oriented programming which allows collaborative relationships of objects to be modeled. Through the introduction of roles and contexts, the behavior of objects can be adapted at run-time via addition or modification of attributes and methods. This dynamism however incurs a high overhead, especially in the area of role function invocation. Recent research has found a remedy inspired by polymorphic inline caches, allowing reuse of so-called dispatch plans which encode the steps directly required for the execution of adaptations. With this optimization, an average speedup of 4.0× was achieved in static contexts and 1.1× in variable contexts. Still, performance sharply drops off at a certain degree of volatility as a consequence of cache capacity exhaustion. This thesis presents a fallback mechanism that is to be used at highly variable call sites which would normally cause a significant slowdown with the new approach. In addition, an optimized reuse mechanism is proposed, further improving execution efficiency. Evaluation through benchmarking shows complete elimination of the aforementioned overhead, meaning a speedup of 16.5×, while the previously achieved speedup is maintained.
|
3 |
A Performance Comparison of Dynamic- and Inline Ray Tracing in DXR : An application in soft shadowsSjöberg, Joakim, Zachrisson, Filip January 2021 (has links)
Background. Ray tracing is a tool that can be used to increase the quality of the graphics in games. One application in graphics that ray tracing excels in is generating shadows because ray tracing can simulate how shadows are generated in real life more accurately than rasterization techniques can. With the release of GPUs with hardware support for ray tracing, it can now be used in real-time graphics applications to some extent. However, it is still a computationally heavy task requiring performance improvements. Objectives. This thesis will evaluate the difference in performance of three raytracing methods in DXR Tier 1.1, namely dynamic ray tracing and two forms of inline ray tracing. To further investigate the ray-tracing performance, soft shadows will be implemented to see if the driver can perform optimizations differently (depending on the choice of ray-tracing method) on the subsequent and/or preceding API interactions. With the pipelines implemented, benchmarks will be performed using different GPUs, scenes, and a varying amount of shadow-casting lights. Methods. The scientific method is based on an experimental approach, using both implementation and performance tests. The experimental approach will begin by extending an in-house DirectX 12 renderer. The extension includes ray-tracing functionality, so that hard shadows can be generated using both dynamic- and the inline forms ray tracing. Afterwards, soft shadows are generated by implementing a state-of-the-art-denoiser with some modifications, which will be added to each ray-tracing method. Finally, the renderer is used to perform benchmarks of various scenes with varying amounts of shadow-casting lights and object complexity to cover a broad area of scenarios that could occur in a game and/or in other similar applications. Results and Conclusions. The results gathered in this experiment suggest that under the experimental conditions of the chosen scenes, objects, and number of lights, AMD’s GPUs were faster in performance when using dynamic ray tracing than using inline ray tracing, whilst Nvidia’s GPUs were faster when using inline ray tracing compared to when using dynamic ray tracing. Also, with an increasing amount of shadow-casting lights, the choice of ray-tracing method had low to no impact except for linearly increasing the execution time in each test. Finally, adding soft shadows(subsequent and preceding API interactions) also had low to no relative impact on the results depending on the different ray-tracing methods. / Bakgrund. Strålspårning (ray tracing) är ett verktyg som kan användas för att öka kvalitén på grafiken i spel. En tillämpning i grafik som strålspårning utmärker sig i är när skuggor ska skapas eftersom att strålspårning lättare kan simulera hur skuggor skapas i verkligheten, vilket tidigare tekniker i rasterisering inte hade möjlighet för. Med ny hårdvara där det finns support för strålspårning inbyggt i grafikkorten finns det nu möjligheter att använda strålspårning i realtids-applikationer inom vissa gränser. Det är fortfarande tunga beräkningar som behöver slutföras och det är därav att det finns behov av förbättringar. Syfte. Denna uppsats kommer att utvärdera skillnaderna i prestanda mellan tre olika strålspårningsmetoder i DXR nivå 1.1, nämligen dynamisk strålspårning och två olika former av inline strålspårning. För att ge en bredare utredning på prestandan mellan strålspårningsmetoderna kommer mjuka skuggor att implementeras för att se om drivrutinen kan göra olika optimiseringar (beroende på valet av strålspårningsmetod) på de efterföljande och/eller föregående API anropen. Efter att dessa rörledningar (pipelines) är implementerade kommer prestandatester att utföras med olika grafikkort, scener, och antal ljus som kastar skuggor. Metod. Den vetenskapliga metoden är baserat på ett experimentellt tillvägagångssätt, som kommer innehålla både ett experiment och ett flertal prestandatester. Det experimentella tillvägagångssättet kommer att börja med att utöka en egenskapad DirectX 12 renderare. Utökningen kommer tillföra ny funktionalitet för att kunna hantera strålspårning så att hårda skuggor ska kunna genereras med både dynamisk och de olika formerna av inline strålspårning. Efter det kommer mjuka skuggor att skapas genom att implementera en väletablerad avbrusningsteknik med några modifikationer, vilket kommer att bli tillagt på varje strålspårningssteg. Till slut kommer olika prestandatester att mätas med olika grafikkort, olika antal ljus, och olika scener för att täcka olika scenarion som skulle kunna uppstå i ett spel och/eller i andra liknande applikationer. Resultat och Slutsatser. De resultat från testerna i detta experiment påvisar att under dessa förutsättningar så är AMD’s grafikkort snabbare på dynamisk strålspårning än på inline strålspårning, samtidigt som Nvidias grafikkort är snabbare på inline strålspårning än på den dynamiska varianten. Ökandet av ljus som kastar skuggor påvisade låg till ingen förändring förutom ett linjärt ökande av exekveringstiden i de flesta testerna. Slutligen så visade det sig även att tillägget av mjuka skuggor (efterföljande och föregående API interaktioner) hade låg till ingen påverkan på valet av strålspårningsmetod.
|
4 |
Multiple treatment objectives of solar driven electrolytic oxidant production for decentralized water treatment in developing regions and its economic feasibilityOtter, Philipp 21 March 2022 (has links)
Im Jahr 2017 konsumierten knapp 2 Milliarden Menschen fäkalkontaminiertes Wasser. Das führte zu fast 500.000 Todesfällen. Gleichzeitig werden Trinkwasserressourcen ausgebeutet und schon heute sind 4 Milliarden Menschen von Wasserknappheit betroffen. In ländlichen Entwicklungsregionen stellt, aus technischer Sicht, die Desinfektion des Wassers eine der größten Herausforderungen für die Sicherstellung einer angemessenen Trinkwasserversorgung dar.
In dieser Dissertation wird die technische sowie wirtschaftliche Machbarkeit einer solar betriebenen Anlage zur Chlorproduktion mittels Inlineelektrolyse (ECl2) als Alternative zur Lieferung und Dosierung von Chlorreagenzien analysiert und bewertet. Während der ECl2 wird das gesamte aufzubereitende Wasser durch eine Elektrolysezelle geleitet und das Chlor aus dem natürlichen Chloridgehalt des Wassers „inline“ gebildet. Unter opti-malen Betriebsbedingungen kann Trinkwasser aber auch aufbereitetes Abwasser ohne Zugabe durch Chemikalien desinfiziert werden. Damit würde die Lieferung von Chlorlösung dauerhaft entfallen. Zusätzlich wurde bewertet, inwiefern die Entfernung von Eisen und Arsen durch die ECl2 sowie der Abbau von organischen Spurenstoffen durch zusätzliche Kombination mit UV-Bestrahlung zur Radikalbildung verbessert werden kann. Alle vorgestellten Feldtests wurden in Langzeitstudien unter Realbedingungen in zukünftigen Anwendungsgebieten durchgeführt. Dadurch konnten mögliche Probleme im Be-trieb der Anlagen frühzeitig erkannt und Lösungsvorschläge erarbeitet werden. Die Erfahrungen aus dem Betrieb der ECl2-Systeme stellen dabei den größten Nutzen dieser Arbeit dar.
Mit den Feldversuchen konnte gezeigt werden, dass Wasser durch die Anwendung von ECl2 sicher desinfiziert und gleichzeitig ausreichend vor Wiederverkeimung geschützt werden kann. Insbesondere die Kombination mit naturnahen Verfahren zur Vorbehandlung des zu desinfizierenden Wassers hat sich als sehr vorteilhaft für die langfristig sichere Anwendung des Verfahrens herausgestellt. Für den in Uttarakhand, Indien, durch-geführten Feldtest konnte die ECl2 zusammen mit der Uferfiltration eine Logstufenreduktion von > 5.0 für Gesamtcoliforme und 3.5 für E. coli erreichen.
Durch die Kombination mit einer vertikal durchströmten Pflanzenkläranlage (VFCW) konnte die Desinfektion von behandeltem Abwasser mittels Chlor erheblich vereinfacht werden. Die Pflanzenkläranlage reduzierte die Ammoniumkonzentration, den CSB sowie die Trübung und damit auch die Chlorzehrung des Wassers. Darüber hinaus wurden durch die VFCW Konzentrationsschwankungen im Zulauf erheblich vergleichmäßigt. Es wurden Logstufenreduktionen für Gesamtcoliforme von 5.1 und >4.0 für E. coli erzielt. Durch diese Verfahrenskombination werden auch Anwendungen zur Wiederverwendung von Abwasser möglich, die über die Bewässerung hinausgehen und damit wertvolle Frischwasserressourcen geschont.
Durch die Behandlung von arsenkontaminiertem Grundwasser konnten mit der hier vor-geschlagenen Kombination von ECl2 und anschließender gemeinsamer Ausfällung und Filtration Entfernungsraten für Arsen von 94 % und Eisen von > 99 % erreicht werden. Da der WHO-Grenzwert für Arsen (10 µg/L) im hier durchgeführten Feldversuch mit 10 ± 4 µg/L dauerhaft nicht sicher eingehalten werden konnte, wurden weitere Optimierungsschritte identifiziert.
Die Entfernungsrate für Benzotriazol von 5 % durch ECl2 allein konnte in Kombination mit UV-Lampen auf 89 % erhöht werden. Ähnliche Ergebnisse wurden für andere aus-gewählte organische Spurenstoffe erzielt. Es sind jedoch weitere Studien erforderlich, um den Abbauprozess im Detail zu verstehen und eine mögliche Zunahme der Toxizität durch die Bildung von Transformationsprodukten sowie Desinfektionsnebenprodukten zu bewerten.
Die Feldversuche haben gezeigt, dass ECl2 als innovative Behandlungstechnologie in der Lage ist, Trinkwasser und behandeltes Abwasser sicher zu desinfizieren. Darüber kann mittels ECl2 u.a. die Entfernung von Arsen aus verunreinigten Rohwässern als auch der Abbau von Spurenstoffen verbessert werden.
Bei härterem Rohwasser steht die rasche Verkalkung der Kathoden jedoch einem wartungsarmen Betrieb der Anlagen, trotz Polumkehr, entgegen. Die Versuche haben ge-zeigt, dass ECl2-Systeme mit den hier verwendeten Elektrolysezellen, nur in Wässern mit einem Gesamthärtewert <200 mg/L CaCO3 zuverlässig arbeiten. Da Rohwässer häufig Konzentrationen >200 mg/L aufweisen, ist der Anwendungsbereich der ECl2 begrenzt und erfordert alternative Chlorierungstechnologien zur ursprünglich geplanten Inline-Elektrolyse.
Bereits während Versuchen zur Abwasserdesinfektion in Spanien wurde daher das Pilotsystem technisch dahingehend angepasst, dass nur noch ein Teilstrom von 4 bis 23 % des zu behandelten Wasservolumens durch die Elektrode floss. Dadurch konnte die Bildung von Ablagerungen vollständig verhindert und ein zuverlässiger, nahezu wartungsfreier Betrieb sichergestellt werden. Je nach dem Chlorbedarf und der natürlichen Chloridkonzentration des Wassers erfordert diese Betriebsweise in der Regel jedoch die Zugabe von Chlorid. In Anbetracht der hier ermittelten erhöhten Prozessstabilität und dem erheblich reduzierten Energieverbrauch erscheint diese Zugabe vertretbar. Laborstudien haben auch gezeigt, dass die Bildung anorganischer Desinfektionsnebenprodukte bei den „onsite chlorine generation“ (OCG) Systemen kein Problem darstellt.
Um die wirtschaftliche Machbarkeit der hier getesteten Trinkwasseraufbereitungssysteme unter Realbedingungen zu bewerten, wurden der Betrieb eines in Ägypten eingesetzten ECl2-Aufbereitungssystems und zweier in Tansania und Nepal eingesetzter OCG-Einheiten analysiert. Die Studie zeigt, dass die Betriebs- und Wartungskosten solcher Einheiten dauerhaft gedeckt werden können. Für den Aufbau der Infrastruktur sind jedoch Investitionen durch entsprechende Förderprogramme erforderlich.
Die hier angewandten Verfahren zur Wasseraufbereitung können eine wichtige Rolle bei der Verbesserung der Trinkwasserversorgung insbesondere in ländlichen Entwicklungsregionen und der Wiederverwendung von aufbereiteten Abwässern spielen. / In 2017 nearly 2 billion people consumed water that was contaminated with feces, causing almost 500.000 diarrheal deaths. At the same time freshwater resources are depleted and water scarcity is already affecting 4 billion people worldwide. From a technical perspective the continuous supply of chemicals needed to ensure sufficient disinfection remains a major challenge in rural water treatment, and existing technical solutions to adequately disinfect water have failed in the past.
This dissertation work evaluates the technical and economic feasibility of solar-driven inline electrolytic production of chlorine (ECl2), as an alternative to external chlorine supply. During ECl2 disinfection the water passes through the cell and chlorine is produced “inline” from the natural chloride content of the water. Under optimal conditions, no chemicals are required to safely disinfect drinking water and treated wastewater. Fur-thermore, the ability of ECl2 to enhance the removal of iron and arsenic from contaminated groundwater and to degrade Trace Organic Compounds (TOrC) when combined with UV were analyzed. All relevant tests have been conducted in long-term field studies in future deployment areas. This enabled the evaluation of potential operational challenges of such systems under real-world conditions. The experiences gathered from these field trials represent the major benefit of this dissertation work.
The trials have shown that with ECl2 water can be safely disinfected and supplied with an adequate amount of residual disinfectant. Here, the combination with natural pre-treatment systems has proven to be beneficial. For the drinking water trial conducted in Uttarakhand, India, the ECl2 system received bank filtrate and achieved overall log re-moval rates of >5.0 for total coliforms and >3.5 for E. coli.
For the disinfection of treated wastewater, the combination with a vertical flow constructed wetland (VFCW) has largely simplified the disinfection with chlorine by equalizing wastewater (WW) inlet quality fluctuations, removing ammonium, COD, and turbidity. This has also substantially reduced the chlorine demand of the water, and pathogen indicator-free conditions were achieved with log unit removals of 5.1 and ≥ 4.0 for total coliforms and E. coli, respectively. Wastewater reuse applications that go beyond irrigation become permissible through this approach and the use of limited freshwater resources can be substituted.
Removal rates for arsenic and iron of 94 % and >99 % respectively were able to be achieved by treating contaminated groundwater with the combination of ECl2 and subsequent co-precipitation and filtration proposed here. Despite effluent concentrations up to 10 ± 4 µg/L for arsenic, strict WHO guideline values could not be met. Here further optimization requirements were identified.
The removal rate for benzotriazole of 5% through ECl2 alone could be increased to 89 % when combined with UV lamps. Similar results were achieved for other selected TOrCs. Still, more advanced studies are required to understand the degradation process in detail, and to evaluate a potential increase in toxicity through the formation of transformation products.
The field trials have shown that ECl2 as an innovative treatment technology is capable of safely disinfecting drinking water and treated wastewater. Its application also enhances the treatment of other contaminants evaluated. However, cathode scaling has been iden-tified as the most critical technical issue - despite the use of polarity inversion. ECl2 systems could only operate reliably in waters with total hardness value < 200 mg/L CaCO3. As such concentrations are rare, the fields of application of ECl2 are limited. This required other chlorination technologies as alternatives to the originally planned inline electrolysis.
An initial derivative of an ECl2 system was also applied during a wastewater disinfection trial in Spain. In this setting the portion of water passing by the electrodes and therefore the quantities of scaling agents were reduced to between 4 and 23 % of the total water volume treated. With this approach, deposit formation was completely prevented and reliable, nearly maintenance-free operation was ensured. However, such onsite chlorine generation (OCG) units commonly require the addition of chloride. From the author’s perspective and the experience collected during the field trials, the addition of NaCl is justifiable considering the increased reliability of system operation. OCG offers further advantages regarding process stability and energy demand. Lab studies have also shown that the formation of inorganic disinfection byproducts has not been an issue with OCG systems.
To evaluate the economic feasibility of the drinking water treatment systems tested in real-world scenarios, an ECl2 treatment system operating in Egypt and two OCG units operating in Tanzania and Nepal were analyzed. The study shows that long-term operation and maintenance costs of such units can be covered. However, seed investment is required for the construction of the initial infrastructure. Once those costs are covered, the treatment approaches presented here can sustainably play an important role in reducing the number of people consuming contaminated water, especially in rural developing regions.
|
5 |
The Use of XBRL for Financial Statement Analysis / Využití formátu XBRL pro analýzu účetních datŠťastný, Daniel January 2011 (has links)
This thesis demonstrates the use of XML (eXtensible Markup Language) instance documents based on XBRL taxonomy (eXtensible Business Reporting Language) for financial statement analysis. It illustrates how XBRL could change data workflow in order to decrease the costs of maintaining, reporting and analyzing financial data. The practical output is a software extension of MS Excel that can read and analyze online XBRL documents prepared as HTML with embedded XBRL. The theoretical part comprises an explanation of the technical background XBRL and important data workflows related to financial analysis and reporting financial data.
|
6 |
Generic Simulation Model Development of Hydraulic Axial Piston MachinesKayani, Omer Khaleeq, Sohaib, Muhammad January 2012 (has links)
This master thesis presents a novel methodology for the development of simulation models for hydraulic pumps and motors. In this work, a generic simulation model capable of representing multiple axial piston machines is presented, implemented and validated. Validation of the developed generic simulation model is done by comparing the results from the simulation model with experimental measurements. The development of the generic model is done using AMESim. Today simulation models are an integral part of any development process concerning hydraulic machines. An improved methodology for developing these simulation models will affect both the development cost and time in a positive manner. Traditionally, specific simulation models dedicated to a certain pump or motor are created. This implies that a complete rethinking of the model structure has to be done when modeling a new pump or motor. Therefore when dealing with a large number of pumps and motors, this traditional way of model development could lead to large development time and cost. This thesis work presents a unique way of simulation model development where a single model could represent multiple pumps and motors resulting in lower development time and cost. An automated routine for simulation model creation is developed and implemented. This routine uses the generic simulation model as a template to automatically create simulation models requested by the user. For this purpose a user interface has been created through the use of Visual Basic scripting. This interface communicates with the generic simulation model allowing the user to either change it parametrically or completely transform it into another pump or motor. To determine the level of accuracy offered by the generic simulation model, simulation results are compared with experimental data. Moreover, an optimization routine to automatically fine tune the simulation model is also presented.
|
7 |
Highly Efficient Thermal Ablation of Silicon and Ablation in Other MaterialsYu, Joe X.Z. 06 June 2011 (has links)
Laser micromachining has become increasing prominent in various industries given its speed, lack of tool wear, and ability to create features on the order of micrometres. Inherent stochastic variations from thermal ablation along with detrimental heat effects, however, limit the feasibility of achieving high precision. The high number of control parameters that make laser micromachining versatile also hinders optimization due to high exploration time. The introduction of high intensity nonlinear ablation leads to more precise cuts but at a much higher, often restrictive, cost.
The work here shows that by combining an imaging technique frequently used in ophthalmology called optical coherence tomography (OCT) with a machining platform, in situ observation of ablation can be made. This combination, known as in-line coherent imaging (ICI), allows information to be gathered about the dynamics of the ablation process. Experimental results show that quality cutting of silicon can be achieved with thermal ablation and at a wavelength of 1070 nm. This result is surprising as silicon absorbs this wavelength very weakly at room temperature. It is shown here that a nonlinear thermal dependence in absorption allows a cascaded absorption effect to enable machining. With the aid of ICI, the model shown here is able to accurately predict the thermal ablation rate and help understand the ablation process. The high quality cutting achieved allows for a more cost efficient alternative to current techniques using ultraviolet diode-pumped solid state (UV DPSS) systems.
Where thermal effects such as heat-affected zones (HAZ) cannot be overcome, high intensity nonlinear ablation allows the processing of lead zirconate titanate (PZT) for high frequency arrays (used in ultrasound applications) at speeds two orders of magnitude greater than found in the literature, and potential feature sizes (< 100 µm) in polymethyl methacrylate (PMMA) unachievable by thermal ablation. The ablation mechanism here is Coulombic explosion (CE), which is a non-thermal process. Coupled with demonstrated manual and automatic feedback abilities of ICI, the processes shown here may open up new avenues for fabrication. / Thesis (Master, Physics, Engineering Physics and Astronomy) -- Queen's University, 2011-05-31 15:02:55.547
|
8 |
Automatisering av mätningar i processindustri : Inline-mätning av syrgas i lösning och konduktivitet vid framställning av parenteral nutritionAndersson, Gustav, Joneby, Cecilia January 2018 (has links)
In line with technological sensor developments, the possibility of making online qualitycontrols or fluid processes also increases. Two of these controls have beeninvestigated in this project, namely measurement of oxygen in liquid as well as controlof conductivity after completion of the clean in place. The issues are "what sensorsand opportunities are available in the market to enable automation of thesecontrols?", and "how can sensors be implemented in the production process and whatprofits can be made?"The aim of the thesis work is to present what possibilities and limitations there arefor the purchase of sensors in order to enable automation of the measurementsmentioned in the problem description, as well as to suggest sensors and to developlayout proposals in order to describe where in the process the controls are to beimplemented and what impact they have.Several sensors were found, and two of these are considered best suited forintroducing at the factory in the field of pharmaceutical production where the thesishas been carried out. Considering the time aspect and the complexity involved inmanufacturing, the work has been limited due to an implementation is notconceivable.A process flow analysis, literature studies, interviews and a selection have resulted ina situation assessment, market research and a layout proposals. This has then beenanalyzed and the different solutions have been ranked, mapping of profits andproposals for working methods for implementation.Finally, the conclusion was that the two selected sensors are possible to beimplement. Automating these controls involves a number of benefits such asimproved work environment, reduced waste and financial gains. The implementationshould not adversely affect the process, and it would be an appropriate step towardsfurther digitization and Industry 4.0.
|
9 |
A Literature Review on Differences Between Robotic and Human In-Line Quality Inspection in Automotive Manufacturing Assembly Line.Avvari, Ddanukash January 2021 (has links)
The advent of the industrial revolution has brought a great number of changes in the functioning of various processes in manufacturing industries. The ways and means of working have evolved exponentially with the implementation of advanced technology. Moreover, with the increasing technology, the customer demands have also been varying dynamically due to changes in customer requirements focusing on individual customization. To cope with the dynamic demand, manufacturing industries had to make sure their products are manufactured with higher quality and shorter lead times. Implementation and efficient usage of technology has provided industries with the necessary tools to achieve market demand and stay competitive by growing continuously. The transformation aims to reach the level of zero-defect manufacturing and ensure higher first-time right yield capability with minimum utilization of available resources. However, technological advancements have not developed the quality inspection process of the manufacturing industry at the same level as other processes. Due to this, the quality inspection processes are still human dependent which requires a highly skilled human operator to perform inspection procedures using sensory abilities to detect deviations. Research suggests that human quality inspection is prone to errors due to fatigue as the process is continuous, strenuous, and tedious work. The efficiency of human inspection is around 80% which becomes a chronic problem in safety-critical and high-value manufacturing environments. Moreover, with the increasing level of customization and technology, the products are becoming more complex with intricate shapes and only human inspection is not enough to meet the customer requirements. Especially in the case of automotive industry in Body in White applications, human inspection of outer body panels, engine parts with tighter tolerances alone does not make the cut. Advancements in the field of metrology have led to the introduction of Coordinate measuring machines (CMM), which are classified as contact and non-contact measuring machines. The measurements are performed offline away from the production line, using the sampling method. The contact measuring machines are equipped with touch trigger probe devices that travel all over the part to make a virtual image of the product which is time-consuming but accurate. Whereas the noncontact measuring machines are equipped with laser scanners or optical devices which scan the part and develop a virtual model which is fast but has accuracy and repeatability issues due to external factors. But coordinate measuring machines have proven to be bottlenecks as they were not able to synchronize with the production pace and could not perform aninspection on all the produced parts, which would help in collecting data. The gathered data can be used to analyse root causes and generate trends in defect detection. With the advancements in non-contact measuring systems, automotive industries have also realized the potential of implementing inline measurement techniques to perform quality inspection. The non-contact measuring system consists of a robotic arm or setup which is equipped with a camera, sensors, and a complex algorithm to identify defects. This provides the robotic arm with machine vision which is works by taking a series of images of the product from various and process these images to detect deviations using digital image processing techniques. The inline measurement has proven to be accurate, fast, and repeatable to be implemented in synchronization with the production line. Further, the automotive industries are moving towards hybrid inspection systems which capitalize on the measuring speed of the robot and the fast decision-making ability of human senses.
|
10 |
Analysis of Droplet Impact on a Liquid PoolRadhika Arvind Bhopatkar (9012413) 25 June 2020 (has links)
<p>Secondary
atomization is very important in applications like IC engine and aircraft
engine performance, agricultural sprays, and inkjet printing to name a few. In
case of IC engines and aircraft engines, a good understanding of the modes of
secondary atomization and the resultant drop size can contribute to improving
the fuel injection and hence the efficiency of the engine. Similarly, with the
help of appropriate secondary atomization desired agro-spray quality, ink usage
and print quality can be achieved which would optimize the usage of chemicals
and ink respectively and avoid any harmful effects on the environment.</p>
<p> </p>
<p>One of
the reasons for secondary atomization that occurs very often in most of the
spray applications is the drop impact on a solid or liquid surface. Especially
it is cardinal to understand the impact of a drop on a liquid film since even
in case of impact of liquid drops on a solid surface ultimately the drops that
are injected at a later time are going have a target surface as a thin liquid
film on the solid base due to the accumulation of the previously injected
drops. Analysis of drop impact on a liquid film with non-dimensional thickness
ranging from 0.1 to 1 has been done thoroughly before (Cossali <i>et al.,</i> 2004, Vander Waal <i>et al.,</i>
2006, Moreira <i>et al.,</i> 2010), however,
analysis of drop impact on a liquid film with non-dimensional thickness greater
than 1 is still in a rudimentary stage. This work focuses on determining the
probability density functions for the secondary drop sizes for drops produced
in case of drop impact on a liquid film while varying the h/d ratio beyond 1. The
experimental set-up used to study drop impact includes a droplet generator and
DIH system as mentioned in, Yao <i>et al.</i>
(2017). The DIH set-up includes a CW laser, spatial filter, beam expander and a
collimator as adapted from Guildenbecher <i>et
al.</i> (2016). The height of drop impact is varied to vary the impact <i>We</i>,
by adjusting the syringe height. Three fluids- DI-Water, ethanol and glycerol
are tested for examining the effect of viscosity on the resultant drop sizes. Results
are plotted with respect to viscosity, impact <i>We</i> and the non-dimensional
film thickness, as the fragmentation of drops is directly associated to these
parameters. Results indicate that majority of the secondary droplets lie in the
size range of 25 µm to 50 µm. It is also observed that the tendency of
secondary atomization from crown splashing increases with the increase in <i>We</i>
and decreases with increase in <i>Oh.</i></p>
|
Page generated in 0.0527 seconds