Spelling suggestions: "subject:"cftdualität"" "subject:"bildqualität""
31 |
Quality-of-Service-Aware Data Stream ProcessingSchmidt, Sven 21 March 2007 (has links) (PDF)
Data stream processing in the industrial as well as in the academic field has gained more and more importance during the last years. Consider the monitoring of industrial processes as an example. There, sensors are mounted to gather lots of data within a short time range. Storing and post-processing these data may occasionally be useless or even impossible. On the one hand, only a small part of the monitored data is relevant. To efficiently use the storage capacity, only a preselection of the data should be considered. On the other hand, it may occur that the volume of incoming data is generally too high to be stored in time or–in other words–the technical efforts for storing the data in time would be out of scale. Processing data streams in the context of this thesis means to apply database operations to the stream in an on-the-fly manner (without explicitly storing the data). The challenges for this task lie in the limited amount of resources while data streams are potentially infinite. Furthermore, data stream processing must be fast and the results have to be disseminated as soon as possible. This thesis focuses on the latter issue. The goal is to provide a so-called Quality-of-Service (QoS) for the data stream processing task. Therefore, adequate QoS metrics like maximum output delay or minimum result data rate are defined. Thereafter, a cost model for obtaining the required processing resources from the specified QoS is presented. On that basis, the stream processing operations are scheduled. Depending on the required QoS and on the available resources, the weight can be shifted among the individual resources and QoS metrics, respectively. Calculating and scheduling resources requires a lot of expert knowledge regarding the characteristics of the stream operations and regarding the incoming data streams. Often, this knowledge is based on experience and thus, a revision of the resource calculation and reservation becomes necessary from time to time. This leads to occasional interruptions of the continuous data stream processing, of the delivery of the result, and thus, of the negotiated Quality-of-Service. The proposed robustness concept supports the user and facilitates a decrease in the number of interruptions by providing more resources.
|
32 |
Positive und negative Aspekte des Lehramtsstudiums aus Sicht der StudierendenGrüneberg, Tillmann, Knopf, Antje, Herfter, Christian 02 September 2014 (has links) (PDF)
Universitäten treten im Zuge der demografischen Wende in einen Wettstreit um Studierende. Fasst man die Universität in diesem Sinne als Unternehmen, Studiengänge als Dienstleistungen und die Studierenden als Kundinnen und Kunden dieser Dienstleistungen auf, rückt die Sicherung und Weiterentwicklung der Qualität des gesamten Studienumfelds in den Mittelpunkt der Betrachtungen. Qualität wird hier in erster Linie als customer satisfaction verstanden, also als Zufriedenheit der Studierenden1. Darüber hinaus tragen die Rückmeldungen von Studierenden zur Weiterentwicklung der Lehrerbildung im Allgemeinen bei. Die Gesamtschau dieser Studierendeneinschätzung hilft zum einen Problembereiche des modularisierten BA-Studiums zu identifizieren und so Verbesserungspotentiale zu erkennen. Zum anderen gibt der Blick auf die positiven Aspekte des Studiums Rückmeldung zu erreichten Verbesserungen, sowie Bereichen in denen Veränderungen beibehalten oder verstärkt werden sollten. Nicht zuletzt dient dies der positiven Außendarstellung des Studiums an der Universität Leipzig.
|
33 |
Nachhaltig stabile Bahnlaufprozesse – Fehlerursachenanalyse und ReaktionLudat, Nicole, Schult, Andre 30 May 2018 (has links) (PDF)
Der Herstellung von Kunststoffen, deren Weiterverarbeitung zu Halbzeugen wie Folien und Platten sowie Produkten wie Verpackungen oder Bauteilen wird in der deutschen Industrie, nicht zuletzt aufgrund der hohen Exportquoten, eine große Bedeutung zugesprochen [STAT2016]. Dabei müssen Halbzeugprodukte immer höheren Funktionsanforderungen bei zugleich geringerem Rohstoffeinsatz gerecht werden [DAUN2014]. Unter diesem Gesichtspunkt werden heute hohe Anforderungen an eine stabile Prozessführung mit zunehmend weniger Toleranz gegenüber Qualitätsschwankungen (z.B. fehlerhafte Druckbilder oder Siegelnähte) bei der Verarbeitung gestellt [VÄTH2011]. [... aus der Einleitung]
|
34 |
Positive und negative Aspekte des Lehramtsstudiums aus Sicht der Studierenden: Projektbericht; Ergebnisse aus Fragebogenstudien mit Lehramtsstudierenden an der Universität LeipzigGrüneberg, Tillmann, Knopf, Antje, Herfter, Christian January 2013 (has links)
Universitäten treten im Zuge der demografischen Wende in einen Wettstreit um Studierende. Fasst man die Universität in diesem Sinne als Unternehmen, Studiengänge als Dienstleistungen und die Studierenden als Kundinnen und Kunden dieser Dienstleistungen auf, rückt die Sicherung und Weiterentwicklung der Qualität des gesamten Studienumfelds in den Mittelpunkt der Betrachtungen. Qualität wird hier in erster Linie als customer satisfaction verstanden, also als Zufriedenheit der Studierenden1. Darüber hinaus tragen die Rückmeldungen von Studierenden zur Weiterentwicklung der Lehrerbildung im Allgemeinen bei. Die Gesamtschau dieser Studierendeneinschätzung hilft zum einen Problembereiche des modularisierten BA-Studiums zu identifizieren und so Verbesserungspotentiale zu erkennen. Zum anderen gibt der Blick auf die positiven Aspekte des Studiums Rückmeldung zu erreichten Verbesserungen, sowie Bereichen in denen Veränderungen beibehalten oder verstärkt werden sollten. Nicht zuletzt dient dies der positiven Außendarstellung des Studiums an der Universität Leipzig.
|
35 |
Visualization and Analysis of Flow Fields based on Clifford ConvolutionEbling, Julia 19 October 2017 (has links)
Vector fields from flow visualization often containmillions of data values. It is obvious that a direct inspection of the data by the user is tedious. Therefore, an automated approach for the preselection of features is essential for a complete analysis of nontrivial flow fields. This thesis deals with automated detection, analysis, and visualization of flow features in vector fields based on techniques transfered from image processing. This work is build on rotation invariant template matching with Clifford convolution as developed in the diploma thesis of the author. A detailed analysis of the possibilities of this approach is done, and further techniques and algorithms up to a complete segmentation of vector fields are developed in the process. One of the major contributions thereby is the definition of a Clifford Fourier
transform in 2D and 3D, and the proof of a corresponding convolution theorem for the Clifford convolution as well as other major theorems. This Clifford Fourier transform allows a frequency analysis of vector fields and the behavior of vectorvalued filters, as well as an acceleration of the convolution computation as a fast transform exists. The depth and precision of flow field analysis based on template matching and Clifford convolution is studied in detail for a specific application, which are flow fields measured in the wake of a helicopter rotor. Determining the features and their parameters in this data is an important step for a better understanding of the observed flow. Specific techniques dealing with subpixel accuracy and the parameters to be determined are developed on the way. To regard the flow as a superposition of simpler features is a necessity for this application as close vortices influence each other. Convolution is a linear system, so it is suited for this kind of analysis. The suitability of other flow analysis and visualization methods for this task is studied here as well. The knowledge and techniques developed for this work are brought together in the end to compute and visualize feature based segmentations of flow fields. The resulting visualizations display important structures of the flow and highlight the interesting features. Thus, a major step towards robust and automatic detection, analysis and visualization of flow fields is taken.
|
36 |
Komponentendokumentationen - Eine systematische Bewertung von Ordnungssystemen aus formaler SichtFettke, Peter, Loos, Peter 26 March 2001 (has links)
Das Zusammensetzen von einzelnen,interagierenden Software-Komponenten zu einem Gesamtsystem setzt voraus, dass die zur Realisierung des Gesamtsystems benötigten Komponenten zuvor in einer Komponentendokumentation aufgefunden worden
sind. Eine zielgerichtete und systematische Suche ermöglicht ein Ordnungssystem, das Komponenten strukturiert verwaltet. Ordnungssysteme können unterschiedlich ausgelegt werden. Dieser Beitrag stellt zunächst unterschiedliche Ordnungssysteme vor. Anschließend werden Kriterien zur Bewertung von Ordnungssystemen abgeleitet, anhand derer die zuvor dargestellten Ordnungssysteme kritisch diskutiert werden.
|
37 |
Qualität in KindertagesstättenSchreiber, Mike 08 May 2004 (has links)
Qualität in Kindertagestätten
|
38 |
Das Leipziger Inventar zur Evaluierung von Ringvorlesungen (LIER) - Fragebogen: FragebogenAust, Robert, Herfter, Christian 15 October 2019 (has links)
Mit diesem Aufsatz verfolgen wir zwei Ziele: Erstens hinterfragen wir, als Beitrag zur Hochschul- und Evaluationsforschung, wir das multidimensionale Modell der Lehrqualität (Rindermann, 1999) entlang eines bisher unberücksichtigten und dennoch bedeutsamen Veranstaltungstyps mit wechselnden Dozierenden, sog. Ringvorlesungen. Darauf aufbauend stellen wir zweitens das L***er Inventar zur Evaluation von Ringvorlesungen (LIER) vor und bewerten dessen Generalisierbarkeit und Validität. Obgleich auf seine Vorläufigkeit und Exemplarität hinzuweisen ist, erhoffen wir uns eine produktive Resonanz im Feld der Lehrevaluation im Hinblick auf die Anwendung, (statistische) Überprüfung und Weiterentwicklung des Inventars.
|
39 |
Quality assurance of 61Cu using ICP mass spectroscopy and metal complexationAsad, A. H., Morandeau, L., Chan, S., Jeffery, C. M., Smith, S. V., Price, R. I. January 2015 (has links)
Introduction
61Cu (T1/2 = 3.33 hr, Eβ= 1.22 MeV, 61.4 %) is an attractive isotope for positron emission tomography (PET) radiopharmaceutical agents such as ATSM and PTSM. Various separation processes have been reported for the production of 61Cu on a medium cyclotron using 13–22 MeV protons on natural and enriched 64Zn target materials [1,2].
This work, investigates production of 61Cu using both natural and enriched 64Zn targets and its separation. Three types of resins were used to assess for their efficiency and speed to separate the desired 61Cu from the 66,67,68Ga and 64Zn and for the recycling of 64Zn target material. The effective specific activity of purified 61Cu, was determined by ICP-MS and its titration with various polyaza and polycarboxylate complexing ligands.
Material and Methods
1. Production and Separation
Targets were irradiated by proton beam of IBA cyclotron 18/18MeV via the 64Zn(p,α) 61Cu and natZn(p,x) 61Cu reactions using an enriched 64Zn foil(15×15×0.05mm, ~50 mg) and natural foil (diameter 25 mm, 0.05 mm,~ 60 mg). Thirty minute irradiations were conducted with incident proton energies between 11.7–12.0 MeV and beam currents of 20 and 40 µA. Irradiated Zn targets were dissolved in 8M HCl at 150 oC then evaporated to dryness. Trace water to the resultant residue (twice) and resultant solutions evaporated to dryness. The residue was re-dissolved in 2ml of 0.01M HCl before loading onto a Cu-resin column (FIG. 1) Zn and Ga isotopes were collectively eluted using 30 ml of 0.01M HCl. The Cu was then removed using 1.5 ml of 8M HCl and passed directly onto a cation exchange followed by an anion exchange column. An additional 3 ml of 8M HCl was used to rinse the cation exchange column and ensure quantitatively removal of Cu (II) ions. The Cu was finally eluted from the anion exchange column using 3 ml of 2M HCl. The Cu solution was heated up at 150 oC until evaporated to dryness and 61Cu final product dissolved in 400–800 μL of 0.01M HCl.
2. Specific activity of 61Cu
The specific activity (GBq/µmol) of the purified 61Cu was determined by ICP-MS and compared with that determined using dota, nota and di-amsar complexing ligands.
For each 61Cu production run aliquot of final solution (100 µL) was left to decay before dilut-ing to 10 mL with 10% HNO3. Decayed samples were sent to ChemCentre (Curtin University) for ICP-MS analysis. Each sample was analysed for Cu, Al, Ca, Co, Fe, Ga, Ni, Si, and Zn, which are known to compete with Cu2+ for ligand complexation.
Effective specific activity of the 61Cu was deter-mined by titrating various known concentration of ligands with 61Cu solution. The method is detailed in the literature [3]. Briefly, varying concentrations of each ligand was prepared in 0.1M sodium acetate buffer pH 6.5 to a total volume 20 µL. Fixed concentration of diluted 61Cu (0.01M HCl) in 10 µL was added to each ligand solution. The mixtures were vortexed then left to incubate at the room temperature for 30 mins. Two uL aliquots were withdrawn (in triplicate) from each reaction mixture and spot-ted on ITLC –SA. [Mobile phase: 0.1M NaCl: 0.1M EDTA (9:1) for Cu2+ and diamsar mixtures: Rf <0.2 Cu-diamsar; Rf > 0.8 free Cu2+ and 0.1M sodium acetate pH 4.5: H2O: MeOH: ammonium hydroxide (20:18:2:1 v/v) for Cu2+ dota and nota mixtures: Rf >0.8 Cu-dota and Cu-nota Rf < 0.2 free Cu2+]. Complexation of the 61Cu with each ligand was complete within 30 mins at room temperature. Concentration of Cu2+ was deter-mined from the 50% labelling efficiency.
Results and Conclusion
1. Production and Separation
The radioisotopes production from natZn target must be minimized by the optimum proton energy to reduce a radiation dose in the final product. The excitation functions of 66,67,68Ga ,65Zn and 61Cu are shown in FIG. 2. Proton beam energy of 11.7 MeV was used for both Zn targets to minimise the production of Ga isotopes and prevent formation of 65Zn. For the enriched 64Zn target (99.30%) higher proton energy could be used for the production of 61Cu allowing for increased yields and reduce radio contaminants. Previously, we used anion and cation exchange resin as described in the literature to separate the 61Cu [1]. Unfortunately the literature method was too long (up to 3 hours) and requiring high concentration of HCl and long evaporation times compromising achievable yields [4]. Thieme S. et al., 2013 [2] reported the successful use of Cu-resin for the separation of Cu radioisotopes and it was of interest to the current work to test this material for the separation of 61Cu in our hands.
A cation, anion exchange and Cu-resin were combined into closed system to separate the 61Cu within 30 mins (FIG. 1). The system is designed to contain the transfer of solutions be-tween each column using simple plunger to force solution through and between each column. This system afforded an easy, reliable and fast separation of 61Cu that could be completed within 30 min.
2. Specific activity
The specific activity of 61Cu was determined using ICP-MS and by titration with three ligands is summarized in TABLE 1. The ICP-MS data show values ranging from 9.2 to 32.4 GBq/μmol for 8 production runs. Specific activity determine using nota and dota were in all cases lower than the ICP MS data indicating some interference from the other metal ion contaminates such as Fe(ii/Iii), Ni (II), Ca (II), Zn (II), Ga (III). The specific activity determine using diamsar, which is known to be highly selective for Cu(II) (and Zn(II) and Fe(III)) in the presence of alkali and alkaline earth ions gave values significantly higher effective specific activity than that obtained using ICP MS. Variations in values can be explained by presence of contaminating metal ions.
|
40 |
Improving the quality of the text, a pilot project to assess and correct the OCR in a multilingual environmentMaurer, Yves 16 October 2017 (has links)
The user expectation from a digitized collection is that a full text search can be performed and that it will retrieve all the relevant results. The reality is, however, that the errors introduced during Optical Character Recognition (OCR) degrade the results significantly and users do not get what they expect. The National Library of Luxembourg started its digitization program in 2000 and in 2005 started performing OCR on the scanned images. The OCR was always performed by the scanning suppliers, so over the years quite a lot of different OCR programs in different versions have been used. The manual parts of the digitization chain (handling, scanning, zoning, …) are difficult, costly and mostly incompressible, so the library thought that the supplier should focus on a high quality level for these parts. OCR is an automated process and so the library believed that the text recognized by the OCR could be improved automatically since OCR software improves over the years. This is why the library has never asked the supplier for a minimum recognition rate.
The author is proposing to test this assumption by first evaluating the base quality of the text extracted by the original supplier, followed by running a contemporary OCR program and finally comparing its quality to the first extraction. The corpus used is the collection of digitized newspapers from Luxembourg, published from the 18th century to the 20th century. A complicating element is that the corpus consists of three main languages, German, French and Luxembourgish, which are often present on a single newspaper page together. A preliminary step is hence added to detect the language used in a block of text so that the correct dictionaries and OCR engines can be used.
|
Page generated in 0.3989 seconds