• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7433
  • 1103
  • 1048
  • 794
  • 476
  • 291
  • 237
  • 184
  • 90
  • 81
  • 63
  • 52
  • 44
  • 43
  • 42
  • Tagged with
  • 14406
  • 9224
  • 3943
  • 2366
  • 1924
  • 1915
  • 1721
  • 1624
  • 1513
  • 1439
  • 1373
  • 1354
  • 1341
  • 1275
  • 1269
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

Kombinierte Optimierung für diskontinuierliche Produktion mit nicht definierten Qualitätskriterium

Schulz, Thomas, Nekrasov, Ivan 27 January 2022 (has links)
Diese Arbeit beschäftigt sich mit einem realen Fall der Chargenproduktion aus der pharmazeutischen Industrie. Das in der Untersuchung betrachtete Problem liegt im Bereich der Optimierung der Chargenqualität und der Minimierung des Ausschusses unter der Gegebenheit, dass die entsprechenden Qualitätsparameter im Unternehmenssteuerungssystem nicht gemessen werden. Die in dieser Arbeit vorgeschlagene Technik führt ein virtuelles Qualitätskriterium ein, das für jede der Chargen angewendet wird, basierend auf dem beschränkten Wissen der Anwender, welche Charge als optimale Charge (auch Golden Batch bezeichnet) betrachtet werden kann und somit als Referenz für die aktuell in Produktion befindliche Charge verwendet werden kann. Zu diesem Zweck verwenden wir das klassische integrale Leistungskriterium, das in der Theorie der optimalen Steuerung dynamischer Systeme weit verbreitet ist, um zu messen, wie weit der aktuelle Zustand des Systems vom 'optimalen' Punkt entfernt ist. Mit Hilfe der beschriebenen Technologie, die aus der genannten Nachbardisziplin stammt, waren wir in der Lage, die Qualität jeder Charge als ein kontinuierliches Messverhältnis zu quantifizieren, was uns erlaubte, mehrere effiziente kontinuierliche Analysetechniken für diesen anfänglichen Chargenproduktionsfall zu verwenden.
562

A Machine Learning Based Visible Light Communication Model Leveraging Complementary Color Channel

Jiang, Ruizhe 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Recently witnessed a great popularity of unobtrusive Visible Light Communication (VLC) using screen-camera channels. They overcomes the inherent drawbacks of traditional approaches based on coded images like bar codes. One popular unobtrusive method is the utilizing of alpha channel or color channels to encode bits into the pixel translucency or color intensity changes with over-the-shelf smart devices. Specifically, Uber-in-light proves to be an successful model encoding data into the color intensity changes that only requires over-the-shelf devices. However, Uber-in-light only exploit Multi Frequency Shift Keying (MFSK), which limits the overall throughput of the system since each data segment is only 3-digit long. Motivated by some previous works like Inframe++ or Uber-in-light, in this thesis, we proposes a new VLC model encoding data into color intensity changes on red and blue channels of video frames. Multi-Phase-Shift-Keying (MPSK) along with MFSK are used to match 4-digit and 5-digit long data segments to specific transmission frequencies and phases. To ensure the transmission accuracy, a modified correlation-based demodulation method and two learning-based methods using SVM and Random Forest are also developed.
563

Anthrax Event Detection: Analysis of Public Opinion Using Twitter During Anthrax Scares, The Mueller Investigation, and North Korean Threats

Miller, Michele E. January 2020 (has links)
No description available.
564

The networking of NC and CNC machines to facilitate the electronic transfer of CAD/CAM data

Hill, Mark Jeffrey 27 January 2015 (has links)
No description available.
565

Application of pattern recognition and adaptive DSP methods for spatio-temporal analysis of satellite based hydrological datasets

Turlapaty, Anish Chand 01 May 2010 (has links)
Data assimilation of satellite-based observations of hydrological variables with full numerical physics models can be used to downscale these observations from coarse to high resolution to improve microwave sensor-based soil moisture observations. Moreover, assimilation can also be used to predict related hydrological variables, e.g., precipitation products can be assimilated in a land information system to estimate soil moisture. High quality spatio-temporal observations of these processes are vital for a successful assimilation which in turn needs a detailed analysis and improvement. In this research, pattern recognition and adaptive signal processing methods are developed for the spatio-temporal analysis and enhancement of soil moisture and precipitation datasets. These methods are applied to accomplish the following tasks: (i) a consistency analysis of level-3 soil moisture data from the Advanced Microwave Scanning Radiometer – EOS (AMSR-E) against in-situ soil moisture measurements from the USDA Soil Climate Analysis Network (SCAN). This method performs a consistency assessment of the entire time series in relation to others and provides a spatial distribution of consistency levels. The methodology is based on a combination of wavelet-based feature extraction and oneclass support vector machines (SVM) classifier. Spatial distribution of consistency levels are presented as consistency maps for a region, including the states of Mississippi, Arkansas, and Louisiana. These results are well correlated with the spatial distributions of average soil moisture, and the cumulative counts of dense vegetation; (ii) a modified singular spectral analysis based interpolation scheme is developed and validated on a few geophysical data products including GODAE’s high resolution sea surface temperature (GHRSST). This method is later employed to fill the systematic gaps in level-3 AMSR-E soil moisture dataset; (iii) a combination of artificial neural networks and vector space transformation function is used to fuse several high resolution precipitation products (HRPP). The final merged product is statistically superior to any of the individual datasets over a seasonal period. The results have been tested against ground based measurements of rainfall over our study area and average accuracies obtained are 85% in the summer and 55% in the winter 2007.
566

Characterization of a Conventional Friction Stir Welding Machine

Brendel, Michael Smith 12 May 2012 (has links)
Process forces arising during Friction Stir Welding (FSW) have become of interest to investigators interested in obtaining weld quality information from recorded weld data. Successful analysis of process forces require the separation of force signals stemming from material flow mechanisms within the weld from signals influenced by the FSW machine. Three modes of FSW control were characterized for system response: Servo Position (SPC), Electronic Deflection Compensation (EDC), and Constant Load Control (CLC). The gain value of the feedback loops associated with EDC and CLC modes were altered and characterized. SPC mode response to vertical changes in the tool position was also characterized. Machine-specific force signatures associated with the motor transmission assembly and spindle resonance were also identified. Characterization of the influence of machine control modes and other machinespecific frequencies on process force signals will allow future investigators to identify segments of welds during which machine actions influenced recorded force data.
567

Categorical approach to automata theory

Sznajder-Glodowski, Malgorzata January 1986 (has links)
No description available.
568

An approach to the implementation of industrial sequential logic controllers /

Tabachnik, Ritchie L. (Ritchie Lee) January 1982 (has links)
No description available.
569

Applications of algebraic automata theory to quantum finite automata

Mercer, Mark. January 2007 (has links)
No description available.
570

A Strategy Oriented, Machine Learning Approach to Automatic Quality Assessment of Wikipedia Articles

De La Calzada, Gabriel 01 April 2009 (has links) (PDF)
This work discusses an approach to modeling and measuring information quality of Wikipedia articles. The approach is based on the idea that the quality of Wikipedia articles with distinctly different profiles needs to be measured using different information quality models. To implement this approach, a software framework written in the Java language was developed to collect and analyze information of Wikipedia articles. We report on our initial study, which involved two categories of Wikipedia articles: ”stabilized” (those, whose content has not undergone major changes for a significant period of time) and ”controversial” (articles that have undergone vandalism, revert wars, or whose content is subject to internal discussions between Wikipedia editors). In addition, we present simple information quality models and compare their performance on a subset of Wikipedia articles with the information quality evaluations provided by human users. Our experiment shows that using special-purpose models for information quality captures user sentiment about Wikipedia articles better than using a single model for both categories of articles.

Page generated in 0.0862 seconds