• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 6
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 23
  • 8
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

The Impacts of Social-Emotional Competence and Other Student, Parent, and School Influences on Kindergarten Achievement

Schiavone, Vincent J., Schiavone 10 December 2018 (has links)
No description available.
22

Mikrofonní pole malých rozměrů pro odhad směru přicházejícího zvuku / Small-Size Microphone Array for Estimation of Direction of Arrival of Sound

Kubišta, Ladislav January 2020 (has links)
This thesis describe detection of direction receiving sound with small–size microphone array. Work is based on analyzing methods of time delay estimation, energy decay or phase difference signal. Work focus mainly on finding of angle of arrival in small time difference. Results of measuring, as programming sound, so sound recorded in laboratory conditions and real enviroment, are mentioned in conclusion. All calculations were done by platform Matlab
23

Large-Context Question Answering with Cross-Lingual Transfer

Sagen, Markus January 2021 (has links)
Models based around the transformer architecture have become one of the most prominent for solving a multitude of natural language processing (NLP)tasks since its introduction in 2017. However, much research related to the transformer model has focused primarily on achieving high performance and many problems remain unsolved. Two of the most prominent currently are the lack of high performing non-English pre-trained models, and the limited number of words most trained models can incorporate for their context. Solving these problems would make NLP models more suitable for real-world applications, improving information retrieval, reading comprehension, and more. All previous research has focused on incorporating long-context for English language models. This thesis investigates the cross-lingual transferability between languages when only training for long-context in English. Training long-context models in English only could make long-context in low-resource languages, such as Swedish, more accessible since it is hard to find such data in most languages and costly to train for each language. This could become an efficient method for creating long-context models in other languages without the need for such data in all languages or pre-training from scratch. We extend the models’ context using the training scheme of the Longformer architecture and fine-tune on a question-answering task in several languages. Our evaluation could not satisfactorily confirm nor deny if transferring long-term context is possible for low-resource languages. We believe that using datasets that require long-context reasoning, such as a multilingual TriviaQAdataset, could demonstrate our hypothesis’s validity.

Page generated in 0.1393 seconds