Return to search

Generating rhyming poetry using LSTM recurrent neural networks

Current approaches to generating rhyming English poetry with a neural network
involve constraining output to enforce the condition of rhyme. We investigate whether
this approach is necessary, or if recurrent neural networks can learn rhyme patterns
on their own. We compile a new dataset of amateur poetry which allows rhyme
to be learned without external constraints because of the dataset’s size and high
frequency of rhymes. We then evaluate models trained on the new dataset using a
novel framework that automatically measures the system’s knowledge of poetic form
and generalizability. We find that our trained model is able to generalize the pattern
of rhyme, generate rhymes unseen in the training data, and also that the learned word
embeddings for rhyming sets of words are linearly separable. Our model generates
a couplet which rhymes 68.15% of the time; this is the first time that a recurrent
neural network has been shown to generate rhyming poetry a high percentage of
the time. Additionally, we show that crowd-source workers can only distinguish
between our generated couplets and couplets from our dataset 63.3% of the time,
indicating that our model generates poetry with coherency, semantic meaning, and
fluency comparable to couplets written by humans. / Graduate

Identiferoai:union.ndltd.org:uvic.ca/oai:dspace.library.uvic.ca:1828/10801
Date30 April 2019
CreatorsPeterson, Cole
ContributorsFyshe, Alona
Source SetsUniversity of Victoria
LanguageEnglish, English
Detected LanguageEnglish
TypeThesis
Formatapplication/pdf
RightsAvailable to the World Wide Web

Page generated in 0.0035 seconds