[PDF][PDF] Generating sequences with recurrent neural networks
A Graves - arXiv preprint arXiv:1308.0850, 2013 - waynenterprises.com
This paper shows how Long Short-term Memory recurrent neural networks can be used to
generate complex sequences with long-range structure, simply by predicting one data point
at a time. The approach is demonstrated for text (where the data are discrete) and online
handwriting (where the data are real-valued). It is then extended to handwriting synthesis by
allowing the network to condition its predictions on a text sequence. The resulting system is
able to generate highly realistic cursive handwriting in a wide variety of styles.
generate complex sequences with long-range structure, simply by predicting one data point
at a time. The approach is demonstrated for text (where the data are discrete) and online
handwriting (where the data are real-valued). It is then extended to handwriting synthesis by
allowing the network to condition its predictions on a text sequence. The resulting system is
able to generate highly realistic cursive handwriting in a wide variety of styles.
Показан е най-добрият резултат за това търсене. Показване на всички резултати