![PDF] A Survey of Controllable Text Generation using Transformer-based Pre-trained Language Models | Semantic Scholar PDF] A Survey of Controllable Text Generation using Transformer-based Pre-trained Language Models | Semantic Scholar](https://d3i71xaburhd42.cloudfront.net/723fcade538f71df5fe5d1cde279686240f97b9f/4-Figure2-1.png)
PDF] A Survey of Controllable Text Generation using Transformer-based Pre-trained Language Models | Semantic Scholar
![Xiang Lisa Li on Twitter: "https://t.co/sUXsgBxOAH We propose Diffusion-LM, a non-autoregressive language model based on continuous diffusions. It enables complex controllable generation. We can steer the LM to generate text with desired Xiang Lisa Li on Twitter: "https://t.co/sUXsgBxOAH We propose Diffusion-LM, a non-autoregressive language model based on continuous diffusions. It enables complex controllable generation. We can steer the LM to generate text with desired](https://pbs.twimg.com/media/FUm4QUWUAAIDNp0.jpg:large)
Xiang Lisa Li on Twitter: "https://t.co/sUXsgBxOAH We propose Diffusion-LM, a non-autoregressive language model based on continuous diffusions. It enables complex controllable generation. We can steer the LM to generate text with desired
![Text Generation with No (Good) Data: New Reinforcement Learning and Causal Frameworks - Speaker Deck Text Generation with No (Good) Data: New Reinforcement Learning and Causal Frameworks - Speaker Deck](https://files.speakerdeck.com/presentations/0a88a0017d014c398b2e3cc52e47b578/slide_6.jpg)