Code Monkey home page Code Monkey logo

text-generation's Introduction

Text Generation

Base

ACL 2019

Syntax-Infused Variational Autoencoder for Text Generation. ACL 2019. [PDF]
We present a syntax-infused variational autoencoder (SIVAE), that integrates sentences with their syntactic trees to improve the grammar of generated sentences.

Enhancing Variational Autoencoders with Mutual Information Neural Estimation for Text Generation. ACL 2019. [PDF] \

Using Semantic Similarity as Reward for Reinforcement Learning in Sentence Generation. ACL 2019. [PDF] \

Towards Generating Long and Coherent Text with Multi-Level Latent Variable Models. ACL 2019. [PDF] \

Adversarial Domain Adaptation Using Artificial Titles for Abstractive Title Generation. ACL 2019. [PDF]
This paper examines techniques for adapting from a labeled source domain to an unlabeled target domain in the context of an encoder-decoder model for text generation.

Learning to Control the Fine-grained Sentiment for Story Ending Generation. ACL 2019. [PDF]
In this paper, we propose to generate sentences from disentangled syntactic and semantic spaces.

Neural Keyphrase Generation via Reinforcement Learning with Adaptive Rewards. ACL 2019. [PDF]
To address this problem, we propose a reinforcement learning (RL) approach for keyphrase generation, with an adaptive reward function that encourages a model to generate both sufficient and accurate keyphrases.

Decomposable Neural Paraphrase Generation. ACL 2019. [PDF]
This paper presents Decomposable Neural Paraphrase Generator (DNPG), a Transformer-based model that can learn and generate paraphrases of a sentence at different levels of granularity in a disentangled way.

Large-Scale Transfer Learning for Natural Language Generation. ACL 2019. [PDF]
We focus in particular on open-domain dialog as a typical high entropy generation task, presenting and comparing different architectures for adapting pretrained models with state of the art results.

Self-Attention Architectures for Answer-Agnostic Neural Question Generation. ACL 2019. [PDF]
We explore how Transformers can be adapted to the task of Neural Question Generation without constraining the model to focus on a specific answer passage.

Dual Supervised Learning for Natural Language Understanding and Generation. ACL 2019. [PDF]
This paper proposes a novel learning framework for natural language understanding and generation on top of dual supervised learning, providing a way to exploit the duality.

Word2Sense: Sparse Interpretable Word Embeddings. ACL 2019. [PDF]
We present an unsupervised method to generate Word2Sense word embeddings that are interpretable – each dimension of the embedding space corresponds to a fine-grained sense, and the non-negative value of the embedding along the j-th dimension represents the relevance of the j-th sense to the word.

Enhancing Unsupervised Generative Dependency Parser with Contextual Information. ACL 2019. [PDF]
In this paper, we propose a novel probabilistic model called discriminative neural dependency model with valence (D-NDMV) that generates a sentence and its parse from a continuous latent representation, which encodes global contextual information of the generated sentence.

Exploring Pre-trained Language Models for Event Extraction and Generation. ACL 2019. [PDF]
To promote event extraction, we first propose an event extraction model to overcome the roles overlap problem by separating the argument prediction in terms of roles. Moreover, to address the problem of insufficient training data, we propose a method to automatically generate labeled data by editing prototypes and screen out generated samples by ranking the quality.

Incorporating Linguistic Constraints into Keyphrase Generation. ACL 2019. [PDF]
In this paper, we propose the parallel Seq2Seq network with the coverage attention to alleviate the overlapping phrase problem.

Generating Summaries with Topic Templates and Structured Convolutional Decoders. ACL 2019. [PDF]
In this paper we propose a structured convolutional decoder that is guided by the content structure of target summaries.

Keeping Notes: Conditional Natural Language Generation with a Scratchpad Encoder. ACL 2019. [PDF]
We introduce the Scratchpad Mechanism, a novel addition to the sequence-to-sequence (seq2seq) neural network architecture and demonstrate its effectiveness in improving the overall fluency of seq2seq models for natural language generation tasks.

Argument Generation with Retrieval, Planning, and Realization. ACL 2019. [PDF]
In this paper, we study the specific problem of counter-argument generation, and present a novel framework, CANDELA.

Inducing Document Structure for Aspect-based Summarization. ACL 2019. [PDF]
We tackle the task of aspect-based summarization, where, given a document and a target aspect, our models generate a summary centered around the aspect.

NAACL 2019

Topic-Guided Variational Auto-Encoder for Text Generation. NAACL 2019. [PDF]
We propose a topic-guided variational auto-encoder (TGVAE) model for text generation.

Keyphrase Generation: A Text Summarization Struggle. NAACL 2019. [PDF]
In this paper, we explore the possibility of considering the keyphrase string as an abstractive summary of the title and the abstract. First, we collect, process and release a large dataset of scientific paper metadata that contains 2.2 million records.

Jointly Optimizing Diversity and Relevance in Neural Response Generation. NAACL 2019. [PDF]
In this paper, we propose a SpaceFusion model to jointly optimize diversity and relevance that essentially fuses the latent space of a sequence-to-sequence model and that of an autoencoder model by leveraging novel regularization terms.

Improving Human Text Comprehension through Semi-Markov CRF-based Neural Section Title Generation. NAACL 2019. [PDF]
In particular, we present an extractive pipeline for section title generation by first selecting the most salient sentence and then applying deletion-based compression.

Pun Generation with Surprise. NAACL 2019. [PDF]
In this paper, we propose an unsupervised approach to pun generation based on lots of raw (unhumorous) text and a surprisal principle.

Latent Code and Text-based Generative Adversarial Networks for Soft-text Generation. NAACL 2019. [PDF]
In this work, we introduce a novel text-based approach called Soft-GAN to effectively exploit GAN setup for text generation.

Neural Text Generation from Rich Semantic Representations. NAACL 2019. [PDF]
We propose neural models to generate high-quality text from structured representations based on Minimal Recursion Semantics (MRS).

Text Generation with Exemplar-based Adaptive Decoding. NAACL 2019. [PDF]
We propose a novel conditioned text generation model.

Towards Content Transfer through Grounded Text Generation. NAACL 2019. [PDF]
This paper introduces the notion of Content Transfer for long-form text generation, where the task is to generate a next sentence in a document that both fits its context and is grounded in a content-rich external textual source such as a news story. As another contribution of this paper, we release a benchmark dataset of 640k Wikipedia referenced sentences paired with the source articles to encourage exploration of this new task.

An Integrated Approach for Keyphrase Generation via Exploring the Power of Retrieval and Extraction. NAACL 2019. [PDF]
In this paper, we present a novel integrated approach for keyphrase generation (KG).

Accelerated Reinforcement Learning for Sentence Generation by Vocabulary Prediction. NAACL 2019. [PDF]
To improve the efficiency of reinforcement learning, we present a novel approach for reducing the action space based on dynamic vocabulary prediction.

Pre-trained language model representations for language generation. NAACL 2019. [PDF]
In this paper, we examine different strategies to integrate pre-trained representations into sequence to sequence models and apply it to neural machine translation and abstractive summarization.

Pragmatically Informative Text Generation. NAACL 2019. [PDF]
We consider two pragmatic modeling methods for text generation: one where pragmatics is imposed by information preservation, and another where pragmatics is imposed by explicit modeling of distractors.

Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation. NAACL 2019. [PDF]
In this paper, we propose to use the Wasserstein autoencoder (WAE) for probabilistic sentence generation, where the encoder could be either stochastic or deterministic.

NAACL 2018

Discourse-Aware Neural Rewards for Coherent Text Generation. NAACL 2018. [PDF]
In this paper, we investigate the use of discourse-aware rewards with reinforcement learning to guide a model to generate long, coherent text.

Neural Text Generation in Stories Using Entity Representations as Context. NAACL 2018. [PDF]
We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.

Neural Text Generation in Stories Using Entity Representations as Context. NAACL 2018. [PDF]
We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.

A Deep Ensemble Model with Slot Alignment for Sequence-to-Sequence Natural Language Generation. NAACL 2018. [PDF]
We describe an ensemble neural language generator, and present several novel methods for data representation and augmentation that yield improved results in our model.

Adversarial Example Generation with Syntactically Controlled Paraphrase Networks. NAACL 2018. [PDF]
We propose syntactically controlled paraphrase networks (SCPNs) and use them to generate adversarial examples.

Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation. NAACL 2018. [PDF]
We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.

Natural Language to Structured Query Generation via Meta-Learning. NAACL 2018. [PDF]
In this work, we explore a different learning protocol that treats each example as a unique pseudo-task, by reducing the original learning problem to a few-shot meta-learning scenario with the help of a domain-dependent relevance function.

Guiding Generation for Abstractive Text Summarization Based on Key Information Guide Network. NAACL 2018. [PDF]
We propose a guiding generation model that combines the extractive method and the abstractive method.

Natural Language Generation by Hierarchical Decoding with Linguistic Patterns. NAACL 2018. [PDF]
This paper introduces a hierarchical decoding NLG model based on linguistic patterns in different levels, and shows that the proposed method outperforms the traditional one with a smaller model size.

RankME: Reliable Human Ratings for Natural Language Generation. NAACL 2018. [PDF]
We present a novel rank-based magnitude estimation method (RankME), which combines the use of continuous scales and relative assessments.

Identifying the Most Dominant Event in a News Article by Mining Event Coreference Relations. NAACL 2018. [PDF]
Identifying the most dominant and central event of a document, which governs and connects other foreground and background events in the document, is useful for many applications, such as text summarization, storyline generation and text segmentation.

EMNLP 2019

Sentence-Level Content Planning and Style Specification for Neural Text Generation. EMNLP 2019. [PDF]
Denoising-based Sequence-to-Sequence Pre-training for Text Generation. EMNLP 2019. [PDF]
A Topic Augmented Text Generation Model: Joint Learning of Semantics and Structural Features. EMNLP 2019. [PDF]
ARAML: A Stable Adversarial Training Framework for Text Generation. EMNLP 2019. [PDF]
Deep Copycat Networks for Text-to-Text Generation. EMNLP 2019. [PDF]
Implicit Deep Latent Variable Models for Text Generation. EMNLP 2019. [PDF]
Long and Diverse Text Generation with Planning-based Hierarchical Variational Model. EMNLP 2019. [PDF]
Select and Attend: Towards Controllable Content Selection in Text Generation. EMNLP 2019. [PDF]
Autoregressive Text Generation beyond Feedback Loops. EMNLP 2019. [PDF]
Translate and Label! An Encoder-Decoder Approach for Cross-lingual Semantic Role Labeling. EMNLP 2019. [PDF]
Contrastive Attention Mechanism for Abstractive Sentence Summarization. [PDF]
Clickbait? Sensational Headline Generation with Auto-tuned Reinforcement Learning. [PDF]
Concept Pointer Network for Abstractive Summarization. [PDF]
Clickbait? Sensational Headline Generation with Auto-tuned Reinforcement Learning. [PDF]
Mixture Content Selection for Diverse Sequence Generation. [PDF]
An End-to-End Generative Architecture for Paraphrase Generation. [PDF]
Exploring Diverse Expressions for Paraphrase Generation. [PDF]
Attending to Future Tokens for Bidirectional Sequence Generation. EMNLP 2019. [PDF]

AAAI 2020

Attractive or Faithful? Popularity‐Reinforced Learning for Inspired Headline Generation. [PDF]
Learning to Compare for Better Training and Evaluation of Open Domain Text Generation Models. [PDF]
Sentence Generation for Entity Description with Content‐plan Attention. [PDF]
Recurrent Nested Model for Sequence Generation. [PDF]
Active Learning with Query Generation for Cost‐Effective Text Classification. [PDF]
CatGAN: Category‐aware Generative Adversarial Networks with Hierarchical Evolutionary Learning for Category Text Generation. [PDF]
Structure Learning for Headline Generation. [PDF]
A Pre‐training Based Personalized Dialogue Generation Model with Persona‐sparse Data. [PDF]
A Dataset for Low-Resource Stylized Sequence-to‐Sequence Generation. [PDF]
Complementary Auxiliary Classifiers for Label‐Conditional Text Generation. [PDF]
Cross-Lingual Natural Language Generation via Pre‐Training. [PDF]
Open Domain Event Text Generation. [PDF]
Joint Parsing and Generation for Abstractive Summarization. [PDF]
A Meta Cooperative Training Paradigm for Improving Adversarial Text Generation. [PDF]
Sequence Generation with Optimal‐Transport‐Enhanced Reinforcement Learning. [PDF] \

AAAI 2019

Differentiated Distribution Recovery for Neural Text Generation. AAAI 2019. [PDF] \

AAAI 2018

Controlling Global Statistics in Recurrent Neural Network Text Generation. AAAI 2018. [PDF]
Long Text Generation via Adversarial Training with Leaked Information. AAAI 2018. [PDF]
Order-Planning Neural Text Generation From Structured Data. AAAI 2018. [PDF]

Table-Text

ACL 2019

Towards Comprehensive Description Generation from Factual Attribute-value Tables. ACL 2019. [PDF]
To relieve these problems, we first propose force attention (FA) method to encourage the generator to pay more attention to the uncovered attributes to avoid potential key attributes missing. Furthermore, we propose reinforcement learning for information richness to generate more informative as well as more loyal descriptions for tables.

Key Fact as Pivot: A Two-Stage Model for Low Resource Table-to-Text Generation. ACL 2019. [PDF]
In this work, we consider the scenario of low resource table-to-text generation, where only limited parallel data is available.

EMNLP 2019

Table-to-Text Generation with Effective Hierarchical Encoder on Three dimensions (Row, Column and Time). EMNLP 2019. [PDF] \

AAAI 2018

Table-to-text Generation by Structure-aware Seq2seq Learning. AAAI 2018. [PDF]

Data-Text

ACL 2019

generation by comparing graph encoders to tree encoders, where reentrancies are not preserved.* Data-to-text Generation with Entity Modeling. ACL 2019. [PDF]
In this work we propose an entity-centric neural architecture for data-to-text generation.

Learning to Select, Track, and Generate for Data-to-Text. ACL 2019. [PDF]
We propose a data-to-text generation model with two modules, one for tracking and the other for text generation.

ACL 2018

A Graph-to-Sequence Model for AMR-to-Text Generation. ACL 2018. [PDF]

NAACL 2019

Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation. NAACL 2019. [PDF]
For training a plan-to-text generator, we present a method for matching reference texts to their corresponding text plans.

Text Generation from Knowledge Graphs with Graph Transformers. NAACL 2019. [PDF]
In this work, we address the problem of generating coherent multi-sentence texts from the output of an information extraction system, and in particular a knowledge graph.

Structural Neural Encoders for AMR-to-text Generation. NAACL 2019. [PDF]
*We investigate the extent to which reentrancies (nodes with multiple parents) have an impact on AMR-to-text

EMNLP 2019

Modeling Graph Structure in Transformer for Better AMR-to-Text Generation. EMNLP 2019. [PDF]
Enhancing AMR-to-Text Generation with Dual Graph Representations. EMNLP 2019. [PDF]
Enhancing Neural Data-To-Text Generation Models with External Background Knowledge. EMNLP 2019. [PDF]
Neural data-to-text generation: A comparison between pipeline and end-to-end architectures. EMNLP 2019. [PDF] \

AAAI 2019

Data-to-Text Generation with Content Selection and Planning. AAAI 2019. [PDF]
Hierarchical Encoder with Auxiliary Supervision for Table-to-text Generation: Learning Better Representation for Tables. AAAI 2019. [PDF]

Multimodal-Text

ACL 2019

What Should I Ask? Using Conversationally Informative Rewards for Goal-oriented Visual Dialog. ACL 2019. [PDF]
In this work, we focus on the task of goal-oriented visual dialogue, aiming to automatically generate a series of questions about an image with a single objective.

Dense Procedure Captioning in Narrated Instructional Videos. ACL 2019. [PDF]
Motivated by video dense captioning, we propose a model to generate procedure captions from narrated instructional videos which are a sequence of step-wise clips with description.

Bridging by Word: Image Grounded Vocabulary Construction for Visual Captioning. ACL 2019. [PDF]
To tackle this problem, we propose to construct an image-grounded vocabulary, based on which, captions are generated with limitation and guidance.

Improving Visual Question Answering by Referring to Generated Paragraph Captions. ACL 2019. [PDF]
Hence, we propose a combined Visual and Textual Question Answering (VTQA) model which takes as input a paragraph caption as well as the corresponding image, and answers the given question based on both inputs.

Ordinal and Attribute Aware Response Generation in a Multimodal Dialogue System. ACL 2019. [PDF]
In this paper, we propose a novel position and attribute aware attention mechanism to learn enhanced image representation conditioned on the user utterance.

NAACL 2018

What’s This Movie About? A Joint Neural Network Architecture for Movie Content Analysis. NAACL 2018. [PDF]
We present a novel end-to-end model for overview generation, consisting of a multi-label encoder for identifying screenplay attributes, and an LSTM decoder to generate natural language sentences conditioned on the identified attributes. We create a dataset that consists of movie scripts, attribute-value pairs for the movies’ aspects, as well as overviews, which we extract from an online database.

ECCV 2018

Diverse and Coherent Paragraph Generation from Images. ECCV 2018. [PDF]

Question Answer

ACL 2019

Generating Question-Answer Hierarchies. ACL 2019. [PDF]
In this paper, we present SQUASH (Specificity-controlled Question-Answer Hierarchies), a novel and challenging text generation task that converts an input document into a hierarchy of question-answer pairs.

Generating Question Relevant Captions to Aid Visual Question Answering. ACL 2019. [PDF]
We present a novel approach to better VQA performance that exploits this connection by jointly generating captions that are targeted to help answer a specific visual question.

Synthetic QA Corpora Generation with Roundtrip Consistency. ACL 2019. [PDF]
We introduce a novel method of generating synthetic question answering corpora by combining models of question generation and answer extraction, and by filtering the results to ensure roundtrip consistency.

Improving the Robustness of Question Answering Systems to Question Paraphrasing. ACL 2019. [PDF]
Using a neural paraphrasing model trained to generate multiple paraphrased questions for a given source question and a set of paraphrase suggestions, we propose a data augmentation approach that requires no human intervention to re-train the models for improved robustness to question paraphrasing.

Asking the Crowd: Question Analysis, Evaluation and Generation for Open Discussion on Online Forums. ACL 2019. [PDF]
In this paper, we take the first step on teaching machines to ask open-answered questions from real-world news for open discussion (openQG).

Cross-Lingual Training for Automatic Question Generation. ACL 2019. [PDF]
We propose a cross-lingual QG model which uses the following training regime: (i) Unsupervised pretraining of language models in both primary and secondary languages and (ii) joint supervised training for QG in both languages.

Interconnected Question Generation with Coreference Alignment and Conversation Flow Modeling. ACL 2019. [PDF]
We propose an end-to-end neural model with coreference alignment and conversation flow modeling.

Learning to Ask Unanswerable Questions for Machine Reading Comprehension. ACL 2019. [PDF]
In this work, we propose a data augmentation technique by automatically generating relevant unanswerable questions according to an answerable question paired with its corresponding paragraph that contains the answer.

Reinforced Dynamic Reasoning for Conversational Question Generation. ACL 2019. [PDF]
Towards that end, we propose a new approach named Reinforced Dynamic Reasoning network, which is based on the general encoder-decoder framework but incorporates a reasoning procedure in a dynamic manner to better understand what has been asked and what to ask next about the passage into the general encoder-decoder framework.

NAACL 2018

Natural Answer Generation with Heterogeneous Memory. NAACL 2018. [PDF]
In this work, we propose a novel attention mechanism to encourage the decoder to actively interact with the memory by taking its heterogeneity into account.

Zero-Shot Question Generation from Knowledge Graphs for Unseen Predicates and Entity Types. NAACL 2018. [PDF]
We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.

Leveraging Context Information for Natural Question Generation. NAACL 2018. [PDF]
We propose a model that matches the answer with the passage before generating the question.

EMNLP 2019

Generating Questions for Knowledge Bases via Incorporating Diversified Contexts and Answer-Aware Loss. [PDF] Addressing Semantic Drift in Question Generation for Semi-Supervised Question Answering Incorporating External Knowledge into Machine Reading for Generative Question Answering

AAAI 2020

Improving Question Generation with Sentence‐level Semantic Matching and Answer Position Inferring. [PDF]
Conclusion‐Supplement Answer Generation for Non‐Factoid Questions. [PDF]
Joint Learning of Answer Selection and Answer Summary Generation in Community Question Answering. [PDF]
Neural Question Generation with Answer Pivot. [PDF]
Capturing Greater Context for Question Generation. [PDF] \

Dialogue

ACL 2019

ReCoSa: Detecting the Relevant Contexts with Self-Attention for Multi-turn Dialogue Generation. ACL 2019. [PDF]
In this paper, we propose a new model, named ReCoSa, to tackle this problem.

Neural Response Generation with Meta-words. ACL 2019. [PDF]
We present open domain dialogue generation with meta-words.

Semantically Conditioned Dialog Response Generation via Hierarchical Disentangled Self-Attention. ACL 2019. [PDF]
To alleviate such scalability issue, we exploit the structure of dialog acts to build a multi-layer hierarchical graph, where each act is represented as a root-to-leaf route on the graph.

Generating Responses with a Specific Emotion in Dialog. ACL 2019. [PDF]
We propose an emotional dialogue system (EmoDS) that can generate the meaningful responses with a coherent structure for a post, and meanwhile express the desired emotion explicitly or implicitly within a unified framework.

A Working Memory Model for Task-oriented Dialog Response Generation. ACL 2019. [PDF]
Inspired by the psychological studies on working memory, we propose a working memory model (WMM2Seq) for dialog response generation

Domain Adaptive Dialog Generation via Meta Learning. ACL 2019. [PDF]
We propose a domain adaptive dialog generation method based on meta-learning (DAML).

NAACL 2019

Affect-Driven Dialog Generation. NAACL 2019. [PDF]
In this paper, we present an affect-driven dialog system, which generates emotional responses in a controlled manner using a continuous representation of emotions.

What makes a good conversation? How controllable attributes affect human judgments. NAACL 2019. [PDF]
In this work, we examine two controllable neural text generation methods, conditional training and weighted decoding, in order to control four important attributes for chit-chat dialogue: repetition, specificity, response-relatedness and questionasking.

NAACL 2018

Dialog Generation Using Multi-Turn Reasoning Neural Networks. NAACL 2018. [PDF]
In this paper, we propose a generalizable dialog generation approach that adapts multi-turn reasoning, one recent advancement in the field of document comprehension, to generate responses (“answers”) by taking current conversation session context as a “document” and current query as a “question”.

Automatic Dialogue Generation with Expressed Emotions. NAACL 2018. [PDF]
In this research, we address the problem of forcing the dialogue generation to express emotion.

EMNLP 2019

Multi-task Learning for Natural Language Generation in Task-Oriented Dialogue. [PDF]
Dirichlet Latent Variable Hierarchical Recurrent Encoder-Decoder in Dialogue Generation. [PDF]
Hierarchy Response Learning for Neural Conversation Generation. [PDF]
Knowledge Aware Conversation Generation with Explainable Reasoning over Augmented Graphs. [PDF]
Adaptive Parameterization for Neural Dialogue Generation. [PDF]
DyKgChat: Benchmarking Dialogue Generation Grounding on Dynamic Knowledge Graphs. [PDF]

AAAI 2020

Learning from Easy to Complex: Adaptive Multi‐curricula Learning for Neural Dialogue Generation. [PDF]
Improving Knowledge‐aware Dialogue Generation via Knowledge Base Question Answering. [PDF]
MALA: Cross‐Domain Dialogue Generation with Action Learning. [PDF]

Other Applications

ACL 2019

Rhetorically Controlled Encoder-Decoder for Modern Chinese Poetry Generation. ACL 2019. [PDF]
In this paper, we propose a rhetorically controlled encoder-decoder for modern Chinese poetry generation.

Automatic Grammatical Error Correction for Sequence-to-sequence Text Generation: An Empirical Study. ACL 2019. [PDF]
In this paper, we present a preliminary empirical study on whether and how much automatic grammatical error correction can help improve seq2seq text generation.

Coherent Comments Generation for Chinese Articles with a Graph-to-Sequence Model. ACL 2019. [PDF]
In this paper, we propose to generate comments with a graph-to-sequence model that models the input news as a topic interaction graph.

Topic-Aware Neural Keyphrase Generation for Social Media Language. ACL 2019. [PDF]
To facilitate automatic language understanding, we study keyphrase prediction, distilling salient information from massive posts.

NAACL 2019

Corpora Generation for Grammatical Error Correction. NAACL 2019. [PDF]
We describe two approaches for generating large parallel datasets for GEC using publicly available Wikipedia data.

Semantically-Aligned Equation Generation for Solving and Reasoning Math Word Problems. NAACL 2019. [PDF]
Motivated by the intuition about how human generates the equations given the problem texts, this paper presents a neural approach to automatically solve math word problems by operating symbols according to their semantic meanings in texts.

NAACL 2018

Interpretable Charge Predictions for Criminal Cases: Learning to Generate Court Views from Fact Descriptions. NAACL 2018. [PDF]
In this paper, we propose to study the problem of court view generation from the fact description in a criminal case.

TypeSQL: Knowledge-Based Type-Aware Neural Text-to-SQL Generation. NAACL 2018. [PDF]
In this paper, we present a novel approach TypeSQL which formats the problem as a slot filling task in a more reasonable way.

Learning to Generate Wikipedia Summaries for Underserved Languages from Wikidata. NAACL 2018. NAACL 2018. [PDF]
In this work, we investigate the generation of open domain Wikipedia summaries in underserved languages using structured data from Wikidata.

AAAI 2020

An Iterative Polishing Framework based on Quality Aware Masked Language Model for Chinese Poetry Generation. [PDF]
MixPoet: Diverse Poetry Generation via Learning Controllable Mixed Latent Space. [PDF]
Label Error Correction and Generation Through Label Relationships. [PDF]
A Character‐Centric Neural Model for Automated Story Generation. [PDF]
Automatic Generation of Headlines for Online Math Questions. [PDF]
TreeGen: A Tree‐Based Transformer Architecture for Code Generation. [PDF]
On the Generation of Medical Question-Answer Pairs. [PDF] \

Dataset

ACL 2019

Storyboarding of Recipes: Grounded Contextual Generation. ACL 2019. [PDF]
We introduce a dataset for sequential procedural (how-to) text generation from images in cooking domain.

Curate and Generate: A Corpus and Method for Joint Control of Semantics and Style in Neural NLG. ACL 2019. [PDF]
We present YelpNLG, a corpus of 300,000 rich, parallel meaning representations and highly stylistically varied reference texts spanning different restaurant attributes, and describe a novel methodology that can be scalably reused to generate NLG datasets for other domains.

Towards Empathetic Open-domain Conversation Models: A New Benchmark and Dataset. ACL 2019. [PDF]
This work proposes a new benchmark for empathetic dialogue generation and EmpatheticDialogues, a novel dataset of 25k conversations grounded in emotional situations.

Explain Yourself! Leveraging Language Models for Commonsense Reasoning. ACL 2019. [PDF]
We collect human explanations for commonsense reasoning in the form of natural language sequences and highlighted annotations in a new dataset called Common Sense Explanations (CoS-E). We use CoS-E to train language models to automatically generate explanations that can be used during training and inference in a novel Commonsense Auto-Generated Explanation (CAGE) framework.

Evaluation

ACL 2019

Sentence Mover's Similarity Automatic Evaluation for Multi-Sentence Texts. ACL 2019. [PDF] \

Know More about Each Other: Evolving Dialogue Strategy via Compound Assessment. ACL 2019. [PDF]
In this paper, a novel Generation-Evaluation framework is developed for multi-turn conversations with the objective of letting both participants know more about each other.

Handling Divergent Reference Texts when Evaluating Table-to-Text Generation. ACL 2019. [PDF]
We propose a new metric, PARENT, which aligns n-grams from the reference and generated texts to the semi-structured data before computing their precision and recall.

NAACL 2019

Evaluating Rewards for Question Generation Models. NAACL 2019. [PDF]
We therefore optimise directly for various objectives beyond simply replicating the ground truth questions, including a novel approach using an adversarial discriminator that seeks to generate questions that are indistinguishable from real examples.

Unifying Human and Statistical Evaluation for Natural Language Generation. NAACL 2019. [PDF]
In this paper, we propose a unified framework which evaluates both diversity and quality, based on the optimal error rate of predicting whether a sentence is human- or machine-generated.

EMNLP 2019

MoverScore: Text Generation Evaluating with Contextualized Embeddings and Earth Mover Distance. EMNLP 2019. [PDF]

text-generation's People

Contributors

mrzixi avatar xixizxx avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.