Home /
Research
Showing 859 - 864 / 897
The Devil is in the Details: Evaluating Limitations of Transformer-based Methods for Granular Tasks
BrihiJoshi, NeilShah, FrancescoBarbieri....
Published date-11/02/2020
QuestionAnswering, SentimentAnalysis
Contextual embeddings derived from transformer-based neural language models have shown state-of-the-art performance for various tasks such as question answering, sentiment analysis, and textual similarity in recent years. Extensive work shows …
Coresets for Regressions with Panel Data
LingxiaoHuang, K.Sudhir, NisheethK.Vishnoi....
Published date-11/02/2020
This paper introduces the problem of coresets for regression problems to panel data settings. We first define coresets for several variants of regression problems with panel data and then present …
Hierarchical Bi-Directional Self-Attention Networks for Paper Review Rating Recommendation
ZhongfenDeng, HaoPeng, CongyingXia....
Published date-11/02/2020
DecisionMaking
Review rating prediction of text reviews is a rapidly growing technology with a wide range of applications in natural language processing. However, most existing methods either use hand-crafted features or …
Multi-Task Learning for Calorie Prediction on a Novel Large-Scale Recipe Dataset Enriched with Nutritional Information
RobinRuede, VerenaHeusser, LukasFrank....
Published date-11/02/2020
Multi-TaskLearning
A rapidly growing amount of content posted online, such as food recipes, opens doors to new exciting applications at the intersection of vision and language. In this work, we aim …
Exploring Question-Specific Rewards for Generating Deep Questions
YuxiXie, LiangmingPan, DongzheWang....
Published date-11/02/2020
QuestionGeneration
Recent question generation (QG) approaches often utilize the sequence-to-sequence framework (Seq2Seq) to optimize the log-likelihood of ground-truth questions using teacher forcing. However, this training objective is inconsistent with actual question …
Emergent Communication Pretraining for Few-Shot Machine Translation
YaoyiranLi, EdoardoM.Ponti, IvanVulić....
Published date-11/02/2020
MachineTranslation, TransferLearning
While state-of-the-art models that rely upon massively multilingual pretrained encoders achieve sample efficiency in downstream applications, they still require abundant amounts of unlabelled text. Nevertheless, most of the world's languages …