[AIRG] Generalizing Word Embeddings, 10/24


Date: Mon, 22 Oct 2018 21:22:40 +0000
From: JINMAN ZHAO <jzhao237@xxxxxxxx>
Subject: [AIRG] Generalizing Word Embeddings, 10/24

Hi AIRG,

 

This Wednesday, I am going to do a practice talk for our upcoming EMNLP short paper âGeneralizing Word Embeddings using Bag of Subwordsâ. I will first present some background. Then the talk itself will be about 10 minutes. After that, I am happy to hear your feedbacks, as well as talk about more details and related topics.

 

Title: Generalizing Word Embeddings using Bag of Subwords

Authors: Jinman Zhao, Sidharth Mudgal, Yingyu Liang

Link: https://arxiv.org/abs/1809.04259

Presentor: Jinman Zhao

Time: 4 pm, Wednesday, 10/24, 2018

Place: CS 3310

 

Word embedding, or obtaining word vectors capturing syntactic and semantic features, plays an important role to many neural based solutions of NLP tasks. However, popular embedding methods only maintain vectors for a fixed set of words (vocabulary), and leave other words at large. The aim of this work is to generalize pre-trained word vectors beyond the fixed-size vocabulary. The method we use is simple yet effective.

 

Hope to see you there!

 

Thanks,

Jinman Zhao

PhD student at University of Wisconsin-Madison

[← Prev in Thread] Current Thread [Next in Thread→]