https://github.com/jojonki/BiDAF/blob/3e5ac9c76d02de2d8f75b1eda6632f8a9432eba6/layers/char_embedding.py#L28 I feel strange to see this code. Why do you sum over `word_len` dimension? Why don't you apply 1D filter over `word_len` dimension? Thank you.
BiDAF/layers/char_embedding.py
Line 28 in 3e5ac9c
I feel strange to see this code.
Why do you sum over
word_lendimension?Why don't you apply 1D filter over
word_lendimension?Thank you.