Gapping parsing using pretrained embeddings, attention mechanism and NCRF
The article is devoted to the problem of automatic gapping resolution for the Russian language. We use BERT Language Model as embeddings with bidirectional recurrent net- work, attention, and NCRF on the top. Unlike other models these are using BERT, we apply BERT only as embedder without any fine-tuning. As a result, our implementation took second place in the AGRR-2019 competition.