Topics of Interest

We invite submissions with original contributions addressing all topics related to the productive interaction between large pretrained language models and large semantic networks. Areas of interest include, but are not limited to, the following:

  • Building and enriching monolingual, multilingual and cross-lingual lexical knowledge bases, semantic networks and wordnets using deep learning techniques and large pre-trained language models.
  • Exploiting lexical knowledge bases, semantic networks and wordnets for creating world knowledge and common sense probes for testing large pre-trained language models.
  • Using lexical knowledge bases, semantic networks and wordnets for creating prompts for zero-shot or few-shot or transfer learning NLP tasks.
  • Leveraging lexical knowledge bases, semantic networks and wordnets and large pre-trained language models towards Natural Language Understanding.