Within the collaborative event devoted to pre-train models using Flax organized by Huggingface with the collaboration of Google Cloud,  the POSTDATA team has launched ALBERTI,  a model based on the multilingual version of BERT specifically trained to work with poetry and poetic texts. The main difference between ALBERTI and other language models is that it seeks to be more poetic and less literal in semantic tasks. All the information is on https://huggingface.co/flax-community/alberti-bert-base-multilingual-cased

And You can try it in

https://huggingface.co/spaces/flax-community/alberti