I have a draft post where I try out a few new sentence-transformer models on HF that use ModernBERT as a base. So far, I think I prefer the standard all-mpnet-base-v2 for similarity — but more models have popped up since I wrote the draft.
Given how the MPNET Sentence Transformer model was trained & fine tuned, I don’t know if any community model based on ModernBERT and trained by one person on a graphics card is going to be as good.
Comments
https://huggingface.co/sentence-transformers/all-mpnet-base-v2