Papers
arxiv:2605.12438

A Causal Language Modeling Detour Improves Encoder Continued Pretraining

Published on May 12
· Submitted by
Rian Touchent
on May 13
Authors:
,

Abstract

Switching from Masked Language Modeling to Causal Language Modeling during encoder adaptation improves downstream performance on biomedical texts through dense supervision effects in lower transformer layers.

AI-generated summary

When adapting an encoder to a new domain, the standard approach is to continue training with Masked Language Modeling (MLM). We show that temporarily switching to Causal Language Modeling (CLM) followed by a short MLM decay improves downstream performance. On biomedical texts with ModernBERT, this CLM detour outperforms MLM baselines trained on identical data and compute across 8 French and 11 English biomedical tasks, by +1.2-2.8pp and +0.3-0.8pp respectively, depending on model size. We investigate the reasons for these gains. We find that CLM's dense supervision impacts low transformer layers (0-7) far more than MLM does. Freezing low layers during CLM eliminates the downstream benefit; freezing mid layers preserves it. The representational changes persist through the MLM decay phase, even when it matches the CLM phase in length, and they scale with model capacity. We release ModernCamemBERT-bio and ModernBERT-bio as state-of-the-art biomedical encoders in Base and Large sizes.

Community

Hi @rntc , very cool idea! Do you btw. Plan to release the code, I would like to try this with other models for domain adaption 😃

·

Hi @stefan-it thank you very much! I will try to release it asap, until then, the code is a modification of this very cool codebase you can find here: https://github.com/JHU-CLSP/ettin-encoder-vs-decoder by @orionweller et al.

Paper submitter

Release of ModernBERT-bio and ModernCamemBERT-bio

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2605.12438
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 4

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2605.12438 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2605.12438 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.