Skip to content

fix: if layers == 0, layers were not initialized#306

Open
stephantul wants to merge 4 commits intoMinishLab:mainfrom
stephantul:fix-bug-init
Open

fix: if layers == 0, layers were not initialized#306
stephantul wants to merge 4 commits intoMinishLab:mainfrom
stephantul:fix-bug-init

Conversation

@stephantul
Copy link
Contributor

While working on a classifier with no hidden layers, I saw that linear layers were not initialized properly if n_layers == 0. The short circuiting was returning a linear with default initialization, which happens to be kaiming anyway.

Not a big deal, but just for consistency. I also initialized to xavier_uniform for the final layer, because this could help (that is, kaiming init is often used for relu, xavier for linear or tanh), although I didn't see any difference on 20 newsgroups.

@codecov
Copy link

codecov bot commented Feb 25, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

Files with missing lines Coverage Δ
model2vec/train/classifier.py 97.61% <100.00%> (+0.05%) ⬆️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant