Skip to content

feat: Add custom metrics#2736

Draft
auguste-probabl wants to merge 5 commits intoprobabl-ai:mainfrom
auguste-probabl:feat-custom-metrics
Draft

feat: Add custom metrics#2736
auguste-probabl wants to merge 5 commits intoprobabl-ai:mainfrom
auguste-probabl:feat-custom-metrics

Conversation

@auguste-probabl
Copy link
Copy Markdown
Collaborator

@auguste-probabl auguste-probabl commented Apr 9, 2026

Change description

This PR makes it possible to register a custom metric in an EstimatorReport, so that it's shown along with the other default metrics in summarize:

from sklearn.datasets import load_breast_cancer
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import make_scorer, mean_absolute_error
from skore import evaluate
X, y = load_breast_cancer(return_X_y=True)
classifier = LogisticRegression(max_iter=10_000)
report = evaluate(classifier, X, y, splitter=0.2, pos_label=1)
report.metrics.register(
    make_scorer(mean_absolute_error, response_method="predict")
)
report.metrics.summarize().frame()

                     LogisticRegression
Metric
Accuracy                       0.947368
Precision                      0.984127
Recall                         0.925373
ROC AUC                        0.993649
Brier score                    0.036154
Fit time (s)                   0.324200
Predict time (s)               0.000323
Mean Absolute Error            0.052632

Closes #2061

Contribution checklist

  • Unit tests were added or updated (if necessary)
  • Documentation was added or updated (if necessary)
  • The documentation builds and renders properly (if it does, our bot will add a comment linking
    to a preview of the documentation to review it visually)
  • A new changelog entry was added to CHANGELOG.rst (if necessary; typically, if your change
    requires updating tests)

AI usage disclosure

AI tools were involved for:

  • Code generation (e.g., when writing an implementation or fixing a bug)
  • Test/benchmark generation
  • Documentation (including examples)
  • Research and understanding

I had an LLM write down specs in docs/design, write some tests based on those specs, and some of the implementation.

@auguste-probabl auguste-probabl changed the title Feat custom metrics feat: Add custom metrics Apr 9, 2026
def test_no_response_method(self, fixture_name, request):
"""Check that computing a custom metric without passing the
response method raises an error."""
def test_summarize_single_list_equivalence(self, fixture_name, request):
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test was done for every report, so I moved it here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat(skore): Registry for custom and extra metrics

1 participant