Skip to content

Predictions and Test & Score: MAPE produces value where it should fail #7041

@wvdvegte

Description

@wvdvegte

What's wrong?
The formula for the Mean Absolute Percentage Error (see here or here) produces a zero division if the test data contains actual values that are zero.
Yet Predictions and Test & Score produce a (very large) value for MAPE if there are zeroes in the test data.

How can we reproduce the problem?
In the attached workflow, Formula is used to compute the Absolute Percentage Error for every record, of which MAPE is the mean over all rows. As it should, Formula fails if rows where the target equals zero are included, but at the same time Predictions and Test & Score produce very large values for MAPE.

MAPE 0div.zip

Perhaps my idea to include MAPE wasn't a very good one, since it is a problematic measure. Perhaps consider adding sMAPE (Python code provided here) or even replacing MAPE by it.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugA bug confirmed by the core teamneeds discussionCore developers need to discuss the issuesnackThis will take an hour or two

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions