climpred.metrics._spearman_r

climpred.metrics._spearman_r(forecast: xarray.Dataset, verif: xarray.Dataset, dim: Optional[Union[str, List[str]]] = None, **metric_kwargs: Any) xarray.Dataset[source]

Spearman’s rank correlation coefficient.

corr = \mathrm{pearsonr}(ranked(f), ranked(o))

This correlation coefficient is nonparametric and assesses how well the relationship between the forecast and verification data can be described using a monotonic function. It is computed by first ranking the forecasts and verification data, and then correlating those ranks using the _pearson_r() correlation.

This is also known as the anomaly correlation coefficient (ACC) when comparing anomalies, although the Pearson product-moment correlation coefficient _pearson_r() is typically used when computing the ACC.

Note

Use metric _spearman_r_p_value() or _spearman_r_eff_p_value`() to get the corresponding p value.

Parameters
  • forecast – Forecast.

  • verif – Verification data.

  • dim – Dimension(s) to perform metric over.

  • metric_kwargs – see xskillscore.spearman_r()

Notes

minimum

-1.0

maximum

1.0

perfect

1.0

orientation

positive

Example

>>> HindcastEnsemble.verify(
...     metric="spearman_r",
...     comparison="e2o",
...     alignment="same_verifs",
...     dim="init",
... )
<xarray.Dataset>
Dimensions:  (lead: 10)
Coordinates:
  * lead     (lead) int32 1 2 3 4 5 6 7 8 9 10
    skill    <U11 'initialized'
Data variables:
    SST      (lead) float64 0.9336 0.9311 0.9293 0.9474 ... 0.9465 0.9346 0.9328
Attributes:
    prediction_skill_software:     climpred https://climpred.readthedocs.io/
    skill_calculated_by_function:  HindcastEnsemble.verify()
    number_of_initializations:     64
    number_of_members:             10
    alignment:                     same_verifs
    metric:                        spearman_r
    comparison:                    e2o
    dim:                           init
    reference:                     []