Przeglądaj źródła

Fix typo in `evaluation/metrics.py`

Shaw 2 lat temu
rodzic
commit
00f588b031
1 zmienionych plików z 1 dodań i 1 usunięć
  1. 1 1
      docs/evaluate-your-own-tasks.md

+ 1 - 1
docs/evaluate-your-own-tasks.md

@@ -72,7 +72,7 @@ The default metrics for the generation task are EM(Exact-Match) and F1. Given in
 
 ## Implement Your Metrics
 
-You can customize your evaluation metrics function and add it to `DEFAULT_METRICS` in `generation/metrics.py`, and then you can specify `metric: ['Your metric name']` in the task YAML file.
+You can customize your evaluation metrics function and add it to `DEFAULT_METRICS` in `evaluation/metrics.py`, and then you can specify `metric: ['Your metric name']` in the task YAML file.
 
 ## Fully customize the evaluation process