Pārlūkot izejas kodu

Fix typo in `evaluation/metrics.py`

Shaw 2 gadi atpakaļ
vecāks
revīzija
00f588b031
1 mainītis faili ar 1 papildinājumiem un 1 dzēšanām
  1. 1 1
      docs/evaluate-your-own-tasks.md

+ 1 - 1
docs/evaluate-your-own-tasks.md

@@ -72,7 +72,7 @@ The default metrics for the generation task are EM(Exact-Match) and F1. Given in
 
 ## Implement Your Metrics
 
-You can customize your evaluation metrics function and add it to `DEFAULT_METRICS` in `generation/metrics.py`, and then you can specify `metric: ['Your metric name']` in the task YAML file.
+You can customize your evaluation metrics function and add it to `DEFAULT_METRICS` in `evaluation/metrics.py`, and then you can specify `metric: ['Your metric name']` in the task YAML file.
 
 ## Fully customize the evaluation process