Error generating predictions

Hello guysm Our last submission failed in the middle of generating predictions. , although we were able to run it locally.
We encountered the following error that we couldn’t understand related to trim_predictions_to_max_token_length function ,could you plz provide us with further insight into this issue?
Traceback (most recent call last):
File “/home/aicrowd/runner.py”, line 110, in run_task
local_evaluation.trim_predictions_to_max_token_length(
File “/home/aicrowd/local_evaluation.py”, line 100, in trim_predictions_to_max_token_length
tokenized_prediction = tokenizer.encode(prediction)
File “/home/aicrowd/.local/lib/python3.9/site-packages/transformers/tokenization_utils_base.py”, line 2629, in encode
encoded_inputs = self.encode_plus(
File “/home/aicrowd/.local/lib/python3.9/site-packages/transformers/tokenization_utils_base.py”, line 3037, in encode_plus
return self._encode_plus(
File “/home/aicrowd/.local/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py”, line 576, in _encode_plus
batched_output = self._batch_encode_plus(
File “/home/aicrowd/.local/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py”, line 504, in _batch_encode_plus
encodings = self._tokenizer.encode_batch(
TypeError: TextEncodeInput must be Union[TextInputSequence, Tuple[InputSequence, InputSequence]]

Error generating predictions.