Submission failure for Task 2

@dipam , @mohanty , @tomtom ,

Hi Team, I have not been able to make any submissions for task 2 since 2/21 at 17:00 CST. I have left messages on the discord channel: Discord

Original successful runs are no longer able to be submitted again. It passed the “Conv AI Validation” Step and immediately after entering “Conv Ai Phase 1” it would fail with no useful error message for debugging.

Can any of the admins help address this issue?

Hi Iris,

The issue reported seems specific to the submissions made by your account, as there are numerous successful submissions after the timestamp reported in your message.

We acknowledge that the error propagation still has some cluster specific issues that we are trying to resolve as we speak.
In the meantime, please find attached the error for your submission.

submission-248780-evaluation-4331-188153-bl8rn-1190196615: /home/aicrowd/.conda/lib/python3.9/site-packages/transformers/convert_slow_tokenizer.py:515: UserWarning: The sentencepiece tokenizer that you are converting to a fast tokenizer uses the byte fallback option which is not implemented in the fast tokenizers. In practice this means that the fast version of the tokenizer can produce unknown tokens whereas the sentencepiece version would have converted these unknown tokens into a sequence of byte tokens matching the original piece of text.
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   warnings.warn(
submission-248780-evaluation-4331-188153-bl8rn-1190196615: Traceback (most recent call last):
submission-248780-evaluation-4331-188153-bl8rn-1190196615: Evaluation started
submission-248780-evaluation-4331-188153-bl8rn-1190196615: Init wrapper
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/run.py", line 3, in <module>
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     start_test_client()
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/client_launcher.py", line 20, in start_test_client
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     client.run_agent()
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 193, in run_agent
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     raw_response, status, message = self.process_request(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 99, in process_request
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     "data": self.route_agent_request(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 129, in route_agent_request
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self.execute(target_attribute, *args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 142, in execute
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return method(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/aicrowd_wrapper.py", line 30, in classify_link
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self.agent.classify_link(test_data_batch)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/agents/deberta_v3_large_nli_comFact_agent.py", line 56, in classify_link
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     head_logits = self.model(**head_inputs).logits
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self._call_impl(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return forward_call(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1300, in forward
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     outputs = self.deberta(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self._call_impl(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return forward_call(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1070, in forward
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     encoder_outputs = self.encoder(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self._call_impl(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return forward_call(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 514, in forward
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     output_states = layer_module(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self._call_impl(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return forward_call(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 362, in forward
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     attention_output = self.attention(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self._call_impl(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return forward_call(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 293, in forward
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     self_output = self.self(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return self._call_impl(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     return forward_call(*args, **kwargs)
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 721, in forward
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     rel_att = self.disentangled_attention_bias(
submission-248780-evaluation-4331-188153-bl8rn-1190196615:   File "/home/aicrowd/.conda/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 819, in disentangled_attention_bias
submission-248780-evaluation-4331-188153-bl8rn-1190196615:     p2c_att = torch.gather(
submission-248780-evaluation-4331-188153-bl8rn-1190196615: torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 1.29 GiB. GPU 0 has a total capacty of 14.75 GiB of which 1.02 GiB is free. Process 38611 has 13.73 GiB memory in use. Of the allocated memory 11.06 GiB is allocated by PyTorch, and 2.54 GiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Best,
Mohanty