Submission #284483 shows "Evaluation failed with exit code 1". Can you provide more log information?

Evaluation failed with exit code 1

Logs are not available.

Please provide me with your submission ID. It is very difficult for me to find your submission given the current information.

Thank you for your reply, the id is #284483, which is mentioned in the title

I cannot make sense of your log, but I just paste everything there to see if they can help you.

2025-05-13 20:47:12.022	
TypeError: 'NoneType' object is not callable
2025-05-13 20:47:12.022	
  File "/usr/local/lib/python3.10/site-packages/tqdm/utils.py", line 336, in _screen_shape_linux
2025-05-13 20:47:12.022	
  File "/usr/local/lib/python3.10/site-packages/tqdm/std.py", line 1453, in format_dict
2025-05-13 20:47:12.022	
  File "/usr/local/lib/python3.10/site-packages/tqdm/std.py", line 1151, in __str__
2025-05-13 20:47:12.022	
  File "/usr/local/lib/python3.10/site-packages/tqdm/std.py", line 1495, in display
2025-05-13 20:47:12.022	
  File "/usr/local/lib/python3.10/site-packages/tqdm/std.py", line 1302, in close
2025-05-13 20:47:12.022	
  File "/usr/local/lib/python3.10/site-packages/tqdm/std.py", line 1148, in __del__
2025-05-13 20:47:12.022	
Traceback (most recent call last):
2025-05-13 20:47:12.022	
Exception ignored in: <function tqdm.__del__ at 0x7f1e981f7b50>
2025-05-13 20:47:12.018	
[rank0]:[W513 12:47:12.430780681 ProcessGroupNCCL.cpp:1496] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
2025-05-13 20:47:11.319	
[rank0]: TimeoutError: Operation timed out
2025-05-13 20:47:11.319	
[rank0]:     raise TimeoutError("Operation timed out")
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/launcher.py", line 149, in _timeout_handler
2025-05-13 20:47:11.319	
[rank0]:     random_samples = random_samples.cpu()
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 485, in _random_sample
2025-05-13 20:47:11.319	
[rank0]:     sample_results = _random_sample(seq_groups,
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 616, in get_pythonized_sample_results
2025-05-13 20:47:11.319	
[rank0]:     return get_pythonized_sample_results(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 744, in _sample_with_torch
2025-05-13 20:47:11.319	
[rank0]:     return _sample_with_torch(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 775, in _sample
2025-05-13 20:47:11.319	
[rank0]:     maybe_deferred_sample_results, maybe_sampled_tokens_tensor = _sample(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 287, in forward
2025-05-13 20:47:11.319	
[rank0]:     return forward_call(*args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
2025-05-13 20:47:11.319	
[rank0]:     return self._call_impl(*args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
2025-05-13 20:47:11.319	
[rank0]:     next_tokens = self.sampler(logits, sampling_metadata)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/model_executor/models/mllama.py", line 1230, in sample
2025-05-13 20:47:11.319	
[rank0]:     output: SamplerOutput = self.model.sample(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/worker/enc_dec_model_runner.py", line 208, in execute_model
2025-05-13 20:47:11.319	
[rank0]:     return func(*args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
2025-05-13 20:47:11.319	
[rank0]:     output = self.model_runner.execute_model(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/worker/worker_base.py", line 420, in execute_model
2025-05-13 20:47:11.319	
[rank0]:     return func(*args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/utils.py", line 2378, in run_method
2025-05-13 20:47:11.319	
[rank0]:     answer = run_method(self.driver_worker, method, args, kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/executor/uniproc_executor.py", line 56, in collective_rpc
2025-05-13 20:47:11.319	
[rank0]:     output = self.collective_rpc("execute_model",
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/executor/executor_base.py", line 140, in execute_model
2025-05-13 20:47:11.319	
[rank0]:     outputs = self.model_executor.execute_model(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 1431, in step
2025-05-13 20:47:11.319	
[rank0]:     step_outputs = self.llm_engine.step()
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 1409, in _run_engine
2025-05-13 20:47:11.319	
[rank0]:     outputs = self._run_engine(use_tqdm=use_tqdm)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 470, in generate
2025-05-13 20:47:11.319	
[rank0]:     return fn(*args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/vllm/utils.py", line 1134, in inner
2025-05-13 20:47:11.319	
[rank0]:     outputs = self.llm.generate(
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/agents/task2_agent.py", line 229, in batch_generate_response
2025-05-13 20:47:11.319	
[rank0]:     return fn(*args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/launcher.py", line 161, in run_with_timeout
2025-05-13 20:47:11.319	
[rank0]:     return run_with_timeout(
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/launcher.py", line 125, in batch_generate_response
2025-05-13 20:47:11.319	
[rank0]:     return method(*args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 144, in execute
2025-05-13 20:47:11.319	
[rank0]:     return self.execute(target_attribute, *args, **kwargs)
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 131, in route_agent_request
2025-05-13 20:47:11.319	
[rank0]:     "data": self.route_agent_request(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 101, in process_request
2025-05-13 20:47:11.319	
[rank0]:     raw_response, status, message = self.process_request(
2025-05-13 20:47:11.319	
[rank0]:   File "/usr/local/lib/python3.10/site-packages/aicrowd_gym/clients/base_oracle_client.py", line 195, in run_agent
2025-05-13 20:47:11.319	
[rank0]:     oracle_client.run_agent()
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/launcher.py", line 178, in serve
2025-05-13 20:47:11.319	
[rank0]:     serve()
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/launcher.py", line 191, in main
2025-05-13 20:47:11.319	
[rank0]:     main()
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/launcher.py", line 198, in <module>
2025-05-13 20:47:11.319	
[rank0]:     raise exc
2025-05-13 20:47:11.319	
[rank0]:   File "/aicrowd-source/launcher.py", line 213, in <module>
2025-05-13 20:47:11.319	
[rank0]: Traceback (most recent call last):

Thank you for your careful reply. Could you please provide the reason for the failure of IB 284972

I think 284972 has been successfully graded in the logs. Will trigger a manual update.

In our internal logs we see this

2025-05-16 22:18:37.413	
2025-05-16 14:18:37.413 | INFO     | __main__:<module>:85 - evaluation completed! {"score": 0, "score_secondary": 0, "meta": {"all_total": 1548.0, "all_correct_exact": 11.0, "all_correct": 374.0, "all_miss": 328.0, "all_hallucination": 846.0, "all_exact_match": 0.0071059431524547806, "all_accuracy": 0.24160206718346253, "all_missing": 0.21188630490956073, "all_hallucination_rate": 0.5465116279069767, "all_truthfulness_score": -0.30490956072351416, "all_mean_multi_turn_conversation_score": -0.3049095607235142, "ego_total": 1191.0, "ego_correct_exact": 8.0, "ego_correct": 267.0, "ego_miss": 282.0, "ego_hallucination": 642.0, "ego_exact_match": 0.006717044500419815, "ego_accuracy": 0.22418136020151133, "ego_missing": 0.2367758186397985, "ego_hallucination_rate": 0.5390428211586902, "ego_truthfulness_score": -0.3148614609571788, "ego_mean_multi_turn_conversation_score": -0.3148614609571788}}
2025-05-16 22:18:37.413	
2025-05-16 14:18:37.413 | INFO     | __main__:<module>:85 - evaluation completed! {"score": 0, "score_secondary": 0, "meta": {"all_total": 1548.0, "all_correct_exact": 11.0, "all_correct": 374.0, "all_miss": 328.0, "all_hallucination": 846.0, "all_exact_match": 0.0071059431524547806, "all_accuracy": 0.24160206718346253, "all_missing": 0.21188630490956073, "all_hallucination_rate": 0.5465116279069767, "all_truthfulness_score": -0.30490956072351416, "all_mean_multi_turn_conversation_score": -0.3049095607235142, "ego_total": 1191.0, "ego_correct_exact": 8.0, "ego_correct": 267.0, "ego_miss": 282.0, "ego_hallucination": 642.0, "ego_exact_match": 0.006717044500419815, "ego_accuracy": 0.22418136020151133, "ego_missing": 0.2367758186397985, "ego_hallucination_rate": 0.5390428211586902, "ego_truthfulness_score": -0.3148614609571788, "ego_mean_multi_turn_conversation_score": -0.3148614609571788}}

So evaluation did finish, but for some unknown reasons the leaderboard is not updated. We will do a manual refresh.

1 Like

Why did the submission of #285628 fail? Can you provide more log information? Thank you for your time.

It seems that you reached the storage limit by downloading LLaMA3-90B.

We offer 300GB of storage, but do note that OS and K8S may take up some overhead, so maybe it is not a good idea to use LLaMA3-90B, because it is not likely to fit on L40s GPU.

We have carefully reviewed the code and we do not use the 90b model anywhere. Can you provide the log?

Sorry about that, I checked the wrong log.

It seems that you are having trouble building the dependencies. pip is trying to download a lot of versions to find a working one, which is causing timeouts.

2025-05-23 18:00:22.267	
#9 93.50   Downloading transformers-4.30.0-py3-none-any.whl (7.2 MB)
2025-05-23 18:00:22.267	
#9 93.45   Downloading transformers-4.30.1-py3-none-any.whl (7.2 MB)
2025-05-23 18:00:22.164	
#9 93.40   Downloading transformers-4.30.2-py3-none-any.whl (7.2 MB)
2025-05-23 18:00:22.164	
#9 93.34   Downloading transformers-4.31.0-py3-none-any.whl (7.4 MB)
2025-05-23 18:00:22.060	
#9 93.29   Downloading transformers-4.32.0-py3-none-any.whl (7.5 MB)
2025-05-23 18:00:22.060	
#9 93.24   Downloading transformers-4.32.1-py3-none-any.whl (7.5 MB)
2025-05-23 18:00:21.953	
#9 93.19   Downloading transformers-4.33.0-py3-none-any.whl (7.6 MB)
2025-05-23 18:00:21.953	
#9 93.13   Downloading transformers-4.33.1-py3-none-any.whl (7.6 MB)
2025-05-23 18:00:21.847	
#9 93.08   Downloading transformers-4.33.2-py3-none-any.whl (7.6 MB)
2025-05-23 18:00:21.847	
#9 93.03   Downloading transformers-4.33.3-py3-none-any.whl (7.6 MB)
2025-05-23 18:00:21.740	
#9 92.97   Downloading transformers-4.34.0-py3-none-any.whl (7.7 MB)
2025-05-23 18:00:21.740	
#9 92.92   Downloading transformers-4.34.1-py3-none-any.whl (7.7 MB)
2025-05-23 18:00:21.628	
#9 92.86   Downloading transformers-4.35.0-py3-none-any.whl (7.9 MB)
2025-05-23 18:00:21.628	
#9 92.80   Downloading transformers-4.35.1-py3-none-any.whl (7.9 MB)
2025-05-23 18:00:21.510	
#9 92.74   Downloading transformers-4.35.2-py3-none-any.whl (7.9 MB)
2025-05-23 18:00:21.510	
#9 92.69   Downloading transformers-4.36.0-py3-none-any.whl (8.2 MB)
2025-05-23 18:00:21.397	
#9 92.63   Downloading transformers-4.36.1-py3-none-any.whl (8.3 MB)
2025-05-23 18:00:21.397	
#9 92.57   Downloading transformers-4.36.2-py3-none-any.whl (8.2 MB)
2025-05-23 18:00:21.276	
#9 92.51   Downloading transformers-4.37.0-py3-none-any.whl (8.4 MB)
2025-05-23 18:00:21.276	
#9 92.45   Downloading transformers-4.37.1-py3-none-any.whl (8.4 MB)
2025-05-23 18:00:21.161	
#9 92.39   Downloading transformers-4.37.2-py3-none-any.whl (8.4 MB)
2025-05-23 18:00:21.161	
#9 92.34   Downloading transformers-4.38.0-py3-none-any.whl (8.5 MB)
2025-05-23 18:00:21.041	
#9 92.27   Downloading transformers-4.38.1-py3-none-any.whl (8.5 MB)
2025-05-23 18:00:21.041	
#9 92.21   Downloading transformers-4.38.2-py3-none-any.whl (8.5 MB)
2025-05-23 18:00:20.917	
#9 92.15   Downloading transformers-4.39.0-py3-none-any.whl (8.8 MB)
2025-05-23 18:00:20.917	
#9 92.08   Downloading transformers-4.39.1-py3-none-any.whl (8.8 MB)
2025-05-23 18:00:20.782	
#9 92.02   Downloading transformers-4.39.2-py3-none-any.whl (8.8 MB)
2025-05-23 18:00:20.782	
#9 91.95   Downloading transformers-4.39.3-py3-none-any.whl (8.8 MB)
2025-05-23 18:00:20.657	
#9 91.89   Downloading transformers-4.40.0-py3-none-any.whl (9.0 MB)
2025-05-23 18:00:20.657	
#9 91.83   Downloading transformers-4.40.1-py3-none-any.whl (9.0 MB)
2025-05-23 18:00:20.538	
#9 91.77   Downloading transformers-4.40.2-py3-none-any.whl (9.0 MB)
2025-05-23 18:00:20.538	
#9 91.71   Downloading transformers-4.41.0-py3-none-any.whl (9.1 MB)
2025-05-23 18:00:20.407	
#9 91.64   Downloading transformers-4.41.1-py3-none-any.whl (9.1 MB)
2025-05-23 18:00:20.407	
#9 91.58   Downloading transformers-4.41.2-py3-none-any.whl (9.1 MB)
2025-05-23 18:00:20.286	
#9 91.52   Downloading transformers-4.42.0-py3-none-any.whl (9.3 MB)
2025-05-23 18:00:20.286	
#9 91.46   Downloading transformers-4.42.1-py3-none-any.whl (9.3 MB)
2025-05-23 18:00:20.286	
#9 91.40   Downloading transformers-4.42.2-py3-none-any.whl (9.3 MB)
2025-05-23 18:00:20.153	
#9 91.33   Downloading transformers-4.42.3-py3-none-any.whl (9.3 MB)
2025-05-23 18:00:20.153	
#9 91.27   Downloading transformers-4.42.4-py3-none-any.whl (9.3 MB)
2025-05-23 18:00:20.002	
#9 91.15   Downloading transformers-4.43.0-py3-none-any.whl (9.4 MB)
2025-05-23 18:00:19.852	
#9 91.09   Downloading transformers-4.43.1-py3-none-any.whl (9.4 MB)
2025-05-23 18:00:19.852	
#9 91.02   Downloading transformers-4.43.2-py3-none-any.whl (9.4 MB)
2025-05-23 18:00:19.715	
#9 90.95   Downloading transformers-4.43.3-py3-none-any.whl (9.4 MB)
2025-05-23 18:00:19.715	
#9 90.89   Downloading transformers-4.43.4-py3-none-any.whl (9.4 MB)
2025-05-23 18:00:19.590	
#9 90.82   Downloading transformers-4.44.0-py3-none-any.whl (9.5 MB)
2025-05-23 18:00:19.590	
#9 90.76   Downloading transformers-4.44.1-py3-none-any.whl (9.5 MB)
2025-05-23 18:00:19.464	
#9 90.70   Downloading transformers-4.44.2-py3-none-any.whl (9.5 MB)
2025-05-23 18:00:19.464	
#9 90.63   Downloading transformers-4.45.0-py3-none-any.whl (9.9 MB)
2025-05-23 18:00:19.336	
#9 90.57   Downloading transformers-4.45.1-py3-none-any.whl (9.9 MB)
2025-05-23 18:00:19.336	
#9 90.50   Downloading transformers-4.45.2-py3-none-any.whl (9.9 MB)
2025-05-23 18:00:19.201	
#9 90.43   Downloading transformers-4.46.1-py3-none-any.whl (10.0 MB)
2025-05-23 18:00:19.201	
#9 90.36   Downloading transformers-4.46.2-py3-none-any.whl (10.0 MB)
2025-05-23 18:00:19.056	
#9 90.29   Downloading transformers-4.46.3-py3-none-any.whl (10.0 MB)
2025-05-23 18:00:19.056	
#9 90.23   Downloading transformers-4.47.0-py3-none-any.whl (10.1 MB)
2025-05-23 18:00:18.929	
#9 90.16   Downloading transformers-4.47.1-py3-none-any.whl (10.1 MB)
2025-05-23 18:00:18.928	
#9 90.10   Downloading transformers-4.48.0-py3-none-any.whl (9.7 MB)
2025-05-23 18:00:18.798	
#9 90.03   Downloading transformers-4.48.1-py3-none-any.whl (9.7 MB)
2025-05-23 18:00:18.798	
#9 89.97   Downloading transformers-4.48.2-py3-none-any.whl (9.7 MB)
2025-05-23 18:00:18.670	
#9 89.90   Downloading transformers-4.48.3-py3-none-any.whl (9.7 MB)
2025-05-23 18:00:18.670	
#9 89.84   Downloading transformers-4.49.0-py3-none-any.whl (10.0 MB)
2025-05-23 18:00:18.542	
#9 89.78   Downloading transformers-4.50.0-py3-none-any.whl (10.2 MB)
2025-05-23 18:00:18.542	
#9 89.70   Downloading transformers-4.50.1-py3-none-any.whl (10.2 MB)
2025-05-23 18:00:18.402	
#9 89.63   Downloading transformers-4.50.2-py3-none-any.whl (10.2 MB)
2025-05-23 18:00:18.402	
#9 89.57   Downloading transformers-4.50.3-py3-none-any.whl (10.2 MB)
2025-05-23 18:00:18.401	
#9 89.51   Downloading transformers-4.51.0-py3-none-any.whl (10.4 MB)
2025-05-23 18:00:18.260	
#9 89.43   Downloading transformers-4.51.1-py3-none-any.whl (10.4 MB)
2025-05-23 18:00:18.110	
#9 89.34   Downloading transformers-4.51.2-py3-none-any.whl (10.4 MB)
2025-05-23 18:00:18.110	
#9 89.28   Downloading transformers-4.51.3-py3-none-any.whl (10.4 MB)
2025-05-23 18:00:17.979	
#9 89.21   Downloading transformers-4.52.1-py3-none-any.whl (10.5 MB)
2025-05-23 18:00:17.979	
#9 89.14   Downloading transformers-4.52.2-py3-none-any.whl (10.5 MB)
2025-05-23 18:00:01.063	
#9 72.30   Downloading vllm-0.6.2-cp38-abi3-manylinux1_x86_64.whl (228.3 MB)
2025-05-23 17:59:59.574	
#9 70.81   Downloading vllm-0.6.3-cp38-abi3-manylinux1_x86_64.whl (193.5 MB)
2025-05-23 17:59:58.086	
#9 69.32   Downloading vllm-0.6.3.post1-cp38-abi3-manylinux1_x86_64.whl (194.8 MB)
2025-05-23 17:59:56.589	
#9 67.70   Downloading vllm-0.6.4-cp38-abi3-manylinux1_x86_64.whl (198.9 MB)
2025-05-23 17:59:55.537	
#9 66.66   Downloading vllm-0.6.4.post1-cp38-abi3-manylinux1_x86_64.whl (198.9 MB)
2025-05-23 17:59:54.485	
#9 65.57   Downloading vllm-0.6.5-cp38-abi3-manylinux1_x86_64.whl (201.1 MB)
2025-05-23 17:59:53.432	
#9 64.56   Downloading vllm-0.6.6-cp38-abi3-manylinux1_x86_64.whl (201.1 MB)
2025-05-23 17:59:52.380	
#9 63.54   Downloading vllm-0.6.6.post1-cp38-abi3-manylinux1_x86_64.whl (201.1 MB)
2025-05-23 17:59:51.026	
#9 62.16   Downloading vllm-0.7.0-cp38-abi3-manylinux1_x86_64.whl (264.1 MB)
2025-05-23 17:59:49.673	
#9 60.79   Downloading vllm-0.7.1-cp38-abi3-manylinux1_x86_64.whl (264.2 MB)
2025-05-23 17:59:48.471	
#9 59.64   Downloading vllm-0.7.2-cp38-abi3-manylinux1_x86_64.whl (264.3 MB)
2025-05-23 17:59:47.268	
#9 58.50   Downloading vllm-0.7.3-cp38-abi3-manylinux1_x86_64.whl (264.6 MB)
2025-05-23 17:59:45.955	
#9 57.05   Downloading vllm-0.8.0-cp38-abi3-manylinux1_x86_64.whl (265.3 MB)
2025-05-23 17:59:44.451	
#9 55.61   Downloading vllm-0.8.1-cp38-abi3-manylinux1_x86_64.whl (265.3 MB)
2025-05-23 17:59:43.098	
#9 54.20   Downloading vllm-0.8.2-cp38-abi3-manylinux1_x86_64.whl (293.6 MB)
2025-05-23 17:59:41.595	
#9 52.83   Downloading vllm-0.8.3-cp38-abi3-manylinux1_x86_64.whl (294.0 MB)
2025-05-23 17:59:40.278	
#9 51.43   Downloading vllm-0.8.4-cp38-abi3-manylinux1_x86_64.whl (294.1 MB)
2025-05-23 17:59:38.925	
#9 50.06   Downloading vllm-0.8.5-cp38-abi3-manylinux1_x86_64.whl (326.4 MB)

Thanks for your time, this issue has been resolved. Could you please look at #285654, which failed at 95% of processing?

Tried again, same process error

2025-05-24 00:47:34.954
[rank0]: ValueError: The decoder prompt (length 8384) is longer than the maximum model length of 8192. Make sure that max_model_len is no smaller than the number of text tokens plus multimodal tokens. For image inputs, the number of image tokens depends on the number of images, and possibly their aspect ratios as well.

This is the eventual error log that leads to failure.

Thank you for your time