Inference failed. View the submission for more details

Could you please help…?

Submission #118869 failed…

not sure whether this is due to the xgb I am deploying…

FYI, I wrote " import xgboost as xgb" in both “def predict_expected_claim” and “def fit_model(X_raw, y_raw)”

Please help…thanks a ton

Hi @YaoLin and welcome to the challenge :partying_face:!

That’s why we’re here :muscle:

About the imports :inbox_tray:

You need only define your imports in the specified area in the notebook and it should work fine and be imported throughout. Though multiple imports won’t hurt anyway :+1:

About errors in general :hammer_and_wrench:

Generally for submissions we provide the full traceback of the error on your training data. In your case you can see the link to the error here, where it says “[generate-predictions]”:

Your specific error :floppy_disk:

In your specific case if you click on that link you will find the following which says that you should be using the native save and load functions for xgboost to save your model. Now if you are saving multiple models then please see this discussion on how to do that.

Traceback (most recent call last):
  File "predict.py", line 30, in <module>
    model = load_model(submission_config["model_path"])
  File "/home/aicrowd/load_model.py", line 7, in load_model
    return pickle.load(target)
  File "/usr/local/lib/python3.8/site-packages/xgboost/core.py", line 1087, in __setstate__
    _check_call(
  File "/usr/local/lib/python3.8/site-packages/xgboost/core.py", line 189, in _check_call
    raise XGBoostError(py_str(_LIB.XGBGetLastError()))
xgboost.core.XGBoostError: [17:32:31] ../src/learner.cc:922: Check failed: header == serialisation_header_: 

  If you are loading a serialized model (like pickle in Python) generated by older
  XGBoost, please export the model by calling `Booster.save_model` from that version
  first, then load it back in current version.  There's a simple script for helping
  the process. See:

    https://xgboost.readthedocs.io/en/latest/tutorials/saving_model.html

  for reference to the script, and more details about differences between saving model and
  serializing.

I hope this helps :slight_smile:

Ah yes you can do this in 3 ways:

  1. Follow the advice on the other thread with creating a zip file in the colab environment.
  2. Save your xgb models using the appropriate save_model functions and then pickle all those files separately and submit that pickle file.
  3. Go through a zip submission process. That way you can fully customise your save and load functions.

Hi @YaoLin

I finally had a chance to dig a little deeper in to this and we have a solution :rocket:

Overall the process we follow for python colab submissions is:

  1. We take your load_model function and give it one argument, which is your model path.
  2. Your load_model is then supposed to do everything to load your model that will be passed on to your predict functions.

So in your case you can make the following changes:

  1. Inside the notebook there is a cell where you have to specify the name of your model object inside the config class. You should include the zip file name here. I’m attaching a screenshot of that.
  2. Your load_model can only take one argument which will be the config file address you add above, you should hardcode everything else inside it. For example if you have 100 different models that you are submitting, you could include in the zip file a pickled list with those names to be loaded.

That should do it :+1:

Unfortunately I can’t test this for you because your current submission does not include the zip file since it was not included in the config class (that’s how we know what to take from the local colab directories!).

Let me know if that doesn’t work and we’ll get to the bottom of it :gear:

Never mind @alfarzan I got it figured out now. thanks a ton for your help!!

1 Like