Inference failed errors

Now submissions are working (thanks admins!!) I see I’m not alone in getting inference failed error messages.

In my case it’s because I’ve got inconsistencies between the feature names of my locally trained model and the features names I’ve put in the prepare_data routine of the zip file submission.

Sharing in case it helps other’s get their submission fixed before the extended deadline.

2 Likes

Turned out I then hit another error which may worth @alfarzan investigating and other R zip file submission people that have recently upgraded to xgboost 1.3 and are getting submission errors also trying.

I upgraded my local version of xgboost to 1.3 which for R users was released on CRAN in the last week. Anyhow after the upgrade I’ve not been successful in making submissions from locally trained zip file, even though they pass all the local tests as per the zip instructions (and I’ve been making successful zip based submissions in previous weeks with xgboost 1.2.)

Anyhow after I downgraded my local version of xgboost back to 1.2, retrained, re-ran tests which again passed and then reloaded my submission then worked.

Could it be that AICrowd is loading xgboost 1.2 for R users and the models trained with the new 1.3 version have incompatibilities causing submission errors? In which case guess I need to specify my version of xgboost in install packages.

1 Like

Thanks @nigel_carpenter

Yes this is true. We were not expecting packages to not be back-compatible in this way.
I will make an announcement about this issue tomorrow after the leadeboard.

From tomorrow we will be:

  1. Updating the packages daily, in case something has been released.
  2. Providing instructions on how to specify version numbers in your install.R script or the install_packages function on colab.

Thanks for figuring it out!

I also had an Inference failed errors during the Generating Prediction Debug phase.

it says:

Error in x[[jj]] <- v :
attempt to select more than one element in integerOneIndex

In my enviornment everythings works and I have just added a line a piece of code with a case_when.
Do you have any idea why I am getting this error?
I really cannot solve it

Hi @pejo92

That looks to be an indexing error from R. Can you please link me to your submission ID and I can try to have a look :slight_smile:

Hi,

I also get the same error and couldn’t figure out where it is coming from ( Submission #114123). Any ideas what might be causing this?

Hi @sina1374

I’m sending you the traceback privately.

For clarity, the reason you are not getting the full traceback is that the evaluation works in 4 stages:

  1. Building your environment and installing packages
  2. Using a sample of the training data and generating predictions
  3. Generating predictions on the leaderboard data
  4. Computing your score

We give you tracebacks on steps 1 and 2. Full traceback on step 3 is not provided to reduce the risk of data leakage. Step 4 is from our side and should very rarely contain an error :slight_smile:

Hi @alfarzan
I figured out that there was problem in my code.
I realized it is possible that we have new make_model, compared to what we have in our train dataset, isn’t it?

Yes! just like a real insurance company may encounter vehicles they’ve never seen before in a new year, so can your model.

But this fact doesn’t extend to other categorical features since they have much fewer unique values.

That’s right! Make super sense to me, thanks! :grinning:

1 Like

Hi @alfarzan,
My latest submissions (e.g. 114780) have encountered a problem with the same error message, which this thread is reffering to: “Inference failed. View the submission for more details.”.
In the log I can find following detailed error message:

“Error in load_model(model_output_path) :
unused argument (model_output_path)”

I really can’t find any solution. Can you please take a look at my submission?

Hi @imfilip

This was a small issue from our side, can you please try again? and please copy a new neotebook (or reset your notebook to factory runtime).

Thanks for catching it! :handshake:
I’ve made a clarification post about this as well.

Hi I’ve been getting inference errors trying to submit h2o model. Can you please help?

1 Like

Hi @actuarist

Can you please provide a submission number so I can look into this one? I’m looking into the one from @Baracuda already as they’ve reached out on Discord as well :slight_smile:

@actuarist for @Baracuda who is using zip submissions the issue was that the save and load model functions need to be altered to work with how h2o models are saved and loaded. Let me know if this doesn’t resolve your issue as well.

finally working :slight_smile:
been mostly using the colab colab seems to run fine but can’t seem to get submission working
worked with zip submission after adding in initialising library statements in predict.R
i thought it should automatically initialise packages in the install.R?

Good to hear :muscle:

In colab any initialisation you need should be done inside of global_imports which is sourced at the top of every file including in predict.R. In zip you are to do that inside of model.R which gets sourced at the top of predict.R.

Can I have your submission ID to make sure things are working fine from our end?

117334 latest working zip submission
can we modify predict.R in colab?

Thanks for that!

I would advise that you do not change any of the predict.R scripts even for your zip submissions.

I’ve looked into your submission and actually the issue is:

  1. You should include all your packages at the top of the model.R code and start h2o up there as well.
  2. To load your model, you should hard-code your model path into your load_model function and the load_model itself cannot take any arguments.

I have sent you a modified version of your submission that does not require changing the predict.R to illustrate this :slight_smile:

For notebook submissions the only difference would be that the the hard-coded path of the saved model has to be in a directory called saved_objects. Otherwise it should work very similarly with the exception of having to use the install_packages and global_imports functions. h2o.init() would fit into the latter.