Can you please tell me the default local directory where the model files are downloaded?

I want to use “BAAI/bge-m3” model with HuggingFaceEmbedding from LlamaIndex, and I’ve added it to aicrowd.json

{
    "repo_id": "BAAI/bge-m3",
    "revision": "main"
},

I tried two ways to load this model:

embedding_model = HuggingFaceEmbedding(model_name="BAAI/bge-m3")
embedding_model = HuggingFaceEmbedding(model_name=os.path.expanduser("~/.cache/huggingface/hub/models--BAAI--bge-m3/resolve/main"))

But both attempts failed.
Can you please tell me the default local directory where the model files are downloaded, or the proper way to load a locally cached model with HuggingFaceEmbedding?

Thanks in advance!

Not sure if this help. You can try with local_files_only or test your code when set HF_HUB_OFFLINE=1 as the enviroment variable. I observed that some packages will use huggifance hub snapshot method to check if some repo has certain files, and if local_files_only is not set in the snapshot, it will try to request internal but did not get correct response.