Evaluation result says file too large?

2019-11-25T14:04:59.69570723Z [Errno 27] File too large
2019-11-25T14:05:00.438503169Z Ending traning phase

This is the response of evaluation. But our uploaed package is totally no more than 4M.
The checkpoint model file will be larger than 30M, isn’t it acceptable??

Hi, it is acceptable. You can training models in train/ folder upto 1000Gi size.

Can you share submission id?

Also the error doesn’t seem to be size related:

On attempting to open files with sufficiently long file names, python throws IOError: [Errno 27] File too large.  This is misleading, and perhaps should be relabeled as 'File name too long.'

[1] https://bugs.python.org/issue9271

AIcrowd Submission Received #25266


please find it above. Thx.

We did consider this, yet the only “long file name” we reach in code, is the record directory name of the human data, this works well on our local machine too.

Hi @rolanchen ,

It seems the error message wasn’t good. I re-ran your submission with minor change to print the traceback. I think this should help you in debugging further, seems to be coming from your server.init()

Traceback (most recent call last):
  File "/home/aicrowd/run.py", line 14, in <module>
    train.main()
  File "/home/aicrowd/train.py", line 159, in main
    server.init()
  File "/home/aicrowd/aiserver.py", line 52, in init
    self.kick_cbuf = SharedCircBuf(self.instance_num, {'NAN':np.zeros([2,2])}, ['NAN'])
  File "/home/aicrowd/lock_free_queue.py", line 86, in __init__
    self.read_queue = SafeQueue(queue_size)
  File "/home/aicrowd/lock_free_queue.py", line 25, in __init__
    sary = multiprocessing.sharedctypes.RawArray('b', 8 * size)
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/sharedctypes.py", line 61, in RawArray
    obj = _new_value(type_)
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/sharedctypes.py", line 41, in _new_value
    wrapper = heap.BufferWrapper(size)
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/heap.py", line 263, in __init__
    block = BufferWrapper._heap.malloc(size)
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/heap.py", line 242, in malloc
    (arena, start, stop) = self._malloc(size)
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/heap.py", line 134, in _malloc
    arena = Arena(length)
  File "/srv/conda/envs/notebook/lib/python3.7/multiprocessing/heap.py", line 77, in __init__
    os.ftruncate(self.fd, size)
OSError: [Errno 27] File too large

Thxs for the log! But it is really strange…we didn’t encounter any problem for this part locally…we will go on debuging it any way. Thx again.