Search before asking
Question
I'm currently trying to train on a custom dataset that has around 25k images.
Despite a good computer configuration: 32,0 GO of memory,
AMD Ryzen 7 3800X 8-Core
NVIDIA Quadro RTX4000
I'm facing the problem of the
error: (-4:Insufficient memory) Failed to allocate xxxx bytes in function 'cv::OutOfMemoryError'
Hence, as I don't want to downscale my images or reduce my dataset or disable the --cache options, I wanted to know if I can train my model in 2 or 3 times to avoid the problem.
I'm also wondering (if it's not possible to train in few times), if I should use transfer learning to learn in 2 or 3 times as I'm training with weights from scratch.
Thank you in advance for any kind of help.
Additional
No response