I want to download a zip file with an extra-big size (up to 300GB). But the net speed of my school in China is limited, so I try to download the data to Google Drive by Colab. However, the maximum file size is about 60GB for Colab at a time. **So I wonder if there is a way to split the 300GB zip file into serval 60GB zip files and download each one at a time?**

Created by pzSuen
If you have rapport with the provider of the data you can ask them to re-upload their data in smaller files. If you do not have rapport with the data provider but have some software engineering skills you can write code to make "Range Requests" (see https://en.wikipedia.org/wiki/Byte_serving) to get just part of the file at a time. If you find you require more disk space, with no way to avoid it, you could try a different machine learning environment, like AWS SageMaker.

Ways to split a big zip file into small ones page is loading…