Version/Environment (if relevant):
When uploading huge directories via
domino upload-dataset if the network connection isn't fast or stable enough an error can occur during the CLI command:
| Error! |
Snapshot upload failed. You can retry this upload with the following command:
domino upload-dataset testuser/testproject/testdata /user/ghud/mydirectory 301d95e2-00db-407d-8ebc-6055b68bac17
The underlying root can vary depending on total size, number of files and the size of the largest file in the directory plus the number of hops between hosts, network latency between hosts and other network or connection-specific nuances outside the sphere of Domino's control.
There are four possible solutions to try.
- Work with your network administrations to ensure your system has speedy connections to your https://<your domino instance url>
- Toggle two environment variables that can alter the network traffic sent via CLI. This may require an iterative process of testing various values for improvement since network connections vary from customer to customer.
a) Configurable Chunk Size
- Allows chunk size to be configured
- Environment Variable: DOMINO_UPLOAD_CHUNK_BYTES
- Default: 3145728 (3 MB)
- Benefit: Tuning based on average file size and network characteristics
- Allows multi-threaded upload
- Environment Variable: DOMINO_UPLOAD_THREADS
- Default: 8
- Benefit: Allows parallel chunk uploads
- Try zipping or tar'ing your directory. Text will compress greatly, which can make the upload have less data and fewer files to move. Once uploaded, open a workspace and use
sudo apt-get install unzipand
unzip file.zip -d destination_folderor
tarto unarchive the data you uploaded.
- Upload files into the dataset via the Domino UI. On that page you can upload up to 50GB or 50,000 individual files. https://docs.dominodatalab.com/en/5.1/user_guide/0efc5a/update-a-dataset/ and https://docs.dominodatalab.com/en/5.1/user_guide/d1a1ae/create-a-dataset/
On Windows use "export" to set environment variables, on Mac or Linux use "set"