Ugh ....
I have a preprocessing stage that takes several days to run, so it was a heavy investment in quota. Assuming it could be run once and only once, this is fine .... but after waiting a long time for training results I decided to look at the logfile and now I see that the preproc stage is running from scratch.
What are the conditions in which preprocessing results are deleted? If they're gone, then I really need to stop here and work on other competitions.
Is there a way you guys can tell me which preprocessing images of mine you still have?
Created by Clinton Mielke subcosmos
> Please ensure this is documented
Sect. 3.6 of the instructions previously said
> If the same submission file is submitted a second time as is, the pre-processing stage will not run because the system knows that the pre-processed data are already available, therefore there is no need to regenerate them.
I added the following sentence:
> (Note: The system caches only one pre-processing result per team. If the pre-processing image is changed in a subsequent submission, the previously cached preprocessing result is removed.) Thats what I suspected, but certainly didn't know about this behavior until it was too late.
Please ensure this is documented for future competitions. This cost me 4 days of quota time. @subcosmos I will have a look. The high level comment is: The system only caches one 'preprocessing' output for you. If you change your preprocessing method from submission to submission then you trigger the system to remove what is has cached and recompute with the preprocessing algorithm of the latest submission.
**Update**: As suspected, looking at your four most recent submissions you seem to be switching back and forth between two preprocessing methods:
(1) docker.synapse.org/syn7988514/prep@sha256:a778d0e7643e557dd77ee768bd63d3156620b98b71450878b8183596322ffded
(2) docker.synapse.org/syn7988514/extract@sha256:00ec2f38337eb46d997a0903338c5aff519b26b908d53410462adf33d5c49881
We only cache the results of one at a time.
syn8378813 1
preprocessing=docker.synapse.org/syn7988514/prep@sha256:a778d0e7643e557dd77ee768bd63d3156620b98b71450878b8183596322ffded
training=docker.synapse.org/syn7988514/train1@sha256:a4b4248c1ebe67299fd869166a7c84aba466723801eea55c875014bfc66138e1
syn8375786 2
preprocessing=docker.synapse.org/syn7988514/extract@sha256:00ec2f38337eb46d997a0903338c5aff519b26b908d53410462adf33d5c49881
training=docker.synapse.org/syn7988514/autoencoder@sha256:5c1f621f7f4a0921b97f8e45f4a6fe8f3ba7e4a98e39d059c4b144ce6088c6c9
syn8378813 1
preprocessing=docker.synapse.org/syn7988514/prep@sha256:a778d0e7643e557dd77ee768bd63d3156620b98b71450878b8183596322ffded
training=docker.synapse.org/syn7988514/train1@sha256:a4b4248c1ebe67299fd869166a7c84aba466723801eea55c875014bfc66138e1
syn8375786 2
preprocessing=docker.synapse.org/syn7988514/extract@sha256:00ec2f38337eb46d997a0903338c5aff519b26b908d53410462adf33d5c49881
training=docker.synapse.org/syn7988514/autoencoder@sha256:5c1f621f7f4a0921b97f8e45f4a6fe8f3ba7e4a98e39d059c4b144ce6088c6c9
${leaderboard?path=%2Fevaluation%2Fsubmission%2Fquery%3Fquery%3Dselect%2B%2A%2Bfrom%2Bevaluation%5F7213944%2Bwhere%2BuserId%253D%253D%25223345664%2522%2BAND%2BcreatedOn%253E%253D1483401600000&paging=true&queryTableResults=true&showIfLoggedInOnly=false&pageSize=100&showRowNumber=false&jsonResultsKeyName=rows&columnConfig0=none%2CSubmission ID%2CobjectId%3B%2CDESC&columnConfig1=none%2CStatus%2Cstatus%3B%2CNONE&columnConfig2=none%2CStatus Detail%2CSTATUS%5FDESCRIPTION%3B%2CNONE&columnConfig3=none%2CProgress %2528%2525%2529%2CPROGRESS%3B%2CNONE&columnConfig4=none%2CTraining Quota Remaining%2CTIME%5FREMAINING%5FDISPLAY%3B%2CNONE&columnConfig5=cancelcontrol%2CCancel%2CcancelControl%3B%2CNONE&columnConfig6=epochdate%2CSubmitted On%2CcreatedOn%3B%2CNONE&columnConfig7=epochdate%2CLast Updated%2CTRAINING%5FLAST%5FUPDATED%3B%2CNONE&columnConfig8=synapseid%2CSubmitted Repository or File%2CentityId%3B%2CNONE&columnConfig9=none%2CFile Version%2CversionNumber%3B%2CNONE&columnConfig10=synapseid%2CLog Folder%2CSUBMISSION%5FFOLDER%3B%2CNONE&columnConfig11=none%2CSubmitting User or Team%2CSUBMITTER%3B%2CNONE&columnConfig12=synapseid%2CModel State File%2CMODEL%5FSTATE%5FENTITY%5FID%3B%2CNONE&columnConfig13=none%2CDaily Log Data Used %2528MB%2529%2CDAILY%5FQUOTA%5FUSED%5FMB%3B%2CNONE&columnConfig14=none%2CDaily Log Data Remaining %2528MB%2529%2CDAILY%5FQUOTA%5FREMAINING%5FMB%3B%2CNONE&columnConfig15=none%2CModel State Size %2528GB%2529%2CMODEL%5FSTATE%5FSIZE%5FGB%3B%2CNONE&columnConfig16=none%2CModel State Quota Used %2528GB%2529%2CMODEL%5FSTATE%5FQUOTA%5FUSED%5FGB%3B%2CNONE&columnConfig17=none%2CModel State Quota Remaining %2528GB%2529%2CMODEL%5FSTATE%5FQUOTA%5FREMAINING%5FGB%3B%2CNONE} Poke*
Any comment on whether preprocessing images can be recovered, or the conditions by which they are deleted? (To avoid this happening again with the precious little time remaining?) Update :
I have run 2 preprocessing images over the duration of this competition. Image1 hasn't been used for a while, and image2 has been in regular daily use for several weeks. Yesterday I ran several jobs with image2 that went straight into training using the cached preprocessing results that were prepared Jan-31st.
Today I decided to switch back to the image1 preprocessing image to compare how the model trains on the older preprocessed data. I found that the preprocessing results started fresh, which I have no time to allow to finish. I assumed you guys delete older results after a timeout.
But now the image2 preprocessing image is ALSO gone. Aside from a serious problem on your backend, Im really hoping that submitting a job depending on image1 didn't clobber image2....... please tell me this isn't intentional.....
Either way, this experience has literally abolished any chance of competing in this competition.
image1: sha256:a778d0e7643e557dd77ee768bd63d3156620b98b71450878b8183596322ffded
image2 : sha256:00ec2f38337eb46d997a0903338c5aff519b26b908d53410462adf33d5c49881
Drop files to upload
Preprocessing is GONE and rerunning page is loading…