Dear organizer,
I'm trying to submit a docker package for task 1 (9746385, 9746374, 9746373); however, it's taking too long and I don't know if it will complete successfully.
Can I request details of computing resources such as RAM, CPU, GPU, etc.?
sincerely yours,
Tsai-Min
Created by TSAI-MIN (??) CHEN (?) chentsaimin @chentsaimin indeed, submission ```9748115``` was submitted to the task 1 Final Round Dear @gaia.sage ,
Thanks for your kind reminding. We have uploaded our docker and writeup submission for task 1 Final Round.
Could you please help me check whether you received them or not?
sincerely yours,
Tsai-Min
Dear @chentsaimin
The leaderboard round is over, so submission must now be made to the Final Round queue.
Thanks! Dear organizer,
I could not upload my docker submission for task 1.
It shows:
```
The given date is outside the time range allowed for submissions.
```
Could you please help me check it?
sincerely yours,
Tsai-Min
Dear Tsai-Min,
It appears the Docker container for submission 9747041 reached the 3 hour time limit and is running a re-try.
We are investigating the reason for the retry, but encourage your team make another submission that can execute within the allotted time.
Thank you for bringing this to our attention!
Jenny Dear organizer,
My docker submission for task 1 (**9747041**) runs too long to finish.
Could you please help me check it?
sincerely yours,
Tsai-Min
Dear Jyoti
Thank you for bringing this to our attention. We've discovered the source of the error and patched the pipeline accordingly. We've also re-processed your `9746809` submission. Please view the leaderboard for your submission results and the updated Docker log.
Thank you,
Jenny Dear organizers,
I submitted several docker packages which need "train_data/PEGS_genomic_data/methylation/PEGS_methylation_beta_train.rds" file (the most recent one being: **9746809**). The script works on my local docker but the .rds file seems to be too large for the servers. The script begins reading the file and then just stops. The synthetic file is around 8GB in size.
There seems to be a problem with the computing resources. Can you please look into it?
Best regards,
Jyoti Dear Dear Tsai-Min,
I wanted to inform you that the resources allocated to the machines running your Docker containers are sufficient for the challenge. We have already received several successful submissions using these resources. Our pipeline runs are set up with the current configuration, so we are unable to modify them or add GPUs.
Thank you for your understanding! Dear Tsai-Min,
I'd like to correct my previous statement: We are allocating 32 GB of RAM and 8 CPU cores for the execution of the Docker container, and 16 GB of RAM and 4 CPU cores to other processes within the pipeline.
We are currently reviewing your request for more compute resources for the submission processing.
A side note: The backend issue has been fixed and your previous 3 submissions have been evaluated.
Thank you for your feedback!
Jenny Dear organizer @jenny.medina ,
Thanks for your help.
As you mention about the computing resources below:
```
To answer your question, each submission is allotted 16 GB of RAM and 4 CPU cores for the execution of the Docker container.
```
Considering the file size of "PEGS_GWAS_genotypes_v1.1_train_synthetic.bed" has exceeded 14.163 GB, are they too strict to utilize the genomic data?
Could you provide more RAM and multiple GPUs with tensorflow-gpu python package?
sincerely yours,
Tsai-Min
Dear Tsai-Min,
I took a look at the pipeline evaluating your submissions and there is an issue with the computing environment that is unrelated to your submission. The engineers will be looking into the fix, after which we will send your submission back into the pipeline for evaluation.
We recommend withholding making submissions until this is complete.
To answer your question, each submission is allotted 16 GB of RAM and 4 CPU cores for the execution of the Docker container.
We apologize for this inconvenience and thank you for your patience,
Jenny
Drop files to upload
(solved)The details of computing resources for task 1 page is loading…