Hi! Again me, I'm having a LOT of trouble because of not knowing which are the features that are included in the datasets. This took me to try many submissions that work perfectly fine in the fast lane but kept failing once and again in the R1 submissions. Two times I got "submission invalid" because a dataset got NAs (due to this problem of not knowing the actual features in it) but now I cannot make new submissions and I don't know the score I got from my model, neither have the possibility to re-submit another version I was also trying. This is VERY frustrating because I got everything working just fine in my computer, my virtual machine and in the fast lane and now I don't know my scores and don't have the same possibilities as the other competitors. Is there any way this could be fixed? or at least it would be good to see the scores of the datasets that got any to keep working on that basis at least.

Created by Martin Guerrero martinguerrero89
@martinguerrero89 The gene names have been posted, please see the new thread.
That is great! thank you! I understand that a great method should be able to work for many datasets with varying features. I'm trying to do that but while trying I encountered this problem. At least for me is good to know which are present, but I can make it without exactly knowing, I guess the problem is not having a representative fast lane to test the range of possibilities present. Anyway, thank you very much for your help!
Hi @martinguerrero89 The invalid submissions shouldn't have counted towards your total, and that's been fixed, you can now submit to the r1 queues as before. We are still deciding on whether or not to release a features list per dataset.

Submission invalid but cannot submit new becuase the maximum has been reached page is loading…