this is the submission id: 9706709
UPDATE: after carefully examining the running time and memory usage of training steps on the synthetic data on both my local machine and your online environments, it seems that the synthetic data volume used in the online environment is much less than the synthetic data we have downloaded. So this problem might be caused by the UW data volume being larger than the online synthetic data and my code's memory usage exceeded 10G cap when training on UW data (but ok on the synthetic data).
What's the UW data size? how many person in it? and how many rows in each table?