I'm getting confused about the 'preprocessing' and the 'training' phases. It's good to have the new format for the Job monitoring, because it makes a few things clearer. I asked before about when the jobs are run, and the answer was that the training job is only run after the preprocessing job. However, I find that, if I submit them both at the same time, they both start running. I need to look at the log file, to see what has happened. However, for some reason, it doesn't always create a log file. My 'training phase' always creates a log-file - but my 'preprocessing phase' seems only to create one if it is too big and has aborted for that reason. What is required for a log file to be produced? As far as I can see, I'm sending output to stdout. Does it matter if I run one script from another?

Created by Peter Brooks fustbariclation
Dear Peter, Apologies for the late reply. If the log file exceeds 1MB then the pipeline stops capturing STDOUT and STDERR. The "error" that the file exceeded 1MB is not part of the content of the log file. This implementation small log file size is an intermediate implementation. We are awaiting more information to see if we actually need to limit the log file size. Does this help? You should be running the preprocessing step first, before running the training step, because your training step probably needs the files you pre-process. Best, Thomas

Messages on completion of submissions - 'has completed its training phase' page is loading…