Forums

Scheduled task parallelisation – Bash script doesn’t pass $X argument correctly to Python script

My goal is to start multiple instances of the same Python script in parallel within a scheduled task, so that the overall process finishes faster. Here’s a minimal working example that reproduces it:

    #!/bin/bash

for i in {1..2}
do
    echo "Starting batch $i..."
    python3 /home/myuser/test_script.py $i &
done

wait
echo "All done."

In test_script.py, I then extract the sys argument as batch_group = int(sys.argv[1]). However, I have now noticed that all instances of test_script seem to run with batch_group=1. So, for debugging purposes, I did the following:

import sys
print("sys.argv:", sys.argv, flush=True)

I'd now expect in the task log something like:

Starting batch 1...
Starting batch 2...
sys.argv: ['test_script.py', '1']
sys.argv: ['test_script.py', '2']
All done.

But instead I'm seeing:

Starting batch 1...
Starting batch 2...
1
1
All done.

Could anyone explain this to me by any chance?

If you're getting that output, then the code that you are running is not the code that you have shown here. The code that you have shown here would output sys.argv at the start of the 2 batch lines and it is not.