Pass environment variables through ssh
I deployed batchspawner again on Comet with a deployment quite similar to my old setup: https://github.com/jupyterhub/jupyterhub-deploy-hpc/tree/master/batchspawner-xsedeoauth-sshtunnel-sdsccomet
Here I ssh into a Comet login node to submit jobs, therefore I need to have the JupyterHub environment variables passed through the SSH session so that it is then passed into the SLURM job.
I am not sure how this could have worked in my old deployment, so it is possible I am missing something.
In my newer deployment I have to explicitely call ssh passing all the variables, see:
https://gist.github.com/zonca/55f7949983e56088186e99db53548ded#file-spawner-py-L42
Everything works fine, but there must be a better way, any suggestions?
I'll contribute this as another example for https://github.com/jupyterhub/jupyterhub-deploy-hpc
This doesn't look too bad to me. There is now an exec_prefix command for which this sort of makes sense, but really it doesn't matter and I'd leave well enough alone. I don't know how it would have worked through ssh without something like this. Overall I'd say you are operating in the normal unixy range here, but someone smarter than me may have a better idea.
One idea would be to set these variables in the batch script itself, but I don't think we have the plumbing to get the variables and values there easily.
Looking at this again...
The SSH server has to accept the variables via AcceptEnv in sshd_config (the default is to accept nothing). Perhaps this was configured on the server before but isn't anymore?
Do you think this issue can be closed now or should we try to do more?