Resource locking serializes concurrent buf executions?
Hello all,
It seems to me like buf convert (and maybe other commands) locks on some resource. If so, could this be either changed, documented or worked around on my side?
For example, in a hacky script to process a bunch of files I tried:
for ((i=1; i<=20; i++)); do
echo "Starting $i" && \
cat ... | buf convert ... > "file_$i.binpb" && \
echo "Done with $i" & # <-- Spawning processes to hopefully do some work in parallel
done
As you can see on the fourth line, I spawn processes in the background to try and speed things up. However, the result was the same as when I did not spawn them in the background. So, it seemed to me that the buf convert processes are queueing up and waiting for some resource.
It's possible - for example, depending on what is in ..., it may be making a network call. You'd have to give us a specific, reproducible example for us to investigate. If you can provide that, we're happy to look into it.
@bufdev It's all local:
cat fooRequest.json | buf convert ./my_proto_project --type="com.mycompany.FooRequest" --from -#format=json --to -#format=binpb
We need something that we can run locally. If you can set up, for example, a temporary github repostory with the code you are executing, so that we can run this code and reproduce the issue, we're happy to look into it!
No problem. I'll get to that soon!
Closing since we can't reproduce, but if you can provide us a reproducible example, we're happy to look into it!