llm_describe_ebm_graph function not working
Hi, I wanted to try out your interesting project locally but I a currently receive the error message
"raise Exception(f"Too many (more than {self.llm.max_retries}) OpenAI API RateLimitError's in a row!") Exception: Too many (more than 5) OpenAI API RateLimitError's in a row!"
when calling the "llm_describe_ebm_graph(...)" function.
It looks like too many requests were sent to the API within a short period of time, more than the API allows. I suppose that this problem did not occur when you made your repo public 8 months ago but rather occurs due to changes in newer guidance and openai packages. I did indeed find out that your t2ebm package only seems to work with openai package versions >= 0.27.10 and <= 0.28.1. Because when using a newer version I receive the following error message:
"File "[...]\talk2ebm\Lib\site-packages\t2ebm\utils.py", line 24, in
At first I thought that this error (the RateLimitError on top) might occur due to the complexity of my dataset so I tried calling the function on the Spaceship Titanic dataset (the one you used in your demo notebook) but the error still occurs. I also tried calling the function with a different api_key from another Openai account but the error still occurs.
I would be very nice, if you could try to replicate the error to confirm that this is in fact an issue based on an outdated "llm_describe_ebm_graph(...)" function.