balajiChundi
balajiChundi
This clearly is a prompt mistake, can you share your training config settings!!
Sorry to hear this, But I can't find any problem here. Can you guys check your dependencies as @2021xyl suggested. If I remember correctly transformer caused a lot of trouble...
"Sending in multiple pages for each request", if you define your use case like this - model's max_positional_embeddings (you might have to parameter tune) might not be sufficient to incorporate...
First and preferred way: Get the predictions from the model twice, once per each page (for a two page invoice), you can parallelize the model predictions for a faster output....
return paj.read_json(f) File "pyarrow/_json.pyx", line 258, in pyarrow._json.read_json File "pyarrow/error.pxi", line 144, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 100, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: JSON parse error: Column() changed from object to string...
I am facing the similar issue, memory gets allocated during requests and later the memory doesn't get deallocated which is making the application to crash eventually. I performed memory profiling...
I have faced a similar problem and identified that the training set has some ambiguous entries, a) your training set might be having outliers, in this case, text that is...
I have around 15k images and trained it for 10 epochs, which resulted in repetitions in some cases - around 10-15% of the images I validated on has this problem...
I figured I have made a mistake in preparing the training data, so even after training it for 20 epochs there are repetitions. I rectified the error in data prep...
Hey @jackkwok, please share the experiment outcome!!