Amit Arora

Results 15 comments of Amit Arora

Exact same request from my side as well and for the same reasons as @jeremyjensen3.

I had the same problem while using langchain in AWS Lambda, the way I resolved this by downloading the numpy files from [here](https://files.pythonhosted.org/packages/f4/f4/45e6e3f7a23b9023554903a122c95585e9787f9403d386bafb7a95d24c9b/numpy-1.24.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl) from the [PyPi numpy files](https://pypi.org/project/numpy/1.24.2/#files) download page....

@reachlin please see https://github.com/aws-samples/llm-apps-workshop/blob/main/blogs/rag/api/deploy.sh for a recipe to package dependencies including numpy and langchain into AWS Lambda.

@Duonghailee so you need to create a Python 3.9 conda environment first and then run this script from within that environment. Yes I have seen this error and it will...

Thanks for the quick reply. I have provided the icons via the options argument, please see the code snippet i posted. I had looked at several examples, including the one...

Any workarounds..suggestions..really want this feature for a very important project...any help would be greatly appreciated.

@madhurprash please get details from Shamik and add here in this ticket.

Do this in the same way we have bring your own deployment script in that there is a `inference` function which is called from the run inference notebook.

We are working on a new FMBench release which supports hosting both the LLMs and FMBench on the same EC2 instance (so no SageMaker, no S3 dependancy) and this should...