p9anand

Results 9 comments of p9anand

have faced the same issue. after each images memory consumption is increasing. any solution yet?

Did you get the chance to have a look at why during inference time memory consumption is keeps on increasing if you pass multiple images one by one.

@piernikowyludek : Thanks. It worked. Now there is no memory leakage on inference.

yes. i tried following codes: from petastorm.spark import SparkDatasetConverter, make_spark_converter ### specify a cache dir first. #### the dir is used to save materialized spark dataframe files spark.conf.set(SparkDatasetConverter.PARENT_CACHE_DIR_URL_CONF, 's3a://****/*****/') df_train,...

Hi @selitvin , Thanks for the response. In my org we use s3a protocol as this support larger files. I can try using s3 protocol. Is there a plan to...

error while using s3 buckets:

working perfectly...!! Thanks.

can we pass augment argument in at the time of inference?