jccarles

Results 9 comments of jccarles

I want to pass some strings which helps define the specifications of the distributed computing cluster I want to spawn. You think the type of information is relevant ?

Also to be more precise, I would like to use my custom executor operator with the KubeflowDagRunner. I seem to have an additional issue which is to make aware the...

I think in the end my issue is with the container_entrypoint.py for kubeflow orchestrated pipelines as it does not expose the parameters `platform_config` and `custom_executor_operators`. The `platform_config` is not exposed...

Yes exactly I am using `beam-worker-pool` as sidecar for all my flink task managers. Yes sorry indeed it is not clear, `pipeline_functions` is my custom module which holds `preprocessing_fn`.

I am pointing to a module file. When I was using GCS I used package user modules and `force_tf_compat_v1=False` and it was working fine. Now with beam on Flink I...

O nice I see, were you able to include those custom kubeflow components in a tfx generated pipeline ? I tried the approach of spawning a Flink cluster for each...

Thank you I will try this solution it looks quite elegant ! I might have one last question ! Have you ever experienced Flink job hanging forever I think over...

Hello ! Thank you for your answer, here is the eval_config used. We used fake very low bounds for testing. ``` { "model_specs": [ { "signature_name": "serving_default", "label_key": "label", "preprocessing_function_names":...

Hello, thank you for checking this issue, did you have time to take a look ? Have you identified anything so far, can I help somehow ?