Idan Yael
Idan Yael
Hi, I'm experiencing an issue trying to sync a GCP secret as a k8s one. Using the same method without the mapping to a k8s object works well and I...
If you are not using the async methods you can workaround the issue by disabling the multiprocessing (that's my case...). ``` class dummy: def close(self): pass def join(self): pass def...
It was a long time ago, but if I remember correctly, you have to mount it to the file system in order to trigger the driver, and only then it...
Encapsulating a Pipeline into a `Pipeline` and having the `PipelineRunner` accept multiple `Pipeline` (as an array) sounds good to me. That will enable us to create multiple pipelines running sequentially...
`AsyncQueryResult`, `QuerySubscription`, and `AsyncMarketProvider`, were written entirely to enable **history** loading from IB (`IBMarketProvider`). As you can see, the single function there is `request_symbol_history`. Ideally, in order to implement streaming...
I'm not familiar with many brokers streaming APIs to know how the interface should be designed. There are APIs that are symbol based and you have to request symbols one...
> **Problem:** > > When running multiple Pipelines with the same context and reintroducing the same candle from a source, the cache will append duplicates causing problems for the rest...
> Note for serialization, I was exploring how this could be done with lambdas and I found this library which extends the pickle library, https://pypi.org/project/cloudpickle/ , but arbitrary code execution...
I read your use case and understand the issues. I'll give you a few pointers that I think might help clear things out. 1. Instead of having the whole `ContextInjectionFactorySource`...
Sorry, I meant Source.read(). So that when the second pipeline will start reading its source, it will get the context from the former pipeline. This answers the `MongoDBSource` question as...