Johannes Alkjær
Johannes Alkjær
You can simply put the firmware after the bootloader (0x1000 = 4096 which is the size of the bootloader) `st-flash --flash=128k umk4x4.stm32f103cb-128k-o3.bin 0x08001000 `
The two WebhookConfiguration are first created by the helm template : * https://github.com/GoogleCloudPlatform/flink-on-k8s-operator/blob/master/helm-chart/flink-operator/templates/flink-operator.yaml#L11 * https://github.com/GoogleCloudPlatform/flink-on-k8s-operator/blob/master/helm-chart/flink-operator/templates/flink-operator.yaml#L386 The cert-job is then started, and it applies a version of the webhookConfig with the...
As a manual workaround that obviously work (fsGroup: 9999) and could be applied as a patch But I'll stick to my current workaround as that doesn't introduce an additional step...
Its now supported in the master version. The helm chart in master is missing some permissions and for some reason the CRD doesn't actually declare the new fields, but they...
I have tried the patch of fsnotifier as suggested in https://youtrack.jetbrains.com/issue/IDEA-126491/File-watcher-failed-to-start-table-error-collision-messages-in-the-log#focus=Comments-27-5914263.0-0 the above error dissapears, but the "scanning files" stall is still there.
> Yes, patch helped with non bazel projects and with previous versions of IDEA (2021.3 and before) I'm on 2022.1.2 and had to apply the patched ver in order for...
the `less -K` trick only works if the the pipe is executed under bash and not /bin/sh (dash on my system) k9s will still complain with the < Ruroh? signal:...
The webhook only adds volumes if the driver/executor has a volumeMount for them: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/pkg/webhook/patch.go#L138-L143. The same goes for configMaps https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/pkg/webhook/patch.go#L335C1-L339 The code doesn't check if a driver/executor initContainer/sidecar mounts the...
The operator is not hard wired to any specific spark versions, so you can just package up your application with a spark 3.4.0 image as base and I'm certain that...
I think you need to be more specific. Try to submit the SparkPI example from the getting started guide and report what issues you are seeing (i.e. copy error messages...