Feature Request: Add arbitrary notification msg field to identify cluster flagger is running in
Describe the feature
What problem are you trying to solve? The notifications to slack for flagger are great but if we have multiple clusters running flagger and all are sending notifications to the same slack channel then its hard to determine which cluster the canary is happening in. For example if we use have a workload called my-app running in a namespace called my-ns in a staging and prod cluster we get identical msgs in slack:
my-app.my-ns New revision detected, progressing canary analysis. Target Deployment/my-app.my-ns Failed checks threshold 5 Progress deadline 60s Traffic routing Weight step: 10 max: 70
Proposed solution
What do you want to happen? Add any considered drawbacks. Add a new flag to flagger (say --slack-msg-prepend) which accepts an arbitrary string which will be shown in the slack msg, possibly prepended to the title of the slack msg. This could be used to identify the cluster that flagger lives in. If it isnt passed as a flag then the slack msg remains the same.
Any alternatives you've considered?
Renaming all of our helm installs across our clusters so this info is surfaced as part of workload, but this isnt feasible for us. Sending flagger slack msgs from each cluster to a separate channel but this doesnt scale well.
Is there another way to solve this problem that isn't as good a solution? Not sure
We could add a Summary field to the Alert API where people can describe the impact, environment, etc
yes that would be great
@stefanprodan we also really need this enhancement, we have many environments and would like to somehow identify them, only for production 6 dedicated clusters
I have https://github.com/fluxcd/flagger/pull/1041 submitted for this