utils.create_from_dict fails for custom objects
What happened (please include outputs or screenshots):
I want to do something like kubectl apply -f <yaml-file> but with python kubernetes api client that is dynamically generated for different clusters. I have a Knative distribution running on a cluster but when I run utils.create_from_dict on a loaded knative yaml object, it fails and gives the following error:
Traceback (most recent call last):
File "/Users/🙈/Documents/redox-backend/base/kluster.py", line 64, in <module>
utils.create_from_dict(kube, obj, True)
File "/opt/homebrew/anaconda3/envs/main/lib/python3.9/site-packages/kubernetes/utils/create_from_yaml.py", line 216, in create_from_dict
created = create_from_yaml_single_item(
File "/opt/homebrew/anaconda3/envs/main/lib/python3.9/site-packages/kubernetes/utils/create_from_yaml.py", line 242, in create_from_yaml_single_item
k8s_api = getattr(client, fcn_to_call)(k8s_client)
AttributeError: module 'kubernetes.client' has no attribute 'ServingKnativeDevV1Api'
From the basic inspection of the error stack it seems a lot of work to implement ServingKnativeDevV1Api. I want to ask if there is a quick workaround as I don't need all the functionality of ServingKnativeDevV1Api object but need to just apply updates to YAML object.
What you expected to happen:
No error, successfully configure the patched yaml like kubectl does.
How to reproduce it (as minimally and precisely as possible):
- Create a new cluster
- Install Knative.
- Use the following yaml file:
apiVersion: serving.knative.dev/v1
kind: Service
metadata:
name: hello
spec:
template:
metadata:
annotations:
autoscaling.knative.dev/min-scale: "1"
autoscaling.knative.dev/max-scale: "5"
spec:
containerConcurrency: 50
containers:
- image: <sample-image>
ports:
- containerPort: 8000
- Load yaml file as dict object in python
- Run
utils.create_from_dictfor that object.
Anything else we need to know?:
Environment:
- Kubernetes version (
kubectl version):
Client Version: version.Info{Major:"1", Minor:"23", GitVersion:"v1.23.5", GitCommit:"c285e781331a3785a7f436042c65c5641ce8a9e9", GitTreeState:"clean", BuildDate:"2022-03-16T15:51:05Z", GoVersion:"go1.17.8", Compiler:"gc", Platform:"darwin/arm64"}
Server Version: version.Info{Major:"1", Minor:"22", GitVersion:"v1.22.6-gke.300", GitCommit:"df413ee6225aa3fc539e18ca3464a48d723bd3ea", GitTreeState:"clean", BuildDate:"2022-01-24T09:29:08Z", GoVersion:"go1.16.12b7", Compiler:"gc", Platform:"linux/amd64"}
- OS (e.g., MacOS 10.13.6): MacOS 12.3.1
- Python version (
python --version): 3.9.9 - Python client version (
pip list | grep kubernetes): 23.3.0
/assign @fabianvf
Hello!
Here is a quick workaround using dynamic client in replacement of utils.create_from_yaml and utils.create_from_dict for customs objects.
import pathlib
import yaml
import kubernetes
def apply_simple_item(dynamic_client: kubernetes.dynamic.DynamicClient, manifest: dict, verbose: bool=False):
api_version = manifest.get("apiVersion")
kind = manifest.get("kind")
resource_name = manifest.get("metadata").get("name")
namespace = manifest.get("metadata").get("namespace")
crd_api = dynamic_client.resources.get(api_version=api_version, kind=kind)
try:
crd_api.get(namespace=namespace, name=resource_name)
crd_api.patch(body=manifest, content_type="application/merge-patch+json")
if verbose:
print(f"{namespace}/{resource_name} patched")
except kubernetes.dynamic.exceptions.NotFoundError:
crd_api.create(body=manifest, namespace=namespace)
if verbose:
print(f"{namespace}/{resource_name} created")
def apply_simple_item_from_yaml(dynamic_client: kubernetes.dynamic.DynamicClient, filepath: pathlib.Path, verbose: bool=False):
with open(filepath, 'r') as f:
manifest = yaml.safe_load(f)
apply_simple_item(dynamic_client=dynamic_client, manifest=manifest, verbose=verbose)
# Usage
kubernetes.config.load_kube_config()
DYNAMIC_CLIENT = kubernetes.dynamic.DynamicClient(
kubernetes.client.api_client.ApiClient()
)
apply_simple_item_from_yaml(DYNAMIC_CLIENT, "manifest.yml", verbose=True)
Thanks, it seems to be working now!
The Kubernetes project currently lacks enough contributors to adequately respond to all issues and PRs.
This bot triages issues and PRs according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue or PR as fresh with
/remove-lifecycle stale - Mark this issue or PR as rotten with
/lifecycle rotten - Close this issue or PR with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.
This bot triages issues and PRs according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue or PR as fresh with
/remove-lifecycle rotten - Close this issue or PR with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle rotten
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.
This bot triages issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Reopen this issue with
/reopen - Mark this issue as fresh with
/remove-lifecycle rotten - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/close not-planned
@k8s-triage-robot: Closing this issue, marking it as "Not Planned".
In response to this:
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.
This bot triages issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied- After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied- After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closedYou can:
- Reopen this issue with
/reopen- Mark this issue as fresh with
/remove-lifecycle rotten- Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/close not-planned
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.
Hello!
Here is a quick workaround using dynamic client in replacement of
utils.create_from_yamlandutils.create_from_dictfor customs objects.import pathlib import yaml import kubernetes def apply_simple_item(dynamic_client: kubernetes.dynamic.DynamicClient, manifest: dict, verbose: bool=False): api_version = manifest.get("apiVersion") kind = manifest.get("kind") resource_name = manifest.get("metadata").get("name") namespace = manifest.get("metadata").get("namespace") crd_api = dynamic_client.resources.get(api_version=api_version, kind=kind) try: crd_api.get(namespace=namespace, name=resource_name) crd_api.patch(body=manifest, content_type="application/merge-patch+json") if verbose: print(f"{namespace}/{resource_name} patched") except kubernetes.dynamic.exceptions.NotFoundError: crd_api.create(body=manifest, namespace=namespace) if verbose: print(f"{namespace}/{resource_name} created") def apply_simple_item_from_yaml(dynamic_client: kubernetes.dynamic.DynamicClient, filepath: pathlib.Path, verbose: bool=False): with open(filepath, 'r') as f: manifest = yaml.safe_load(f) apply_simple_item(dynamic_client=dynamic_client, manifest=manifest, verbose=verbose) # Usage kubernetes.config.load_kube_config() DYNAMIC_CLIENT = kubernetes.dynamic.DynamicClient( kubernetes.client.api_client.ApiClient() ) apply_simple_item_from_yaml(DYNAMIC_CLIENT, "manifest.yml", verbose=True)
Hi, I have a same issue here for creating a resource kind Workflow (its for argo workflow) using utils.create_from_dict
I have this error : module 'kubernetes.client' has no attribute 'ArgoprojIoV1alpha1Api'
but when I use the cli kubectl it works totally fine with create -f file.yaml
I tried the solution above but I get this :
404
Reason: Not Found
HTTP response headers: HTTPHeaderDict({'Audit-Id': 'eee8ca83-533c-47cd-b6b1-a2e04566d0b2', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'X-Kubernetes-Pf-Flowschema-Uid': '15bb149d-4605-405f-be86-f44fedbf33be', 'X-Kubernetes-Pf-Prioritylevel-Uid': 'd6fe364f-537d-4ef3-9f6b-31a9ee9729b9', 'Date': 'Tue, 22 Aug 2023 12:36:22 GMT', 'Content-Length': '200'})
HTTP response body: b'{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"namespaces \\"stg-argowf\\" not found","reason":"NotFound","details":{"name":"stg-argowf","kind":"namespaces"},"code":404}\n'