Deploying AWX Operator with kustomization.yaml gets stuck

I am trying to install AWX Operator into MiniKube while using kustomization.yaml file like the github website says. https://github.com/ansible/awx-operator#service-type. The first time run it to deploy the awx-operator-controller-manager pod and that work fine. Then when I had the resource line to deploy awx demo and when I run the command " kubectl logs -f deployments/awx-operator-controller-manager -c awx-manager," it comes back with only this “{“level”:“info”,“ts”:1657068654.6148314,“logger”:“cmd”,“msg”:“Version”,“Go Version”:“go1.16.9”,“GOOS”:“linux”,“GOARCH”:“amd64”,“ansible-operator”:“v1.12.0”,“commit”:“d3b2761afdb78f629a7eaf4461b0fb8ae3b02860”}
{“level”:“info”,“ts”:1657068654.6151662,“logger”:“cmd”,“msg”:“Watching single namespace.”,“Namespace”:“awx”}
{“level”:“info”,“ts”:1657068654.7725334,“logger”:“controller-runtime.metrics”,“msg”:“metrics server is starting to listen”,“addr”:“127.0.0.1:8080”}
{“level”:“info”,“ts”:1657068654.7742505,“logger”:“watches”,“msg”:“Environment variable not set; using default value”,“envVar”:“ANSIBLE_VERBOSITY_AWX_AWX_ANSIBLE_COM”,“default”:2}
{“level”:“info”,“ts”:1657068654.7743437,“logger”:“watches”,“msg”:“Environment variable not set; using default value”,“envVar”:“ANSIBLE_VERBOSITY_AWXBACKUP_AWX_ANSIBLE_COM”,“default”:2}
{“level”:“info”,“ts”:1657068654.774362,“logger”:“watches”,“msg”:“Environment variable not set; using default value”,“envVar”:“ANSIBLE_VERBOSITY_AWXRESTORE_AWX_ANSIBLE_COM”,“default”:2}
{“level”:“info”,“ts”:1657068654.7743986,“logger”:“ansible-controller”,“msg”:“Watching resource”,“Options.Group”:“awx.ansible.com”,“Options.Version”:“v1beta1”,“Options.Kind”:“AWX”}
{“level”:“info”,“ts”:1657068654.7745326,“logger”:“ansible-controller”,“msg”:“Watching resource”,“Options.Group”:“awx.ansible.com”,“Options.Version”:“v1beta1”,“Options.Kind”:“AWXBackup”}
{“level”:“info”,“ts”:1657068654.7745917,“logger”:“ansible-controller”,“msg”:“Watching resource”,“Options.Group”:“awx.ansible.com”,“Options.Version”:“v1beta1”,“Options.Kind”:“AWXRestore”}
{“level”:“info”,“ts”:1657068654.776151,“logger”:“proxy”,“msg”:“Starting to serve”,“Address”:“127.0.0.1:8888”}
{“level”:“info”,“ts”:1657068654.776901,“logger”:“controller-runtime.manager”,“msg”:“starting metrics server”,“path”:”/metrics"}
I0706 00:50:54.776820 7 leaderelection.go:243] attempting to acquire leader lease awx/awx-operator…
I0706 00:51:09.884187 7 leaderelection.go:253] successfully acquired lease awx/awx-operator
{“level”:“info”,“ts”:1657068669.886074,“logger”:“controller-runtime.manager.controller.awxbackup-controller”,“msg”:“Starting EventSource”,“source”:“kind source: awx.ansible.com/v1beta1, Kind=AWXBackup”}
{“level”:“info”,“ts”:1657068669.88644,“logger”:“controller-runtime.manager.controller.awxbackup-controller”,“msg”:“Starting Controller”}
{“level”:“info”,“ts”:1657068669.8860497,“logger”:“controller-runtime.manager.controller.awx-controller”,“msg”:“Starting EventSource”,“source”:“kind source: awx.ansible.com/v1beta1, Kind=AWX”}
{“level”:“info”,“ts”:1657068669.8860552,“logger”:“controller-runtime.manager.controller.awxrestore-controller”,“msg”:“Starting EventSource”,“source”:“kind source: awx.ansible.com/v1beta1, Kind=AWXRestore”}
{“level”:“info”,“ts”:1657068669.8869374,“logger”:“controller-runtime.manager.controller.awxrestore-controller”,“msg”:“Starting Controller”}
{“level”:“info”,“ts”:1657068669.8867931,“logger”:“controller-runtime.manager.controller.awx-controller”,“msg”:“Starting Controller”}
{“level”:“info”,“ts”:1657068669.9885616,“logger”:“controller-runtime.manager.controller.awx-controller”,“msg”:“Starting workers”,“worker count”:8}
{“level”:“info”,“ts”:1657068669.9887462,“logger”:“controller-runtime.manager.controller.awxbackup-controller”,“msg”:“Starting workers”,“worker count”:8}
{“level”:“info”,“ts”:1657068669.988807,“logger”:“controller-runtime.manager.controller.awxrestore-controller”,“msg”:“Starting workers”,“worker count”:8}" and just stays there.

Not sure what is going on or what I am doing work.

Hey.

Delete the deployment and try again.

This happened to me yesterday, with minikube & kustomize, on my ubuntu server(bare-metal 16GB RAM/8CPU Cores with minikube(4cpus/8gb ram))
the commands you need to run are:

kubectl delete namespace awx or minikube kubectl – delete namespace awx in case you haven’t set your alias yet, or do not have kubectl installed.

then run the kustomize deployment again.

If you reach the same error , rinse and repeat.

Hope this helps.