Skip to content

Bug: Azure/k8s-deploy@v4 doesn't seem to be working after upgrading to AKS 1.24.9 #282

@ealasgarov

Description

@ealasgarov

What happened?

I have upgraded my private cluster to latest stable version 1.24.9, since then cannot get the pipeline to work.
(although I'm also using the new service principle [azure credentials], new clusterrole/binding for that service principle, but I guess here there are no issues).
I have deployed in this way previously with no problems, but now getting "error undefined" upon deploy step.

Here's my pipeline:

name: deploy-test
on: workflow_dispatch
jobs:
  deploy:
    runs-on: platform-aks-runner
    steps:
      - name: Checkout source code 
        uses: actions/checkout@v3
      - name: Set up kubelogin for non-interactive login
        run: |
          sudo rm -f /usr/local/bin/kubelogin
          curl -LO https://github.com/Azure/kubelogin/releases/download/v0.0.28/kubelogin-linux-amd64.zip
          sudo unzip -j kubelogin-linux-amd64.zip -d /usr/local/bin
          rm -f kubelogin-linux-amd64.zip
          kubelogin --version
      - name: Azure login
        id: login
        uses: azure/login@v1
        with:
          creds: ${{ secrets.AZURE_CREDENTIALS_DEV }}
      - name: Set AKS context
        id: set-context
        uses: azure/aks-set-context@v3
        with:
          resource-group: '${{ secrets.RESOURCE_GROUP_DEV }}' 
          cluster-name: '${{ secrets.CLUSTER_DEV }}'
          admin: 'false'
          use-kubelogin: 'true'
      - name: Setup kubectl
        id: install-kubectl
        uses: azure/setup-kubectl@v3
      - name: Deploy to AKS
        id: deploy-aks
        uses: Azure/k8s-deploy@v4
        with:
          resource-group: '${{ secrets.RESOURCE_GROUP_DEV }}' 
          name: '${{ secrets.CLUSTER_DEV }}'
          private-cluster: true 
          action: deploy
          force: true
          strategy: basic
          namespace: 'mynamespace'
          manifests: |
             ./resources.yaml
          images: '${{ secrets.registry_dev }}.azurecr.io/myrepo/myimage:latest'

I am not sure what else could be an issue. I think if the problem was with credentials I would get a different error.
@OliverMKing Any ideas perhaps?
Many thanks in advance!

Version

  • I am using the latest version

Runner

self-hosted on AKS, latest version 2.303

Relevant log output

Run Azure/k8s-deploy@v4
  with:
    resource-group: ***
    name: ***
    private-cluster: true
    action: deploy
    force: true
    strategy: basic
    namespace: mynamespace
    manifests: ./resources.yaml
  
    images: ***.azurecr.io/myrepo/myimage:latest
    pull-images: true
    route-method: service
    version-switch-buffer: 0
    traffic-split-method: pod
    percentage: 0
    token: ***
    annotate-namespace: true
    skip-tls-verify: false
  env:
    AZURE_HTTP_USER_AGENT: 
    AZUREPS_HOST_ENVIRONMENT: 
    KUBECONFIG: /runner/_work/_temp/kubeconfig_1680[2](https://github.com/myrepo/actions/runs/4569561054/jobs/8065945022#step:8:2)150991[3](https://github.com/myrepo/actions/runs/4569561054/jobs/8065945022#step:8:3)1
    KUBE_CONFIG_PATH: /runner/_work/_temp/kubeconfig_168021[5](https://github.com/myrepo/actions/runs/4569561054/jobs/8065945022#step:8:5)0[9](https://github.com/myrepo/actions/runs/4569561054/jobs/8065945022#step:8:9)9[13](https://github.com/myrepo/actions/runs/4569561054/jobs/8065945022#step:8:13)1

**Deploying manifests
  Error: Error: undefined**

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingidleInactive for 14 days

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions