Skip to content

Add kvmtest-t0 integration test stage to CI pipeline#21

Open
devin-ai-integration[bot] wants to merge 16 commits intomasterfrom
devin/1770961640-add-kvmtest-stage
Open

Add kvmtest-t0 integration test stage to CI pipeline#21
devin-ai-integration[bot] wants to merge 16 commits intomasterfrom
devin/1770961640-add-kvmtest-stage

Conversation

@devin-ai-integration
Copy link

@devin-ai-integration devin-ai-integration bot commented Feb 13, 2026

Why I did it

Enable KVM-based integration testing (kvmtest-t0) in the CI pipeline for the fork. The Test stage was previously commented out because it depended on the SONiC Elastictest managed service, which is not available in the Cisco-SONiC-PoC Azure DevOps organization. This PR adds a direct KVM test stage that runs on the self-hosted sonic-build agent pool, modeled after the existing .azure-pipelines/run-test-template.yml.

Work item tracking
  • Microsoft ADO (number only): N/A

How I did it

Added a new Test stage to azure-pipelines.yml that:

  1. Depends on BuildVS succeeding
  2. Runs environment diagnostics (KVM, libvirt, Docker, sonic-mgmt) and logs results
  3. Auto-creates the sonic-mgmt Docker container if it doesn't exist, using /data/sonic-mgmt/setup-container.sh with a manual fallback
  4. Downloads the VS image artifact from the build stage (with robust multi-path detection)
  5. Cleans existing testbed topology — stops and removes all vms6-1 containers, cEOS data directories, and stale vlab VMs via libvirt, then verifies the sonic-mgmt container survived cleanup
  6. Sets up testbed — resets /data/sonic-mgmt to origin/master, configures veos_vtb, auto-detects the cEOS image version on the agent and patches ceos.yml, then runs testbed-cli.sh refresh-dut
  7. Patches sonic-mgmt with upstream fixes (runs after git reset --hard origin/master to avoid being reverted):
    • Applies a base64-encoded Python script that adds teardown logic to the restore_test_env fixture in test_cacl.py, restoring config_db.json after the module completes (see sonic-mgmt PR #1)
    • Fixes stale kvmtest.sh reference: test_ipv6.py was renamed to test_ip_bgp.py upstream (PR #13650) but kvmtest.sh was never updated
  8. Runs kvmtest.sh with t0 topology against vms-kvm-t0 / vlab-01
  9. Collects test logs and kvmdump on failure
  10. Publishes test results as pipeline artifacts (conditionally, only if files exist)
  11. Posts failure details to the GitHub PR via post-failure-to-github.yml template

Also rewrote post-failure-to-github.yml to improve error visibility: extracts pytest short test summary sections, uses more precise error patterns, includes environment snapshots, and handles oversized comments by rebuilding with reduced output.

Updates since last revision

How to verify it

  1. Push a PR to trigger the pipeline
  2. Verify the BuildVS stage completes
  3. Verify the Test stage starts, cleans existing topology, creates the sonic-mgmt container if needed, downloads the artifact, and sets up the testbed
  4. Check update-sonic-mgmt.log for SUCCESS: test_cacl.py patched and kvmtest.sh patched
  5. Review pipeline logs and GitHub PR comments for kvmtest execution and results
  6. Verify config_db_check no longer fails after test_cacl.py (confirmed in builds update swss, swss-common, sairedis submodules sonic-net/sonic-buildimage#189, sonic-cfggen with sonicv2 dockers sonic-net/sonic-buildimage#190)

Human Review Checklist:

  • git reset ordering: The patch step now runs AFTER git reset --hard origin/master (Fixed in build update swss, swss-common, sairedis submodules sonic-net/sonic-buildimage#189)
  • Missing test_ipv6.py: File was renamed to test_ip_bgp.py upstream (Fixed in build sonic-cfggen with sonicv2 dockers sonic-net/sonic-buildimage#190 via sed patch)
  • Base64 patch fragility: The patch script uses exact string matching. If upstream changes the restore_test_env fixture text, the patch will silently fail (continueOnError: true). Consider merging sonic-mgmt PR #1 to upstream.
  • continueOnError on patch step: If the patch fails, the pipeline continues without the fix. Consider whether this should fail the build.
  • Upstream gnmi test failure: test_gnmi_configdb_full_01 fails in build sonic-cfggen with sonicv2 dockers sonic-net/sonic-buildimage#190. This appears to be an unrelated upstream issue, not caused by this PR.
  • Hardcoded group-name vms6-1: The cleanup step uses grep 'vms6-1' to find containers. This matches the group-name in vtestbed.yaml for vms-kvm-t0. If the testbed config changes, update the cleanup step accordingly.
  • libvirt connectivity: Diagnostics show virsh -c qemu:///system list fails on the agent. The cleanup step's VM removal and kvmdump collection depend on libvirt. Verify libvirtd is running and accessible to the azureuser account.
  • 3-minute sleep after refresh-dut: The sleep 180 is a fixed wait for DUT readiness. May need tuning based on actual boot times.

Which release branch to backport (provide reason below if selected)

  • 202305
  • 202311
  • 202405
  • 202411
  • 202505
  • 202511

Tested branch (Please provide the tested image version)

Description for the changelog

Add kvmtest-t0 integration test stage to the CI pipeline using direct KVM testing on the self-hosted build agent, with auto-creation of sonic-mgmt container, auto-detection of cEOS image version, cleanup of existing testbed topology, and improved GitHub failure reporting with pytest summary extraction

Link to config_db schema for YANG module changes

N/A

A picture of a cute animal (not mandatory but encouraged)

N/A


Link to Devin run: https://cisco-demo.devinenterprise.com/sessions/ec7f851273cb42029be7c8d2d912ec7b
Requested by: Arthur Poon (@arthurkkp-cog)

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@devin-ai-integration
Copy link
Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

devin-ai-integration bot and others added 3 commits February 13, 2026 05:49
Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
…path

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #177 | Commit: 70b6a89

ℹ️ No log files found

No .log files were found in /home/azureuser/_work/10/a/logs.

The build may have failed before generating logs, or logs are stored elsewhere.

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #178 | Commit: eabfdc5

⚠️ setup-testbed.log (2 errors, 20 lines total)

Errors found:

16:+ echo 'ERROR: sonic-mgmt Docker container is not running. Please start it on the build agent.'
17:ERROR: sonic-mgmt Docker container is not running. Please start it on the build agent.

Last 200 lines:

+ VS_IMAGE=
+ '[' -f /home/azureuser/_work/10/target/sonic-vs.img.gz ']'
+ VS_IMAGE=/home/azureuser/_work/10/target/sonic-vs.img.gz
+ '[' -z /home/azureuser/_work/10/target/sonic-vs.img.gz ']'
+ echo 'Found VS image at: /home/azureuser/_work/10/target/sonic-vs.img.gz'
Found VS image at: /home/azureuser/_work/10/target/sonic-vs.img.gz
+ sudo mkdir -p /data/sonic-vm/images
+ sudo cp -v /home/azureuser/_work/10/target/sonic-vs.img.gz /data/sonic-vm/images/sonic-vs.img.gz
'/home/azureuser/_work/10/target/sonic-vs.img.gz' -> '/data/sonic-vm/images/sonic-vs.img.gz'
+ sudo gzip -fd /data/sonic-vm/images/sonic-vs.img.gz
++ id -un
+ username=azureuser
+ sudo chown -R azureuser.azureuser /data/sonic-vm
+ '[' '!' -d /data/sonic-mgmt ']'
+ docker exec sonic-mgmt echo OK
+ echo 'ERROR: sonic-mgmt Docker container is not running. Please start it on the build agent.'
ERROR: sonic-mgmt Docker container is not running. Please start it on the build agent.
+ echo 'Hint: docker run -d --name sonic-mgmt -v /data:/data -v /var/run/libvirt:/var/run/libvirt --privileged docker-sonic-mgmt'
Hint: docker run -d --name sonic-mgmt -v /data:/data -v /var/run/libvirt:/var/run/libvirt --privileged docker-sonic-mgmt
+ exit 1
⚠️ diagnostics.log (2 errors, 131 lines total)

Errors found:

26:+ echo 'FAIL: cannot connect to libvirt'
27:FAIL: cannot connect to libvirt

Last 200 lines:

+ echo '=== Environment diagnostics ==='
=== Environment diagnostics ===
+ echo 'Pipeline.Workspace: /home/azureuser/_work/10'
Pipeline.Workspace: /home/azureuser/_work/10
+ echo 'Build.ArtifactStagingDirectory: /home/azureuser/_work/10/a'
Build.ArtifactStagingDirectory: /home/azureuser/_work/10/a
+ echo 'System.DefaultWorkingDirectory: /home/azureuser/_work/10/s'
System.DefaultWorkingDirectory: /home/azureuser/_work/10/s
+ echo 'Agent.BuildDirectory: /home/azureuser/_work/10'
Agent.BuildDirectory: /home/azureuser/_work/10
++ pwd
+ echo 'PWD: /home/azureuser/_work/10/s'
PWD: /home/azureuser/_work/10/s
++ whoami
+ echo 'whoami: azureuser'
whoami: azureuser
+ echo '=== Check KVM ==='
=== Check KVM ===
+ ls -la /dev/kvm
crw-rw---- 1 root kvm 10, 232 Feb 13 10:13 /dev/kvm
+ echo '=== Check libvirt ==='
=== Check libvirt ===
+ virsh --version
8.0.0
+ virsh -c qemu:///system list
+ echo 'FAIL: cannot connect to libvirt'
FAIL: cannot connect to libvirt
+ echo '=== Check sonic-mgmt directory ==='
=== Check sonic-mgmt directory ===
+ ls -la /data/sonic-mgmt
total 140
drwxrwxr-x  13 azureuser azureuser  4096 Feb 13 03:30 .
drwxr-xr-x   5 azureuser azureuser  4096 Feb 13 03:20 ..
drwxrwxr-x   9 azureuser azureuser  4096 Feb 13 02:41 .azure-pipelines
-rw-rw-r--   1 azureuser azureuser   416 Feb 13 02:41 .flake8
drwxrwxr-x   8 azureuser azureuser  4096 Feb 13 09:28 .git
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 .github
-rw-rw-r--   1 azureuser azureuser   477 Feb 13 02:41 .gitignore
drwxrwxr-x   3 azureuser azureuser  4096 Feb 13 02:41 .hooks
-rw-rw-r--   1 azureuser azureuser    39 Feb 13 02:41 .markdownlint.json
-rw-rw-r--   1 azureuser azureuser  1984 Feb 13 02:41 .pre-commit-config.yaml
-rw-rw-r--   1 azureuser azureuser   220 Feb 13 02:41 .pre-commit-hooks.yaml
drwxrwxr-x   3 azureuser azureuser  4096 Feb 13 03:30 .pytest_cache
-rw-rw-r--   1 azureuser azureuser   558 Feb 13 02:41 LICENSE
-rw-rw-r--   1 azureuser azureuser  2417 Feb 13 02:41 README.md
-rw-rw-r--   1 azureuser azureuser  2756 Feb 13 02:41 SECURITY.md
drwxrwxr-x  19 azureuser azureuser  4096 Feb 13 09:28 ansible
-rw-rw-r--   1 azureuser azureuser  2642 Feb 13 02:41 azure-pipelines.yml
drwxrwxr-x   9 azureuser azureuser  4096 Feb 13 02:41 docs
-rw-rw-r--   1 azureuser azureuser 22975 Feb 13 02:41 pylintrc
-rw-rw-r--   1 azureuser azureuser  1068 Feb 13 02:41 pyproject.toml
drwxrwxr-x   5 azureuser azureuser  4096 Feb 13 02:41 sdn_tests
-rwxrwxr-x   1 azureuser azureuser 18986 Feb 13 02:41 setup-container.sh
-rw-rw-r--   1 azureuser azureuser  2025 Feb 13 02:41 sonic_dictionary.txt
drwxrwxr-x  15 azureuser azureuser  4096 Feb 13 02:41 spytest
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 test_reporting
drwxrwxr-x 126 azureuser azureuser  4096 Feb 13 03:30 tests
+ ls /data/sonic-mgmt/tests/kvmtest.sh
/data/sonic-mgmt/tests/kvmtest.sh
+ echo '=== Check sonic-mgmt docker container ==='
=== Check sonic-mgmt docker container ===
+ docker ps -a --filter name=sonic-mgmt
CONTAINER ID   IMAGE     COMMAND   CREATED   STATUS    PORTS     NAMES
+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
+ echo 'FAIL: sonic-mgmt container not running or not accessible'
FAIL: sonic-mgmt container not running or not accessible
+ echo '=== Check /data contents ==='
=== Check /data contents ===
+ ls -la /data/
total 20
drwxr-xr-x  5 azureuser azureuser 4096 Feb 13 03:20 .
drwxr-xr-x 22 root      root      4096 Feb 13 02:25 ..
drwxr-xr-x  6 root      root      4096 Feb 13 03:20 ceos
drwxrwxr-x 13 azureuser azureuser 4096 Feb 13 03:30 sonic-mgmt
drwxr-xr-x  3 azureuser azureuser 4096 Feb 13 02:25 sonic-vm
+ echo '=== Check cEOS images ==='
=== Check cEOS images ===
+ ls -la /data/sonic-vm/images/
total 5030156
drwxr-xr-x 2 azureuser azureuser       4096 Feb 13 09:28 .
drwxr-xr-x 3 azureuser azureuser       4096 Feb 13 02:25 ..
-rw-r--r-- 1 azureuser azureuser 5150867456 Feb 13 09:27 sonic-vs.img
+ ls -la /data/ceos/
total 24
drwxr-xr-x   6 root      root      4096 Feb 13 03:20 .
drwxr-xr-x   5 azureuser azureuser 4096 Feb 13 03:20 ..
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0100
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0101
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0102
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0103
+ echo '=== Docker images ==='
=== Docker images ===
+ docker images
+ head -20
IMAGE                                                                ID             DISK USAGE   CONTENT SIZE   EXTRA
ceosimage:4.29.10.1M                                                 cf484164b16d       2.89GB          731MB        
ceosimage:4.29.10.1M-1                                               c6a1850ef28f       2.89GB          731MB   U    
docker-ptf:latest                                                    f28aaf787373       9.09GB         4.39GB        
docker-sonic-vs:latest                                               93c8cf3b870e       1.74GB          828MB        
publicmirror.azurecr.io/debian:bookworm                              c66c66fac809        185MB         52.2MB        
publicmirror.azurecr.io/debian:trixie                                c71b05eac0b2        186MB         52.5MB        
sonic-slave-bookworm-azureuser:1828b4d7c29                           983be73adef0       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:35feeb650be                           ae838157f361       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:6db1f584aa2                           266b697e04d1       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:73c8df02574                           dfe58e9aafc4       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:9eaf6be7f19                           79c4b6740f13       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:a27a4904ede                           c2a93f135256       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:ada88ee24f1                           06a1ce313b86       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:b8b044053f8                           e27f53eab83a       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:eee57d1af57                           18f6072bb74c       13.5GB         3.34GB        
sonic-slave-bookworm:733a2a69061                                     0114860d906a       13.5GB         3.34GB        
sonic-slave-trixie-azureuser:01d13355124                             896b1ab6f42e         14GB         3.34GB        
sonic-slave-trixie-azureuser:83b460050f6                             b1c07fdfb78c         14GB         3.34GB        
sonic-slave-trixie-azureuser:86a2c601666                             e07cd74d1aa7         14GB         3.34GB        
+ echo '=== Docker containers ==='
=== Docker containers ===
+ docker ps -a
CONTAINER ID   IMAGE                                                 COMMAND                  CREATED        STATUS        PORTS     NAMES
46054318c3f1   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   7 hours ago    Up 7 hours              ceos_vms6-1_VM0102
869da5d16fd1   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   7 hours ago    Up 7 hours              ceos_vms6-1_VM0100
cba8ba0394c2   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   7 hours ago    Up 7 hours              ceos_vms6-1_VM0103
dfb331f981cf   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   7 hours ago    Up 7 hours              ceos_vms6-1_VM0101
2e1721943205   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   7 hours ago    Up 7 hours              net_vms6-1_VM0103
9e9bcf00aadf   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   7 hours ago    Up 7 hours              net_vms6-1_VM0102
f8592fda4d9f   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   7 hours ago    Up 7 hours              net_vms6-1_VM0101
a0dd05607cc5   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   7 hours ago    Up 7 hours              net_vms6-1_VM0100
6240945900b7   sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   "/root/env-python3/b…"   7 hours ago    Up 7 hours              ptf_vms6-1
d269a1130978   sonic-slave-trixie-azureuser:b0cc8a9f29a              "bash -c 'make -f sl…"   15 hours ago   Up 15 hours   22/tcp    hardcore_keller
8ebc852cc991   d0474f6ff0b1                                          "/bin/sh -c '#(nop) …"   5 weeks ago    Created                 hopeful_mcclintock
+ echo '=== Diagnostics complete ==='
=== Diagnostics complete ===

🔍 4 error(s) found across 2 log file(s)

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #179 | Commit: a0559ea

⚠️ setup-testbed.log (3 errors, 1948 lines total)

Errors found:

773:TASK [vm_set : Fail if kickstart gives error for vlab-01] **********************
1944:fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": false, "msg": ["Failed, no working ceos image download URL is found. There are 2 options to fix it:", "  1. Fix ceos_image_url defined in ansible/group_vars/vm_host/ceos.yml", "  2. Manually put cEOS image to /home/azureuser/veos-vm/images/cEOS64-lab-4.32.5M.tar"]}
1947:STR-ACS-VSERV-01           : ok=285  changed=34   unreachable=0    failed=1    skipped=314  rescued=0    ignored=0   

Last 200 lines:

skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Bind ptf_ip to keysight_api_server] *****************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Pull and start Keysight IxANVL container] ***********************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Get dut ports] **************************************************
skipping: [STR-ACS-VSERV-01] => (item=vlab-01) 
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Create vlan ports for dut] **************************************
skipping: [STR-ACS-VSERV-01] => (item=vlab-01) 
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Bind topology t0 to VMs. base vm = VM0100] **********************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Create ptf container ptf_vms6-1] ********************************
changed: [STR-ACS-VSERV-01]

TASK [vm_set : Update ptf password] ********************************************
included: /data/sonic-mgmt/ansible/roles/vm_set/tasks/update_ptf_password.yml for STR-ACS-VSERV-01

TASK [vm_set : include_vars] ***************************************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Render ptf secrets] *********************************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Init default ptf_username] **************************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Init default ptf_password] **************************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Override default ptf_username] **********************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Override default ptf_password] **********************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Get ptf_alt_passwords from ptf_secrets] *************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : If ptf_alt_passwords is a list, set ptf_password to its first value] ***
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : If ptf_alt_passwords is not a list, log a debug message] ********
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Update ptf username and password] *******************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Enable ipv6 for docker container ptf_vms6-1] ********************
changed: [STR-ACS-VSERV-01]

TASK [vm_set : Set ipv6 route max size of ptf_vms6-1] **************************
changed: [STR-ACS-VSERV-01]

TASK [vm_set : Don't accept ipv6 router advertisements for docker container ptf_vms6-1] ***
changed: [STR-ACS-VSERV-01]

TASK [vm_set : Create file to store dut type in PTF] ***************************
changed: [STR-ACS-VSERV-01]

TASK [vm_set : Create file to store asic type in PTF] **************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Get dut ports] **************************************************
included: /data/sonic-mgmt/ansible/roles/vm_set/tasks/get_dut_port.yml for STR-ACS-VSERV-01 => (item=vlab-01)

TASK [vm_set : Get front panel port for vlan tunnel] ***************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Setup mgmt port for physical dut] *******************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : determine whether to include internal ports] ********************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : determine whether to sort port_alias by index] ******************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : set_fact] *******************************************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Get DUT port alias] *********************************************
ok: [STR-ACS-VSERV-01 -> localhost]

TASK [vm_set : Get front panel and mgmt port for kvm vm] ***********************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Get front panel and mgmt port for SID] **************************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Get front panel and mgmt port for 8000e-sonic device] ***********
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : set_fact] *******************************************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Create vlan ports for dut] **************************************
skipping: [STR-ACS-VSERV-01] => (item=vlab-01) 
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : debug] **********************************************************
ok: [STR-ACS-VSERV-01] => {
    "msg": {
        "vlab-01": {
            "0": "vlab-01-1",
            "1": "vlab-01-2",
            "10": "vlab-01-11",
            "11": "vlab-01-12",
            "12": "vlab-01-13",
            "13": "vlab-01-14",
            "14": "vlab-01-15",
            "15": "vlab-01-16",
            "16": "vlab-01-17",
            "17": "vlab-01-18",
            "18": "vlab-01-19",
            "19": "vlab-01-20",
            "2": "vlab-01-3",
            "20": "vlab-01-21",
            "21": "vlab-01-22",
            "22": "vlab-01-23",
            "23": "vlab-01-24",
            "24": "vlab-01-25",
            "25": "vlab-01-26",
            "26": "vlab-01-27",
            "27": "vlab-01-28",
            "28": "vlab-01-29",
            "29": "vlab-01-30",
            "3": "vlab-01-4",
            "30": "vlab-01-31",
            "31": "vlab-01-32",
            "4": "vlab-01-5",
            "5": "vlab-01-6",
            "6": "vlab-01-7",
            "7": "vlab-01-8",
            "8": "vlab-01-9",
            "9": "vlab-01-10"
        }
    }
}

TASK [vm_set : debug] **********************************************************
ok: [STR-ACS-VSERV-01] => {
    "msg": [
        "vlab-01-0"
    ]
}

TASK [vm_set : include_tasks] **************************************************
included: /data/sonic-mgmt/ansible/roles/vm_set/tasks/add_ceos_list.yml for STR-ACS-VSERV-01

TASK [vm_set : Check if cEOS docker image exists or not] ***********************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Check if ceos_image_orig exists or not] *************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Check if local ceos image file exists or not] *******************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Fail if skip_ceos_image_downloading is true] ********************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Init ceos_image_urls when ceos_image_url value type is string] ***
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Init ceos_image_urls when ceos_image_url value type is list] ****
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Init working_image_urls list] ***********************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Loop ceos_image_urls to find out working URLs] ******************
included: /data/sonic-mgmt/ansible/roles/vm_set/tasks/probe_image_url.yml for STR-ACS-VSERV-01 => (item=http://example1.com/cEOS64-lab-4.32.5M.tar)
included: /data/sonic-mgmt/ansible/roles/vm_set/tasks/probe_image_url.yml for STR-ACS-VSERV-01 => (item=http://example2.com/cEOS64-lab-4.32.5M.tar)

TASK [vm_set : Probe if the URL works] *****************************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Append working URL to working_image_urls list] ******************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Probe if the URL works] *****************************************
ok: [STR-ACS-VSERV-01]

TASK [vm_set : Append working URL to working_image_urls list] ******************
skipping: [STR-ACS-VSERV-01]

TASK [vm_set : Fail if no working ceos image download url is found] ************
fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": false, "msg": ["Failed, no working ceos image download URL is found. There are 2 options to fix it:", "  1. Fix ceos_image_url defined in ansible/group_vars/vm_host/ceos.yml", "  2. Manually put cEOS image to /home/azureuser/veos-vm/images/cEOS64-lab-4.32.5M.tar"]}

PLAY RECAP *********************************************************************
STR-ACS-VSERV-01           : ok=285  changed=34   unreachable=0    failed=1    skipped=314  rescued=0    ignored=0   

⚠️ sonic-mgmt-container-setup.log (2 errors, 230 lines total)

Errors found:

132:�[91mcp: cannot stat '/var/AzDevOps/!(env-*)': No such file or directory
133:�[0m�[91mcp: cannot stat '/var/AzDevOps/': No such file or directory

Last 200 lines:

72e61d4dfe05: Pulling fs layer
26963a90d020: Pulling fs layer
cd9ca00facce: Pulling fs layer
e2b7aa600eb9: Pulling fs layer
5f4b10d2bb5c: Pulling fs layer
e1e9d061fb18: Pulling fs layer
4f4fb700ef54: Already exists
95af26b5fb96: Download complete
a73ed48333f5: Download complete
21e937b1c360: Download complete
fd1d403e3d06: Download complete
4325fbf7e62d: Download complete
0b39d165a0be: Download complete
a30df45cc924: Download complete
e2b7aa600eb9: Download complete
006d350d29e8: Download complete
38ddc29bb037: Download complete
e93fce65fb9f: Download complete
aac4f284c0c4: Download complete
72e61d4dfe05: Download complete
26963a90d020: Download complete
e1e9d061fb18: Download complete
823e769a039c: Download complete
2604b4a19545: Download complete
1ad063597f51: Download complete
34a4ddd51117: Download complete
cd9ca00facce: Download complete
dc838e875c3f: Download complete
785dde52c77f: Download complete
5f4b10d2bb5c: Download complete
e93fce65fb9f: Pull complete
38ddc29bb037: Pull complete
26963a90d020: Pull complete
95af26b5fb96: Pull complete
a73ed48333f5: Pull complete
34a4ddd51117: Pull complete
aac4f284c0c4: Pull complete
cd9ca00facce: Pull complete
dc838e875c3f: Pull complete
785dde52c77f: Pull complete
21e937b1c360: Pull complete
823e769a039c: Pull complete
2604b4a19545: Pull complete
e1e9d061fb18: Pull complete
5f4b10d2bb5c: Pull complete
4325fbf7e62d: Pull complete
e2b7aa600eb9: Pull complete
4f4fb700ef54: Pull complete
0b39d165a0be: Pull complete
72e61d4dfe05: Pull complete
a30df45cc924: Pull complete
006d350d29e8: Pull complete
1ad063597f51: Pull complete
fd1d403e3d06: Pull complete
Digest: sha256:1f066aae1744467e05c63f55bd3c252c841f614d07da874c67adacc7eeb8802a
Status: Downloaded newer image for sonicdev-microsoft.azurecr.io:443/docker-sonic-mgmt:latest
sonicdev-microsoft.azurecr.io:443/docker-sonic-mgmt:latest
NOTICE: using default docker image: sonicdev-microsoft.azurecr.io:443/docker-sonic-mgmt:latest
INFO: generate SSH key pair: id_rsa_docker_sonic_mgmt/id_rsa_docker_sonic_mgmt.pub
INFO: read SSH public key: /home/azureuser/.ssh/id_rsa_docker_sonic_mgmt.pub
INFO: setup a temporary dir: /tmp/tmp.xYuR0cSQHL
INFO: copy SSH key pair: id_rsa_docker_sonic_mgmt/id_rsa_docker_sonic_mgmt.pub
'/home/azureuser/.ssh/id_rsa_docker_sonic_mgmt' -> '/tmp/tmp.xYuR0cSQHL/id_rsa'
'/home/azureuser/.ssh/id_rsa_docker_sonic_mgmt.pub' -> '/tmp/tmp.xYuR0cSQHL/id_rsa.pub'
INFO: prepare a Dockerfile template: /tmp/tmp.xYuR0cSQHL/Dockerfile.j2
INFO: prepare an environment file: /tmp/tmp.xYuR0cSQHL/data.env
INFO: generate a Dockerfile: /tmp/tmp.xYuR0cSQHL/Dockerfile
INFO: building docker image from /tmp/tmp.xYuR0cSQHL: docker-sonic-mgmt-azureuser:master ...
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
            BuildKit is currently disabled; enable it by removing the DOCKER_BUILDKIT=0
            environment-variable.

Sending build context to Docker daemon  15.87kB

Step 1/27 : FROM sonicdev-microsoft.azurecr.io:443/docker-sonic-mgmt:latest
 ---> 1f066aae1744
Step 2/27 : USER root
 ---> Running in 70d792aa07ec
 ---> Removed intermediate container 70d792aa07ec
 ---> bf5de3ea8f91
Step 3/27 : RUN if getent passwd ubuntu; then userdel -r ubuntu; fi
 ---> Running in 3d32ab348b37
 ---> Removed intermediate container 3d32ab348b37
 ---> 6b4999bf31c1
Step 4/27 : RUN if getent group azureuser; then groupmod -o -g 1000 azureuser; else groupadd -o -g 1000 azureuser; fi
 ---> Running in 0efa9eb67e5b
 ---> Removed intermediate container 0efa9eb67e5b
 ---> 5a6f4a394644
Step 5/27 : RUN if getent passwd azureuser; then userdel azureuser; fi
 ---> Running in 2e1723fc2886
 ---> Removed intermediate container 2e1723fc2886
 ---> 2b9213a2a0b0
Step 6/27 : RUN useradd -o -l -g 1000 -u 1000 -m -d /home/azureuser -s /bin/bash azureuser;
 ---> Running in f756389c6ec0
 ---> Removed intermediate container f756389c6ec0
 ---> 5f0ce8b808f8
Step 7/27 : RUN if getent group docker; then groupmod -o -g 999 docker; else groupadd -o -g 999 docker; fi
 ---> Running in b6a7e5fdfe44
 ---> Removed intermediate container b6a7e5fdfe44
 ---> a465329e5821
Step 8/27 : RUN if [ 'azureuser' != 'AzDevOps' ]; then /bin/bash -O extglob -c 'cp -a -f /var/AzDevOps/!(env-*) /home/azureuser/'; for hidden_stuff in '.profile .local .ssh'; do /bin/bash -c 'cp -a -f /var/AzDevOps/$hidden_stuff /home/azureuser/ || true'; done fi
 ---> Running in 510407c99371
�[91mcp: cannot stat '/var/AzDevOps/!(env-*)': No such file or directory
�[0m�[91mcp: cannot stat '/var/AzDevOps/': No such file or directory
�[0m ---> Removed intermediate container 510407c99371
 ---> f07f72ad1891
Step 9/27 : RUN usermod -a -G sudo azureuser
 ---> Running in 3f13bab00dda
 ---> Removed intermediate container 3f13bab00dda
 ---> a25504ed8c74
Step 10/27 : RUN usermod -a -G docker azureuser
 ---> Running in bedaa7480a19
 ---> Removed intermediate container bedaa7480a19
 ---> 17293667bc47
Step 11/27 : RUN echo 'azureuser ALL=(ALL) NOPASSWD:ALL' > /etc/sudoers.d/azureuser
 ---> Running in 597f14790ed9
 ---> Removed intermediate container 597f14790ed9
 ---> 35207384869d
Step 12/27 : RUN chmod 0440 /etc/sudoers.d/azureuser
 ---> Running in 94a8e6ea279c
 ---> Removed intermediate container 94a8e6ea279c
 ---> 72a6133ec9d4
Step 13/27 : RUN chown -R '1000:1000' /home/azureuser
 ---> Running in 8adac1aff940
 ---> Removed intermediate container 8adac1aff940
 ---> e9a4d73cb64b
Step 14/27 : RUN sed -i -E 's/^#?PermitRootLogin.*$/PermitRootLogin yes/g' /etc/ssh/sshd_config
 ---> Running in 587c67809f79
 ---> Removed intermediate container 587c67809f79
 ---> ed556e12b02f
Step 15/27 : RUN echo 'root:root' | chpasswd
 ---> Running in f0af7ef7831e
 ---> Removed intermediate container f0af7ef7831e
 ---> 1879fd674ea9
Step 16/27 : RUN echo 'azureuser:12345' | chpasswd
 ---> Running in 58b44950d2b4
 ---> Removed intermediate container 58b44950d2b4
 ---> d21a383f6de4
Step 17/27 : USER azureuser
 ---> Running in fd0464b300e0
 ---> Removed intermediate container fd0464b300e0
 ---> 835d7d4698d2
Step 18/27 : ENV HOME=/home/azureuser
 ---> Running in aeec22f84189
 ---> Removed intermediate container aeec22f84189
 ---> 819cf2a0563f
Step 19/27 : ENV USER=azureuser
 ---> Running in 57fcd9dc7b97
 ---> Removed intermediate container 57fcd9dc7b97
 ---> 36dc3ad6343f
Step 20/27 : COPY --chown=1000:1000 id_rsa id_rsa.pub ${HOME}/.ssh/
 ---> f38146eb8237
Step 21/27 : RUN chmod 0700 ${HOME}/.ssh
 ---> Running in 8b6d87aba783
 ---> Removed intermediate container 8b6d87aba783
 ---> 74cb0d1b3354
Step 22/27 : RUN chmod 0600 ${HOME}/.ssh/id_rsa
 ---> Running in edb726d374be
 ---> Removed intermediate container edb726d374be
 ---> 78c3762920c6
Step 23/27 : RUN chmod 0644 ${HOME}/.ssh/id_rsa.pub
 ---> Running in 5c4066e862d6
 ---> Removed intermediate container 5c4066e862d6
 ---> 59453fd6a622
Step 24/27 : RUN cat ${HOME}/.ssh/id_rsa.pub >> ${HOME}/.ssh/authorized_keys
 ---> Running in 69e39ba32bb5
 ---> Removed intermediate container 69e39ba32bb5
 ---> e1c8fd0dd12f
Step 25/27 : RUN chmod 0600 ${HOME}/.ssh/authorized_keys
 ---> Running in e605ca88e57e
 ---> Removed intermediate container e605ca88e57e
 ---> 17936fe28d8d
Step 26/27 : WORKDIR ${HOME}
 ---> Running in 16c44df9f770
 ---> Removed intermediate container 16c44df9f770
 ---> 3e9ad0c43226
Step 27/27 : RUN if [ -d /var/AzDevOps/env-python3 ]  && [ 'azureuser' != 'AzDevOps' ]  && ! pip3 list | grep -c pytest >/dev/null; then /bin/bash -c 'python3 -m venv ${HOME}/env-python3'; /bin/bash -c '${HOME}/env-python3/bin/pip install pip --upgrade'; /bin/bash -c '${HOME}/env-python3/bin/pip install wheel'; /bin/bash -c '${HOME}/env-python3/bin/pip install $(/var/AzDevOps/env-python3/bin/pip freeze | grep -vE "distro|PyGObject|python-apt|unattended-upgrades|dbus-python")'; fi
 ---> Running in 328e878cd442
 ---> Removed intermediate container 328e878cd442
 ---> dc1af3bbf6a5
Successfully built dc1af3bbf6a5
Successfully tagged docker-sonic-mgmt-azureuser:master
INFO: cleanup a temporary dir: /tmp/tmp.xYuR0cSQHL
INFO: creating a container: sonic-mgmt ...
43138ce8a47b89039a2371413f2d6d7ec0c8c1a3df87272fbf97d14b5e41e4ef
 * Restarting OpenBSD Secure Shell server sshd
   ...done.
INFO: verifying UID and GID in container matches host
******************************************************************************
EXEC: docker exec --user azureuser -ti sonic-mgmt bash
SSH:  ssh -i ~/.ssh/id_rsa_docker_sonic_mgmt azureuser@172.17.0.3
******************************************************************************

INFO: sonic-mgmt configuration is done!

+ docker exec sonic-mgmt echo OK
OK
+ docker exec sonic-mgmt echo OK
OK
+ echo 'sonic-mgmt container is ready'
sonic-mgmt container is ready
⚠️ diagnostics.log (2 errors, 131 lines total)

Errors found:

26:+ echo 'FAIL: cannot connect to libvirt'
27:FAIL: cannot connect to libvirt

Last 200 lines:

+ echo '=== Environment diagnostics ==='
=== Environment diagnostics ===
+ echo 'Pipeline.Workspace: /home/azureuser/_work/10'
Pipeline.Workspace: /home/azureuser/_work/10
+ echo 'Build.ArtifactStagingDirectory: /home/azureuser/_work/10/a'
Build.ArtifactStagingDirectory: /home/azureuser/_work/10/a
+ echo 'System.DefaultWorkingDirectory: /home/azureuser/_work/10/s'
System.DefaultWorkingDirectory: /home/azureuser/_work/10/s
+ echo 'Agent.BuildDirectory: /home/azureuser/_work/10'
Agent.BuildDirectory: /home/azureuser/_work/10
++ pwd
+ echo 'PWD: /home/azureuser/_work/10/s'
PWD: /home/azureuser/_work/10/s
++ whoami
+ echo 'whoami: azureuser'
whoami: azureuser
+ echo '=== Check KVM ==='
=== Check KVM ===
+ ls -la /dev/kvm
crw-rw---- 1 root kvm 10, 232 Feb 13 11:28 /dev/kvm
+ echo '=== Check libvirt ==='
=== Check libvirt ===
+ virsh --version
8.0.0
+ virsh -c qemu:///system list
+ echo 'FAIL: cannot connect to libvirt'
FAIL: cannot connect to libvirt
+ echo '=== Check sonic-mgmt directory ==='
=== Check sonic-mgmt directory ===
+ ls -la /data/sonic-mgmt
total 140
drwxrwxr-x  13 azureuser azureuser  4096 Feb 13 03:30 .
drwxr-xr-x   5 azureuser azureuser  4096 Feb 13 03:20 ..
drwxrwxr-x   9 azureuser azureuser  4096 Feb 13 02:41 .azure-pipelines
-rw-rw-r--   1 azureuser azureuser   416 Feb 13 02:41 .flake8
drwxrwxr-x   8 azureuser azureuser  4096 Feb 13 09:28 .git
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 .github
-rw-rw-r--   1 azureuser azureuser   477 Feb 13 02:41 .gitignore
drwxrwxr-x   3 azureuser azureuser  4096 Feb 13 02:41 .hooks
-rw-rw-r--   1 azureuser azureuser    39 Feb 13 02:41 .markdownlint.json
-rw-rw-r--   1 azureuser azureuser  1984 Feb 13 02:41 .pre-commit-config.yaml
-rw-rw-r--   1 azureuser azureuser   220 Feb 13 02:41 .pre-commit-hooks.yaml
drwxrwxr-x   3 azureuser azureuser  4096 Feb 13 03:30 .pytest_cache
-rw-rw-r--   1 azureuser azureuser   558 Feb 13 02:41 LICENSE
-rw-rw-r--   1 azureuser azureuser  2417 Feb 13 02:41 README.md
-rw-rw-r--   1 azureuser azureuser  2756 Feb 13 02:41 SECURITY.md
drwxrwxr-x  19 azureuser azureuser  4096 Feb 13 09:28 ansible
-rw-rw-r--   1 azureuser azureuser  2642 Feb 13 02:41 azure-pipelines.yml
drwxrwxr-x   9 azureuser azureuser  4096 Feb 13 02:41 docs
-rw-rw-r--   1 azureuser azureuser 22975 Feb 13 02:41 pylintrc
-rw-rw-r--   1 azureuser azureuser  1068 Feb 13 02:41 pyproject.toml
drwxrwxr-x   5 azureuser azureuser  4096 Feb 13 02:41 sdn_tests
-rwxrwxr-x   1 azureuser azureuser 18986 Feb 13 02:41 setup-container.sh
-rw-rw-r--   1 azureuser azureuser  2025 Feb 13 02:41 sonic_dictionary.txt
drwxrwxr-x  15 azureuser azureuser  4096 Feb 13 02:41 spytest
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 test_reporting
drwxrwxr-x 126 azureuser azureuser  4096 Feb 13 03:30 tests
+ ls /data/sonic-mgmt/tests/kvmtest.sh
/data/sonic-mgmt/tests/kvmtest.sh
+ echo '=== Check sonic-mgmt docker container ==='
=== Check sonic-mgmt docker container ===
+ docker ps -a --filter name=sonic-mgmt
CONTAINER ID   IMAGE     COMMAND   CREATED   STATUS    PORTS     NAMES
+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
+ echo 'FAIL: sonic-mgmt container not running or not accessible'
FAIL: sonic-mgmt container not running or not accessible
+ echo '=== Check /data contents ==='
=== Check /data contents ===
+ ls -la /data/
total 20
drwxr-xr-x  5 azureuser azureuser 4096 Feb 13 03:20 .
drwxr-xr-x 22 root      root      4096 Feb 13 02:25 ..
drwxr-xr-x  6 root      root      4096 Feb 13 03:20 ceos
drwxrwxr-x 13 azureuser azureuser 4096 Feb 13 03:30 sonic-mgmt
drwxr-xr-x  3 azureuser azureuser 4096 Feb 13 02:25 sonic-vm
+ echo '=== Check cEOS images ==='
=== Check cEOS images ===
+ ls -la /data/sonic-vm/images/
total 5030284
drwxr-xr-x 2 azureuser azureuser       4096 Feb 13 10:39 .
drwxr-xr-x 3 azureuser azureuser       4096 Feb 13 02:25 ..
-rw-r--r-- 1 azureuser azureuser 5150998528 Feb 13 10:38 sonic-vs.img
+ ls -la /data/ceos/
total 24
drwxr-xr-x   6 root      root      4096 Feb 13 03:20 .
drwxr-xr-x   5 azureuser azureuser 4096 Feb 13 03:20 ..
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0100
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0101
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0102
drwxrwxr-x+ 10 root      root      4096 Feb 13 03:21 ceos_vms6-1_VM0103
+ echo '=== Docker images ==='
=== Docker images ===
+ docker images
+ head -20
IMAGE                                                                ID             DISK USAGE   CONTENT SIZE   EXTRA
ceosimage:4.29.10.1M                                                 cf484164b16d       2.89GB          731MB        
ceosimage:4.29.10.1M-1                                               c6a1850ef28f       2.89GB          731MB   U    
docker-ptf:latest                                                    f28aaf787373       9.09GB         4.39GB        
docker-sonic-vs:latest                                               93c8cf3b870e       1.74GB          828MB        
publicmirror.azurecr.io/debian:bookworm                              c66c66fac809        185MB         52.2MB        
publicmirror.azurecr.io/debian:trixie                                c71b05eac0b2        186MB         52.5MB        
sonic-slave-bookworm-azureuser:1828b4d7c29                           983be73adef0       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:35feeb650be                           ae838157f361       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:6db1f584aa2                           266b697e04d1       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:73c8df02574                           dfe58e9aafc4       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:9eaf6be7f19                           79c4b6740f13       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:a27a4904ede                           c2a93f135256       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:ada88ee24f1                           06a1ce313b86       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:b8b044053f8                           e27f53eab83a       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:eee57d1af57                           18f6072bb74c       13.5GB         3.34GB        
sonic-slave-bookworm:733a2a69061                                     0114860d906a       13.5GB         3.34GB        
sonic-slave-trixie-azureuser:01d13355124                             896b1ab6f42e         14GB         3.34GB        
sonic-slave-trixie-azureuser:83b460050f6                             b1c07fdfb78c         14GB         3.34GB        
sonic-slave-trixie-azureuser:86a2c601666                             e07cd74d1aa7         14GB         3.34GB        
+ echo '=== Docker containers ==='
=== Docker containers ===
+ docker ps -a
CONTAINER ID   IMAGE                                                 COMMAND                  CREATED        STATUS        PORTS     NAMES
46054318c3f1   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   9 hours ago    Up 9 hours              ceos_vms6-1_VM0102
869da5d16fd1   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   9 hours ago    Up 9 hours              ceos_vms6-1_VM0100
cba8ba0394c2   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   9 hours ago    Up 9 hours              ceos_vms6-1_VM0103
dfb331f981cf   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   9 hours ago    Up 9 hours              ceos_vms6-1_VM0101
2e1721943205   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   9 hours ago    Up 9 hours              net_vms6-1_VM0103
9e9bcf00aadf   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   9 hours ago    Up 9 hours              net_vms6-1_VM0102
f8592fda4d9f   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   9 hours ago    Up 9 hours              net_vms6-1_VM0101
a0dd05607cc5   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   9 hours ago    Up 9 hours              net_vms6-1_VM0100
6240945900b7   sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   "/root/env-python3/b…"   9 hours ago    Up 9 hours              ptf_vms6-1
d269a1130978   sonic-slave-trixie-azureuser:b0cc8a9f29a              "bash -c 'make -f sl…"   16 hours ago   Up 16 hours   22/tcp    hardcore_keller
8ebc852cc991   d0474f6ff0b1                                          "/bin/sh -c '#(nop) …"   5 weeks ago    Created                 hopeful_mcclintock
+ echo '=== Diagnostics complete ==='
=== Diagnostics complete ===

🔍 7 error(s) found across 3 log file(s)

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #180 | Commit: 96988a2

⚠️ setup-testbed.log (50 errors, 4484 lines total)

Errors found:

763:fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": true, "cmd": "arp -d  10.250.0.101", "delta": "0:00:00.002849", "end": "2026-02-13 13:12:48.025991", "msg": "non-zero return code", "rc": 255, "start": "2026-02-13 13:12:48.023142", "stderr": "", "stderr_lines": [], "stdout": "No ARP entry for 10.250.0.101", "stdout_lines": ["No ARP entry for 10.250.0.101"]}
796:TASK [vm_set : Fail if kickstart gives error for vlab-01] **********************
1998:ok: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j950521033811.781628', 'results_file': '/root/.ansible_async/j950521033811.781628', 'changed': True, 'vm_name': 'VM0100', 'ansible_loop_var': 'vm_name'})
1999:ok: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j510535630245.781659', 'results_file': '/root/.ansible_async/j510535630245.781659', 'changed': True, 'vm_name': 'VM0101', 'ansible_loop_var': 'vm_name'})
2000:ok: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j220640312633.781690', 'results_file': '/root/.ansible_async/j220640312633.781690', 'changed': True, 'vm_name': 'VM0102', 'ansible_loop_var': 'vm_name'})
2001:ok: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j842969429004.781728', 'results_file': '/root/.ansible_async/j842969429004.781728', 'changed': True, 'vm_name': 'VM0103', 'ansible_loop_var': 'vm_name'})
2160:    "msg": "upstream_neighbor_groups=undefined, downstream_neighbor_groups=undefined"
2179:fatal: [STR-ACS-VSERV-01 -> localhost]: FAILED! => {"changed": false, "msg": "Traceback (most recent call last):\n  File \"/tmp/ansible_test_facts_payload_o5ze2ftl/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 240, in main\n    testbed_topo = topoinfo.get_testbed_info(testbed_name)\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/tmp/ansible_test_facts_payload_o5ze2ftl/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 193, in get_testbed_info\n    return self.testbed_topo[testbed_name]\n           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^\nKeyError: 'vms-kvm-t0'\n"}
3113:        "failed": false,
3131:        "failed": false,
3149:        "failed": false,
3167:        "failed": false,
3188:fatal: [VM0102 -> STR-ACS-VSERV-01(172.17.0.1)]: FAILED! => {"changed": true, "cmd": ["rm", "-rf", "//data/ceos/ceos_vms6-1_VM0102"], "delta": "0:00:00.005354", "end": "2026-02-13 13:15:31.297577", "msg": "non-zero return code", "rc": 1, "start": "2026-02-13 13:15:31.292223", "stderr": "rm: cannot remove '//data/ceos/ceos_vms6-1_VM0102': Directory not empty", "stderr_lines": ["rm: cannot remove '//data/ceos/ceos_vms6-1_VM0102': Directory not empty"], "stdout": "", "stdout_lines": []}
3189:fatal: [VM0101 -> STR-ACS-VSERV-01(172.17.0.1)]: FAILED! => {"changed": true, "cmd": ["rm", "-rf", "//data/ceos/ceos_vms6-1_VM0101"], "delta": "0:00:00.005937", "end": "2026-02-13 13:15:31.301477", "msg": "non-zero return code", "rc": 1, "start": "2026-02-13 13:15:31.295540", "stderr": "rm: cannot remove '//data/ceos/ceos_vms6-1_VM0101': Directory not empty", "stderr_lines": ["rm: cannot remove '//data/ceos/ceos_vms6-1_VM0101': Directory not empty"], "stdout": "", "stdout_lines": []}
3190:fatal: [VM0103 -> STR-ACS-VSERV-01(172.17.0.1)]: FAILED! => {"changed": true, "cmd": ["rm", "-rf", "//data/ceos/ceos_vms6-1_VM0103"], "delta": "0:00:00.004758", "end": "2026-02-13 13:15:31.311396", "msg": "non-zero return code", "rc": 1, "start": "2026-02-13 13:15:31.306638", "stderr": "rm: cannot remove '//data/ceos/ceos_vms6-1_VM0103': Directory not empty", "stderr_lines": ["rm: cannot remove '//data/ceos/ceos_vms6-1_VM0103': Directory not empty"], "stdout": "", "stdout_lines": []}
4439:STR-ACS-VSERV-01           : ok=337  changed=44   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
4440:VM0100                     : ok=40   changed=4    unreachable=0    failed=0    skipped=51   rescued=0    ignored=0   
4441:VM0101                     : ok=23   changed=1    unreachable=0    failed=1    skipped=3    rescued=0    ignored=0   
4442:VM0102                     : ok=23   changed=1    unreachable=0    failed=1    skipped=3    rescued=0    ignored=0   
4443:VM0103                     : ok=23   changed=1    unreachable=0    failed=1    skipped=3    rescued=0    ignored=0   
4444:VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4445:VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4446:VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4447:VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4448:VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4449:VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4450:VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4451:VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4452:VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4453:VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4454:VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4455:VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4456:VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4457:VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4458:VM0118                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4459:VM0119                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4460:VM0120                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4461:VM0121                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4462:VM0122                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4463:VM0123                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4464:VM0124                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4465:VM0125                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4466:VM0126                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4467:VM0127                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4468:VM0128                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4469:VM0129                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4470:VM0130                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4471:VM0131                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4472:VM0132                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4473:VM0133                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Last 200 lines:

skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

TASK [cisco : Set properties list to values, when they're defined] *************
skipping: [VM0100]
skipping: [VM0104]
skipping: [VM0105]
skipping: [VM0106]
skipping: [VM0107]
skipping: [VM0108]
skipping: [VM0109]
skipping: [VM0110]
skipping: [VM0111]
skipping: [VM0112]
skipping: [VM0113]
skipping: [VM0114]
skipping: [VM0115]
skipping: [VM0116]
skipping: [VM0117]
skipping: [VM0118]
skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

TASK [cisco : Expand ARISTA01T1 properties into props] *************************
skipping: [VM0100] => (item=common) 
skipping: [VM0100]
skipping: [VM0104]
skipping: [VM0105]
skipping: [VM0106]
skipping: [VM0107]
skipping: [VM0108]
skipping: [VM0109]
skipping: [VM0110]
skipping: [VM0111]
skipping: [VM0112]
skipping: [VM0113]
skipping: [VM0114]
skipping: [VM0115]
skipping: [VM0116]
skipping: [VM0117]
skipping: [VM0118]
skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

TASK [cisco : include_tasks] ***************************************************
skipping: [VM0100]
skipping: [VM0104]
skipping: [VM0105]
skipping: [VM0106]
skipping: [VM0107]
skipping: [VM0108]
skipping: [VM0109]
skipping: [VM0110]
skipping: [VM0111]
skipping: [VM0112]
skipping: [VM0113]
skipping: [VM0114]
skipping: [VM0115]
skipping: [VM0116]
skipping: [VM0117]
skipping: [VM0118]
skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

PLAY [servers:&vm_host] ********************************************************

TASK [Integrated traffic generator] ********************************************
skipping: [STR-ACS-VSERV-01]

PLAY RECAP *********************************************************************
STR-ACS-VSERV-01           : ok=337  changed=44   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
VM0100                     : ok=40   changed=4    unreachable=0    failed=0    skipped=51   rescued=0    ignored=0   
VM0101                     : ok=23   changed=1    unreachable=0    failed=1    skipped=3    rescued=0    ignored=0   
VM0102                     : ok=23   changed=1    unreachable=0    failed=1    skipped=3    rescued=0    ignored=0   
VM0103                     : ok=23   changed=1    unreachable=0    failed=1    skipped=3    rescued=0    ignored=0   
VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0118                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0119                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0120                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0121                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0122                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0123                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0124                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0125                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0126                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0127                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0128                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0129                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0130                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0131                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0132                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0133                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0134                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0135                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0136                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0137                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0138                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0139                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0140                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0141                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0142                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0143                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   


🔍 50 error(s) found across 2 log file(s)

…sh-dut

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #181 | Commit: cfc9908

devin-ai-integration bot and others added 2 commits February 13, 2026 15:35
…verify sonic-mgmt

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
…comments

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #183 | Commit: 965088b

⚙️ Environment snapshot at failure time
=== Docker containers ===
NAMES                IMAGE                                                 STATUS
ceos_vms6-1_VM0103   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0102   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0101   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0100   ceosimage:4.29.10.1M-1                                Up 50 minutes
net_vms6-1_VM0103    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
net_vms6-1_VM0102    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
net_vms6-1_VM0101    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
net_vms6-1_VM0100    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
ptf_vms6-1           sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   Up 52 minutes
sonic-mgmt           docker-sonic-mgmt-azureuser:master                    Up 8 hours
hardcore_keller      sonic-slave-trixie-azureuser:b0cc8a9f29a              Up 24 hours
hopeful_mcclintock   d0474f6ff0b1                                          Created

=== Log directory listing ===
total 1148
drwxr-xr-x 4 azureuser azureuser   4096 Feb 13 19:35 .
drwxr-xr-x 4 azureuser azureuser   4096 Feb 13 19:35 ..
drwxr-xr-x 4 azureuser azureuser   4096 Feb 13 19:35 1vlan
-rw-r--r-- 1 azureuser azureuser   3669 Feb 13 18:40 clean-testbed.log
-rw-r--r-- 1 azureuser azureuser   8955 Feb 13 18:37 diagnostics.log
-rw-r--r-- 1 azureuser azureuser 982444 Feb 13 19:35 kvmtest-run.log
drwxr-xr-x 3 azureuser azureuser   4096 Feb 13 19:35 ptf
-rw-r--r-- 1 azureuser azureuser 147606 Feb 13 18:45 setup-testbed.log
-rw-r--r-- 1 azureuser azureuser    218 Feb 13 18:37 sonic-mgmt-container-setup.log

=== Disk usage ===
Filesystem      Size  Used Avail Use% Mounted on
/dev/root       993G  163G  831G  17% /
/dev/root       993G  163G  831G  17% /

=== Recent dmesg (last 10 lines) ===

🔍 1 log file(s) processed, 0 error patterns matched — check full log tails above for the actual failure

…tion, handle oversized comments

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #184 | Commit: 12ab8e6

⚠️ kvmtest-run.log (30 errors, 7791 lines, 982444B)

Pytest summary:

=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip snappi metadata generation for non-tgen testbed
SKIPPED [1] test_pretest.py:469: No URL specified for python saithrift package
SKIPPED [1] common/helpers/assertions.py:16: Skip 'test_backend_acl_load' on non t0-backend testbeds.
=== 10 passed, 3 skipped, 4897 deselected, 727 warnings in 258.84s (0:04:18) ===
=== Running tests individually ===
Running: python3 -m pytest dns/test_dns_resolv_conf.py --inventory ../ansible/veos_vtb --host-pattern vlab-01 --dpu-pattern None --testbed vms-kvm-t0 --testbed_file vtestbed.yaml --log-cli-level warning --log-file-level debug --kube_master unset --showlocals --assert plain --show-capture no -rav --ignore=ptftests --ignore=acstests --ignore=saitests --ignore=scripts --ignore=k8s --ignore=sai_qualify --maxfail=1 --log-file logs/1vlan/dns/test_dns_resolv_conf.log --junitxml=logs/1vlan/dns/test_dns_resolv_conf.xml --allow_recover --completeness_level=confident
============================= test session starts ==============================
=========================== short test summary info ============================
SKIPPED [1] generic_config_updater/test_eth_interface.py:199: Bypass as it is blocking submodule update
SKIPPED [1] generic_config_updater/test_eth_interface.py:335: Bypass as this is not a production scenario
SKIPPED [1] generic_config_updater/test_eth_interface.py:362: Bypass as this is not a production scenario
=========== 10 passed, 3 skipped, 372 warnings in 303.49s (0:05:03) ============
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 120.98s (0:02:00) ====

Error lines:

4:+ exit_on_error=
9:+ exit_on_error=true
89:TASK [get connection graph if defined for dut (ignore any errors)] *************
321:TASK [saved original minigraph file in SONiC DUT(ignore errors when file does not exist)] ***
322:fatal: [vlab-01]: FAILED! => {"changed": true, "cmd": "mv /etc/sonic/minigraph.xml /etc/sonic/minigraph.xml.orig", "delta": "0:00:00.003466", "end": "2026-02-13 20:59:13.398390", "msg": "non-zero return code", "rc": 1, "start": "2026-02-13 20:59:13.394924", "stderr": "mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory", "stderr_lines": ["mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory"], "stdout": "", "stdout_lines": []}
373:    "msg": "Stat result is {'changed': False, 'stat': {'exists': False}, 'failed': False}"
546:ASYNC FAILED on vlab-01: jid=j680454585934.17963
547:fatal: [vlab-01]: FAILED! => {"ansible_job_id": "j680454585934.17963", "changed": true, "cmd": ["chronyd", "-F", "1", "-q"], "delta": "0:00:10.555745", "end": "2026-02-13 21:00:05.680968", "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j680454585934.17963", "start": "2026-02-13 20:59:55.125223", "started": 1, "stderr": "2026-02-13T20:59:55Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)\n2026-02-13T20:59:55Z Timezone right/UTC failed leap second check, ignoring\n2026-02-13T20:59:55Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift\n2026-02-13T20:59:55Z Loaded seccomp filter (level 1)\n2026-02-13T21:00:05Z No suitable source for synchronisation\n2026-02-13T21:00:05Z chronyd exiting", "stderr_lines": ["2026-02-13T20:59:55Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)", "2026-02-13T20:59:55Z Timezone right/UTC failed leap second check, ignoring", "2026-02-13T20:59:55Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift", "2026-02-13T20:59:55Z Loaded seccomp filter (level 1)", "2026-02-13T21:00:05Z No suitable source for synchronisation", "2026-02-13T21:00:05Z chronyd exiting"], "stdout": "", "stdout_lines": []}
657:vlab-01                    : ok=85   changed=24   unreachable=0    failed=0    skipped=93   rescued=0    ignored=2   
1737:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_gnmi_authorize_failed_with_invalid_cname', found at tests/gnmi_e2e/test_gnmi_auth.py:33 and tests/gnmi/test_gnmi.py:152
1865:WARNING  pytest_plus:__init__.py:94 Test <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1866:WARNING  pytest_plus:__init__.py:94 <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status --fetch-from-hardware]> has an id that looks above 60 characters.
1876:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_verify_fec_stats_counters', found at tests/platform_tests/test_intf_fec.py:125 and tests/layer1/test_fec_error.py:27
1892:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -h]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1893:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --help]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1894:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -v]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1895:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --version]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1896:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -j]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1897:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --json]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1898:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -r]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1899:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --raw]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
2816:layer1/test_fec_error.py:8
2817:  /data/sonic-mgmt/tests/layer1/test_fec_error.py:8: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
2820:layer1/test_port_error.py:10
2821:  /data/sonic-mgmt/tests/layer1/test_port_error.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
3548:  /data/sonic-mgmt/tests/snappi_tests/dash/ha/ha_helper.py:71: PytestCollectionWarning: cannot collect test class 'TestPhase' because it has a __init__ constructor (from: tests/snappi_tests/dash/test_cps.py)
3980:=========================== short test summary info ============================
4279:WARNING  tests.conftest:conftest.py:3006 Core dump or config check failed for test_cacl.py, results: {"core_dump_check": {"failed": false, "new_core_dumps": {"vlab-01": []}}, "config_db_check": {"failed": true, "pre_only_config": {"vlab-01": {"null": {"DNS_NAMESERVER": {"10.11.0.5": {}, "10.11.0.6": {}}, "BMP": {"table": {"bgp_neighbor_table": "true", "bgp_rib_in_table": "true", "bgp_rib_out_table": "true"}}, "TACPLUS": {"global": {"auth_type": "login", "passkey": "testing123"}}, "AAA": {"accounting": {"login": "tacacs+,local"}, "authentication": {"login": "tacacs+"}, "authorization": {"login": "tacacs+"}}}}}, "cur_only_config": {"vlab-01": {"null": {}}}, "inconsistent_config": {"vlab-01": {"null": {"DEVICE_METADATA": {"pre_value": {"localhost": {"bgp_asn": "65100", "buffer_model": "traditional", "cloudtype": "Public", "default_bgp_status": "up", "default_pfcwd_status": "disable", "deployment_id": "1", "docker_routing_config_mode": "separated", "hostname": "vlab-01", "hwsku": "Force10-S6000", "mac": "22:48:23:27:33:d8", "orch_northbond_route_zmq_enabled": "true", "platform": "x86_64-kvm_x86_64-r0", "region": "None", "synchronous_mode": "enable", "timezone": "UTC", "type": "ToRRouter", "yang_config_validation": "disable"}}, "cur_value": {"localhost": {"bgp_asn": "65100", "buffer_model": "traditional", "cloudtype": "Public", "default_bgp_status": "up", "default_pfcwd_status": "disable", "deployment_id": "1", "docker_routing_config_mode": "separated", "hostname": "vlab-01", "hwsku": "Force10-S6000", "mac": "22:48:23:27:33:d8", "platform": "x86_64-kvm_x86_64-r0", "region": "None", "synchronous_mode": "enable", "timezone": "UTC", "type": "ToRRouter", "yang_config_validation": "disable"}}}, "FEATURE": {"pre_value": {"bgp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "bmp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "false"}, "database": {"auto_restart": "always_enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "always_enabled", "support_syslog_rate_limit": "true"}, "dhcp_relay": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "True"}, "dhcp_server": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "False"}, "eventd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "frr_bmp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "false"}, "gbsyncd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gnmi": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "lldp": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "macsec": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "True"}, "mgmt-framework": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "mux": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "always_disabled", "support_syslog_rate_limit": "true"}, "nat": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "pmon": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "radv": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "sflow": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "snmp": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "swss": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "syncd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "teamd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "telemetry": {"delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "state": "disabled"}}, "cur_value": {"bgp": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "database": {"auto_restart": "always_enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "always_enabled", "support_syslog_rate_limit": "true"}, "dhcp_relay": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "True"}, "dhcp_server": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "False"}, "eventd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gbsyncd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gnmi": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "lldp": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "macsec": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "True"}, "mgmt-framework": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "mux": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "always_disabled", "support_syslog_rate_limit": "true"}, "nat": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "pmon": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "radv": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "sflow": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "snmp": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "swss": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "syncd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "teamd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "telemetry": {"delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "state": "disabled"}}}}}}}}
4281:DEBUG:tests.conftest:append custom_msg: {'dut_check_result': {'core_dump_check_failed': False, 'config_db_check_failed': True}}
4434:=========================== short test summary info ============================

Tail (80 lines):


voq/test_voq_ipfwd.py:730
  /data/sonic-mgmt/tests/voq/test_voq_ipfwd.py:730: PytestUnknownMarkWarning: Unknown pytest.mark.express - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.param(128, 64, marks=pytest.mark.express),

voq/test_voq_disrupts.py:25
  /data/sonic-mgmt/tests/voq/test_voq_disrupts.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_scale_ecmp.py:19
  /data/sonic-mgmt/tests/vxlan/test_scale_ecmp.py:19: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_vnet_decap.py:14
  /data/sonic-mgmt/tests/vxlan/test_vnet_decap.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vnet_vxlan.py:25
  /data/sonic-mgmt/tests/vxlan/test_vnet_vxlan.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

DEBUG:tests.conftest:[log_custom_msg] item: <Function test_collect_dualtor_logs>
INFO:root:Can not get Allure report URL. Please check logs
vxlan/test_vxlan_multi_tunnel.py:13
  /data/sonic-mgmt/tests/vxlan/test_vxlan_multi_tunnel.py:13: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vxlan_tunnel_route_scale.py:23
  /data/sonic-mgmt/tests/vxlan/test_vxlan_tunnel_route_scale.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

zmq/test_gnmi_zmq.py:12
  /data/sonic-mgmt/tests/zmq/test_gnmi_zmq.py:12: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_file.py::test_file_stat is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_system.py::test_system_time is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance[NOTSET] is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance_stats is skipped when no topology marker is given
    warnings.warn(warn_msg)

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /opt/venv/lib/python3.12/site-packages/pytest_ansible/host_manager/v213.py:13: DeprecationWarning: Host management is deprecated and will be removed in a future release
    class HostManagerV213(BaseHostManager):

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_recover_rsyslog_rate_limit[vlab-01]
tests/test_posttest.py::test_enable_startup_tsa_tsb_service
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_dualtor_logs
  /opt/venv/lib/python3.12/site-packages/ansible/plugins/loader.py:1485: UserWarning: AnsibleCollectionFinder has already been configured
    warnings.warn('AnsibleCollectionFinder has already been configured')

tests/test_posttest.py: 107 warnings
  /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=12090) is multi-threaded, use of fork() may lead to deadlocks in the child.
    self.pid = os.fork()

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /data/sonic-mgmt/tests/conftest.py:1248: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).
    record_testsuite_property("timestamp", datetime.utcnow())

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
------ generated xml file: /data/sonic-mgmt/tests/logs/1vlan/posttest.xml ------
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 120.98s (0:02:00) ====
⚠️ setup-testbed.log (30 errors, 4645 lines, 147604B)

Error lines:

763:fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": true, "cmd": "arp -d  10.250.0.101", "delta": "0:00:00.002846", "end": "2026-02-13 20:51:53.320582", "msg": "non-zero return code", "rc": 255, "start": "2026-02-13 20:51:53.317736", "stderr": "", "stderr_lines": [], "stdout": "No ARP entry for 10.250.0.101", "stdout_lines": ["No ARP entry for 10.250.0.101"]}
796:TASK [vm_set : Fail if kickstart gives error for vlab-01] **********************
1998:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j99883768144.29978', 'results_file': '/root/.ansible_async/j99883768144.29978', 'changed': True, 'vm_name': 'VM0100', 'ansible_loop_var': 'vm_name'})
1999:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j366421642221.30009', 'results_file': '/root/.ansible_async/j366421642221.30009', 'changed': True, 'vm_name': 'VM0101', 'ansible_loop_var': 'vm_name'})
2000:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j33070435826.30040', 'results_file': '/root/.ansible_async/j33070435826.30040', 'changed': True, 'vm_name': 'VM0102', 'ansible_loop_var': 'vm_name'})
2001:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j847833268909.30081', 'results_file': '/root/.ansible_async/j847833268909.30081', 'changed': True, 'vm_name': 'VM0103', 'ansible_loop_var': 'vm_name'})
2179:fatal: [STR-ACS-VSERV-01 -> localhost]: FAILED! => {"changed": false, "msg": "Traceback (most recent call last):\n  File \"/tmp/ansible_test_facts_payload_t5wsd_mf/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 240, in main\n    testbed_topo = topoinfo.get_testbed_info(testbed_name)\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/tmp/ansible_test_facts_payload_t5wsd_mf/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 193, in get_testbed_info\n    return self.testbed_topo[testbed_name]\n           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^\nKeyError: 'vms-kvm-t0'\n"}
3113:        "failed": false,
3131:        "failed": false,
3149:        "failed": false,
3167:        "failed": false,
4598:STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
4599:VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
4600:VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4601:VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4602:VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4603:VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4604:VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4605:VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4606:VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4607:VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4608:VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4609:VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4610:VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4611:VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4612:VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4613:VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4614:VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4615:VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4616:VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Tail (80 lines):

skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

PLAY [servers:&vm_host] ********************************************************

TASK [Integrated traffic generator] ********************************************
skipping: [STR-ACS-VSERV-01]

PLAY RECAP *********************************************************************
STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0118                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0119                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0120                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0121                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0122                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0123                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0124                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0125                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0126                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0127                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0128                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0129                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0130                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0131                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0132                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0133                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0134                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0135                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0136                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0137                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0138                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0139                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0140                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0141                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0142                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0143                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Done
+ sleep 180
📄 clean-testbed.log (94 lines, 3669B)

Tail (80 lines):

+ docker stop ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ docker rm -f ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0102'
Stopping and removing container: ceos_vms6-1_VM0102
+ docker stop ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ docker rm -f ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0101'
Stopping and removing container: ceos_vms6-1_VM0101
+ docker stop ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ docker rm -f ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0100'
Stopping and removing container: ceos_vms6-1_VM0100
+ docker stop ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ docker rm -f ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0103'
Stopping and removing container: net_vms6-1_VM0103
+ docker stop net_vms6-1_VM0103
net_vms6-1_VM0103
+ docker rm -f net_vms6-1_VM0103
net_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0102'
Stopping and removing container: net_vms6-1_VM0102
+ docker stop net_vms6-1_VM0102
net_vms6-1_VM0102
+ docker rm -f net_vms6-1_VM0102
net_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0101'
Stopping and removing container: net_vms6-1_VM0101
+ docker stop net_vms6-1_VM0101
net_vms6-1_VM0101
+ docker rm -f net_vms6-1_VM0101
net_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0100'
Stopping and removing container: net_vms6-1_VM0100
+ docker stop net_vms6-1_VM0100
net_vms6-1_VM0100
+ docker rm -f net_vms6-1_VM0100
net_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ptf_vms6-1'
Stopping and removing container: ptf_vms6-1
+ docker stop ptf_vms6-1
ptf_vms6-1
+ docker rm -f ptf_vms6-1
ptf_vms6-1
+ echo '=== Cleaning cEOS data directories ==='
=== Cleaning cEOS data directories ===
+ sudo rm -rf /data/ceos/ceos_vms6-1_VM0100 /data/ceos/ceos_vms6-1_VM0101 /data/ceos/ceos_vms6-1_VM0102 /data/ceos/ceos_vms6-1_VM0103
+ echo '=== Removing stale vlab VMs ==='
=== Removing stale vlab VMs ===
+ virsh -c qemu:///system list --all
+ grep -q vlab
+ echo '=== Verifying sonic-mgmt container still running ==='
=== Verifying sonic-mgmt container still running ===
+ docker exec sonic-mgmt echo OK
OK
+ echo '=== Remaining containers ==='
=== Remaining containers ===
+ docker ps -a --format 'table {{.Names}}\t{{.Status}}'
NAMES                STATUS
sonic-mgmt           Up 9 hours
hardcore_keller      Up 25 hours
hopeful_mcclintock   Created
+ echo '=== Cleanup complete ==='
=== Cleanup complete ===
📄 sonic-mgmt-container-setup.log (5 lines, 218B)

Tail (80 lines):

+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo 'sonic-mgmt container already running, nothing to do'
sonic-mgmt container already running, nothing to do
+ exit 0
⚠️ diagnostics.log (2 errors, 132 lines, 8955B)

Error lines:

26:+ echo 'FAIL: cannot connect to libvirt'
27:FAIL: cannot connect to libvirt

Tail (80 lines):

-rwxrwxr-x   1 azureuser azureuser 18986 Feb 13 02:41 setup-container.sh
-rw-rw-r--   1 azureuser azureuser  2025 Feb 13 02:41 sonic_dictionary.txt
drwxrwxr-x  15 azureuser azureuser  4096 Feb 13 02:41 spytest
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 test_reporting
drwxrwxr-x 129 azureuser azureuser  4096 Feb 13 18:57 tests
+ ls /data/sonic-mgmt/tests/kvmtest.sh
/data/sonic-mgmt/tests/kvmtest.sh
+ echo '=== Check sonic-mgmt docker container ==='
=== Check sonic-mgmt docker container ===
+ docker ps -a --filter name=sonic-mgmt
CONTAINER ID   IMAGE                                COMMAND       CREATED       STATUS       PORTS     NAMES
43138ce8a47b   docker-sonic-mgmt-azureuser:master   "/bin/bash"   9 hours ago   Up 9 hours   22/tcp    sonic-mgmt
+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo '=== Check /data contents ==='
=== Check /data contents ===
+ ls -la /data/
total 20
drwxr-xr-x  5 azureuser azureuser 4096 Feb 13 03:20 .
drwxr-xr-x 22 root      root      4096 Feb 13 02:25 ..
drwxr-xr-x  6 root      root      4096 Feb 13 18:44 ceos
drwxrwxr-x 13 azureuser azureuser 4096 Feb 13 03:30 sonic-mgmt
drwxr-xr-x  4 azureuser azureuser 4096 Feb 13 11:58 sonic-vm
+ echo '=== Check cEOS images ==='
=== Check cEOS images ===
+ ls -la /data/sonic-vm/images/
total 5030348
drwxr-xr-x 2 azureuser azureuser       4096 Feb 13 18:40 .
drwxr-xr-x 4 azureuser azureuser       4096 Feb 13 11:58 ..
-rw-r--r-- 1 azureuser azureuser 5151064064 Feb 13 18:40 sonic-vs.img
+ ls -la /data/ceos/
total 24
drwxr-xr-x   6 root      root      4096 Feb 13 18:44 .
drwxr-xr-x   5 azureuser azureuser 4096 Feb 13 03:20 ..
drwxrwxr-x+ 10 root      root      4096 Feb 13 18:45 ceos_vms6-1_VM0100
drwxrwxr-x+ 10 root      root      4096 Feb 13 18:45 ceos_vms6-1_VM0101
drwxrwxr-x+ 10 root      root      4096 Feb 13 18:45 ceos_vms6-1_VM0102
drwxrwxr-x+ 10 root      root      4096 Feb 13 18:45 ceos_vms6-1_VM0103
+ echo '=== Docker images ==='
=== Docker images ===
+ docker images
+ head -20
IMAGE                                                                ID             DISK USAGE   CONTENT SIZE   EXTRA
ceosimage:4.29.10.1M                                                 cf484164b16d       2.89GB          731MB        
ceosimage:4.29.10.1M-1                                               c6a1850ef28f       2.89GB          731MB   U    
docker-ptf:latest                                                    f28aaf787373       9.09GB         4.39GB        
docker-sonic-mgmt-azureuser:master                                   dc1af3bbf6a5       5.08GB          992MB   U    
docker-sonic-vs:latest                                               93c8cf3b870e       1.74GB          828MB        
publicmirror.azurecr.io/debian:bookworm                              c66c66fac809        185MB         52.2MB        
publicmirror.azurecr.io/debian:trixie                                c71b05eac0b2        186MB         52.5MB        
sonic-slave-bookworm-azureuser:1828b4d7c29                           983be73adef0       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:35feeb650be                           ae838157f361       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:6db1f584aa2                           266b697e04d1       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:73c8df02574                           dfe58e9aafc4       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:9eaf6be7f19                           79c4b6740f13       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:a27a4904ede                           c2a93f135256       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:ada88ee24f1                           06a1ce313b86       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:b8b044053f8                           e27f53eab83a       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:eee57d1af57                           18f6072bb74c       13.5GB         3.34GB        
sonic-slave-bookworm:733a2a69061                                     0114860d906a       13.5GB         3.34GB        
sonic-slave-trixie-azureuser:01d13355124                             896b1ab6f42e         14GB         3.34GB        
sonic-slave-trixie-azureuser:83b460050f6                             b1c07fdfb78c         14GB         3.34GB        
+ echo '=== Docker containers ==='
=== Docker containers ===
+ docker ps -a
CONTAINER ID   IMAGE                                                 COMMAND                  CREATED        STATUS        PORTS     NAMES
12914646c38f   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago    Up 2 hours              ceos_vms6-1_VM0103
8019af88717a   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago    Up 2 hours              ceos_vms6-1_VM0102
8a30a5163191   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago    Up 2 hours              ceos_vms6-1_VM0101
3d327f5f1dcb   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago    Up 2 hours              ceos_vms6-1_VM0100
6aaa7a63041b   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago    Up 2 hours              net_vms6-1_VM0103
13604328d96a   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago    Up 2 hours              net_vms6-1_VM0102
dcdd25bfb5f7   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago    Up 2 hours              net_vms6-1_VM0101
f4f22d93f5cd   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago    Up 2 hours              net_vms6-1_VM0100
b9ab77a4e0b9   sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   "/root/env-python3/b…"   2 hours ago    Up 2 hours              ptf_vms6-1
43138ce8a47b   docker-sonic-mgmt-azureuser:master                    "/bin/bash"              9 hours ago    Up 9 hours    22/tcp    sonic-mgmt
d269a1130978   sonic-slave-trixie-azureuser:b0cc8a9f29a              "bash -c 'make -f sl…"   25 hours ago   Up 25 hours   22/tcp    hardcore_keller
8ebc852cc991   d0474f6ff0b1                                          "/bin/sh -c '#(nop) …"   5 weeks ago    Created                 hopeful_mcclintock
+ echo '=== Diagnostics complete ==='
=== Diagnostics complete ===
⚙️ Environment snapshot
=== Docker containers ===
NAMES                IMAGE                                                 STATUS
ceos_vms6-1_VM0102   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0103   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0100   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0101   ceosimage:4.29.10.1M-1                                Up 50 minutes
net_vms6-1_VM0103    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
net_vms6-1_VM0102    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
net_vms6-1_VM0101    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
net_vms6-1_VM0100    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
ptf_vms6-1           sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   Up 51 minutes
sonic-mgmt           docker-sonic-mgmt-azureuser:master                    Up 10 hours
hardcore_keller      sonic-slave-trixie-azureuser:b0cc8a9f29a              Up 26 hours
hopeful_mcclintock   d0474f6ff0b1                                          Created

=== Log directory ===
total 1152
drwxr-xr-x 4 azureuser azureuser   4096 Feb 13 21:45 .
drwxr-xr-x 4 azureuser azureuser   4096 Feb 13 21:45 ..
drwxr-xr-x 4 azureuser azureuser   4096 Feb 13 21:45 1vlan
-rw-r--r-- 1 azureuser azureuser   3669 Feb 13 20:50 clean-testbed.log
-rw-r--r-- 1 azureuser azureuser   8955 Feb 13 20:47 diagnostics.log
-rw-r--r-- 1 azureuser azureuser 982444 Feb 13 21:45 kvmtest-run.log
drwxr-xr-x 3 azureuser azureuser   4096 Feb 13 21:45 ptf
-rw-r--r-- 1 azureuser azureuser 147604 Feb 13 20:56 setup-testbed.log
-rw-r--r-- 1 azureuser azureuser    218 Feb 13 20:47 sonic-mgmt-container-setup.log

=== Disk usage ===
Filesystem      Size  Used Avail Use% Mounted on
/dev/root       993G  163G  831G  17% /
/dev/root       993G  163G  831G  17% /

🔍 62 error(s) across 5 log file(s)

Adds a pipeline step to fetch the fixed test_cacl.py from
arthur-cog-sonic/sonic-mgmt fork before running kvmtest. The fix
adds teardown to restore_test_env fixture so config_db_check passes.

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
Copy link
Author

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 potential issue.

View 5 additional findings in Devin Review.

Open in Devin Review

Comment on lines +298 to +299
git remote update
git reset --hard origin/master
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 git reset --hard origin/master in "Setup testbed" step overwrites the test fix applied by the earlier "Update sonic-mgmt" step

The "Update sonic-mgmt with test fixes" step at azure-pipelines.yml:206 checks out a patched test_cacl.py from the arthur-fork branch into /data/sonic-mgmt. However, the later "Setup testbed" step at azure-pipelines.yml:299 runs git reset --hard origin/master, which resets the entire working tree back to origin/master, completely erasing the file checked out in the earlier step.

Root Cause and Impact

The step ordering is:

  1. Line 206: git checkout arthur-fork/devin/1771222385-fix-cacl-config-restore -- tests/generic_config_updater/test_cacl.py
  2. Line 299: git reset --hard origin/master — this discards ALL uncommitted changes, including the checkout from step 1.

The intent of the "Update sonic-mgmt with test fixes" step is to patch test_cacl.py with a config restore fix before tests run. But because the setup-testbed step does a hard reset afterward, the fix is never actually present when kvmtest.sh executes at line 327. The test_cacl.py config restore fix is silently lost every run.

Impact: The test fix for test_cacl.py is never applied, so kvmtests always run with the unpatched version from origin/master.

Prompt for agents
In azure-pipelines.yml, the "Setup testbed" step (around line 297-299) does `pushd /data/sonic-mgmt`, `git remote update`, then `git reset --hard origin/master`. This erases the test_cacl.py fix that was checked out from arthur-fork in the "Update sonic-mgmt with test fixes" step (line 206). To fix this, either: (1) Move the "Update sonic-mgmt with test fixes" step (lines 193-210) to AFTER the `git reset --hard origin/master` in the "Setup testbed" step (i.e., after line 299 but before line 300), or (2) Re-apply the checkout of test_cacl.py after the git reset in the setup-testbed script. For example, after line 299 add: `git fetch arthur-fork devin/1771222385-fix-cacl-config-restore 2>/dev/null && git checkout arthur-fork/devin/1771222385-fix-cacl-config-restore -- tests/generic_config_updater/test_cacl.py 2>/dev/null || true`.
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #185 | Commit: 60d5031

⚠️ kvmtest-run.log (30 errors, 7791 lines, 982447B)

Pytest summary:

=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip snappi metadata generation for non-tgen testbed
SKIPPED [1] test_pretest.py:469: No URL specified for python saithrift package
SKIPPED [1] common/helpers/assertions.py:16: Skip 'test_backend_acl_load' on non t0-backend testbeds.
=== 10 passed, 3 skipped, 4897 deselected, 727 warnings in 257.53s (0:04:17) ===
=== Running tests individually ===
Running: python3 -m pytest dns/test_dns_resolv_conf.py --inventory ../ansible/veos_vtb --host-pattern vlab-01 --dpu-pattern None --testbed vms-kvm-t0 --testbed_file vtestbed.yaml --log-cli-level warning --log-file-level debug --kube_master unset --showlocals --assert plain --show-capture no -rav --ignore=ptftests --ignore=acstests --ignore=saitests --ignore=scripts --ignore=k8s --ignore=sai_qualify --maxfail=1 --log-file logs/1vlan/dns/test_dns_resolv_conf.log --junitxml=logs/1vlan/dns/test_dns_resolv_conf.xml --allow_recover --completeness_level=confident
============================= test session starts ==============================
=========================== short test summary info ============================
SKIPPED [1] generic_config_updater/test_eth_interface.py:199: Bypass as it is blocking submodule update
SKIPPED [1] generic_config_updater/test_eth_interface.py:335: Bypass as this is not a production scenario
SKIPPED [1] generic_config_updater/test_eth_interface.py:362: Bypass as this is not a production scenario
=========== 10 passed, 3 skipped, 372 warnings in 308.98s (0:05:08) ============
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 124.46s (0:02:04) ====

Error lines:

4:+ exit_on_error=
9:+ exit_on_error=true
89:TASK [get connection graph if defined for dut (ignore any errors)] *************
321:TASK [saved original minigraph file in SONiC DUT(ignore errors when file does not exist)] ***
322:fatal: [vlab-01]: FAILED! => {"changed": true, "cmd": "mv /etc/sonic/minigraph.xml /etc/sonic/minigraph.xml.orig", "delta": "0:00:00.003459", "end": "2026-02-16 07:35:31.471152", "msg": "non-zero return code", "rc": 1, "start": "2026-02-16 07:35:31.467693", "stderr": "mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory", "stderr_lines": ["mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory"], "stdout": "", "stdout_lines": []}
373:    "msg": "Stat result is {'changed': False, 'stat': {'exists': False}, 'failed': False}"
546:ASYNC FAILED on vlab-01: jid=j523617241189.18521
547:fatal: [vlab-01]: FAILED! => {"ansible_job_id": "j523617241189.18521", "changed": true, "cmd": ["chronyd", "-F", "1", "-q"], "delta": "0:00:10.556127", "end": "2026-02-16 07:36:28.191460", "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j523617241189.18521", "start": "2026-02-16 07:36:17.635333", "started": 1, "stderr": "2026-02-16T07:36:17Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)\n2026-02-16T07:36:17Z Timezone right/UTC failed leap second check, ignoring\n2026-02-16T07:36:17Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift\n2026-02-16T07:36:17Z Loaded seccomp filter (level 1)\n2026-02-16T07:36:28Z No suitable source for synchronisation\n2026-02-16T07:36:28Z chronyd exiting", "stderr_lines": ["2026-02-16T07:36:17Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)", "2026-02-16T07:36:17Z Timezone right/UTC failed leap second check, ignoring", "2026-02-16T07:36:17Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift", "2026-02-16T07:36:17Z Loaded seccomp filter (level 1)", "2026-02-16T07:36:28Z No suitable source for synchronisation", "2026-02-16T07:36:28Z chronyd exiting"], "stdout": "", "stdout_lines": []}
657:vlab-01                    : ok=85   changed=24   unreachable=0    failed=0    skipped=93   rescued=0    ignored=2   
1737:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_gnmi_authorize_failed_with_invalid_cname', found at tests/gnmi_e2e/test_gnmi_auth.py:33 and tests/gnmi/test_gnmi.py:152
1865:WARNING  pytest_plus:__init__.py:94 Test <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1866:WARNING  pytest_plus:__init__.py:94 <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status --fetch-from-hardware]> has an id that looks above 60 characters.
1876:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_verify_fec_stats_counters', found at tests/platform_tests/test_intf_fec.py:125 and tests/layer1/test_fec_error.py:27
1892:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -h]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1893:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --help]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1894:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -v]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1895:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --version]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1896:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -j]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1897:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --json]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1898:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -r]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1899:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --raw]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
2816:layer1/test_fec_error.py:8
2817:  /data/sonic-mgmt/tests/layer1/test_fec_error.py:8: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
2820:layer1/test_port_error.py:10
2821:  /data/sonic-mgmt/tests/layer1/test_port_error.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
3548:  /data/sonic-mgmt/tests/snappi_tests/dash/ha/ha_helper.py:71: PytestCollectionWarning: cannot collect test class 'TestPhase' because it has a __init__ constructor (from: tests/snappi_tests/dash/test_cps.py)
3980:=========================== short test summary info ============================
4279:WARNING  tests.conftest:conftest.py:3006 Core dump or config check failed for test_cacl.py, results: {"core_dump_check": {"failed": false, "new_core_dumps": {"vlab-01": []}}, "config_db_check": {"failed": true, "pre_only_config": {"vlab-01": {"null": {"BMP": {"table": {"bgp_neighbor_table": "true", "bgp_rib_in_table": "true", "bgp_rib_out_table": "true"}}, "DNS_NAMESERVER": {"10.11.0.5": {}, "10.11.0.6": {}}, "AAA": {"accounting": {"login": "tacacs+,local"}, "authentication": {"login": "tacacs+"}, "authorization": {"login": "tacacs+"}}, "TACPLUS": {"global": {"auth_type": "login", "passkey": "testing123"}}}}}, "cur_only_config": {"vlab-01": {"null": {}}}, "inconsistent_config": {"vlab-01": {"null": {"FEATURE": {"pre_value": {"bgp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "bmp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "false"}, "database": {"auto_restart": "always_enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "always_enabled", "support_syslog_rate_limit": "true"}, "dhcp_relay": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "True"}, "dhcp_server": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "False"}, "eventd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "frr_bmp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "false"}, "gbsyncd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gnmi": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "lldp": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "macsec": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "True"}, "mgmt-framework": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "mux": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "always_disabled", "support_syslog_rate_limit": "true"}, "nat": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "pmon": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "radv": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "sflow": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "snmp": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "swss": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "syncd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "teamd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "telemetry": {"delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "state": "disabled"}}, "cur_value": {"bgp": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "database": {"auto_restart": "always_enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "always_enabled", "support_syslog_rate_limit": "true"}, "dhcp_relay": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "True"}, "dhcp_server": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "False"}, "eventd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gbsyncd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gnmi": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "lldp": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "macsec": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "True"}, "mgmt-framework": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "mux": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "always_disabled", "support_syslog_rate_limit": "true"}, "nat": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "pmon": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "radv": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "sflow": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "snmp": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "swss": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "syncd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "teamd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "telemetry": {"delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "state": "disabled"}}}, "DEVICE_METADATA": {"pre_value": {"localhost": {"bgp_asn": "65100", "buffer_model": "traditional", "cloudtype": "Public", "default_bgp_status": "up", "default_pfcwd_status": "disable", "deployment_id": "1", "docker_routing_config_mode": "separated", "hostname": "vlab-01", "hwsku": "Force10-S6000", "mac": "22:48:23:27:33:d8", "orch_northbond_route_zmq_enabled": "true", "platform": "x86_64-kvm_x86_64-r0", "region": "None", "synchronous_mode": "enable", "timezone": "UTC", "type": "ToRRouter", "yang_config_validation": "disable"}}, "cur_value": {"localhost": {"bgp_asn": "65100", "buffer_model": "traditional", "cloudtype": "Public", "default_bgp_status": "up", "default_pfcwd_status": "disable", "deployment_id": "1", "docker_routing_config_mode": "separated", "hostname": "vlab-01", "hwsku": "Force10-S6000", "mac": "22:48:23:27:33:d8", "platform": "x86_64-kvm_x86_64-r0", "region": "None", "synchronous_mode": "enable", "timezone": "UTC", "type": "ToRRouter", "yang_config_validation": "disable"}}}}}}}}
4281:DEBUG:tests.conftest:append custom_msg: {'dut_check_result': {'core_dump_check_failed': False, 'config_db_check_failed': True}}
4434:=========================== short test summary info ============================

Tail (80 lines):


voq/test_voq_ipfwd.py:730
  /data/sonic-mgmt/tests/voq/test_voq_ipfwd.py:730: PytestUnknownMarkWarning: Unknown pytest.mark.express - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.param(128, 64, marks=pytest.mark.express),

voq/test_voq_disrupts.py:25
  /data/sonic-mgmt/tests/voq/test_voq_disrupts.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_scale_ecmp.py:19
  /data/sonic-mgmt/tests/vxlan/test_scale_ecmp.py:19: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_vnet_decap.py:14
  /data/sonic-mgmt/tests/vxlan/test_vnet_decap.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vnet_vxlan.py:25
  /data/sonic-mgmt/tests/vxlan/test_vnet_vxlan.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

DEBUG:tests.conftest:[log_custom_msg] item: <Function test_collect_dualtor_logs>
INFO:root:Can not get Allure report URL. Please check logs
vxlan/test_vxlan_multi_tunnel.py:13
  /data/sonic-mgmt/tests/vxlan/test_vxlan_multi_tunnel.py:13: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vxlan_tunnel_route_scale.py:23
  /data/sonic-mgmt/tests/vxlan/test_vxlan_tunnel_route_scale.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

zmq/test_gnmi_zmq.py:12
  /data/sonic-mgmt/tests/zmq/test_gnmi_zmq.py:12: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_file.py::test_file_stat is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_system.py::test_system_time is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance[NOTSET] is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance_stats is skipped when no topology marker is given
    warnings.warn(warn_msg)

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /opt/venv/lib/python3.12/site-packages/pytest_ansible/host_manager/v213.py:13: DeprecationWarning: Host management is deprecated and will be removed in a future release
    class HostManagerV213(BaseHostManager):

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_recover_rsyslog_rate_limit[vlab-01]
tests/test_posttest.py::test_enable_startup_tsa_tsb_service
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_dualtor_logs
  /opt/venv/lib/python3.12/site-packages/ansible/plugins/loader.py:1485: UserWarning: AnsibleCollectionFinder has already been configured
    warnings.warn('AnsibleCollectionFinder has already been configured')

tests/test_posttest.py: 107 warnings
  /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=22532) is multi-threaded, use of fork() may lead to deadlocks in the child.
    self.pid = os.fork()

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /data/sonic-mgmt/tests/conftest.py:1248: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).
    record_testsuite_property("timestamp", datetime.utcnow())

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
------ generated xml file: /data/sonic-mgmt/tests/logs/1vlan/posttest.xml ------
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 124.46s (0:02:04) ====
⚠️ setup-testbed.log (30 errors, 4651 lines, 148003B)

Error lines:

768:fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": true, "cmd": "arp -d  10.250.0.101", "delta": "0:00:00.003010", "end": "2026-02-16 07:27:04.443917", "msg": "non-zero return code", "rc": 255, "start": "2026-02-16 07:27:04.440907", "stderr": "", "stderr_lines": [], "stdout": "No ARP entry for 10.250.0.101", "stdout_lines": ["No ARP entry for 10.250.0.101"]}
801:TASK [vm_set : Fail if kickstart gives error for vlab-01] **********************
2003:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j476107640916.13605', 'results_file': '/root/.ansible_async/j476107640916.13605', 'changed': True, 'vm_name': 'VM0100', 'ansible_loop_var': 'vm_name'})
2004:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j22313272308.13636', 'results_file': '/root/.ansible_async/j22313272308.13636', 'changed': True, 'vm_name': 'VM0101', 'ansible_loop_var': 'vm_name'})
2005:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j96716876201.13668', 'results_file': '/root/.ansible_async/j96716876201.13668', 'changed': True, 'vm_name': 'VM0102', 'ansible_loop_var': 'vm_name'})
2006:FAILED - RETRYING: [STR-ACS-VSERV-01]: Wait for creation of net base containers (10 retries left).
2007:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j422859825225.13707', 'results_file': '/root/.ansible_async/j422859825225.13707', 'changed': True, 'vm_name': 'VM0103', 'ansible_loop_var': 'vm_name'})
2185:fatal: [STR-ACS-VSERV-01 -> localhost]: FAILED! => {"changed": false, "msg": "Traceback (most recent call last):\n  File \"/tmp/ansible_test_facts_payload_p2jzyxv2/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 240, in main\n    testbed_topo = topoinfo.get_testbed_info(testbed_name)\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/tmp/ansible_test_facts_payload_p2jzyxv2/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 193, in get_testbed_info\n    return self.testbed_topo[testbed_name]\n           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^\nKeyError: 'vms-kvm-t0'\n"}
3119:        "failed": false,
3137:        "failed": false,
3155:        "failed": false,
3173:        "failed": false,
4604:STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
4605:VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
4606:VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4607:VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4608:VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4609:VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4610:VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4611:VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4612:VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4613:VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4614:VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4615:VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4616:VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4617:VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4618:VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4619:VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4620:VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4621:VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Tail (80 lines):

skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

PLAY [servers:&vm_host] ********************************************************

TASK [Integrated traffic generator] ********************************************
skipping: [STR-ACS-VSERV-01]

PLAY RECAP *********************************************************************
STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0118                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0119                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0120                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0121                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0122                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0123                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0124                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0125                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0126                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0127                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0128                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0129                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0130                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0131                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0132                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0133                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0134                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0135                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0136                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0137                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0138                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0139                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0140                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0141                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0142                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0143                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Done
+ sleep 180
📄 clean-testbed.log (94 lines, 3666B)

Tail (80 lines):

+ docker stop ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ docker rm -f ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0103'
Stopping and removing container: ceos_vms6-1_VM0103
+ docker stop ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ docker rm -f ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0100'
Stopping and removing container: ceos_vms6-1_VM0100
+ docker stop ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ docker rm -f ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0101'
Stopping and removing container: ceos_vms6-1_VM0101
+ docker stop ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ docker rm -f ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0103'
Stopping and removing container: net_vms6-1_VM0103
+ docker stop net_vms6-1_VM0103
net_vms6-1_VM0103
+ docker rm -f net_vms6-1_VM0103
net_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0102'
Stopping and removing container: net_vms6-1_VM0102
+ docker stop net_vms6-1_VM0102
net_vms6-1_VM0102
+ docker rm -f net_vms6-1_VM0102
net_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0101'
Stopping and removing container: net_vms6-1_VM0101
+ docker stop net_vms6-1_VM0101
net_vms6-1_VM0101
+ docker rm -f net_vms6-1_VM0101
net_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0100'
Stopping and removing container: net_vms6-1_VM0100
+ docker stop net_vms6-1_VM0100
net_vms6-1_VM0100
+ docker rm -f net_vms6-1_VM0100
net_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ptf_vms6-1'
Stopping and removing container: ptf_vms6-1
+ docker stop ptf_vms6-1
ptf_vms6-1
+ docker rm -f ptf_vms6-1
ptf_vms6-1
+ echo '=== Cleaning cEOS data directories ==='
=== Cleaning cEOS data directories ===
+ sudo rm -rf /data/ceos/ceos_vms6-1_VM0100 /data/ceos/ceos_vms6-1_VM0101 /data/ceos/ceos_vms6-1_VM0102 /data/ceos/ceos_vms6-1_VM0103
+ echo '=== Removing stale vlab VMs ==='
=== Removing stale vlab VMs ===
+ virsh -c qemu:///system list --all
+ grep -q vlab
+ echo '=== Verifying sonic-mgmt container still running ==='
=== Verifying sonic-mgmt container still running ===
+ docker exec sonic-mgmt echo OK
OK
+ echo '=== Remaining containers ==='
=== Remaining containers ===
+ docker ps -a --format 'table {{.Names}}\t{{.Status}}'
NAMES                STATUS
sonic-mgmt           Up 2 days
hardcore_keller      Up 3 days
hopeful_mcclintock   Created
+ echo '=== Cleanup complete ==='
=== Cleanup complete ===
📄 update-sonic-mgmt.log (17 lines, 1060B)

Tail (80 lines):

+ echo '=== Updating /data/sonic-mgmt from arthur-cog-sonic fork ==='
=== Updating /data/sonic-mgmt from arthur-cog-sonic fork ===
+ cd /data/sonic-mgmt
+ git remote get-url arthur-fork
+ git remote add arthur-fork https://github.com/arthur-cog-sonic/sonic-mgmt.git
+ git fetch arthur-fork devin/1771222385-fix-cacl-config-restore
From https://github.com/arthur-cog-sonic/sonic-mgmt
 * branch                devin/1771222385-fix-cacl-config-restore -> FETCH_HEAD
 * [new branch]          devin/1771222385-fix-cacl-config-restore -> arthur-fork/devin/1771222385-fix-cacl-config-restore
+ git checkout arthur-fork/devin/1771222385-fix-cacl-config-restore -- tests/generic_config_updater/test_cacl.py
+ echo '=== Updated test_cacl.py with config restore fix ==='
=== Updated test_cacl.py with config restore fix ===
+ git diff --stat HEAD
 ansible/group_vars/vm_host/ceos.yml       | 16 +++++-----------
 ansible/veos_vtb                          |  4 ++--
 tests/generic_config_updater/test_cacl.py |  5 +++++
 3 files changed, 12 insertions(+), 13 deletions(-)
📄 sonic-mgmt-container-setup.log (5 lines, 218B)

Tail (80 lines):

+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo 'sonic-mgmt container already running, nothing to do'
sonic-mgmt container already running, nothing to do
+ exit 0
⚠️ diagnostics.log (2 errors, 132 lines, 8912B)

Error lines:

26:+ echo 'FAIL: cannot connect to libvirt'
27:FAIL: cannot connect to libvirt

Tail (80 lines):

-rwxrwxr-x   1 azureuser azureuser 18986 Feb 13 02:41 setup-container.sh
-rw-rw-r--   1 azureuser azureuser  2025 Feb 13 02:41 sonic_dictionary.txt
drwxrwxr-x  15 azureuser azureuser  4096 Feb 13 02:41 spytest
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 test_reporting
drwxrwxr-x 129 azureuser azureuser  4096 Feb 13 21:08 tests
+ ls /data/sonic-mgmt/tests/kvmtest.sh
/data/sonic-mgmt/tests/kvmtest.sh
+ echo '=== Check sonic-mgmt docker container ==='
=== Check sonic-mgmt docker container ===
+ docker ps -a --filter name=sonic-mgmt
CONTAINER ID   IMAGE                                COMMAND       CREATED      STATUS      PORTS     NAMES
43138ce8a47b   docker-sonic-mgmt-azureuser:master   "/bin/bash"   2 days ago   Up 2 days   22/tcp    sonic-mgmt
+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo '=== Check /data contents ==='
=== Check /data contents ===
+ ls -la /data/
total 20
drwxr-xr-x  5 azureuser azureuser 4096 Feb 13 03:20 .
drwxr-xr-x 22 root      root      4096 Feb 13 02:25 ..
drwxr-xr-x  6 root      root      4096 Feb 13 20:55 ceos
drwxrwxr-x 13 azureuser azureuser 4096 Feb 13 03:30 sonic-mgmt
drwxr-xr-x  4 azureuser azureuser 4096 Feb 13 11:58 sonic-vm
+ echo '=== Check cEOS images ==='
=== Check cEOS images ===
+ ls -la /data/sonic-vm/images/
total 4995980
drwxr-xr-x 2 azureuser azureuser       4096 Feb 13 20:51 .
drwxr-xr-x 4 azureuser azureuser       4096 Feb 13 11:58 ..
-rw-r--r-- 1 azureuser azureuser 5115871232 Feb 13 20:50 sonic-vs.img
+ ls -la /data/ceos/
total 24
drwxr-xr-x   6 root      root      4096 Feb 13 20:55 .
drwxr-xr-x   5 azureuser azureuser 4096 Feb 13 03:20 ..
drwxrwxr-x+ 10 root      root      4096 Feb 13 20:55 ceos_vms6-1_VM0100
drwxrwxr-x+ 10 root      root      4096 Feb 13 20:55 ceos_vms6-1_VM0101
drwxrwxr-x+ 10 root      root      4096 Feb 13 20:55 ceos_vms6-1_VM0102
drwxrwxr-x+ 10 root      root      4096 Feb 13 20:55 ceos_vms6-1_VM0103
+ echo '=== Docker images ==='
=== Docker images ===
+ docker images
+ head -20
IMAGE                                                                ID             DISK USAGE   CONTENT SIZE   EXTRA
ceosimage:4.29.10.1M                                                 cf484164b16d       2.89GB          731MB        
ceosimage:4.29.10.1M-1                                               c6a1850ef28f       2.89GB          731MB   U    
docker-ptf:latest                                                    f28aaf787373       9.09GB         4.39GB        
docker-sonic-mgmt-azureuser:master                                   dc1af3bbf6a5       5.08GB          992MB   U    
docker-sonic-vs:latest                                               93c8cf3b870e       1.74GB          828MB        
publicmirror.azurecr.io/debian:bookworm                              c66c66fac809        185MB         52.2MB        
publicmirror.azurecr.io/debian:trixie                                c71b05eac0b2        186MB         52.5MB        
sonic-slave-bookworm-azureuser:1828b4d7c29                           983be73adef0       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:35feeb650be                           ae838157f361       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:6db1f584aa2                           266b697e04d1       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:73c8df02574                           dfe58e9aafc4       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:9eaf6be7f19                           79c4b6740f13       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:a27a4904ede                           c2a93f135256       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:ada88ee24f1                           06a1ce313b86       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:b8b044053f8                           e27f53eab83a       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:eee57d1af57                           18f6072bb74c       13.5GB         3.34GB        
sonic-slave-bookworm:733a2a69061                                     0114860d906a       13.5GB         3.34GB        
sonic-slave-trixie-azureuser:01d13355124                             896b1ab6f42e         14GB         3.34GB        
sonic-slave-trixie-azureuser:83b460050f6                             b1c07fdfb78c         14GB         3.34GB        
+ echo '=== Docker containers ==='
=== Docker containers ===
+ docker ps -a
CONTAINER ID   IMAGE                                                 COMMAND                  CREATED       STATUS      PORTS     NAMES
fb734c766aba   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 days ago    Up 2 days             ceos_vms6-1_VM0102
64511993d378   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 days ago    Up 2 days             ceos_vms6-1_VM0103
ee4f1b79bcae   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 days ago    Up 2 days             ceos_vms6-1_VM0100
1fffd17f7252   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 days ago    Up 2 days             ceos_vms6-1_VM0101
ac047e9f336e   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 days ago    Up 2 days             net_vms6-1_VM0103
94d75a1d108c   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 days ago    Up 2 days             net_vms6-1_VM0102
10217b187370   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 days ago    Up 2 days             net_vms6-1_VM0101
b4e0408e77e3   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 days ago    Up 2 days             net_vms6-1_VM0100
af298bad4e2e   sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   "/root/env-python3/b…"   2 days ago    Up 2 days             ptf_vms6-1
43138ce8a47b   docker-sonic-mgmt-azureuser:master                    "/bin/bash"              2 days ago    Up 2 days   22/tcp    sonic-mgmt
d269a1130978   sonic-slave-trixie-azureuser:b0cc8a9f29a              "bash -c 'make -f sl…"   3 days ago    Up 3 days   22/tcp    hardcore_keller
8ebc852cc991   d0474f6ff0b1                                          "/bin/sh -c '#(nop) …"   5 weeks ago   Created               hopeful_mcclintock
+ echo '=== Diagnostics complete ==='
=== Diagnostics complete ===
⚙️ Environment snapshot
=== Docker containers ===
NAMES                IMAGE                                                 STATUS
ceos_vms6-1_VM0102   ceosimage:4.29.10.1M-1                                Up 51 minutes
ceos_vms6-1_VM0100   ceosimage:4.29.10.1M-1                                Up 51 minutes
ceos_vms6-1_VM0103   ceosimage:4.29.10.1M-1                                Up 51 minutes
ceos_vms6-1_VM0101   ceosimage:4.29.10.1M-1                                Up 51 minutes
net_vms6-1_VM0103    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
net_vms6-1_VM0102    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
net_vms6-1_VM0101    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
net_vms6-1_VM0100    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 52 minutes
ptf_vms6-1           sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   Up 53 minutes
sonic-mgmt           docker-sonic-mgmt-azureuser:master                    Up 2 days
hardcore_keller      sonic-slave-trixie-azureuser:b0cc8a9f29a              Up 3 days
hopeful_mcclintock   d0474f6ff0b1                                          Created

=== Log directory ===
total 1156
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 08:22 .
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 08:22 ..
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 08:22 1vlan
-rw-r--r-- 1 azureuser azureuser   3666 Feb 16 07:25 clean-testbed.log
-rw-r--r-- 1 azureuser azureuser   8912 Feb 16 07:23 diagnostics.log
-rw-r--r-- 1 azureuser azureuser 982447 Feb 16 08:22 kvmtest-run.log
drwxr-xr-x 3 azureuser azureuser   4096 Feb 16 08:22 ptf
-rw-r--r-- 1 azureuser azureuser 148003 Feb 16 07:32 setup-testbed.log
-rw-r--r-- 1 azureuser azureuser    218 Feb 16 07:23 sonic-mgmt-container-setup.log
-rw-r--r-- 1 azureuser azureuser   1060 Feb 16 07:23 update-sonic-mgmt.log

=== Disk usage ===
Filesystem      Size  Used Avail Use% Mounted on
/dev/root       993G  168G  825G  17% /
/dev/root       993G  168G  825G  17% /

🔍 62 error(s) across 6 log file(s)

devin-ai-integration bot and others added 2 commits February 16, 2026 08:30
The previous approach using git fetch from the fork likely failed on the
build agent due to network/auth issues. Replace with an inline Python
script that patches test_cacl.py directly on the agent filesystem.

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
The previous inline Python code broke Azure Pipelines YAML parser
because unindented Python lines violated the YAML literal block scalar
indentation rules. Fix by base64-encoding the patch script and
decoding it at runtime on the agent.

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #188 | Commit: 996eab6

⚠️ kvmtest-run.log (30 errors, 7791 lines, 982442B)

Pytest summary:

=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip snappi metadata generation for non-tgen testbed
SKIPPED [1] test_pretest.py:469: No URL specified for python saithrift package
SKIPPED [1] common/helpers/assertions.py:16: Skip 'test_backend_acl_load' on non t0-backend testbeds.
=== 10 passed, 3 skipped, 4897 deselected, 727 warnings in 259.35s (0:04:19) ===
=== Running tests individually ===
Running: python3 -m pytest dns/test_dns_resolv_conf.py --inventory ../ansible/veos_vtb --host-pattern vlab-01 --dpu-pattern None --testbed vms-kvm-t0 --testbed_file vtestbed.yaml --log-cli-level warning --log-file-level debug --kube_master unset --showlocals --assert plain --show-capture no -rav --ignore=ptftests --ignore=acstests --ignore=saitests --ignore=scripts --ignore=k8s --ignore=sai_qualify --maxfail=1 --log-file logs/1vlan/dns/test_dns_resolv_conf.log --junitxml=logs/1vlan/dns/test_dns_resolv_conf.xml --allow_recover --completeness_level=confident
============================= test session starts ==============================
=========================== short test summary info ============================
SKIPPED [1] generic_config_updater/test_eth_interface.py:199: Bypass as it is blocking submodule update
SKIPPED [1] generic_config_updater/test_eth_interface.py:335: Bypass as this is not a production scenario
SKIPPED [1] generic_config_updater/test_eth_interface.py:362: Bypass as this is not a production scenario
=========== 10 passed, 3 skipped, 372 warnings in 303.09s (0:05:03) ============
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 122.45s (0:02:02) ====

Error lines:

4:+ exit_on_error=
9:+ exit_on_error=true
89:TASK [get connection graph if defined for dut (ignore any errors)] *************
321:TASK [saved original minigraph file in SONiC DUT(ignore errors when file does not exist)] ***
322:fatal: [vlab-01]: FAILED! => {"changed": true, "cmd": "mv /etc/sonic/minigraph.xml /etc/sonic/minigraph.xml.orig", "delta": "0:00:00.003619", "end": "2026-02-16 18:56:49.300683", "msg": "non-zero return code", "rc": 1, "start": "2026-02-16 18:56:49.297064", "stderr": "mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory", "stderr_lines": ["mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory"], "stdout": "", "stdout_lines": []}
373:    "msg": "Stat result is {'changed': False, 'stat': {'exists': False}, 'failed': False}"
546:ASYNC FAILED on vlab-01: jid=j751403936083.18011
547:fatal: [vlab-01]: FAILED! => {"ansible_job_id": "j751403936083.18011", "changed": true, "cmd": ["chronyd", "-F", "1", "-q"], "delta": "0:00:10.519419", "end": "2026-02-16 18:57:43.542974", "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j751403936083.18011", "start": "2026-02-16 18:57:33.023555", "started": 1, "stderr": "2026-02-16T18:57:33Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)\n2026-02-16T18:57:33Z Timezone right/UTC failed leap second check, ignoring\n2026-02-16T18:57:33Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift\n2026-02-16T18:57:33Z Loaded seccomp filter (level 1)\n2026-02-16T18:57:43Z No suitable source for synchronisation\n2026-02-16T18:57:43Z chronyd exiting", "stderr_lines": ["2026-02-16T18:57:33Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)", "2026-02-16T18:57:33Z Timezone right/UTC failed leap second check, ignoring", "2026-02-16T18:57:33Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift", "2026-02-16T18:57:33Z Loaded seccomp filter (level 1)", "2026-02-16T18:57:43Z No suitable source for synchronisation", "2026-02-16T18:57:43Z chronyd exiting"], "stdout": "", "stdout_lines": []}
657:vlab-01                    : ok=85   changed=24   unreachable=0    failed=0    skipped=93   rescued=0    ignored=2   
1737:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_gnmi_authorize_failed_with_invalid_cname', found at tests/gnmi_e2e/test_gnmi_auth.py:33 and tests/gnmi/test_gnmi.py:152
1865:WARNING  pytest_plus:__init__.py:94 Test <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1866:WARNING  pytest_plus:__init__.py:94 <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status --fetch-from-hardware]> has an id that looks above 60 characters.
1876:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_verify_fec_stats_counters', found at tests/platform_tests/test_intf_fec.py:125 and tests/layer1/test_fec_error.py:27
1892:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -h]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1893:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --help]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1894:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -v]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1895:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --version]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1896:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -j]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1897:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --json]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1898:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -r]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1899:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --raw]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
2816:layer1/test_fec_error.py:8
2817:  /data/sonic-mgmt/tests/layer1/test_fec_error.py:8: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
2820:layer1/test_port_error.py:10
2821:  /data/sonic-mgmt/tests/layer1/test_port_error.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
3548:  /data/sonic-mgmt/tests/snappi_tests/dash/ha/ha_helper.py:71: PytestCollectionWarning: cannot collect test class 'TestPhase' because it has a __init__ constructor (from: tests/snappi_tests/dash/test_cps.py)
3980:=========================== short test summary info ============================
4279:WARNING  tests.conftest:conftest.py:3006 Core dump or config check failed for test_cacl.py, results: {"core_dump_check": {"failed": false, "new_core_dumps": {"vlab-01": []}}, "config_db_check": {"failed": true, "pre_only_config": {"vlab-01": {"null": {"TACPLUS": {"global": {"auth_type": "login", "passkey": "testing123"}}, "DNS_NAMESERVER": {"10.11.0.5": {}, "10.11.0.6": {}}, "BMP": {"table": {"bgp_neighbor_table": "true", "bgp_rib_in_table": "true", "bgp_rib_out_table": "true"}}, "AAA": {"accounting": {"login": "tacacs+,local"}, "authentication": {"login": "tacacs+"}, "authorization": {"login": "tacacs+"}}}}}, "cur_only_config": {"vlab-01": {"null": {}}}, "inconsistent_config": {"vlab-01": {"null": {"DEVICE_METADATA": {"pre_value": {"localhost": {"bgp_asn": "65100", "buffer_model": "traditional", "cloudtype": "Public", "default_bgp_status": "up", "default_pfcwd_status": "disable", "deployment_id": "1", "docker_routing_config_mode": "separated", "hostname": "vlab-01", "hwsku": "Force10-S6000", "mac": "22:48:23:27:33:d8", "orch_northbond_route_zmq_enabled": "true", "platform": "x86_64-kvm_x86_64-r0", "region": "None", "synchronous_mode": "enable", "timezone": "UTC", "type": "ToRRouter", "yang_config_validation": "disable"}}, "cur_value": {"localhost": {"bgp_asn": "65100", "buffer_model": "traditional", "cloudtype": "Public", "default_bgp_status": "up", "default_pfcwd_status": "disable", "deployment_id": "1", "docker_routing_config_mode": "separated", "hostname": "vlab-01", "hwsku": "Force10-S6000", "mac": "22:48:23:27:33:d8", "platform": "x86_64-kvm_x86_64-r0", "region": "None", "synchronous_mode": "enable", "timezone": "UTC", "type": "ToRRouter", "yang_config_validation": "disable"}}}, "FEATURE": {"pre_value": {"bgp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "bmp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "false"}, "database": {"auto_restart": "always_enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "always_enabled", "support_syslog_rate_limit": "true"}, "dhcp_relay": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "True"}, "dhcp_server": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "False"}, "eventd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "frr_bmp": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "false"}, "gbsyncd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gnmi": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "lldp": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "macsec": {"auto_restart": "disabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "True"}, "mgmt-framework": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "mux": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "always_disabled", "support_syslog_rate_limit": "true"}, "nat": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "pmon": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "radv": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "sflow": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "snmp": {"auto_restart": "disabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "swss": {"auto_restart": "disabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "syncd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "teamd": {"auto_restart": "disabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "telemetry": {"delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "state": "disabled"}}, "cur_value": {"bgp": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "database": {"auto_restart": "always_enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "always_enabled", "support_syslog_rate_limit": "true"}, "dhcp_relay": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "enabled", "support_syslog_rate_limit": "True"}, "dhcp_server": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "False"}, "eventd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gbsyncd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "gnmi": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "lldp": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "macsec": {"auto_restart": "enabled", "check_up_status": "False", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "set_owner": "local", "state": "disabled", "support_syslog_rate_limit": "True"}, "mgmt-framework": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "mux": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "always_disabled", "support_syslog_rate_limit": "true"}, "nat": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "pmon": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "radv": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "sflow": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "disabled", "support_syslog_rate_limit": "true"}, "snmp": {"auto_restart": "enabled", "delayed": "True", "has_global_scope": "True", "has_per_asic_scope": "False", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "swss": {"auto_restart": "enabled", "check_up_status": "false", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "syncd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "teamd": {"auto_restart": "enabled", "delayed": "False", "has_global_scope": "False", "has_per_asic_scope": "True", "high_mem_alert": "disabled", "state": "enabled", "support_syslog_rate_limit": "true"}, "telemetry": {"delayed": "False", "has_global_scope": "True", "has_per_asic_scope": "False", "state": "disabled"}}}}}}}}
4281:DEBUG:tests.conftest:append custom_msg: {'dut_check_result': {'core_dump_check_failed': False, 'config_db_check_failed': True}}
4434:=========================== short test summary info ============================

Tail (80 lines):


voq/test_voq_ipfwd.py:730
  /data/sonic-mgmt/tests/voq/test_voq_ipfwd.py:730: PytestUnknownMarkWarning: Unknown pytest.mark.express - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.param(128, 64, marks=pytest.mark.express),

voq/test_voq_disrupts.py:25
  /data/sonic-mgmt/tests/voq/test_voq_disrupts.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_scale_ecmp.py:19
  /data/sonic-mgmt/tests/vxlan/test_scale_ecmp.py:19: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_vnet_decap.py:14
  /data/sonic-mgmt/tests/vxlan/test_vnet_decap.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vnet_vxlan.py:25
  /data/sonic-mgmt/tests/vxlan/test_vnet_vxlan.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

DEBUG:tests.conftest:[log_custom_msg] item: <Function test_collect_dualtor_logs>
INFO:root:Can not get Allure report URL. Please check logs
vxlan/test_vxlan_multi_tunnel.py:13
  /data/sonic-mgmt/tests/vxlan/test_vxlan_multi_tunnel.py:13: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vxlan_tunnel_route_scale.py:23
  /data/sonic-mgmt/tests/vxlan/test_vxlan_tunnel_route_scale.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

zmq/test_gnmi_zmq.py:12
  /data/sonic-mgmt/tests/zmq/test_gnmi_zmq.py:12: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_file.py::test_file_stat is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_system.py::test_system_time is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance[NOTSET] is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance_stats is skipped when no topology marker is given
    warnings.warn(warn_msg)

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /opt/venv/lib/python3.12/site-packages/pytest_ansible/host_manager/v213.py:13: DeprecationWarning: Host management is deprecated and will be removed in a future release
    class HostManagerV213(BaseHostManager):

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_recover_rsyslog_rate_limit[vlab-01]
tests/test_posttest.py::test_enable_startup_tsa_tsb_service
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_dualtor_logs
  /opt/venv/lib/python3.12/site-packages/ansible/plugins/loader.py:1485: UserWarning: AnsibleCollectionFinder has already been configured
    warnings.warn('AnsibleCollectionFinder has already been configured')

tests/test_posttest.py: 107 warnings
  /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=504) is multi-threaded, use of fork() may lead to deadlocks in the child.
    self.pid = os.fork()

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /data/sonic-mgmt/tests/conftest.py:1248: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).
    record_testsuite_property("timestamp", datetime.utcnow())

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
------ generated xml file: /data/sonic-mgmt/tests/logs/1vlan/posttest.xml ------
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 122.45s (0:02:02) ====
⚠️ setup-testbed.log (30 errors, 4646 lines, 147629B)

Error lines:

764:fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": true, "cmd": "arp -d  10.250.0.101", "delta": "0:00:00.002832", "end": "2026-02-16 18:48:50.107359", "msg": "non-zero return code", "rc": 255, "start": "2026-02-16 18:48:50.104527", "stderr": "", "stderr_lines": [], "stdout": "No ARP entry for 10.250.0.101", "stdout_lines": ["No ARP entry for 10.250.0.101"]}
797:TASK [vm_set : Fail if kickstart gives error for vlab-01] **********************
1999:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j555455324253.26161', 'results_file': '/root/.ansible_async/j555455324253.26161', 'changed': True, 'vm_name': 'VM0100', 'ansible_loop_var': 'vm_name'})
2000:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j825253436434.26192', 'results_file': '/root/.ansible_async/j825253436434.26192', 'changed': True, 'vm_name': 'VM0101', 'ansible_loop_var': 'vm_name'})
2001:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j120562052816.26223', 'results_file': '/root/.ansible_async/j120562052816.26223', 'changed': True, 'vm_name': 'VM0102', 'ansible_loop_var': 'vm_name'})
2002:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j338456256947.26261', 'results_file': '/root/.ansible_async/j338456256947.26261', 'changed': True, 'vm_name': 'VM0103', 'ansible_loop_var': 'vm_name'})
2180:fatal: [STR-ACS-VSERV-01 -> localhost]: FAILED! => {"changed": false, "msg": "Traceback (most recent call last):\n  File \"/tmp/ansible_test_facts_payload_nyk8e7nj/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 240, in main\n    testbed_topo = topoinfo.get_testbed_info(testbed_name)\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/tmp/ansible_test_facts_payload_nyk8e7nj/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 193, in get_testbed_info\n    return self.testbed_topo[testbed_name]\n           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^\nKeyError: 'vms-kvm-t0'\n"}
3114:        "failed": false,
3132:        "failed": false,
3150:        "failed": false,
3168:        "failed": false,
4599:STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
4600:VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
4601:VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4602:VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4603:VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4604:VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4605:VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4606:VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4607:VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4608:VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4609:VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4610:VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4611:VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4612:VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4613:VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4614:VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4615:VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4616:VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4617:VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Tail (80 lines):

skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

PLAY [servers:&vm_host] ********************************************************

TASK [Integrated traffic generator] ********************************************
skipping: [STR-ACS-VSERV-01]

PLAY RECAP *********************************************************************
STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0118                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0119                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0120                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0121                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0122                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0123                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0124                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0125                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0126                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0127                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0128                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0129                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0130                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0131                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0132                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0133                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0134                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0135                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0136                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0137                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0138                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0139                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0140                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0141                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0142                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0143                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Done
+ sleep 180
📄 clean-testbed.log (94 lines, 3666B)

Tail (80 lines):

+ docker stop ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ docker rm -f ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0100'
Stopping and removing container: ceos_vms6-1_VM0100
+ docker stop ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ docker rm -f ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0103'
Stopping and removing container: ceos_vms6-1_VM0103
+ docker stop ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ docker rm -f ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0101'
Stopping and removing container: ceos_vms6-1_VM0101
+ docker stop ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ docker rm -f ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0103'
Stopping and removing container: net_vms6-1_VM0103
+ docker stop net_vms6-1_VM0103
net_vms6-1_VM0103
+ docker rm -f net_vms6-1_VM0103
net_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0102'
Stopping and removing container: net_vms6-1_VM0102
+ docker stop net_vms6-1_VM0102
net_vms6-1_VM0102
+ docker rm -f net_vms6-1_VM0102
net_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0101'
Stopping and removing container: net_vms6-1_VM0101
+ docker stop net_vms6-1_VM0101
net_vms6-1_VM0101
+ docker rm -f net_vms6-1_VM0101
net_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0100'
Stopping and removing container: net_vms6-1_VM0100
+ docker stop net_vms6-1_VM0100
net_vms6-1_VM0100
+ docker rm -f net_vms6-1_VM0100
net_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ptf_vms6-1'
Stopping and removing container: ptf_vms6-1
+ docker stop ptf_vms6-1
ptf_vms6-1
+ docker rm -f ptf_vms6-1
ptf_vms6-1
+ echo '=== Cleaning cEOS data directories ==='
=== Cleaning cEOS data directories ===
+ sudo rm -rf /data/ceos/ceos_vms6-1_VM0100 /data/ceos/ceos_vms6-1_VM0101 /data/ceos/ceos_vms6-1_VM0102 /data/ceos/ceos_vms6-1_VM0103
+ echo '=== Removing stale vlab VMs ==='
=== Removing stale vlab VMs ===
+ virsh -c qemu:///system list --all
+ grep -q vlab
+ echo '=== Verifying sonic-mgmt container still running ==='
=== Verifying sonic-mgmt container still running ===
+ docker exec sonic-mgmt echo OK
OK
+ echo '=== Remaining containers ==='
=== Remaining containers ===
+ docker ps -a --format 'table {{.Names}}\t{{.Status}}'
NAMES                STATUS
sonic-mgmt           Up 3 days
hardcore_keller      Up 3 days
hopeful_mcclintock   Created
+ echo '=== Cleanup complete ==='
=== Cleanup complete ===
📄 update-sonic-mgmt.log (32 lines, 3334B)

Tail (80 lines):

+ TARGET=/data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
+ echo '=== Patching test_cacl.py with config restore fix ==='
=== Patching test_cacl.py with config restore fix ===
+ echo 'Before patch:'
Before patch:
+ sed -n 49,53p /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
@pytest.fixture(scope="module", autouse=True)
def restore_test_env(duthosts, rand_one_dut_front_end_hostname):
    duthost = duthosts[rand_one_dut_front_end_hostname]
    config_reload(duthost, config_source="minigraph", safe_reload=True)
    yield
+ echo aW1wb3J0IHN5cwp0YXJnZXQgPSBzeXMuYXJndlsxXQp3aXRoIG9wZW4odGFyZ2V0LCAncicpIGFzIGY6CiAgICBjb250ZW50ID0gZi5yZWFkKCkKb2xkID0gJ0BweXRlc3QuZml4dHVyZShzY29wZT0ibW9kdWxlIiwgYXV0b3VzZT1UcnVlKVxuZGVmIHJlc3RvcmVfdGVzdF9lbnYoZHV0aG9zdHMsIHJhbmRfb25lX2R1dF9mcm9udF9lbmRfaG9zdG5hbWUpOlxuICAgIGR1dGhvc3QgPSBkdXRob3N0c1tyYW5kX29uZV9kdXRfZnJvbnRfZW5kX2hvc3RuYW1lXVxuICAgIGNvbmZpZ19yZWxvYWQoZHV0aG9zdCwgY29uZmlnX3NvdXJjZT0ibWluaWdyYXBoIiwgc2FmZV9yZWxvYWQ9VHJ1ZSlcbiAgICB5aWVsZCcKbmV3ID0gJ0BweXRlc3QuZml4dHVyZShzY29wZT0ibW9kdWxlIiwgYXV0b3VzZT1UcnVlKVxuZGVmIHJlc3RvcmVfdGVzdF9lbnYoZHV0aG9zdHMsIHJhbmRfb25lX2R1dF9mcm9udF9lbmRfaG9zdG5hbWUpOlxuICAgIGR1dGhvc3QgPSBkdXRob3N0c1tyYW5kX29uZV9kdXRfZnJvbnRfZW5kX2hvc3RuYW1lXVxuICAgIGR1dGhvc3Quc2hlbGwoImNwIC9ldGMvc29uaWMvY29uZmlnX2RiLmpzb24gL2V0Yy9zb25pYy9jb25maWdfZGIuanNvbi5iZWZvcmVfY2FjbCIpXG4gICAgY29uZmlnX3JlbG9hZChkdXRob3N0LCBjb25maWdfc291cmNlPSJtaW5pZ3JhcGgiLCBzYWZlX3JlbG9hZD1UcnVlKVxuICAgIHlpZWxkXG4gICAgbG9nZ2VyLmluZm8oIlJlc3RvcmluZyBvcmlnaW5hbCBjb25maWdfZGIuanNvbiBhZnRlciB0ZXN0X2NhY2wgbW9kdWxlIilcbiAgICBkdXRob3N0LnNoZWxsKCJjcCAvZXRjL3NvbmljL2NvbmZpZ19kYi5qc29uLmJlZm9yZV9jYWNsIC9ldGMvc29uaWMvY29uZmlnX2RiLmpzb24iKVxuICAgIGNvbmZpZ19yZWxvYWQoZHV0aG9zdCwgc2FmZV9yZWxvYWQ9VHJ1ZSlcbiAgICBkdXRob3N0LnNoZWxsKCJybSAtZiAvZXRjL3NvbmljL2NvbmZpZ19kYi5qc29uLmJlZm9yZV9jYWNsIiknCmlmIG9sZCBpbiBjb250ZW50OgogICAgY29udGVudCA9IGNvbnRlbnQucmVwbGFjZShvbGQsIG5ldykKICAgIHdpdGggb3Blbih0YXJnZXQsICd3JykgYXMgZjoKICAgICAgICBmLndyaXRlKGNvbnRlbnQpCiAgICBwcmludCgnU1VDQ0VTUzogdGVzdF9jYWNsLnB5IHBhdGNoZWQnKQplbHNlOgogICAgcHJpbnQoJ1dBUk5JTkc6IFBhdHRlcm4gbm90IGZvdW5kLCBjaGVja2luZyBmaWxlIGNvbnRlbnQuLi4nKQogICAgZm9yIGksIGxpbmUgaW4gZW51bWVyYXRlKGNvbnRlbnQuc3BsaXQoJ1xuJyksIDEpOgogICAgICAgIGlmICdyZXN0b3JlX3Rlc3RfZW52JyBpbiBsaW5lIG9yICdiZWZvcmVfY2FjbCcgaW4gbGluZToKICAgICAgICAgICAgcHJpbnQoZicgIExpbmUge2l9OiB7bGluZX0nKQo=
+ base64 -d
+ python3 /tmp/patch_cacl.py /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
SUCCESS: test_cacl.py patched
+ echo 'After patch:'
After patch:
+ sed -n 49,62p /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
@pytest.fixture(scope="module", autouse=True)
def restore_test_env(duthosts, rand_one_dut_front_end_hostname):
    duthost = duthosts[rand_one_dut_front_end_hostname]
    duthost.shell("cp /etc/sonic/config_db.json /etc/sonic/config_db.json.before_cacl")
    config_reload(duthost, config_source="minigraph", safe_reload=True)
    yield
    logger.info("Restoring original config_db.json after test_cacl module")
    duthost.shell("cp /etc/sonic/config_db.json.before_cacl /etc/sonic/config_db.json")
    config_reload(duthost, safe_reload=True)
    duthost.shell("rm -f /etc/sonic/config_db.json.before_cacl")


@pytest.fixture(scope="module", autouse=True)
def disable_port_toggle(duthosts, tbinfo, restore_test_env):
📄 sonic-mgmt-container-setup.log (5 lines, 218B)

Tail (80 lines):

+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo 'sonic-mgmt container already running, nothing to do'
sonic-mgmt container already running, nothing to do
+ exit 0
⚠️ diagnostics.log (2 errors, 132 lines, 8951B)

Error lines:

26:+ echo 'FAIL: cannot connect to libvirt'
27:FAIL: cannot connect to libvirt

Tail (80 lines):

-rwxrwxr-x   1 azureuser azureuser 18986 Feb 13 02:41 setup-container.sh
-rw-rw-r--   1 azureuser azureuser  2025 Feb 13 02:41 sonic_dictionary.txt
drwxrwxr-x  15 azureuser azureuser  4096 Feb 13 02:41 spytest
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 test_reporting
drwxrwxr-x 129 azureuser azureuser  4096 Feb 16 07:44 tests
+ ls /data/sonic-mgmt/tests/kvmtest.sh
/data/sonic-mgmt/tests/kvmtest.sh
+ echo '=== Check sonic-mgmt docker container ==='
=== Check sonic-mgmt docker container ===
+ docker ps -a --filter name=sonic-mgmt
CONTAINER ID   IMAGE                                COMMAND       CREATED      STATUS      PORTS     NAMES
43138ce8a47b   docker-sonic-mgmt-azureuser:master   "/bin/bash"   3 days ago   Up 3 days   22/tcp    sonic-mgmt
+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo '=== Check /data contents ==='
=== Check /data contents ===
+ ls -la /data/
total 20
drwxr-xr-x  5 azureuser azureuser 4096 Feb 13 03:20 .
drwxr-xr-x 22 root      root      4096 Feb 13 02:25 ..
drwxr-xr-x  6 root      root      4096 Feb 16 07:31 ceos
drwxrwxr-x 13 azureuser azureuser 4096 Feb 13 03:30 sonic-mgmt
drwxr-xr-x  4 azureuser azureuser 4096 Feb 13 11:58 sonic-vm
+ echo '=== Check cEOS images ==='
=== Check cEOS images ===
+ ls -la /data/sonic-vm/images/
total 5030284
drwxr-xr-x 2 azureuser azureuser       4096 Feb 16 07:26 .
drwxr-xr-x 4 azureuser azureuser       4096 Feb 13 11:58 ..
-rw-r--r-- 1 azureuser azureuser 5150998528 Feb 16 07:25 sonic-vs.img
+ ls -la /data/ceos/
total 24
drwxr-xr-x   6 root      root      4096 Feb 16 07:31 .
drwxr-xr-x   5 azureuser azureuser 4096 Feb 13 03:20 ..
drwxrwxr-x+ 10 root      root      4096 Feb 16 07:31 ceos_vms6-1_VM0100
drwxrwxr-x+ 10 root      root      4096 Feb 16 07:31 ceos_vms6-1_VM0101
drwxrwxr-x+ 10 root      root      4096 Feb 16 07:31 ceos_vms6-1_VM0102
drwxrwxr-x+ 10 root      root      4096 Feb 16 07:31 ceos_vms6-1_VM0103
+ echo '=== Docker images ==='
=== Docker images ===
+ docker images
+ head -20
IMAGE                                                                ID             DISK USAGE   CONTENT SIZE   EXTRA
ceosimage:4.29.10.1M                                                 cf484164b16d       2.89GB          731MB        
ceosimage:4.29.10.1M-1                                               c6a1850ef28f       2.89GB          731MB   U    
docker-ptf:latest                                                    f28aaf787373       9.09GB         4.39GB        
docker-sonic-mgmt-azureuser:master                                   dc1af3bbf6a5       5.08GB          992MB   U    
docker-sonic-vs:latest                                               93c8cf3b870e       1.74GB          828MB        
publicmirror.azurecr.io/debian:bookworm                              c66c66fac809        185MB         52.2MB        
publicmirror.azurecr.io/debian:trixie                                c71b05eac0b2        186MB         52.5MB        
sonic-slave-bookworm-azureuser:1828b4d7c29                           983be73adef0       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:35feeb650be                           ae838157f361       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:6db1f584aa2                           266b697e04d1       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:73c8df02574                           dfe58e9aafc4       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:9eaf6be7f19                           79c4b6740f13       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:a27a4904ede                           c2a93f135256       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:ada88ee24f1                           06a1ce313b86       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:b8b044053f8                           e27f53eab83a       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:eee57d1af57                           18f6072bb74c       13.5GB         3.34GB        
sonic-slave-bookworm:733a2a69061                                     0114860d906a       13.5GB         3.34GB        
sonic-slave-trixie-azureuser:01d13355124                             896b1ab6f42e         14GB         3.34GB        
sonic-slave-trixie-azureuser:83b460050f6                             b1c07fdfb78c         14GB         3.34GB        
+ echo '=== Docker containers ==='
=== Docker containers ===
+ docker ps -a
CONTAINER ID   IMAGE                                                 COMMAND                  CREATED        STATUS        PORTS     NAMES
791e830e7125   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   11 hours ago   Up 11 hours             ceos_vms6-1_VM0102
675f423f43d3   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   11 hours ago   Up 11 hours             ceos_vms6-1_VM0100
f11658098148   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   11 hours ago   Up 11 hours             ceos_vms6-1_VM0103
97bc3b256e91   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   11 hours ago   Up 11 hours             ceos_vms6-1_VM0101
5fdd06a3eab9   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   11 hours ago   Up 11 hours             net_vms6-1_VM0103
e2bdff0fead9   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   11 hours ago   Up 11 hours             net_vms6-1_VM0102
1e03c28be860   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   11 hours ago   Up 11 hours             net_vms6-1_VM0101
d10dcd88dc63   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   11 hours ago   Up 11 hours             net_vms6-1_VM0100
52dacb690f2d   sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   "/root/env-python3/b…"   11 hours ago   Up 11 hours             ptf_vms6-1
43138ce8a47b   docker-sonic-mgmt-azureuser:master                    "/bin/bash"              3 days ago     Up 3 days     22/tcp    sonic-mgmt
d269a1130978   sonic-slave-trixie-azureuser:b0cc8a9f29a              "bash -c 'make -f sl…"   3 days ago     Up 3 days     22/tcp    hardcore_keller
8ebc852cc991   d0474f6ff0b1                                          "/bin/sh -c '#(nop) …"   5 weeks ago    Created                 hopeful_mcclintock
+ echo '=== Diagnostics complete ==='
=== Diagnostics complete ===
⚙️ Environment snapshot
=== Docker containers ===
NAMES                IMAGE                                                 STATUS
ceos_vms6-1_VM0103   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0102   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0101   ceosimage:4.29.10.1M-1                                Up 50 minutes
ceos_vms6-1_VM0100   ceosimage:4.29.10.1M-1                                Up 50 minutes
net_vms6-1_VM0103    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
net_vms6-1_VM0102    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
net_vms6-1_VM0101    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
net_vms6-1_VM0100    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 51 minutes
ptf_vms6-1           sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   Up 51 minutes
sonic-mgmt           docker-sonic-mgmt-azureuser:master                    Up 3 days
hardcore_keller      sonic-slave-trixie-azureuser:b0cc8a9f29a              Up 4 days
hopeful_mcclintock   d0474f6ff0b1                                          Created

=== Log directory ===
total 1156
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 19:43 .
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 19:43 ..
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 19:43 1vlan
-rw-r--r-- 1 azureuser azureuser   3666 Feb 16 18:47 clean-testbed.log
-rw-r--r-- 1 azureuser azureuser   8951 Feb 16 18:44 diagnostics.log
-rw-r--r-- 1 azureuser azureuser 982442 Feb 16 19:43 kvmtest-run.log
drwxr-xr-x 3 azureuser azureuser   4096 Feb 16 19:43 ptf
-rw-r--r-- 1 azureuser azureuser 147629 Feb 16 18:53 setup-testbed.log
-rw-r--r-- 1 azureuser azureuser    218 Feb 16 18:44 sonic-mgmt-container-setup.log
-rw-r--r-- 1 azureuser azureuser   3334 Feb 16 18:44 update-sonic-mgmt.log

=== Disk usage ===
Filesystem      Size  Used Avail Use% Mounted on
/dev/root       993G  174G  819G  18% /
/dev/root       993G  174G  819G  18% /

🔍 62 error(s) across 6 log file(s)

The patch was being applied before the setup testbed step which
runs 'git reset --hard origin/master', reverting the patch.
Move patch step to after setup testbed but before test execution.

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #189 | Commit: 4316694

⚠️ kvmtest-run.log (30 errors, 7785 lines, 971230B)

Pytest summary:

=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip snappi metadata generation for non-tgen testbed
SKIPPED [1] test_pretest.py:469: No URL specified for python saithrift package
SKIPPED [1] common/helpers/assertions.py:16: Skip 'test_backend_acl_load' on non t0-backend testbeds.
=== 10 passed, 3 skipped, 4897 deselected, 727 warnings in 264.18s (0:04:24) ===
=== Running tests individually ===
Running: python3 -m pytest dns/test_dns_resolv_conf.py --inventory ../ansible/veos_vtb --host-pattern vlab-01 --dpu-pattern None --testbed vms-kvm-t0 --testbed_file vtestbed.yaml --log-cli-level warning --log-file-level debug --kube_master unset --showlocals --assert plain --show-capture no -rav --ignore=ptftests --ignore=acstests --ignore=saitests --ignore=scripts --ignore=k8s --ignore=sai_qualify --maxfail=1 --log-file logs/1vlan/dns/test_dns_resolv_conf.log --junitxml=logs/1vlan/dns/test_dns_resolv_conf.xml --allow_recover --completeness_level=confident
============================= test session starts ==============================
=========================== short test summary info ============================
SKIPPED [1] generic_config_updater/test_eth_interface.py:199: Bypass as it is blocking submodule update
SKIPPED [1] generic_config_updater/test_eth_interface.py:335: Bypass as this is not a production scenario
SKIPPED [1] generic_config_updater/test_eth_interface.py:362: Bypass as this is not a production scenario
=========== 10 passed, 3 skipped, 372 warnings in 310.20s (0:05:10) ============
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 123.67s (0:02:03) ====

Error lines:

4:+ exit_on_error=
9:+ exit_on_error=true
89:TASK [get connection graph if defined for dut (ignore any errors)] *************
321:TASK [saved original minigraph file in SONiC DUT(ignore errors when file does not exist)] ***
322:fatal: [vlab-01]: FAILED! => {"changed": true, "cmd": "mv /etc/sonic/minigraph.xml /etc/sonic/minigraph.xml.orig", "delta": "0:00:00.003480", "end": "2026-02-16 21:10:59.469723", "msg": "non-zero return code", "rc": 1, "start": "2026-02-16 21:10:59.466243", "stderr": "mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory", "stderr_lines": ["mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory"], "stdout": "", "stdout_lines": []}
373:    "msg": "Stat result is {'changed': False, 'stat': {'exists': False}, 'failed': False}"
546:ASYNC FAILED on vlab-01: jid=j309229884628.18137
547:fatal: [vlab-01]: FAILED! => {"ansible_job_id": "j309229884628.18137", "changed": true, "cmd": ["chronyd", "-F", "1", "-q"], "delta": "0:00:10.554180", "end": "2026-02-16 21:11:54.237824", "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j309229884628.18137", "start": "2026-02-16 21:11:43.683644", "started": 1, "stderr": "2026-02-16T21:11:43Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)\n2026-02-16T21:11:43Z Timezone right/UTC failed leap second check, ignoring\n2026-02-16T21:11:43Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift\n2026-02-16T21:11:43Z Loaded seccomp filter (level 1)\n2026-02-16T21:11:54Z No suitable source for synchronisation\n2026-02-16T21:11:54Z chronyd exiting", "stderr_lines": ["2026-02-16T21:11:43Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)", "2026-02-16T21:11:43Z Timezone right/UTC failed leap second check, ignoring", "2026-02-16T21:11:43Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift", "2026-02-16T21:11:43Z Loaded seccomp filter (level 1)", "2026-02-16T21:11:54Z No suitable source for synchronisation", "2026-02-16T21:11:54Z chronyd exiting"], "stdout": "", "stdout_lines": []}
657:vlab-01                    : ok=85   changed=24   unreachable=0    failed=0    skipped=93   rescued=0    ignored=2   
1737:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_gnmi_authorize_failed_with_invalid_cname', found at tests/gnmi_e2e/test_gnmi_auth.py:33 and tests/gnmi/test_gnmi.py:152
1865:WARNING  pytest_plus:__init__.py:94 Test <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1866:WARNING  pytest_plus:__init__.py:94 <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status --fetch-from-hardware]> has an id that looks above 60 characters.
1876:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_verify_fec_stats_counters', found at tests/platform_tests/test_intf_fec.py:125 and tests/layer1/test_fec_error.py:27
1892:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -h]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1893:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --help]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1894:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -v]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1895:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --version]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1896:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -j]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1897:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --json]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1898:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -r]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1899:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --raw]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
2816:layer1/test_fec_error.py:8
2817:  /data/sonic-mgmt/tests/layer1/test_fec_error.py:8: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
2820:layer1/test_port_error.py:10
2821:  /data/sonic-mgmt/tests/layer1/test_port_error.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
3548:  /data/sonic-mgmt/tests/snappi_tests/dash/ha/ha_helper.py:71: PytestCollectionWarning: cannot collect test class 'TestPhase' because it has a __init__ constructor (from: tests/snappi_tests/dash/test_cps.py)
3980:=========================== short test summary info ============================
4430:=========================== short test summary info ============================
4489:ERROR: file or directory not found: generic_config_updater/test_ipv6.py
5543:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_gnmi_authorize_failed_with_invalid_cname', found at tests/gnmi_e2e/test_gnmi_auth.py:33 and tests/gnmi/test_gnmi.py:152

Tail (80 lines):


voq/test_voq_ipfwd.py:730
  /data/sonic-mgmt/tests/voq/test_voq_ipfwd.py:730: PytestUnknownMarkWarning: Unknown pytest.mark.express - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.param(128, 64, marks=pytest.mark.express),

voq/test_voq_disrupts.py:25
  /data/sonic-mgmt/tests/voq/test_voq_disrupts.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_scale_ecmp.py:19
  /data/sonic-mgmt/tests/vxlan/test_scale_ecmp.py:19: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_vnet_decap.py:14
  /data/sonic-mgmt/tests/vxlan/test_vnet_decap.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vnet_vxlan.py:25
  /data/sonic-mgmt/tests/vxlan/test_vnet_vxlan.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

DEBUG:tests.conftest:[log_custom_msg] item: <Function test_collect_dualtor_logs>
INFO:root:Can not get Allure report URL. Please check logs
vxlan/test_vxlan_multi_tunnel.py:13
  /data/sonic-mgmt/tests/vxlan/test_vxlan_multi_tunnel.py:13: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vxlan_tunnel_route_scale.py:23
  /data/sonic-mgmt/tests/vxlan/test_vxlan_tunnel_route_scale.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

zmq/test_gnmi_zmq.py:12
  /data/sonic-mgmt/tests/zmq/test_gnmi_zmq.py:12: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_file.py::test_file_stat is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_system.py::test_system_time is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance[NOTSET] is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance_stats is skipped when no topology marker is given
    warnings.warn(warn_msg)

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /opt/venv/lib/python3.12/site-packages/pytest_ansible/host_manager/v213.py:13: DeprecationWarning: Host management is deprecated and will be removed in a future release
    class HostManagerV213(BaseHostManager):

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_recover_rsyslog_rate_limit[vlab-01]
tests/test_posttest.py::test_enable_startup_tsa_tsb_service
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_dualtor_logs
  /opt/venv/lib/python3.12/site-packages/ansible/plugins/loader.py:1485: UserWarning: AnsibleCollectionFinder has already been configured
    warnings.warn('AnsibleCollectionFinder has already been configured')

tests/test_posttest.py: 107 warnings
  /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=10897) is multi-threaded, use of fork() may lead to deadlocks in the child.
    self.pid = os.fork()

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /data/sonic-mgmt/tests/conftest.py:1248: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).
    record_testsuite_property("timestamp", datetime.utcnow())

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
------ generated xml file: /data/sonic-mgmt/tests/logs/1vlan/posttest.xml ------
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 123.67s (0:02:03) ====
📄 update-sonic-mgmt.log (32 lines, 3334B)

Tail (80 lines):

+ TARGET=/data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
+ echo '=== Patching test_cacl.py with config restore fix ==='
=== Patching test_cacl.py with config restore fix ===
+ echo 'Before patch:'
Before patch:
+ sed -n 49,53p /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
@pytest.fixture(scope="module", autouse=True)
def restore_test_env(duthosts, rand_one_dut_front_end_hostname):
    duthost = duthosts[rand_one_dut_front_end_hostname]
    config_reload(duthost, config_source="minigraph", safe_reload=True)
    yield
+ echo aW1wb3J0IHN5cwp0YXJnZXQgPSBzeXMuYXJndlsxXQp3aXRoIG9wZW4odGFyZ2V0LCAncicpIGFzIGY6CiAgICBjb250ZW50ID0gZi5yZWFkKCkKb2xkID0gJ0BweXRlc3QuZml4dHVyZShzY29wZT0ibW9kdWxlIiwgYXV0b3VzZT1UcnVlKVxuZGVmIHJlc3RvcmVfdGVzdF9lbnYoZHV0aG9zdHMsIHJhbmRfb25lX2R1dF9mcm9udF9lbmRfaG9zdG5hbWUpOlxuICAgIGR1dGhvc3QgPSBkdXRob3N0c1tyYW5kX29uZV9kdXRfZnJvbnRfZW5kX2hvc3RuYW1lXVxuICAgIGNvbmZpZ19yZWxvYWQoZHV0aG9zdCwgY29uZmlnX3NvdXJjZT0ibWluaWdyYXBoIiwgc2FmZV9yZWxvYWQ9VHJ1ZSlcbiAgICB5aWVsZCcKbmV3ID0gJ0BweXRlc3QuZml4dHVyZShzY29wZT0ibW9kdWxlIiwgYXV0b3VzZT1UcnVlKVxuZGVmIHJlc3RvcmVfdGVzdF9lbnYoZHV0aG9zdHMsIHJhbmRfb25lX2R1dF9mcm9udF9lbmRfaG9zdG5hbWUpOlxuICAgIGR1dGhvc3QgPSBkdXRob3N0c1tyYW5kX29uZV9kdXRfZnJvbnRfZW5kX2hvc3RuYW1lXVxuICAgIGR1dGhvc3Quc2hlbGwoImNwIC9ldGMvc29uaWMvY29uZmlnX2RiLmpzb24gL2V0Yy9zb25pYy9jb25maWdfZGIuanNvbi5iZWZvcmVfY2FjbCIpXG4gICAgY29uZmlnX3JlbG9hZChkdXRob3N0LCBjb25maWdfc291cmNlPSJtaW5pZ3JhcGgiLCBzYWZlX3JlbG9hZD1UcnVlKVxuICAgIHlpZWxkXG4gICAgbG9nZ2VyLmluZm8oIlJlc3RvcmluZyBvcmlnaW5hbCBjb25maWdfZGIuanNvbiBhZnRlciB0ZXN0X2NhY2wgbW9kdWxlIilcbiAgICBkdXRob3N0LnNoZWxsKCJjcCAvZXRjL3NvbmljL2NvbmZpZ19kYi5qc29uLmJlZm9yZV9jYWNsIC9ldGMvc29uaWMvY29uZmlnX2RiLmpzb24iKVxuICAgIGNvbmZpZ19yZWxvYWQoZHV0aG9zdCwgc2FmZV9yZWxvYWQ9VHJ1ZSlcbiAgICBkdXRob3N0LnNoZWxsKCJybSAtZiAvZXRjL3NvbmljL2NvbmZpZ19kYi5qc29uLmJlZm9yZV9jYWNsIiknCmlmIG9sZCBpbiBjb250ZW50OgogICAgY29udGVudCA9IGNvbnRlbnQucmVwbGFjZShvbGQsIG5ldykKICAgIHdpdGggb3Blbih0YXJnZXQsICd3JykgYXMgZjoKICAgICAgICBmLndyaXRlKGNvbnRlbnQpCiAgICBwcmludCgnU1VDQ0VTUzogdGVzdF9jYWNsLnB5IHBhdGNoZWQnKQplbHNlOgogICAgcHJpbnQoJ1dBUk5JTkc6IFBhdHRlcm4gbm90IGZvdW5kLCBjaGVja2luZyBmaWxlIGNvbnRlbnQuLi4nKQogICAgZm9yIGksIGxpbmUgaW4gZW51bWVyYXRlKGNvbnRlbnQuc3BsaXQoJ1xuJyksIDEpOgogICAgICAgIGlmICdyZXN0b3JlX3Rlc3RfZW52JyBpbiBsaW5lIG9yICdiZWZvcmVfY2FjbCcgaW4gbGluZToKICAgICAgICAgICAgcHJpbnQoZicgIExpbmUge2l9OiB7bGluZX0nKQo=
+ base64 -d
+ python3 /tmp/patch_cacl.py /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
SUCCESS: test_cacl.py patched
+ echo 'After patch:'
After patch:
+ sed -n 49,62p /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
@pytest.fixture(scope="module", autouse=True)
def restore_test_env(duthosts, rand_one_dut_front_end_hostname):
    duthost = duthosts[rand_one_dut_front_end_hostname]
    duthost.shell("cp /etc/sonic/config_db.json /etc/sonic/config_db.json.before_cacl")
    config_reload(duthost, config_source="minigraph", safe_reload=True)
    yield
    logger.info("Restoring original config_db.json after test_cacl module")
    duthost.shell("cp /etc/sonic/config_db.json.before_cacl /etc/sonic/config_db.json")
    config_reload(duthost, safe_reload=True)
    duthost.shell("rm -f /etc/sonic/config_db.json.before_cacl")


@pytest.fixture(scope="module", autouse=True)
def disable_port_toggle(duthosts, tbinfo, restore_test_env):
⚠️ setup-testbed.log (30 errors, 4646 lines, 147629B)

Error lines:

764:fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": true, "cmd": "arp -d  10.250.0.101", "delta": "0:00:00.003134", "end": "2026-02-16 21:03:31.955171", "msg": "non-zero return code", "rc": 255, "start": "2026-02-16 21:03:31.952037", "stderr": "", "stderr_lines": [], "stdout": "No ARP entry for 10.250.0.101", "stdout_lines": ["No ARP entry for 10.250.0.101"]}
797:TASK [vm_set : Fail if kickstart gives error for vlab-01] **********************
1999:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j286393489374.31235', 'results_file': '/root/.ansible_async/j286393489374.31235', 'changed': True, 'vm_name': 'VM0100', 'ansible_loop_var': 'vm_name'})
2000:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j914571739060.31266', 'results_file': '/root/.ansible_async/j914571739060.31266', 'changed': True, 'vm_name': 'VM0101', 'ansible_loop_var': 'vm_name'})
2001:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j365448338482.31297', 'results_file': '/root/.ansible_async/j365448338482.31297', 'changed': True, 'vm_name': 'VM0102', 'ansible_loop_var': 'vm_name'})
2002:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j338886718343.31340', 'results_file': '/root/.ansible_async/j338886718343.31340', 'changed': True, 'vm_name': 'VM0103', 'ansible_loop_var': 'vm_name'})
2180:fatal: [STR-ACS-VSERV-01 -> localhost]: FAILED! => {"changed": false, "msg": "Traceback (most recent call last):\n  File \"/tmp/ansible_test_facts_payload_apae106v/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 240, in main\n    testbed_topo = topoinfo.get_testbed_info(testbed_name)\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/tmp/ansible_test_facts_payload_apae106v/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 193, in get_testbed_info\n    return self.testbed_topo[testbed_name]\n           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^\nKeyError: 'vms-kvm-t0'\n"}
3114:        "failed": false,
3132:        "failed": false,
3150:        "failed": false,
3168:        "failed": false,
4599:STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
4600:VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
4601:VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4602:VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4603:VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4604:VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4605:VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4606:VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4607:VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4608:VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4609:VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4610:VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4611:VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4612:VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4613:VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4614:VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4615:VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4616:VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4617:VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Tail (80 lines):

skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

PLAY [servers:&vm_host] ********************************************************

TASK [Integrated traffic generator] ********************************************
skipping: [STR-ACS-VSERV-01]

PLAY RECAP *********************************************************************
STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0118                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0119                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0120                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0121                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0122                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0123                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0124                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0125                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0126                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0127                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0128                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0129                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0130                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0131                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0132                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0133                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0134                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0135                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0136                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0137                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0138                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0139                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0140                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0141                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0142                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0143                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Done
+ sleep 180
📄 clean-testbed.log (94 lines, 3666B)

Tail (80 lines):

+ docker stop ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ docker rm -f ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0102'
Stopping and removing container: ceos_vms6-1_VM0102
+ docker stop ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ docker rm -f ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0101'
Stopping and removing container: ceos_vms6-1_VM0101
+ docker stop ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ docker rm -f ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0100'
Stopping and removing container: ceos_vms6-1_VM0100
+ docker stop ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ docker rm -f ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0103'
Stopping and removing container: net_vms6-1_VM0103
+ docker stop net_vms6-1_VM0103
net_vms6-1_VM0103
+ docker rm -f net_vms6-1_VM0103
net_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0102'
Stopping and removing container: net_vms6-1_VM0102
+ docker stop net_vms6-1_VM0102
net_vms6-1_VM0102
+ docker rm -f net_vms6-1_VM0102
net_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0101'
Stopping and removing container: net_vms6-1_VM0101
+ docker stop net_vms6-1_VM0101
net_vms6-1_VM0101
+ docker rm -f net_vms6-1_VM0101
net_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0100'
Stopping and removing container: net_vms6-1_VM0100
+ docker stop net_vms6-1_VM0100
net_vms6-1_VM0100
+ docker rm -f net_vms6-1_VM0100
net_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ptf_vms6-1'
Stopping and removing container: ptf_vms6-1
+ docker stop ptf_vms6-1
ptf_vms6-1
+ docker rm -f ptf_vms6-1
ptf_vms6-1
+ echo '=== Cleaning cEOS data directories ==='
=== Cleaning cEOS data directories ===
+ sudo rm -rf /data/ceos/ceos_vms6-1_VM0100 /data/ceos/ceos_vms6-1_VM0101 /data/ceos/ceos_vms6-1_VM0102 /data/ceos/ceos_vms6-1_VM0103
+ echo '=== Removing stale vlab VMs ==='
=== Removing stale vlab VMs ===
+ virsh -c qemu:///system list --all
+ grep -q vlab
+ echo '=== Verifying sonic-mgmt container still running ==='
=== Verifying sonic-mgmt container still running ===
+ docker exec sonic-mgmt echo OK
OK
+ echo '=== Remaining containers ==='
=== Remaining containers ===
+ docker ps -a --format 'table {{.Names}}\t{{.Status}}'
NAMES                STATUS
sonic-mgmt           Up 3 days
hardcore_keller      Up 4 days
hopeful_mcclintock   Created
+ echo '=== Cleanup complete ==='
=== Cleanup complete ===
📄 sonic-mgmt-container-setup.log (5 lines, 218B)

Tail (80 lines):

+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo 'sonic-mgmt container already running, nothing to do'
sonic-mgmt container already running, nothing to do
+ exit 0
⚠️ diagnostics.log (2 errors, 132 lines, 8925B)

Error lines:

26:+ echo 'FAIL: cannot connect to libvirt'
27:FAIL: cannot connect to libvirt

Tail (80 lines):

-rwxrwxr-x   1 azureuser azureuser 18986 Feb 13 02:41 setup-container.sh
-rw-rw-r--   1 azureuser azureuser  2025 Feb 13 02:41 sonic_dictionary.txt
drwxrwxr-x  15 azureuser azureuser  4096 Feb 13 02:41 spytest
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 test_reporting
drwxrwxr-x 129 azureuser azureuser  4096 Feb 16 19:05 tests
+ ls /data/sonic-mgmt/tests/kvmtest.sh
/data/sonic-mgmt/tests/kvmtest.sh
+ echo '=== Check sonic-mgmt docker container ==='
=== Check sonic-mgmt docker container ===
+ docker ps -a --filter name=sonic-mgmt
CONTAINER ID   IMAGE                                COMMAND       CREATED      STATUS      PORTS     NAMES
43138ce8a47b   docker-sonic-mgmt-azureuser:master   "/bin/bash"   3 days ago   Up 3 days   22/tcp    sonic-mgmt
+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo '=== Check /data contents ==='
=== Check /data contents ===
+ ls -la /data/
total 20
drwxr-xr-x  5 azureuser azureuser 4096 Feb 13 03:20 .
drwxr-xr-x 22 root      root      4096 Feb 13 02:25 ..
drwxr-xr-x  6 root      root      4096 Feb 16 18:52 ceos
drwxrwxr-x 13 azureuser azureuser 4096 Feb 13 03:30 sonic-mgmt
drwxr-xr-x  4 azureuser azureuser 4096 Feb 13 11:58 sonic-vm
+ echo '=== Check cEOS images ==='
=== Check cEOS images ===
+ ls -la /data/sonic-vm/images/
total 5030220
drwxr-xr-x 2 azureuser azureuser       4096 Feb 16 18:48 .
drwxr-xr-x 4 azureuser azureuser       4096 Feb 13 11:58 ..
-rw-r--r-- 1 azureuser azureuser 5150932992 Feb 16 18:47 sonic-vs.img
+ ls -la /data/ceos/
total 24
drwxr-xr-x   6 root      root      4096 Feb 16 18:52 .
drwxr-xr-x   5 azureuser azureuser 4096 Feb 13 03:20 ..
drwxrwxr-x+ 10 root      root      4096 Feb 16 18:53 ceos_vms6-1_VM0100
drwxrwxr-x+ 10 root      root      4096 Feb 16 18:53 ceos_vms6-1_VM0101
drwxrwxr-x+ 10 root      root      4096 Feb 16 18:53 ceos_vms6-1_VM0102
drwxrwxr-x+ 10 root      root      4096 Feb 16 18:53 ceos_vms6-1_VM0103
+ echo '=== Docker images ==='
=== Docker images ===
+ docker images
+ head -20
IMAGE                                                                ID             DISK USAGE   CONTENT SIZE   EXTRA
ceosimage:4.29.10.1M                                                 cf484164b16d       2.89GB          731MB        
ceosimage:4.29.10.1M-1                                               c6a1850ef28f       2.89GB          731MB   U    
docker-ptf:latest                                                    f28aaf787373       9.09GB         4.39GB        
docker-sonic-mgmt-azureuser:master                                   dc1af3bbf6a5       5.08GB          992MB   U    
docker-sonic-vs:latest                                               93c8cf3b870e       1.74GB          828MB        
publicmirror.azurecr.io/debian:bookworm                              c66c66fac809        185MB         52.2MB        
publicmirror.azurecr.io/debian:trixie                                c71b05eac0b2        186MB         52.5MB        
sonic-slave-bookworm-azureuser:1828b4d7c29                           983be73adef0       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:35feeb650be                           ae838157f361       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:6db1f584aa2                           266b697e04d1       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:73c8df02574                           dfe58e9aafc4       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:9eaf6be7f19                           79c4b6740f13       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:a27a4904ede                           c2a93f135256       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:ada88ee24f1                           06a1ce313b86       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:b8b044053f8                           e27f53eab83a       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:eee57d1af57                           18f6072bb74c       13.5GB         3.34GB        
sonic-slave-bookworm:733a2a69061                                     0114860d906a       13.5GB         3.34GB        
sonic-slave-trixie-azureuser:01d13355124                             896b1ab6f42e         14GB         3.34GB        
sonic-slave-trixie-azureuser:83b460050f6                             b1c07fdfb78c         14GB         3.34GB        
+ echo '=== Docker containers ==='
=== Docker containers ===
+ docker ps -a
CONTAINER ID   IMAGE                                                 COMMAND                  CREATED       STATUS       PORTS     NAMES
a75917725f3c   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago   Up 2 hours             ceos_vms6-1_VM0103
5e8930e05321   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago   Up 2 hours             ceos_vms6-1_VM0102
0555083de52e   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago   Up 2 hours             ceos_vms6-1_VM0101
3118430cfee5   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   2 hours ago   Up 2 hours             ceos_vms6-1_VM0100
a8403c7b4ae6   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago   Up 2 hours             net_vms6-1_VM0103
97fb1b6a9c9d   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago   Up 2 hours             net_vms6-1_VM0102
ad39017906f1   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago   Up 2 hours             net_vms6-1_VM0101
17f0fa602674   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   2 hours ago   Up 2 hours             net_vms6-1_VM0100
fad48a897670   sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   "/root/env-python3/b…"   2 hours ago   Up 2 hours             ptf_vms6-1
43138ce8a47b   docker-sonic-mgmt-azureuser:master                    "/bin/bash"              3 days ago    Up 3 days    22/tcp    sonic-mgmt
d269a1130978   sonic-slave-trixie-azureuser:b0cc8a9f29a              "bash -c 'make -f sl…"   4 days ago    Up 4 days    22/tcp    hardcore_keller
8ebc852cc991   d0474f6ff0b1                                          "/bin/sh -c '#(nop) …"   5 weeks ago   Created                hopeful_mcclintock
+ echo '=== Diagnostics complete ==='
=== Diagnostics complete ===
⚙️ Environment snapshot
=== Docker containers ===
NAMES                IMAGE                                                 STATUS
ceos_vms6-1_VM0103   ceosimage:4.29.10.1M-1                                Up 51 minutes
ceos_vms6-1_VM0102   ceosimage:4.29.10.1M-1                                Up 51 minutes
ceos_vms6-1_VM0101   ceosimage:4.29.10.1M-1                                Up 51 minutes
ceos_vms6-1_VM0100   ceosimage:4.29.10.1M-1                                Up 51 minutes
net_vms6-1_VM0103    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 53 minutes
net_vms6-1_VM0102    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 53 minutes
net_vms6-1_VM0101    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 53 minutes
net_vms6-1_VM0100    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 53 minutes
ptf_vms6-1           sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   Up 53 minutes
sonic-mgmt           docker-sonic-mgmt-azureuser:master                    Up 3 days
hardcore_keller      sonic-slave-trixie-azureuser:b0cc8a9f29a              Up 4 days
hopeful_mcclintock   d0474f6ff0b1                                          Created

=== Log directory ===
total 1148
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 21:58 .
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 21:58 ..
drwxr-xr-x 4 azureuser azureuser   4096 Feb 16 21:58 1vlan
-rw-r--r-- 1 azureuser azureuser   3666 Feb 16 21:02 clean-testbed.log
-rw-r--r-- 1 azureuser azureuser   8925 Feb 16 20:59 diagnostics.log
-rw-r--r-- 1 azureuser azureuser 971230 Feb 16 21:58 kvmtest-run.log
drwxr-xr-x 3 azureuser azureuser   4096 Feb 16 21:58 ptf
-rw-r--r-- 1 azureuser azureuser 147629 Feb 16 21:07 setup-testbed.log
-rw-r--r-- 1 azureuser azureuser    218 Feb 16 20:59 sonic-mgmt-container-setup.log
-rw-r--r-- 1 azureuser azureuser   3334 Feb 16 21:10 update-sonic-mgmt.log

=== Disk usage ===
Filesystem      Size  Used Avail Use% Mounted on
/dev/root       993G  174G  819G  18% /
/dev/root       993G  174G  819G  18% /

🔍 62 error(s) across 6 log file(s)

Add sed patch to fix the stale test_ipv6.py reference in kvmtest.sh
that was renamed to test_ip_bgp.py in sonic-net/sonic-mgmt#13650.
This was causing 'file or directory not found' errors during kvmtest.

Co-Authored-By: Arthur Poon <arthur.poon@windsurf.com>
@arthur-cog-sonic
Copy link
Owner

❌ Build Failed: kvmtest-t0

Build: #190 | Commit: d037980

⚠️ kvmtest-run.log (30 errors, 8349 lines, 1013561B)

Pytest summary:

=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip snappi metadata generation for non-tgen testbed
SKIPPED [1] test_pretest.py:469: No URL specified for python saithrift package
SKIPPED [1] common/helpers/assertions.py:16: Skip 'test_backend_acl_load' on non t0-backend testbeds.
=== 10 passed, 3 skipped, 4897 deselected, 727 warnings in 258.07s (0:04:18) ===
=== Running tests individually ===
Running: python3 -m pytest dns/test_dns_resolv_conf.py --inventory ../ansible/veos_vtb --host-pattern vlab-01 --dpu-pattern None --testbed vms-kvm-t0 --testbed_file vtestbed.yaml --log-cli-level warning --log-file-level debug --kube_master unset --showlocals --assert plain --show-capture no -rav --ignore=ptftests --ignore=acstests --ignore=saitests --ignore=scripts --ignore=k8s --ignore=sai_qualify --maxfail=1 --log-file logs/1vlan/dns/test_dns_resolv_conf.log --junitxml=logs/1vlan/dns/test_dns_resolv_conf.xml --allow_recover --completeness_level=confident
============================= test session starts ==============================
=========================== short test summary info ============================
SKIPPED [1] generic_config_updater/test_eth_interface.py:199: Bypass as it is blocking submodule update
SKIPPED [1] generic_config_updater/test_eth_interface.py:335: Bypass as this is not a production scenario
SKIPPED [1] generic_config_updater/test_eth_interface.py:362: Bypass as this is not a production scenario
=========== 10 passed, 3 skipped, 372 warnings in 307.28s (0:05:07) ============
=========================== short test summary info ============================
FAILED gnmi/test_gnmi_configdb.py::test_gnmi_configdb_full_01 - AssertionErro...
!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!
============ 1 failed, 12 passed, 948 warnings in 918.76s (0:15:18) ============
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 122.16s (0:02:02) ====

Error lines:

4:+ exit_on_error=
9:+ exit_on_error=true
89:TASK [get connection graph if defined for dut (ignore any errors)] *************
321:TASK [saved original minigraph file in SONiC DUT(ignore errors when file does not exist)] ***
322:fatal: [vlab-01]: FAILED! => {"changed": true, "cmd": "mv /etc/sonic/minigraph.xml /etc/sonic/minigraph.xml.orig", "delta": "0:00:00.003462", "end": "2026-02-16 23:58:50.867614", "msg": "non-zero return code", "rc": 1, "start": "2026-02-16 23:58:50.864152", "stderr": "mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory", "stderr_lines": ["mv: cannot stat '/etc/sonic/minigraph.xml': No such file or directory"], "stdout": "", "stdout_lines": []}
373:    "msg": "Stat result is {'changed': False, 'stat': {'exists': False}, 'failed': False}"
546:ASYNC FAILED on vlab-01: jid=j348391802249.18485
547:fatal: [vlab-01]: FAILED! => {"ansible_job_id": "j348391802249.18485", "changed": true, "cmd": ["chronyd", "-F", "1", "-q"], "delta": "0:00:10.549890", "end": "2026-02-16 23:59:45.912131", "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j348391802249.18485", "start": "2026-02-16 23:59:35.362241", "started": 1, "stderr": "2026-02-16T23:59:35Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)\n2026-02-16T23:59:35Z Timezone right/UTC failed leap second check, ignoring\n2026-02-16T23:59:35Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift\n2026-02-16T23:59:35Z Loaded seccomp filter (level 1)\n2026-02-16T23:59:45Z No suitable source for synchronisation\n2026-02-16T23:59:45Z chronyd exiting", "stderr_lines": ["2026-02-16T23:59:35Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG)", "2026-02-16T23:59:35Z Timezone right/UTC failed leap second check, ignoring", "2026-02-16T23:59:35Z Frequency 0.000 +/- 1000000.000 ppm read from /var/lib/chrony/chrony.drift", "2026-02-16T23:59:35Z Loaded seccomp filter (level 1)", "2026-02-16T23:59:45Z No suitable source for synchronisation", "2026-02-16T23:59:45Z chronyd exiting"], "stdout": "", "stdout_lines": []}
657:vlab-01                    : ok=85   changed=24   unreachable=0    failed=0    skipped=93   rescued=0    ignored=2   
1737:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_gnmi_authorize_failed_with_invalid_cname', found at tests/gnmi_e2e/test_gnmi_auth.py:33 and tests/gnmi/test_gnmi.py:152
1865:WARNING  pytest_plus:__init__.py:94 Test <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1866:WARNING  pytest_plus:__init__.py:94 <Function test_check_sfputil_error_status[vlab-01-None-sudo sfputil show error-status --fetch-from-hardware]> has an id that looks above 60 characters.
1876:WARNING  pytest_plus:__init__.py:94 Duplicate test name 'test_verify_fec_stats_counters', found at tests/platform_tests/test_intf_fec.py:125 and tests/layer1/test_fec_error.py:27
1892:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -h]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1893:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --help]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1894:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -v]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1895:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --version]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1896:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -j]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1897:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --json]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1898:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat -r]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
1899:WARNING  pytest_plus:__init__.py:94 Test <Function test_portstat_no_exceptions[vlab-01-portstat --raw]> has an id that does not match our safe pattern '^[\w_\-\.:]+$' for use with a terminal.
2816:layer1/test_fec_error.py:8
2817:  /data/sonic-mgmt/tests/layer1/test_fec_error.py:8: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
2820:layer1/test_port_error.py:10
2821:  /data/sonic-mgmt/tests/layer1/test_port_error.py:10: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
3548:  /data/sonic-mgmt/tests/snappi_tests/dash/ha/ha_helper.py:71: PytestCollectionWarning: cannot collect test class 'TestPhase' because it has a __init__ constructor (from: tests/snappi_tests/dash/test_cps.py)
3980:=========================== short test summary info ============================
4430:=========================== short test summary info ============================
4711:ERROR    tests.common.plugins.sanity_check.checks:checks.py:224 Failed to get BGP status on host vlab-01: run module bgp_facts failed, Ansible Results =>
4712:failed = True

Tail (80 lines):


voq/test_voq_ipfwd.py:730
  /data/sonic-mgmt/tests/voq/test_voq_ipfwd.py:730: PytestUnknownMarkWarning: Unknown pytest.mark.express - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.param(128, 64, marks=pytest.mark.express),

voq/test_voq_disrupts.py:25
  /data/sonic-mgmt/tests/voq/test_voq_disrupts.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_scale_ecmp.py:19
  /data/sonic-mgmt/tests/vxlan/test_scale_ecmp.py:19: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

vxlan/test_vnet_decap.py:14
  /data/sonic-mgmt/tests/vxlan/test_vnet_decap.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vnet_vxlan.py:25
  /data/sonic-mgmt/tests/vxlan/test_vnet_vxlan.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

DEBUG:tests.conftest:[log_custom_msg] item: <Function test_collect_dualtor_logs>
INFO:root:Can not get Allure report URL. Please check logs
vxlan/test_vxlan_multi_tunnel.py:13
  /data/sonic-mgmt/tests/vxlan/test_vxlan_multi_tunnel.py:13: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer

vxlan/test_vxlan_tunnel_route_scale.py:23
  /data/sonic-mgmt/tests/vxlan/test_vxlan_tunnel_route_scale.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

zmq/test_gnmi_zmq.py:12
  /data/sonic-mgmt/tests/zmq/test_gnmi_zmq.py:12: PytestUnknownMarkWarning: Unknown pytest.mark.disable_loganalyzer - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    pytest.mark.disable_loganalyzer,

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_file.py::test_file_stat is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/gnxi/test_gnoi_system.py::test_system_time is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance[NOTSET] is skipped when no topology marker is given
    warnings.warn(warn_msg)

common/plugins/custom_markers/__init__.py:88
  /data/sonic-mgmt/tests/common/plugins/custom_markers/__init__.py:88: UserWarning: testcase tests/performance_meter/test_performance.py::test_performance_stats is skipped when no topology marker is given
    warnings.warn(warn_msg)

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /opt/venv/lib/python3.12/site-packages/pytest_ansible/host_manager/v213.py:13: DeprecationWarning: Host management is deprecated and will be removed in a future release
    class HostManagerV213(BaseHostManager):

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
tests/test_posttest.py::test_recover_rsyslog_rate_limit[vlab-01]
tests/test_posttest.py::test_enable_startup_tsa_tsb_service
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_ptf_logs
tests/test_posttest.py::test_collect_dualtor_logs
  /opt/venv/lib/python3.12/site-packages/ansible/plugins/loader.py:1485: UserWarning: AnsibleCollectionFinder has already been configured
    warnings.warn('AnsibleCollectionFinder has already been configured')

tests/test_posttest.py: 107 warnings
  /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=25547) is multi-threaded, use of fork() may lead to deadlocks in the child.
    self.pid = os.fork()

tests/test_posttest.py::test_restore_container_autorestart[vlab-01]
  /data/sonic-mgmt/tests/conftest.py:1248: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).
    record_testsuite_property("timestamp", datetime.utcnow())

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
------ generated xml file: /data/sonic-mgmt/tests/logs/1vlan/posttest.xml ------
=========================== short test summary info ============================
SKIPPED [1] common/helpers/assertions.py:16: Skip on vs testbed
=== 4 passed, 1 skipped, 4905 deselected, 615 warnings in 122.16s (0:02:02) ====
📄 update-sonic-mgmt.log (38 lines, 3698B)

Tail (80 lines):

+ TARGET=/data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
+ echo '=== Patching test_cacl.py with config restore fix ==='
=== Patching test_cacl.py with config restore fix ===
+ echo 'Before patch:'
Before patch:
+ sed -n 49,53p /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
@pytest.fixture(scope="module", autouse=True)
def restore_test_env(duthosts, rand_one_dut_front_end_hostname):
    duthost = duthosts[rand_one_dut_front_end_hostname]
    config_reload(duthost, config_source="minigraph", safe_reload=True)
    yield
+ echo aW1wb3J0IHN5cwp0YXJnZXQgPSBzeXMuYXJndlsxXQp3aXRoIG9wZW4odGFyZ2V0LCAncicpIGFzIGY6CiAgICBjb250ZW50ID0gZi5yZWFkKCkKb2xkID0gJ0BweXRlc3QuZml4dHVyZShzY29wZT0ibW9kdWxlIiwgYXV0b3VzZT1UcnVlKVxuZGVmIHJlc3RvcmVfdGVzdF9lbnYoZHV0aG9zdHMsIHJhbmRfb25lX2R1dF9mcm9udF9lbmRfaG9zdG5hbWUpOlxuICAgIGR1dGhvc3QgPSBkdXRob3N0c1tyYW5kX29uZV9kdXRfZnJvbnRfZW5kX2hvc3RuYW1lXVxuICAgIGNvbmZpZ19yZWxvYWQoZHV0aG9zdCwgY29uZmlnX3NvdXJjZT0ibWluaWdyYXBoIiwgc2FmZV9yZWxvYWQ9VHJ1ZSlcbiAgICB5aWVsZCcKbmV3ID0gJ0BweXRlc3QuZml4dHVyZShzY29wZT0ibW9kdWxlIiwgYXV0b3VzZT1UcnVlKVxuZGVmIHJlc3RvcmVfdGVzdF9lbnYoZHV0aG9zdHMsIHJhbmRfb25lX2R1dF9mcm9udF9lbmRfaG9zdG5hbWUpOlxuICAgIGR1dGhvc3QgPSBkdXRob3N0c1tyYW5kX29uZV9kdXRfZnJvbnRfZW5kX2hvc3RuYW1lXVxuICAgIGR1dGhvc3Quc2hlbGwoImNwIC9ldGMvc29uaWMvY29uZmlnX2RiLmpzb24gL2V0Yy9zb25pYy9jb25maWdfZGIuanNvbi5iZWZvcmVfY2FjbCIpXG4gICAgY29uZmlnX3JlbG9hZChkdXRob3N0LCBjb25maWdfc291cmNlPSJtaW5pZ3JhcGgiLCBzYWZlX3JlbG9hZD1UcnVlKVxuICAgIHlpZWxkXG4gICAgbG9nZ2VyLmluZm8oIlJlc3RvcmluZyBvcmlnaW5hbCBjb25maWdfZGIuanNvbiBhZnRlciB0ZXN0X2NhY2wgbW9kdWxlIilcbiAgICBkdXRob3N0LnNoZWxsKCJjcCAvZXRjL3NvbmljL2NvbmZpZ19kYi5qc29uLmJlZm9yZV9jYWNsIC9ldGMvc29uaWMvY29uZmlnX2RiLmpzb24iKVxuICAgIGNvbmZpZ19yZWxvYWQoZHV0aG9zdCwgc2FmZV9yZWxvYWQ9VHJ1ZSlcbiAgICBkdXRob3N0LnNoZWxsKCJybSAtZiAvZXRjL3NvbmljL2NvbmZpZ19kYi5qc29uLmJlZm9yZV9jYWNsIiknCmlmIG9sZCBpbiBjb250ZW50OgogICAgY29udGVudCA9IGNvbnRlbnQucmVwbGFjZShvbGQsIG5ldykKICAgIHdpdGggb3Blbih0YXJnZXQsICd3JykgYXMgZjoKICAgICAgICBmLndyaXRlKGNvbnRlbnQpCiAgICBwcmludCgnU1VDQ0VTUzogdGVzdF9jYWNsLnB5IHBhdGNoZWQnKQplbHNlOgogICAgcHJpbnQoJ1dBUk5JTkc6IFBhdHRlcm4gbm90IGZvdW5kLCBjaGVja2luZyBmaWxlIGNvbnRlbnQuLi4nKQogICAgZm9yIGksIGxpbmUgaW4gZW51bWVyYXRlKGNvbnRlbnQuc3BsaXQoJ1xuJyksIDEpOgogICAgICAgIGlmICdyZXN0b3JlX3Rlc3RfZW52JyBpbiBsaW5lIG9yICdiZWZvcmVfY2FjbCcgaW4gbGluZToKICAgICAgICAgICAgcHJpbnQoZicgIExpbmUge2l9OiB7bGluZX0nKQo=
+ base64 -d
+ python3 /tmp/patch_cacl.py /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
SUCCESS: test_cacl.py patched
+ echo 'After patch:'
After patch:
+ sed -n 49,62p /data/sonic-mgmt/tests/generic_config_updater/test_cacl.py
@pytest.fixture(scope="module", autouse=True)
def restore_test_env(duthosts, rand_one_dut_front_end_hostname):
    duthost = duthosts[rand_one_dut_front_end_hostname]
    duthost.shell("cp /etc/sonic/config_db.json /etc/sonic/config_db.json.before_cacl")
    config_reload(duthost, config_source="minigraph", safe_reload=True)
    yield
    logger.info("Restoring original config_db.json after test_cacl module")
    duthost.shell("cp /etc/sonic/config_db.json.before_cacl /etc/sonic/config_db.json")
    config_reload(duthost, safe_reload=True)
    duthost.shell("rm -f /etc/sonic/config_db.json.before_cacl")


@pytest.fixture(scope="module", autouse=True)
def disable_port_toggle(duthosts, tbinfo, restore_test_env):
+ echo '=== Fixing kvmtest.sh: test_ipv6.py was renamed to test_ip_bgp.py ==='
=== Fixing kvmtest.sh: test_ipv6.py was renamed to test_ip_bgp.py ===
+ KVMTEST=/data/sonic-mgmt/tests/kvmtest.sh
+ sed -i 's|generic_config_updater/test_ipv6\.py|generic_config_updater/test_ip_bgp.py|g' /data/sonic-mgmt/tests/kvmtest.sh
+ echo 'kvmtest.sh patched'
kvmtest.sh patched
⚠️ setup-testbed.log (30 errors, 4650 lines, 147970B)

Error lines:

768:fatal: [STR-ACS-VSERV-01]: FAILED! => {"changed": true, "cmd": "arp -d  10.250.0.101", "delta": "0:00:00.003120", "end": "2026-02-16 23:51:26.751191", "msg": "non-zero return code", "rc": 255, "start": "2026-02-16 23:51:26.748071", "stderr": "", "stderr_lines": [], "stdout": "No ARP entry for 10.250.0.101", "stdout_lines": ["No ARP entry for 10.250.0.101"]}
801:TASK [vm_set : Fail if kickstart gives error for vlab-01] **********************
2003:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j367700580818.34773', 'results_file': '/root/.ansible_async/j367700580818.34773', 'changed': True, 'vm_name': 'VM0100', 'ansible_loop_var': 'vm_name'})
2004:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j43541470966.34804', 'results_file': '/root/.ansible_async/j43541470966.34804', 'changed': True, 'vm_name': 'VM0101', 'ansible_loop_var': 'vm_name'})
2005:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j599425000977.34835', 'results_file': '/root/.ansible_async/j599425000977.34835', 'changed': True, 'vm_name': 'VM0102', 'ansible_loop_var': 'vm_name'})
2006:changed: [STR-ACS-VSERV-01] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j497761502421.34875', 'results_file': '/root/.ansible_async/j497761502421.34875', 'changed': True, 'vm_name': 'VM0103', 'ansible_loop_var': 'vm_name'})
2184:fatal: [STR-ACS-VSERV-01 -> localhost]: FAILED! => {"changed": false, "msg": "Traceback (most recent call last):\n  File \"/tmp/ansible_test_facts_payload_3i9qk3co/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 240, in main\n    testbed_topo = topoinfo.get_testbed_info(testbed_name)\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File \"/tmp/ansible_test_facts_payload_3i9qk3co/ansible_test_facts_payload.zip/ansible/modules/test_facts.py\", line 193, in get_testbed_info\n    return self.testbed_topo[testbed_name]\n           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^\nKeyError: 'vms-kvm-t0'\n"}
3118:        "failed": false,
3136:        "failed": false,
3154:        "failed": false,
3172:        "failed": false,
4603:STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
4604:VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
4605:VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4606:VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4607:VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
4608:VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4609:VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4610:VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4611:VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4612:VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4613:VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4614:VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4615:VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4616:VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4617:VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4618:VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4619:VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4620:VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
4621:VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Tail (80 lines):

skipping: [VM0119]
skipping: [VM0120]
skipping: [VM0121]
skipping: [VM0122]
skipping: [VM0123]
skipping: [VM0124]
skipping: [VM0125]
skipping: [VM0126]
skipping: [VM0127]
skipping: [VM0128]
skipping: [VM0129]
skipping: [VM0130]
skipping: [VM0131]
skipping: [VM0132]
skipping: [VM0133]
skipping: [VM0134]
skipping: [VM0135]
skipping: [VM0136]
skipping: [VM0137]
skipping: [VM0138]
skipping: [VM0139]
skipping: [VM0140]
skipping: [VM0141]
skipping: [VM0142]
skipping: [VM0143]

PLAY [servers:&vm_host] ********************************************************

TASK [Integrated traffic generator] ********************************************
skipping: [STR-ACS-VSERV-01]

PLAY RECAP *********************************************************************
STR-ACS-VSERV-01           : ok=337  changed=53   unreachable=0    failed=0    skipped=358  rescued=0    ignored=2   
VM0100                     : ok=38   changed=5    unreachable=0    failed=0    skipped=53   rescued=0    ignored=0   
VM0101                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0102                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0103                     : ok=33   changed=5    unreachable=0    failed=0    skipped=49   rescued=0    ignored=0   
VM0104                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0105                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0106                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0107                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0108                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0109                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0110                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0111                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0112                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0113                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0114                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0115                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0116                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0117                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0118                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0119                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0120                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0121                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0122                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0123                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0124                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0125                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0126                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0127                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0128                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0129                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0130                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0131                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0132                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0133                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0134                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0135                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0136                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0137                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0138                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0139                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0140                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0141                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0142                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   
VM0143                     : ok=0    changed=0    unreachable=0    failed=0    skipped=42   rescued=0    ignored=0   

Done
+ sleep 180
📄 clean-testbed.log (94 lines, 3666B)

Tail (80 lines):

+ docker stop ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ docker rm -f ceos_vms6-1_VM0103
ceos_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0102'
Stopping and removing container: ceos_vms6-1_VM0102
+ docker stop ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ docker rm -f ceos_vms6-1_VM0102
ceos_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0101'
Stopping and removing container: ceos_vms6-1_VM0101
+ docker stop ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ docker rm -f ceos_vms6-1_VM0101
ceos_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ceos_vms6-1_VM0100'
Stopping and removing container: ceos_vms6-1_VM0100
+ docker stop ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ docker rm -f ceos_vms6-1_VM0100
ceos_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0103'
Stopping and removing container: net_vms6-1_VM0103
+ docker stop net_vms6-1_VM0103
net_vms6-1_VM0103
+ docker rm -f net_vms6-1_VM0103
net_vms6-1_VM0103
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0102'
Stopping and removing container: net_vms6-1_VM0102
+ docker stop net_vms6-1_VM0102
net_vms6-1_VM0102
+ docker rm -f net_vms6-1_VM0102
net_vms6-1_VM0102
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0101'
Stopping and removing container: net_vms6-1_VM0101
+ docker stop net_vms6-1_VM0101
net_vms6-1_VM0101
+ docker rm -f net_vms6-1_VM0101
net_vms6-1_VM0101
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: net_vms6-1_VM0100'
Stopping and removing container: net_vms6-1_VM0100
+ docker stop net_vms6-1_VM0100
net_vms6-1_VM0100
+ docker rm -f net_vms6-1_VM0100
net_vms6-1_VM0100
+ for c in $(docker ps -a --format '{{.Names}}' | grep 'vms6-1')
+ echo 'Stopping and removing container: ptf_vms6-1'
Stopping and removing container: ptf_vms6-1
+ docker stop ptf_vms6-1
ptf_vms6-1
+ docker rm -f ptf_vms6-1
ptf_vms6-1
+ echo '=== Cleaning cEOS data directories ==='
=== Cleaning cEOS data directories ===
+ sudo rm -rf /data/ceos/ceos_vms6-1_VM0100 /data/ceos/ceos_vms6-1_VM0101 /data/ceos/ceos_vms6-1_VM0102 /data/ceos/ceos_vms6-1_VM0103
+ echo '=== Removing stale vlab VMs ==='
=== Removing stale vlab VMs ===
+ virsh -c qemu:///system list --all
+ grep -q vlab
+ echo '=== Verifying sonic-mgmt container still running ==='
=== Verifying sonic-mgmt container still running ===
+ docker exec sonic-mgmt echo OK
OK
+ echo '=== Remaining containers ==='
=== Remaining containers ===
+ docker ps -a --format 'table {{.Names}}\t{{.Status}}'
NAMES                STATUS
sonic-mgmt           Up 3 days
hardcore_keller      Up 4 days
hopeful_mcclintock   Created
+ echo '=== Cleanup complete ==='
=== Cleanup complete ===
📄 sonic-mgmt-container-setup.log (5 lines, 218B)

Tail (80 lines):

+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo 'sonic-mgmt container already running, nothing to do'
sonic-mgmt container already running, nothing to do
+ exit 0
⚠️ diagnostics.log (2 errors, 132 lines, 8925B)

Error lines:

26:+ echo 'FAIL: cannot connect to libvirt'
27:FAIL: cannot connect to libvirt

Tail (80 lines):

-rwxrwxr-x   1 azureuser azureuser 18986 Feb 13 02:41 setup-container.sh
-rw-rw-r--   1 azureuser azureuser  2025 Feb 13 02:41 sonic_dictionary.txt
drwxrwxr-x  15 azureuser azureuser  4096 Feb 13 02:41 spytest
drwxrwxr-x   6 azureuser azureuser  4096 Feb 13 02:41 test_reporting
drwxrwxr-x 129 azureuser azureuser  4096 Feb 16 21:19 tests
+ ls /data/sonic-mgmt/tests/kvmtest.sh
/data/sonic-mgmt/tests/kvmtest.sh
+ echo '=== Check sonic-mgmt docker container ==='
=== Check sonic-mgmt docker container ===
+ docker ps -a --filter name=sonic-mgmt
CONTAINER ID   IMAGE                                COMMAND       CREATED      STATUS      PORTS     NAMES
43138ce8a47b   docker-sonic-mgmt-azureuser:master   "/bin/bash"   3 days ago   Up 3 days   22/tcp    sonic-mgmt
+ docker exec sonic-mgmt echo 'sonic-mgmt container is running'
sonic-mgmt container is running
+ echo '=== Check /data contents ==='
=== Check /data contents ===
+ ls -la /data/
total 20
drwxr-xr-x  5 azureuser azureuser 4096 Feb 13 03:20 .
drwxr-xr-x 22 root      root      4096 Feb 13 02:25 ..
drwxr-xr-x  6 root      root      4096 Feb 16 21:06 ceos
drwxrwxr-x 13 azureuser azureuser 4096 Feb 13 03:30 sonic-mgmt
drwxr-xr-x  4 azureuser azureuser 4096 Feb 13 11:58 sonic-vm
+ echo '=== Check cEOS images ==='
=== Check cEOS images ===
+ ls -la /data/sonic-vm/images/
total 5030348
drwxr-xr-x 2 azureuser azureuser       4096 Feb 16 21:02 .
drwxr-xr-x 4 azureuser azureuser       4096 Feb 13 11:58 ..
-rw-r--r-- 1 azureuser azureuser 5151064064 Feb 16 21:02 sonic-vs.img
+ ls -la /data/ceos/
total 24
drwxr-xr-x   6 root      root      4096 Feb 16 21:06 .
drwxr-xr-x   5 azureuser azureuser 4096 Feb 13 03:20 ..
drwxrwxr-x+ 10 root      root      4096 Feb 16 21:07 ceos_vms6-1_VM0100
drwxrwxr-x+ 10 root      root      4096 Feb 16 21:07 ceos_vms6-1_VM0101
drwxrwxr-x+ 10 root      root      4096 Feb 16 21:07 ceos_vms6-1_VM0102
drwxrwxr-x+ 10 root      root      4096 Feb 16 21:07 ceos_vms6-1_VM0103
+ echo '=== Docker images ==='
=== Docker images ===
+ docker images
+ head -20
IMAGE                                                                ID             DISK USAGE   CONTENT SIZE   EXTRA
ceosimage:4.29.10.1M                                                 cf484164b16d       2.89GB          731MB        
ceosimage:4.29.10.1M-1                                               c6a1850ef28f       2.89GB          731MB   U    
docker-ptf:latest                                                    f28aaf787373       9.09GB         4.39GB        
docker-sonic-mgmt-azureuser:master                                   dc1af3bbf6a5       5.08GB          992MB   U    
docker-sonic-vs:latest                                               93c8cf3b870e       1.74GB          828MB        
publicmirror.azurecr.io/debian:bookworm                              c66c66fac809        185MB         52.2MB        
publicmirror.azurecr.io/debian:trixie                                c71b05eac0b2        186MB         52.5MB        
sonic-slave-bookworm-azureuser:1828b4d7c29                           983be73adef0       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:35feeb650be                           ae838157f361       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:6db1f584aa2                           266b697e04d1       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:73c8df02574                           dfe58e9aafc4       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:9eaf6be7f19                           79c4b6740f13       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:a27a4904ede                           c2a93f135256       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:ada88ee24f1                           06a1ce313b86       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:b8b044053f8                           e27f53eab83a       13.5GB         3.34GB        
sonic-slave-bookworm-azureuser:eee57d1af57                           18f6072bb74c       13.5GB         3.34GB        
sonic-slave-bookworm:733a2a69061                                     0114860d906a       13.5GB         3.34GB        
sonic-slave-trixie-azureuser:01d13355124                             896b1ab6f42e         14GB         3.34GB        
sonic-slave-trixie-azureuser:83b460050f6                             b1c07fdfb78c         14GB         3.34GB        
+ echo '=== Docker containers ==='
=== Docker containers ===
+ docker ps -a
CONTAINER ID   IMAGE                                                 COMMAND                  CREATED       STATUS       PORTS     NAMES
008442b73581   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   3 hours ago   Up 3 hours             ceos_vms6-1_VM0103
1c12267352d4   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   3 hours ago   Up 3 hours             ceos_vms6-1_VM0102
d0bd407e07ed   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   3 hours ago   Up 3 hours             ceos_vms6-1_VM0101
250a81f7fcee   ceosimage:4.29.10.1M-1                                "/sbin/init systemd.…"   3 hours ago   Up 3 hours             ceos_vms6-1_VM0100
0d55440dc90c   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   3 hours ago   Up 3 hours             net_vms6-1_VM0103
ace7d5d70808   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   3 hours ago   Up 3 hours             net_vms6-1_VM0102
7a205158aea5   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   3 hours ago   Up 3 hours             net_vms6-1_VM0101
7d2bb3e64298   sonicdev-microsoft.azurecr.io:443/debian:bookworm     "bash"                   3 hours ago   Up 3 hours             net_vms6-1_VM0100
66136771b59b   sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   "/root/env-python3/b…"   3 hours ago   Up 3 hours             ptf_vms6-1
43138ce8a47b   docker-sonic-mgmt-azureuser:master                    "/bin/bash"              3 days ago    Up 3 days    22/tcp    sonic-mgmt
d269a1130978   sonic-slave-trixie-azureuser:b0cc8a9f29a              "bash -c 'make -f sl…"   4 days ago    Up 4 days    22/tcp    hardcore_keller
8ebc852cc991   d0474f6ff0b1                                          "/bin/sh -c '#(nop) …"   5 weeks ago   Created                hopeful_mcclintock
+ echo '=== Diagnostics complete ==='
=== Diagnostics complete ===
⚙️ Environment snapshot
=== Docker containers ===
NAMES                IMAGE                                                 STATUS
ceos_vms6-1_VM0103   ceosimage:4.29.10.1M-1                                Up About an hour
ceos_vms6-1_VM0102   ceosimage:4.29.10.1M-1                                Up About an hour
ceos_vms6-1_VM0100   ceosimage:4.29.10.1M-1                                Up About an hour
ceos_vms6-1_VM0101   ceosimage:4.29.10.1M-1                                Up About an hour
net_vms6-1_VM0103    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 2 hours
net_vms6-1_VM0102    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 2 hours
net_vms6-1_VM0101    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 2 hours
net_vms6-1_VM0100    sonicdev-microsoft.azurecr.io:443/debian:bookworm     Up 2 hours
ptf_vms6-1           sonicdev-microsoft.azurecr.io:443/docker-ptf:latest   Up 2 hours
sonic-mgmt           docker-sonic-mgmt-azureuser:master                    Up 3 days
hardcore_keller      sonic-slave-trixie-azureuser:b0cc8a9f29a              Up 4 days
hopeful_mcclintock   d0474f6ff0b1                                          Created

=== Log directory ===
total 1188
drwxr-xr-x 4 azureuser azureuser    4096 Feb 17 01:24 .
drwxr-xr-x 4 azureuser azureuser    4096 Feb 17 01:24 ..
drwxr-xr-x 5 azureuser azureuser    4096 Feb 17 01:24 1vlan
-rw-r--r-- 1 azureuser azureuser    3666 Feb 16 23:50 clean-testbed.log
-rw-r--r-- 1 azureuser azureuser    8925 Feb 16 23:47 diagnostics.log
-rw-r--r-- 1 azureuser azureuser 1013561 Feb 17 01:24 kvmtest-run.log
drwxr-xr-x 3 azureuser azureuser    4096 Feb 17 01:24 ptf
-rw-r--r-- 1 azureuser azureuser  147970 Feb 16 23:55 setup-testbed.log
-rw-r--r-- 1 azureuser azureuser     218 Feb 16 23:47 sonic-mgmt-container-setup.log
-rw-r--r-- 1 azureuser azureuser    3698 Feb 16 23:58 update-sonic-mgmt.log

=== Disk usage ===
Filesystem      Size  Used Avail Use% Mounted on
/dev/root       993G  174G  819G  18% /
/dev/root       993G  174G  819G  18% /

🔍 62 error(s) across 6 log file(s)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant