Skip to content

icann/TAMS-Security-Assets

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

TAMS Security Testing Scenarios README

Snapshot date: February 25, 2026

Purpose

This document explains:

  • What ICANN’s TLD Application Management System (TAMS) security testing artifacts are (and how they relate to each other)
  • How to read and use the security test scenarios
  • How to provide feedback

ICANN’s goal is to make security testing of TAMS transparent, repeatable, and easy to review by interested stakeholders.

What artifacts exist and how they fit together

Security Test Scenario Matrix

The Security Test Scenario Matrix document defines and governs the scope of the security testing performed; it:

  • Defines what will be tested (scenario categories and intent)
  • Specifies level of access based on roles
  • Establishes a shared baseline for coverage and consistency across phases of the application lifecycle
  • Helps reviewers quickly understand the test boundary (what’s in scope vs. out of scope)

The matrix acts as the map of what we intend to test and who should be able to do what.

Test Scenario Scripts

The Test Scenario Scripts implement the matrix in executable form. Each test scenario includes:

  • The exact scenario being tested
  • The precondition and step-by-step instructions (accounts, roles, data state, and actions to be performed)
  • Expected results

The scenario scripts are the playbook for execution.

Feedback

Feedback is encouraged and welcomed, and should be provided in GitHub. Reviewers may not be able to run the scenarios themselves, but feedback is still valuable. Please alert us to gaps, mistakes, or ambiguities in the scenario definition or test case writing.

Where to provide feedback

Use one of the following in GitHub:

  • Issue (preferred for anything that should be tracked, triaged, or assigned).
  • Pull Request (preferred if you can propose edits directly to documentation).

What to include in feedback

To help us triage efficiently, please include:

  • Reference: Test case IDs(s) (e.g., ID-1234) and/or scenario name from the matrix/scenario script

  • Observation: What you believe is wrong, missing, or unclear.

    Example: “Expected result does not define whether the button is hidden or blocked with an error.”

Important Note

These artifacts represent a point-in-time snapshot of the security testing scenarios. Some scenarios may become outdated due to product changes, role model changes, and/or configuration updates. The test scenarios and matrix reflect what was known at the time they were authored. Update may occur, but there is no fixed cadence and changes may not be reflected immediately (or at all).

Use these materials as a transparent view of intended testing scope, not as a guarantee that the documentation always matches the latest deployed behavior. If you identify outdated scenarios, unclear steps, or missing coverage please provide feedback so we can prioritize improvements where they add the most value.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors