Skip to content

Add tensorflow adapted ASL#109

Open
Mooster1 wants to merge 1 commit intoAlibaba-MIIL:mainfrom
Mooster1:main
Open

Add tensorflow adapted ASL#109
Mooster1 wants to merge 1 commit intoAlibaba-MIIL:mainfrom
Mooster1:main

Conversation

@Mooster1
Copy link

@Mooster1 Mooster1 commented Aug 4, 2023

Add tf_losses.py file which initializes edited version of AsymmetricLoss.forward() function for use with tensorflow.

@sonarqubecloud
Copy link

Quality Gate Failed Quality Gate failed

Failed conditions
B Maintainability Rating on New Code (required ≥ A)

See analysis details on SonarQube Cloud

Catch issues before they fail your Quality Gate with our IDE extension SonarQube for IDE

@mrT23
Copy link
Contributor

mrT23 commented Feb 11, 2026

/describe

@qodo-code-review
Copy link

User description

Add tf_losses.py file which initializes edited version of AsymmetricLoss.forward() function for use with tensorflow.


PR Type

Enhancement


Description

  • Add TensorFlow adaptation of Asymmetric Loss function

  • Implements ASL with configurable gamma parameters

  • Supports multi-label classification with asymmetric emphasis

  • Returns compiled loss function for TensorFlow models


Diagram Walkthrough

flowchart LR
  A["ASL Wrapper Function"] -- "gamma_neg, gamma_pos" --> B["ASL_func"]
  B -- "logits, targets" --> C["Probability Calculation"]
  C --> D["Cross Entropy Loss"]
  D --> E["Asymmetric Focusing"]
  E --> F["Weighted Loss Output"]
Loading

File Walkthrough

Relevant files
Enhancement
tf_losses.py
TensorFlow Asymmetric Loss implementation                               

src/loss_functions/tf_losses.py

  • New file implementing TensorFlow adaptation of Asymmetric Loss
  • Wrapper function AsymmetricLoss accepts gamma_neg and gamma_pos
    parameters
  • Inner ASL_func computes asymmetric cross-entropy loss with focusing
    mechanism
  • Calculates probabilities, applies basic CE, and applies asymmetric
    weighting
+42/-0   

@mrT23
Copy link
Contributor

mrT23 commented Feb 11, 2026

/agentic_review

@qodo-code-review
Copy link

Code Review by Qodo

🐞 Bugs (3) 📘 Rule violations (2) 📎 Requirement gaps (0)

Grey Divider


Action required

1. tf.math.log() can hit -inf 📘 Rule violation ⛯ Reliability
Description
The loss computes tf.math.log(xs_pos) and tf.math.log(xs_neg) without clipping/validating that
x is in (0,1), which can produce -inf/NaN and break training. This is missing boundary/edge-case
handling and input validation.
Code

src/loss_functions/tf_losses.py[R23-30]

+    # Calculating Probabilities
+    xs_pos = x
+    xs_neg = 1 - x
+
+    # Basic CE calculation
+    los_pos = y * tf.math.log(xs_pos)
+    los_neg = (1 - y) * tf.math.log(xs_neg)
+    loss = los_pos + los_neg
Evidence
PR Compliance ID 3 and 6 require explicit handling/validation of boundary values and inputs; the new
code performs logarithms on values derived directly from x and 1 - x with no clipping or
validation, so x at 0/1 (or outside [0,1]) will cause invalid logs.

Rule 3: Generic: Robust Error Handling and Edge Case Management
Rule 6: Generic: Security-First Input Validation and Data Handling
src/loss_functions/tf_losses.py[23-30]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`tf.math.log()` is applied to `x` and `1 - x` without ensuring values are strictly within (0,1), which can produce `-inf/NaN` for boundary/out-of-range inputs.

## Issue Context
This function is intended to be used as a Keras loss. Model outputs may be logits (unbounded) or probabilities that can reach exactly 0/1 depending on activations and numerics.

## Fix Focus Areas
- src/loss_functions/tf_losses.py[23-30]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


2. Logits used as probs 🐞 Bug ✓ Correctness
Description
ASL_func’s docstring says x is logits, but the implementation uses x directly as a probability
(xs_pos=x, xs_neg=1-x) and omits the sigmoid step used by the reference ASL, producing an incorrect
loss when model outputs are logits.
Code

src/loss_functions/tf_losses.py[R23-26]

+    # Calculating Probabilities
+    xs_pos = x
+    xs_neg = 1 - x
+
Evidence
The TF version directly treats x as probability, while the repo’s reference ASL explicitly applies
sigmoid to logits before computing xs_pos/xs_neg; training code also documents that logits are
passed and sigmoid is applied inside the loss.

src/loss_functions/tf_losses.py[15-26]
src/loss_functions/losses.py[15-27]
train.py[110-113]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`src/loss_functions/tf_losses.py` documents `x` as logits but computes probabilities as `xs_pos = x` / `xs_neg = 1 - x`, which breaks ASL semantics when the model outputs logits.

## Issue Context
The repo’s reference ASL (`src/loss_functions/losses.py`) applies `sigmoid` to logits inside the loss, and training code passes logits with an explicit comment that sigmoid happens in the loss.

## Fix Focus Areas
- src/loss_functions/tf_losses.py[15-26]
- src/loss_functions/tf_losses.py[23-31]
- src/loss_functions/losses.py[23-27]
- train.py[110-113]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Remediation recommended

3. ASL_func uses y, x 📘 Rule violation ✓ Correctness
Description
The new loss function uses single-letter parameters (y, x) and a nonstandard order (targets
first, logits second), making it easy to misuse and reducing readability. This violates the
meaningful naming/self-documenting requirement.
Code

src/loss_functions/tf_losses.py[R15-21]

+  def ASL_func(y, x):
+    """"
+    Parameters
+    ----------
+    x: input logits (y hat)
+    y: targets (multi-label binarized vector)
+    """
Evidence
PR Compliance ID 2 requires identifiers to clearly express intent and discourages single-letter
names except for short-lived iterators; the function signature introduces y and x for long-lived
semantic inputs (targets and logits).

Rule 2: Generic: Meaningful Naming and Self-Documenting Code
src/loss_functions/tf_losses.py[15-21]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
The inner loss function signature uses ambiguous single-letter names (`y`, `x`) and an unusual argument order versus common TensorFlow/Keras conventions.

## Issue Context
Keras losses are typically called with `(y_true, y_pred)`; unclear naming/order increases the chance of wiring the loss incorrectly.

## Fix Focus Areas
- src/loss_functions/tf_losses.py[15-21]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


4. TensorFlow dep unmanaged 🐞 Bug ⛯ Reliability
Description
The new module unconditionally imports TensorFlow, but the project’s requirements don’t include it;
any environment that imports this module without TensorFlow installed will fail immediately.
Code

src/loss_functions/tf_losses.py[1]

+import tensorflow as tf
Evidence
tf_losses.py requires TensorFlow at import time, while requirements.txt lists only torch-related
dependencies. This makes TensorFlow either an undeclared hard dependency or requires guarding for
optional use.

src/loss_functions/tf_losses.py[1-1]
requirements.txt[1-5]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`tf_losses.py` imports TensorFlow unconditionally, but TensorFlow is not declared in dependencies.

## Issue Context
This repository appears torch-first (torch is in requirements), so TF may be intended as an optional capability.

## Fix Focus Areas
- src/loss_functions/tf_losses.py[1-1]
- requirements.txt[1-5]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Advisory comments

5. Malformed docstrings 🐞 Bug ✓ Correctness
Description
The docstrings start with four quotes ("""") instead of triple quotes ("""), leaving a stray quote
at the beginning of the docstring and making the docstrings misleading/poorly formatted.
Code

src/loss_functions/tf_losses.py[R3-17]

+def AsymmetricLoss(gamma_neg=4.0, gamma_pos=1.0): # Wrapper for ASL function
+  """"
+  Tensorflow adaptation of "Official Pytorch Implementation of: 'Asymmetric Loss For Multi-Label Classification'(ICCV, 2021) paper" --> https://github.com/Alibaba-MIIL/ASL/blob/main/src/loss_functions/losses.py
+  Returns a loss function with asymmetric, specifiable emphases for false negatives & false positives. Output can be passed in as loss function for model.compile().
+  ----------
+  Parameters
+  ----------
+  gamma_neg: asymmetric emphasis on false negatives
+  gamma_pos: assymetric emphasis on false positives
+  """
+  
+  # Return ASL function with custom emphases
+  def ASL_func(y, x):
+    """"
+    Parameters
Evidence
Both the outer factory and inner loss function use """" as the opening delimiter rather than
standard triple quotes, which results in malformed docstring content.

src/loss_functions/tf_losses.py[3-12]
src/loss_functions/tf_losses.py[15-17]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
Docstrings are opened with four quotes instead of three, resulting in malformed docstring text.

## Issue Context
This impacts readability and tooling (help(), IDE doc popups, doc generation).

## Fix Focus Areas
- src/loss_functions/tf_losses.py[3-12]
- src/loss_functions/tf_losses.py[15-21]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

Comment on lines +23 to +30
# Calculating Probabilities
xs_pos = x
xs_neg = 1 - x

# Basic CE calculation
los_pos = y * tf.math.log(xs_pos)
los_neg = (1 - y) * tf.math.log(xs_neg)
loss = los_pos + los_neg

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

1. tf.math.log() can hit -inf 📘 Rule violation ⛯ Reliability

The loss computes tf.math.log(xs_pos) and tf.math.log(xs_neg) without clipping/validating that
x is in (0,1), which can produce -inf/NaN and break training. This is missing boundary/edge-case
handling and input validation.
Agent Prompt
## Issue description
`tf.math.log()` is applied to `x` and `1 - x` without ensuring values are strictly within (0,1), which can produce `-inf/NaN` for boundary/out-of-range inputs.

## Issue Context
This function is intended to be used as a Keras loss. Model outputs may be logits (unbounded) or probabilities that can reach exactly 0/1 depending on activations and numerics.

## Fix Focus Areas
- src/loss_functions/tf_losses.py[23-30]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

Comment on lines +23 to +26
# Calculating Probabilities
xs_pos = x
xs_neg = 1 - x

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

2. Logits used as probs 🐞 Bug ✓ Correctness

ASL_func’s docstring says x is logits, but the implementation uses x directly as a probability
(xs_pos=x, xs_neg=1-x) and omits the sigmoid step used by the reference ASL, producing an incorrect
loss when model outputs are logits.
Agent Prompt
## Issue description
`src/loss_functions/tf_losses.py` documents `x` as logits but computes probabilities as `xs_pos = x` / `xs_neg = 1 - x`, which breaks ASL semantics when the model outputs logits.

## Issue Context
The repo’s reference ASL (`src/loss_functions/losses.py`) applies `sigmoid` to logits inside the loss, and training code passes logits with an explicit comment that sigmoid happens in the loss.

## Fix Focus Areas
- src/loss_functions/tf_losses.py[15-26]
- src/loss_functions/tf_losses.py[23-31]
- src/loss_functions/losses.py[23-27]
- train.py[110-113]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants