Skip to content

Conversation

@psiva01
Copy link
Contributor

@psiva01 psiva01 commented Nov 20, 2025

…05" (#686)"

This reverts commit aeaedc6.

Revert "VPLAY-9274:[DASH] optimization: independent track downloads (#614)"

This reverts commit 3746211.

…05" (#686)"

This reverts commit aeaedc6.

Revert "VPLAY-9274:[DASH] optimization: independent track downloads (#614)"

This reverts commit 3746211.
@psiva01 psiva01 requested a review from a team as a code owner November 20, 2025 18:41
@rdkcmf-jenkins
Copy link

rdkcmf-jenkins commented Nov 20, 2025

Coverity Issue - Double unlock

"PushNextFragment" unlocks "pMediaStreamContext->mutex" while it is unlocked.

Medium Impact, CWE-765
LOCK

Issue location

This issue was discovered outside the diff for this Pull Request. You can find it at:
fragmentcollector_mpd.cpp:6377

@rdkcmf-jenkins
Copy link

rdkcmf-jenkins commented Nov 20, 2025

Coverity Issue - Double unlock

"PushNextFragment" unlocks "pMediaStreamContext->mutex" while it is unlocked.

Medium Impact, CWE-765
LOCK

Issue location

This issue was discovered outside the diff for this Pull Request. You can find it at:
fragmentcollector_mpd.cpp:7018

mBasePeriodOffset += (pMediaStreamContext->fragmentTime - currFragTime);
}

if (PushNextFragment(pMediaStreamContext, getCurlInstanceByMediaType(static_cast<AampMediaType>(trackIdx))))

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Coverity Issue - Double unlock

"PushNextFragment" unlocks "pMediaStreamContext->mutex" while it is unlocked.

Medium Impact, CWE-765
LOCK

// Important DEBUG area, live downloader is delayed due to some external factors (Injector or Gstreamer)
if (pMediaStreamContext->IsInjectionFromCachedFragmentChunks())
{
AAMPLOG_ERR("%s Live downloader is not advancing at the moment cache (%d / %d)", GetMediaTypeName((AampMediaType)trackIdx), pMediaStreamContext->numberOfFragmentChunksCached, pMediaStreamContext->maxCachedFragmentChunksPerTrack);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Coverity Issue - Data race condition

Accessing "pMediaStreamContext->numberOfFragmentChunksCached" without holding lock "MediaTrack.mutex". Elsewhere, "MediaTrack.numberOfFragmentChunksCached" is written to with "MediaTrack.mutex" held 4 out of 4 times (2 of these accesses strongly imply that it is necessary).

Medium Impact, CWE-366
MISSING_LOCK

if(pMediaStreamContext->WaitForFreeFragmentAvailable(0))
{
pMediaStreamContext->profileChanged = false;
FetchFragment(pMediaStreamContext, std::move(initialization), fragmentDuration, true, getCurlInstanceByMediaType(pMediaStreamContext->mediaType), //CurlContext 0=Video, 1=Audio)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Coverity Issue - Double unlock

"FetchFragment" unlocks "pMediaStreamContext->mutex" while it is unlocked.

Medium Impact, CWE-765
LOCK

if(pMediaStreamContext->WaitForFreeFragmentAvailable(0))
{
pMediaStreamContext->profileChanged = false;
FetchFragment(pMediaStreamContext, std::move(initialization), fragmentDuration, true, getCurlInstanceByMediaType(pMediaStreamContext->mediaType), //CurlContext 0=Video, 1=Audio)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Coverity Issue - Wait without held lock

"FetchFragment" waits while "pMediaStreamContext->mutex" is unlocked.

Medium Impact, CWE-667
LOCK

{
uint32_t timeScale = 0;
if (buffer.getTimeScale(timeScale))
buffer.getTimeScale(timeScale);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Coverity Issue - Unchecked return value

Calling "getTimeScale" without checking return value (as is done elsewhere 6 out of 6 times).

Medium Impact, CWE-252
CHECKED_RETURN

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants