-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
General Idea: get some exemplars and embed/run feature extractors to get features, then compare (same basic loop as #24 )
Possible feature extractors:
- https://github.com/antoine77340/S3D_HowTo100M
- https://github.com/gulvarol/bsl1k
- That one actually links to "our improved model"
https://github.com/DCNemesis/hidden-layer-tokenizer/tree/main provides code that should be relatively easy to adapt? It already accepts "any pytorch model". And extracts features, it just puts them in a FAISS and uses them for clustering. I want to keep them around and find cosine distances so I can compare the video for HOUSE with various sections of a long video
To use this in the similarity loop I just need a function that lets me compare two sequences of frames. Something like:
def compare_features(query_video_path:Path, long_video_path:Path, search_segment_start_frame_index: int, search_segment_end_frame_index: int) -> float:
"""given a short query video and a long video to search inside, embeds the query video with a feature extractor, embeds the given segment, and returns the cosine distance. Precomputing and storing in FAISS is valid too"""Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels
