Skip to content

Conversation

@gaikwadrahul8
Copy link

Hi, Team
I have created this PR for some code improvement so I've made below code changes

Loader Security Hardening(ai_edge_torch/generative/utilities/loader.py) for precise file detection so changed from *pt to *.pt pattern for exact matching and CPU only loading added map_location=torch.device("cpu") for safer loading and also added security enhancement weights_only=True to prevent pickle based attacks

LoRA Random Initialization Fix (ai_edge_torch/generative/layers/lora.py) changed from torch.randint to torch.rand for float dtype runtime error prevention which eliminates crashes when LoRA weights are floating point

I would request you to please review this PR, if you've any feedback or suggestion please let me know that will be very helpful. Thank you for your consideration.

@gaikwadrahul8 gaikwadrahul8 changed the title Fix harden checkpoint loading and fix LoRA initialization bugs Fix harden checkpoint loading and LoRA initialization issue Aug 7, 2025
@gaikwadrahul8 gaikwadrahul8 force-pushed the fix/secure-loader-and-lora-initialization branch from b604b38 to cc1a9d1 Compare August 7, 2025 22:49
…bugs

- loader: use precise .pt detection (glob: *.pt, suffix: .pt)
- loader: enforce torch.load on CPU with weights_only for safety
- lora: use torch.rand for float dtype in random() to avoid runtime error
- code cleanup: remove unused code and simplify operations
@gaikwadrahul8 gaikwadrahul8 force-pushed the fix/secure-loader-and-lora-initialization branch from cc1a9d1 to 5432f60 Compare August 7, 2025 22:51
@gaikwadrahul8 gaikwadrahul8 added the status:awaiting review Awaiting PR review label Sep 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant