The new method of quantization of 'bitnet' that uses 1.58 bit now has some models that were quantized using it, like "HF1BitLLM/Llama3-8B-1.58-100B-tokens", but when I try to download it using transformer, it tells me that this new 'bitnet' quantization is not found.
