-
Notifications
You must be signed in to change notification settings - Fork 17
Description
### 设置完share=True,运行python inference/gradio_demo.py --model-path Alibaba-DAMO-Academy/RynnEC-2B,gradio无法设置公网url访问,下面是输出日志:
/home/tc/anaconda3/envs/RynnEC/lib/python3.10/site-packages/torch/nn/modules/module.py:2068: UserWarning: for obj_ptr_proj.layers.1.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/home/tc/anaconda3/envs/RynnEC/lib/python3.10/site-packages/torch/nn/modules/module.py:2068: UserWarning: for obj_ptr_proj.layers.1.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/home/tc/anaconda3/envs/RynnEC/lib/python3.10/site-packages/torch/nn/modules/module.py:2068: UserWarning: for obj_ptr_proj.layers.2.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
/home/tc/anaconda3/envs/RynnEC/lib/python3.10/site-packages/torch/nn/modules/module.py:2068: UserWarning: for obj_ptr_proj.layers.2.bias: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
Some weights of the model checkpoint at Alibaba-DAMO-Academy/RynnEC-2B were not used when initializing RynnecQwen2ForCausalLM: ['grounding_encoder.sam2_model.memory_encoder.fuser.layers.0.weight', 'grounding_encoder.sam2_model.memory_encoder.fuser.layers.1.weight']
- This IS expected if you are initializing RynnecQwen2ForCausalLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing RynnecQwen2ForCausalLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of RynnecQwen2ForCausalLM were not initialized from the model checkpoint at Alibaba-DAMO-Academy/RynnEC-2B and are newly initialized: ['grounding_encoder.sam2_model.memory_encoder.fuser.layers.0.gamma', 'grounding_encoder.sam2_model.memory_encoder.fuser.layers.1.gamma']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Running on local URL: http://127.0.0.1:7860