Skip to content

When compiling with rustc 1.90.0 (rustup version 1.28.2), an error was encountered #88

@yangxudong

Description

@yangxudong

When compiling with rustc 1.90.0 (rustup version 1.28.2), an error was encountered,and how to solve it, thanks!

Compiling tokenizers-c v0.1.0 (/fg/build_in_docker/_deps/tokenizers_cpp-src/rust)
error: implicit autoref creates a reference to the dereference of a raw pointer
--> src/lib.rs:218:20
|
218 | *out_len = (*handle).decode_str.len();
| ^^------^^^^^^^^^^^^^^^^^^
| |
| this raw pointer has type *mut TokenizerWrapper
|
= note: creating a reference requires the pointer target to be valid and imposes aliasing requirements
note: autoref is being applied to this expression, resulting in: &std::string::String
--> src/lib.rs:218:20
|
218 | *out_len = (*handle).decode_str.len();
| ^^^^^^^^^^^^^^^^^^^^
note: method calls to len require a reference
--> /rustc/1159e78c4747b02ef996e55082b704c09b970588/library/alloc/src/string.rs:1854:5
= note: #[deny(dangerous_implicit_autorefs)] on by default
help: try using a raw pointer method instead; or if this reference is intentional, make it explicit
|
218 | *out_len = (&(*handle).decode_str).len();
| ++ +

error: implicit autoref creates a reference to the dereference of a raw pointer
--> src/lib.rs:251:20
|
251 | *out_len = (*handle).id_to_token_result.len();
| ^^------^^^^^^^^^^^^^^^^^^^^^^^^^^
| |
| this raw pointer has type *mut TokenizerWrapper
|
= note: creating a reference requires the pointer target to be valid and imposes aliasing requirements
note: autoref is being applied to this expression, resulting in: &std::string::String
--> src/lib.rs:251:20
|
251 | *out_len = (*handle).id_to_token_result.len();
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
note: method calls to len require a reference
--> /rustc/1159e78c4747b02ef996e55082b704c09b970588/library/alloc/src/string.rs:1854:5
help: try using a raw pointer method instead; or if this reference is intentional, make it explicit
|
251 | *out_len = (&(*handle).id_to_token_result).len();
| ++ +

error: could not compile tokenizers-c (lib) due to 2 previous errors
gmake[2]: *** [_deps/tokenizers_cpp-build/release/libtokenizers_c.a] Error 101
gmake[1]: *** [_deps/tokenizers_cpp-build/CMakeFiles/tokenizers_c.dir/all] Error 2

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions