-
Notifications
You must be signed in to change notification settings - Fork 15k
[DebugInfo] Add bit size to _BitInt name in debug info #165583
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@llvm/pr-subscribers-clang-codegen @llvm/pr-subscribers-debuginfo Author: Orlando Cazalet-Hyams (OCHyams) ChangesFollow on from #164372 DWARF for _BitInt(23):
DW_TAG_base_type
DW_AT_byte_size (0x04)
DW_AT_encoding (DW_ATE_signed)
DW_AT_bit_size (0x17)
- DW_AT_name ("_BitInt")
+ DW_AT_name ("_BitInt(23)")This matches GCC. It's what our (Sony) debugger would prefer to see, but it's possibly not useful for LLDB - so maybe we want to avoid doing this with -glldb tuning? Full diff: https://github.com/llvm/llvm-project/pull/165583.diff 2 Files Affected:
diff --git a/clang/lib/CodeGen/CGDebugInfo.cpp b/clang/lib/CodeGen/CGDebugInfo.cpp
index 07a2cfb21bef2..fd2f6dcf182b5 100644
--- a/clang/lib/CodeGen/CGDebugInfo.cpp
+++ b/clang/lib/CodeGen/CGDebugInfo.cpp
@@ -1174,7 +1174,10 @@ llvm::DIType *CGDebugInfo::CreateType(const BuiltinType *BT) {
}
llvm::DIType *CGDebugInfo::CreateType(const BitIntType *Ty) {
- StringRef Name = Ty->isUnsigned() ? "unsigned _BitInt" : "_BitInt";
+ SmallString<32> Name;
+ llvm::raw_svector_ostream OS(Name);
+ OS << (Ty->isUnsigned() ? "unsigned _BitInt(" : "_BitInt(")
+ << Ty->getNumBits() << ")";
llvm::dwarf::TypeKind Encoding = Ty->isUnsigned()
? llvm::dwarf::DW_ATE_unsigned
: llvm::dwarf::DW_ATE_signed;
diff --git a/clang/test/DebugInfo/Generic/bit-int.c b/clang/test/DebugInfo/Generic/bit-int.c
index 94b93013e3b46..88ecc139eee9f 100644
--- a/clang/test/DebugInfo/Generic/bit-int.c
+++ b/clang/test/DebugInfo/Generic/bit-int.c
@@ -4,5 +4,5 @@
unsigned _BitInt(17) a;
_BitInt(2) b;
-// CHECK: !DIBasicType(name: "_BitInt", size: 8, dataSize: 2, encoding: DW_ATE_signed)
-// CHECK: !DIBasicType(name: "unsigned _BitInt", size: 32, dataSize: 17, encoding: DW_ATE_unsigned)
+// CHECK: !DIBasicType(name: "_BitInt(2)", size: 8, dataSize: 2, encoding: DW_ATE_signed)
+// CHECK: !DIBasicType(name: "unsigned _BitInt(17)", size: 32, dataSize: 17, encoding: DW_ATE_unsigned)
|
Seems good to me. I guess we still want LLDB to be able to gracefully handle GCC's generated |
LLDB's support for GCC generated DWARF is pretty best effort, but yea, would be good to account for both. There aren't any bots out there running the LLDB test-suite with GCC (that I know of). That being said, LLDB tuning for this may be overly careful. I'll double check LLDB's behaviour, but tentatively this makes sense to do. Being consistent with GCC seems sensible. |
|
FWIW LLDB support seems limited currently as the type is printed as $ cat test.c -n
1 #include <string.h>
2
3 int main() {
4 _BitInt(15) a;
5 memset(&a, 0x7f, 2);
6 return 0; // GDB: b 6, p a
7 }* thread #1, name = 'test.elf', stop reason = breakpoint 1.1
frame #0: 0x0000555555555162 test.elf`main at test.c:6:3
3 int main() {
4 _BitInt(15) a;
5 memset(&a, 0x7f, 2);
-> 6 return 0; // GDB: b 6, p a
7 }
(lldb) p a
(short) 32639
(lldb) im lookup -t _BitInt(15)
Best match found in /home/och/scratch/test.elf:
id = {0x00000042}, name = "_BitInt(15)", qualified = "short", byte-size = 2, compiler_type = "short"If there's any LLDB testing I can do on my side please let me know |
Yea the way that LLDB determines types for builtins is here:
So |
Follow on from #164372
DWARF for _BitInt(23): DW_TAG_base_type DW_AT_byte_size (0x04) DW_AT_encoding (DW_ATE_signed) DW_AT_bit_size (0x17) - DW_AT_name ("_BitInt") + DW_AT_name ("_BitInt(23)")This matches GCC. It's what our (Sony) debugger would prefer to see, but it's possibly not useful for LLDB - so maybe we want to avoid doing this with -glldb tuning?