Added : C# Language Design Notes for Aug 9, 2017 #822
Replies: 87 comments
-
I would suggest the following:
The use of default expressions within generic types, at least for all of the cases described in the notes, is covered by the proposal in #727. |
Beta Was this translation helpful? Give feedback.
-
This seems like a good warning.
I would expect warning for using a non-nullable reference type as a field of a struct will have a high rate of detecting potential bugs in the wild. |
Beta Was this translation helpful? Give feedback.
-
I disagree with the conclusions about arrays and default expressions. If the feature is to be useful, it should not lie to you about the underlying type system. When you create an array, it is always composed of nullable elements. The warnings and type inference should treat it as such. How would you even distinguish between an array variable that can be null vs an array variable that can hold nulls but cannot itself be null? var a = default(string); //This produces a warning? That seems bizarre.
var a = new int[30]; //The array reference should be non-nullable, but the array elements themselves should always be treated as nullable |
Beta Was this translation helpful? Give feedback.
-
Yeah, that's about what I was hoping for. 👍
Like the way that this feature is getting implemented isn't already lying about the type system. 🙃 |
Beta Was this translation helpful? Give feedback.
-
Then we cannot deliver a feature. After all, the underlying type system always can give you nulls. There is nothing at the CLR level enforcing any of this. So at best, this is just trying to prevent you from dealing with nulls as long as people are not trying to subvert things. But it's trivial to subvert any of this (just use a language that doesn't understand/respect these annotations). Note: this is what other languages do as well (like Kotlin). And yet, the feature can be useful. So i reject the claim that for the feature to be useful it must always respect what the underlying type system may do. |
Beta Was this translation helpful? Give feedback.
-
But what you're complaining about for array elements is actually what's happening with fields, locals and parameters themselves! They call all hold null at runtime and are not guarded, even though the type system generates best-effort control flow warnings. I wish an equal amount of effort was going into CLR work to enable strongly-typed non-nullability in languages. |
Beta Was this translation helpful? Give feedback.
-
Warnings are inherently just a best effort to tell you that something may be a problem. As such, they will inherently be on a spectrum. Some people will want to be told about anything that could be a problem. Some people will find that unreasonable and will want something much more gentle. For example, i want a way to be able to express that an array contains non-null values. I will ensure that all the values in it are non-null. I want to do this so that i don't have to be warned at every point that there may be null when i know that that's not the case. -- Again, this goes back to the point that compilers can only do very weak analysis here. And if you want to be warned about anything outside that weak analysis, you're going to have a very rough time. |
Beta Was this translation helpful? Give feedback.
-
Yes. And that's why we give the user the control to tell us what it means, and we trust it and only want to warn about stuff that seems to go explicitly against that (and even then we give ways to opt out of the warning). |
Beta Was this translation helpful? Give feedback.
-
Like many language features, it's all a dial. You can provide really weak guarantees and really weak analysis and make the feature crippled so that even people who might otherwise want to use it won't because its constantly lying to them. As I've said before, this feature risks making things worse than they are today. If I get an NRE from a variable marked as non-nullable, I no longer trust the feature. I stop using it. And not only that, but now I think less of the C# language as well. I go and talk with my colleagues about how they are right that so-so language is less broken and more modern. You guys are spending so much time hand wringing over making it easy to upgrade that you risk compromising the much more important use case, the infinite future time after which the feature is delivered. You are already hiding it behind a compiler switch. So make it as effective and honest as it can be once you decide to go and turn that switch on. |
Beta Was this translation helpful? Give feedback.
-
Not happening. And we've seen from other ecosystems that it does not need to happen in order for features like this to be incredibly useful and valuable. Without those existence proofs i'd be a lot more concerned, but we've seen how this plays out. Again, this feature is not about contracts or about 100% foolproof ensuring that no null shall ever be possible. This feature is about helping catch the vast majority of nulls in the vast majority of cases when people are playing by the rules and not trying to subvert the system. It's to help prevent nulls from accidentally creeping in by mistake. -- To repeat myself from the other thread on this topic: #796 (comment)
|
Beta Was this translation helpful? Give feedback.
-
This already exists today in other languages, and that hasn't been a problem. So, overall the ecosystem isn't revolting in the way you indicate. Yes, i agree that some users will not use it. but i don't want to win over 100% of users, i want to be useful to enough users. A good example of this is F#. You can talk about non-null types in it, but it's not like the compiler actually enforces that completely. So you can totally get null ref exceptions when you use it. Is that a problem? For some users... maybe. But overall it isn't and people accept that overall it's a great help even if it isn't perfect. |
Beta Was this translation helpful? Give feedback.
-
Evidently, and yet, I still wish. |
Beta Was this translation helpful? Give feedback.
-
As i've mentioned, we're entirely amenable to actually exposing this as a dial (and it's something we've talked about). There would be several different types of things users could choose from when turning on the feature. For example (Again, example example example, not actual design).
Users could choose to what level of analysis aggressiveness they care about. |
Beta Was this translation helpful? Give feedback.
-
Yup. And TypeScript has type system holes you can drive a massive truck through. I've been burned by them myself *while i worked on writing hte TypeScript compiler itself :D *. Despite that, still a great help :) |
Beta Was this translation helpful? Give feedback.
-
Let's stop being theoretical. Arrays should always be nullable, here's why. void Main() {
var a = new Foo[5]; //Compiler infers a is composed of non-nullable elements
Bar(a[3]); //Bar expects a non-nullable, great! That's just what the compiler thinks we have!
}
void Bar(Foo foo) {
foo.DoSomething(); //NRE - WTF - foo is supposed to be non-nullable
} If I'm a user and this happens my conclusion is I still have to do defensive null checks everywhere anyway. So then what's the point of the feature? I may as well just use |
Beta Was this translation helpful? Give feedback.
-
@svick you cannot instantiate a |
Beta Was this translation helpful? Give feedback.
-
Only if
|
Beta Was this translation helpful? Give feedback.
-
When does null mean a value? |
Beta Was this translation helpful? Give feedback.
-
@jnm2 When lazy developers (myself included) use it to mean things like "Other". |
Beta Was this translation helpful? Give feedback.
-
@bondsbw When you use it to mean "other," how is that different than using it to mean "no value"? |
Beta Was this translation helpful? Give feedback.
-
Using it for "other" means that the options provided are not adequate to express my intent, and perhaps it is coupled with an additional text response. "No value" might mean that the user simply hasn't selected an option. But what happens when When I fail to adequately communicate intent in my data types, I've failed to create proper data types. |
Beta Was this translation helpful? Give feedback.
-
@bondsbw I agree with all of that about communication, and yet like you said I wonder if some of that is because people try to conflate their data, domain and UI models. There are other good reasons not to do that. But that kind of whole-model conflation is the only scenario in which I can imagine conflicting assumptions being made about what caused the |
Beta Was this translation helpful? Give feedback.
-
What belongs where is certainly a design decision, with no one-size-fits-all solution. In the domains I have worked in, I find that storing as much information as possible to be the best case. I may never use that extra information, but I don't know what I'm going to want to do with it 5 years down the road. I may need to run usage analytics or investigate a situation. In my domains, storage is too cheap to throw away any data, even if I don't yet know what I'll do with it. |
Beta Was this translation helpful? Give feedback.
-
Certainly a good policy. |
Beta Was this translation helpful? Give feedback.
-
What I would propose is-
I would insist for the 2nd point only. Other two are just suggestions. |
Beta Was this translation helpful? Give feedback.
-
That's already the case, the feature is called (confusingly, but due to the history of how the feature was developed) "Nullable Reference Types". |
Beta Was this translation helpful? Give feedback.
-
@Joe4evr My objection is about the "type" part, not nullable/non-nullable part. As I have stated- "Nullable and non-nullable references" (no type). |
Beta Was this translation helpful? Give feedback.
-
Re: What is null? and Reference Type vs Reference - my 2C. Should it be Null Reference or Null Reference Type? It's like Schrödinger's cat. The cat is null when the box is closed - unevaluable. But it has a type (Cat) and values - color, alive/dead, etc. They are just cloaked from us. When we open the box we will see a dead cat or an alive cat, but still a cat. It's the same with objects in memory. If we create a reference object it has a type with memory allocation for that type, and a pointer to the base. If we make that null, we don't necessarily destroy the object, we remove the ability to resolve the pointer. The Reference Type isn't about the "type", it's about the reference - (Pointer Reference to the base location of a Type Instance). In this case, we expect the pointer to point to the base location of memory that can be read as a Cat. Dropping 'Type' would translate to Pointer Referencing a Memory Location (Non-Nullable Reference) or Pointer Possibly Referencing a Memory Location (Nullable Reference). We have no idea what we might find at that location. Since nullable isn't about the object itself, but our ability to evaluate the object, default values are something of a requirement - but really only in the case of bool. Booleans can only be true or false by definition. If we allow conversion between nullable and non-nullable, that means when we "de-cloak" a bool it must have a value. Value types like It seems to me that what we are really after is "Mandatory Valued Objects" as much as a true not-null - we want to ensure that our references point to locations that have "resolvable" objects, not that they just point to a location that can have a object. |
Beta Was this translation helpful? Give feedback.
-
@RandyBuchholz I'm not sure what you are agreeing with / disagreeing with / proposing. It seems like that is mostly an argument in favor of renaming the feature from "Nullable reference types" to just "Nullable references", but other than that can you clarify the specific desired outcomes? |
Beta Was this translation helpful? Give feedback.
-
@sharwell Sorry, kind of rambling. I was jumping in on the conversation about what null is and if it can have values, and comments about dropping type from the name. It seems like "null" is often considered a special value (enforced by being able to do So when we talk about different aspects/attributes of "nullable" that may be a perspective to consider. Some languages do that to different degrees, which is why we have null, none, undefined appearing together. That leads to my last comment about what we want isn't really tied to the definition of null. What we want is to be able to constrain objects (in this case reference objects) to always have a value, and to always be able to determine that value. I'm disagreeing about dropping type. I think "Nullable Reference Type" is better than "Nullable Reference". But really (I'm not proposing this) "Mandatory" or "Required" would more accurately describe what we want. I don't have a complete proposal for a desired outcome, other than not changing the direction things are going right now. But a general outcome I would like to see (and have been working on a proposal for) is something along the lines of a "Domain" concept in the language. Not in the DDD way but in the UDT or database way. This isn't a syntax I would use, but just for the general idea -
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
C# Language Design Notes for Aug 9, 2017
We discussed how nullable reference types should work in a number of different situations.
Please discuss!
Beta Was this translation helpful? Give feedback.
All reactions