I'm sure Bill understands what I'm about to say, but as a person on team "require explicit initializations" I think the mitigations I would be looking at are:
1. Only require that the programmer prove the variable is initialized before it's read. Riffing on Bill's example:
Object *x; // Fine
if (is_foo) {
x = &foo;
} else {
x = &bar;
}
x->a // Fine, was initialized by the time it was used.
Of course this is still a trade-off, your compiler has to work harder to prove that all paths that flow to each use are definitely-initialized. When you have initialization functions you now need to have type specifiers like 'in', 'out', and 'in/out' so that 'out' can take a pointer to uninitialized data, or something like MaybeUninit. This handles this example from Bill:
Foo f;
Bar b;
grab_data(&f, &b); // Fine if 'grab_data(out Foo *, out Bar *)'.
2. Something like Rust's MaybeUninit to explicitly opt-in to wanting to deal with possibly-uninitialized data. Obviously also a trade-off, especially if you want to force all the maybe uninitialized stuff to be in an 'unsafe' block.
In the first case in a language with type inference, I'd argue it would probably be written a bit different. And probably something like the following two cases:
x := &bar
if is_foo { x = &foo }
// or
x := &foo if is_foo else &bar
I know this is a simple example, but it's kind just a different way to think about the problem. Sometimes the default C way of having to specify the type first is a good enough of a nudge away to not do when you can just infer the type from the expression. Especially when this variable is being used not as a variable in itself but as an alias for another. The place where it is more likely to have a `nil` pointer due to no explicit initialization in Odin would be within a struct, rather than on its own on the stack.
In the second case, Odin's approach would probably just not even do that, but rather `f, b := grab_data()`. The use of `&` is when it needs to be an in-out parameter (i.e. read and write to), which means it has different functionality.
Making all registers caller-saved around context switches is a neat insight, it's intuitive how that could potentially lead to needing fewer instructions to execute when switching contexts.
I haven't yet digested exactly how the stacks are made to work. I think fig 8 should be obvious, but I'm getting stuck on picturing how the system guarantees that frame 1 can't grow into frame 2.
EDIT: Ah, I think I understand. It seems that they're taking advantage of a large virtual address space to give each stack loads of space, rather than them being contiguous like I assumed from fig 8.
> As modern 64-bit address-space is big enough for the allocation of many large continuous stacks [...]
> `foo :: Semigroup a, Traversable t => t a -> a` I already know that whatever is passed to this function can be traversed and have it's sum computed. It's impossible to pass something which doesn't satisfy both of those constraints as this is a specification given at the type level which is checked at compile-time.
To add to your point, I don't think foo can even be implemented (more accurately: is not total) because neither `Semigroup a` or `Traversable t` guarantee a way to get an `a`.
I think you'd need either `Monoid a` which has `mempty`, or `(Foldable1 t, Traversable t)` which guarantees that there's at least one `a` available.
Yep, I missed it as I don't often work in haskell anymore but with the correction the rest of the above still stands (haskell syntax is still the most efficient I'm aware of for talking about this). Also, it being unimplementable is also a nice property of having such a type system as you can prove that your specification is impossible to implement (more applicable in Lean, ATS2, Idris, Agda) or make invalid states impossible to even write (consider state transition rules at the type level).
Uh, you are right. C is the right name. It throws.
I mixed it up with a similar compile-time constness check in some of my libraries, where I decided if const of not. gcc throws, only clang could give me a proper answer.
My main source was reading other source and sometimes asking questions in the Pi forum. The already linked kmscube shows how some of mentioned techniques work. It was then mainly following up on API calls names and parameters (DRM_MODE_ATOMIC_TEST_ONLY) to find other snippets of code that use them. Felt a bit more like code archeology :-)
> doesn't run against the grain of the entire language
Not an expert, but my gut says maybe it runs against zero values? As in, "what's the zero value for a non-nullable reference?" Maybe the answer is something like "you can only use this type for parameters", but that seems very limiting.
Half of the language is already non-nullable and is accomplished by allowing for zero values. Non pointer variables are guaranteed to be never nil.
What is missing is the ability to have pointer variables and have the compiler ensure that it will be never nil. I believe this was a design choice, not some technical limitation.
Like the sibling comment seems to be saying: a non-nil pointer would have to be set to some real (non-nil) pointer value anyway. So having a zero value does not seem to apply?
I think the person you're responding to is agreeing with you that you should optimize for languages other than English rather than for people naming a label "Bin" and then switching to the UK locale.
FWIW, that's not dependent typing, because the type doesn't depend on run-time values. A bool is either true or false, a [0...1] is either 0 or, ..., or 1.
I'd probably find the name Natural Seitan misleading (i.e. mistakenly buy it thinking it was vegan), but calling it "Animal Seitan" would be perfectly acceptable/clear to me .
1. Only require that the programmer prove the variable is initialized before it's read. Riffing on Bill's example:
Of course this is still a trade-off, your compiler has to work harder to prove that all paths that flow to each use are definitely-initialized. When you have initialization functions you now need to have type specifiers like 'in', 'out', and 'in/out' so that 'out' can take a pointer to uninitialized data, or something like MaybeUninit. This handles this example from Bill: 2. Something like Rust's MaybeUninit to explicitly opt-in to wanting to deal with possibly-uninitialized data. Obviously also a trade-off, especially if you want to force all the maybe uninitialized stuff to be in an 'unsafe' block.