HN2new | past | comments | ask | show | jobs | submit | Joker_vD's commentslogin

Because it's not posted by a Russian/Indian account, duh!

> And finally, source-level debugging is gnarly. You would like to be able to embed DWARF information corresponding to the code you residualize; I don’t know how to do that when generating C.

I think emitting something like

    #line 12 "source.wasm"
for each line of your source before the generated code for that line does something that GDB recognizes well enough.

I dabbled into literate programming and wrote C basically in reStructuredText. Used '#line' to specify the correct file name and line number in my reStructuredText sources. Worked as advertised, no complaints.

In my case I had long-ish fragments, so I only put '#line' at the start of a fragment. The compiler counted subsequent lines itself.

It was a cross-platform project and worked equally well with GCC, Clang and MS Visual C++.


If you have ever used something like yacc/bison, debugging it is relatively sane with gdb.

You can find all the possible tricks in making it debuggable by reading the y.tab.c

Including all the corner cases for odd compilers.

Re2c is a bit more modern if you don't need all the history of yacc.


Debugging Yacc is completely insane with gdb, for other reasons, like that grammar rules aren't functions you can just put a breakpoint on, and see their backtrace, etc, as you can with a recursive descent parser.

But yes, you can put a line-oriented breakpoint on your action code and step through it.


Nim compiles to C, and it has a compiler iotion that does this.

The CPP does that as well, because once the compiler is invoked, the original source code file is already gone.

> Empowering the 'User' (hardware owner) should have always been the focus.

The "user" and "hardware owner" are not necessarily the same person.


Yeah, but it's a shame that almost every language gets it wrong (and even traditional presentation of mathematical logic) and uses zero as representation of FALSE. It should represent TRUE!

Zero as the absence of something definitely makes more sense as false than true. It is the case for these common C idioms:

    while(p) {
        // do something with p
        ...
        p = p->next;
    }

    if(err=foo()) {
        printf("Error %d occurred\n", err);
        ...
    }

Pedantic: the axioms of Boolean algebra don’t assign any natural numbers to the elements “top” and “bottom” of the set it operates on. The notation is usually “1” and “0” but it doesn’t have to be. It’s a convenience that many computer languages have named those elements “true” and “false”, and yes, it’s totally valid that in some representations, top = 0 = true and bottom = 1 = false.

Technically true, in practice using 0 for the bottom element and 1 for the top is a pretty strong convention; justified, for example, by the connection to probability measures and isomorphism with the Boolean ring ℤ/2ℤ.

If you want to make an argument for something else being the representations of boolean variables than 0 for false and 1 for true, one could make the case for true being all bits set.

That would make it slightly easier to do things like memset()'ing a vector of boolean, or a struct containing a boolean like in this case. Backwards compatibility with pre-_Bool boolean expressions in C99 probably made that a non starter in any case.


A 1-bit integer can be interpreted as either a signed integer or as an unsigned integer, exactly like an integer number of any other size.

Converting a 1-bit integer to a byte-sized or word-sized integer, by using the same extension rules as for any other size (i.e. by using either sign extension or zero extension), yields as the converted value for "true" either "1" for the unsigned integer interpretation or the value with all ones (i.e. "-1") for the signed integer interpretation.

So you could have "unsigned bool" and "signed bool", exactly like you have "unsigned char" and "signed char", to choose between the 2 possible representations.


> one could make the case for true being all bits set

Historical note: this was the case in QBasic, where true was defined as -1.


There, apparently, were quite a number of ISAs where checking the sign bit was more convenient (performant?) than checking (in)equality with zero.

Some Fortran compilers also did this. MS Powerstation Fortran at least, IIRC.

That is right, and you could map the Boolean values to other numbers, e.g. mapping them to +1 and -1 corresponds better to many of the hardware implementation techniques for logic circuits.

However when the use of Boolean algebra is embedded in some bigger theories, there are cases when the mapping to 0 and 1 becomes mandatory, e.g. in relationship with the theory of probabilities or with the theory of binary polynomials, where the logical operations can be mapped to arithmetic or algebraic operations.

The mapping to 0 and 1 is fully exploited in APL and its derivatives, where it enables the concise writing of many kinds of conditional expressions (in a similar manner to how mask registers are used in GPUs and in AVX-512).


> Pedantic: the axioms of Boolean algebra don’t assign any natural numbers to the elements “top” and “bottom” of the set it operates on.

Yes? That's precisely what I meant when I said that the traditional presentation of mathematical logic get it wrong: it assigns 0 to FALSE and 1 to TRUE, but it can be done other way around.


Including the words "top" and "bottom". In my language, if(x) is the same as if(x==bottom), and 1<2 resolves to bottom. Take that.

> 1. Explicitly set the C standard to C17 or older, so the code is built using the custom boolean type.

> Option 1) seemed like the easiest one, but it also felt a bit like kicking the can down the road – plus, it introduced the question of which standard to use.

Arguably, that's the sanest one: you can't expect the old C code to follow the rules of the new versions of the language. In a better world, each source file would start with something like

    #pragma lang_ver stdc89
and it would automatically kick off the compatibility mode in the newer compilers, but oh well. Even modern languages such as Go miss this obvious solution.

On the topic of the article, yeah, sticking anything other than 0 or 1 into C99 bool type is UB. Use ints.


Yeah, it’s only kicking the can down the road if you’re the actual maintainer of the software.

If you’re just a packager, it’s your job to get the package to build and work correctly; for your own sanity, you should be making minimal changes to the underlying code to facilitate that. Get it building with the old language version and file a bug report.


Rust does the right thing, with the per-crate

    edition =
statement.

With C you put that information as a build option in your Makefile or similar. That’s a consequence of C only standardizing the actual language (and a runtime library), not the build environment.

Since you mention Go, it does offer precisely the feature you describe in the form of build constraints. A file starting with

  //go:build go1.18
tells the toolchain to use Go 1.18. A slightly different syntax was used prior to Go 1.17 but the feature itself has existed since Go 1.0.

I know this is likely to be an unpopular take but: I wish it was normal to ship your compiler in your source repo.

Modern compilers are bloated as hell huge things which makes it a bit impractical, but if it was a normal thing to do then we'd probably have optimized the binary sizes somewhat.

I just really like the idea of including _everything_ you need for the project. Also ensures that weird problems like this dont happen. As an extra benefit, if you included the compiler source and a bootstrapping path instead of just the latest binary, then you could easily include project specific compiler / language extensions with no extra effort.


That's pretty close to the underlying concept behind Guix and Nix. Give them a glance, if you can!

> you can't expect the old C code to follow the rules of the new versions of the language

Well, to be pedantic, the entire point of the C standard, and the standard body, is that you should expect it to work, as long as you're working within the standard!


Not really, no. Newer versions of standard can (and do, although rarely, I have give it to C standard committee) introduce incompatibilities with earlier versions of standard. E.g. at one point the standard explicitly allowed to #undef "bool", "true", and "false" (and to redefine them later) but IIRC this has been deprecated and removed.

In any case: blindly switching what is essentially a typedef-ed int into _Bool has no business working as expected, since _Bool is a rather quirky type.


I'm using the dictionary definition of expect here, which is compatible with what you're saying.

> That's a poor advice for the scripts you call relatively frequently.

Why? It protects you from someone else (cough updated packages introducing new commands cough) picking a name you already use.


Because it's useless extra typing. People try to narrow commands down to two fucking chars and you suggest to type the whole goddamn path!

Nobody suggested typing whole paths.

They did. Just not explicitly.

The only situation where you'd need to type a full path is when you have a name collision. But the suggestion to not worry about collisions with unknown system binaries also said to put the script folder first. In that situation running the script doesn't need a full path, and you won't be running the unknown system binary. So you won't be typing full paths.

Please explain the non-explicit suggestion you see, because I don't see it.

It's clear that "adding your script directory in front of the PATH" means a one time edit to .bashrc or equivalent, right? In normal use you're not typing it.


I've asked you why prepending paths to PATH instead of appending was a bad idea. How is having typing full paths has anything to do with it?

> if AI replaces work wholesale right now billions of people will die before society is reshaped accordingly.

Don't worry, the economists will slap the label "natural readjustment of labour supply levels" on this phenomenon, and it will make everything morally better.

Edit: in fact, we have historic precedents in e.g. Indian famines and how the British administrations talked about them and handled them [0][1]. Ah, malthusianism and undisguised racism, what a mixture. Of course, nobody counts those as part of "the millions of victims of Capitalism".

[0] Rune Møller Stahl, "The Economics of Starvation: Laissez-Faire Ideology and Famine in Colonial India": https://www.researchgate.net/publication/304189843_The_Econo...

[1] Jayant Chandel, "Political Economy of Famines in the British Empire: An Analysis of the Great Famines in India from 1876–1879" (PDF): https://www.journalofpoliticalscience.com/uploads/archives/8...


Reminds me of a review (written somewhere in the early 60s, I believe) by some Soviet sci-fi writer of Hamilton's Star Kings (1949) and the Western sci-fi in general; to paraphrase, "it's astonishing that those writers would set their decorations thousands years in the future, with wildly imaginative technological advances and inventions, yet when they come to the social systems, all they can imagine is either the feudal order of the past, or the modern style of capitalism".

But everything after headers can (almost) be a blob. Just copy buffers while taking care to track CRLF and look if what follows is a space. In fact, you have to do it anyhow, because line-folding is allowed in headers as well! And this "chunking long lines" technique has been around since the 70s, when people started writing parsers on top of hand-crafted buffered I/O.

I am willing to bet the whole xAI/SpaceX merger is simply a ploy by Musk to evade releasing accurate historical information about SpaceX's finances. How much did it actually cost SpaceX to launch a kilogram of payload into space each year? How much is NASA actually donating them, per each year?

I mean, I still remember promises of $1000-per-kg for space launches, and how e.g. Gigafactory will produce half of the world battery supply, and other non-scientific fiction peddled by Musk. Remember when SpaceX suggested in 2019 that the US Army could use its Starship rockets to transport troops and supplies across the planet in minutes? I do. By the way, have they finished testing Starship yet, is it ready?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: