On the main page, shorts, as all the other videos, are served by the recommendation algorithm which should filter out general audience crap you'd see if you're not logged in or have view history disabled. You'd normally see the same stuff you're subscribed to there, plus a few random videos of cats. Maybe a wamen butt occasionally. Might as well hide the main page entirely if you're not that easily entertained. To be quite frank, the main page is such an echo chamber lately that I almost got myself unhooked from procrastinating on YouTube.
On the search page, shorts are mostly a mixed bag, but you do occasionally get useful results.
So what does this solve? Seems like a form of protest nobody important (those in power) cares about.
Another thing is, I have, to my own surprise, discovered a few decent channels that I like, that post their videos in form of shorts exclusively. That's a somewhat new trend and mostly relevant to humor-related or music channels, though.
Almost forgot to mention. YouTube recently added the scroll bar to the shorts so they aren't all that different from the other videos now.
Filtering content is not "a form of protest", it is about deciding what content you want to see in your browser and what not. Youtube, even the paid version, does not offer much in terms of customising one's experience (imo the "algorithm" deciding what you should watch based on your history does not count as one) and shorts is a proven addictive pattern that one may not want to encounter online.
It is fine if you like watching shorts, such filter lists are for those who do not want to watch shorts.
I might be wrong, but I don't think people really care about the addictiveness in the first place. As I see it, the shorts were irritating to see, mainly because they were heavily out of tune with the rest of recommendations. But they seem to have tuned them to be more in line with the rest of the videos. Being not that different from the rest of the videos one gets recommended, there is not much point in hiding them? I'm not exactly protecting shorts here. My point is, you can, of course, cut some of the videos from the feed, but the rest would still be affected by the same algorithm. You still don't get to filter anything, really. So what's the point?
If addictiveness really is that much of a factor, I rest my case.
The main benefit for me is hiding content I'm actively uninterested in seeing. Shorts are portrait mode content that pretty much never seem to be long enough to discuss anything interesting. I watch on widescreen monitors, so I just don't care for them. There's nothing else to it really.
It is not a bad rule, to use online services / software where you know that the malicious owners are likely not after you nor in cahoots with the government where you live. Or you can take the Swiss option with stuff like ProtonVPN, Signal etc. :-)
There is normally a wiki page for every popular program which normally contains an official site URL. That's how I remember where to actually get PuTTY. Wiki can potentially be abused if it's a lesser known software, but, in general, it's a good indicator of legitimacy.
So wikipedia is now part of the supply chain (informally) which means there is another set of people who will try to hijack Wikipedia, as if we didn't had enough, just great.
You can corroborate multiple trusted sources, especially those with histories. You can check the edit history of the Wikipedia article. Also, if you search "7zip" on HN, the second result with loads of votes and comments is 7-zip.org. Another is searching the Archlinux package repos; you can check the git history of the package build files to see where it's gotten the source from.
And we're really going to do all the brouhaha for a single dl of an alternative compressor ? And then multiple that work as a best practice for every single interaction on the Internet? No we're not.
The dl for some programs are often on some subdomain page with like 2 lines of text and 10 dl links for binaries, even for official programs. Its so hard to know whether they are legit or not.
My point was more along the lines of "there's no need to complain about Wikipedia being hijackable, there are other options", and now you're complaining about having too many options...
You don't need to do everything or anything. They're options. Use your own judgment.
Not exactly news, wiki's been used for misinformation quite extensively from what I recall. You can't always be 100% sure with any online source of information, but at least you know there is an extensive community that'll notice if something's fishy rather sooner than later.
It did get me thinking - maybe there should be IoTS devices, where the S stands for Security. A commitment to updates for a certain amount of time, the source code in escrow to be released when updates/support ceases, probably other things I'm not thinking of.
Seems like a fitting area for government regulation and certification. But in order for a government to even begin to consider the lack of security in IoT a problem, the adoption must ubiquitous. I.e. the devices (or the number of thereof) should pose enough a threat to public infrastructure (think botnets) to be subjected to regulation. Is there such an incentive in any country at the moment?
Not OP, and not sure about OCaml and Haskell, but one example where Java's type system is unhelpful/incorrect is mutable subtyping.
E.g. Java assumes Cat[] is a subtype of Animal[]. But this only holds when reading from the array. The correct behavior would be:
- `readonly Cat[]` is a subtype of `Animal[]`
- `writeonly Cat[]` is a supertype of `Animal[]`
- `readwrite Cat[]` has no relationship with `Animal[]`
But Java doesn't track whether a reference is readable or writable. The runtime makes every reference read-write, but the type checker assumes every reference is read-only.
This results in both
- incorrect programs passing the type checker, e.g. when you try to write an Animal to an Animal[] (which, unbeknown to you, is actually a Cat[]), you get a runtime exception
- correct programs not passing the type checker, e.g. passing a Animal[] into an writeCatIntoArray(Cat[] output) function is a type error, even though it would be safe.
(Although all that is assuming you're actually following the Liskov substitution principle, or in other words, writing your custom subtypes to follow the subtyping laws that the type checker assumes. You could always override a method to throw UnsupportedOperationException, in which case the type checker is thrown out of the window.)
Interestingly, AFAIK Typescript makes these types both subtypes and supertypes at the same time, in the interest of not rejecting any correct programs. But that also allows even more incorrect programs.
Did you mean arrays instead of lists? Arrays behave as you describe (with ArrayStoreException when you write a wrong value to an array). List<> is invariant WRT its type parameter.
I can't make a Mappable interface, and have my classes implement map(f). Because map(f) will necessarily return Mappable, not my class itself. So no method chaining for me.
Also null. Yeah I know it's contentious. People don't want to let go of it. Since learning to hate null, I've also lost any nuance in my ability to explain why it's bad. Because I know longer see it as 'subtly-bad' or 'might lead to bugs'. It's just plain, on-the-surface-wrong. One might as well have named it 'wrong' rather than 'null'.
'Null' is the thing which it isn't. I can write business logic that says every Person has a Name. Once you admit null into the mix, I can no longer make that simplest of statements. My autocomplete now lies to me, because person may or may not implement the method .name().
"But how will I half-arse instantiate a Person? I don't have a Name, yet I want to tell the computer I have a Person?" It makes me happy that you can't.
"I wrote a function that promises to return a Person. I was unable to return a Person. How can I tell the computer I'm returning a Person even though I'm not?" Glad that you can't.
Avoiding null is one of those things the FA complains about — you'll make your language three times as complicated to avoid it, and that might not be a good trade-off.
It's not really about the implementation of Java (might be bad, I don't know).
It is the specification.
- People talked about null being an issues and that is a big one.
- The entire idea of OOP extremism Java implemented was a mistake - though just a consequence of the time it was born in. Much has been written about this topic by many people.
- Lacking facilities and really design for generic programming (also related to the OOP extremism and null issue
So much more more you can find out with Google or any LLM
> If my computer goes to sleep, WSL becomes unresponsive. I have to save all my stuff and reboot to continue working.
Try wsl --shutdown. Works for me when WSL hangs for no apparent reason.
I've also noticed that, in my case, these hangs are somehow tied to Docker for Windows. Couldn't figure what triggers them so far, though. I just restart DFW and kill WSL when that happens.
Restarting the vmcompute service sometimes helps. Doing so completely blue/blackscreened my machine this morning so it just makes me more confident in WSL's low level hooks.
reply