Europe is not less innovative. Many advanced machinery that makes everything else originate and get perfected in Europe.
EU is way more efficient in making citizen-friendly laws too.
The USA just likes to splurge unnecessary amount of money and call that "innovation" where there isn't any. They can do that because they have lots of money and infinite debt limit due to US Dollar's special status. This also makes everything else expensive for other players in the world. Remove the special status and see how the worlds change.
(4) Or just as the law states, you can update the regulation accordingly. There are avenues of update just in the law. The connector should be just modern enough. At the same time I do believe thermodynamics itself will be the limit and 120 W is more than enough for any phone.
Apple Silicon had a process node advantage over the Core 100 and 200 series due to having a better allocation with TSMC.
Now Intel's process node is also SOTA and on par with TSMC 2nm so they should be more or less equivalent and the only differences down to what set of compromises they make in the design of the chips.
Phones cannot have non-updated software due to another EU Regulation: Cyber Resilience Act. You need to support devices at least for 5 years starting from December 2027.
How sure are you that the vendor is actually providing device-specific updates? What about firmware updates? Outside of x86 ecosystem the whole-device-family updates in mainline Linux kernel is rare. You're probably deceiving yourself believing that your devices are up-to-date.
Most of the time laptops and many "mainline-friendly" phones stop receiving firmware updates in 2 years. By "firmware" I mean the binary blobs for the peripherals. Most of the SoCs have unified memory for the LTE and CPU modules. If a vulnerability found in the firmware of the LTE module, it can be used for data extraction.
CRA puts hard requirements on documenting and fixing vulnerabilities in device software in 5 year period. It cannot be infinite amount of years, so a reasonable update period had to be choosen. It covers everything provided by the vendor itself too. So if there are vulnerabilities in FW they have to fix it unlike the current situation.
> How sure are you that the vendor is actually providing device-specific updates?
First of all, my phone runs an FSF-endorsed operating system, so no closed drivers. Granted, not everything has been upstreamed yet, but they're working on it and I trust that it will be done soon. (They have done it with the devkit.)
Second, my phone has removable modem and removable WiFi card (no unified memory), so when the firmware can't be updated anymore, the card itself can be replaced. (They actually have already done it by releasing a new WiFi card; 5G modem is also being tested). In the worst case, the device can still be used as a pocket computer with no wireless communications.
Conda does not solve the problems of deployment and they don't have any reproducibility guarantees. That's not surprising considering how Conda binaries are built.
That's why I emphasized Pixi. With Pixi you get a per-platform lockfile that guarantees installation of the exact versions.
If what you want is to deploy a server or development environment, you already get it with Pixi. If you want a Windows installer with DLLs, you don't get. However it was never the reason.
Actually no. I use it to manage more and more non-Python dependencies like Protobuf compiler and LLVM tooling.
I am an embedded developer and we don't use Python for the main project. It is just scripting. It doesn't get rid of everything but it does make developer environment setup so easy.
When you have switches that link two nodes together, for only the duration of one-way transmission you don't need CSMA/CD. We literally have no use for it. We will never have two computers transmit onto the same Ethernet wire anymore.
WiFi is different of course. However as the author wrote, your WiFi devices always go through the access point where they use 802.11 RTS/CTS messages to request and receive permission to send packets. All nodes can see CTS being broadcasted so they know that somebody is sending something. So even CSMA/CA is getting less useful.
Yes I'm only talking about wifi networks. I get that CSMA/CD itself is getting less useful, but it's because something else is doing its job, not because what it did is useless (that's why I wrote "or something similar" when I asked). Wifi is still, necessarily, a common bus where everyone talks.
CSMA/CD - Collision Detection and CA Collision Avoidance. - FYI the article is from 2017!
for Non-WiFi, we don't use CD because all is bi-dirireactional and all communication have their own lane, no needed because there will never be a collision this is down to the port level on the switches, the algorithm might be still there but not use for it.
For WiFi, CD can never be good or work, because "Detecting" is pointless, it cannot work. we need to "Avoid" so it has functionality because is a shared lane or medium. CA is a necessity, now in 2026, we actually truly don't need it or use it as much since now WiFi and 802.11 functions as a switch with OFDM and with RF signal steering, at the PHY (physical level) the actual RF radio frequency side, it cancels out all other signals say from others devices near you and we "create" similar bi-directional lanes and functions similar as switches.
The article is good and represents how IETF operates a view (opinionated) of what happens inside. We actually need an IETF equivalent for AI. Its actually good and a meritocracy even though of late the Big companies try to corrupted or get their way, but academia is still the driver and steers it, and all votes count for when Working-Groups self organized. (my last IETF was 2018 so not sure how it is now in the 2020s)
> freaking memory is 77% full when freaking Windows 11 starts up.
Unused memory is wasted memory. 77% is basically caches + private process memory + shared memory. Unless you are comparing by the private committed working set, you usually have no idea of the actual usage. .Net apps and browsers often allocate overcommitted memory to avoid making system calls.
I get it, using browsers for ToDo apps is slow, however measuring their impact is harder than you think. At the same time the best x-platform UI framework is the browser. Qt comes next but it lacks man-decade amount of fixes/polishing to match native font support and text rendering, media handling, accessibility support, hw acceleration and memory pressure behaviors of Skia and Chromium.
> Unused memory is wasted memory. 77% is basically caches + private process memory + shared memory.
In simplified overviews, Windows counts file system caches (standby memory) as free (respectively available) memory, so if 77% of 32 GB is to be taken literally, it still sounds rather on the high side.
Why does my computer freeze and become unusable when the RAM is 90%, then? That myth is complete nonsense - RAM is like a seatbelt or a crumple zone, serves as a buffer between the user and crashes, and will hopefully never be tested under use.
Typst is what Rust is to C++ but for Latex. Saner syntax, well thought extensibility (including scripting and macros), tables that are sane, a package manager. I am happy that I switched to it for documentation purposes. I am looking forward to compiling web pages with it too.
Congrats on discovering what "thinking" models do internally. That's how they work, they generate "thinking" lines to feed back on themselves on top of your prompt. There is no way of separating it.
I am aware. That is not what the guy above was suggesting, nor what was I.
Things generally exist without an LLM receiving and maintaining a representation about them.
If there's no provenance information and message separation currently being emitted into the context window by tooling, the latter part of which I'd be surprised by, and the models are not trained to focus on it, then what I'm suggesting is that these could be inserted and the models could be tuned, so that this is then mitigated.
What I'm also suggesting is that the above person's snark-laden idea of thinking mode, and how resolvable this issue is, is thus false.
EU is way more efficient in making citizen-friendly laws too.
The USA just likes to splurge unnecessary amount of money and call that "innovation" where there isn't any. They can do that because they have lots of money and infinite debt limit due to US Dollar's special status. This also makes everything else expensive for other players in the world. Remove the special status and see how the worlds change.
reply