It's not just LLMs, it's people doing that too— LLMs are trained on real writing in papers published in science, academia, and technology journals, as well as web pages and social media posts.
I've met real hard-working people who have had to change writing styles, because their style is too heavily mimicked by LLMs, so it now takes extra effort to not be accused of cheating.
You can use tools to detect the likelihood of text being written by an LLM, but as mentioned it will have lots of false positives in fields that significantly contributed to training data.
Your best option is to keep track of writers and journalists, who's styles you've appreciated, and follow them on whatever platforms they write for. Journalists often publish to more than one media outlet, and self-hosting platforms like Substack are growing quickly.
Even the slower ones are more like a Wii U, which is perfectly capable of everything a set-top box needs to do. Really, the hardware acceleration does all of the heavy lifting, and the processor only needs to render text and coordinate what to composite.
It's the bloat of the software layer on top that's slowing things down.
A 1st-generation Chromecast only has 512 MB pf RAM and a dual-core 1.2 GHz processor, and it can handle video streaming just fine. Building an interface on top of that doesn't take a lot of resources, if the underlying layers aren't bloated. With current Android/iOS development, they very much are.
Russia's years behind on this; western European countries have long been maligning Durov for not giving governments open access to user communications on Telegram.
I bought a PinePhone with a keyboard case, and it's a great form factor, but the implementation is really bad, with the keys immediately binding if pressed at even the slightest angle.
A proper implementation could make phones as usable as laptops.
It's a double whammy, because at cold temperatures the total capacity of the battery is reduced, and now they're only using 75% of that.
Also, recalls in lithium cobalt batteries are much, much more common than they are with other battery technologies, which is an issue inherent to the specific technology used for those batteries.
Lithium cobalt batteries have great specific power and energy, at a reasonable price per watt-hour, but there's a lot of trade-offs. It's been around a long time, so it's easy to manufacture.
Lithium iron phosphate batteries have lower specific power and energy, often making them too heavy for automotive use, but they have a great price per watt-hour, so they're getting enough use that they're becoming easier to manufacture, which is making them even more affordable and increasing the specific power and energy as new variations are developed.
Public transit busses are very large, don't carry a lot of weight, and need to charge and operate in cold areas, so sodium ion batteries are the best option, despite their low specific power and energy. They also theoretically have an even lower price per watt-hour, but they have so little adoption that there isn't any cost optimized manufacturing. They're absolutely what's needed for busses, and are also great for battery backup in cold climates, but as with lithium iron phosphate, it will take some time for them to be viable.
Early adopters are what makes technology happen, but they have to deal with under-performance and high costs. We may have to wait for the technology to mature, before it is practical for public infrastructure.
Video encoding and image compression is a huge use case, and not at all uncommon, so much so that a lot of hardware has dedicated hardware for it. Of course, offloading the SIMD instructions to dedicated hardware accelerators does reduce usage of SIMD instructions, but any time a specific CODEC or algorithm isn't accelerated, then the SIMD instructions are absolutely necessary.
Emulators also use them a lot, often in unintended ways, because they are very flexible. This is partially because the emulator itself can use the flexibility to optimize emulation, but also because hand optimizing with SIMD instruction can significantly improve performance of any application, which is necessary for the low-performance processors common in videogame consoles.
It doesn't need to be logged on to a Google account, and it supports locally storing map data and generating routes, so you could turn on network access, download local maps, block network access, then use it for navigation without it calling home.
As long as copying some numbers, printed on a piece of plastic, into an online order form is all the authentication that is needed for a transaction, anything more than that is inherently security theater.
That’s why for most transactions I do with a credit card in my country, you need an extra validation with the mobile app. It is mostly American websites that do not enable this functionality.
Yes, because we don't want these stupid locked down apps. Credit cards give buyers many protections, it's very easy to dispute an illegitimate transaction.
The consumer does not typically pay this directly. It may be passed onto the consumer indirectly through higher prices, but those apply to anyone regardless of payment method. On the contrary, I get cash back on purchases and other rewards.
Because we have anti-fraud consumer potection rules and CCs operate on a make money first type of bais. The debit networks on the otherhand are a different story.
I've met real hard-working people who have had to change writing styles, because their style is too heavily mimicked by LLMs, so it now takes extra effort to not be accused of cheating.
You can use tools to detect the likelihood of text being written by an LLM, but as mentioned it will have lots of false positives in fields that significantly contributed to training data.
Your best option is to keep track of writers and journalists, who's styles you've appreciated, and follow them on whatever platforms they write for. Journalists often publish to more than one media outlet, and self-hosting platforms like Substack are growing quickly.
reply