HN2new | past | comments | ask | show | jobs | submit | hereonout2's commentslogin

I'm curious from the other direction, what are the conversations like if you feel they are easy to move?

Do you have the memory feature disabled? I have the feeling this in particular is doing absolutely loads behind the scene, e.g summarising all conversations and adding additional hidden context to every request.

I can start a new chat in the UI right now, ask it what my job is, what my current project is, how many kids I have, what car I drive etc. It'll know the answer already.

I think it's this conversation history - or maybe better yet if we think of it as this "relationship" - that people are saying is going to make it hard to move.


I ask for code snippets, occasional recipes, translations... I don't have memory enabled. I start a new chat for each question. At times I ask things in different languages, if the question is tied to culture or location. If I notice I asked the wrong question, I start a new session instead of continuing the old one, so it doesn't try to merge the questions somehow.

I don't see any benefit in it knowing anything about me. Instead I'm usually quite vague to avoid biased answers.


This is not the case.

I use OpenAI a lot on the paid plan via the UI. It now knows absolutely loads about me and seems to have a massive amount of cross conversational memory. It's really getting very close to what you'd expect from a human conversation in this regard.

Sure the model itself is still stateless, and if you use the API then what you say is true.

But they are doing so much unseen summarisation and longer context building behind the scenes in the webapp, what you see in the current conversation history is just a fraction of what is getting sent to the model.


> It now knows absolutely loads about me

Baffled that someone tech literate would be boasting about this in the year 2026. I mean, you do you, we all have different priorities and threat vectors, but this is the furthest from what I would personally want.


It's not boasting, I'm not sure why what I wrote would come across that way. I'm describing how I use a product and the functionality it presents to me.

But yes, it's an emerging area and I am questioning if I am sharing too much with it. I 100% would not want my chat histories exposed.

Saying that though, facebook can read my highly personal messages, google every email, my phone is tracking my every move, I have to sign up for random janky websites for my kids school where ther medical info is stored, etc.

LLM chat history presents a new risk and a different set of data, but it's a crowded minefield already.


This is the same as when Google got big (and Facebook, etc...). We have some privacy focused competitors (Kagi, etc...) but most people are quite happy to just give Google (and worse, Facebook) everything.

AI is just a new technology but this has been ramping up for decades now.


Others getting nostalgia over the Xbox 360 reminds me how old I am!

Now for an additional kickback: nostalgia induced here is about the NXE, but it famously displaced the original Blades dashboard.

I loved the Blades dashboard. Something about idly pressing the shoulder buttons to flip through the blades while talking to my friend with that goofy wireless "Xbox communicator" on my ear.

Blades was better than the redesign

Yeah it was. I hated the NXE so much. It was both harder to use and slower than the original UI. It looked prettier but that was it.

Best Xbox console. It had pretty good games. Sad they were unable to keep that momentum going and are basically nope’ing from the console business altogether now.

Late night uno sessions were a lot of fun. Not everyone had a camera so voice chat was "off the chain" as they used to say

I just bought this the other day, https://www.retro-gamer.de/shop/heft/retro-gamer-2-26-einzel...

The Xbox 360 is now considered a retro gaming device, that was such a reminder how old I am now, to note my first home computer was a Timex 2068.


I was able pull together a Halo 3 LAN party last year, although the "consoles" were Linux PCs and the game was the MCC edition (60fps instead 30). Split-screen was resurrected via mods. I bought some Microsoft gamepad receiver to bring Xbox 360 original controllers under Linux. Some people insisted they get to play on the original gamepad (otherwise it was a mixed bag of PlayStation and newer Xbox/PC controllers). I also realized that Halo 3 itself would have been old enough to drink with us!

The Xbox 360 is about as old now as the NES was when the Xbox 360 came out.

Yeah, and that is why some of us feel rather old. :)

I still remember when all that Nintendo had were Game & Watch handhelds, before NES came to be.

https://en.wikipedia.org/wiki/List_of_Game_%26_Watch_games


> my first home computer was a Timex 2068.

I don't know if the Altair 8800 would count as my first home computer, as I was too young to really understand what it was and mostly just liked to play with the paper tape feed on the Teletype attached to it. By the time we got the PET 2001, I was old enough to actually use it as intended.


I still have it in a box with all its games

I still love its controller design.


I was playing about with Chat GPT the other day, uploading screen shots of sheet music and asking it to convert it to ABC notation so I could make a midi file of it.

The results seemed impressive until I noticed some of the "Thinking" statements in the UI.

One made it apparent the model / agent / whatever had read the title from the screenshot and was off searching for existing ABC transcripts of the piece Ode to Joy.

So the whole thing was far less impressive after that, it wasn't reading the score anymore, just reading the title and using the internet to answer my query.


Yes I have found that grok for example actually suddenly becomes quite sane when you tell it to stop querying the internet And just rethink the conversation data and answer the question.

It's weird, it's like many agents are now in a phase of constantly getting more information and never just thinking with what they've got.


but isn't it what we wanted? we complained so much that LLM uses deprecated or outdated apis instead of current version because they relied so much on what they remembered


To be clear, what I mean is that grok will query 30 pages and then answer your question vaguely or wrongly and then ask for clarification of what it meant and then it goes and requeries everything again ... I can imagine why it might need to revisit pages etc and it might be a UI thing but it still feels like until you yell at it to stop searching for answers to summarise it doesn't activate it's "think with what you got" mode.

I guess we could call this gathering and then do your best conditional on what you found right now.


2010's: Google Search is making humans who constantly rely on it dumber

2020's: LLMs are making humans who constantly rely on them dumber

2026: Google Search is making LLMs who constantly rely on it dumber


Touché, that is what we humans are doing to some degree as well.


Sounds pretty human like! Always searching for a shortcut


It sounds like it's lying and making stuff up, something everybody seems to be okay with when using LLMs.


I am not sure why...you want the LLM to solve problems not come up with answers itself. It's allowed to use tools, precisely because it tends to make stuff up. In general, only if you're benchmarking LLMs you care about whether the LLM itself provided the answer or it used a tool. If you ask it to convert the notation of sheet music it might use a tool, and it's probably the right decision.


The shortcut is fine if it's a bog standard canonical arrangement of the piece. If it's a custom jazz rendition you composed with an odd key changes and and shifting time signatures, taking that shortcut is not going to yield the intended result. It's choosing the wrong tool to help which makes it unreliable for this task.


For structured outputs like that wouldn’t it be better to get the LLM to create a script to repeatably make the translation?


I went to Lidl UKs first walk out shop a few weeks ago. You get the bill and receipts about 40 minutes after you've left.

It certainly felt like it could have been sent off to a lower paid country for a human to tot up.

Also consider you're in the store for what, 10 mins - that's a lot of video processing presumably using state of the art CV models. It's quite possibly cheaper to pay a human than rent the H100 to do it.


I don't get this kind of indignation against anything shell related.


I often favour low maintenance and over head solutions. Most recently I made a stupidly large static website with over 50k items (i.e. pages).

I think a lot of people would have used a database at this point, but the site didn't need to be updated once built so serving a load of static files via S3 makes ongoing maintenance very low.

Also feel a slight sense of superiority when I see colleagues write a load of pandas scripts to generate some basic summary stats Vs my usual throw away approach based around awk.


It's because phones speakers aren't loud enough to be audible over the sound of the tube itself!

It is noticeable on buses and overground when people play things out load, but to be honest quite rare in the grand scheme of things.


That's true. I made several complaints about that to TFL before capitulating and just settling for noise-cancelling headphones.

Never been happier.

The clincher was noticing that the drivers themselves had access to ear defenders ... TFL said that that's because they're down there for extended periods of time. Sounds reasonable but I'm not buying that as a way out of not fixing the issue and exposing my ears to the worst bits of the tube.

Also has the ancillary benefits of blocking out those rare times (for me) when people do have their phone on speaker or are having a chat I'm uninterested in.


Welcome to the UK where citizens are so apathetic they don't care about aging infrastructure or government money being siphoned away.


I'm not really a big gamer but was looking into buying an xbox again. I already had a controller and thought why not try xbox cloud gaming on my Samsung TV.

With a decent internet connection I now struggle to see why anyone would want to buy a hardware Xbox. Games on the cloud version load instantly, play brilliantly and cost the same as the usual Game Pass as far as I can tell. The catalogue seems smaller maybe but aside from that I see little downside.

I could see it working well for PCs too - as long as the terminal device is seamless. I guess us devs have been renting computers in "the cloud" for decades anyway.


> I could see it working well for PCs too

I moonlight in film restoration. One 2hr movie out of our scanner is easily 16 TiB or more depending on the settings we scanned with.

Getting this uploaded to a remote server would take ~39hr over a fully-saturated 1Gbe pipe.


Clearly one use case where it wouldn't work.

On the other hand I'm a software engineer and my incredibly powerful MacBook could be not much more than a fancy dumb terminal - to be honest it almost is already.

If I can play a very responsive multiplayer game of the latest call of duty on my $300 TV with a little arm chip in it, then I could well imagine doing my job on a cloud Mac if the terminal device looked and felt like a MacBook but had the same tiny CPU my TV has.

Not sure if I'd choose it as a personal device but for corporations it seems a no brainer.


I dunno, I brought a pi 500+ with an SSD, 16GB RAM, little screen, PSU, mouse and cables. It was around £300.

It's not super powerful but my young kids use it to surf the net, play Minecraft, do art projects, etc. (we are yet to play with the gpio).

I don't get on with the keyboard but otherwise would make a decent development machine for me, considering my development starts with me ssh'ing into some remote VM and running vim.

The whole lot is tiny and extremely portable, we pack it away in a draw when not in use.

All in it felt like good value for money for something that took about 3 minutes to get up and running.


You can get much more powerful PCs for much less, e.g.:

https://www.amazon.co.uk/dp/B0CFPRDQY8/


That's actually about the same price as the pi 500+ without the screen. Except that one has 500gb Vs 256gb SSD, but doesn't have the snazzy led keyboard.

Processor comparison too

https://www.cpu-monkey.com/en/compare_cpu-raspberry_pi_5_b_b...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: