HN2new | past | comments | ask | show | jobs | submitlogin

I'm genuinely concerned about the WebGPU attack vector. The possibilities are exciting, but we (everyone) has virtually no experience with securing them (compared to decades of securing x86 - which we still can't pull off). My biggest concern is fingerprinting.


Somehow I can’t resign myself to this brave new world of web apps with low level hardware access. I do not want web apps doing GPGPU work on my machine. If the browser engine implements high level functionality that way, fine, but I don’t want arbitrary websites using low level hardware directly.

They were so preoccupied with whether they could, the never stopped to consider whether they should.


I feel like fingerprinting is inevitable with any hardware access, including WebGL or WebGPU. It’s one of my big concerns about Chrome exposing more and more of the hardware it runs on in the goal of being a Web based OS.

That said, fingerprinting is not as big a risk as what I was thinking of, which is one process being able to peer into another’s on the GPU. There are various takes on isolation on the GPU but they tend to have strong caveats attached.


Fingerprinting is probably inevitable if it is enabled by default. Given game code themselves relies on exact device model to workaround gpu implementation bugs. Gpu compatibility is always a shit show history that relies on all sort of device specific workarounds. You may spoof it. But don't assume it would work perfectly for any moderate to big sized programs.


Is this substantially different than say, containers with GPU access, right?

Lots of computation is moving to GPUs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: