HN2new | past | comments | ask | show | jobs | submitlogin

I follow your thinking. The ad networks could begin transforming their content in subtle ways which the eye cannot detect but which could throw off simple pattern matching thereby forcing the blocker client to employ an algorithmic approach.

see https://en.wikipedia.org/wiki/BPCS-Steganography

The networks would be able to utilize many cores in parallel to mutate the content, but the blocker client would have to run it's detection on mobile devices.

The more processing the blocker client must perform the more burden it places on that relatively underpowered mobile CPU which increases latency. Also when you get into algorithmic detection you have to start considering false positive ratio because any noticeable false positives will break the user's expectation causing them to eventually lose confidence in the client's utility.

As you rightly point out there may be both vectors for either side which neither of us has thought of though.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: