Hacker News .hnnew | past | comments | ask | show | jobs | submit | markbeare's commentslogin

I work for a cybersecurity company, and I think that the method they used to check these links with the mentioned security companies was not a reflection of how they detect. I'm sure that many of these companies do not have these domains in their DBs of bad sites but if you were to run these products and then visit the site then heuristic detection would have likely flagged the sites.


I would have expected at least Virustotal to flag them if that were the case. It does more than just looking up in a database of known malicious URLs and I think the reputation of the domains is the key factor here.

https://www.virustotal.com/gui/url/6dd23e90ee436e1ff066725aa...

> BitDefender - government

> Sophos - government

> Forcepoint ThreatSeeker - government

- https://docs.virustotal.com/docs/how-it-works


This probably relates to the Bing Search API. A lot of these alternative search engines are using Bing for the organic results.


This has got to be somewhat interesting to Bing, Google and Brave?


Haha totally! That’s the true test


Agreed with the other comments here. I would love to hear this as an engineering manager. I fear the work becoming stale for my team and push growth opportunities to keep things interesting.


Exactly


I feel like this is a bit of a controversial topic. With better planning an architectural specialist is less important and people with domain knowledge on the product itself is better. Thoughts?


This is a little bit easier to pull off when you product is less than a year old. Try doing this with a product that is 10-15 years old


100%


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: