HN2new | past | comments | ask | show | jobs | submitlogin

Poul-Henning Kamp (the author of Varnish) explained it best. http://queue.acm.org/detail.cfm?id=2716278


Do you have another explanation which hasn't been torn apart by Hacker News commentators already? As someone who would like to avoid going to the trouble to implement HTTP2, I'm not being sarcastic - I truly want to know why HTTP2 is so bad, and I want to read it from someone who backs up their words with cold-hard facts


Have a look at the W3 HTTP WG mailing list. PHK and others voiced all of their objections there in detail over a long period, and you can also read the rest of the WG's responses: https://www.w3.org/Search/Mail/Public/advanced_search?keywor... (all of PHK's posts to the list, in chronological order).


This for example happened in the HTTP working group earlier today, the whole thing was rushed and the known flaws just keep adding up. https://lists.w3.org/Archives/Public/ietf-http-wg/2015JanMar...


I followed the thread a little further, and the response seemed reasonable: experiments were run on this idea, it didn't seem to help, and nobody (including phk) has offered any further evidence to the contrary since then.

I think you need to expand a little more on why you feel this is a "known flaw".


A 60 page document to explain how to compress text name/value pairs that are mostly unchanged between requests is "reasonable"?

A static compression table with proxy authentication fields in it, as if the network between your browser and LAN proxy is somehow the bottleneck, is reasonable?

This is a most over-engineered protocol, one where nobody knows how tables were constructed or from what data and nobody can quantify what the benefits its features will really be. For instance how much is saved by using a huffman encoding instead of a simple LZ encoding or simple store/recall or not compressing it? Nobody knows! Google did some magic research in private and decided on the most complicated compression method, so therefore HTTP/2 must use it.

This HTTP/2 process is insane.


> For instance how much is saved by using a huffman encoding instead of a simple LZ encoding or simple store/recall or not compressing it? Nobody knows!

It's not all about compression ratios. From the HPACK spec (https://http2.github.io/http2-spec/compression.html#rfc.sect...):

  SPDY [SPDY] initially addressed this redundancy by 
  compressing header fields using the DEFLATE [DEFLATE]
  format, which proved very effective at efficiently 
  representing the redundant header fields. However, that
  approach exposed a security risk as demonstrated by
  the CRIME attack(see [CRIME]).

  This specification defines HPACK, a new compressor for
  header fields which eliminates redundant header fields, 
  limits vulnerability to known security attacks, and which
  has a bounded memory requirement for use in constrained 
  environments. Potential security concerns for HPACK are
  described in Section 7.


The Huffman code is optional and the spec says not to use it on sensitive fields (no mention of what those are). You can use a separate LZ on each header field, compressing really long headers and not having information leak between headers and the bodies. You think they tested that? Or did the people who didn't know about CRIME in the first place just react? Where are the numbers that show a static huffman code that was created from some unknown dataset at one point in time and can't be extended because there's only a single "compress or huffman" bit in the protocol is needed? Like I said, insane.


"As someone who would like to avoid going to the trouble to implement HTTP2 (...)"

Good luck with that. Google has been incredibly adept at using the leverage of their search engine to make webmasters adhere to best practices. ("mobile friendly" e-mails lately, anyone?) Once HTTP/2 lands as a default in Chrome, Apache, Nginx, and Netty, of course they'll push HTTP/2-friendly sites higher in search results.


I think you meant to say "Googles definition of best practices"


Except, as HN has mostly ripped him apart for, his argument is very weak at best.

He goes into zero detail, and where he does, it says things like "likely to increase CO2 consumption"?

Seriously?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: