I found this:
https://github.com/wikimedia/mediawiki/commit/4f11b614544be8cb6198fbbef36e90206ed311bf#diff-6a25b7d123e94e4ae53bd62ecb2ebac4d94d85090177605742bcfda53d3c786cR1000-R1003
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/538362
T206283: Failed deferred updates should be queued as jobs if possible (Deadlock from LinksUpdate in WikiPage::updateCategoryCounts)
And commenting out the following lines:
ini_set( 'zlib.output_compression', 0 ); if ( function_exists( 'apache_setenv' ) ) { apache_setenv( 'no-gzip', '1' ); }
Leads to pages once again loading for me.
Thus I guess this is some sort of regression / set of things interacting together around this part of code that has been introduced.
It seems this has surfaced in a variety of places
Original description:
When attempting to access a MediaWiki path in Chrome that sets a session cookie like https://thegoodplace.wmflabs.org/wiki/Special:CreateAccount the request fails with the following error:
net::ERR_HTTP2_PROTOCOL_ERROR 200
Accessing other pages like https://thegoodplace.wmflabs.org/wiki/Mars works fine.
This is only a problem in Chrome, it does not happen in Firefox or Safari. Also, this problem does not seem to happen locally or in production, the only place this seems to happen is on a Cloud VPS.
Here is a captured network log:
that can be browsed by uploading it to https://netlog-viewer.appspot.com