python - https GET behind proxy succeeds with wget + TLSv1, but fails with requests even if ssl protocol is forced to TLSv1 -
i'm trying file via https using requests 2.11.1 python 2.7.12 , openssl 1.0.2h (all installed anaconda) on macos 10.11.6 behind proxy. according ssllabs, server supports tls 1.0, 1.1, , 1.2. moreover, can retrieve file wget (linked openssl 1.0.2h) if explicitly set secure-protocol tlsv1 (but not if set unsupported protocols sslv2). however, if try explicitly set secure protocol used requests tlsv1, tlsv1_1, or tlsv1_2, e.g., follows,
from requests_toolbelt import ssladapter import requests import ssl s = requests.session() p = ssl.protocol_tlsv1 s.mount('https://', ssladapter(p)) r = s.get("https://anaconda.org/conda-forge/matplotlib/2.0.0b3/download/osx-64/matplotlib-2.0.0b3-np111py27_5.tar.bz2")
i encounter following exception:
/users/lebedov/anaconda2/lib/python2.7/site-packages/requests/adapters.pyc in send(self, request, stream, timeout, verify, cert, proxies) 495 except (_sslerror, _httperror) e: 496 if isinstance(e, _sslerror): --> 497 raise sslerror(e, request=request) 498 elif isinstance(e, readtimeouterror): 499 raise readtimeout(e, request=request) sslerror: ("bad handshake: error([('ssl routines', 'ssl3_get_record', 'wrong version number')],)",)
(less surprisingly, explicitly setting protocol sslv2, sslv3, or sslv23 results in handshake exceptions.) also, don't observe exception when attempting other sites via https. idea why connection fails requests when force use tlsv1?
this issue proved due proxy rejecting http headers unrecognized user-agent strings. explicitly setting user-agent in headers mozilla/4.0 (compatible; msie 5.5; windows 98)
via hacked version of cntlm solved problem.
Comments
Post a Comment