Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

One more test needs to be marked as .net #344

Closed
2 tasks done
mcepl opened this issue Mar 21, 2022 · 1 comment · Fixed by #391
Closed
2 tasks done

One more test needs to be marked as .net #344

mcepl opened this issue Mar 21, 2022 · 1 comment · Fixed by #391
Labels

Comments

@mcepl
Copy link

mcepl commented Mar 21, 2022

Prerequisites

  • Did you make sure a similar issue didn't exist?
  • Did you update gTTS to the latest? (pip install --upgrade gTTS)

Current Behaviour (steps to reproduce)

When running the test suite in the network isolated environment (and with pytest -k 'not net') I get

[   15s] =================================== FAILURES ===================================
[   15s] _______________________________ test_bad_fp_type _______________________________
[   15s]
[   15s] self = <urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>
[   15s]
[   15s]     def _new_conn(self):
[   15s]         """Establish a socket connection and set nodelay settings on it.
[   15s]
[   15s]         :return: New socket connection.
[   15s]         """
[   15s]         extra_kw = {}
[   15s]         if self.source_address:
[   15s]             extra_kw["source_address"] = self.source_address
[   15s]
[   15s]         if self.socket_options:
[   15s]             extra_kw["socket_options"] = self.socket_options
[   15s]
[   15s]         try:
[   15s] >           conn = connection.create_connection(
[   15s]                 (self._dns_host, self.port), self.timeout, **extra_kw
[   15s]             )
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/connection.py:174:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] address = ('translate.google.com', 443), timeout = None, source_address = None
[   15s] socket_options = [(6, 1, 1)]
[   15s]
[   15s]     def create_connection(
[   15s]         address,
[   15s]         timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
[   15s]         source_address=None,
[   15s]         socket_options=None,
[   15s]     ):
[   15s]         """Connect to *address* and return the socket object.
[   15s]
[   15s]         Convenience function.  Connect to *address* (a 2-tuple ``(host,
[   15s]         port)``) and return the socket object.  Passing the optional
[   15s]         *timeout* parameter will set the timeout on the socket instance
[   15s]         before attempting to connect.  If no *timeout* is supplied, the
[   15s]         global default timeout setting returned by :func:`socket.getdefaulttimeout`
[   15s]         is used.  If *source_address* is set it must be a tuple of (host, port)
[   15s]         for the socket to bind as a source address before making the connection.
[   15s]         An host of '' or port 0 tells the OS to use the default.
[   15s]         """
[   15s]
[   15s]         host, port = address
[   15s]         if host.startswith("["):
[   15s]             host = host.strip("[]")
[   15s]         err = None
[   15s]
[   15s]         # Using the value from allowed_gai_family() in the context of getaddrinfo lets
[   15s]         # us select whether to work with IPv4 DNS records, IPv6 records, or both.
[   15s]         # The original create_connection function always returns all records.
[   15s]         family = allowed_gai_family()
[   15s]
[   15s]         try:
[   15s]             host.encode("idna")
[   15s]         except UnicodeError:
[   15s]             return six.raise_from(
[   15s]                 LocationParseError(u"'%s', label empty or too long" % host), None
[   15s]             )
[   15s]
[   15s] >       for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/util/connection.py:72:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] host = 'translate.google.com', port = 443, family = <AddressFamily.AF_UNSPEC: 0>
[   15s] type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
[   15s]
[   15s]     def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
[   15s]         """Resolve host and port into list of address info entries.
[   15s]
[   15s]         Translate the host/port argument into a sequence of 5-tuples that contain
[   15s]         all the necessary arguments for creating a socket connected to that service.
[   15s]         host is a domain name, a string representation of an IPv4/v6 address or
[   15s]         None. port is a string service name such as 'http', a numeric port number or
[   15s]         None. By passing None as the value of host and port, you can pass NULL to
[   15s]         the underlying C API.
[   15s]
[   15s]         The family, type and proto arguments can be optionally specified in order to
[   15s]         narrow the list of addresses returned. Passing zero as a value for each of
[   15s]         these arguments selects the full range of results.
[   15s]         """
[   15s]         # We override this function since we want to translate the numeric family
[   15s]         # and socket type values to enum constants.
[   15s]         addrlist = []
[   15s] >       for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
[   15s] E       socket.gaierror: [Errno -3] Temporary failure in name resolution
[   15s]
[   15s] /usr/lib64/python3.9/socket.py:954: gaierror
[   15s]
[   15s] During handling of the above exception, another exception occurred:
[   15s]
[   15s] self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fb289ec27c0>
[   15s] method = 'POST', url = '/_/TranslateWebserverUi/data/batchexecute'
[   15s] body = 'f.req=%5B%5B%5B%22jQ1olc%22%2C%22%5B%5C%22test%5C%22%2C%5C%22en%5C%22%2Cnull%2C%5C%22null%5C%22%5D%22%2Cnull%2C%22generic%22%5D%5D%5D&'
[   15s] headers = {'Referer': 'http://translate.google.com/', 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KH....0.2526.106 Safari/537.36', 'Content-Type': 'application/x-www-form-urlencoded;charset=utf-8', 'Content-Length': '134'}
[   15s] retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
[   15s] redirect = False, assert_same_host = False
[   15s] timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
[   15s] release_conn = False, chunked = False, body_pos = None
[   15s] response_kw = {'decode_content': False, 'preload_content': False}
[   15s] parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/_/TranslateWebserverUi/data/batchexecute', query=None, fragment=None)
[   15s] destination_scheme = None, conn = None, release_this_conn = True
[   15s] http_tunnel_required = False, err = None, clean_exit = False
[   15s]
[   15s]     def urlopen(
[   15s]         self,
[   15s]         method,
[   15s]         url,
[   15s]         body=None,
[   15s]         headers=None,
[   15s]         retries=None,
[   15s]         redirect=True,
[   15s]         assert_same_host=True,
[   15s]         timeout=_Default,
[   15s]         pool_timeout=None,
[   15s]         release_conn=None,
[   15s]         chunked=False,
[   15s]         body_pos=None,
[   15s]         **response_kw
[   15s]     ):
[   15s]         """
[   15s]         Get a connection from the pool and perform an HTTP request. This is the
[   15s]         lowest level call for making a request, so you'll need to specify all
[   15s]         the raw details.
[   15s]
[   15s]         .. note::
[   15s]
[   15s]            More commonly, it's appropriate to use a convenience method provided
[   15s]            by :class:`.RequestMethods`, such as :meth:`request`.
[   15s]
[   15s]         .. note::
[   15s]
[   15s]            `release_conn` will only behave as expected if
[   15s]            `preload_content=False` because we want to make
[   15s]            `preload_content=False` the default behaviour someday soon without
[   15s]            breaking backwards compatibility.
[   15s]
[   15s]         :param method:
[   15s]             HTTP request method (such as GET, POST, PUT, etc.)
[   15s]
[   15s]         :param url:
[   15s]             The URL to perform the request on.
[   15s]
[   15s]         :param body:
[   15s]             Data to send in the request body, either :class:`str`, :class:`bytes`,
[   15s]             an iterable of :class:`str`/:class:`bytes`, or a file-like object.
[   15s]
[   15s]         :param headers:
[   15s]             Dictionary of custom headers to send, such as User-Agent,
[   15s]             If-None-Match, etc. If None, pool headers are used. If provided,
[   15s]             these headers completely replace any pool-specific headers.
[   15s]
[   15s]         :param retries:
[   15s]             Configure the number of retries to allow before raising a
[   15s]             :class:`~urllib3.exceptions.MaxRetryError` exception.
[   15s]
[   15s]             Pass ``None`` to retry until you receive a response. Pass a
[   15s]             :class:`~urllib3.util.retry.Retry` object for fine-grained control
[   15s]             over different types of retries.
[   15s]             Pass an integer number to retry connection errors that many times,
[   15s]             but no other types of errors. Pass zero to never retry.
[   15s]
[   15s]             If ``False``, then retries are disabled and any exception is raised
[   15s]             immediately. Also, instead of raising a MaxRetryError on redirects,
[   15s]             the redirect response will be returned.
[   15s]
[   15s]         :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
[   15s]
[   15s]         :param redirect:
[   15s]             If True, automatically handle redirects (status codes 301, 302,
[   15s]             303, 307, 308). Each redirect counts as a retry. Disabling retries
[   15s]             will disable redirect, too.
[   15s]
[   15s]         :param assert_same_host:
[   15s]             If ``True``, will make sure that the host of the pool requests is
[   15s]             consistent else will raise HostChangedError. When ``False``, you can
[   15s]             use the pool on an HTTP proxy and request foreign hosts.
[   15s]
[   15s]         :param timeout:
[   15s]             If specified, overrides the default timeout for this one
[   15s]             request. It may be a float (in seconds) or an instance of
[   15s]             :class:`urllib3.util.Timeout`.
[   15s]
[   15s]         :param pool_timeout:
[   15s]             If set and the pool is set to block=True, then this method will
[   15s]             block for ``pool_timeout`` seconds and raise EmptyPoolError if no
[   15s]             connection is available within the time period.
[   15s]
[   15s]         :param release_conn:
[   15s]             If False, then the urlopen call will not release the connection
[   15s]             back into the pool once a response is received (but will release if
[   15s]             you read the entire contents of the response such as when
[   15s]             `preload_content=True`). This is useful if you're not preloading
[   15s]             the response's content immediately. You will need to call
[   15s]             ``r.release_conn()`` on the response ``r`` to return the connection
[   15s]             back into the pool. If None, it takes the value of
[   15s]             ``response_kw.get('preload_content', True)``.
[   15s]
[   15s]         :param chunked:
[   15s]             If True, urllib3 will send the body using chunked transfer
[   15s]             encoding. Otherwise, urllib3 will send the body using the standard
[   15s]             content-length form. Defaults to False.
[   15s]
[   15s]         :param int body_pos:
[   15s]             Position to seek to in file-like body in the event of a retry or
[   15s]             redirect. Typically this won't need to be set because urllib3 will
[   15s]             auto-populate the value when needed.
[   15s]
[   15s]         :param \\**response_kw:
[   15s]             Additional parameters are passed to
[   15s]             :meth:`urllib3.response.HTTPResponse.from_httplib`
[   15s]         """
[   15s]
[   15s]         parsed_url = parse_url(url)
[   15s]         destination_scheme = parsed_url.scheme
[   15s]
[   15s]         if headers is None:
[   15s]             headers = self.headers
[   15s]
[   15s]         if not isinstance(retries, Retry):
[   15s]             retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
[   15s]
[   15s]         if release_conn is None:
[   15s]             release_conn = response_kw.get("preload_content", True)
[   15s]
[   15s]         # Check host
[   15s]         if assert_same_host and not self.is_same_host(url):
[   15s]             raise HostChangedError(self, url, retries)
[   15s]
[   15s]         # Ensure that the URL we're connecting to is properly encoded
[   15s]         if url.startswith("/"):
[   15s]             url = six.ensure_str(_encode_target(url))
[   15s]         else:
[   15s]             url = six.ensure_str(parsed_url.url)
[   15s]
[   15s]         conn = None
[   15s]
[   15s]         # Track whether `conn` needs to be released before
[   15s]         # returning/raising/recursing. Update this variable if necessary, and
[   15s]         # leave `release_conn` constant throughout the function. That way, if
[   15s]         # the function recurses, the original value of `release_conn` will be
[   15s]         # passed down into the recursive call, and its value will be respected.
[   15s]         #
[   15s]         # See issue #651 [1] for details.
[   15s]         #
[   15s]         # [1] <https://github.com/urllib3/urllib3/issues/651>
[   15s]         release_this_conn = release_conn
[   15s]
[   15s]         http_tunnel_required = connection_requires_http_tunnel(
[   15s]             self.proxy, self.proxy_config, destination_scheme
[   15s]         )
[   15s]
[   15s]         # Merge the proxy headers. Only done when not using HTTP CONNECT. We
[   15s]         # have to copy the headers dict so we can safely change it without those
[   15s]         # changes being reflected in anyone else's copy.
[   15s]         if not http_tunnel_required:
[   15s]             headers = headers.copy()
[   15s]             headers.update(self.proxy_headers)
[   15s]
[   15s]         # Must keep the exception bound to a separate variable or else Python 3
[   15s]         # complains about UnboundLocalError.
[   15s]         err = None
[   15s]
[   15s]         # Keep track of whether we cleanly exited the except block. This
[   15s]         # ensures we do proper cleanup in finally.
[   15s]         clean_exit = False
[   15s]
[   15s]         # Rewind body position, if needed. Record current position
[   15s]         # for future rewinds in the event of a redirect/retry.
[   15s]         body_pos = set_file_position(body, body_pos)
[   15s]
[   15s]         try:
[   15s]             # Request a connection from the queue.
[   15s]             timeout_obj = self._get_timeout(timeout)
[   15s]             conn = self._get_conn(timeout=pool_timeout)
[   15s]
[   15s]             conn.timeout = timeout_obj.connect_timeout
[   15s]
[   15s]             is_new_proxy_conn = self.proxy is not None and not getattr(
[   15s]                 conn, "sock", None
[   15s]             )
[   15s]             if is_new_proxy_conn and http_tunnel_required:
[   15s]                 self._prepare_proxy(conn)
[   15s]
[   15s]             # Make the request on the httplib connection object.
[   15s] >           httplib_response = self._make_request(
[   15s]                 conn,
[   15s]                 method,
[   15s]                 url,
[   15s]                 timeout=timeout_obj,
[   15s]                 body=body,
[   15s]                 headers=headers,
[   15s]                 chunked=chunked,
[   15s]             )
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/connectionpool.py:703:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fb289ec27c0>
[   15s] conn = <urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>
[   15s] method = 'POST', url = '/_/TranslateWebserverUi/data/batchexecute'
[   15s] timeout = Timeout(connect=None, read=None, total=None), chunked = False
[   15s] httplib_request_kw = {'body': 'f.req=%5B%5B%5B%22jQ1olc%22%2C%22%5B%5C%22test%5C%22%2C%5C%22en%5C%22%2Cnull%2C%5C%22null%5C%22%5D%22%2Cnull...0.2526.106 Safari/537.36', 'Content-Type': 'application/x-www-form-urlencoded;charset=utf-8', 'Content-Length': '134'}}
[   15s] timeout_obj = Timeout(connect=None, read=None, total=None)
[   15s]
[   15s]     def _make_request(
[   15s]         self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
[   15s]     ):
[   15s]         """
[   15s]         Perform a request on a given urllib connection object taken from our
[   15s]         pool.
[   15s]
[   15s]         :param conn:
[   15s]             a connection from one of our connection pools
[   15s]
[   15s]         :param timeout:
[   15s]             Socket timeout in seconds for the request. This can be a
[   15s]             float or integer, which will set the same timeout value for
[   15s]             the socket connect and the socket read, or an instance of
[   15s]             :class:`urllib3.util.Timeout`, which gives you more fine-grained
[   15s]             control over your timeouts.
[   15s]         """
[   15s]         self.num_requests += 1
[   15s]
[   15s]         timeout_obj = self._get_timeout(timeout)
[   15s]         timeout_obj.start_connect()
[   15s]         conn.timeout = timeout_obj.connect_timeout
[   15s]
[   15s]         # Trigger any extra validation we need to do.
[   15s]         try:
[   15s] >           self._validate_conn(conn)
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/connectionpool.py:386:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fb289ec27c0>
[   15s] conn = <urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>
[   15s]
[   15s]     def _validate_conn(self, conn):
[   15s]         """
[   15s]         Called right before a request is made, after the socket is created.
[   15s]         """
[   15s]         super(HTTPSConnectionPool, self)._validate_conn(conn)
[   15s]
[   15s]         # Force connect early to allow us to validate the connection.
[   15s]         if not getattr(conn, "sock", None):  # AppEngine might not have  `.sock`
[   15s] >           conn.connect()
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/connectionpool.py:1040:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>
[   15s]
[   15s]     def connect(self):
[   15s]         # Add certificate verification
[   15s] >       conn = self._new_conn()
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/connection.py:358:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>
[   15s]
[   15s]     def _new_conn(self):
[   15s]         """Establish a socket connection and set nodelay settings on it.
[   15s]
[   15s]         :return: New socket connection.
[   15s]         """
[   15s]         extra_kw = {}
[   15s]         if self.source_address:
[   15s]             extra_kw["source_address"] = self.source_address
[   15s]
[   15s]         if self.socket_options:
[   15s]             extra_kw["socket_options"] = self.socket_options
[   15s]
[   15s]         try:
[   15s]             conn = connection.create_connection(
[   15s]                 (self._dns_host, self.port), self.timeout, **extra_kw
[   15s]             )
[   15s]
[   15s]         except SocketTimeout:
[   15s]             raise ConnectTimeoutError(
[   15s]                 self,
[   15s]                 "Connection to %s timed out. (connect timeout=%s)"
[   15s]                 % (self.host, self.timeout),
[   15s]             )
[   15s]
[   15s]         except SocketError as e:
[   15s] >           raise NewConnectionError(
[   15s]                 self, "Failed to establish a new connection: %s" % e
[   15s]             )
[   15s] E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/connection.py:186: NewConnectionError
[   15s]
[   15s] During handling of the above exception, another exception occurred:
[   15s]
[   15s] self = <requests.adapters.HTTPAdapter object at 0x7fb289ec2b20>
[   15s] request = <PreparedRequest [POST]>, stream = False
[   15s] timeout = Timeout(connect=None, read=None, total=None), verify = False
[   15s] cert = None, proxies = {}
[   15s]
[   15s]     def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
[   15s]         """Sends PreparedRequest object. Returns Response object.
[   15s]
[   15s]         :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
[   15s]         :param stream: (optional) Whether to stream the request content.
[   15s]         :param timeout: (optional) How long to wait for the server to send
[   15s]             data before giving up, as a float, or a :ref:`(connect timeout,
[   15s]             read timeout) <timeouts>` tuple.
[   15s]         :type timeout: float or tuple or urllib3 Timeout object
[   15s]         :param verify: (optional) Either a boolean, in which case it controls whether
[   15s]             we verify the server's TLS certificate, or a string, in which case it
[   15s]             must be a path to a CA bundle to use
[   15s]         :param cert: (optional) Any user-provided SSL certificate to be trusted.
[   15s]         :param proxies: (optional) The proxies dictionary to apply to the request.
[   15s]         :rtype: requests.Response
[   15s]         """
[   15s]
[   15s]         try:
[   15s]             conn = self.get_connection(request.url, proxies)
[   15s]         except LocationValueError as e:
[   15s]             raise InvalidURL(e, request=request)
[   15s]
[   15s]         self.cert_verify(conn, request.url, verify, cert)
[   15s]         url = self.request_url(request, proxies)
[   15s]         self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
[   15s]
[   15s]         chunked = not (request.body is None or 'Content-Length' in request.headers)
[   15s]
[   15s]         if isinstance(timeout, tuple):
[   15s]             try:
[   15s]                 connect, read = timeout
[   15s]                 timeout = TimeoutSauce(connect=connect, read=read)
[   15s]             except ValueError as e:
[   15s]                 # this may raise a string formatting error.
[   15s]                 err = ("Invalid timeout {}. Pass a (connect, read) "
[   15s]                        "timeout tuple, or a single float to set "
[   15s]                        "both timeouts to the same value".format(timeout))
[   15s]                 raise ValueError(err)
[   15s]         elif isinstance(timeout, TimeoutSauce):
[   15s]             pass
[   15s]         else:
[   15s]             timeout = TimeoutSauce(connect=timeout, read=timeout)
[   15s]
[   15s]         try:
[   15s]             if not chunked:
[   15s] >               resp = conn.urlopen(
[   15s]                     method=request.method,
[   15s]                     url=url,
[   15s]                     body=request.body,
[   15s]                     headers=request.headers,
[   15s]                     redirect=False,
[   15s]                     assert_same_host=False,
[   15s]                     preload_content=False,
[   15s]                     decode_content=False,
[   15s]                     retries=self.max_retries,
[   15s]                     timeout=timeout
[   15s]                 )
[   15s]
[   15s] /usr/lib/python3.9/site-packages/requests/adapters.py:440:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fb289ec27c0>
[   15s] method = 'POST', url = '/_/TranslateWebserverUi/data/batchexecute'
[   15s] body = 'f.req=%5B%5B%5B%22jQ1olc%22%2C%22%5B%5C%22test%5C%22%2C%5C%22en%5C%22%2Cnull%2C%5C%22null%5C%22%5D%22%2Cnull%2C%22generic%22%5D%5D%5D&'
[   15s] headers = {'Referer': 'http://translate.google.com/', 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KH....0.2526.106 Safari/537.36', 'Content-Type': 'application/x-www-form-urlencoded;charset=utf-8', 'Content-Length': '134'}
[   15s] retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
[   15s] redirect = False, assert_same_host = False
[   15s] timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
[   15s] release_conn = False, chunked = False, body_pos = None
[   15s] response_kw = {'decode_content': False, 'preload_content': False}
[   15s] parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/_/TranslateWebserverUi/data/batchexecute', query=None, fragment=None)
[   15s] destination_scheme = None, conn = None, release_this_conn = True
[   15s] http_tunnel_required = False, err = None, clean_exit = False
[   15s]
[   15s]     def urlopen(
[   15s]         self,
[   15s]         method,
[   15s]         url,
[   15s]         body=None,
[   15s]         headers=None,
[   15s]         retries=None,
[   15s]         redirect=True,
[   15s]         assert_same_host=True,
[   15s]         timeout=_Default,
[   15s]         pool_timeout=None,
[   15s]         release_conn=None,
[   15s]         chunked=False,
[   15s]         body_pos=None,
[   15s]         **response_kw
[   15s]     ):
[   15s]         """
[   15s]         Get a connection from the pool and perform an HTTP request. This is the
[   15s]         lowest level call for making a request, so you'll need to specify all
[   15s]         the raw details.
[   15s]
[   15s]         .. note::
[   15s]
[   15s]            More commonly, it's appropriate to use a convenience method provided
[   15s]            by :class:`.RequestMethods`, such as :meth:`request`.
[   15s]
[   15s]         .. note::
[   15s]
[   15s]            `release_conn` will only behave as expected if
[   15s]            `preload_content=False` because we want to make
[   15s]            `preload_content=False` the default behaviour someday soon without
[   15s]            breaking backwards compatibility.
[   15s]
[   15s]         :param method:
[   15s]             HTTP request method (such as GET, POST, PUT, etc.)
[   15s]
[   15s]         :param url:
[   15s]             The URL to perform the request on.
[   15s]
[   15s]         :param body:
[   15s]             Data to send in the request body, either :class:`str`, :class:`bytes`,
[   15s]             an iterable of :class:`str`/:class:`bytes`, or a file-like object.
[   15s]
[   15s]         :param headers:
[   15s]             Dictionary of custom headers to send, such as User-Agent,
[   15s]             If-None-Match, etc. If None, pool headers are used. If provided,
[   15s]             these headers completely replace any pool-specific headers.
[   15s]
[   15s]         :param retries:
[   15s]             Configure the number of retries to allow before raising a
[   15s]             :class:`~urllib3.exceptions.MaxRetryError` exception.
[   15s]
[   15s]             Pass ``None`` to retry until you receive a response. Pass a
[   15s]             :class:`~urllib3.util.retry.Retry` object for fine-grained control
[   15s]             over different types of retries.
[   15s]             Pass an integer number to retry connection errors that many times,
[   15s]             but no other types of errors. Pass zero to never retry.
[   15s]
[   15s]             If ``False``, then retries are disabled and any exception is raised
[   15s]             immediately. Also, instead of raising a MaxRetryError on redirects,
[   15s]             the redirect response will be returned.
[   15s]
[   15s]         :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
[   15s]
[   15s]         :param redirect:
[   15s]             If True, automatically handle redirects (status codes 301, 302,
[   15s]             303, 307, 308). Each redirect counts as a retry. Disabling retries
[   15s]             will disable redirect, too.
[   15s]
[   15s]         :param assert_same_host:
[   15s]             If ``True``, will make sure that the host of the pool requests is
[   15s]             consistent else will raise HostChangedError. When ``False``, you can
[   15s]             use the pool on an HTTP proxy and request foreign hosts.
[   15s]
[   15s]         :param timeout:
[   15s]             If specified, overrides the default timeout for this one
[   15s]             request. It may be a float (in seconds) or an instance of
[   15s]             :class:`urllib3.util.Timeout`.
[   15s]
[   15s]         :param pool_timeout:
[   15s]             If set and the pool is set to block=True, then this method will
[   15s]             block for ``pool_timeout`` seconds and raise EmptyPoolError if no
[   15s]             connection is available within the time period.
[   15s]
[   15s]         :param release_conn:
[   15s]             If False, then the urlopen call will not release the connection
[   15s]             back into the pool once a response is received (but will release if
[   15s]             you read the entire contents of the response such as when
[   15s]             `preload_content=True`). This is useful if you're not preloading
[   15s]             the response's content immediately. You will need to call
[   15s]             ``r.release_conn()`` on the response ``r`` to return the connection
[   15s]             back into the pool. If None, it takes the value of
[   15s]             ``response_kw.get('preload_content', True)``.
[   15s]
[   15s]         :param chunked:
[   15s]             If True, urllib3 will send the body using chunked transfer
[   15s]             encoding. Otherwise, urllib3 will send the body using the standard
[   15s]             content-length form. Defaults to False.
[   15s]
[   15s]         :param int body_pos:
[   15s]             Position to seek to in file-like body in the event of a retry or
[   15s]             redirect. Typically this won't need to be set because urllib3 will
[   15s]             auto-populate the value when needed.
[   15s]
[   15s]         :param \\**response_kw:
[   15s]             Additional parameters are passed to
[   15s]             :meth:`urllib3.response.HTTPResponse.from_httplib`
[   15s]         """
[   15s]
[   15s]         parsed_url = parse_url(url)
[   15s]         destination_scheme = parsed_url.scheme
[   15s]
[   15s]         if headers is None:
[   15s]             headers = self.headers
[   15s]
[   15s]         if not isinstance(retries, Retry):
[   15s]             retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
[   15s]
[   15s]         if release_conn is None:
[   15s]             release_conn = response_kw.get("preload_content", True)
[   15s]
[   15s]         # Check host
[   15s]         if assert_same_host and not self.is_same_host(url):
[   15s]             raise HostChangedError(self, url, retries)
[   15s]
[   15s]         # Ensure that the URL we're connecting to is properly encoded
[   15s]         if url.startswith("/"):
[   15s]             url = six.ensure_str(_encode_target(url))
[   15s]         else:
[   15s]             url = six.ensure_str(parsed_url.url)
[   15s]
[   15s]         conn = None
[   15s]
[   15s]         # Track whether `conn` needs to be released before
[   15s]         # returning/raising/recursing. Update this variable if necessary, and
[   15s]         # leave `release_conn` constant throughout the function. That way, if
[   15s]         # the function recurses, the original value of `release_conn` will be
[   15s]         # passed down into the recursive call, and its value will be respected.
[   15s]         #
[   15s]         # See issue #651 [1] for details.
[   15s]         #
[   15s]         # [1] <https://github.com/urllib3/urllib3/issues/651>
[   15s]         release_this_conn = release_conn
[   15s]
[   15s]         http_tunnel_required = connection_requires_http_tunnel(
[   15s]             self.proxy, self.proxy_config, destination_scheme
[   15s]         )
[   15s]
[   15s]         # Merge the proxy headers. Only done when not using HTTP CONNECT. We
[   15s]         # have to copy the headers dict so we can safely change it without those
[   15s]         # changes being reflected in anyone else's copy.
[   15s]         if not http_tunnel_required:
[   15s]             headers = headers.copy()
[   15s]             headers.update(self.proxy_headers)
[   15s]
[   15s]         # Must keep the exception bound to a separate variable or else Python 3
[   15s]         # complains about UnboundLocalError.
[   15s]         err = None
[   15s]
[   15s]         # Keep track of whether we cleanly exited the except block. This
[   15s]         # ensures we do proper cleanup in finally.
[   15s]         clean_exit = False
[   15s]
[   15s]         # Rewind body position, if needed. Record current position
[   15s]         # for future rewinds in the event of a redirect/retry.
[   15s]         body_pos = set_file_position(body, body_pos)
[   15s]
[   15s]         try:
[   15s]             # Request a connection from the queue.
[   15s]             timeout_obj = self._get_timeout(timeout)
[   15s]             conn = self._get_conn(timeout=pool_timeout)
[   15s]
[   15s]             conn.timeout = timeout_obj.connect_timeout
[   15s]
[   15s]             is_new_proxy_conn = self.proxy is not None and not getattr(
[   15s]                 conn, "sock", None
[   15s]             )
[   15s]             if is_new_proxy_conn and http_tunnel_required:
[   15s]                 self._prepare_proxy(conn)
[   15s]
[   15s]             # Make the request on the httplib connection object.
[   15s]             httplib_response = self._make_request(
[   15s]                 conn,
[   15s]                 method,
[   15s]                 url,
[   15s]                 timeout=timeout_obj,
[   15s]                 body=body,
[   15s]                 headers=headers,
[   15s]                 chunked=chunked,
[   15s]             )
[   15s]
[   15s]             # If we're going to release the connection in ``finally:``, then
[   15s]             # the response doesn't need to know about the connection. Otherwise
[   15s]             # it will also try to release it and we'll have a double-release
[   15s]             # mess.
[   15s]             response_conn = conn if not release_conn else None
[   15s]
[   15s]             # Pass method to Response for length checking
[   15s]             response_kw["request_method"] = method
[   15s]
[   15s]             # Import httplib's response into our own wrapper object
[   15s]             response = self.ResponseCls.from_httplib(
[   15s]                 httplib_response,
[   15s]                 pool=self,
[   15s]                 connection=response_conn,
[   15s]                 retries=retries,
[   15s]                 **response_kw
[   15s]             )
[   15s]
[   15s]             # Everything went great!
[   15s]             clean_exit = True
[   15s]
[   15s]         except EmptyPoolError:
[   15s]             # Didn't get a connection from the pool, no need to clean up
[   15s]             clean_exit = True
[   15s]             release_this_conn = False
[   15s]             raise
[   15s]
[   15s]         except (
[   15s]             TimeoutError,
[   15s]             HTTPException,
[   15s]             SocketError,
[   15s]             ProtocolError,
[   15s]             BaseSSLError,
[   15s]             SSLError,
[   15s]             CertificateError,
[   15s]         ) as e:
[   15s]             # Discard the connection for these exceptions. It will be
[   15s]             # replaced during the next _get_conn() call.
[   15s]             clean_exit = False
[   15s]
[   15s]             def _is_ssl_error_message_from_http_proxy(ssl_error):
[   15s]                 # We're trying to detect the message 'WRONG_VERSION_NUMBER' but
[   15s]                 # SSLErrors are kinda all over the place when it comes to the message,
[   15s]                 # so we try to cover our bases here!
[   15s]                 message = " ".join(re.split("[^a-z]", str(ssl_error).lower()))
[   15s]                 return (
[   15s]                     "wrong version number" in message or "unknown protocol" in message
[   15s]                 )
[   15s]
[   15s]             # Try to detect a common user error with proxies which is to
[   15s]             # set an HTTP proxy to be HTTPS when it should be 'http://'
[   15s]             # (ie {'http': 'http://proxy', 'https': 'https://proxy'})
[   15s]             # Instead we add a nice error message and point to a URL.
[   15s]             if (
[   15s]                 isinstance(e, BaseSSLError)
[   15s]                 and self.proxy
[   15s]                 and _is_ssl_error_message_from_http_proxy(e)
[   15s]             ):
[   15s]                 e = ProxyError(
[   15s]                     "Your proxy appears to only use HTTP and not HTTPS, "
[   15s]                     "try changing your proxy URL to be HTTP. See: "
[   15s]                     "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html"
[   15s]                     "#https-proxy-error-http-proxy",
[   15s]                     SSLError(e),
[   15s]                 )
[   15s]             elif isinstance(e, (BaseSSLError, CertificateError)):
[   15s]                 e = SSLError(e)
[   15s]             elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
[   15s]                 e = ProxyError("Cannot connect to proxy.", e)
[   15s]             elif isinstance(e, (SocketError, HTTPException)):
[   15s]                 e = ProtocolError("Connection aborted.", e)
[   15s]
[   15s] >           retries = retries.increment(
[   15s]                 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
[   15s]             )
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/connectionpool.py:785:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
[   15s] method = 'POST', url = '/_/TranslateWebserverUi/data/batchexecute'
[   15s] response = None
[   15s] error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')
[   15s] _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fb289ec27c0>
[   15s] _stacktrace = <traceback object at 0x7fb28acfa9c0>
[   15s]
[   15s]     def increment(
[   15s]         self,
[   15s]         method=None,
[   15s]         url=None,
[   15s]         response=None,
[   15s]         error=None,
[   15s]         _pool=None,
[   15s]         _stacktrace=None,
[   15s]     ):
[   15s]         """Return a new Retry object with incremented retry counters.
[   15s]
[   15s]         :param response: A response object, or None, if the server did not
[   15s]             return a response.
[   15s]         :type response: :class:`~urllib3.response.HTTPResponse`
[   15s]         :param Exception error: An error encountered during the request, or
[   15s]             None if the response was received successfully.
[   15s]
[   15s]         :return: A new ``Retry`` object.
[   15s]         """
[   15s]         if self.total is False and error:
[   15s]             # Disabled, indicate to re-raise the error.
[   15s]             raise six.reraise(type(error), error, _stacktrace)
[   15s]
[   15s]         total = self.total
[   15s]         if total is not None:
[   15s]             total -= 1
[   15s]
[   15s]         connect = self.connect
[   15s]         read = self.read
[   15s]         redirect = self.redirect
[   15s]         status_count = self.status
[   15s]         other = self.other
[   15s]         cause = "unknown"
[   15s]         status = None
[   15s]         redirect_location = None
[   15s]
[   15s]         if error and self._is_connection_error(error):
[   15s]             # Connect retry?
[   15s]             if connect is False:
[   15s]                 raise six.reraise(type(error), error, _stacktrace)
[   15s]             elif connect is not None:
[   15s]                 connect -= 1
[   15s]
[   15s]         elif error and self._is_read_error(error):
[   15s]             # Read retry?
[   15s]             if read is False or not self._is_method_retryable(method):
[   15s]                 raise six.reraise(type(error), error, _stacktrace)
[   15s]             elif read is not None:
[   15s]                 read -= 1
[   15s]
[   15s]         elif error:
[   15s]             # Other retry?
[   15s]             if other is not None:
[   15s]                 other -= 1
[   15s]
[   15s]         elif response and response.get_redirect_location():
[   15s]             # Redirect retry?
[   15s]             if redirect is not None:
[   15s]                 redirect -= 1
[   15s]             cause = "too many redirects"
[   15s]             redirect_location = response.get_redirect_location()
[   15s]             status = response.status
[   15s]
[   15s]         else:
[   15s]             # Incrementing because of a server error like a 500 in
[   15s]             # status_forcelist and the given method is in the allowed_methods
[   15s]             cause = ResponseError.GENERIC_ERROR
[   15s]             if response and response.status:
[   15s]                 if status_count is not None:
[   15s]                     status_count -= 1
[   15s]                 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
[   15s]                 status = response.status
[   15s]
[   15s]         history = self.history + (
[   15s]             RequestHistory(method, url, error, status, redirect_location),
[   15s]         )
[   15s]
[   15s]         new_retry = self.new(
[   15s]             total=total,
[   15s]             connect=connect,
[   15s]             read=read,
[   15s]             redirect=redirect,
[   15s]             status=status_count,
[   15s]             other=other,
[   15s]             history=history,
[   15s]         )
[   15s]
[   15s]         if new_retry.is_exhausted():
[   15s] >           raise MaxRetryError(_pool, url, error or ResponseError(cause))
[   15s] E           urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='translate.google.com', port=443): Max retries exceeded with url: /_/TranslateWebserverUi/data/batchexecute (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
[   15s]
[   15s] /usr/lib/python3.9/site-packages/urllib3/util/retry.py:592: MaxRetryError
[   15s]
[   15s] During handling of the above exception, another exception occurred:
[   15s]
[   15s] self = <gtts.tts.gTTS object at 0x7fb289f0d3d0>
[   15s]
[   15s]     def stream(self):
[   15s]         """Do the TTS API request(s) and stream bytes
[   15s]
[   15s]         Raises:
[   15s]             :class:`gTTSError`: When there's an error with the API request.
[   15s]
[   15s]         """
[   15s]         # When disabling ssl verify in requests (for proxies and firewalls),
[   15s]         # urllib3 prints an insecure warning on stdout. We disable that.
[   15s]         try:
[   15s]             urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
[   15s]         except:
[   15s]             pass
[   15s]
[   15s]         prepared_requests = self._prepare_requests()
[   15s]         for idx, pr in enumerate(prepared_requests):
[   15s]             try:
[   15s]                 with requests.Session() as s:
[   15s]                     # Send request
[   15s] >                   r = s.send(
[   15s]                         request=pr, proxies=urllib.request.getproxies(), verify=False
[   15s]                     )
[   15s]
[   15s] gtts/tts.py:265:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <requests.sessions.Session object at 0x7fb289eccc10>
[   15s] request = <PreparedRequest [POST]>
[   15s] kwargs = {'cert': None, 'proxies': {}, 'stream': False, 'verify': False}
[   15s] allow_redirects = True, stream = False, hooks = {'response': []}
[   15s] adapter = <requests.adapters.HTTPAdapter object at 0x7fb289ec2b20>
[   15s] start = 1647873461.4827988
[   15s]
[   15s]     def send(self, request, **kwargs):
[   15s]         """Send a given PreparedRequest.
[   15s]
[   15s]         :rtype: requests.Response
[   15s]         """
[   15s]         # Set defaults that the hooks can utilize to ensure they always have
[   15s]         # the correct parameters to reproduce the previous request.
[   15s]         kwargs.setdefault('stream', self.stream)
[   15s]         kwargs.setdefault('verify', self.verify)
[   15s]         kwargs.setdefault('cert', self.cert)
[   15s]         if 'proxies' not in kwargs:
[   15s]             kwargs['proxies'] = resolve_proxies(
[   15s]                 request, self.proxies, self.trust_env
[   15s]             )
[   15s]
[   15s]         # It's possible that users might accidentally send a Request object.
[   15s]         # Guard against that specific failure case.
[   15s]         if isinstance(request, Request):
[   15s]             raise ValueError('You can only send PreparedRequests.')
[   15s]
[   15s]         # Set up variables needed for resolve_redirects and dispatching of hooks
[   15s]         allow_redirects = kwargs.pop('allow_redirects', True)
[   15s]         stream = kwargs.get('stream')
[   15s]         hooks = request.hooks
[   15s]
[   15s]         # Get the appropriate adapter to use
[   15s]         adapter = self.get_adapter(url=request.url)
[   15s]
[   15s]         # Start time (approximately) of the request
[   15s]         start = preferred_clock()
[   15s]
[   15s]         # Send the request
[   15s] >       r = adapter.send(request, **kwargs)
[   15s]
[   15s] /usr/lib/python3.9/site-packages/requests/sessions.py:645:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <requests.adapters.HTTPAdapter object at 0x7fb289ec2b20>
[   15s] request = <PreparedRequest [POST]>, stream = False
[   15s] timeout = Timeout(connect=None, read=None, total=None), verify = False
[   15s] cert = None, proxies = {}
[   15s]
[   15s]     def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
[   15s]         """Sends PreparedRequest object. Returns Response object.
[   15s]
[   15s]         :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
[   15s]         :param stream: (optional) Whether to stream the request content.
[   15s]         :param timeout: (optional) How long to wait for the server to send
[   15s]             data before giving up, as a float, or a :ref:`(connect timeout,
[   15s]             read timeout) <timeouts>` tuple.
[   15s]         :type timeout: float or tuple or urllib3 Timeout object
[   15s]         :param verify: (optional) Either a boolean, in which case it controls whether
[   15s]             we verify the server's TLS certificate, or a string, in which case it
[   15s]             must be a path to a CA bundle to use
[   15s]         :param cert: (optional) Any user-provided SSL certificate to be trusted.
[   15s]         :param proxies: (optional) The proxies dictionary to apply to the request.
[   15s]         :rtype: requests.Response
[   15s]         """
[   15s]
[   15s]         try:
[   15s]             conn = self.get_connection(request.url, proxies)
[   15s]         except LocationValueError as e:
[   15s]             raise InvalidURL(e, request=request)
[   15s]
[   15s]         self.cert_verify(conn, request.url, verify, cert)
[   15s]         url = self.request_url(request, proxies)
[   15s]         self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
[   15s]
[   15s]         chunked = not (request.body is None or 'Content-Length' in request.headers)
[   15s]
[   15s]         if isinstance(timeout, tuple):
[   15s]             try:
[   15s]                 connect, read = timeout
[   15s]                 timeout = TimeoutSauce(connect=connect, read=read)
[   15s]             except ValueError as e:
[   15s]                 # this may raise a string formatting error.
[   15s]                 err = ("Invalid timeout {}. Pass a (connect, read) "
[   15s]                        "timeout tuple, or a single float to set "
[   15s]                        "both timeouts to the same value".format(timeout))
[   15s]                 raise ValueError(err)
[   15s]         elif isinstance(timeout, TimeoutSauce):
[   15s]             pass
[   15s]         else:
[   15s]             timeout = TimeoutSauce(connect=timeout, read=timeout)
[   15s]
[   15s]         try:
[   15s]             if not chunked:
[   15s]                 resp = conn.urlopen(
[   15s]                     method=request.method,
[   15s]                     url=url,
[   15s]                     body=request.body,
[   15s]                     headers=request.headers,
[   15s]                     redirect=False,
[   15s]                     assert_same_host=False,
[   15s]                     preload_content=False,
[   15s]                     decode_content=False,
[   15s]                     retries=self.max_retries,
[   15s]                     timeout=timeout
[   15s]                 )
[   15s]
[   15s]             # Send the request.
[   15s]             else:
[   15s]                 if hasattr(conn, 'proxy_pool'):
[   15s]                     conn = conn.proxy_pool
[   15s]
[   15s]                 low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
[   15s]
[   15s]                 try:
[   15s]                     skip_host = 'Host' in request.headers
[   15s]                     low_conn.putrequest(request.method,
[   15s]                                         url,
[   15s]                                         skip_accept_encoding=True,
[   15s]                                         skip_host=skip_host)
[   15s]
[   15s]                     for header, value in request.headers.items():
[   15s]                         low_conn.putheader(header, value)
[   15s]
[   15s]                     low_conn.endheaders()
[   15s]
[   15s]                     for i in request.body:
[   15s]                         low_conn.send(hex(len(i))[2:].encode('utf-8'))
[   15s]                         low_conn.send(b'\r\n')
[   15s]                         low_conn.send(i)
[   15s]                         low_conn.send(b'\r\n')
[   15s]                     low_conn.send(b'0\r\n\r\n')
[   15s]
[   15s]                     # Receive the response from the server
[   15s]                     try:
[   15s]                         # For Python 2.7, use buffering of HTTP responses
[   15s]                         r = low_conn.getresponse(buffering=True)
[   15s]                     except TypeError:
[   15s]                         # For compatibility with Python 3.3+
[   15s]                         r = low_conn.getresponse()
[   15s]
[   15s]                     resp = HTTPResponse.from_httplib(
[   15s]                         r,
[   15s]                         pool=conn,
[   15s]                         connection=low_conn,
[   15s]                         preload_content=False,
[   15s]                         decode_content=False
[   15s]                     )
[   15s]                 except:
[   15s]                     # If we hit any problems here, clean up the connection.
[   15s]                     # Then, reraise so that we can handle the actual exception.
[   15s]                     low_conn.close()
[   15s]                     raise
[   15s]
[   15s]         except (ProtocolError, socket.error) as err:
[   15s]             raise ConnectionError(err, request=request)
[   15s]
[   15s]         except MaxRetryError as e:
[   15s]             if isinstance(e.reason, ConnectTimeoutError):
[   15s]                 # TODO: Remove this in 3.0.0: see #2811
[   15s]                 if not isinstance(e.reason, NewConnectionError):
[   15s]                     raise ConnectTimeout(e, request=request)
[   15s]
[   15s]             if isinstance(e.reason, ResponseError):
[   15s]                 raise RetryError(e, request=request)
[   15s]
[   15s]             if isinstance(e.reason, _ProxyError):
[   15s]                 raise ProxyError(e, request=request)
[   15s]
[   15s]             if isinstance(e.reason, _SSLError):
[   15s]                 # This branch is for urllib3 v1.22 and later.
[   15s]                 raise SSLError(e, request=request)
[   15s]
[   15s] >           raise ConnectionError(e, request=request)
[   15s] E           requests.exceptions.ConnectionError: HTTPSConnectionPool(host='translate.google.com', port=443): Max retries exceeded with url: /_/TranslateWebserverUi/data/batchexecute (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
[   15s]
[   15s] /usr/lib/python3.9/site-packages/requests/adapters.py:519: ConnectionError
[   15s]
[   15s] During handling of the above exception, another exception occurred:
[   15s]
[   15s]     def test_bad_fp_type():
[   15s]         """Raise TypeError if fp is not a file-like object (no .write())"""
[   15s]         # Create gTTS and save
[   15s]         tts = gTTS(text="test")
[   15s]         with pytest.raises(TypeError):
[   15s] >           tts.write_to_fp(5)
[   15s]
[   15s] gtts/tests/test_tts.py:92:
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s] gtts/tts.py:310: in write_to_fp
[   15s]     for idx, decoded in enumerate(self.stream()):
[   15s] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[   15s]
[   15s] self = <gtts.tts.gTTS object at 0x7fb289f0d3d0>
[   15s]
[   15s]     def stream(self):
[   15s]         """Do the TTS API request(s) and stream bytes
[   15s]
[   15s]         Raises:
[   15s]             :class:`gTTSError`: When there's an error with the API request.
[   15s]
[   15s]         """
[   15s]         # When disabling ssl verify in requests (for proxies and firewalls),
[   15s]         # urllib3 prints an insecure warning on stdout. We disable that.
[   15s]         try:
[   15s]             urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
[   15s]         except:
[   15s]             pass
[   15s]
[   15s]         prepared_requests = self._prepare_requests()
[   15s]         for idx, pr in enumerate(prepared_requests):
[   15s]             try:
[   15s]                 with requests.Session() as s:
[   15s]                     # Send request
[   15s]                     r = s.send(
[   15s]                         request=pr, proxies=urllib.request.getproxies(), verify=False
[   15s]                     )
[   15s]
[   15s]                 log.debug("headers-%i: %s", idx, r.request.headers)
[   15s]                 log.debug("url-%i: %s", idx, r.request.url)
[   15s]                 log.debug("status-%i: %s", idx, r.status_code)
[   15s]
[   15s]                 r.raise_for_status()
[   15s]             except requests.exceptions.HTTPError as e:  # pragma: no cover
[   15s]                 # Request successful, bad response
[   15s]                 log.debug(str(e))
[   15s]                 raise gTTSError(tts=self, response=r)
[   15s]             except requests.exceptions.RequestException as e:  # pragma: no cover
[   15s]                 # Request failed
[   15s]                 log.debug(str(e))
[   15s] >               raise gTTSError(tts=self)
[   15s] E               gtts.tts.gTTSError: Failed to connect. Probable cause: Unknown
[   15s]
[   15s] gtts/tts.py:281: gTTSError
[   15s] ------------------------------ Captured log call -------------------------------
[   15s] DEBUG    gtts.tts:tts.py:131 text: test
[   15s] DEBUG    gtts.tts:tts.py:131 tld: com
[   15s] DEBUG    gtts.tts:tts.py:131 lang: en
[   15s] DEBUG    gtts.tts:tts.py:131 slow: False
[   15s] DEBUG    gtts.tts:tts.py:131 lang_check: True
[   15s] DEBUG    gtts.tts:tts.py:131 pre_processor_funcs: [<function tone_marks at 0x7fb28a9f39d0>, <function end_of_line at 0x7fb28a98c310>, <function abbreviations at 0x7fb28a98c3a0>, <function word_sub at 0x7fb28a98c430>]
[   15s] DEBUG    gtts.tts:tts.py:131 tokenizer_func: <bound method Tokenizer.run of re.compile('(?<=\\?).|(?<=!).|(?<=?).|(?<=!).|(?<!\\.[a-z])\\. |(?<!\\.[a-z]), |(?<!\\d):|\\[|\\(|;|…|\\)|:|¿|—|,|、|\\\n|。|¡|\\]|‥|،', re.IGNORECASE) from: [<function tone_marks at 0x7fb28a98c550>, <function period_comma at 0x7fb28a98c670>, <function colon at 0x7fb28a98c700>, <function other_punctuation at 0x7fb28a98c790>]>
[   15s] DEBUG    gtts.lang:lang.py:33 langs: {'af': 'Afrikaans', 'ar': 'Arabic', 'bg': 'Bulgarian', 'bn': 'Bengali', 'bs': 'Bosnian', 'ca': 'Catalan', 'cs': 'Czech', 'cy': 'Welsh', 'da': 'Danish', 'de': 'German', 'el': 'Greek', 'en': 'English', 'eo': 'Esperanto', 'es': 'Spanish', 'et': 'Estonian', 'fi': 'Finnish', 'fr': 'French', 'gu': 'Gujarati', 'hi': 'Hindi', 'hr': 'Croatian', 'hu': 'Hungarian', 'hy': 'Armenian', 'id': 'Indonesian', 'is': 'Icelandic', 'it': 'Italian', 'iw': 'Hebrew', 'ja': 'Japanese', 'jw': 'Javanese', 'km': 'Khmer', 'kn': 'Kannada', 'ko': 'Korean', 'la': 'Latin', 'lv': 'Latvian', 'mk': 'Macedonian', 'ms': 'Malay', 'ml': 'Malayalam', 'mr': 'Marathi', 'my': 'Myanmar (Burmese)', 'ne': 'Nepali', 'nl': 'Dutch', 'no': 'Norwegian', 'pl': 'Polish', 'pt': 'Portuguese', 'ro': 'Romanian', 'ru': 'Russian', 'si': 'Sinhala', 'sk': 'Slovak', 'sq': 'Albanian', 'sr': 'Serbian', 'su': 'Sundanese', 'sv': 'Swedish', 'sw': 'Swahili', 'ta': 'Tamil', 'te': 'Telugu', 'th': 'Thai', 'tl': 'Filipino', 'tr': 'Turkish', 'uk': 'Ukrainian', 'ur': 'Urdu', 'vi': 'Vietnamese', 'zh-CN': 'Chinese', 'zh-TW': 'Chinese (Mandarin/Taiwan)', 'zh': 'Chinese (Mandarin)'}
[   15s] DEBUG    gtts.tts:tts.py:172 pre-processing: <function tone_marks at 0x7fb28a9f39d0>
[   15s] DEBUG    gtts.tts:tts.py:172 pre-processing: <function end_of_line at 0x7fb28a98c310>
[   15s] DEBUG    gtts.tts:tts.py:172 pre-processing: <function abbreviations at 0x7fb28a98c3a0>
[   15s] DEBUG    gtts.tts:tts.py:172 pre-processing: <function word_sub at 0x7fb28a98c430>
[   15s] DEBUG    gtts.tts:tts.py:207 text_parts: ['test']
[   15s] DEBUG    gtts.tts:tts.py:208 text_parts: 1
[   15s] DEBUG    gtts.tts:tts.py:215 data-0: f.req=%5B%5B%5B%22jQ1olc%22%2C%22%5B%5C%22test%5C%22%2C%5C%22en%5C%22%2Cnull%2C%5C%22null%5C%22%5D%22%2Cnull%2C%22generic%22%5D%5D%5D&
[   15s] DEBUG    gtts.tts:tts.py:280 HTTPSConnectionPool(host='translate.google.com', port=443): Max retries exceeded with url: /_/TranslateWebserverUi/data/batchexecute (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fb289ec2e20>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
[   15s] =============================== warnings summary ===============================
[   15s] ../../../../../usr/lib/python3.9/site-packages/_pytest/config/__init__.py:1233
[   15s]   /usr/lib/python3.9/site-packages/_pytest/config/__init__.py:1233: PytestConfigWarning: Unknown config option: maxversion
[   15s]
[   15s]     self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")
[   15s]
[   15s] -- Docs: https://docs.pytest.org/en/stable/warnings.html
[   15s] =========================== short test summary info ============================
[   15s] FAILED gtts/tests/test_tts.py::test_bad_fp_type - gtts.tts.gTTSError: Failed ...
[   15s] ============ 1 failed, 35 passed, 81 deselected, 1 warning in 0.32s ============

Expected Behaviour

The test suite should pass

Context

Complete build log with all details of packages used and steps taken to run the test suite.

Suggested patch

Environment

  • gTTS version: 2.2.4, python 3.8.12
  • Operating System version: Linux, openSUSE/Tumbleweed (rolling distro as of today, 2022-03-21)
bmwiedemann pushed a commit to bmwiedemann/openSUSE that referenced this issue Mar 21, 2022
https://build.opensuse.org/request/show/963733
by user mcepl + dimstar_suse
- Make tests working at least a little bit. Requires two new patches:
  - demock.patch (gh#pndurette/gTTS#343)
  - network-tests.patch (gh#pndurette/gTTS#344)
- version update to 2.2.4
  2.2.4 (2022-03-14)
  ------------------
  Features
  ~~~~~~~
  - Added Malay language support (`#316 <https://github.com/pndurette/gTTS/issues/316>`_)
  - Added Hebrew language support (`#324 <https://github.com/pndurette/gTTS/issues/324>`_)
  - Added new ``gTTS.stream()`` method to stream bytes (`#319 <https://github.com/pndurette/gTTS/issues/319>`_)
  Misc
  ~~~
  - `#334 <https://github.com/pndurette/gTTS/issues/334>`_
  2.2.3 (2021-06-17)
  ------------------
  Features
  ~~~~~~~
  - Added Bulgarian language support (`#302 <https://github.com/pndurette/gTTS/issues/302>`_)
  2.2.2 (2021-02-03)
  --------
@pndurette
Copy link
Owner

@mcepl First of all, so sorry for the delay! And 2nd of all, yes, that one slipped by! Thanks a lot, will fix!

@pndurette pndurette added the misc label Nov 21, 2022
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jan 29, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants