Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tracing support enhancements #3084

Open
thehesiod opened this issue Jun 14, 2018 · 4 comments
Open

tracing support enhancements #3084

thehesiod opened this issue Jun 14, 2018 · 4 comments

Comments

@thehesiod
Copy link
Contributor

Long story short

After writing an ugly 2.x monkeypatch for client tracing support in datadog: DataDog/dd-trace-py#294 a new aiohttp tracing strategy was discussed and created. I'm now trying to move this PR to use the new tracing support however have run into a few issues.

Expected behaviour

Ability to trace major network activities in aiohttp

Actual behaviour

  1. Unable to trace when aiohttp is waiting for a request response headers: https://github.com/aio-libs/aiohttp/blob/v3.3.1/aiohttp/client.py#L387
  2. Unable to trace chunked stream reads via StreamReader.read()

Steps to reproduce

TBD

Your environment

aiohttp 3.3.1

Other Thoughts from a client perspective

  1. Initially I thought a trace was missing for req.send: https://github.com/aio-libs/aiohttp/blob/v3.3.1/aiohttp/client.py#L385 however after tracing through that method and what it calls I don't think it needs to be an async function.
  2. I think it would be a lot simpler if there was a trace per while loop of redirects, otherwise I have to piece together conn timeout + header send + header response
  3. I'm not sure what send_response_chunk_received represents? It would be nice if there was a diagram describing when each one happens.
  4. seems weird having send_request_chunk_sent w/o a received, basically sending request but never got response? What I see on a request from aiobotocore is: send_request_start -> send_conn_create_start -> send_conn_create_end -> send_request_chunk_sent -> send_request_end
@asvetlov
Copy link
Member

asvetlov commented Jun 15, 2018

Thank you very much for the feedback.

Unable to trace when aiohttp is waiting for a request response headers: https://github.com/aio-libs/aiohttp/blob/v3.3.1/aiohttp/client.py#L387

Agree, we need it.

Unable to trace chunked stream reads via StreamReader.read()

Maybe we need to move sending send_response_chunk_received() to code that pushes a data into StreamReader? Do you really want a signal when reading a data from the stream or when pushing a data into the stream? Both?

Initially I thought a trace was missing for req.send: https://github.com/aio-libs/aiohttp/blob/v3.3.1/aiohttp/client.py#L385 however after tracing through that method and what it calls I don't think it needs to be an async function.

I suspect this private API will be changed soon.

I think it would be a lot simpler if there was a trace per while loop of redirects, otherwise I have to piece together conn timeout + header send + header response

Please elaborate. Are you request for a new signal for composing these three?

I'm not sure what send_response_chunk_received represents? It would be nice if there was a diagram describing when each one happens.

Discussed above.

seems weird having send_request_chunk_sent w/o a received, basically sending request but never got response? What I see on a request from aiobotocore is: send_request_start -> send_conn_create_start -> send_conn_create_end -> send_request_chunk_sent -> send_request_end

A response is received after sending a request. What exactly confuses you?

@thehesiod
Copy link
Contributor Author

thehesiod commented Jun 15, 2018

Unable to trace chunked stream reads via StreamReader.read()

Maybe we need to move sending send_response_chunk_received() to code that pushes a data into StreamReader? Do you really want a signal when reading a data from the stream or when pushing a data into the stream? Both?

Ideally any network related delay would be traceable. It would be neat if this was lower level in core python but what can we do :) I think the closer to bare metal the better. In ddtrace we want to know how long it took to read from the stream. I think right now I capture the sending via the overall request length however it's not critical to have because I think that typically doesn't take a long time.

I think it would be a lot simpler if there was a trace per while loop of redirects, otherwise I have to piece together conn timeout + header send + header response

Please elaborate. Are you request for a new signal for composing these three?

It's not really necessary, I have enough information from the other traces and what we're discussing.

seems weird having send_request_chunk_sent w/o a received, basically sending request but never got response? What I see on a request from aiobotocore is: send_request_start -> send_conn_create_start -> send_conn_create_end -> send_request_chunk_sent -> send_request_end

A response is received after sending a request. What exactly confuses you?

the part that's confusing is that you have a send_request_chunk_sent but not a sending_request_chunk, so there's no what to know how long the chunk took to send.

Thanks!

@asvetlov
Copy link
Member

@thehesiod would you work on the issue?

I see a strong request for a signal that is raised when response headers are received.
We can start with it. Please provide a PR, I happy to review it.
After that, we can discuss and implement other missing signals.
Does it work for you?

Sorry, I cannot spend much time by hacking aiohttp now at least up to the end of May (Python 3.8 feature freeze date) plus realistically I will be busy on writing documentation and fixing bugs even after the beta release.
But I'll find a time for review.

P.S.
The year beginning was hard for me:

  1. a lot of work on my job
  2. broken aiohtp build process by (un)famous pip 19 release, I had no clear vision where things are going and what is the proper fix.
  3. my 11-month-now daughter requires more my attention than when she was 4-month only

Anyway, I'm starting to return to active FOSS development. The next Python is my the first target, the next is aiohttp (maybe with stops on multidict and yarl, both libraries need my love too).

@thehesiod
Copy link
Contributor Author

Welcome to the baby club :) today our son was shrieking all day :). I'll try too, this year has been crazy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants