Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too many open files (os error 24) #214

Closed
MagiMaster opened this issue Jul 11, 2019 · 4 comments
Closed

Too many open files (os error 24) #214

MagiMaster opened this issue Jul 11, 2019 · 4 comments
Labels
status: needs repro We need more information on how to reproduce this issue. type: bug Something happens that shouldn't happen

Comments

@MagiMaster
Copy link

On Mac, if you leave "rojo serve" running overnight it will eventually crash with "[ERROR hyper::server::tcp] accept error: Too many open files (os error 24)". This doesn't actually kill the server, but does make it unresponsive. (Manually killing and restarting it fixes things for another day.)

@LPGhatguy LPGhatguy added impact: small Minor papercuts in Rojo that don't warrant immediate resolutoin. type: bug Something happens that shouldn't happen labels Jul 12, 2019
@LPGhatguy
Copy link
Contributor

Weird!

I wonder what layer this bug is happening at. Some things it could be off the top of my head:

  • Hyper itself (could it be keeping too many connections around, which are conceptually "files" in Unix?)
  • Rojo keeping too many actual file handles open, causing Hyper to fail
  • Some bug or resource leak in a library Rojo uses

@MagiMaster
Copy link
Author

It looks like it might behave differently if Studio isn't left connected to it. I'm not 100% sure I left it open long enough that it should have crashed, but there was no error message this morning.

@LPGhatguy
Copy link
Contributor

That makes sense!

Studio communicates with the Rojo server (implemented with Hyper/Tokio) through HTTP long-polling. It starts up a request and relies on HTTP timeout behavior to create server->client notifications. In Roblox, that means that the client has to restart the request every 30 seconds or so, assuming no changes are happening. Over night, that's a couple thousand HTTP requests that time out. There could be a memory leak in Rojo or one of Rojo's dependencies that could be leaking these requests.

@LPGhatguy LPGhatguy added status: needs repro We need more information on how to reproduce this issue. and removed impact: small Minor papercuts in Rojo that don't warrant immediate resolutoin. labels Aug 27, 2019
@LPGhatguy
Copy link
Contributor

I'm going to close this issue since we haven't seen anyone else with it pop back up. Feel free to re-open it or leave a comment if you hit this again!

I think it's likely to be a Tokio issue, which we should also be able to resolve by upgrading some of our dependencies there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: needs repro We need more information on how to reproduce this issue. type: bug Something happens that shouldn't happen
Projects
None yet
Development

No branches or pull requests

2 participants