Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python dataclasses from JSON data using types defined in schema / how to test #1190

Open
vashek opened this issue Mar 25, 2020 · 3 comments
Open

Comments

@vashek
Copy link

vashek commented Mar 25, 2020

I have only recently starting learning about Connexion and so far I love it, but there are two things that I'm missing:

  • some way to automatically check/test/verify that my functions that implement the API operations actually accept and return what they're supposed to,
  • some way to help my IDE figure out what the functions should accept and return, so the IDE can help me write correct code. (I currently use PyCharm and like it.)

I was thinking, even though I like the one-step build process and generally try to avoid introducing the need for generated code, in this case perhaps generating a module with data classes corresponding to the types defined in the schema would solve both?

@RobbeSneyders
Copy link
Member

We could leverage dataclasses-json or SimpleNamespace here.

@vashek
Copy link
Author

vashek commented Apr 7, 2023

Ah, so what I actually ended up doing was:

  • make a script that takes my OpenAPI spec and generates the data model expressed using dataclasses and Enums
  • make a decorator for my API functions that inspects the type annotations and then uses dacite.from_dict to convert the body from dict to the correct dataclass and the enum strings into actual Enums, calls the API function, and then uses dataclasses.asdict to convert the returned dataclass to dict

So then my API implementation never has to deal with the raw dicts: it accepts and returns dataclasses, mypy and pylint checks my types and does static analysis and an IDE can give me hints.

Now that I think about it, I think I basically achieved what FastAPI is doing, just using dacite instead of pydantic, the difference being that FastAPI generates a spec from code while I generate code from spec.

I suspect it might be too specific to the way I write my API spec (and also not very clean code TBH) but if there's interest, I suppose I could offer it up to the community.

@RaphaelCanin
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants