Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Too many queries or query is too complex" issue #1689

Open
jmainwaring opened this issue Aug 29, 2024 · 11 comments
Open

"Too many queries or query is too complex" issue #1689

jmainwaring opened this issue Aug 29, 2024 · 11 comments
Labels
Bug Something isn't working Triage 👀

Comments

@jmainwaring
Copy link

jmainwaring commented Aug 29, 2024

Hello! As seen in this thread, many of us continue to face the following error when run metadata is added in BigQuery:

Resources exceeded during query execution: Not enough resources for query planning - too many subqueries or query is too complex.

I see in the thread that the issue appears to have been resolved, but I and others later in the thread are still encountering this error after upgrading to version 0.16.0. This and this are two other examples of users who appear to be running into the same thing. If you all have any suggestions for config changes we need to make or would be up for discussing a potential fix, that would be greatly appreciated, as there are some wonderful features in the new version that we hope to not miss out on. Thank you!

Environment:
Elementary dbt package version: 1.6.0
dbt version you're using: 1.7.4
Data warehouse: BigQuery

@jmainwaring jmainwaring added Bug Something isn't working Triage 👀 labels Aug 29, 2024
@cwmuller
Copy link

cwmuller commented Sep 2, 2024

I'm also experiencing this on 0.16.0, dbt-bigquery 1.8.2. I tried setting the the query_max_size and insert_rows_method parameters, as my impression was that this was an attempt make changes that would alleviate this issue, but I am still getting the same error messages as before.

@belle-crisp
Copy link
Contributor

Same for me!

@cwmuller
Copy link

Uploaded log file: dbt.log

@haritamar
Copy link
Collaborator

haritamar commented Oct 1, 2024

Hi @jmainwaring and all,

Apologies for the delayed response here. This configuration (that was mentioned in on of the threads) should probably solve this issue:

insert_rows_method: chunk
dbt_artifacts_chunk_size: 500

Some extended context about this problem - BQ has two types of limits that affect insert queries:

  1. A limit on the number of rows inserted - this causes the error mentioned above.
  2. A limit on the length of the entire query.

The chunk insert method actually handles both types of limits today, and we'll make it the default in the upcoming package release.

Thanks,
Itamar

@cwmuller
Copy link

cwmuller commented Oct 1, 2024

Hi @haritamar ,

The log I posted above is the output from elementary actively failing with the configuration you are mentioning. Unless I have incorrectly set these parameters, they have failed to fix this issue.

@Maayan-s
Copy link
Contributor

Maayan-s commented Oct 1, 2024

@cwmuller @belle-crisp
@nickozilla, @benoitgoujon, @wei-wei29, @chenxilll, @kylemcleland, @neethu2409

Let us know if the config @haritamar suggested doesn't solve this 🙏🏻

@belle-crisp
Copy link
Contributor

I have the same experience as @cwmuller

@haritamar
Copy link
Collaborator

haritamar commented Oct 1, 2024

Hi @cwmuller @belle-crisp ,
Sorry about that.

@cwmuller another setting that might help is to tweak the query_max_size parameter, not sure if you did that as well. We saw in the log that the failing query has around 240K characters.
Have you tried setting query_max_size: 100000?

(I saw it was suggested in one of the slack thread so not sure if you tried that as well)

Thanks,
Itamar

@haritamar
Copy link
Collaborator

Also if you can share the full list of vars you set up would be great 🙏

@cwmuller
Copy link

cwmuller commented Oct 1, 2024

Uploaded a new log running with the following profiles config:

  elementary:
      insert_rows_method: "chunk"
      dbt_artifacts_chunk_size: 500
      query_max_size: 1000000

Will get back to you on vars.

dbt.log

@haritamar
Copy link
Collaborator

Hey @cwmuller ,
The configuration above is what I meant. But interesting - I think it is actually configured in the wrong place...
These should be set as vars in dbt_project.yml, and not under the profiles.

e.g.:

vars:
  insert_rows_method: "chunk"
  dbt_artifacts_chunk_size: 500
  query_max_size: 100000

(I removed one zero from query_max_size)

If you can check if this configurations makes any difference would be great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working Triage 👀
Projects
None yet
Development

No branches or pull requests

5 participants